CHECK: Is CUDA the right version (10)? Not applying augmentation to RGBD data Loading a pascal format RGBD dataset WARN: Loading imagenet weights Creating model, this may take a second... Building ResNet backbone using defined input shape of Tensor("input_1:0", shape=(?, ?, ?, 4), dtype=float32) Loading weights into RGB model Loading weights into depth model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 4 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ padding_conv1_rgb (ZeroPadding2 (None, None, None, 3 0 lambda_1[0][0] __________________________________________________________________________________________________ conv1_rgb (Conv2D) (None, None, None, 6 9408 padding_conv1_rgb[0][0] __________________________________________________________________________________________________ bn_conv1_rgb (BatchNormalizatio (None, None, None, 6 256 conv1_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_rgb (Activation) (None, None, None, 6 0 bn_conv1_rgb[0][0] __________________________________________________________________________________________________ pool1_rgb (MaxPooling2D) (None, None, None, 6 0 conv1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_rgb (Conv2D) (None, None, None, 6 4096 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch1_rgb (Conv2D) (None, None, None, 2 16384 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch1_rgb (BatchNormaliz (None, None, None, 2 1024 res2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_rgb (Add) (None, None, None, 2 0 bn2a_branch2c_rgb[0][0] bn2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_rgb (Activation) (None, None, None, 2 0 res2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2a_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2b_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_rgb (Add) (None, None, None, 2 0 bn2b_branch2c_rgb[0][0] res2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_rgb (Activation) (None, None, None, 2 0 res2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_rgb (Add) (None, None, None, 2 0 bn2c_branch2c_rgb[0][0] res2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_rgb (Activation) (None, None, None, 2 0 res2c_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_rgb (Conv2D) (None, None, None, 1 32768 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2a_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_rgb (ZeroPad (None, None, None, 1 0 res3a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2b_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch1_rgb (Conv2D) (None, None, None, 5 131072 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2c_rgb (BatchNormali (None, None, None, 5 2048 res3a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch1_rgb (BatchNormaliz (None, None, None, 5 2048 res3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_rgb (Add) (None, None, None, 5 0 bn3a_branch2c_rgb[0][0] bn3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_rgb (Activation) (None, None, None, 5 0 res3a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3a_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b1_rgb (Add) (None, None, None, 5 0 bn3b1_branch2c_rgb[0][0] res3a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_rgb (Activation) (None, None, None, 5 0 res3b1_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_rgb (Add) (None, None, None, 5 0 bn3b2_branch2c_rgb[0][0] res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_rgb (Activation) (None, None, None, 5 0 res3b2_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_rgb (Add) (None, None, None, 5 0 bn3b3_branch2c_rgb[0][0] res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_rgb (Activation) (None, None, None, 5 0 res3b3_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_rgb (Conv2D) (None, None, None, 2 131072 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2a_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_rgb (ZeroPad (None, None, None, 2 0 res4a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2b_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch1_rgb (Conv2D) (None, None, None, 1 524288 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2c_rgb (BatchNormali (None, None, None, 1 4096 res4a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch1_rgb (BatchNormaliz (None, None, None, 1 4096 res4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_rgb (Add) (None, None, None, 1 0 bn4a_branch2c_rgb[0][0] bn4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_rgb (Activation) (None, None, None, 1 0 res4a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4a_relu_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ lambda_2 (Lambda) (None, None, None, 1 0 input_1[0][0] __________________________________________________________________________________________________ res4b1_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding_conv1_d (ZeroPadding2D) (None, None, None, 1 0 lambda_2[0][0] __________________________________________________________________________________________________ bn4b1_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ conv1_d (Conv2D) (None, None, None, 6 3136 padding_conv1_d[0][0] __________________________________________________________________________________________________ res4b1_rgb (Add) (None, None, None, 1 0 bn4b1_branch2c_rgb[0][0] res4a_relu_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_d (Activation) (None, None, None, 6 0 conv1_d[0][0] __________________________________________________________________________________________________ res4b1_relu_rgb (Activation) (None, None, None, 1 0 res4b1_rgb[0][0] __________________________________________________________________________________________________ pool1_d (MaxPooling2D) (None, None, None, 6 0 conv1_relu_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_d (Conv2D) (None, None, None, 6 4096 pool1_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_d (Activati (None, None, None, 6 0 res2a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_d (Activati (None, None, None, 6 0 res2a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_d (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res2a_branch1_d (Conv2D) (None, None, None, 2 16384 pool1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_d (Add) (None, None, None, 2 0 res2a_branch2c_d[0][0] res2a_branch1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_d (Activation) (None, None, None, 2 0 res2a_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_d (Conv2D) (None, None, None, 6 16384 res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b2_rgb (Add) (None, None, None, 1 0 bn4b2_branch2c_rgb[0][0] res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_d (Activati (None, None, None, 6 0 res2b_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_relu_rgb (Activation) (None, None, None, 1 0 res4b2_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_d (Activati (None, None, None, 6 0 res2b_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_d (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_d (Add) (None, None, None, 2 0 res2b_branch2c_d[0][0] res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_d (Activation) (None, None, None, 2 0 res2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_d (Conv2D) (None, None, None, 6 16384 res2b_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_d (Activati (None, None, None, 6 0 res2c_branch2a_d[0][0] __________________________________________________________________________________________________ res4b3_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_rgb (Add) (None, None, None, 1 0 bn4b3_branch2c_rgb[0][0] res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_d (Activati (None, None, None, 6 0 res2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_relu_rgb (Activation) (None, None, None, 1 0 res4b3_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_d (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_d (Add) (None, None, None, 2 0 res2c_branch2c_d[0][0] res2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_d (Activation) (None, None, None, 2 0 res2c_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_d (Conv2D) (None, None, None, 1 32768 res2c_relu_d[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b4_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_d (Activati (None, None, None, 1 0 res3a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_d (ZeroPaddi (None, None, None, 1 0 res3a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_d (Activati (None, None, None, 1 0 res3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_d (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res3a_branch1_d (Conv2D) (None, None, None, 5 131072 res2c_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b4_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3a_d (Add) (None, None, None, 5 0 res3a_branch2c_d[0][0] res3a_branch1_d[0][0] __________________________________________________________________________________________________ res4b4_rgb (Add) (None, None, None, 1 0 bn4b4_branch2c_rgb[0][0] res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_d (Activation) (None, None, None, 5 0 res3a_d[0][0] __________________________________________________________________________________________________ res4b4_relu_rgb (Activation) (None, None, None, 1 0 res4b4_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_d (Conv2D) (None, None, None, 1 65536 res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_d (Activat (None, None, None, 1 0 res3b1_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b5_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_d (Activat (None, None, None, 1 0 res3b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_d (Add) (None, None, None, 5 0 res3b1_branch2c_d[0][0] res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_d (Activation) (None, None, None, 5 0 res3b1_d[0][0] __________________________________________________________________________________________________ res4b5_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b1_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b5_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_d (Activat (None, None, None, 1 0 res3b2_branch2a_d[0][0] __________________________________________________________________________________________________ res4b5_rgb (Add) (None, None, None, 1 0 bn4b5_branch2c_rgb[0][0] res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_relu_rgb (Activation) (None, None, None, 1 0 res4b5_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_d (Activat (None, None, None, 1 0 res3b2_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_d (Add) (None, None, None, 5 0 res3b2_branch2c_d[0][0] res3b1_relu_d[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b6_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_d (Activation) (None, None, None, 5 0 res3b2_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b2_relu_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_d (Activat (None, None, None, 1 0 res3b3_branch2a_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b6_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_d (Activat (None, None, None, 1 0 res3b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_rgb (Add) (None, None, None, 1 0 bn4b6_branch2c_rgb[0][0] res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_relu_rgb (Activation) (None, None, None, 1 0 res4b6_rgb[0][0] __________________________________________________________________________________________________ res3b3_d (Add) (None, None, None, 5 0 res3b3_branch2c_d[0][0] res3b2_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_d (Activation) (None, None, None, 5 0 res3b3_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_d (Conv2D) (None, None, None, 2 131072 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_d (Activati (None, None, None, 2 0 res4a_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b7_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_d (ZeroPaddi (None, None, None, 2 0 res4a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_d (Activati (None, None, None, 2 0 res4a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_d (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4a_branch1_d (Conv2D) (None, None, None, 1 524288 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_d (Add) (None, None, None, 1 0 res4a_branch2c_d[0][0] res4a_branch1_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b7_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_d (Activation) (None, None, None, 1 0 res4a_d[0][0] __________________________________________________________________________________________________ res4b7_rgb (Add) (None, None, None, 1 0 bn4b7_branch2c_rgb[0][0] res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_d (Conv2D) (None, None, None, 2 262144 res4a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_relu_rgb (Activation) (None, None, None, 1 0 res4b7_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_d (Activat (None, None, None, 2 0 res4b1_branch2a_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_d (Activat (None, None, None, 2 0 res4b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b8_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_d (Add) (None, None, None, 1 0 res4b1_branch2c_d[0][0] res4a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_relu_d (Activation) (None, None, None, 1 0 res4b1_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_d (Activat (None, None, None, 2 0 res4b2_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b8_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b8_rgb (Add) (None, None, None, 1 0 bn4b8_branch2c_rgb[0][0] res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_relu_rgb (Activation) (None, None, None, 1 0 res4b8_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_d (Activat (None, None, None, 2 0 res4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_d (Add) (None, None, None, 1 0 res4b2_branch2c_d[0][0] res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_relu_d (Activation) (None, None, None, 1 0 res4b2_d[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b9_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_d (Activat (None, None, None, 2 0 res4b3_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_d (Activat (None, None, None, 2 0 res4b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b9_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b9_rgb (Add) (None, None, None, 1 0 bn4b9_branch2c_rgb[0][0] res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_d (Add) (None, None, None, 1 0 res4b3_branch2c_d[0][0] res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_relu_rgb (Activation) (None, None, None, 1 0 res4b9_rgb[0][0] __________________________________________________________________________________________________ res4b3_relu_d (Activation) (None, None, None, 1 0 res4b3_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b3_relu_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_d (Activat (None, None, None, 2 0 res4b4_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b4_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b10_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_d (Activat (None, None, None, 2 0 res4b4_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_d (Add) (None, None, None, 1 0 res4b4_branch2c_d[0][0] res4b3_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_relu_d (Activation) (None, None, None, 1 0 res4b4_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b10_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b10_rgb (Add) (None, None, None, 1 0 bn4b10_branch2c_rgb[0][0] res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_d (Activat (None, None, None, 2 0 res4b5_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_relu_rgb (Activation) (None, None, None, 1 0 res4b10_rgb[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b5_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_d (Activat (None, None, None, 2 0 res4b5_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b11_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_d (Add) (None, None, None, 1 0 res4b5_branch2c_d[0][0] res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b5_relu_d (Activation) (None, None, None, 1 0 res4b5_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b5_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_d (Activat (None, None, None, 2 0 res4b6_branch2a_d[0][0] __________________________________________________________________________________________________ res4b11_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b6_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b11_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_rgb (Add) (None, None, None, 1 0 bn4b11_branch2c_rgb[0][0] res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_d (Activat (None, None, None, 2 0 res4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_relu_rgb (Activation) (None, None, None, 1 0 res4b11_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_d (Add) (None, None, None, 1 0 res4b6_branch2c_d[0][0] res4b5_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b6_relu_d (Activation) (None, None, None, 1 0 res4b6_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b6_relu_d[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b12_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_d (Activat (None, None, None, 2 0 res4b7_branch2a_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b7_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_d (Activat (None, None, None, 2 0 res4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b12_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b7_d (Add) (None, None, None, 1 0 res4b7_branch2c_d[0][0] res4b6_relu_d[0][0] __________________________________________________________________________________________________ res4b12_rgb (Add) (None, None, None, 1 0 bn4b12_branch2c_rgb[0][0] res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_relu_d (Activation) (None, None, None, 1 0 res4b7_d[0][0] __________________________________________________________________________________________________ res4b12_relu_rgb (Activation) (None, None, None, 1 0 res4b12_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_d (Activat (None, None, None, 2 0 res4b8_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b8_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b13_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_d (Activat (None, None, None, 2 0 res4b8_branch2b_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_d (Add) (None, None, None, 1 0 res4b8_branch2c_d[0][0] res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_relu_d (Activation) (None, None, None, 1 0 res4b8_d[0][0] __________________________________________________________________________________________________ res4b13_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b8_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b13_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_d (Activat (None, None, None, 2 0 res4b9_branch2a_d[0][0] __________________________________________________________________________________________________ res4b13_rgb (Add) (None, None, None, 1 0 bn4b13_branch2c_rgb[0][0] res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b9_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_relu_rgb (Activation) (None, None, None, 1 0 res4b13_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_d (Activat (None, None, None, 2 0 res4b9_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_d (Add) (None, None, None, 1 0 res4b9_branch2c_d[0][0] res4b8_relu_d[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b14_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_relu_d (Activation) (None, None, None, 1 0 res4b9_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b9_relu_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_d (Activa (None, None, None, 2 0 res4b10_branch2a_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_d (ZeroPad (None, None, None, 2 0 res4b10_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b14_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_d (Activa (None, None, None, 2 0 res4b10_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_rgb (Add) (None, None, None, 1 0 bn4b14_branch2c_rgb[0][0] res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_relu_rgb (Activation) (None, None, None, 1 0 res4b14_rgb[0][0] __________________________________________________________________________________________________ res4b10_d (Add) (None, None, None, 1 0 res4b10_branch2c_d[0][0] res4b9_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_relu_d (Activation) (None, None, None, 1 0 res4b10_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b10_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_d (Activa (None, None, None, 2 0 res4b11_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b15_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_d (ZeroPad (None, None, None, 2 0 res4b11_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_d (Activa (None, None, None, 2 0 res4b11_branch2b_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b11_d (Add) (None, None, None, 1 0 res4b11_branch2c_d[0][0] res4b10_relu_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b15_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b11_relu_d (Activation) (None, None, None, 1 0 res4b11_d[0][0] __________________________________________________________________________________________________ res4b15_rgb (Add) (None, None, None, 1 0 bn4b15_branch2c_rgb[0][0] res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b11_relu_d[0][0] __________________________________________________________________________________________________ res4b15_relu_rgb (Activation) (None, None, None, 1 0 res4b15_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_d (Activa (None, None, None, 2 0 res4b12_branch2a_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_d (ZeroPad (None, None, None, 2 0 res4b12_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_d (Activa (None, None, None, 2 0 res4b12_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b16_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_d (Add) (None, None, None, 1 0 res4b12_branch2c_d[0][0] res4b11_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_relu_d (Activation) (None, None, None, 1 0 res4b12_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_d (Activa (None, None, None, 2 0 res4b13_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b16_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_d (ZeroPad (None, None, None, 2 0 res4b13_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b16_rgb (Add) (None, None, None, 1 0 bn4b16_branch2c_rgb[0][0] res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_relu_rgb (Activation) (None, None, None, 1 0 res4b16_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_d (Activa (None, None, None, 2 0 res4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_d (Add) (None, None, None, 1 0 res4b13_branch2c_d[0][0] res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_relu_d (Activation) (None, None, None, 1 0 res4b13_d[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b17_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_d (Activa (None, None, None, 2 0 res4b14_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_d (ZeroPad (None, None, None, 2 0 res4b14_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_d (Activa (None, None, None, 2 0 res4b14_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b17_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b17_rgb (Add) (None, None, None, 1 0 bn4b17_branch2c_rgb[0][0] res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_d (Add) (None, None, None, 1 0 res4b14_branch2c_d[0][0] res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_relu_rgb (Activation) (None, None, None, 1 0 res4b17_rgb[0][0] __________________________________________________________________________________________________ res4b14_relu_d (Activation) (None, None, None, 1 0 res4b14_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b14_relu_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_d (Activa (None, None, None, 2 0 res4b15_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_d (ZeroPad (None, None, None, 2 0 res4b15_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b18_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_d (Activa (None, None, None, 2 0 res4b15_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_d (Add) (None, None, None, 1 0 res4b15_branch2c_d[0][0] res4b14_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_relu_d (Activation) (None, None, None, 1 0 res4b15_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b18_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b18_rgb (Add) (None, None, None, 1 0 bn4b18_branch2c_rgb[0][0] res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_d (Activa (None, None, None, 2 0 res4b16_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_relu_rgb (Activation) (None, None, None, 1 0 res4b18_rgb[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_d (ZeroPad (None, None, None, 2 0 res4b16_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_d (Activa (None, None, None, 2 0 res4b16_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b19_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_d (Add) (None, None, None, 1 0 res4b16_branch2c_d[0][0] res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b16_relu_d (Activation) (None, None, None, 1 0 res4b16_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b16_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_d (Activa (None, None, None, 2 0 res4b17_branch2a_d[0][0] __________________________________________________________________________________________________ res4b19_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_d (ZeroPad (None, None, None, 2 0 res4b17_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b19_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_rgb (Add) (None, None, None, 1 0 bn4b19_branch2c_rgb[0][0] res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_d (Activa (None, None, None, 2 0 res4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_relu_rgb (Activation) (None, None, None, 1 0 res4b19_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_d (Add) (None, None, None, 1 0 res4b17_branch2c_d[0][0] res4b16_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b17_relu_d (Activation) (None, None, None, 1 0 res4b17_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b17_relu_d[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b20_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_d (Activa (None, None, None, 2 0 res4b18_branch2a_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_d (ZeroPad (None, None, None, 2 0 res4b18_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_d (Activa (None, None, None, 2 0 res4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b20_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b18_d (Add) (None, None, None, 1 0 res4b18_branch2c_d[0][0] res4b17_relu_d[0][0] __________________________________________________________________________________________________ res4b20_rgb (Add) (None, None, None, 1 0 bn4b20_branch2c_rgb[0][0] res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_relu_d (Activation) (None, None, None, 1 0 res4b18_d[0][0] __________________________________________________________________________________________________ res4b20_relu_rgb (Activation) (None, None, None, 1 0 res4b20_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_d (Activa (None, None, None, 2 0 res4b19_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_d (ZeroPad (None, None, None, 2 0 res4b19_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b21_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_d (Activa (None, None, None, 2 0 res4b19_branch2b_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_d (Add) (None, None, None, 1 0 res4b19_branch2c_d[0][0] res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_relu_d (Activation) (None, None, None, 1 0 res4b19_d[0][0] __________________________________________________________________________________________________ res4b21_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b19_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b21_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_d (Activa (None, None, None, 2 0 res4b20_branch2a_d[0][0] __________________________________________________________________________________________________ res4b21_rgb (Add) (None, None, None, 1 0 bn4b21_branch2c_rgb[0][0] res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_d (ZeroPad (None, None, None, 2 0 res4b20_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_relu_rgb (Activation) (None, None, None, 1 0 res4b21_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_d (Activa (None, None, None, 2 0 res4b20_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_d (Add) (None, None, None, 1 0 res4b20_branch2c_d[0][0] res4b19_relu_d[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b22_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_relu_d (Activation) (None, None, None, 1 0 res4b20_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b20_relu_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_d (Activa (None, None, None, 2 0 res4b21_branch2a_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_d (ZeroPad (None, None, None, 2 0 res4b21_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b22_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_d (Activa (None, None, None, 2 0 res4b21_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_rgb (Add) (None, None, None, 1 0 bn4b22_branch2c_rgb[0][0] res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_relu_rgb (Activation) (None, None, None, 1 0 res4b22_rgb[0][0] __________________________________________________________________________________________________ res4b21_d (Add) (None, None, None, 1 0 res4b21_branch2c_d[0][0] res4b20_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_rgb (Conv2D) (None, None, None, 5 524288 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_relu_d (Activation) (None, None, None, 1 0 res4b21_d[0][0] __________________________________________________________________________________________________ bn5a_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b21_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_d (Activa (None, None, None, 2 0 res4b22_branch2a_d[0][0] __________________________________________________________________________________________________ padding5a_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_d (ZeroPad (None, None, None, 2 0 res4b22_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_d[0][0] __________________________________________________________________________________________________ bn5a_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_d (Activa (None, None, None, 2 0 res4b22_branch2b_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch1_rgb (Conv2D) (None, None, None, 2 2097152 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b22_d (Add) (None, None, None, 1 0 res4b22_branch2c_d[0][0] res4b21_relu_d[0][0] __________________________________________________________________________________________________ bn5a_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn5a_branch1_rgb (BatchNormaliz (None, None, None, 2 8192 res5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4b22_relu_d (Activation) (None, None, None, 1 0 res4b22_d[0][0] __________________________________________________________________________________________________ res5a_rgb (Add) (None, None, None, 2 0 bn5a_branch2c_rgb[0][0] bn5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_d (Conv2D) (None, None, None, 5 524288 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5a_relu_rgb (Activation) (None, None, None, 2 0 res5a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_d (Activati (None, None, None, 5 0 res5a_branch2a_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5a_relu_rgb[0][0] __________________________________________________________________________________________________ padding5a_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn5b_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_d (Activati (None, None, None, 5 0 res5a_branch2b_d[0][0] __________________________________________________________________________________________________ padding5b_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch1_d (Conv2D) (None, None, None, 2 2097152 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_d (Add) (None, None, None, 2 0 res5a_branch2c_d[0][0] res5a_branch1_d[0][0] __________________________________________________________________________________________________ bn5b_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_relu_d (Activation) (None, None, None, 2 0 res5a_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5a_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_d (Activati (None, None, None, 5 0 res5b_branch2a_d[0][0] __________________________________________________________________________________________________ bn5b_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5b_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding5b_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5b_rgb (Add) (None, None, None, 2 0 bn5b_branch2c_rgb[0][0] res5a_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_relu_rgb (Activation) (None, None, None, 2 0 res5b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_d (Activati (None, None, None, 5 0 res5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn5c_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_d (Add) (None, None, None, 2 0 res5b_branch2c_d[0][0] res5a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_relu_d (Activation) (None, None, None, 2 0 res5b_d[0][0] __________________________________________________________________________________________________ padding5c_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_d (Activati (None, None, None, 5 0 res5c_branch2a_d[0][0] __________________________________________________________________________________________________ bn5c_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding5c_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_d (Activati (None, None, None, 5 0 res5c_branch2b_d[0][0] __________________________________________________________________________________________________ bn5c_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5c_rgb (Add) (None, None, None, 2 0 bn5c_branch2c_rgb[0][0] res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_d (Add) (None, None, None, 2 0 res5c_branch2c_d[0][0] res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_relu_rgb (Activation) (None, None, None, 2 0 res5c_rgb[0][0] __________________________________________________________________________________________________ res5c_relu_d (Activation) (None, None, None, 2 0 res5c_d[0][0] __________________________________________________________________________________________________ concatenate_33 (Concatenate) (None, None, None, 4 0 res5c_relu_rgb[0][0] res5c_relu_d[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 1048832 concatenate_33[0][0] __________________________________________________________________________________________________ concatenate_30 (Concatenate) (None, None, None, 2 0 res4b22_relu_rgb[0][0] res4b22_relu_d[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] concatenate_30[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 524544 concatenate_30[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ concatenate_7 (Concatenate) (None, None, None, 1 0 res3b3_relu_rgb[0][0] res3b3_relu_d[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] concatenate_7[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 262400 concatenate_7[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 9437440 concatenate_33[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 103,451,949 Trainable params: 103,241,261 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/20 1/500 [..............................] - ETA: 1:05:37 - loss: 4.1425 - regression_loss: 3.0124 - classification_loss: 1.1300 2/500 [..............................] - ETA: 34:15 - loss: 4.0701 - regression_loss: 2.9399 - classification_loss: 1.1302 3/500 [..............................] - ETA: 23:51 - loss: 4.1029 - regression_loss: 2.9734 - classification_loss: 1.1296 4/500 [..............................] - ETA: 18:43 - loss: 4.1190 - regression_loss: 2.9895 - classification_loss: 1.1295 5/500 [..............................] - ETA: 15:37 - loss: 4.1042 - regression_loss: 2.9749 - classification_loss: 1.1293 6/500 [..............................] - ETA: 13:34 - loss: 4.1079 - regression_loss: 2.9781 - classification_loss: 1.1298 7/500 [..............................] - ETA: 12:02 - loss: 4.0683 - regression_loss: 2.9385 - classification_loss: 1.1298 8/500 [..............................] - ETA: 10:53 - loss: 4.0556 - regression_loss: 2.9260 - classification_loss: 1.1296 9/500 [..............................] - ETA: 9:59 - loss: 4.0334 - regression_loss: 2.9035 - classification_loss: 1.1299 10/500 [..............................] - ETA: 9:16 - loss: 4.0353 - regression_loss: 2.9056 - classification_loss: 1.1297 11/500 [..............................] - ETA: 8:42 - loss: 4.0676 - regression_loss: 2.9378 - classification_loss: 1.1297 12/500 [..............................] - ETA: 8:14 - loss: 4.0578 - regression_loss: 2.9282 - classification_loss: 1.1296 13/500 [..............................] - ETA: 7:50 - loss: 4.0626 - regression_loss: 2.9331 - classification_loss: 1.1295 14/500 [..............................] - ETA: 7:30 - loss: 4.0553 - regression_loss: 2.9259 - classification_loss: 1.1294 15/500 [..............................] - ETA: 7:12 - loss: 4.0530 - regression_loss: 2.9237 - classification_loss: 1.1293 16/500 [..............................] - ETA: 6:57 - loss: 4.0494 - regression_loss: 2.9202 - classification_loss: 1.1293 17/500 [>.............................] - ETA: 6:43 - loss: 4.0463 - regression_loss: 2.9171 - classification_loss: 1.1292 18/500 [>.............................] - ETA: 6:30 - loss: 4.0505 - regression_loss: 2.9212 - classification_loss: 1.1292 19/500 [>.............................] - ETA: 6:18 - loss: 4.0476 - regression_loss: 2.9184 - classification_loss: 1.1292 20/500 [>.............................] - ETA: 6:08 - loss: 4.0566 - regression_loss: 2.9276 - classification_loss: 1.1290 21/500 [>.............................] - ETA: 5:59 - loss: 4.0532 - regression_loss: 2.9242 - classification_loss: 1.1290 22/500 [>.............................] - ETA: 5:50 - loss: 4.0578 - regression_loss: 2.9289 - classification_loss: 1.1289 23/500 [>.............................] - ETA: 5:42 - loss: 4.0556 - regression_loss: 2.9269 - classification_loss: 1.1287 24/500 [>.............................] - ETA: 5:35 - loss: 4.0575 - regression_loss: 2.9288 - classification_loss: 1.1286 25/500 [>.............................] - ETA: 5:28 - loss: 4.0479 - regression_loss: 2.9192 - classification_loss: 1.1287 26/500 [>.............................] - ETA: 5:22 - loss: 4.0466 - regression_loss: 2.9181 - classification_loss: 1.1285 27/500 [>.............................] - ETA: 5:16 - loss: 4.0378 - regression_loss: 2.9091 - classification_loss: 1.1287 28/500 [>.............................] - ETA: 5:10 - loss: 4.0290 - regression_loss: 2.9001 - classification_loss: 1.1288 29/500 [>.............................] - ETA: 5:05 - loss: 4.0347 - regression_loss: 2.9060 - classification_loss: 1.1286 30/500 [>.............................] - ETA: 5:00 - loss: 4.0323 - regression_loss: 2.9038 - classification_loss: 1.1284 31/500 [>.............................] - ETA: 4:56 - loss: 4.0328 - regression_loss: 2.9045 - classification_loss: 1.1283 32/500 [>.............................] - ETA: 4:51 - loss: 4.0307 - regression_loss: 2.9025 - classification_loss: 1.1282 33/500 [>.............................] - ETA: 4:47 - loss: 4.0321 - regression_loss: 2.9041 - classification_loss: 1.1281 34/500 [=>............................] - ETA: 4:43 - loss: 4.0236 - regression_loss: 2.8954 - classification_loss: 1.1282 35/500 [=>............................] - ETA: 4:40 - loss: 4.0253 - regression_loss: 2.8973 - classification_loss: 1.1280 36/500 [=>............................] - ETA: 4:36 - loss: 4.0309 - regression_loss: 2.9030 - classification_loss: 1.1279 37/500 [=>............................] - ETA: 4:33 - loss: 4.0251 - regression_loss: 2.8973 - classification_loss: 1.1278 38/500 [=>............................] - ETA: 4:30 - loss: 4.0211 - regression_loss: 2.8935 - classification_loss: 1.1276 39/500 [=>............................] - ETA: 4:27 - loss: 4.0232 - regression_loss: 2.8955 - classification_loss: 1.1277 40/500 [=>............................] - ETA: 4:24 - loss: 4.0214 - regression_loss: 2.8939 - classification_loss: 1.1275 41/500 [=>............................] - ETA: 4:21 - loss: 4.0225 - regression_loss: 2.8951 - classification_loss: 1.1273 42/500 [=>............................] - ETA: 4:19 - loss: 4.0247 - regression_loss: 2.8975 - classification_loss: 1.1272 43/500 [=>............................] - ETA: 4:16 - loss: 4.0256 - regression_loss: 2.8987 - classification_loss: 1.1269 44/500 [=>............................] - ETA: 4:14 - loss: 4.0216 - regression_loss: 2.8949 - classification_loss: 1.1267 45/500 [=>............................] - ETA: 4:12 - loss: 4.0092 - regression_loss: 2.8826 - classification_loss: 1.1267 46/500 [=>............................] - ETA: 4:09 - loss: 4.0054 - regression_loss: 2.8787 - classification_loss: 1.1267 47/500 [=>............................] - ETA: 4:07 - loss: 3.9980 - regression_loss: 2.8716 - classification_loss: 1.1264 48/500 [=>............................] - ETA: 4:05 - loss: 3.9948 - regression_loss: 2.8688 - classification_loss: 1.1260 49/500 [=>............................] - ETA: 4:03 - loss: 3.9953 - regression_loss: 2.8696 - classification_loss: 1.1257 50/500 [==>...........................] - ETA: 4:02 - loss: 3.9922 - regression_loss: 2.8667 - classification_loss: 1.1255 51/500 [==>...........................] - ETA: 4:00 - loss: 3.9913 - regression_loss: 2.8662 - classification_loss: 1.1251 52/500 [==>...........................] - ETA: 3:58 - loss: 3.9897 - regression_loss: 2.8650 - classification_loss: 1.1247 53/500 [==>...........................] - ETA: 3:56 - loss: 3.9888 - regression_loss: 2.8644 - classification_loss: 1.1244 54/500 [==>...........................] - ETA: 3:54 - loss: 3.9848 - regression_loss: 2.8608 - classification_loss: 1.1240 55/500 [==>...........................] - ETA: 3:53 - loss: 3.9822 - regression_loss: 2.8587 - classification_loss: 1.1235 56/500 [==>...........................] - ETA: 3:51 - loss: 3.9784 - regression_loss: 2.8553 - classification_loss: 1.1231 57/500 [==>...........................] - ETA: 3:50 - loss: 3.9743 - regression_loss: 2.8517 - classification_loss: 1.1226 58/500 [==>...........................] - ETA: 3:48 - loss: 3.9705 - regression_loss: 2.8484 - classification_loss: 1.1221 59/500 [==>...........................] - ETA: 3:46 - loss: 3.9661 - regression_loss: 2.8446 - classification_loss: 1.1215 60/500 [==>...........................] - ETA: 3:45 - loss: 3.9622 - regression_loss: 2.8406 - classification_loss: 1.1216 61/500 [==>...........................] - ETA: 3:44 - loss: 3.9577 - regression_loss: 2.8366 - classification_loss: 1.1211 62/500 [==>...........................] - ETA: 3:42 - loss: 3.9545 - regression_loss: 2.8340 - classification_loss: 1.1205 63/500 [==>...........................] - ETA: 3:41 - loss: 3.9479 - regression_loss: 2.8276 - classification_loss: 1.1203 64/500 [==>...........................] - ETA: 3:39 - loss: 3.9392 - regression_loss: 2.8200 - classification_loss: 1.1192 65/500 [==>...........................] - ETA: 3:38 - loss: 3.9391 - regression_loss: 2.8202 - classification_loss: 1.1189 66/500 [==>...........................] - ETA: 3:37 - loss: 3.9332 - regression_loss: 2.8152 - classification_loss: 1.1181 67/500 [===>..........................] - ETA: 3:35 - loss: 3.9259 - regression_loss: 2.8082 - classification_loss: 1.1177 68/500 [===>..........................] - ETA: 3:34 - loss: 3.9223 - regression_loss: 2.8052 - classification_loss: 1.1172 69/500 [===>..........................] - ETA: 3:33 - loss: 3.9173 - regression_loss: 2.8006 - classification_loss: 1.1167 70/500 [===>..........................] - ETA: 3:32 - loss: 3.9172 - regression_loss: 2.8007 - classification_loss: 1.1165 71/500 [===>..........................] - ETA: 3:31 - loss: 3.9118 - regression_loss: 2.7964 - classification_loss: 1.1154 72/500 [===>..........................] - ETA: 3:30 - loss: 3.9050 - regression_loss: 2.7907 - classification_loss: 1.1143 73/500 [===>..........................] - ETA: 3:28 - loss: 3.9042 - regression_loss: 2.7902 - classification_loss: 1.1140 74/500 [===>..........................] - ETA: 3:27 - loss: 3.8993 - regression_loss: 2.7855 - classification_loss: 1.1138 75/500 [===>..........................] - ETA: 3:26 - loss: 3.8954 - regression_loss: 2.7825 - classification_loss: 1.1129 76/500 [===>..........................] - ETA: 3:25 - loss: 3.8920 - regression_loss: 2.7807 - classification_loss: 1.1112 77/500 [===>..........................] - ETA: 3:24 - loss: 3.8881 - regression_loss: 2.7787 - classification_loss: 1.1094 78/500 [===>..........................] - ETA: 3:23 - loss: 3.8853 - regression_loss: 2.7759 - classification_loss: 1.1094 79/500 [===>..........................] - ETA: 3:22 - loss: 3.8808 - regression_loss: 2.7719 - classification_loss: 1.1089 80/500 [===>..........................] - ETA: 3:21 - loss: 3.8761 - regression_loss: 2.7694 - classification_loss: 1.1067 81/500 [===>..........................] - ETA: 3:20 - loss: 3.8726 - regression_loss: 2.7660 - classification_loss: 1.1066 82/500 [===>..........................] - ETA: 3:20 - loss: 3.8676 - regression_loss: 2.7615 - classification_loss: 1.1060 83/500 [===>..........................] - ETA: 3:19 - loss: 3.8692 - regression_loss: 2.7645 - classification_loss: 1.1047 84/500 [====>.........................] - ETA: 3:18 - loss: 3.8679 - regression_loss: 2.7649 - classification_loss: 1.1029 85/500 [====>.........................] - ETA: 3:17 - loss: 3.8621 - regression_loss: 2.7611 - classification_loss: 1.1009 86/500 [====>.........................] - ETA: 3:16 - loss: 3.8566 - regression_loss: 2.7580 - classification_loss: 1.0986 87/500 [====>.........................] - ETA: 3:15 - loss: 3.8522 - regression_loss: 2.7559 - classification_loss: 1.0963 88/500 [====>.........................] - ETA: 3:14 - loss: 3.8501 - regression_loss: 2.7539 - classification_loss: 1.0962 89/500 [====>.........................] - ETA: 3:13 - loss: 3.8454 - regression_loss: 2.7510 - classification_loss: 1.0944 90/500 [====>.........................] - ETA: 3:12 - loss: 3.8455 - regression_loss: 2.7514 - classification_loss: 1.0941 91/500 [====>.........................] - ETA: 3:11 - loss: 3.8396 - regression_loss: 2.7477 - classification_loss: 1.0920 92/500 [====>.........................] - ETA: 3:10 - loss: 3.8371 - regression_loss: 2.7467 - classification_loss: 1.0904 93/500 [====>.........................] - ETA: 3:09 - loss: 3.8353 - regression_loss: 2.7454 - classification_loss: 1.0899 94/500 [====>.........................] - ETA: 3:08 - loss: 3.8291 - regression_loss: 2.7421 - classification_loss: 1.0871 95/500 [====>.........................] - ETA: 3:07 - loss: 3.8217 - regression_loss: 2.7375 - classification_loss: 1.0842 96/500 [====>.........................] - ETA: 3:06 - loss: 3.8160 - regression_loss: 2.7341 - classification_loss: 1.0819 97/500 [====>.........................] - ETA: 3:06 - loss: 3.8115 - regression_loss: 2.7316 - classification_loss: 1.0799 98/500 [====>.........................] - ETA: 3:05 - loss: 3.8057 - regression_loss: 2.7290 - classification_loss: 1.0767 99/500 [====>.........................] - ETA: 3:04 - loss: 3.8067 - regression_loss: 2.7305 - classification_loss: 1.0762 100/500 [=====>........................] - ETA: 3:03 - loss: 3.8024 - regression_loss: 2.7296 - classification_loss: 1.0728 101/500 [=====>........................] - ETA: 3:02 - loss: 3.7968 - regression_loss: 2.7271 - classification_loss: 1.0697 102/500 [=====>........................] - ETA: 3:02 - loss: 3.7897 - regression_loss: 2.7242 - classification_loss: 1.0655 103/500 [=====>........................] - ETA: 3:01 - loss: 3.7826 - regression_loss: 2.7211 - classification_loss: 1.0615 104/500 [=====>........................] - ETA: 3:00 - loss: 3.7817 - regression_loss: 2.7197 - classification_loss: 1.0620 105/500 [=====>........................] - ETA: 2:59 - loss: 3.7809 - regression_loss: 2.7195 - classification_loss: 1.0614 106/500 [=====>........................] - ETA: 2:59 - loss: 3.7806 - regression_loss: 2.7182 - classification_loss: 1.0624 107/500 [=====>........................] - ETA: 2:58 - loss: 3.7760 - regression_loss: 2.7159 - classification_loss: 1.0601 108/500 [=====>........................] - ETA: 2:57 - loss: 3.7773 - regression_loss: 2.7169 - classification_loss: 1.0604 109/500 [=====>........................] - ETA: 2:56 - loss: 3.7714 - regression_loss: 2.7135 - classification_loss: 1.0579 110/500 [=====>........................] - ETA: 2:56 - loss: 3.7755 - regression_loss: 2.7181 - classification_loss: 1.0575 111/500 [=====>........................] - ETA: 2:55 - loss: 3.7713 - regression_loss: 2.7169 - classification_loss: 1.0544 112/500 [=====>........................] - ETA: 2:54 - loss: 3.7641 - regression_loss: 2.7137 - classification_loss: 1.0504 113/500 [=====>........................] - ETA: 2:54 - loss: 3.7633 - regression_loss: 2.7134 - classification_loss: 1.0499 114/500 [=====>........................] - ETA: 2:53 - loss: 3.7570 - regression_loss: 2.7102 - classification_loss: 1.0468 115/500 [=====>........................] - ETA: 2:52 - loss: 3.7508 - regression_loss: 2.7079 - classification_loss: 1.0429 116/500 [=====>........................] - ETA: 2:52 - loss: 3.7506 - regression_loss: 2.7078 - classification_loss: 1.0429 117/500 [======>.......................] - ETA: 2:51 - loss: 3.7437 - regression_loss: 2.7046 - classification_loss: 1.0391 118/500 [======>.......................] - ETA: 2:50 - loss: 3.7404 - regression_loss: 2.7021 - classification_loss: 1.0384 119/500 [======>.......................] - ETA: 2:49 - loss: 3.7359 - regression_loss: 2.7007 - classification_loss: 1.0352 120/500 [======>.......................] - ETA: 2:49 - loss: 3.7304 - regression_loss: 2.6988 - classification_loss: 1.0317 121/500 [======>.......................] - ETA: 2:48 - loss: 3.7254 - regression_loss: 2.6975 - classification_loss: 1.0279 122/500 [======>.......................] - ETA: 2:48 - loss: 3.7192 - regression_loss: 2.6958 - classification_loss: 1.0234 123/500 [======>.......................] - ETA: 2:47 - loss: 3.7136 - regression_loss: 2.6940 - classification_loss: 1.0196 124/500 [======>.......................] - ETA: 2:46 - loss: 3.7070 - regression_loss: 2.6909 - classification_loss: 1.0162 125/500 [======>.......................] - ETA: 2:46 - loss: 3.7002 - regression_loss: 2.6880 - classification_loss: 1.0122 126/500 [======>.......................] - ETA: 2:45 - loss: 3.6931 - regression_loss: 2.6854 - classification_loss: 1.0077 127/500 [======>.......................] - ETA: 2:44 - loss: 3.6921 - regression_loss: 2.6842 - classification_loss: 1.0079 128/500 [======>.......................] - ETA: 2:44 - loss: 3.6853 - regression_loss: 2.6815 - classification_loss: 1.0037 129/500 [======>.......................] - ETA: 2:43 - loss: 3.6793 - regression_loss: 2.6789 - classification_loss: 1.0004 130/500 [======>.......................] - ETA: 2:42 - loss: 3.6731 - regression_loss: 2.6766 - classification_loss: 0.9964 131/500 [======>.......................] - ETA: 2:42 - loss: 3.6665 - regression_loss: 2.6745 - classification_loss: 0.9920 132/500 [======>.......................] - ETA: 2:41 - loss: 3.6620 - regression_loss: 2.6729 - classification_loss: 0.9891 133/500 [======>.......................] - ETA: 2:41 - loss: 3.6614 - regression_loss: 2.6726 - classification_loss: 0.9888 134/500 [=======>......................] - ETA: 2:40 - loss: 3.6592 - regression_loss: 2.6714 - classification_loss: 0.9878 135/500 [=======>......................] - ETA: 2:40 - loss: 3.6541 - regression_loss: 2.6698 - classification_loss: 0.9843 136/500 [=======>......................] - ETA: 2:39 - loss: 3.6532 - regression_loss: 2.6702 - classification_loss: 0.9831 137/500 [=======>......................] - ETA: 2:39 - loss: 3.6508 - regression_loss: 2.6693 - classification_loss: 0.9815 138/500 [=======>......................] - ETA: 2:38 - loss: 3.6456 - regression_loss: 2.6674 - classification_loss: 0.9782 139/500 [=======>......................] - ETA: 2:37 - loss: 3.6383 - regression_loss: 2.6638 - classification_loss: 0.9744 140/500 [=======>......................] - ETA: 2:37 - loss: 3.6365 - regression_loss: 2.6633 - classification_loss: 0.9732 141/500 [=======>......................] - ETA: 2:36 - loss: 3.6309 - regression_loss: 2.6613 - classification_loss: 0.9697 142/500 [=======>......................] - ETA: 2:36 - loss: 3.6325 - regression_loss: 2.6643 - classification_loss: 0.9682 143/500 [=======>......................] - ETA: 2:35 - loss: 3.6271 - regression_loss: 2.6620 - classification_loss: 0.9652 144/500 [=======>......................] - ETA: 2:35 - loss: 3.6209 - regression_loss: 2.6596 - classification_loss: 0.9612 145/500 [=======>......................] - ETA: 2:34 - loss: 3.6148 - regression_loss: 2.6573 - classification_loss: 0.9575 146/500 [=======>......................] - ETA: 2:33 - loss: 3.6125 - regression_loss: 2.6556 - classification_loss: 0.9569 147/500 [=======>......................] - ETA: 2:33 - loss: 3.6141 - regression_loss: 2.6575 - classification_loss: 0.9566 148/500 [=======>......................] - ETA: 2:32 - loss: 3.6091 - regression_loss: 2.6558 - classification_loss: 0.9532 149/500 [=======>......................] - ETA: 2:32 - loss: 3.6081 - regression_loss: 2.6568 - classification_loss: 0.9513 150/500 [========>.....................] - ETA: 2:31 - loss: 3.6055 - regression_loss: 2.6562 - classification_loss: 0.9494 151/500 [========>.....................] - ETA: 2:31 - loss: 3.6006 - regression_loss: 2.6545 - classification_loss: 0.9461 152/500 [========>.....................] - ETA: 2:30 - loss: 3.5958 - regression_loss: 2.6537 - classification_loss: 0.9421 153/500 [========>.....................] - ETA: 2:30 - loss: 3.5917 - regression_loss: 2.6532 - classification_loss: 0.9385 154/500 [========>.....................] - ETA: 2:29 - loss: 3.5882 - regression_loss: 2.6519 - classification_loss: 0.9363 155/500 [========>.....................] - ETA: 2:29 - loss: 3.5842 - regression_loss: 2.6506 - classification_loss: 0.9336 156/500 [========>.....................] - ETA: 2:28 - loss: 3.5878 - regression_loss: 2.6532 - classification_loss: 0.9346 157/500 [========>.....................] - ETA: 2:27 - loss: 3.5818 - regression_loss: 2.6511 - classification_loss: 0.9307 158/500 [========>.....................] - ETA: 2:27 - loss: 3.5770 - regression_loss: 2.6491 - classification_loss: 0.9279 159/500 [========>.....................] - ETA: 2:27 - loss: 3.5723 - regression_loss: 2.6474 - classification_loss: 0.9249 160/500 [========>.....................] - ETA: 2:26 - loss: 3.5690 - regression_loss: 2.6461 - classification_loss: 0.9228 161/500 [========>.....................] - ETA: 2:25 - loss: 3.5678 - regression_loss: 2.6463 - classification_loss: 0.9215 162/500 [========>.....................] - ETA: 2:25 - loss: 3.5626 - regression_loss: 2.6441 - classification_loss: 0.9185 163/500 [========>.....................] - ETA: 2:24 - loss: 3.5579 - regression_loss: 2.6421 - classification_loss: 0.9158 164/500 [========>.....................] - ETA: 2:24 - loss: 3.5540 - regression_loss: 2.6412 - classification_loss: 0.9129 165/500 [========>.....................] - ETA: 2:23 - loss: 3.5489 - regression_loss: 2.6384 - classification_loss: 0.9106 166/500 [========>.....................] - ETA: 2:23 - loss: 3.5470 - regression_loss: 2.6383 - classification_loss: 0.9087 167/500 [=========>....................] - ETA: 2:22 - loss: 3.5417 - regression_loss: 2.6361 - classification_loss: 0.9056 168/500 [=========>....................] - ETA: 2:22 - loss: 3.5405 - regression_loss: 2.6357 - classification_loss: 0.9048 169/500 [=========>....................] - ETA: 2:21 - loss: 3.5382 - regression_loss: 2.6351 - classification_loss: 0.9031 170/500 [=========>....................] - ETA: 2:21 - loss: 3.5332 - regression_loss: 2.6333 - classification_loss: 0.8999 171/500 [=========>....................] - ETA: 2:20 - loss: 3.5285 - regression_loss: 2.6318 - classification_loss: 0.8968 172/500 [=========>....................] - ETA: 2:20 - loss: 3.5235 - regression_loss: 2.6298 - classification_loss: 0.8938 173/500 [=========>....................] - ETA: 2:19 - loss: 3.5199 - regression_loss: 2.6285 - classification_loss: 0.8914 174/500 [=========>....................] - ETA: 2:19 - loss: 3.5178 - regression_loss: 2.6289 - classification_loss: 0.8889 175/500 [=========>....................] - ETA: 2:18 - loss: 3.5130 - regression_loss: 2.6268 - classification_loss: 0.8862 176/500 [=========>....................] - ETA: 2:18 - loss: 3.5072 - regression_loss: 2.6237 - classification_loss: 0.8834 177/500 [=========>....................] - ETA: 2:17 - loss: 3.5021 - regression_loss: 2.6215 - classification_loss: 0.8806 178/500 [=========>....................] - ETA: 2:17 - loss: 3.4978 - regression_loss: 2.6197 - classification_loss: 0.8781 179/500 [=========>....................] - ETA: 2:16 - loss: 3.4935 - regression_loss: 2.6177 - classification_loss: 0.8758 180/500 [=========>....................] - ETA: 2:15 - loss: 3.4911 - regression_loss: 2.6176 - classification_loss: 0.8734 181/500 [=========>....................] - ETA: 2:15 - loss: 3.4872 - regression_loss: 2.6159 - classification_loss: 0.8712 182/500 [=========>....................] - ETA: 2:14 - loss: 3.4849 - regression_loss: 2.6158 - classification_loss: 0.8691 183/500 [=========>....................] - ETA: 2:14 - loss: 3.4812 - regression_loss: 2.6140 - classification_loss: 0.8672 184/500 [==========>...................] - ETA: 2:13 - loss: 3.4796 - regression_loss: 2.6139 - classification_loss: 0.8657 185/500 [==========>...................] - ETA: 2:13 - loss: 3.4793 - regression_loss: 2.6141 - classification_loss: 0.8652 186/500 [==========>...................] - ETA: 2:13 - loss: 3.4751 - regression_loss: 2.6124 - classification_loss: 0.8627 187/500 [==========>...................] - ETA: 2:12 - loss: 3.4748 - regression_loss: 2.6121 - classification_loss: 0.8627 188/500 [==========>...................] - ETA: 2:12 - loss: 3.4690 - regression_loss: 2.6091 - classification_loss: 0.8599 189/500 [==========>...................] - ETA: 2:11 - loss: 3.4645 - regression_loss: 2.6068 - classification_loss: 0.8578 190/500 [==========>...................] - ETA: 2:11 - loss: 3.4604 - regression_loss: 2.6049 - classification_loss: 0.8555 191/500 [==========>...................] - ETA: 2:10 - loss: 3.4540 - regression_loss: 2.6014 - classification_loss: 0.8526 192/500 [==========>...................] - ETA: 2:10 - loss: 3.4502 - regression_loss: 2.6002 - classification_loss: 0.8500 193/500 [==========>...................] - ETA: 2:09 - loss: 3.4484 - regression_loss: 2.6005 - classification_loss: 0.8479 194/500 [==========>...................] - ETA: 2:09 - loss: 3.4465 - regression_loss: 2.6004 - classification_loss: 0.8461 195/500 [==========>...................] - ETA: 2:08 - loss: 3.4446 - regression_loss: 2.5987 - classification_loss: 0.8459 196/500 [==========>...................] - ETA: 2:08 - loss: 3.4402 - regression_loss: 2.5969 - classification_loss: 0.8433 197/500 [==========>...................] - ETA: 2:07 - loss: 3.4378 - regression_loss: 2.5966 - classification_loss: 0.8411 198/500 [==========>...................] - ETA: 2:07 - loss: 3.4330 - regression_loss: 2.5944 - classification_loss: 0.8386 199/500 [==========>...................] - ETA: 2:06 - loss: 3.4288 - regression_loss: 2.5926 - classification_loss: 0.8363 200/500 [===========>..................] - ETA: 2:06 - loss: 3.4247 - regression_loss: 2.5907 - classification_loss: 0.8340 201/500 [===========>..................] - ETA: 2:05 - loss: 3.4229 - regression_loss: 2.5898 - classification_loss: 0.8331 202/500 [===========>..................] - ETA: 2:05 - loss: 3.4203 - regression_loss: 2.5885 - classification_loss: 0.8318 203/500 [===========>..................] - ETA: 2:04 - loss: 3.4164 - regression_loss: 2.5866 - classification_loss: 0.8299 204/500 [===========>..................] - ETA: 2:04 - loss: 3.4142 - regression_loss: 2.5852 - classification_loss: 0.8290 205/500 [===========>..................] - ETA: 2:04 - loss: 3.4134 - regression_loss: 2.5856 - classification_loss: 0.8278 206/500 [===========>..................] - ETA: 2:03 - loss: 3.4083 - regression_loss: 2.5827 - classification_loss: 0.8256 207/500 [===========>..................] - ETA: 2:03 - loss: 3.4051 - regression_loss: 2.5814 - classification_loss: 0.8236 208/500 [===========>..................] - ETA: 2:02 - loss: 3.4036 - regression_loss: 2.5818 - classification_loss: 0.8218 209/500 [===========>..................] - ETA: 2:02 - loss: 3.4005 - regression_loss: 2.5808 - classification_loss: 0.8197 210/500 [===========>..................] - ETA: 2:01 - loss: 3.3970 - regression_loss: 2.5781 - classification_loss: 0.8189 211/500 [===========>..................] - ETA: 2:01 - loss: 3.3972 - regression_loss: 2.5786 - classification_loss: 0.8186 212/500 [===========>..................] - ETA: 2:00 - loss: 3.3958 - regression_loss: 2.5783 - classification_loss: 0.8175 213/500 [===========>..................] - ETA: 2:00 - loss: 3.3933 - regression_loss: 2.5766 - classification_loss: 0.8168 214/500 [===========>..................] - ETA: 1:59 - loss: 3.3904 - regression_loss: 2.5756 - classification_loss: 0.8148 215/500 [===========>..................] - ETA: 1:59 - loss: 3.3879 - regression_loss: 2.5750 - classification_loss: 0.8130 216/500 [===========>..................] - ETA: 1:58 - loss: 3.3825 - regression_loss: 2.5713 - classification_loss: 0.8112 217/500 [============>.................] - ETA: 1:58 - loss: 3.3806 - regression_loss: 2.5709 - classification_loss: 0.8096 218/500 [============>.................] - ETA: 1:57 - loss: 3.3828 - regression_loss: 2.5742 - classification_loss: 0.8086 219/500 [============>.................] - ETA: 1:57 - loss: 3.3801 - regression_loss: 2.5732 - classification_loss: 0.8069 220/500 [============>.................] - ETA: 1:57 - loss: 3.3786 - regression_loss: 2.5736 - classification_loss: 0.8050 221/500 [============>.................] - ETA: 1:56 - loss: 3.3759 - regression_loss: 2.5723 - classification_loss: 0.8036 222/500 [============>.................] - ETA: 1:56 - loss: 3.3747 - regression_loss: 2.5723 - classification_loss: 0.8024 223/500 [============>.................] - ETA: 1:55 - loss: 3.3736 - regression_loss: 2.5729 - classification_loss: 0.8007 224/500 [============>.................] - ETA: 1:55 - loss: 3.3703 - regression_loss: 2.5716 - classification_loss: 0.7987 225/500 [============>.................] - ETA: 1:54 - loss: 3.3682 - regression_loss: 2.5710 - classification_loss: 0.7972 226/500 [============>.................] - ETA: 1:54 - loss: 3.3647 - regression_loss: 2.5694 - classification_loss: 0.7953 227/500 [============>.................] - ETA: 1:53 - loss: 3.3618 - regression_loss: 2.5679 - classification_loss: 0.7939 228/500 [============>.................] - ETA: 1:53 - loss: 3.3604 - regression_loss: 2.5677 - classification_loss: 0.7928 229/500 [============>.................] - ETA: 1:52 - loss: 3.3577 - regression_loss: 2.5667 - classification_loss: 0.7911 230/500 [============>.................] - ETA: 1:52 - loss: 3.3550 - regression_loss: 2.5656 - classification_loss: 0.7895 231/500 [============>.................] - ETA: 1:52 - loss: 3.3530 - regression_loss: 2.5644 - classification_loss: 0.7886 232/500 [============>.................] - ETA: 1:51 - loss: 3.3486 - regression_loss: 2.5619 - classification_loss: 0.7867 233/500 [============>.................] - ETA: 1:51 - loss: 3.3455 - regression_loss: 2.5605 - classification_loss: 0.7850 234/500 [=============>................] - ETA: 1:50 - loss: 3.3428 - regression_loss: 2.5596 - classification_loss: 0.7833 235/500 [=============>................] - ETA: 1:50 - loss: 3.3395 - regression_loss: 2.5584 - classification_loss: 0.7811 236/500 [=============>................] - ETA: 1:49 - loss: 3.3371 - regression_loss: 2.5574 - classification_loss: 0.7797 237/500 [=============>................] - ETA: 1:49 - loss: 3.3331 - regression_loss: 2.5555 - classification_loss: 0.7776 238/500 [=============>................] - ETA: 1:48 - loss: 3.3307 - regression_loss: 2.5545 - classification_loss: 0.7763 239/500 [=============>................] - ETA: 1:48 - loss: 3.3289 - regression_loss: 2.5538 - classification_loss: 0.7751 240/500 [=============>................] - ETA: 1:48 - loss: 3.3263 - regression_loss: 2.5527 - classification_loss: 0.7736 241/500 [=============>................] - ETA: 1:47 - loss: 3.3247 - regression_loss: 2.5528 - classification_loss: 0.7719 242/500 [=============>................] - ETA: 1:47 - loss: 3.3224 - regression_loss: 2.5521 - classification_loss: 0.7702 243/500 [=============>................] - ETA: 1:46 - loss: 3.3216 - regression_loss: 2.5518 - classification_loss: 0.7698 244/500 [=============>................] - ETA: 1:46 - loss: 3.3193 - regression_loss: 2.5511 - classification_loss: 0.7682 245/500 [=============>................] - ETA: 1:45 - loss: 3.3174 - regression_loss: 2.5505 - classification_loss: 0.7669 246/500 [=============>................] - ETA: 1:45 - loss: 3.3139 - regression_loss: 2.5489 - classification_loss: 0.7650 247/500 [=============>................] - ETA: 1:44 - loss: 3.3105 - regression_loss: 2.5472 - classification_loss: 0.7633 248/500 [=============>................] - ETA: 1:44 - loss: 3.3109 - regression_loss: 2.5472 - classification_loss: 0.7637 249/500 [=============>................] - ETA: 1:44 - loss: 3.3072 - regression_loss: 2.5456 - classification_loss: 0.7616 250/500 [==============>...............] - ETA: 1:43 - loss: 3.3057 - regression_loss: 2.5448 - classification_loss: 0.7609 251/500 [==============>...............] - ETA: 1:43 - loss: 3.3030 - regression_loss: 2.5436 - classification_loss: 0.7594 252/500 [==============>...............] - ETA: 1:42 - loss: 3.2993 - regression_loss: 2.5414 - classification_loss: 0.7579 253/500 [==============>...............] - ETA: 1:42 - loss: 3.2968 - regression_loss: 2.5401 - classification_loss: 0.7567 254/500 [==============>...............] - ETA: 1:41 - loss: 3.2957 - regression_loss: 2.5402 - classification_loss: 0.7555 255/500 [==============>...............] - ETA: 1:41 - loss: 3.2927 - regression_loss: 2.5386 - classification_loss: 0.7542 256/500 [==============>...............] - ETA: 1:41 - loss: 3.2907 - regression_loss: 2.5378 - classification_loss: 0.7529 257/500 [==============>...............] - ETA: 1:40 - loss: 3.2876 - regression_loss: 2.5365 - classification_loss: 0.7512 258/500 [==============>...............] - ETA: 1:40 - loss: 3.2881 - regression_loss: 2.5374 - classification_loss: 0.7507 259/500 [==============>...............] - ETA: 1:39 - loss: 3.2897 - regression_loss: 2.5396 - classification_loss: 0.7501 260/500 [==============>...............] - ETA: 1:39 - loss: 3.2876 - regression_loss: 2.5391 - classification_loss: 0.7485 261/500 [==============>...............] - ETA: 1:38 - loss: 3.2856 - regression_loss: 2.5381 - classification_loss: 0.7475 262/500 [==============>...............] - ETA: 1:38 - loss: 3.2822 - regression_loss: 2.5364 - classification_loss: 0.7458 263/500 [==============>...............] - ETA: 1:37 - loss: 3.2801 - regression_loss: 2.5355 - classification_loss: 0.7446 264/500 [==============>...............] - ETA: 1:37 - loss: 3.2780 - regression_loss: 2.5346 - classification_loss: 0.7434 265/500 [==============>...............] - ETA: 1:37 - loss: 3.2747 - regression_loss: 2.5329 - classification_loss: 0.7418 266/500 [==============>...............] - ETA: 1:36 - loss: 3.2739 - regression_loss: 2.5330 - classification_loss: 0.7409 267/500 [===============>..............] - ETA: 1:36 - loss: 3.2724 - regression_loss: 2.5326 - classification_loss: 0.7397 268/500 [===============>..............] - ETA: 1:35 - loss: 3.2691 - regression_loss: 2.5310 - classification_loss: 0.7381 269/500 [===============>..............] - ETA: 1:35 - loss: 3.2669 - regression_loss: 2.5300 - classification_loss: 0.7369 270/500 [===============>..............] - ETA: 1:34 - loss: 3.2644 - regression_loss: 2.5292 - classification_loss: 0.7352 271/500 [===============>..............] - ETA: 1:34 - loss: 3.2635 - regression_loss: 2.5286 - classification_loss: 0.7349 272/500 [===============>..............] - ETA: 1:33 - loss: 3.2612 - regression_loss: 2.5278 - classification_loss: 0.7334 273/500 [===============>..............] - ETA: 1:33 - loss: 3.2587 - regression_loss: 2.5268 - classification_loss: 0.7319 274/500 [===============>..............] - ETA: 1:33 - loss: 3.2580 - regression_loss: 2.5264 - classification_loss: 0.7316 275/500 [===============>..............] - ETA: 1:32 - loss: 3.2551 - regression_loss: 2.5250 - classification_loss: 0.7301 276/500 [===============>..............] - ETA: 1:32 - loss: 3.2525 - regression_loss: 2.5236 - classification_loss: 0.7289 277/500 [===============>..............] - ETA: 1:31 - loss: 3.2502 - regression_loss: 2.5225 - classification_loss: 0.7277 278/500 [===============>..............] - ETA: 1:31 - loss: 3.2460 - regression_loss: 2.5202 - classification_loss: 0.7259 279/500 [===============>..............] - ETA: 1:30 - loss: 3.2451 - regression_loss: 2.5199 - classification_loss: 0.7252 280/500 [===============>..............] - ETA: 1:30 - loss: 3.2457 - regression_loss: 2.5210 - classification_loss: 0.7247 281/500 [===============>..............] - ETA: 1:30 - loss: 3.2466 - regression_loss: 2.5225 - classification_loss: 0.7241 282/500 [===============>..............] - ETA: 1:29 - loss: 3.2475 - regression_loss: 2.5228 - classification_loss: 0.7247 283/500 [===============>..............] - ETA: 1:29 - loss: 3.2481 - regression_loss: 2.5240 - classification_loss: 0.7242 284/500 [================>.............] - ETA: 1:28 - loss: 3.2467 - regression_loss: 2.5234 - classification_loss: 0.7233 285/500 [================>.............] - ETA: 1:28 - loss: 3.2442 - regression_loss: 2.5220 - classification_loss: 0.7223 286/500 [================>.............] - ETA: 1:27 - loss: 3.2429 - regression_loss: 2.5215 - classification_loss: 0.7213 287/500 [================>.............] - ETA: 1:27 - loss: 3.2418 - regression_loss: 2.5218 - classification_loss: 0.7200 288/500 [================>.............] - ETA: 1:27 - loss: 3.2399 - regression_loss: 2.5208 - classification_loss: 0.7190 289/500 [================>.............] - ETA: 1:26 - loss: 3.2375 - regression_loss: 2.5198 - classification_loss: 0.7177 290/500 [================>.............] - ETA: 1:26 - loss: 3.2381 - regression_loss: 2.5208 - classification_loss: 0.7172 291/500 [================>.............] - ETA: 1:25 - loss: 3.2368 - regression_loss: 2.5205 - classification_loss: 0.7163 292/500 [================>.............] - ETA: 1:25 - loss: 3.2353 - regression_loss: 2.5200 - classification_loss: 0.7152 293/500 [================>.............] - ETA: 1:24 - loss: 3.2350 - regression_loss: 2.5201 - classification_loss: 0.7149 294/500 [================>.............] - ETA: 1:24 - loss: 3.2332 - regression_loss: 2.5194 - classification_loss: 0.7138 295/500 [================>.............] - ETA: 1:24 - loss: 3.2306 - regression_loss: 2.5180 - classification_loss: 0.7126 296/500 [================>.............] - ETA: 1:23 - loss: 3.2289 - regression_loss: 2.5176 - classification_loss: 0.7113 297/500 [================>.............] - ETA: 1:23 - loss: 3.2268 - regression_loss: 2.5167 - classification_loss: 0.7102 298/500 [================>.............] - ETA: 1:22 - loss: 3.2259 - regression_loss: 2.5159 - classification_loss: 0.7099 299/500 [================>.............] - ETA: 1:22 - loss: 3.2231 - regression_loss: 2.5146 - classification_loss: 0.7086 300/500 [=================>............] - ETA: 1:21 - loss: 3.2205 - regression_loss: 2.5131 - classification_loss: 0.7074 301/500 [=================>............] - ETA: 1:21 - loss: 3.2181 - regression_loss: 2.5116 - classification_loss: 0.7065 302/500 [=================>............] - ETA: 1:21 - loss: 3.2152 - regression_loss: 2.5100 - classification_loss: 0.7052 303/500 [=================>............] - ETA: 1:20 - loss: 3.2129 - regression_loss: 2.5089 - classification_loss: 0.7039 304/500 [=================>............] - ETA: 1:20 - loss: 3.2113 - regression_loss: 2.5086 - classification_loss: 0.7027 305/500 [=================>............] - ETA: 1:19 - loss: 3.2090 - regression_loss: 2.5074 - classification_loss: 0.7015 306/500 [=================>............] - ETA: 1:19 - loss: 3.2065 - regression_loss: 2.5062 - classification_loss: 0.7003 307/500 [=================>............] - ETA: 1:18 - loss: 3.2042 - regression_loss: 2.5049 - classification_loss: 0.6993 308/500 [=================>............] - ETA: 1:18 - loss: 3.2030 - regression_loss: 2.5046 - classification_loss: 0.6984 309/500 [=================>............] - ETA: 1:18 - loss: 3.2015 - regression_loss: 2.5042 - classification_loss: 0.6972 310/500 [=================>............] - ETA: 1:17 - loss: 3.1983 - regression_loss: 2.5025 - classification_loss: 0.6958 311/500 [=================>............] - ETA: 1:17 - loss: 3.1961 - regression_loss: 2.5016 - classification_loss: 0.6945 312/500 [=================>............] - ETA: 1:16 - loss: 3.1943 - regression_loss: 2.5004 - classification_loss: 0.6939 313/500 [=================>............] - ETA: 1:16 - loss: 3.1916 - regression_loss: 2.4991 - classification_loss: 0.6924 314/500 [=================>............] - ETA: 1:16 - loss: 3.1903 - regression_loss: 2.4986 - classification_loss: 0.6917 315/500 [=================>............] - ETA: 1:15 - loss: 3.1874 - regression_loss: 2.4968 - classification_loss: 0.6906 316/500 [=================>............] - ETA: 1:15 - loss: 3.1886 - regression_loss: 2.4984 - classification_loss: 0.6902 317/500 [==================>...........] - ETA: 1:14 - loss: 3.1863 - regression_loss: 2.4970 - classification_loss: 0.6893 318/500 [==================>...........] - ETA: 1:14 - loss: 3.1855 - regression_loss: 2.4964 - classification_loss: 0.6891 319/500 [==================>...........] - ETA: 1:13 - loss: 3.1833 - regression_loss: 2.4952 - classification_loss: 0.6880 320/500 [==================>...........] - ETA: 1:13 - loss: 3.1829 - regression_loss: 2.4955 - classification_loss: 0.6874 321/500 [==================>...........] - ETA: 1:13 - loss: 3.1789 - regression_loss: 2.4929 - classification_loss: 0.6859 322/500 [==================>...........] - ETA: 1:12 - loss: 3.1781 - regression_loss: 2.4930 - classification_loss: 0.6851 323/500 [==================>...........] - ETA: 1:12 - loss: 3.1778 - regression_loss: 2.4935 - classification_loss: 0.6843 324/500 [==================>...........] - ETA: 1:11 - loss: 3.1782 - regression_loss: 2.4943 - classification_loss: 0.6839 325/500 [==================>...........] - ETA: 1:11 - loss: 3.1761 - regression_loss: 2.4935 - classification_loss: 0.6826 326/500 [==================>...........] - ETA: 1:11 - loss: 3.1750 - regression_loss: 2.4928 - classification_loss: 0.6822 327/500 [==================>...........] - ETA: 1:10 - loss: 3.1752 - regression_loss: 2.4932 - classification_loss: 0.6820 328/500 [==================>...........] - ETA: 1:10 - loss: 3.1744 - regression_loss: 2.4927 - classification_loss: 0.6817 329/500 [==================>...........] - ETA: 1:09 - loss: 3.1719 - regression_loss: 2.4912 - classification_loss: 0.6807 330/500 [==================>...........] - ETA: 1:09 - loss: 3.1710 - regression_loss: 2.4913 - classification_loss: 0.6797 331/500 [==================>...........] - ETA: 1:08 - loss: 3.1700 - regression_loss: 2.4913 - classification_loss: 0.6787 332/500 [==================>...........] - ETA: 1:08 - loss: 3.1677 - regression_loss: 2.4899 - classification_loss: 0.6778 333/500 [==================>...........] - ETA: 1:08 - loss: 3.1680 - regression_loss: 2.4908 - classification_loss: 0.6772 334/500 [===================>..........] - ETA: 1:07 - loss: 3.1672 - regression_loss: 2.4909 - classification_loss: 0.6763 335/500 [===================>..........] - ETA: 1:07 - loss: 3.1657 - regression_loss: 2.4901 - classification_loss: 0.6756 336/500 [===================>..........] - ETA: 1:06 - loss: 3.1634 - regression_loss: 2.4889 - classification_loss: 0.6745 337/500 [===================>..........] - ETA: 1:06 - loss: 3.1611 - regression_loss: 2.4877 - classification_loss: 0.6734 338/500 [===================>..........] - ETA: 1:05 - loss: 3.1610 - regression_loss: 2.4882 - classification_loss: 0.6728 339/500 [===================>..........] - ETA: 1:05 - loss: 3.1591 - regression_loss: 2.4871 - classification_loss: 0.6720 340/500 [===================>..........] - ETA: 1:05 - loss: 3.1577 - regression_loss: 2.4866 - classification_loss: 0.6711 341/500 [===================>..........] - ETA: 1:04 - loss: 3.1560 - regression_loss: 2.4859 - classification_loss: 0.6701 342/500 [===================>..........] - ETA: 1:04 - loss: 3.1549 - regression_loss: 2.4855 - classification_loss: 0.6694 343/500 [===================>..........] - ETA: 1:03 - loss: 3.1569 - regression_loss: 2.4877 - classification_loss: 0.6692 344/500 [===================>..........] - ETA: 1:03 - loss: 3.1550 - regression_loss: 2.4868 - classification_loss: 0.6683 345/500 [===================>..........] - ETA: 1:03 - loss: 3.1512 - regression_loss: 2.4841 - classification_loss: 0.6671 346/500 [===================>..........] - ETA: 1:02 - loss: 3.1501 - regression_loss: 2.4835 - classification_loss: 0.6666 347/500 [===================>..........] - ETA: 1:02 - loss: 3.1479 - regression_loss: 2.4823 - classification_loss: 0.6656 348/500 [===================>..........] - ETA: 1:01 - loss: 3.1486 - regression_loss: 2.4832 - classification_loss: 0.6654 349/500 [===================>..........] - ETA: 1:01 - loss: 3.1476 - regression_loss: 2.4829 - classification_loss: 0.6647 350/500 [====================>.........] - ETA: 1:00 - loss: 3.1458 - regression_loss: 2.4819 - classification_loss: 0.6639 351/500 [====================>.........] - ETA: 1:00 - loss: 3.1445 - regression_loss: 2.4815 - classification_loss: 0.6630 352/500 [====================>.........] - ETA: 1:00 - loss: 3.1463 - regression_loss: 2.4827 - classification_loss: 0.6636 353/500 [====================>.........] - ETA: 59s - loss: 3.1450 - regression_loss: 2.4821 - classification_loss: 0.6629 354/500 [====================>.........] - ETA: 59s - loss: 3.1424 - regression_loss: 2.4807 - classification_loss: 0.6617 355/500 [====================>.........] - ETA: 58s - loss: 3.1411 - regression_loss: 2.4801 - classification_loss: 0.6611 356/500 [====================>.........] - ETA: 58s - loss: 3.1386 - regression_loss: 2.4785 - classification_loss: 0.6601 357/500 [====================>.........] - ETA: 58s - loss: 3.1347 - regression_loss: 2.4759 - classification_loss: 0.6588 358/500 [====================>.........] - ETA: 57s - loss: 3.1356 - regression_loss: 2.4759 - classification_loss: 0.6598 359/500 [====================>.........] - ETA: 57s - loss: 3.1371 - regression_loss: 2.4777 - classification_loss: 0.6594 360/500 [====================>.........] - ETA: 56s - loss: 3.1347 - regression_loss: 2.4762 - classification_loss: 0.6585 361/500 [====================>.........] - ETA: 56s - loss: 3.1333 - regression_loss: 2.4754 - classification_loss: 0.6579 362/500 [====================>.........] - ETA: 55s - loss: 3.1311 - regression_loss: 2.4742 - classification_loss: 0.6569 363/500 [====================>.........] - ETA: 55s - loss: 3.1294 - regression_loss: 2.4735 - classification_loss: 0.6560 364/500 [====================>.........] - ETA: 55s - loss: 3.1292 - regression_loss: 2.4735 - classification_loss: 0.6557 365/500 [====================>.........] - ETA: 54s - loss: 3.1272 - regression_loss: 2.4724 - classification_loss: 0.6548 366/500 [====================>.........] - ETA: 54s - loss: 3.1253 - regression_loss: 2.4714 - classification_loss: 0.6540 367/500 [=====================>........] - ETA: 53s - loss: 3.1237 - regression_loss: 2.4707 - classification_loss: 0.6530 368/500 [=====================>........] - ETA: 53s - loss: 3.1207 - regression_loss: 2.4689 - classification_loss: 0.6518 369/500 [=====================>........] - ETA: 53s - loss: 3.1205 - regression_loss: 2.4691 - classification_loss: 0.6514 370/500 [=====================>........] - ETA: 52s - loss: 3.1178 - regression_loss: 2.4676 - classification_loss: 0.6503 371/500 [=====================>........] - ETA: 52s - loss: 3.1171 - regression_loss: 2.4678 - classification_loss: 0.6493 372/500 [=====================>........] - ETA: 51s - loss: 3.1156 - regression_loss: 2.4667 - classification_loss: 0.6489 373/500 [=====================>........] - ETA: 51s - loss: 3.1134 - regression_loss: 2.4653 - classification_loss: 0.6481 374/500 [=====================>........] - ETA: 51s - loss: 3.1121 - regression_loss: 2.4646 - classification_loss: 0.6475 375/500 [=====================>........] - ETA: 50s - loss: 3.1111 - regression_loss: 2.4645 - classification_loss: 0.6466 376/500 [=====================>........] - ETA: 50s - loss: 3.1083 - regression_loss: 2.4625 - classification_loss: 0.6458 377/500 [=====================>........] - ETA: 49s - loss: 3.1088 - regression_loss: 2.4633 - classification_loss: 0.6455 378/500 [=====================>........] - ETA: 49s - loss: 3.1063 - regression_loss: 2.4618 - classification_loss: 0.6445 379/500 [=====================>........] - ETA: 48s - loss: 3.1048 - regression_loss: 2.4609 - classification_loss: 0.6439 380/500 [=====================>........] - ETA: 48s - loss: 3.1024 - regression_loss: 2.4593 - classification_loss: 0.6431 381/500 [=====================>........] - ETA: 48s - loss: 3.1005 - regression_loss: 2.4582 - classification_loss: 0.6423 382/500 [=====================>........] - ETA: 47s - loss: 3.0985 - regression_loss: 2.4573 - classification_loss: 0.6413 383/500 [=====================>........] - ETA: 47s - loss: 3.0974 - regression_loss: 2.4567 - classification_loss: 0.6407 384/500 [======================>.......] - ETA: 46s - loss: 3.0962 - regression_loss: 2.4561 - classification_loss: 0.6400 385/500 [======================>.......] - ETA: 46s - loss: 3.0945 - regression_loss: 2.4553 - classification_loss: 0.6392 386/500 [======================>.......] - ETA: 46s - loss: 3.0944 - regression_loss: 2.4558 - classification_loss: 0.6386 387/500 [======================>.......] - ETA: 45s - loss: 3.0945 - regression_loss: 2.4565 - classification_loss: 0.6380 388/500 [======================>.......] - ETA: 45s - loss: 3.0923 - regression_loss: 2.4551 - classification_loss: 0.6371 389/500 [======================>.......] - ETA: 44s - loss: 3.0912 - regression_loss: 2.4545 - classification_loss: 0.6367 390/500 [======================>.......] - ETA: 44s - loss: 3.0915 - regression_loss: 2.4536 - classification_loss: 0.6379 391/500 [======================>.......] - ETA: 44s - loss: 3.0902 - regression_loss: 2.4528 - classification_loss: 0.6373 392/500 [======================>.......] - ETA: 43s - loss: 3.0909 - regression_loss: 2.4540 - classification_loss: 0.6369 393/500 [======================>.......] - ETA: 43s - loss: 3.0883 - regression_loss: 2.4525 - classification_loss: 0.6358 394/500 [======================>.......] - ETA: 42s - loss: 3.0878 - regression_loss: 2.4525 - classification_loss: 0.6353 395/500 [======================>.......] - ETA: 42s - loss: 3.0847 - regression_loss: 2.4504 - classification_loss: 0.6343 396/500 [======================>.......] - ETA: 41s - loss: 3.0814 - regression_loss: 2.4481 - classification_loss: 0.6333 397/500 [======================>.......] - ETA: 41s - loss: 3.0783 - regression_loss: 2.4459 - classification_loss: 0.6323 398/500 [======================>.......] - ETA: 41s - loss: 3.0766 - regression_loss: 2.4448 - classification_loss: 0.6317 399/500 [======================>.......] - ETA: 40s - loss: 3.0742 - regression_loss: 2.4433 - classification_loss: 0.6310 400/500 [=======================>......] - ETA: 40s - loss: 3.0743 - regression_loss: 2.4436 - classification_loss: 0.6307 401/500 [=======================>......] - ETA: 39s - loss: 3.0730 - regression_loss: 2.4427 - classification_loss: 0.6303 402/500 [=======================>......] - ETA: 39s - loss: 3.0718 - regression_loss: 2.4420 - classification_loss: 0.6298 403/500 [=======================>......] - ETA: 39s - loss: 3.0701 - regression_loss: 2.4411 - classification_loss: 0.6290 404/500 [=======================>......] - ETA: 38s - loss: 3.0683 - regression_loss: 2.4400 - classification_loss: 0.6283 405/500 [=======================>......] - ETA: 38s - loss: 3.0667 - regression_loss: 2.4388 - classification_loss: 0.6279 406/500 [=======================>......] - ETA: 37s - loss: 3.0658 - regression_loss: 2.4385 - classification_loss: 0.6273 407/500 [=======================>......] - ETA: 37s - loss: 3.0641 - regression_loss: 2.4373 - classification_loss: 0.6269 408/500 [=======================>......] - ETA: 37s - loss: 3.0637 - regression_loss: 2.4372 - classification_loss: 0.6266 409/500 [=======================>......] - ETA: 36s - loss: 3.0645 - regression_loss: 2.4368 - classification_loss: 0.6278 410/500 [=======================>......] - ETA: 36s - loss: 3.0628 - regression_loss: 2.4357 - classification_loss: 0.6271 411/500 [=======================>......] - ETA: 35s - loss: 3.0607 - regression_loss: 2.4343 - classification_loss: 0.6264 412/500 [=======================>......] - ETA: 35s - loss: 3.0588 - regression_loss: 2.4330 - classification_loss: 0.6258 413/500 [=======================>......] - ETA: 35s - loss: 3.0582 - regression_loss: 2.4330 - classification_loss: 0.6252 414/500 [=======================>......] - ETA: 34s - loss: 3.0586 - regression_loss: 2.4334 - classification_loss: 0.6252 415/500 [=======================>......] - ETA: 34s - loss: 3.0566 - regression_loss: 2.4321 - classification_loss: 0.6245 416/500 [=======================>......] - ETA: 33s - loss: 3.0544 - regression_loss: 2.4306 - classification_loss: 0.6238 417/500 [========================>.....] - ETA: 33s - loss: 3.0507 - regression_loss: 2.4278 - classification_loss: 0.6229 418/500 [========================>.....] - ETA: 32s - loss: 3.0490 - regression_loss: 2.4268 - classification_loss: 0.6222 419/500 [========================>.....] - ETA: 32s - loss: 3.0483 - regression_loss: 2.4267 - classification_loss: 0.6216 420/500 [========================>.....] - ETA: 32s - loss: 3.0472 - regression_loss: 2.4259 - classification_loss: 0.6213 421/500 [========================>.....] - ETA: 31s - loss: 3.0458 - regression_loss: 2.4250 - classification_loss: 0.6208 422/500 [========================>.....] - ETA: 31s - loss: 3.0455 - regression_loss: 2.4252 - classification_loss: 0.6203 423/500 [========================>.....] - ETA: 30s - loss: 3.0449 - regression_loss: 2.4251 - classification_loss: 0.6198 424/500 [========================>.....] - ETA: 30s - loss: 3.0434 - regression_loss: 2.4243 - classification_loss: 0.6191 425/500 [========================>.....] - ETA: 30s - loss: 3.0418 - regression_loss: 2.4231 - classification_loss: 0.6187 426/500 [========================>.....] - ETA: 29s - loss: 3.0400 - regression_loss: 2.4221 - classification_loss: 0.6180 427/500 [========================>.....] - ETA: 29s - loss: 3.0391 - regression_loss: 2.4218 - classification_loss: 0.6172 428/500 [========================>.....] - ETA: 28s - loss: 3.0370 - regression_loss: 2.4206 - classification_loss: 0.6164 429/500 [========================>.....] - ETA: 28s - loss: 3.0352 - regression_loss: 2.4194 - classification_loss: 0.6158 430/500 [========================>.....] - ETA: 28s - loss: 3.0335 - regression_loss: 2.4183 - classification_loss: 0.6152 431/500 [========================>.....] - ETA: 27s - loss: 3.0317 - regression_loss: 2.4172 - classification_loss: 0.6145 432/500 [========================>.....] - ETA: 27s - loss: 3.0300 - regression_loss: 2.4162 - classification_loss: 0.6138 433/500 [========================>.....] - ETA: 26s - loss: 3.0288 - regression_loss: 2.4155 - classification_loss: 0.6133 434/500 [=========================>....] - ETA: 26s - loss: 3.0270 - regression_loss: 2.4142 - classification_loss: 0.6128 435/500 [=========================>....] - ETA: 26s - loss: 3.0248 - regression_loss: 2.4128 - classification_loss: 0.6120 436/500 [=========================>....] - ETA: 25s - loss: 3.0242 - regression_loss: 2.4125 - classification_loss: 0.6117 437/500 [=========================>....] - ETA: 25s - loss: 3.0230 - regression_loss: 2.4118 - classification_loss: 0.6112 438/500 [=========================>....] - ETA: 24s - loss: 3.0205 - regression_loss: 2.4101 - classification_loss: 0.6104 439/500 [=========================>....] - ETA: 24s - loss: 3.0209 - regression_loss: 2.4103 - classification_loss: 0.6106 440/500 [=========================>....] - ETA: 24s - loss: 3.0208 - regression_loss: 2.4102 - classification_loss: 0.6106 441/500 [=========================>....] - ETA: 23s - loss: 3.0190 - regression_loss: 2.4091 - classification_loss: 0.6099 442/500 [=========================>....] - ETA: 23s - loss: 3.0168 - regression_loss: 2.4074 - classification_loss: 0.6094 443/500 [=========================>....] - ETA: 22s - loss: 3.0151 - regression_loss: 2.4064 - classification_loss: 0.6087 444/500 [=========================>....] - ETA: 22s - loss: 3.0137 - regression_loss: 2.4055 - classification_loss: 0.6082 445/500 [=========================>....] - ETA: 22s - loss: 3.0138 - regression_loss: 2.4059 - classification_loss: 0.6080 446/500 [=========================>....] - ETA: 21s - loss: 3.0116 - regression_loss: 2.4043 - classification_loss: 0.6073 447/500 [=========================>....] - ETA: 21s - loss: 3.0099 - regression_loss: 2.4031 - classification_loss: 0.6068 448/500 [=========================>....] - ETA: 20s - loss: 3.0070 - regression_loss: 2.4010 - classification_loss: 0.6060 449/500 [=========================>....] - ETA: 20s - loss: 3.0064 - regression_loss: 2.4009 - classification_loss: 0.6055 450/500 [==========================>...] - ETA: 20s - loss: 3.0070 - regression_loss: 2.4012 - classification_loss: 0.6058 451/500 [==========================>...] - ETA: 19s - loss: 3.0057 - regression_loss: 2.4000 - classification_loss: 0.6056 452/500 [==========================>...] - ETA: 19s - loss: 3.0065 - regression_loss: 2.4007 - classification_loss: 0.6058 453/500 [==========================>...] - ETA: 18s - loss: 3.0052 - regression_loss: 2.3998 - classification_loss: 0.6054 454/500 [==========================>...] - ETA: 18s - loss: 3.0044 - regression_loss: 2.3996 - classification_loss: 0.6048 455/500 [==========================>...] - ETA: 18s - loss: 3.0039 - regression_loss: 2.3998 - classification_loss: 0.6041 456/500 [==========================>...] - ETA: 17s - loss: 3.0025 - regression_loss: 2.3991 - classification_loss: 0.6034 457/500 [==========================>...] - ETA: 17s - loss: 3.0018 - regression_loss: 2.3989 - classification_loss: 0.6029 458/500 [==========================>...] - ETA: 16s - loss: 3.0015 - regression_loss: 2.3993 - classification_loss: 0.6023 459/500 [==========================>...] - ETA: 16s - loss: 3.0004 - regression_loss: 2.3986 - classification_loss: 0.6018 460/500 [==========================>...] - ETA: 16s - loss: 2.9998 - regression_loss: 2.3984 - classification_loss: 0.6015 461/500 [==========================>...] - ETA: 15s - loss: 2.9991 - regression_loss: 2.3979 - classification_loss: 0.6012 462/500 [==========================>...] - ETA: 15s - loss: 2.9975 - regression_loss: 2.3969 - classification_loss: 0.6007 463/500 [==========================>...] - ETA: 14s - loss: 2.9961 - regression_loss: 2.3960 - classification_loss: 0.6001 464/500 [==========================>...] - ETA: 14s - loss: 2.9944 - regression_loss: 2.3949 - classification_loss: 0.5994 465/500 [==========================>...] - ETA: 14s - loss: 2.9929 - regression_loss: 2.3941 - classification_loss: 0.5988 466/500 [==========================>...] - ETA: 13s - loss: 2.9913 - regression_loss: 2.3930 - classification_loss: 0.5983 467/500 [===========================>..] - ETA: 13s - loss: 2.9906 - regression_loss: 2.3928 - classification_loss: 0.5979 468/500 [===========================>..] - ETA: 12s - loss: 2.9896 - regression_loss: 2.3922 - classification_loss: 0.5974 469/500 [===========================>..] - ETA: 12s - loss: 2.9875 - regression_loss: 2.3906 - classification_loss: 0.5968 470/500 [===========================>..] - ETA: 12s - loss: 2.9862 - regression_loss: 2.3898 - classification_loss: 0.5965 471/500 [===========================>..] - ETA: 11s - loss: 2.9854 - regression_loss: 2.3894 - classification_loss: 0.5960 472/500 [===========================>..] - ETA: 11s - loss: 2.9837 - regression_loss: 2.3881 - classification_loss: 0.5956 473/500 [===========================>..] - ETA: 10s - loss: 2.9823 - regression_loss: 2.3873 - classification_loss: 0.5950 474/500 [===========================>..] - ETA: 10s - loss: 2.9830 - regression_loss: 2.3880 - classification_loss: 0.5950 475/500 [===========================>..] - ETA: 10s - loss: 2.9816 - regression_loss: 2.3873 - classification_loss: 0.5943 476/500 [===========================>..] - ETA: 9s - loss: 2.9797 - regression_loss: 2.3861 - classification_loss: 0.5936 477/500 [===========================>..] - ETA: 9s - loss: 2.9803 - regression_loss: 2.3867 - classification_loss: 0.5935 478/500 [===========================>..] - ETA: 8s - loss: 2.9790 - regression_loss: 2.3859 - classification_loss: 0.5931 479/500 [===========================>..] - ETA: 8s - loss: 2.9770 - regression_loss: 2.3847 - classification_loss: 0.5923 480/500 [===========================>..] - ETA: 8s - loss: 2.9761 - regression_loss: 2.3841 - classification_loss: 0.5920 481/500 [===========================>..] - ETA: 7s - loss: 2.9741 - regression_loss: 2.3828 - classification_loss: 0.5913 482/500 [===========================>..] - ETA: 7s - loss: 2.9739 - regression_loss: 2.3827 - classification_loss: 0.5912 483/500 [===========================>..] - ETA: 6s - loss: 2.9720 - regression_loss: 2.3815 - classification_loss: 0.5905 484/500 [============================>.] - ETA: 6s - loss: 2.9719 - regression_loss: 2.3815 - classification_loss: 0.5904 485/500 [============================>.] - ETA: 6s - loss: 2.9691 - regression_loss: 2.3795 - classification_loss: 0.5897 486/500 [============================>.] - ETA: 5s - loss: 2.9670 - regression_loss: 2.3781 - classification_loss: 0.5889 487/500 [============================>.] - ETA: 5s - loss: 2.9659 - regression_loss: 2.3777 - classification_loss: 0.5882 488/500 [============================>.] - ETA: 4s - loss: 2.9644 - regression_loss: 2.3767 - classification_loss: 0.5877 489/500 [============================>.] - ETA: 4s - loss: 2.9632 - regression_loss: 2.3761 - classification_loss: 0.5870 490/500 [============================>.] - ETA: 4s - loss: 2.9649 - regression_loss: 2.3773 - classification_loss: 0.5876 491/500 [============================>.] - ETA: 3s - loss: 2.9644 - regression_loss: 2.3771 - classification_loss: 0.5872 492/500 [============================>.] - ETA: 3s - loss: 2.9626 - regression_loss: 2.3759 - classification_loss: 0.5866 493/500 [============================>.] - ETA: 2s - loss: 2.9604 - regression_loss: 2.3745 - classification_loss: 0.5858 494/500 [============================>.] - ETA: 2s - loss: 2.9576 - regression_loss: 2.3725 - classification_loss: 0.5851 495/500 [============================>.] - ETA: 1s - loss: 2.9557 - regression_loss: 2.3714 - classification_loss: 0.5844 496/500 [============================>.] - ETA: 1s - loss: 2.9547 - regression_loss: 2.3708 - classification_loss: 0.5839 497/500 [============================>.] - ETA: 1s - loss: 2.9533 - regression_loss: 2.3699 - classification_loss: 0.5834 498/500 [============================>.] - ETA: 0s - loss: 2.9522 - regression_loss: 2.3693 - classification_loss: 0.5829 499/500 [============================>.] - ETA: 0s - loss: 2.9511 - regression_loss: 2.3686 - classification_loss: 0.5825 500/500 [==============================] - 200s 400ms/step - loss: 2.9490 - regression_loss: 2.3672 - classification_loss: 0.5818 326 instances of class plum with average precision: 0.5741 mAP: 0.5741 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/20 1/500 [..............................] - ETA: 3:06 - loss: 2.3141 - regression_loss: 2.0083 - classification_loss: 0.3058 2/500 [..............................] - ETA: 3:07 - loss: 2.1700 - regression_loss: 1.8727 - classification_loss: 0.2972 3/500 [..............................] - ETA: 3:08 - loss: 2.3848 - regression_loss: 2.0607 - classification_loss: 0.3241 4/500 [..............................] - ETA: 3:06 - loss: 2.3480 - regression_loss: 2.0169 - classification_loss: 0.3311 5/500 [..............................] - ETA: 3:04 - loss: 2.4320 - regression_loss: 2.0819 - classification_loss: 0.3501 6/500 [..............................] - ETA: 3:05 - loss: 2.3427 - regression_loss: 2.0017 - classification_loss: 0.3409 7/500 [..............................] - ETA: 3:05 - loss: 2.3323 - regression_loss: 1.9950 - classification_loss: 0.3373 8/500 [..............................] - ETA: 3:04 - loss: 2.3642 - regression_loss: 2.0047 - classification_loss: 0.3594 9/500 [..............................] - ETA: 3:03 - loss: 2.3488 - regression_loss: 1.9933 - classification_loss: 0.3555 10/500 [..............................] - ETA: 3:04 - loss: 2.4492 - regression_loss: 2.0672 - classification_loss: 0.3820 11/500 [..............................] - ETA: 3:04 - loss: 2.4378 - regression_loss: 2.0552 - classification_loss: 0.3826 12/500 [..............................] - ETA: 3:04 - loss: 2.4875 - regression_loss: 2.0898 - classification_loss: 0.3977 13/500 [..............................] - ETA: 3:04 - loss: 2.4762 - regression_loss: 2.0831 - classification_loss: 0.3930 14/500 [..............................] - ETA: 3:04 - loss: 2.5655 - regression_loss: 2.1601 - classification_loss: 0.4054 15/500 [..............................] - ETA: 3:03 - loss: 2.5393 - regression_loss: 2.1410 - classification_loss: 0.3983 16/500 [..............................] - ETA: 3:03 - loss: 2.4997 - regression_loss: 2.1137 - classification_loss: 0.3860 17/500 [>.............................] - ETA: 3:03 - loss: 2.5643 - regression_loss: 2.1733 - classification_loss: 0.3910 18/500 [>.............................] - ETA: 3:02 - loss: 2.5385 - regression_loss: 2.1551 - classification_loss: 0.3834 19/500 [>.............................] - ETA: 3:02 - loss: 2.5136 - regression_loss: 2.1355 - classification_loss: 0.3781 20/500 [>.............................] - ETA: 3:01 - loss: 2.5026 - regression_loss: 2.1276 - classification_loss: 0.3750 21/500 [>.............................] - ETA: 3:01 - loss: 2.5140 - regression_loss: 2.1391 - classification_loss: 0.3749 22/500 [>.............................] - ETA: 3:00 - loss: 2.4942 - regression_loss: 2.1232 - classification_loss: 0.3710 23/500 [>.............................] - ETA: 2:59 - loss: 2.5119 - regression_loss: 2.1283 - classification_loss: 0.3836 24/500 [>.............................] - ETA: 2:59 - loss: 2.5016 - regression_loss: 2.1224 - classification_loss: 0.3792 25/500 [>.............................] - ETA: 2:59 - loss: 2.4976 - regression_loss: 2.1219 - classification_loss: 0.3758 26/500 [>.............................] - ETA: 2:59 - loss: 2.4913 - regression_loss: 2.1196 - classification_loss: 0.3718 27/500 [>.............................] - ETA: 2:59 - loss: 2.5195 - regression_loss: 2.1360 - classification_loss: 0.3835 28/500 [>.............................] - ETA: 2:58 - loss: 2.5232 - regression_loss: 2.1412 - classification_loss: 0.3820 29/500 [>.............................] - ETA: 2:58 - loss: 2.5107 - regression_loss: 2.1306 - classification_loss: 0.3802 30/500 [>.............................] - ETA: 2:57 - loss: 2.5128 - regression_loss: 2.1344 - classification_loss: 0.3784 31/500 [>.............................] - ETA: 2:57 - loss: 2.4938 - regression_loss: 2.1153 - classification_loss: 0.3786 32/500 [>.............................] - ETA: 2:56 - loss: 2.5000 - regression_loss: 2.1169 - classification_loss: 0.3831 33/500 [>.............................] - ETA: 2:56 - loss: 2.4916 - regression_loss: 2.1106 - classification_loss: 0.3810 34/500 [=>............................] - ETA: 2:56 - loss: 2.4824 - regression_loss: 2.1035 - classification_loss: 0.3790 35/500 [=>............................] - ETA: 2:55 - loss: 2.4792 - regression_loss: 2.1018 - classification_loss: 0.3774 36/500 [=>............................] - ETA: 2:55 - loss: 2.4721 - regression_loss: 2.0975 - classification_loss: 0.3746 37/500 [=>............................] - ETA: 2:54 - loss: 2.4651 - regression_loss: 2.0929 - classification_loss: 0.3722 38/500 [=>............................] - ETA: 2:54 - loss: 2.4542 - regression_loss: 2.0846 - classification_loss: 0.3696 39/500 [=>............................] - ETA: 2:53 - loss: 2.4476 - regression_loss: 2.0793 - classification_loss: 0.3683 40/500 [=>............................] - ETA: 2:53 - loss: 2.4321 - regression_loss: 2.0675 - classification_loss: 0.3647 41/500 [=>............................] - ETA: 2:53 - loss: 2.4164 - regression_loss: 2.0552 - classification_loss: 0.3611 42/500 [=>............................] - ETA: 2:53 - loss: 2.4114 - regression_loss: 2.0503 - classification_loss: 0.3611 43/500 [=>............................] - ETA: 2:52 - loss: 2.4109 - regression_loss: 2.0494 - classification_loss: 0.3615 44/500 [=>............................] - ETA: 2:52 - loss: 2.4085 - regression_loss: 2.0478 - classification_loss: 0.3608 45/500 [=>............................] - ETA: 2:52 - loss: 2.4081 - regression_loss: 2.0455 - classification_loss: 0.3626 46/500 [=>............................] - ETA: 2:52 - loss: 2.4113 - regression_loss: 2.0476 - classification_loss: 0.3637 47/500 [=>............................] - ETA: 2:51 - loss: 2.4021 - regression_loss: 2.0401 - classification_loss: 0.3620 48/500 [=>............................] - ETA: 2:51 - loss: 2.4305 - regression_loss: 2.0577 - classification_loss: 0.3728 49/500 [=>............................] - ETA: 2:51 - loss: 2.4104 - regression_loss: 2.0403 - classification_loss: 0.3701 50/500 [==>...........................] - ETA: 2:50 - loss: 2.3999 - regression_loss: 2.0331 - classification_loss: 0.3668 51/500 [==>...........................] - ETA: 2:50 - loss: 2.3923 - regression_loss: 2.0273 - classification_loss: 0.3650 52/500 [==>...........................] - ETA: 2:49 - loss: 2.3914 - regression_loss: 2.0255 - classification_loss: 0.3659 53/500 [==>...........................] - ETA: 2:49 - loss: 2.3858 - regression_loss: 2.0212 - classification_loss: 0.3646 54/500 [==>...........................] - ETA: 2:48 - loss: 2.3658 - regression_loss: 2.0046 - classification_loss: 0.3613 55/500 [==>...........................] - ETA: 2:48 - loss: 2.3639 - regression_loss: 2.0033 - classification_loss: 0.3606 56/500 [==>...........................] - ETA: 2:48 - loss: 2.3863 - regression_loss: 2.0135 - classification_loss: 0.3727 57/500 [==>...........................] - ETA: 2:48 - loss: 2.3783 - regression_loss: 2.0080 - classification_loss: 0.3702 58/500 [==>...........................] - ETA: 2:48 - loss: 2.3796 - regression_loss: 2.0086 - classification_loss: 0.3710 59/500 [==>...........................] - ETA: 2:47 - loss: 2.3696 - regression_loss: 2.0013 - classification_loss: 0.3683 60/500 [==>...........................] - ETA: 2:47 - loss: 2.3664 - regression_loss: 1.9982 - classification_loss: 0.3681 61/500 [==>...........................] - ETA: 2:46 - loss: 2.3699 - regression_loss: 2.0029 - classification_loss: 0.3670 62/500 [==>...........................] - ETA: 2:46 - loss: 2.3574 - regression_loss: 1.9929 - classification_loss: 0.3645 63/500 [==>...........................] - ETA: 2:46 - loss: 2.3551 - regression_loss: 1.9914 - classification_loss: 0.3637 64/500 [==>...........................] - ETA: 2:45 - loss: 2.3559 - regression_loss: 1.9895 - classification_loss: 0.3664 65/500 [==>...........................] - ETA: 2:45 - loss: 2.3715 - regression_loss: 2.0021 - classification_loss: 0.3694 66/500 [==>...........................] - ETA: 2:45 - loss: 2.3767 - regression_loss: 2.0072 - classification_loss: 0.3695 67/500 [===>..........................] - ETA: 2:44 - loss: 2.3747 - regression_loss: 2.0064 - classification_loss: 0.3683 68/500 [===>..........................] - ETA: 2:44 - loss: 2.3752 - regression_loss: 2.0066 - classification_loss: 0.3685 69/500 [===>..........................] - ETA: 2:43 - loss: 2.3738 - regression_loss: 2.0052 - classification_loss: 0.3687 70/500 [===>..........................] - ETA: 2:43 - loss: 2.3763 - regression_loss: 2.0088 - classification_loss: 0.3675 71/500 [===>..........................] - ETA: 2:43 - loss: 2.3614 - regression_loss: 1.9967 - classification_loss: 0.3648 72/500 [===>..........................] - ETA: 2:43 - loss: 2.3492 - regression_loss: 1.9870 - classification_loss: 0.3622 73/500 [===>..........................] - ETA: 2:42 - loss: 2.3342 - regression_loss: 1.9743 - classification_loss: 0.3599 74/500 [===>..........................] - ETA: 2:42 - loss: 2.3293 - regression_loss: 1.9702 - classification_loss: 0.3592 75/500 [===>..........................] - ETA: 2:41 - loss: 2.3249 - regression_loss: 1.9661 - classification_loss: 0.3589 76/500 [===>..........................] - ETA: 2:41 - loss: 2.3302 - regression_loss: 1.9700 - classification_loss: 0.3602 77/500 [===>..........................] - ETA: 2:41 - loss: 2.3317 - regression_loss: 1.9710 - classification_loss: 0.3608 78/500 [===>..........................] - ETA: 2:40 - loss: 2.3269 - regression_loss: 1.9666 - classification_loss: 0.3602 79/500 [===>..........................] - ETA: 2:40 - loss: 2.3253 - regression_loss: 1.9655 - classification_loss: 0.3598 80/500 [===>..........................] - ETA: 2:40 - loss: 2.3257 - regression_loss: 1.9665 - classification_loss: 0.3591 81/500 [===>..........................] - ETA: 2:39 - loss: 2.3290 - regression_loss: 1.9709 - classification_loss: 0.3580 82/500 [===>..........................] - ETA: 2:39 - loss: 2.3372 - regression_loss: 1.9751 - classification_loss: 0.3622 83/500 [===>..........................] - ETA: 2:38 - loss: 2.3391 - regression_loss: 1.9767 - classification_loss: 0.3624 84/500 [====>.........................] - ETA: 2:38 - loss: 2.3381 - regression_loss: 1.9759 - classification_loss: 0.3622 85/500 [====>.........................] - ETA: 2:37 - loss: 2.3424 - regression_loss: 1.9797 - classification_loss: 0.3626 86/500 [====>.........................] - ETA: 2:37 - loss: 2.3412 - regression_loss: 1.9789 - classification_loss: 0.3623 87/500 [====>.........................] - ETA: 2:37 - loss: 2.3361 - regression_loss: 1.9753 - classification_loss: 0.3608 88/500 [====>.........................] - ETA: 2:36 - loss: 2.3290 - regression_loss: 1.9706 - classification_loss: 0.3585 89/500 [====>.........................] - ETA: 2:36 - loss: 2.3402 - regression_loss: 1.9803 - classification_loss: 0.3600 90/500 [====>.........................] - ETA: 2:35 - loss: 2.3340 - regression_loss: 1.9752 - classification_loss: 0.3588 91/500 [====>.........................] - ETA: 2:35 - loss: 2.3334 - regression_loss: 1.9740 - classification_loss: 0.3594 92/500 [====>.........................] - ETA: 2:35 - loss: 2.3318 - regression_loss: 1.9732 - classification_loss: 0.3586 93/500 [====>.........................] - ETA: 2:34 - loss: 2.3298 - regression_loss: 1.9708 - classification_loss: 0.3590 94/500 [====>.........................] - ETA: 2:34 - loss: 2.3322 - regression_loss: 1.9730 - classification_loss: 0.3592 95/500 [====>.........................] - ETA: 2:34 - loss: 2.3202 - regression_loss: 1.9632 - classification_loss: 0.3570 96/500 [====>.........................] - ETA: 2:33 - loss: 2.3168 - regression_loss: 1.9616 - classification_loss: 0.3552 97/500 [====>.........................] - ETA: 2:33 - loss: 2.3128 - regression_loss: 1.9581 - classification_loss: 0.3547 98/500 [====>.........................] - ETA: 2:33 - loss: 2.3063 - regression_loss: 1.9527 - classification_loss: 0.3535 99/500 [====>.........................] - ETA: 2:32 - loss: 2.3045 - regression_loss: 1.9518 - classification_loss: 0.3527 100/500 [=====>........................] - ETA: 2:32 - loss: 2.3045 - regression_loss: 1.9523 - classification_loss: 0.3521 101/500 [=====>........................] - ETA: 2:31 - loss: 2.3081 - regression_loss: 1.9555 - classification_loss: 0.3526 102/500 [=====>........................] - ETA: 2:31 - loss: 2.3158 - regression_loss: 1.9627 - classification_loss: 0.3531 103/500 [=====>........................] - ETA: 2:30 - loss: 2.3169 - regression_loss: 1.9645 - classification_loss: 0.3524 104/500 [=====>........................] - ETA: 2:30 - loss: 2.3208 - regression_loss: 1.9670 - classification_loss: 0.3538 105/500 [=====>........................] - ETA: 2:30 - loss: 2.3185 - regression_loss: 1.9658 - classification_loss: 0.3527 106/500 [=====>........................] - ETA: 2:30 - loss: 2.3160 - regression_loss: 1.9627 - classification_loss: 0.3533 107/500 [=====>........................] - ETA: 2:29 - loss: 2.3123 - regression_loss: 1.9601 - classification_loss: 0.3522 108/500 [=====>........................] - ETA: 2:29 - loss: 2.3109 - regression_loss: 1.9585 - classification_loss: 0.3524 109/500 [=====>........................] - ETA: 2:28 - loss: 2.3099 - regression_loss: 1.9573 - classification_loss: 0.3526 110/500 [=====>........................] - ETA: 2:28 - loss: 2.3059 - regression_loss: 1.9544 - classification_loss: 0.3515 111/500 [=====>........................] - ETA: 2:28 - loss: 2.3053 - regression_loss: 1.9540 - classification_loss: 0.3512 112/500 [=====>........................] - ETA: 2:27 - loss: 2.3005 - regression_loss: 1.9504 - classification_loss: 0.3501 113/500 [=====>........................] - ETA: 2:27 - loss: 2.3029 - regression_loss: 1.9522 - classification_loss: 0.3506 114/500 [=====>........................] - ETA: 2:27 - loss: 2.3029 - regression_loss: 1.9523 - classification_loss: 0.3506 115/500 [=====>........................] - ETA: 2:26 - loss: 2.3028 - regression_loss: 1.9517 - classification_loss: 0.3511 116/500 [=====>........................] - ETA: 2:26 - loss: 2.3078 - regression_loss: 1.9543 - classification_loss: 0.3535 117/500 [======>.......................] - ETA: 2:26 - loss: 2.3083 - regression_loss: 1.9536 - classification_loss: 0.3547 118/500 [======>.......................] - ETA: 2:25 - loss: 2.3038 - regression_loss: 1.9502 - classification_loss: 0.3537 119/500 [======>.......................] - ETA: 2:25 - loss: 2.3024 - regression_loss: 1.9494 - classification_loss: 0.3530 120/500 [======>.......................] - ETA: 2:24 - loss: 2.3062 - regression_loss: 1.9520 - classification_loss: 0.3543 121/500 [======>.......................] - ETA: 2:24 - loss: 2.3055 - regression_loss: 1.9517 - classification_loss: 0.3538 122/500 [======>.......................] - ETA: 2:24 - loss: 2.3047 - regression_loss: 1.9513 - classification_loss: 0.3534 123/500 [======>.......................] - ETA: 2:23 - loss: 2.3042 - regression_loss: 1.9508 - classification_loss: 0.3534 124/500 [======>.......................] - ETA: 2:23 - loss: 2.3010 - regression_loss: 1.9482 - classification_loss: 0.3528 125/500 [======>.......................] - ETA: 2:23 - loss: 2.2989 - regression_loss: 1.9470 - classification_loss: 0.3519 126/500 [======>.......................] - ETA: 2:22 - loss: 2.3029 - regression_loss: 1.9498 - classification_loss: 0.3531 127/500 [======>.......................] - ETA: 2:22 - loss: 2.3017 - regression_loss: 1.9491 - classification_loss: 0.3526 128/500 [======>.......................] - ETA: 2:22 - loss: 2.3046 - regression_loss: 1.9504 - classification_loss: 0.3542 129/500 [======>.......................] - ETA: 2:21 - loss: 2.3047 - regression_loss: 1.9500 - classification_loss: 0.3547 130/500 [======>.......................] - ETA: 2:21 - loss: 2.3017 - regression_loss: 1.9477 - classification_loss: 0.3540 131/500 [======>.......................] - ETA: 2:20 - loss: 2.3026 - regression_loss: 1.9489 - classification_loss: 0.3537 132/500 [======>.......................] - ETA: 2:20 - loss: 2.3074 - regression_loss: 1.9518 - classification_loss: 0.3556 133/500 [======>.......................] - ETA: 2:20 - loss: 2.3099 - regression_loss: 1.9548 - classification_loss: 0.3551 134/500 [=======>......................] - ETA: 2:19 - loss: 2.3058 - regression_loss: 1.9514 - classification_loss: 0.3544 135/500 [=======>......................] - ETA: 2:19 - loss: 2.2994 - regression_loss: 1.9461 - classification_loss: 0.3533 136/500 [=======>......................] - ETA: 2:19 - loss: 2.2989 - regression_loss: 1.9460 - classification_loss: 0.3528 137/500 [=======>......................] - ETA: 2:18 - loss: 2.2954 - regression_loss: 1.9429 - classification_loss: 0.3524 138/500 [=======>......................] - ETA: 2:18 - loss: 2.3000 - regression_loss: 1.9466 - classification_loss: 0.3534 139/500 [=======>......................] - ETA: 2:18 - loss: 2.2972 - regression_loss: 1.9447 - classification_loss: 0.3525 140/500 [=======>......................] - ETA: 2:17 - loss: 2.2968 - regression_loss: 1.9440 - classification_loss: 0.3528 141/500 [=======>......................] - ETA: 2:17 - loss: 2.2991 - regression_loss: 1.9453 - classification_loss: 0.3538 142/500 [=======>......................] - ETA: 2:16 - loss: 2.2962 - regression_loss: 1.9433 - classification_loss: 0.3528 143/500 [=======>......................] - ETA: 2:16 - loss: 2.3008 - regression_loss: 1.9475 - classification_loss: 0.3533 144/500 [=======>......................] - ETA: 2:16 - loss: 2.2930 - regression_loss: 1.9406 - classification_loss: 0.3525 145/500 [=======>......................] - ETA: 2:15 - loss: 2.2900 - regression_loss: 1.9381 - classification_loss: 0.3519 146/500 [=======>......................] - ETA: 2:15 - loss: 2.2896 - regression_loss: 1.9383 - classification_loss: 0.3513 147/500 [=======>......................] - ETA: 2:14 - loss: 2.2876 - regression_loss: 1.9365 - classification_loss: 0.3511 148/500 [=======>......................] - ETA: 2:14 - loss: 2.2844 - regression_loss: 1.9341 - classification_loss: 0.3503 149/500 [=======>......................] - ETA: 2:14 - loss: 2.2832 - regression_loss: 1.9334 - classification_loss: 0.3498 150/500 [========>.....................] - ETA: 2:13 - loss: 2.2802 - regression_loss: 1.9312 - classification_loss: 0.3490 151/500 [========>.....................] - ETA: 2:13 - loss: 2.2823 - regression_loss: 1.9326 - classification_loss: 0.3497 152/500 [========>.....................] - ETA: 2:13 - loss: 2.2844 - regression_loss: 1.9342 - classification_loss: 0.3502 153/500 [========>.....................] - ETA: 2:12 - loss: 2.2843 - regression_loss: 1.9346 - classification_loss: 0.3497 154/500 [========>.....................] - ETA: 2:12 - loss: 2.2824 - regression_loss: 1.9334 - classification_loss: 0.3490 155/500 [========>.....................] - ETA: 2:11 - loss: 2.2813 - regression_loss: 1.9326 - classification_loss: 0.3487 156/500 [========>.....................] - ETA: 2:11 - loss: 2.2827 - regression_loss: 1.9343 - classification_loss: 0.3485 157/500 [========>.....................] - ETA: 2:11 - loss: 2.2815 - regression_loss: 1.9338 - classification_loss: 0.3477 158/500 [========>.....................] - ETA: 2:10 - loss: 2.2836 - regression_loss: 1.9356 - classification_loss: 0.3479 159/500 [========>.....................] - ETA: 2:10 - loss: 2.2829 - regression_loss: 1.9354 - classification_loss: 0.3475 160/500 [========>.....................] - ETA: 2:10 - loss: 2.2869 - regression_loss: 1.9386 - classification_loss: 0.3483 161/500 [========>.....................] - ETA: 2:09 - loss: 2.2802 - regression_loss: 1.9326 - classification_loss: 0.3477 162/500 [========>.....................] - ETA: 2:09 - loss: 2.2764 - regression_loss: 1.9293 - classification_loss: 0.3471 163/500 [========>.....................] - ETA: 2:08 - loss: 2.2725 - regression_loss: 1.9259 - classification_loss: 0.3466 164/500 [========>.....................] - ETA: 2:08 - loss: 2.2754 - regression_loss: 1.9284 - classification_loss: 0.3470 165/500 [========>.....................] - ETA: 2:08 - loss: 2.2773 - regression_loss: 1.9294 - classification_loss: 0.3479 166/500 [========>.....................] - ETA: 2:07 - loss: 2.2768 - regression_loss: 1.9289 - classification_loss: 0.3478 167/500 [=========>....................] - ETA: 2:07 - loss: 2.2767 - regression_loss: 1.9288 - classification_loss: 0.3479 168/500 [=========>....................] - ETA: 2:06 - loss: 2.2758 - regression_loss: 1.9281 - classification_loss: 0.3477 169/500 [=========>....................] - ETA: 2:06 - loss: 2.2758 - regression_loss: 1.9283 - classification_loss: 0.3475 170/500 [=========>....................] - ETA: 2:06 - loss: 2.2775 - regression_loss: 1.9298 - classification_loss: 0.3477 171/500 [=========>....................] - ETA: 2:05 - loss: 2.2798 - regression_loss: 1.9317 - classification_loss: 0.3481 172/500 [=========>....................] - ETA: 2:05 - loss: 2.2807 - regression_loss: 1.9326 - classification_loss: 0.3481 173/500 [=========>....................] - ETA: 2:05 - loss: 2.2792 - regression_loss: 1.9311 - classification_loss: 0.3481 174/500 [=========>....................] - ETA: 2:04 - loss: 2.2797 - regression_loss: 1.9316 - classification_loss: 0.3482 175/500 [=========>....................] - ETA: 2:04 - loss: 2.2813 - regression_loss: 1.9329 - classification_loss: 0.3484 176/500 [=========>....................] - ETA: 2:03 - loss: 2.2780 - regression_loss: 1.9300 - classification_loss: 0.3480 177/500 [=========>....................] - ETA: 2:03 - loss: 2.2875 - regression_loss: 1.9335 - classification_loss: 0.3540 178/500 [=========>....................] - ETA: 2:03 - loss: 2.2884 - regression_loss: 1.9331 - classification_loss: 0.3553 179/500 [=========>....................] - ETA: 2:02 - loss: 2.2897 - regression_loss: 1.9342 - classification_loss: 0.3555 180/500 [=========>....................] - ETA: 2:02 - loss: 2.2913 - regression_loss: 1.9356 - classification_loss: 0.3557 181/500 [=========>....................] - ETA: 2:02 - loss: 2.2903 - regression_loss: 1.9348 - classification_loss: 0.3556 182/500 [=========>....................] - ETA: 2:01 - loss: 2.2872 - regression_loss: 1.9323 - classification_loss: 0.3549 183/500 [=========>....................] - ETA: 2:01 - loss: 2.2878 - regression_loss: 1.9330 - classification_loss: 0.3547 184/500 [==========>...................] - ETA: 2:00 - loss: 2.2936 - regression_loss: 1.9375 - classification_loss: 0.3562 185/500 [==========>...................] - ETA: 2:00 - loss: 2.2939 - regression_loss: 1.9380 - classification_loss: 0.3559 186/500 [==========>...................] - ETA: 2:00 - loss: 2.2928 - regression_loss: 1.9374 - classification_loss: 0.3554 187/500 [==========>...................] - ETA: 1:59 - loss: 2.2924 - regression_loss: 1.9371 - classification_loss: 0.3553 188/500 [==========>...................] - ETA: 1:59 - loss: 2.2910 - regression_loss: 1.9362 - classification_loss: 0.3548 189/500 [==========>...................] - ETA: 1:59 - loss: 2.2963 - regression_loss: 1.9403 - classification_loss: 0.3560 190/500 [==========>...................] - ETA: 1:58 - loss: 2.2944 - regression_loss: 1.9391 - classification_loss: 0.3553 191/500 [==========>...................] - ETA: 1:58 - loss: 2.2931 - regression_loss: 1.9382 - classification_loss: 0.3549 192/500 [==========>...................] - ETA: 1:57 - loss: 2.2972 - regression_loss: 1.9415 - classification_loss: 0.3557 193/500 [==========>...................] - ETA: 1:57 - loss: 2.2958 - regression_loss: 1.9404 - classification_loss: 0.3553 194/500 [==========>...................] - ETA: 1:57 - loss: 2.2956 - regression_loss: 1.9406 - classification_loss: 0.3551 195/500 [==========>...................] - ETA: 1:56 - loss: 2.2964 - regression_loss: 1.9407 - classification_loss: 0.3557 196/500 [==========>...................] - ETA: 1:56 - loss: 2.2945 - regression_loss: 1.9389 - classification_loss: 0.3555 197/500 [==========>...................] - ETA: 1:56 - loss: 2.2937 - regression_loss: 1.9383 - classification_loss: 0.3554 198/500 [==========>...................] - ETA: 1:55 - loss: 2.2898 - regression_loss: 1.9349 - classification_loss: 0.3549 199/500 [==========>...................] - ETA: 1:55 - loss: 2.2969 - regression_loss: 1.9409 - classification_loss: 0.3559 200/500 [===========>..................] - ETA: 1:54 - loss: 2.2956 - regression_loss: 1.9402 - classification_loss: 0.3555 201/500 [===========>..................] - ETA: 1:54 - loss: 2.2955 - regression_loss: 1.9402 - classification_loss: 0.3552 202/500 [===========>..................] - ETA: 1:54 - loss: 2.2951 - regression_loss: 1.9401 - classification_loss: 0.3550 203/500 [===========>..................] - ETA: 1:53 - loss: 2.2944 - regression_loss: 1.9400 - classification_loss: 0.3544 204/500 [===========>..................] - ETA: 1:53 - loss: 2.2942 - regression_loss: 1.9398 - classification_loss: 0.3544 205/500 [===========>..................] - ETA: 1:52 - loss: 2.2928 - regression_loss: 1.9389 - classification_loss: 0.3540 206/500 [===========>..................] - ETA: 1:52 - loss: 2.2904 - regression_loss: 1.9372 - classification_loss: 0.3533 207/500 [===========>..................] - ETA: 1:52 - loss: 2.2887 - regression_loss: 1.9360 - classification_loss: 0.3527 208/500 [===========>..................] - ETA: 1:51 - loss: 2.2877 - regression_loss: 1.9353 - classification_loss: 0.3524 209/500 [===========>..................] - ETA: 1:51 - loss: 2.2857 - regression_loss: 1.9339 - classification_loss: 0.3518 210/500 [===========>..................] - ETA: 1:51 - loss: 2.2862 - regression_loss: 1.9344 - classification_loss: 0.3518 211/500 [===========>..................] - ETA: 1:50 - loss: 2.2850 - regression_loss: 1.9337 - classification_loss: 0.3513 212/500 [===========>..................] - ETA: 1:50 - loss: 2.2853 - regression_loss: 1.9335 - classification_loss: 0.3518 213/500 [===========>..................] - ETA: 1:49 - loss: 2.2884 - regression_loss: 1.9363 - classification_loss: 0.3521 214/500 [===========>..................] - ETA: 1:49 - loss: 2.2886 - regression_loss: 1.9366 - classification_loss: 0.3520 215/500 [===========>..................] - ETA: 1:49 - loss: 2.2870 - regression_loss: 1.9354 - classification_loss: 0.3515 216/500 [===========>..................] - ETA: 1:48 - loss: 2.2859 - regression_loss: 1.9347 - classification_loss: 0.3512 217/500 [============>.................] - ETA: 1:48 - loss: 2.2851 - regression_loss: 1.9341 - classification_loss: 0.3509 218/500 [============>.................] - ETA: 1:48 - loss: 2.2838 - regression_loss: 1.9331 - classification_loss: 0.3507 219/500 [============>.................] - ETA: 1:47 - loss: 2.2814 - regression_loss: 1.9314 - classification_loss: 0.3500 220/500 [============>.................] - ETA: 1:47 - loss: 2.2809 - regression_loss: 1.9314 - classification_loss: 0.3496 221/500 [============>.................] - ETA: 1:46 - loss: 2.2797 - regression_loss: 1.9303 - classification_loss: 0.3494 222/500 [============>.................] - ETA: 1:46 - loss: 2.2791 - regression_loss: 1.9301 - classification_loss: 0.3491 223/500 [============>.................] - ETA: 1:46 - loss: 2.2792 - regression_loss: 1.9301 - classification_loss: 0.3491 224/500 [============>.................] - ETA: 1:45 - loss: 2.2764 - regression_loss: 1.9280 - classification_loss: 0.3484 225/500 [============>.................] - ETA: 1:45 - loss: 2.2768 - regression_loss: 1.9283 - classification_loss: 0.3484 226/500 [============>.................] - ETA: 1:45 - loss: 2.2770 - regression_loss: 1.9288 - classification_loss: 0.3481 227/500 [============>.................] - ETA: 1:44 - loss: 2.2794 - regression_loss: 1.9310 - classification_loss: 0.3484 228/500 [============>.................] - ETA: 1:44 - loss: 2.2765 - regression_loss: 1.9288 - classification_loss: 0.3477 229/500 [============>.................] - ETA: 1:43 - loss: 2.2739 - regression_loss: 1.9268 - classification_loss: 0.3471 230/500 [============>.................] - ETA: 1:43 - loss: 2.2755 - regression_loss: 1.9281 - classification_loss: 0.3473 231/500 [============>.................] - ETA: 1:43 - loss: 2.2709 - regression_loss: 1.9244 - classification_loss: 0.3464 232/500 [============>.................] - ETA: 1:42 - loss: 2.2710 - regression_loss: 1.9247 - classification_loss: 0.3462 233/500 [============>.................] - ETA: 1:42 - loss: 2.2702 - regression_loss: 1.9244 - classification_loss: 0.3457 234/500 [=============>................] - ETA: 1:42 - loss: 2.2702 - regression_loss: 1.9234 - classification_loss: 0.3468 235/500 [=============>................] - ETA: 1:41 - loss: 2.2682 - regression_loss: 1.9219 - classification_loss: 0.3462 236/500 [=============>................] - ETA: 1:41 - loss: 2.2656 - regression_loss: 1.9198 - classification_loss: 0.3458 237/500 [=============>................] - ETA: 1:40 - loss: 2.2646 - regression_loss: 1.9191 - classification_loss: 0.3455 238/500 [=============>................] - ETA: 1:40 - loss: 2.2651 - regression_loss: 1.9195 - classification_loss: 0.3455 239/500 [=============>................] - ETA: 1:40 - loss: 2.2643 - regression_loss: 1.9193 - classification_loss: 0.3450 240/500 [=============>................] - ETA: 1:39 - loss: 2.2660 - regression_loss: 1.9206 - classification_loss: 0.3453 241/500 [=============>................] - ETA: 1:39 - loss: 2.2721 - regression_loss: 1.9246 - classification_loss: 0.3475 242/500 [=============>................] - ETA: 1:39 - loss: 2.2715 - regression_loss: 1.9242 - classification_loss: 0.3474 243/500 [=============>................] - ETA: 1:38 - loss: 2.2698 - regression_loss: 1.9231 - classification_loss: 0.3467 244/500 [=============>................] - ETA: 1:38 - loss: 2.2704 - regression_loss: 1.9231 - classification_loss: 0.3473 245/500 [=============>................] - ETA: 1:37 - loss: 2.2686 - regression_loss: 1.9218 - classification_loss: 0.3469 246/500 [=============>................] - ETA: 1:37 - loss: 2.2656 - regression_loss: 1.9194 - classification_loss: 0.3462 247/500 [=============>................] - ETA: 1:37 - loss: 2.2640 - regression_loss: 1.9182 - classification_loss: 0.3459 248/500 [=============>................] - ETA: 1:36 - loss: 2.2584 - regression_loss: 1.9133 - classification_loss: 0.3451 249/500 [=============>................] - ETA: 1:36 - loss: 2.2627 - regression_loss: 1.9162 - classification_loss: 0.3464 250/500 [==============>...............] - ETA: 1:35 - loss: 2.2651 - regression_loss: 1.9183 - classification_loss: 0.3467 251/500 [==============>...............] - ETA: 1:35 - loss: 2.2649 - regression_loss: 1.9184 - classification_loss: 0.3465 252/500 [==============>...............] - ETA: 1:35 - loss: 2.2666 - regression_loss: 1.9191 - classification_loss: 0.3475 253/500 [==============>...............] - ETA: 1:34 - loss: 2.2673 - regression_loss: 1.9194 - classification_loss: 0.3479 254/500 [==============>...............] - ETA: 1:34 - loss: 2.2697 - regression_loss: 1.9211 - classification_loss: 0.3485 255/500 [==============>...............] - ETA: 1:34 - loss: 2.2708 - regression_loss: 1.9216 - classification_loss: 0.3492 256/500 [==============>...............] - ETA: 1:33 - loss: 2.2707 - regression_loss: 1.9216 - classification_loss: 0.3491 257/500 [==============>...............] - ETA: 1:33 - loss: 2.2736 - regression_loss: 1.9239 - classification_loss: 0.3497 258/500 [==============>...............] - ETA: 1:32 - loss: 2.2768 - regression_loss: 1.9261 - classification_loss: 0.3508 259/500 [==============>...............] - ETA: 1:32 - loss: 2.2747 - regression_loss: 1.9244 - classification_loss: 0.3503 260/500 [==============>...............] - ETA: 1:32 - loss: 2.2732 - regression_loss: 1.9232 - classification_loss: 0.3501 261/500 [==============>...............] - ETA: 1:31 - loss: 2.2732 - regression_loss: 1.9229 - classification_loss: 0.3503 262/500 [==============>...............] - ETA: 1:31 - loss: 2.2699 - regression_loss: 1.9197 - classification_loss: 0.3501 263/500 [==============>...............] - ETA: 1:31 - loss: 2.2729 - regression_loss: 1.9217 - classification_loss: 0.3512 264/500 [==============>...............] - ETA: 1:30 - loss: 2.2744 - regression_loss: 1.9229 - classification_loss: 0.3515 265/500 [==============>...............] - ETA: 1:30 - loss: 2.2751 - regression_loss: 1.9235 - classification_loss: 0.3516 266/500 [==============>...............] - ETA: 1:29 - loss: 2.2735 - regression_loss: 1.9222 - classification_loss: 0.3514 267/500 [===============>..............] - ETA: 1:29 - loss: 2.2713 - regression_loss: 1.9207 - classification_loss: 0.3507 268/500 [===============>..............] - ETA: 1:29 - loss: 2.2717 - regression_loss: 1.9208 - classification_loss: 0.3510 269/500 [===============>..............] - ETA: 1:28 - loss: 2.2712 - regression_loss: 1.9203 - classification_loss: 0.3509 270/500 [===============>..............] - ETA: 1:28 - loss: 2.2695 - regression_loss: 1.9190 - classification_loss: 0.3506 271/500 [===============>..............] - ETA: 1:27 - loss: 2.2672 - regression_loss: 1.9172 - classification_loss: 0.3499 272/500 [===============>..............] - ETA: 1:27 - loss: 2.2659 - regression_loss: 1.9162 - classification_loss: 0.3497 273/500 [===============>..............] - ETA: 1:27 - loss: 2.2659 - regression_loss: 1.9159 - classification_loss: 0.3500 274/500 [===============>..............] - ETA: 1:26 - loss: 2.2673 - regression_loss: 1.9169 - classification_loss: 0.3504 275/500 [===============>..............] - ETA: 1:26 - loss: 2.2645 - regression_loss: 1.9144 - classification_loss: 0.3501 276/500 [===============>..............] - ETA: 1:26 - loss: 2.2642 - regression_loss: 1.9142 - classification_loss: 0.3500 277/500 [===============>..............] - ETA: 1:25 - loss: 2.2648 - regression_loss: 1.9148 - classification_loss: 0.3501 278/500 [===============>..............] - ETA: 1:25 - loss: 2.2659 - regression_loss: 1.9158 - classification_loss: 0.3501 279/500 [===============>..............] - ETA: 1:24 - loss: 2.2644 - regression_loss: 1.9144 - classification_loss: 0.3499 280/500 [===============>..............] - ETA: 1:24 - loss: 2.2636 - regression_loss: 1.9139 - classification_loss: 0.3497 281/500 [===============>..............] - ETA: 1:24 - loss: 2.2623 - regression_loss: 1.9127 - classification_loss: 0.3496 282/500 [===============>..............] - ETA: 1:23 - loss: 2.2607 - regression_loss: 1.9114 - classification_loss: 0.3493 283/500 [===============>..............] - ETA: 1:23 - loss: 2.2569 - regression_loss: 1.9081 - classification_loss: 0.3488 284/500 [================>.............] - ETA: 1:22 - loss: 2.2543 - regression_loss: 1.9060 - classification_loss: 0.3483 285/500 [================>.............] - ETA: 1:22 - loss: 2.2538 - regression_loss: 1.9055 - classification_loss: 0.3483 286/500 [================>.............] - ETA: 1:21 - loss: 2.2538 - regression_loss: 1.9054 - classification_loss: 0.3483 287/500 [================>.............] - ETA: 1:21 - loss: 2.2516 - regression_loss: 1.9038 - classification_loss: 0.3478 288/500 [================>.............] - ETA: 1:21 - loss: 2.2535 - regression_loss: 1.9049 - classification_loss: 0.3486 289/500 [================>.............] - ETA: 1:20 - loss: 2.2538 - regression_loss: 1.9052 - classification_loss: 0.3486 290/500 [================>.............] - ETA: 1:20 - loss: 2.2533 - regression_loss: 1.9047 - classification_loss: 0.3485 291/500 [================>.............] - ETA: 1:20 - loss: 2.2538 - regression_loss: 1.9053 - classification_loss: 0.3485 292/500 [================>.............] - ETA: 1:19 - loss: 2.2537 - regression_loss: 1.9053 - classification_loss: 0.3485 293/500 [================>.............] - ETA: 1:19 - loss: 2.2534 - regression_loss: 1.9050 - classification_loss: 0.3484 294/500 [================>.............] - ETA: 1:18 - loss: 2.2492 - regression_loss: 1.9015 - classification_loss: 0.3477 295/500 [================>.............] - ETA: 1:18 - loss: 2.2504 - regression_loss: 1.9025 - classification_loss: 0.3479 296/500 [================>.............] - ETA: 1:18 - loss: 2.2495 - regression_loss: 1.9018 - classification_loss: 0.3477 297/500 [================>.............] - ETA: 1:17 - loss: 2.2486 - regression_loss: 1.9013 - classification_loss: 0.3473 298/500 [================>.............] - ETA: 1:17 - loss: 2.2455 - regression_loss: 1.8987 - classification_loss: 0.3468 299/500 [================>.............] - ETA: 1:17 - loss: 2.2451 - regression_loss: 1.8986 - classification_loss: 0.3466 300/500 [=================>............] - ETA: 1:16 - loss: 2.2415 - regression_loss: 1.8955 - classification_loss: 0.3460 301/500 [=================>............] - ETA: 1:16 - loss: 2.2407 - regression_loss: 1.8951 - classification_loss: 0.3456 302/500 [=================>............] - ETA: 1:15 - loss: 2.2420 - regression_loss: 1.8965 - classification_loss: 0.3455 303/500 [=================>............] - ETA: 1:15 - loss: 2.2404 - regression_loss: 1.8952 - classification_loss: 0.3452 304/500 [=================>............] - ETA: 1:15 - loss: 2.2392 - regression_loss: 1.8944 - classification_loss: 0.3448 305/500 [=================>............] - ETA: 1:14 - loss: 2.2376 - regression_loss: 1.8928 - classification_loss: 0.3448 306/500 [=================>............] - ETA: 1:14 - loss: 2.2333 - regression_loss: 1.8888 - classification_loss: 0.3445 307/500 [=================>............] - ETA: 1:14 - loss: 2.2331 - regression_loss: 1.8885 - classification_loss: 0.3446 308/500 [=================>............] - ETA: 1:13 - loss: 2.2317 - regression_loss: 1.8875 - classification_loss: 0.3442 309/500 [=================>............] - ETA: 1:13 - loss: 2.2307 - regression_loss: 1.8868 - classification_loss: 0.3439 310/500 [=================>............] - ETA: 1:12 - loss: 2.2294 - regression_loss: 1.8857 - classification_loss: 0.3437 311/500 [=================>............] - ETA: 1:12 - loss: 2.2277 - regression_loss: 1.8845 - classification_loss: 0.3432 312/500 [=================>............] - ETA: 1:12 - loss: 2.2269 - regression_loss: 1.8838 - classification_loss: 0.3431 313/500 [=================>............] - ETA: 1:11 - loss: 2.2273 - regression_loss: 1.8842 - classification_loss: 0.3431 314/500 [=================>............] - ETA: 1:11 - loss: 2.2250 - regression_loss: 1.8823 - classification_loss: 0.3427 315/500 [=================>............] - ETA: 1:10 - loss: 2.2253 - regression_loss: 1.8827 - classification_loss: 0.3426 316/500 [=================>............] - ETA: 1:10 - loss: 2.2241 - regression_loss: 1.8819 - classification_loss: 0.3421 317/500 [==================>...........] - ETA: 1:10 - loss: 2.2225 - regression_loss: 1.8808 - classification_loss: 0.3417 318/500 [==================>...........] - ETA: 1:09 - loss: 2.2233 - regression_loss: 1.8815 - classification_loss: 0.3418 319/500 [==================>...........] - ETA: 1:09 - loss: 2.2218 - regression_loss: 1.8803 - classification_loss: 0.3414 320/500 [==================>...........] - ETA: 1:09 - loss: 2.2206 - regression_loss: 1.8795 - classification_loss: 0.3412 321/500 [==================>...........] - ETA: 1:08 - loss: 2.2199 - regression_loss: 1.8789 - classification_loss: 0.3410 322/500 [==================>...........] - ETA: 1:08 - loss: 2.2195 - regression_loss: 1.8786 - classification_loss: 0.3408 323/500 [==================>...........] - ETA: 1:07 - loss: 2.2211 - regression_loss: 1.8794 - classification_loss: 0.3417 324/500 [==================>...........] - ETA: 1:07 - loss: 2.2192 - regression_loss: 1.8778 - classification_loss: 0.3414 325/500 [==================>...........] - ETA: 1:07 - loss: 2.2178 - regression_loss: 1.8768 - classification_loss: 0.3410 326/500 [==================>...........] - ETA: 1:06 - loss: 2.2178 - regression_loss: 1.8767 - classification_loss: 0.3410 327/500 [==================>...........] - ETA: 1:06 - loss: 2.2179 - regression_loss: 1.8771 - classification_loss: 0.3409 328/500 [==================>...........] - ETA: 1:06 - loss: 2.2149 - regression_loss: 1.8747 - classification_loss: 0.3402 329/500 [==================>...........] - ETA: 1:05 - loss: 2.2151 - regression_loss: 1.8745 - classification_loss: 0.3406 330/500 [==================>...........] - ETA: 1:05 - loss: 2.2149 - regression_loss: 1.8744 - classification_loss: 0.3405 331/500 [==================>...........] - ETA: 1:04 - loss: 2.2127 - regression_loss: 1.8725 - classification_loss: 0.3402 332/500 [==================>...........] - ETA: 1:04 - loss: 2.2111 - regression_loss: 1.8712 - classification_loss: 0.3399 333/500 [==================>...........] - ETA: 1:04 - loss: 2.2110 - regression_loss: 1.8714 - classification_loss: 0.3396 334/500 [===================>..........] - ETA: 1:03 - loss: 2.2110 - regression_loss: 1.8715 - classification_loss: 0.3395 335/500 [===================>..........] - ETA: 1:03 - loss: 2.2105 - regression_loss: 1.8712 - classification_loss: 0.3393 336/500 [===================>..........] - ETA: 1:03 - loss: 2.2098 - regression_loss: 1.8708 - classification_loss: 0.3390 337/500 [===================>..........] - ETA: 1:02 - loss: 2.2097 - regression_loss: 1.8701 - classification_loss: 0.3396 338/500 [===================>..........] - ETA: 1:02 - loss: 2.2086 - regression_loss: 1.8693 - classification_loss: 0.3393 339/500 [===================>..........] - ETA: 1:01 - loss: 2.2082 - regression_loss: 1.8689 - classification_loss: 0.3393 340/500 [===================>..........] - ETA: 1:01 - loss: 2.2127 - regression_loss: 1.8730 - classification_loss: 0.3397 341/500 [===================>..........] - ETA: 1:01 - loss: 2.2113 - regression_loss: 1.8719 - classification_loss: 0.3394 342/500 [===================>..........] - ETA: 1:00 - loss: 2.2108 - regression_loss: 1.8715 - classification_loss: 0.3393 343/500 [===================>..........] - ETA: 1:00 - loss: 2.2095 - regression_loss: 1.8703 - classification_loss: 0.3392 344/500 [===================>..........] - ETA: 59s - loss: 2.2086 - regression_loss: 1.8699 - classification_loss: 0.3388  345/500 [===================>..........] - ETA: 59s - loss: 2.2054 - regression_loss: 1.8670 - classification_loss: 0.3384 346/500 [===================>..........] - ETA: 59s - loss: 2.2053 - regression_loss: 1.8668 - classification_loss: 0.3384 347/500 [===================>..........] - ETA: 58s - loss: 2.2034 - regression_loss: 1.8653 - classification_loss: 0.3381 348/500 [===================>..........] - ETA: 58s - loss: 2.2016 - regression_loss: 1.8640 - classification_loss: 0.3376 349/500 [===================>..........] - ETA: 58s - loss: 2.2008 - regression_loss: 1.8632 - classification_loss: 0.3376 350/500 [====================>.........] - ETA: 57s - loss: 2.1987 - regression_loss: 1.8615 - classification_loss: 0.3372 351/500 [====================>.........] - ETA: 57s - loss: 2.1996 - regression_loss: 1.8624 - classification_loss: 0.3372 352/500 [====================>.........] - ETA: 56s - loss: 2.1985 - regression_loss: 1.8611 - classification_loss: 0.3374 353/500 [====================>.........] - ETA: 56s - loss: 2.1946 - regression_loss: 1.8575 - classification_loss: 0.3371 354/500 [====================>.........] - ETA: 56s - loss: 2.1944 - regression_loss: 1.8575 - classification_loss: 0.3369 355/500 [====================>.........] - ETA: 55s - loss: 2.1937 - regression_loss: 1.8569 - classification_loss: 0.3367 356/500 [====================>.........] - ETA: 55s - loss: 2.1929 - regression_loss: 1.8565 - classification_loss: 0.3364 357/500 [====================>.........] - ETA: 55s - loss: 2.1923 - regression_loss: 1.8560 - classification_loss: 0.3363 358/500 [====================>.........] - ETA: 54s - loss: 2.1946 - regression_loss: 1.8577 - classification_loss: 0.3369 359/500 [====================>.........] - ETA: 54s - loss: 2.1982 - regression_loss: 1.8608 - classification_loss: 0.3374 360/500 [====================>.........] - ETA: 53s - loss: 2.2009 - regression_loss: 1.8629 - classification_loss: 0.3379 361/500 [====================>.........] - ETA: 53s - loss: 2.2016 - regression_loss: 1.8637 - classification_loss: 0.3379 362/500 [====================>.........] - ETA: 53s - loss: 2.2012 - regression_loss: 1.8634 - classification_loss: 0.3379 363/500 [====================>.........] - ETA: 52s - loss: 2.2018 - regression_loss: 1.8636 - classification_loss: 0.3382 364/500 [====================>.........] - ETA: 52s - loss: 2.2014 - regression_loss: 1.8632 - classification_loss: 0.3383 365/500 [====================>.........] - ETA: 51s - loss: 2.2010 - regression_loss: 1.8628 - classification_loss: 0.3382 366/500 [====================>.........] - ETA: 51s - loss: 2.2015 - regression_loss: 1.8633 - classification_loss: 0.3381 367/500 [=====================>........] - ETA: 51s - loss: 2.2025 - regression_loss: 1.8641 - classification_loss: 0.3384 368/500 [=====================>........] - ETA: 50s - loss: 2.2030 - regression_loss: 1.8644 - classification_loss: 0.3386 369/500 [=====================>........] - ETA: 50s - loss: 2.2021 - regression_loss: 1.8638 - classification_loss: 0.3383 370/500 [=====================>........] - ETA: 50s - loss: 2.2000 - regression_loss: 1.8621 - classification_loss: 0.3379 371/500 [=====================>........] - ETA: 49s - loss: 2.2017 - regression_loss: 1.8636 - classification_loss: 0.3381 372/500 [=====================>........] - ETA: 49s - loss: 2.2005 - regression_loss: 1.8628 - classification_loss: 0.3377 373/500 [=====================>........] - ETA: 48s - loss: 2.2000 - regression_loss: 1.8624 - classification_loss: 0.3376 374/500 [=====================>........] - ETA: 48s - loss: 2.2003 - regression_loss: 1.8628 - classification_loss: 0.3375 375/500 [=====================>........] - ETA: 48s - loss: 2.1999 - regression_loss: 1.8626 - classification_loss: 0.3373 376/500 [=====================>........] - ETA: 47s - loss: 2.1993 - regression_loss: 1.8621 - classification_loss: 0.3372 377/500 [=====================>........] - ETA: 47s - loss: 2.1987 - regression_loss: 1.8616 - classification_loss: 0.3371 378/500 [=====================>........] - ETA: 46s - loss: 2.1977 - regression_loss: 1.8609 - classification_loss: 0.3368 379/500 [=====================>........] - ETA: 46s - loss: 2.1970 - regression_loss: 1.8604 - classification_loss: 0.3366 380/500 [=====================>........] - ETA: 46s - loss: 2.1966 - regression_loss: 1.8602 - classification_loss: 0.3364 381/500 [=====================>........] - ETA: 45s - loss: 2.1952 - regression_loss: 1.8590 - classification_loss: 0.3362 382/500 [=====================>........] - ETA: 45s - loss: 2.1943 - regression_loss: 1.8584 - classification_loss: 0.3359 383/500 [=====================>........] - ETA: 45s - loss: 2.1916 - regression_loss: 1.8562 - classification_loss: 0.3354 384/500 [======================>.......] - ETA: 44s - loss: 2.1894 - regression_loss: 1.8542 - classification_loss: 0.3352 385/500 [======================>.......] - ETA: 44s - loss: 2.1880 - regression_loss: 1.8531 - classification_loss: 0.3348 386/500 [======================>.......] - ETA: 43s - loss: 2.1856 - regression_loss: 1.8512 - classification_loss: 0.3344 387/500 [======================>.......] - ETA: 43s - loss: 2.1845 - regression_loss: 1.8504 - classification_loss: 0.3341 388/500 [======================>.......] - ETA: 43s - loss: 2.1854 - regression_loss: 1.8512 - classification_loss: 0.3342 389/500 [======================>.......] - ETA: 42s - loss: 2.1850 - regression_loss: 1.8509 - classification_loss: 0.3342 390/500 [======================>.......] - ETA: 42s - loss: 2.1837 - regression_loss: 1.8498 - classification_loss: 0.3339 391/500 [======================>.......] - ETA: 41s - loss: 2.1827 - regression_loss: 1.8490 - classification_loss: 0.3337 392/500 [======================>.......] - ETA: 41s - loss: 2.1829 - regression_loss: 1.8491 - classification_loss: 0.3338 393/500 [======================>.......] - ETA: 41s - loss: 2.1796 - regression_loss: 1.8462 - classification_loss: 0.3333 394/500 [======================>.......] - ETA: 40s - loss: 2.1787 - regression_loss: 1.8456 - classification_loss: 0.3331 395/500 [======================>.......] - ETA: 40s - loss: 2.1784 - regression_loss: 1.8454 - classification_loss: 0.3330 396/500 [======================>.......] - ETA: 40s - loss: 2.1809 - regression_loss: 1.8468 - classification_loss: 0.3341 397/500 [======================>.......] - ETA: 39s - loss: 2.1804 - regression_loss: 1.8465 - classification_loss: 0.3340 398/500 [======================>.......] - ETA: 39s - loss: 2.1816 - regression_loss: 1.8476 - classification_loss: 0.3340 399/500 [======================>.......] - ETA: 38s - loss: 2.1794 - regression_loss: 1.8457 - classification_loss: 0.3337 400/500 [=======================>......] - ETA: 38s - loss: 2.1787 - regression_loss: 1.8454 - classification_loss: 0.3333 401/500 [=======================>......] - ETA: 38s - loss: 2.1782 - regression_loss: 1.8451 - classification_loss: 0.3331 402/500 [=======================>......] - ETA: 37s - loss: 2.1767 - regression_loss: 1.8439 - classification_loss: 0.3328 403/500 [=======================>......] - ETA: 37s - loss: 2.1803 - regression_loss: 1.8463 - classification_loss: 0.3340 404/500 [=======================>......] - ETA: 36s - loss: 2.1788 - regression_loss: 1.8452 - classification_loss: 0.3336 405/500 [=======================>......] - ETA: 36s - loss: 2.1777 - regression_loss: 1.8442 - classification_loss: 0.3335 406/500 [=======================>......] - ETA: 36s - loss: 2.1804 - regression_loss: 1.8463 - classification_loss: 0.3341 407/500 [=======================>......] - ETA: 35s - loss: 2.1804 - regression_loss: 1.8464 - classification_loss: 0.3341 408/500 [=======================>......] - ETA: 35s - loss: 2.1812 - regression_loss: 1.8470 - classification_loss: 0.3342 409/500 [=======================>......] - ETA: 35s - loss: 2.1838 - regression_loss: 1.8490 - classification_loss: 0.3348 410/500 [=======================>......] - ETA: 34s - loss: 2.1854 - regression_loss: 1.8502 - classification_loss: 0.3351 411/500 [=======================>......] - ETA: 34s - loss: 2.1853 - regression_loss: 1.8500 - classification_loss: 0.3353 412/500 [=======================>......] - ETA: 33s - loss: 2.1856 - regression_loss: 1.8501 - classification_loss: 0.3355 413/500 [=======================>......] - ETA: 33s - loss: 2.1843 - regression_loss: 1.8491 - classification_loss: 0.3352 414/500 [=======================>......] - ETA: 33s - loss: 2.1836 - regression_loss: 1.8485 - classification_loss: 0.3351 415/500 [=======================>......] - ETA: 32s - loss: 2.1828 - regression_loss: 1.8479 - classification_loss: 0.3349 416/500 [=======================>......] - ETA: 32s - loss: 2.1795 - regression_loss: 1.8451 - classification_loss: 0.3344 417/500 [========================>.....] - ETA: 31s - loss: 2.1787 - regression_loss: 1.8443 - classification_loss: 0.3344 418/500 [========================>.....] - ETA: 31s - loss: 2.1776 - regression_loss: 1.8435 - classification_loss: 0.3341 419/500 [========================>.....] - ETA: 31s - loss: 2.1773 - regression_loss: 1.8434 - classification_loss: 0.3339 420/500 [========================>.....] - ETA: 30s - loss: 2.1764 - regression_loss: 1.8427 - classification_loss: 0.3337 421/500 [========================>.....] - ETA: 30s - loss: 2.1776 - regression_loss: 1.8438 - classification_loss: 0.3339 422/500 [========================>.....] - ETA: 30s - loss: 2.1776 - regression_loss: 1.8439 - classification_loss: 0.3337 423/500 [========================>.....] - ETA: 29s - loss: 2.1786 - regression_loss: 1.8444 - classification_loss: 0.3341 424/500 [========================>.....] - ETA: 29s - loss: 2.1789 - regression_loss: 1.8447 - classification_loss: 0.3343 425/500 [========================>.....] - ETA: 28s - loss: 2.1802 - regression_loss: 1.8454 - classification_loss: 0.3348 426/500 [========================>.....] - ETA: 28s - loss: 2.1795 - regression_loss: 1.8450 - classification_loss: 0.3346 427/500 [========================>.....] - ETA: 28s - loss: 2.1796 - regression_loss: 1.8451 - classification_loss: 0.3344 428/500 [========================>.....] - ETA: 27s - loss: 2.1783 - regression_loss: 1.8441 - classification_loss: 0.3342 429/500 [========================>.....] - ETA: 27s - loss: 2.1783 - regression_loss: 1.8442 - classification_loss: 0.3342 430/500 [========================>.....] - ETA: 26s - loss: 2.1765 - regression_loss: 1.8426 - classification_loss: 0.3339 431/500 [========================>.....] - ETA: 26s - loss: 2.1766 - regression_loss: 1.8427 - classification_loss: 0.3339 432/500 [========================>.....] - ETA: 26s - loss: 2.1767 - regression_loss: 1.8427 - classification_loss: 0.3340 433/500 [========================>.....] - ETA: 25s - loss: 2.1761 - regression_loss: 1.8422 - classification_loss: 0.3339 434/500 [=========================>....] - ETA: 25s - loss: 2.1749 - regression_loss: 1.8413 - classification_loss: 0.3336 435/500 [=========================>....] - ETA: 25s - loss: 2.1731 - regression_loss: 1.8398 - classification_loss: 0.3333 436/500 [=========================>....] - ETA: 24s - loss: 2.1737 - regression_loss: 1.8404 - classification_loss: 0.3333 437/500 [=========================>....] - ETA: 24s - loss: 2.1738 - regression_loss: 1.8405 - classification_loss: 0.3333 438/500 [=========================>....] - ETA: 23s - loss: 2.1743 - regression_loss: 1.8404 - classification_loss: 0.3338 439/500 [=========================>....] - ETA: 23s - loss: 2.1733 - regression_loss: 1.8396 - classification_loss: 0.3337 440/500 [=========================>....] - ETA: 23s - loss: 2.1722 - regression_loss: 1.8388 - classification_loss: 0.3334 441/500 [=========================>....] - ETA: 22s - loss: 2.1715 - regression_loss: 1.8383 - classification_loss: 0.3332 442/500 [=========================>....] - ETA: 22s - loss: 2.1688 - regression_loss: 1.8361 - classification_loss: 0.3327 443/500 [=========================>....] - ETA: 21s - loss: 2.1680 - regression_loss: 1.8356 - classification_loss: 0.3324 444/500 [=========================>....] - ETA: 21s - loss: 2.1675 - regression_loss: 1.8352 - classification_loss: 0.3322 445/500 [=========================>....] - ETA: 21s - loss: 2.1675 - regression_loss: 1.8352 - classification_loss: 0.3322 446/500 [=========================>....] - ETA: 20s - loss: 2.1668 - regression_loss: 1.8346 - classification_loss: 0.3322 447/500 [=========================>....] - ETA: 20s - loss: 2.1669 - regression_loss: 1.8347 - classification_loss: 0.3323 448/500 [=========================>....] - ETA: 20s - loss: 2.1663 - regression_loss: 1.8340 - classification_loss: 0.3322 449/500 [=========================>....] - ETA: 19s - loss: 2.1631 - regression_loss: 1.8313 - classification_loss: 0.3318 450/500 [==========================>...] - ETA: 19s - loss: 2.1623 - regression_loss: 1.8306 - classification_loss: 0.3317 451/500 [==========================>...] - ETA: 18s - loss: 2.1616 - regression_loss: 1.8301 - classification_loss: 0.3315 452/500 [==========================>...] - ETA: 18s - loss: 2.1605 - regression_loss: 1.8289 - classification_loss: 0.3316 453/500 [==========================>...] - ETA: 18s - loss: 2.1600 - regression_loss: 1.8286 - classification_loss: 0.3315 454/500 [==========================>...] - ETA: 17s - loss: 2.1601 - regression_loss: 1.8286 - classification_loss: 0.3315 455/500 [==========================>...] - ETA: 17s - loss: 2.1601 - regression_loss: 1.8285 - classification_loss: 0.3316 456/500 [==========================>...] - ETA: 16s - loss: 2.1614 - regression_loss: 1.8295 - classification_loss: 0.3319 457/500 [==========================>...] - ETA: 16s - loss: 2.1611 - regression_loss: 1.8294 - classification_loss: 0.3317 458/500 [==========================>...] - ETA: 16s - loss: 2.1589 - regression_loss: 1.8274 - classification_loss: 0.3315 459/500 [==========================>...] - ETA: 15s - loss: 2.1584 - regression_loss: 1.8268 - classification_loss: 0.3316 460/500 [==========================>...] - ETA: 15s - loss: 2.1586 - regression_loss: 1.8269 - classification_loss: 0.3317 461/500 [==========================>...] - ETA: 15s - loss: 2.1590 - regression_loss: 1.8271 - classification_loss: 0.3319 462/500 [==========================>...] - ETA: 14s - loss: 2.1581 - regression_loss: 1.8264 - classification_loss: 0.3317 463/500 [==========================>...] - ETA: 14s - loss: 2.1575 - regression_loss: 1.8259 - classification_loss: 0.3316 464/500 [==========================>...] - ETA: 13s - loss: 2.1568 - regression_loss: 1.8252 - classification_loss: 0.3315 465/500 [==========================>...] - ETA: 13s - loss: 2.1572 - regression_loss: 1.8257 - classification_loss: 0.3315 466/500 [==========================>...] - ETA: 13s - loss: 2.1549 - regression_loss: 1.8237 - classification_loss: 0.3313 467/500 [===========================>..] - ETA: 12s - loss: 2.1520 - regression_loss: 1.8212 - classification_loss: 0.3308 468/500 [===========================>..] - ETA: 12s - loss: 2.1520 - regression_loss: 1.8214 - classification_loss: 0.3306 469/500 [===========================>..] - ETA: 11s - loss: 2.1515 - regression_loss: 1.8209 - classification_loss: 0.3306 470/500 [===========================>..] - ETA: 11s - loss: 2.1508 - regression_loss: 1.8200 - classification_loss: 0.3308 471/500 [===========================>..] - ETA: 11s - loss: 2.1503 - regression_loss: 1.8197 - classification_loss: 0.3306 472/500 [===========================>..] - ETA: 10s - loss: 2.1494 - regression_loss: 1.8184 - classification_loss: 0.3309 473/500 [===========================>..] - ETA: 10s - loss: 2.1500 - regression_loss: 1.8191 - classification_loss: 0.3309 474/500 [===========================>..] - ETA: 10s - loss: 2.1476 - regression_loss: 1.8171 - classification_loss: 0.3305 475/500 [===========================>..] - ETA: 9s - loss: 2.1459 - regression_loss: 1.8157 - classification_loss: 0.3301  476/500 [===========================>..] - ETA: 9s - loss: 2.1475 - regression_loss: 1.8167 - classification_loss: 0.3308 477/500 [===========================>..] - ETA: 8s - loss: 2.1472 - regression_loss: 1.8164 - classification_loss: 0.3308 478/500 [===========================>..] - ETA: 8s - loss: 2.1475 - regression_loss: 1.8166 - classification_loss: 0.3309 479/500 [===========================>..] - ETA: 8s - loss: 2.1474 - regression_loss: 1.8165 - classification_loss: 0.3309 480/500 [===========================>..] - ETA: 7s - loss: 2.1459 - regression_loss: 1.8153 - classification_loss: 0.3306 481/500 [===========================>..] - ETA: 7s - loss: 2.1448 - regression_loss: 1.8144 - classification_loss: 0.3303 482/500 [===========================>..] - ETA: 6s - loss: 2.1452 - regression_loss: 1.8146 - classification_loss: 0.3306 483/500 [===========================>..] - ETA: 6s - loss: 2.1450 - regression_loss: 1.8145 - classification_loss: 0.3305 484/500 [============================>.] - ETA: 6s - loss: 2.1451 - regression_loss: 1.8147 - classification_loss: 0.3304 485/500 [============================>.] - ETA: 5s - loss: 2.1438 - regression_loss: 1.8137 - classification_loss: 0.3301 486/500 [============================>.] - ETA: 5s - loss: 2.1432 - regression_loss: 1.8133 - classification_loss: 0.3299 487/500 [============================>.] - ETA: 5s - loss: 2.1422 - regression_loss: 1.8125 - classification_loss: 0.3297 488/500 [============================>.] - ETA: 4s - loss: 2.1411 - regression_loss: 1.8116 - classification_loss: 0.3295 489/500 [============================>.] - ETA: 4s - loss: 2.1394 - regression_loss: 1.8101 - classification_loss: 0.3293 490/500 [============================>.] - ETA: 3s - loss: 2.1396 - regression_loss: 1.8103 - classification_loss: 0.3293 491/500 [============================>.] - ETA: 3s - loss: 2.1385 - regression_loss: 1.8093 - classification_loss: 0.3292 492/500 [============================>.] - ETA: 3s - loss: 2.1382 - regression_loss: 1.8091 - classification_loss: 0.3290 493/500 [============================>.] - ETA: 2s - loss: 2.1383 - regression_loss: 1.8093 - classification_loss: 0.3290 494/500 [============================>.] - ETA: 2s - loss: 2.1384 - regression_loss: 1.8094 - classification_loss: 0.3290 495/500 [============================>.] - ETA: 1s - loss: 2.1380 - regression_loss: 1.8092 - classification_loss: 0.3289 496/500 [============================>.] - ETA: 1s - loss: 2.1377 - regression_loss: 1.8091 - classification_loss: 0.3286 497/500 [============================>.] - ETA: 1s - loss: 2.1377 - regression_loss: 1.8085 - classification_loss: 0.3293 498/500 [============================>.] - ETA: 0s - loss: 2.1369 - regression_loss: 1.8079 - classification_loss: 0.3290 499/500 [============================>.] - ETA: 0s - loss: 2.1372 - regression_loss: 1.8081 - classification_loss: 0.3291 500/500 [==============================] - 193s 386ms/step - loss: 2.1368 - regression_loss: 1.8078 - classification_loss: 0.3290 326 instances of class plum with average precision: 0.6709 mAP: 0.6709 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/20 1/500 [..............................] - ETA: 3:08 - loss: 2.2664 - regression_loss: 1.9729 - classification_loss: 0.2935 2/500 [..............................] - ETA: 3:11 - loss: 2.2856 - regression_loss: 1.9929 - classification_loss: 0.2927 3/500 [..............................] - ETA: 3:09 - loss: 2.2234 - regression_loss: 1.9396 - classification_loss: 0.2839 4/500 [..............................] - ETA: 3:09 - loss: 2.1466 - regression_loss: 1.8638 - classification_loss: 0.2828 5/500 [..............................] - ETA: 3:09 - loss: 2.1160 - regression_loss: 1.8213 - classification_loss: 0.2947 6/500 [..............................] - ETA: 3:09 - loss: 2.0885 - regression_loss: 1.8007 - classification_loss: 0.2878 7/500 [..............................] - ETA: 3:10 - loss: 2.0462 - regression_loss: 1.7680 - classification_loss: 0.2782 8/500 [..............................] - ETA: 3:09 - loss: 2.0502 - regression_loss: 1.7674 - classification_loss: 0.2828 9/500 [..............................] - ETA: 3:08 - loss: 2.0141 - regression_loss: 1.7363 - classification_loss: 0.2778 10/500 [..............................] - ETA: 3:07 - loss: 2.0534 - regression_loss: 1.7720 - classification_loss: 0.2814 11/500 [..............................] - ETA: 3:06 - loss: 2.1041 - regression_loss: 1.8181 - classification_loss: 0.2860 12/500 [..............................] - ETA: 3:07 - loss: 2.0698 - regression_loss: 1.7885 - classification_loss: 0.2813 13/500 [..............................] - ETA: 3:07 - loss: 2.1114 - regression_loss: 1.8214 - classification_loss: 0.2899 14/500 [..............................] - ETA: 3:06 - loss: 2.0702 - regression_loss: 1.7856 - classification_loss: 0.2846 15/500 [..............................] - ETA: 3:06 - loss: 2.0369 - regression_loss: 1.7586 - classification_loss: 0.2782 16/500 [..............................] - ETA: 3:06 - loss: 2.0736 - regression_loss: 1.7864 - classification_loss: 0.2871 17/500 [>.............................] - ETA: 3:05 - loss: 2.0324 - regression_loss: 1.7498 - classification_loss: 0.2826 18/500 [>.............................] - ETA: 3:05 - loss: 2.0286 - regression_loss: 1.7465 - classification_loss: 0.2821 19/500 [>.............................] - ETA: 3:05 - loss: 2.0162 - regression_loss: 1.7367 - classification_loss: 0.2795 20/500 [>.............................] - ETA: 3:04 - loss: 1.9964 - regression_loss: 1.7206 - classification_loss: 0.2758 21/500 [>.............................] - ETA: 3:04 - loss: 2.0010 - regression_loss: 1.7243 - classification_loss: 0.2767 22/500 [>.............................] - ETA: 3:04 - loss: 2.0007 - regression_loss: 1.7251 - classification_loss: 0.2757 23/500 [>.............................] - ETA: 3:04 - loss: 1.9921 - regression_loss: 1.7194 - classification_loss: 0.2727 24/500 [>.............................] - ETA: 3:03 - loss: 2.0231 - regression_loss: 1.7463 - classification_loss: 0.2768 25/500 [>.............................] - ETA: 3:02 - loss: 2.0147 - regression_loss: 1.7405 - classification_loss: 0.2742 26/500 [>.............................] - ETA: 3:02 - loss: 2.0203 - regression_loss: 1.7446 - classification_loss: 0.2758 27/500 [>.............................] - ETA: 3:01 - loss: 2.0129 - regression_loss: 1.7347 - classification_loss: 0.2782 28/500 [>.............................] - ETA: 3:01 - loss: 2.0030 - regression_loss: 1.7276 - classification_loss: 0.2754 29/500 [>.............................] - ETA: 3:00 - loss: 2.0237 - regression_loss: 1.7470 - classification_loss: 0.2767 30/500 [>.............................] - ETA: 3:00 - loss: 2.0122 - regression_loss: 1.7312 - classification_loss: 0.2810 31/500 [>.............................] - ETA: 3:00 - loss: 2.0517 - regression_loss: 1.7653 - classification_loss: 0.2864 32/500 [>.............................] - ETA: 2:59 - loss: 2.0423 - regression_loss: 1.7571 - classification_loss: 0.2852 33/500 [>.............................] - ETA: 2:59 - loss: 2.0159 - regression_loss: 1.7344 - classification_loss: 0.2816 34/500 [=>............................] - ETA: 2:59 - loss: 1.9975 - regression_loss: 1.7190 - classification_loss: 0.2786 35/500 [=>............................] - ETA: 2:58 - loss: 1.9778 - regression_loss: 1.7022 - classification_loss: 0.2756 36/500 [=>............................] - ETA: 2:58 - loss: 1.9765 - regression_loss: 1.7010 - classification_loss: 0.2755 37/500 [=>............................] - ETA: 2:58 - loss: 1.9946 - regression_loss: 1.7155 - classification_loss: 0.2791 38/500 [=>............................] - ETA: 2:58 - loss: 2.0009 - regression_loss: 1.7208 - classification_loss: 0.2802 39/500 [=>............................] - ETA: 2:57 - loss: 1.9777 - regression_loss: 1.7014 - classification_loss: 0.2764 40/500 [=>............................] - ETA: 2:57 - loss: 1.9809 - regression_loss: 1.7025 - classification_loss: 0.2784 41/500 [=>............................] - ETA: 2:57 - loss: 1.9744 - regression_loss: 1.6945 - classification_loss: 0.2799 42/500 [=>............................] - ETA: 2:57 - loss: 1.9675 - regression_loss: 1.6893 - classification_loss: 0.2782 43/500 [=>............................] - ETA: 2:56 - loss: 1.9894 - regression_loss: 1.7063 - classification_loss: 0.2831 44/500 [=>............................] - ETA: 2:56 - loss: 1.9877 - regression_loss: 1.7039 - classification_loss: 0.2838 45/500 [=>............................] - ETA: 2:55 - loss: 1.9881 - regression_loss: 1.7027 - classification_loss: 0.2854 46/500 [=>............................] - ETA: 2:55 - loss: 2.0114 - regression_loss: 1.7245 - classification_loss: 0.2869 47/500 [=>............................] - ETA: 2:55 - loss: 1.9996 - regression_loss: 1.7153 - classification_loss: 0.2842 48/500 [=>............................] - ETA: 2:55 - loss: 2.0117 - regression_loss: 1.7244 - classification_loss: 0.2873 49/500 [=>............................] - ETA: 2:54 - loss: 1.9979 - regression_loss: 1.7118 - classification_loss: 0.2860 50/500 [==>...........................] - ETA: 2:54 - loss: 1.9954 - regression_loss: 1.7101 - classification_loss: 0.2853 51/500 [==>...........................] - ETA: 2:54 - loss: 1.9871 - regression_loss: 1.7039 - classification_loss: 0.2832 52/500 [==>...........................] - ETA: 2:53 - loss: 1.9845 - regression_loss: 1.7017 - classification_loss: 0.2829 53/500 [==>...........................] - ETA: 2:53 - loss: 1.9829 - regression_loss: 1.7005 - classification_loss: 0.2824 54/500 [==>...........................] - ETA: 2:52 - loss: 1.9813 - regression_loss: 1.6999 - classification_loss: 0.2814 55/500 [==>...........................] - ETA: 2:51 - loss: 1.9757 - regression_loss: 1.6960 - classification_loss: 0.2797 56/500 [==>...........................] - ETA: 2:51 - loss: 1.9778 - regression_loss: 1.6972 - classification_loss: 0.2806 57/500 [==>...........................] - ETA: 2:50 - loss: 1.9692 - regression_loss: 1.6901 - classification_loss: 0.2790 58/500 [==>...........................] - ETA: 2:50 - loss: 1.9663 - regression_loss: 1.6884 - classification_loss: 0.2779 59/500 [==>...........................] - ETA: 2:50 - loss: 1.9630 - regression_loss: 1.6862 - classification_loss: 0.2768 60/500 [==>...........................] - ETA: 2:49 - loss: 1.9580 - regression_loss: 1.6820 - classification_loss: 0.2760 61/500 [==>...........................] - ETA: 2:49 - loss: 1.9491 - regression_loss: 1.6748 - classification_loss: 0.2743 62/500 [==>...........................] - ETA: 2:49 - loss: 1.9468 - regression_loss: 1.6736 - classification_loss: 0.2732 63/500 [==>...........................] - ETA: 2:48 - loss: 1.9541 - regression_loss: 1.6790 - classification_loss: 0.2751 64/500 [==>...........................] - ETA: 2:48 - loss: 1.9598 - regression_loss: 1.6830 - classification_loss: 0.2767 65/500 [==>...........................] - ETA: 2:48 - loss: 1.9582 - regression_loss: 1.6824 - classification_loss: 0.2758 66/500 [==>...........................] - ETA: 2:47 - loss: 1.9511 - regression_loss: 1.6762 - classification_loss: 0.2749 67/500 [===>..........................] - ETA: 2:47 - loss: 1.9485 - regression_loss: 1.6746 - classification_loss: 0.2740 68/500 [===>..........................] - ETA: 2:47 - loss: 1.9427 - regression_loss: 1.6700 - classification_loss: 0.2727 69/500 [===>..........................] - ETA: 2:46 - loss: 1.9460 - regression_loss: 1.6727 - classification_loss: 0.2733 70/500 [===>..........................] - ETA: 2:46 - loss: 1.9466 - regression_loss: 1.6737 - classification_loss: 0.2729 71/500 [===>..........................] - ETA: 2:46 - loss: 1.9452 - regression_loss: 1.6719 - classification_loss: 0.2733 72/500 [===>..........................] - ETA: 2:45 - loss: 1.9457 - regression_loss: 1.6731 - classification_loss: 0.2726 73/500 [===>..........................] - ETA: 2:45 - loss: 1.9440 - regression_loss: 1.6716 - classification_loss: 0.2724 74/500 [===>..........................] - ETA: 2:44 - loss: 1.9380 - regression_loss: 1.6663 - classification_loss: 0.2717 75/500 [===>..........................] - ETA: 2:44 - loss: 1.9326 - regression_loss: 1.6621 - classification_loss: 0.2705 76/500 [===>..........................] - ETA: 2:44 - loss: 1.9340 - regression_loss: 1.6625 - classification_loss: 0.2715 77/500 [===>..........................] - ETA: 2:43 - loss: 1.9231 - regression_loss: 1.6531 - classification_loss: 0.2700 78/500 [===>..........................] - ETA: 2:43 - loss: 1.9089 - regression_loss: 1.6402 - classification_loss: 0.2687 79/500 [===>..........................] - ETA: 2:42 - loss: 1.9066 - regression_loss: 1.6383 - classification_loss: 0.2683 80/500 [===>..........................] - ETA: 2:42 - loss: 1.9108 - regression_loss: 1.6421 - classification_loss: 0.2686 81/500 [===>..........................] - ETA: 2:42 - loss: 1.9051 - regression_loss: 1.6375 - classification_loss: 0.2676 82/500 [===>..........................] - ETA: 2:41 - loss: 1.9032 - regression_loss: 1.6355 - classification_loss: 0.2676 83/500 [===>..........................] - ETA: 2:41 - loss: 1.8963 - regression_loss: 1.6294 - classification_loss: 0.2670 84/500 [====>.........................] - ETA: 2:41 - loss: 1.8987 - regression_loss: 1.6314 - classification_loss: 0.2672 85/500 [====>.........................] - ETA: 2:40 - loss: 1.8966 - regression_loss: 1.6293 - classification_loss: 0.2672 86/500 [====>.........................] - ETA: 2:40 - loss: 1.9087 - regression_loss: 1.6399 - classification_loss: 0.2689 87/500 [====>.........................] - ETA: 2:40 - loss: 1.9054 - regression_loss: 1.6375 - classification_loss: 0.2680 88/500 [====>.........................] - ETA: 2:39 - loss: 1.8934 - regression_loss: 1.6273 - classification_loss: 0.2661 89/500 [====>.........................] - ETA: 2:39 - loss: 1.8921 - regression_loss: 1.6266 - classification_loss: 0.2656 90/500 [====>.........................] - ETA: 2:38 - loss: 1.8914 - regression_loss: 1.6265 - classification_loss: 0.2649 91/500 [====>.........................] - ETA: 2:38 - loss: 1.8942 - regression_loss: 1.6302 - classification_loss: 0.2640 92/500 [====>.........................] - ETA: 2:38 - loss: 1.8950 - regression_loss: 1.6306 - classification_loss: 0.2644 93/500 [====>.........................] - ETA: 2:37 - loss: 1.8990 - regression_loss: 1.6341 - classification_loss: 0.2649 94/500 [====>.........................] - ETA: 2:37 - loss: 1.8869 - regression_loss: 1.6235 - classification_loss: 0.2634 95/500 [====>.........................] - ETA: 2:36 - loss: 1.8895 - regression_loss: 1.6252 - classification_loss: 0.2643 96/500 [====>.........................] - ETA: 2:36 - loss: 1.8896 - regression_loss: 1.6251 - classification_loss: 0.2644 97/500 [====>.........................] - ETA: 2:36 - loss: 1.8836 - regression_loss: 1.6199 - classification_loss: 0.2638 98/500 [====>.........................] - ETA: 2:35 - loss: 1.8835 - regression_loss: 1.6192 - classification_loss: 0.2643 99/500 [====>.........................] - ETA: 2:35 - loss: 1.8747 - regression_loss: 1.6107 - classification_loss: 0.2640 100/500 [=====>........................] - ETA: 2:34 - loss: 1.8909 - regression_loss: 1.6218 - classification_loss: 0.2691 101/500 [=====>........................] - ETA: 2:34 - loss: 1.8881 - regression_loss: 1.6197 - classification_loss: 0.2684 102/500 [=====>........................] - ETA: 2:34 - loss: 1.8887 - regression_loss: 1.6200 - classification_loss: 0.2688 103/500 [=====>........................] - ETA: 2:33 - loss: 1.8844 - regression_loss: 1.6165 - classification_loss: 0.2679 104/500 [=====>........................] - ETA: 2:33 - loss: 1.8842 - regression_loss: 1.6166 - classification_loss: 0.2677 105/500 [=====>........................] - ETA: 2:33 - loss: 1.8844 - regression_loss: 1.6166 - classification_loss: 0.2678 106/500 [=====>........................] - ETA: 2:32 - loss: 1.8811 - regression_loss: 1.6142 - classification_loss: 0.2669 107/500 [=====>........................] - ETA: 2:32 - loss: 1.8877 - regression_loss: 1.6177 - classification_loss: 0.2700 108/500 [=====>........................] - ETA: 2:32 - loss: 1.8922 - regression_loss: 1.6216 - classification_loss: 0.2707 109/500 [=====>........................] - ETA: 2:31 - loss: 1.8918 - regression_loss: 1.6214 - classification_loss: 0.2704 110/500 [=====>........................] - ETA: 2:31 - loss: 1.8913 - regression_loss: 1.6213 - classification_loss: 0.2701 111/500 [=====>........................] - ETA: 2:30 - loss: 1.8902 - regression_loss: 1.6196 - classification_loss: 0.2706 112/500 [=====>........................] - ETA: 2:30 - loss: 1.8980 - regression_loss: 1.6250 - classification_loss: 0.2730 113/500 [=====>........................] - ETA: 2:30 - loss: 1.8961 - regression_loss: 1.6235 - classification_loss: 0.2726 114/500 [=====>........................] - ETA: 2:29 - loss: 1.8958 - regression_loss: 1.6238 - classification_loss: 0.2720 115/500 [=====>........................] - ETA: 2:29 - loss: 1.8972 - regression_loss: 1.6255 - classification_loss: 0.2718 116/500 [=====>........................] - ETA: 2:29 - loss: 1.8942 - regression_loss: 1.6230 - classification_loss: 0.2712 117/500 [======>.......................] - ETA: 2:28 - loss: 1.8906 - regression_loss: 1.6196 - classification_loss: 0.2710 118/500 [======>.......................] - ETA: 2:28 - loss: 1.8951 - regression_loss: 1.6219 - classification_loss: 0.2731 119/500 [======>.......................] - ETA: 2:28 - loss: 1.8973 - regression_loss: 1.6243 - classification_loss: 0.2730 120/500 [======>.......................] - ETA: 2:27 - loss: 1.8963 - regression_loss: 1.6226 - classification_loss: 0.2737 121/500 [======>.......................] - ETA: 2:27 - loss: 1.8974 - regression_loss: 1.6236 - classification_loss: 0.2739 122/500 [======>.......................] - ETA: 2:26 - loss: 1.8931 - regression_loss: 1.6196 - classification_loss: 0.2735 123/500 [======>.......................] - ETA: 2:26 - loss: 1.8955 - regression_loss: 1.6216 - classification_loss: 0.2739 124/500 [======>.......................] - ETA: 2:26 - loss: 1.8957 - regression_loss: 1.6217 - classification_loss: 0.2740 125/500 [======>.......................] - ETA: 2:25 - loss: 1.8907 - regression_loss: 1.6174 - classification_loss: 0.2733 126/500 [======>.......................] - ETA: 2:25 - loss: 1.8897 - regression_loss: 1.6167 - classification_loss: 0.2730 127/500 [======>.......................] - ETA: 2:25 - loss: 1.8917 - regression_loss: 1.6190 - classification_loss: 0.2727 128/500 [======>.......................] - ETA: 2:24 - loss: 1.8906 - regression_loss: 1.6176 - classification_loss: 0.2730 129/500 [======>.......................] - ETA: 2:24 - loss: 1.8896 - regression_loss: 1.6168 - classification_loss: 0.2728 130/500 [======>.......................] - ETA: 2:23 - loss: 1.8904 - regression_loss: 1.6180 - classification_loss: 0.2723 131/500 [======>.......................] - ETA: 2:23 - loss: 1.8931 - regression_loss: 1.6208 - classification_loss: 0.2723 132/500 [======>.......................] - ETA: 2:23 - loss: 1.8944 - regression_loss: 1.6222 - classification_loss: 0.2722 133/500 [======>.......................] - ETA: 2:22 - loss: 1.8919 - regression_loss: 1.6205 - classification_loss: 0.2714 134/500 [=======>......................] - ETA: 2:22 - loss: 1.8899 - regression_loss: 1.6193 - classification_loss: 0.2707 135/500 [=======>......................] - ETA: 2:22 - loss: 1.8948 - regression_loss: 1.6229 - classification_loss: 0.2719 136/500 [=======>......................] - ETA: 2:21 - loss: 1.8955 - regression_loss: 1.6236 - classification_loss: 0.2718 137/500 [=======>......................] - ETA: 2:21 - loss: 1.8943 - regression_loss: 1.6228 - classification_loss: 0.2715 138/500 [=======>......................] - ETA: 2:20 - loss: 1.8931 - regression_loss: 1.6217 - classification_loss: 0.2714 139/500 [=======>......................] - ETA: 2:20 - loss: 1.8937 - regression_loss: 1.6224 - classification_loss: 0.2713 140/500 [=======>......................] - ETA: 2:20 - loss: 1.8928 - regression_loss: 1.6217 - classification_loss: 0.2711 141/500 [=======>......................] - ETA: 2:19 - loss: 1.8872 - regression_loss: 1.6152 - classification_loss: 0.2721 142/500 [=======>......................] - ETA: 2:19 - loss: 1.8860 - regression_loss: 1.6142 - classification_loss: 0.2717 143/500 [=======>......................] - ETA: 2:18 - loss: 1.8849 - regression_loss: 1.6134 - classification_loss: 0.2715 144/500 [=======>......................] - ETA: 2:18 - loss: 1.8845 - regression_loss: 1.6128 - classification_loss: 0.2716 145/500 [=======>......................] - ETA: 2:18 - loss: 1.8864 - regression_loss: 1.6139 - classification_loss: 0.2725 146/500 [=======>......................] - ETA: 2:17 - loss: 1.8824 - regression_loss: 1.6106 - classification_loss: 0.2718 147/500 [=======>......................] - ETA: 2:17 - loss: 1.8873 - regression_loss: 1.6145 - classification_loss: 0.2728 148/500 [=======>......................] - ETA: 2:17 - loss: 1.8870 - regression_loss: 1.6138 - classification_loss: 0.2732 149/500 [=======>......................] - ETA: 2:16 - loss: 1.8882 - regression_loss: 1.6144 - classification_loss: 0.2738 150/500 [========>.....................] - ETA: 2:16 - loss: 1.8826 - regression_loss: 1.6099 - classification_loss: 0.2727 151/500 [========>.....................] - ETA: 2:15 - loss: 1.8804 - regression_loss: 1.6082 - classification_loss: 0.2722 152/500 [========>.....................] - ETA: 2:15 - loss: 1.8874 - regression_loss: 1.6135 - classification_loss: 0.2739 153/500 [========>.....................] - ETA: 2:15 - loss: 1.8818 - regression_loss: 1.6090 - classification_loss: 0.2728 154/500 [========>.....................] - ETA: 2:14 - loss: 1.8792 - regression_loss: 1.6068 - classification_loss: 0.2724 155/500 [========>.....................] - ETA: 2:14 - loss: 1.8795 - regression_loss: 1.6073 - classification_loss: 0.2723 156/500 [========>.....................] - ETA: 2:13 - loss: 1.8776 - regression_loss: 1.6060 - classification_loss: 0.2716 157/500 [========>.....................] - ETA: 2:13 - loss: 1.8793 - regression_loss: 1.6075 - classification_loss: 0.2718 158/500 [========>.....................] - ETA: 2:13 - loss: 1.8789 - regression_loss: 1.6073 - classification_loss: 0.2716 159/500 [========>.....................] - ETA: 2:12 - loss: 1.8742 - regression_loss: 1.6030 - classification_loss: 0.2712 160/500 [========>.....................] - ETA: 2:12 - loss: 1.8751 - regression_loss: 1.6039 - classification_loss: 0.2712 161/500 [========>.....................] - ETA: 2:12 - loss: 1.8800 - regression_loss: 1.6081 - classification_loss: 0.2719 162/500 [========>.....................] - ETA: 2:11 - loss: 1.8803 - regression_loss: 1.6080 - classification_loss: 0.2723 163/500 [========>.....................] - ETA: 2:11 - loss: 1.8816 - regression_loss: 1.6092 - classification_loss: 0.2724 164/500 [========>.....................] - ETA: 2:10 - loss: 1.8803 - regression_loss: 1.6082 - classification_loss: 0.2721 165/500 [========>.....................] - ETA: 2:10 - loss: 1.8813 - regression_loss: 1.6097 - classification_loss: 0.2716 166/500 [========>.....................] - ETA: 2:10 - loss: 1.8838 - regression_loss: 1.6120 - classification_loss: 0.2717 167/500 [=========>....................] - ETA: 2:09 - loss: 1.8842 - regression_loss: 1.6125 - classification_loss: 0.2717 168/500 [=========>....................] - ETA: 2:09 - loss: 1.8844 - regression_loss: 1.6123 - classification_loss: 0.2722 169/500 [=========>....................] - ETA: 2:08 - loss: 1.8787 - regression_loss: 1.6074 - classification_loss: 0.2712 170/500 [=========>....................] - ETA: 2:08 - loss: 1.8747 - regression_loss: 1.6041 - classification_loss: 0.2706 171/500 [=========>....................] - ETA: 2:08 - loss: 1.8690 - regression_loss: 1.5993 - classification_loss: 0.2697 172/500 [=========>....................] - ETA: 2:07 - loss: 1.8701 - regression_loss: 1.5999 - classification_loss: 0.2702 173/500 [=========>....................] - ETA: 2:07 - loss: 1.8705 - regression_loss: 1.6003 - classification_loss: 0.2702 174/500 [=========>....................] - ETA: 2:07 - loss: 1.8761 - regression_loss: 1.6036 - classification_loss: 0.2725 175/500 [=========>....................] - ETA: 2:06 - loss: 1.8825 - regression_loss: 1.6085 - classification_loss: 0.2740 176/500 [=========>....................] - ETA: 2:06 - loss: 1.8790 - regression_loss: 1.6055 - classification_loss: 0.2735 177/500 [=========>....................] - ETA: 2:05 - loss: 1.8815 - regression_loss: 1.6072 - classification_loss: 0.2742 178/500 [=========>....................] - ETA: 2:05 - loss: 1.8802 - regression_loss: 1.6060 - classification_loss: 0.2741 179/500 [=========>....................] - ETA: 2:05 - loss: 1.8775 - regression_loss: 1.6037 - classification_loss: 0.2739 180/500 [=========>....................] - ETA: 2:04 - loss: 1.8790 - regression_loss: 1.6046 - classification_loss: 0.2744 181/500 [=========>....................] - ETA: 2:04 - loss: 1.8766 - regression_loss: 1.6029 - classification_loss: 0.2738 182/500 [=========>....................] - ETA: 2:03 - loss: 1.8821 - regression_loss: 1.6078 - classification_loss: 0.2743 183/500 [=========>....................] - ETA: 2:03 - loss: 1.8826 - regression_loss: 1.6089 - classification_loss: 0.2738 184/500 [==========>...................] - ETA: 2:03 - loss: 1.8844 - regression_loss: 1.6102 - classification_loss: 0.2741 185/500 [==========>...................] - ETA: 2:02 - loss: 1.8853 - regression_loss: 1.6118 - classification_loss: 0.2735 186/500 [==========>...................] - ETA: 2:02 - loss: 1.8841 - regression_loss: 1.6110 - classification_loss: 0.2731 187/500 [==========>...................] - ETA: 2:01 - loss: 1.8781 - regression_loss: 1.6056 - classification_loss: 0.2725 188/500 [==========>...................] - ETA: 2:01 - loss: 1.8769 - regression_loss: 1.6047 - classification_loss: 0.2721 189/500 [==========>...................] - ETA: 2:01 - loss: 1.8749 - regression_loss: 1.6030 - classification_loss: 0.2719 190/500 [==========>...................] - ETA: 2:00 - loss: 1.8752 - regression_loss: 1.6032 - classification_loss: 0.2720 191/500 [==========>...................] - ETA: 2:00 - loss: 1.8780 - regression_loss: 1.6056 - classification_loss: 0.2724 192/500 [==========>...................] - ETA: 1:59 - loss: 1.8764 - regression_loss: 1.6044 - classification_loss: 0.2720 193/500 [==========>...................] - ETA: 1:59 - loss: 1.8761 - regression_loss: 1.6039 - classification_loss: 0.2722 194/500 [==========>...................] - ETA: 1:59 - loss: 1.8799 - regression_loss: 1.6067 - classification_loss: 0.2732 195/500 [==========>...................] - ETA: 1:58 - loss: 1.8838 - regression_loss: 1.6093 - classification_loss: 0.2745 196/500 [==========>...................] - ETA: 1:58 - loss: 1.8838 - regression_loss: 1.6091 - classification_loss: 0.2747 197/500 [==========>...................] - ETA: 1:58 - loss: 1.8867 - regression_loss: 1.6119 - classification_loss: 0.2748 198/500 [==========>...................] - ETA: 1:57 - loss: 1.8869 - regression_loss: 1.6120 - classification_loss: 0.2749 199/500 [==========>...................] - ETA: 1:57 - loss: 1.8862 - regression_loss: 1.6110 - classification_loss: 0.2752 200/500 [===========>..................] - ETA: 1:56 - loss: 1.8884 - regression_loss: 1.6131 - classification_loss: 0.2753 201/500 [===========>..................] - ETA: 1:56 - loss: 1.8884 - regression_loss: 1.6133 - classification_loss: 0.2751 202/500 [===========>..................] - ETA: 1:56 - loss: 1.8870 - regression_loss: 1.6121 - classification_loss: 0.2749 203/500 [===========>..................] - ETA: 1:55 - loss: 1.8877 - regression_loss: 1.6130 - classification_loss: 0.2748 204/500 [===========>..................] - ETA: 1:55 - loss: 1.8913 - regression_loss: 1.6158 - classification_loss: 0.2755 205/500 [===========>..................] - ETA: 1:54 - loss: 1.8883 - regression_loss: 1.6131 - classification_loss: 0.2752 206/500 [===========>..................] - ETA: 1:54 - loss: 1.8868 - regression_loss: 1.6118 - classification_loss: 0.2750 207/500 [===========>..................] - ETA: 1:54 - loss: 1.8856 - regression_loss: 1.6108 - classification_loss: 0.2748 208/500 [===========>..................] - ETA: 1:53 - loss: 1.8899 - regression_loss: 1.6141 - classification_loss: 0.2758 209/500 [===========>..................] - ETA: 1:53 - loss: 1.8885 - regression_loss: 1.6131 - classification_loss: 0.2754 210/500 [===========>..................] - ETA: 1:52 - loss: 1.8856 - regression_loss: 1.6107 - classification_loss: 0.2749 211/500 [===========>..................] - ETA: 1:52 - loss: 1.8851 - regression_loss: 1.6107 - classification_loss: 0.2744 212/500 [===========>..................] - ETA: 1:52 - loss: 1.8833 - regression_loss: 1.6094 - classification_loss: 0.2739 213/500 [===========>..................] - ETA: 1:51 - loss: 1.8859 - regression_loss: 1.6113 - classification_loss: 0.2747 214/500 [===========>..................] - ETA: 1:51 - loss: 1.8876 - regression_loss: 1.6126 - classification_loss: 0.2750 215/500 [===========>..................] - ETA: 1:51 - loss: 1.8875 - regression_loss: 1.6125 - classification_loss: 0.2750 216/500 [===========>..................] - ETA: 1:50 - loss: 1.8859 - regression_loss: 1.6111 - classification_loss: 0.2748 217/500 [============>.................] - ETA: 1:50 - loss: 1.8869 - regression_loss: 1.6120 - classification_loss: 0.2750 218/500 [============>.................] - ETA: 1:49 - loss: 1.8885 - regression_loss: 1.6135 - classification_loss: 0.2750 219/500 [============>.................] - ETA: 1:49 - loss: 1.8897 - regression_loss: 1.6140 - classification_loss: 0.2757 220/500 [============>.................] - ETA: 1:49 - loss: 1.8915 - regression_loss: 1.6156 - classification_loss: 0.2759 221/500 [============>.................] - ETA: 1:48 - loss: 1.8921 - regression_loss: 1.6162 - classification_loss: 0.2759 222/500 [============>.................] - ETA: 1:48 - loss: 1.8932 - regression_loss: 1.6170 - classification_loss: 0.2762 223/500 [============>.................] - ETA: 1:47 - loss: 1.8908 - regression_loss: 1.6152 - classification_loss: 0.2757 224/500 [============>.................] - ETA: 1:47 - loss: 1.8900 - regression_loss: 1.6146 - classification_loss: 0.2754 225/500 [============>.................] - ETA: 1:47 - loss: 1.8891 - regression_loss: 1.6141 - classification_loss: 0.2750 226/500 [============>.................] - ETA: 1:46 - loss: 1.8904 - regression_loss: 1.6146 - classification_loss: 0.2758 227/500 [============>.................] - ETA: 1:46 - loss: 1.8864 - regression_loss: 1.6111 - classification_loss: 0.2753 228/500 [============>.................] - ETA: 1:46 - loss: 1.8883 - regression_loss: 1.6117 - classification_loss: 0.2766 229/500 [============>.................] - ETA: 1:45 - loss: 1.8871 - regression_loss: 1.6107 - classification_loss: 0.2764 230/500 [============>.................] - ETA: 1:45 - loss: 1.8871 - regression_loss: 1.6107 - classification_loss: 0.2764 231/500 [============>.................] - ETA: 1:44 - loss: 1.8876 - regression_loss: 1.6111 - classification_loss: 0.2765 232/500 [============>.................] - ETA: 1:44 - loss: 1.8885 - regression_loss: 1.6117 - classification_loss: 0.2768 233/500 [============>.................] - ETA: 1:44 - loss: 1.8892 - regression_loss: 1.6121 - classification_loss: 0.2771 234/500 [=============>................] - ETA: 1:43 - loss: 1.8914 - regression_loss: 1.6137 - classification_loss: 0.2777 235/500 [=============>................] - ETA: 1:43 - loss: 1.8892 - regression_loss: 1.6120 - classification_loss: 0.2772 236/500 [=============>................] - ETA: 1:42 - loss: 1.8895 - regression_loss: 1.6125 - classification_loss: 0.2770 237/500 [=============>................] - ETA: 1:42 - loss: 1.8921 - regression_loss: 1.6140 - classification_loss: 0.2781 238/500 [=============>................] - ETA: 1:42 - loss: 1.8902 - regression_loss: 1.6126 - classification_loss: 0.2776 239/500 [=============>................] - ETA: 1:41 - loss: 1.8889 - regression_loss: 1.6116 - classification_loss: 0.2774 240/500 [=============>................] - ETA: 1:41 - loss: 1.8868 - regression_loss: 1.6098 - classification_loss: 0.2769 241/500 [=============>................] - ETA: 1:40 - loss: 1.8838 - regression_loss: 1.6072 - classification_loss: 0.2766 242/500 [=============>................] - ETA: 1:40 - loss: 1.8840 - regression_loss: 1.6076 - classification_loss: 0.2765 243/500 [=============>................] - ETA: 1:40 - loss: 1.8840 - regression_loss: 1.6076 - classification_loss: 0.2764 244/500 [=============>................] - ETA: 1:39 - loss: 1.8809 - regression_loss: 1.6050 - classification_loss: 0.2759 245/500 [=============>................] - ETA: 1:39 - loss: 1.8797 - regression_loss: 1.6040 - classification_loss: 0.2757 246/500 [=============>................] - ETA: 1:38 - loss: 1.8807 - regression_loss: 1.6051 - classification_loss: 0.2756 247/500 [=============>................] - ETA: 1:38 - loss: 1.8806 - regression_loss: 1.6052 - classification_loss: 0.2754 248/500 [=============>................] - ETA: 1:38 - loss: 1.8775 - regression_loss: 1.6027 - classification_loss: 0.2748 249/500 [=============>................] - ETA: 1:37 - loss: 1.8773 - regression_loss: 1.6024 - classification_loss: 0.2750 250/500 [==============>...............] - ETA: 1:37 - loss: 1.8771 - regression_loss: 1.6021 - classification_loss: 0.2750 251/500 [==============>...............] - ETA: 1:36 - loss: 1.8760 - regression_loss: 1.6011 - classification_loss: 0.2749 252/500 [==============>...............] - ETA: 1:36 - loss: 1.8723 - regression_loss: 1.5978 - classification_loss: 0.2745 253/500 [==============>...............] - ETA: 1:36 - loss: 1.8718 - regression_loss: 1.5974 - classification_loss: 0.2743 254/500 [==============>...............] - ETA: 1:35 - loss: 1.8691 - regression_loss: 1.5949 - classification_loss: 0.2742 255/500 [==============>...............] - ETA: 1:35 - loss: 1.8666 - regression_loss: 1.5928 - classification_loss: 0.2738 256/500 [==============>...............] - ETA: 1:35 - loss: 1.8706 - regression_loss: 1.5960 - classification_loss: 0.2746 257/500 [==============>...............] - ETA: 1:34 - loss: 1.8717 - regression_loss: 1.5971 - classification_loss: 0.2745 258/500 [==============>...............] - ETA: 1:34 - loss: 1.8705 - regression_loss: 1.5963 - classification_loss: 0.2742 259/500 [==============>...............] - ETA: 1:33 - loss: 1.8712 - regression_loss: 1.5969 - classification_loss: 0.2743 260/500 [==============>...............] - ETA: 1:33 - loss: 1.8704 - regression_loss: 1.5963 - classification_loss: 0.2741 261/500 [==============>...............] - ETA: 1:33 - loss: 1.8714 - regression_loss: 1.5970 - classification_loss: 0.2744 262/500 [==============>...............] - ETA: 1:32 - loss: 1.8715 - regression_loss: 1.5971 - classification_loss: 0.2744 263/500 [==============>...............] - ETA: 1:32 - loss: 1.8720 - regression_loss: 1.5974 - classification_loss: 0.2745 264/500 [==============>...............] - ETA: 1:31 - loss: 1.8721 - regression_loss: 1.5974 - classification_loss: 0.2747 265/500 [==============>...............] - ETA: 1:31 - loss: 1.8718 - regression_loss: 1.5972 - classification_loss: 0.2746 266/500 [==============>...............] - ETA: 1:31 - loss: 1.8702 - regression_loss: 1.5959 - classification_loss: 0.2743 267/500 [===============>..............] - ETA: 1:30 - loss: 1.8714 - regression_loss: 1.5968 - classification_loss: 0.2747 268/500 [===============>..............] - ETA: 1:30 - loss: 1.8738 - regression_loss: 1.5989 - classification_loss: 0.2749 269/500 [===============>..............] - ETA: 1:29 - loss: 1.8737 - regression_loss: 1.5989 - classification_loss: 0.2748 270/500 [===============>..............] - ETA: 1:29 - loss: 1.8724 - regression_loss: 1.5980 - classification_loss: 0.2744 271/500 [===============>..............] - ETA: 1:29 - loss: 1.8706 - regression_loss: 1.5966 - classification_loss: 0.2740 272/500 [===============>..............] - ETA: 1:28 - loss: 1.8718 - regression_loss: 1.5976 - classification_loss: 0.2742 273/500 [===============>..............] - ETA: 1:28 - loss: 1.8711 - regression_loss: 1.5969 - classification_loss: 0.2741 274/500 [===============>..............] - ETA: 1:27 - loss: 1.8720 - regression_loss: 1.5979 - classification_loss: 0.2742 275/500 [===============>..............] - ETA: 1:27 - loss: 1.8722 - regression_loss: 1.5982 - classification_loss: 0.2740 276/500 [===============>..............] - ETA: 1:27 - loss: 1.8734 - regression_loss: 1.5993 - classification_loss: 0.2740 277/500 [===============>..............] - ETA: 1:26 - loss: 1.8718 - regression_loss: 1.5981 - classification_loss: 0.2737 278/500 [===============>..............] - ETA: 1:26 - loss: 1.8734 - regression_loss: 1.5988 - classification_loss: 0.2746 279/500 [===============>..............] - ETA: 1:25 - loss: 1.8702 - regression_loss: 1.5961 - classification_loss: 0.2741 280/500 [===============>..............] - ETA: 1:25 - loss: 1.8708 - regression_loss: 1.5966 - classification_loss: 0.2741 281/500 [===============>..............] - ETA: 1:25 - loss: 1.8705 - regression_loss: 1.5967 - classification_loss: 0.2739 282/500 [===============>..............] - ETA: 1:24 - loss: 1.8685 - regression_loss: 1.5949 - classification_loss: 0.2735 283/500 [===============>..............] - ETA: 1:24 - loss: 1.8673 - regression_loss: 1.5942 - classification_loss: 0.2732 284/500 [================>.............] - ETA: 1:24 - loss: 1.8671 - regression_loss: 1.5941 - classification_loss: 0.2730 285/500 [================>.............] - ETA: 1:23 - loss: 1.8694 - regression_loss: 1.5961 - classification_loss: 0.2733 286/500 [================>.............] - ETA: 1:23 - loss: 1.8690 - regression_loss: 1.5957 - classification_loss: 0.2732 287/500 [================>.............] - ETA: 1:22 - loss: 1.8680 - regression_loss: 1.5950 - classification_loss: 0.2730 288/500 [================>.............] - ETA: 1:22 - loss: 1.8663 - regression_loss: 1.5936 - classification_loss: 0.2726 289/500 [================>.............] - ETA: 1:22 - loss: 1.8657 - regression_loss: 1.5935 - classification_loss: 0.2722 290/500 [================>.............] - ETA: 1:21 - loss: 1.8626 - regression_loss: 1.5910 - classification_loss: 0.2716 291/500 [================>.............] - ETA: 1:21 - loss: 1.8627 - regression_loss: 1.5912 - classification_loss: 0.2715 292/500 [================>.............] - ETA: 1:20 - loss: 1.8592 - regression_loss: 1.5882 - classification_loss: 0.2710 293/500 [================>.............] - ETA: 1:20 - loss: 1.8615 - regression_loss: 1.5900 - classification_loss: 0.2715 294/500 [================>.............] - ETA: 1:20 - loss: 1.8605 - regression_loss: 1.5892 - classification_loss: 0.2713 295/500 [================>.............] - ETA: 1:19 - loss: 1.8608 - regression_loss: 1.5895 - classification_loss: 0.2713 296/500 [================>.............] - ETA: 1:19 - loss: 1.8605 - regression_loss: 1.5895 - classification_loss: 0.2710 297/500 [================>.............] - ETA: 1:19 - loss: 1.8629 - regression_loss: 1.5915 - classification_loss: 0.2714 298/500 [================>.............] - ETA: 1:18 - loss: 1.8626 - regression_loss: 1.5912 - classification_loss: 0.2714 299/500 [================>.............] - ETA: 1:18 - loss: 1.8618 - regression_loss: 1.5909 - classification_loss: 0.2710 300/500 [=================>............] - ETA: 1:17 - loss: 1.8610 - regression_loss: 1.5903 - classification_loss: 0.2707 301/500 [=================>............] - ETA: 1:17 - loss: 1.8595 - regression_loss: 1.5891 - classification_loss: 0.2704 302/500 [=================>............] - ETA: 1:17 - loss: 1.8601 - regression_loss: 1.5895 - classification_loss: 0.2705 303/500 [=================>............] - ETA: 1:16 - loss: 1.8596 - regression_loss: 1.5892 - classification_loss: 0.2704 304/500 [=================>............] - ETA: 1:16 - loss: 1.8589 - regression_loss: 1.5886 - classification_loss: 0.2702 305/500 [=================>............] - ETA: 1:15 - loss: 1.8574 - regression_loss: 1.5875 - classification_loss: 0.2699 306/500 [=================>............] - ETA: 1:15 - loss: 1.8559 - regression_loss: 1.5862 - classification_loss: 0.2697 307/500 [=================>............] - ETA: 1:15 - loss: 1.8559 - regression_loss: 1.5864 - classification_loss: 0.2695 308/500 [=================>............] - ETA: 1:14 - loss: 1.8540 - regression_loss: 1.5847 - classification_loss: 0.2693 309/500 [=================>............] - ETA: 1:14 - loss: 1.8526 - regression_loss: 1.5836 - classification_loss: 0.2690 310/500 [=================>............] - ETA: 1:13 - loss: 1.8585 - regression_loss: 1.5881 - classification_loss: 0.2704 311/500 [=================>............] - ETA: 1:13 - loss: 1.8583 - regression_loss: 1.5879 - classification_loss: 0.2704 312/500 [=================>............] - ETA: 1:13 - loss: 1.8583 - regression_loss: 1.5880 - classification_loss: 0.2704 313/500 [=================>............] - ETA: 1:12 - loss: 1.8607 - regression_loss: 1.5892 - classification_loss: 0.2714 314/500 [=================>............] - ETA: 1:12 - loss: 1.8609 - regression_loss: 1.5894 - classification_loss: 0.2715 315/500 [=================>............] - ETA: 1:11 - loss: 1.8599 - regression_loss: 1.5886 - classification_loss: 0.2713 316/500 [=================>............] - ETA: 1:11 - loss: 1.8620 - regression_loss: 1.5904 - classification_loss: 0.2716 317/500 [==================>...........] - ETA: 1:11 - loss: 1.8637 - regression_loss: 1.5916 - classification_loss: 0.2721 318/500 [==================>...........] - ETA: 1:10 - loss: 1.8617 - regression_loss: 1.5899 - classification_loss: 0.2718 319/500 [==================>...........] - ETA: 1:10 - loss: 1.8608 - regression_loss: 1.5890 - classification_loss: 0.2718 320/500 [==================>...........] - ETA: 1:10 - loss: 1.8596 - regression_loss: 1.5879 - classification_loss: 0.2716 321/500 [==================>...........] - ETA: 1:09 - loss: 1.8616 - regression_loss: 1.5896 - classification_loss: 0.2721 322/500 [==================>...........] - ETA: 1:09 - loss: 1.8599 - regression_loss: 1.5880 - classification_loss: 0.2718 323/500 [==================>...........] - ETA: 1:08 - loss: 1.8600 - regression_loss: 1.5881 - classification_loss: 0.2719 324/500 [==================>...........] - ETA: 1:08 - loss: 1.8584 - regression_loss: 1.5868 - classification_loss: 0.2716 325/500 [==================>...........] - ETA: 1:08 - loss: 1.8578 - regression_loss: 1.5863 - classification_loss: 0.2715 326/500 [==================>...........] - ETA: 1:07 - loss: 1.8564 - regression_loss: 1.5852 - classification_loss: 0.2712 327/500 [==================>...........] - ETA: 1:07 - loss: 1.8561 - regression_loss: 1.5851 - classification_loss: 0.2710 328/500 [==================>...........] - ETA: 1:06 - loss: 1.8567 - regression_loss: 1.5857 - classification_loss: 0.2710 329/500 [==================>...........] - ETA: 1:06 - loss: 1.8558 - regression_loss: 1.5849 - classification_loss: 0.2709 330/500 [==================>...........] - ETA: 1:06 - loss: 1.8552 - regression_loss: 1.5845 - classification_loss: 0.2707 331/500 [==================>...........] - ETA: 1:05 - loss: 1.8535 - regression_loss: 1.5831 - classification_loss: 0.2704 332/500 [==================>...........] - ETA: 1:05 - loss: 1.8529 - regression_loss: 1.5824 - classification_loss: 0.2706 333/500 [==================>...........] - ETA: 1:04 - loss: 1.8537 - regression_loss: 1.5831 - classification_loss: 0.2706 334/500 [===================>..........] - ETA: 1:04 - loss: 1.8521 - regression_loss: 1.5819 - classification_loss: 0.2702 335/500 [===================>..........] - ETA: 1:04 - loss: 1.8506 - regression_loss: 1.5806 - classification_loss: 0.2700 336/500 [===================>..........] - ETA: 1:03 - loss: 1.8501 - regression_loss: 1.5803 - classification_loss: 0.2698 337/500 [===================>..........] - ETA: 1:03 - loss: 1.8478 - regression_loss: 1.5784 - classification_loss: 0.2694 338/500 [===================>..........] - ETA: 1:03 - loss: 1.8447 - regression_loss: 1.5758 - classification_loss: 0.2690 339/500 [===================>..........] - ETA: 1:02 - loss: 1.8418 - regression_loss: 1.5733 - classification_loss: 0.2685 340/500 [===================>..........] - ETA: 1:02 - loss: 1.8413 - regression_loss: 1.5730 - classification_loss: 0.2682 341/500 [===================>..........] - ETA: 1:01 - loss: 1.8405 - regression_loss: 1.5726 - classification_loss: 0.2679 342/500 [===================>..........] - ETA: 1:01 - loss: 1.8402 - regression_loss: 1.5723 - classification_loss: 0.2679 343/500 [===================>..........] - ETA: 1:01 - loss: 1.8379 - regression_loss: 1.5703 - classification_loss: 0.2676 344/500 [===================>..........] - ETA: 1:00 - loss: 1.8376 - regression_loss: 1.5702 - classification_loss: 0.2674 345/500 [===================>..........] - ETA: 1:00 - loss: 1.8365 - regression_loss: 1.5692 - classification_loss: 0.2673 346/500 [===================>..........] - ETA: 59s - loss: 1.8364 - regression_loss: 1.5692 - classification_loss: 0.2672  347/500 [===================>..........] - ETA: 59s - loss: 1.8363 - regression_loss: 1.5689 - classification_loss: 0.2673 348/500 [===================>..........] - ETA: 59s - loss: 1.8381 - regression_loss: 1.5705 - classification_loss: 0.2675 349/500 [===================>..........] - ETA: 58s - loss: 1.8372 - regression_loss: 1.5698 - classification_loss: 0.2674 350/500 [====================>.........] - ETA: 58s - loss: 1.8362 - regression_loss: 1.5690 - classification_loss: 0.2671 351/500 [====================>.........] - ETA: 57s - loss: 1.8361 - regression_loss: 1.5690 - classification_loss: 0.2670 352/500 [====================>.........] - ETA: 57s - loss: 1.8348 - regression_loss: 1.5680 - classification_loss: 0.2668 353/500 [====================>.........] - ETA: 57s - loss: 1.8338 - regression_loss: 1.5672 - classification_loss: 0.2666 354/500 [====================>.........] - ETA: 56s - loss: 1.8341 - regression_loss: 1.5674 - classification_loss: 0.2667 355/500 [====================>.........] - ETA: 56s - loss: 1.8322 - regression_loss: 1.5659 - classification_loss: 0.2664 356/500 [====================>.........] - ETA: 56s - loss: 1.8308 - regression_loss: 1.5646 - classification_loss: 0.2661 357/500 [====================>.........] - ETA: 55s - loss: 1.8314 - regression_loss: 1.5653 - classification_loss: 0.2661 358/500 [====================>.........] - ETA: 55s - loss: 1.8307 - regression_loss: 1.5643 - classification_loss: 0.2664 359/500 [====================>.........] - ETA: 54s - loss: 1.8297 - regression_loss: 1.5635 - classification_loss: 0.2662 360/500 [====================>.........] - ETA: 54s - loss: 1.8288 - regression_loss: 1.5628 - classification_loss: 0.2660 361/500 [====================>.........] - ETA: 54s - loss: 1.8277 - regression_loss: 1.5619 - classification_loss: 0.2658 362/500 [====================>.........] - ETA: 53s - loss: 1.8258 - regression_loss: 1.5604 - classification_loss: 0.2654 363/500 [====================>.........] - ETA: 53s - loss: 1.8271 - regression_loss: 1.5616 - classification_loss: 0.2654 364/500 [====================>.........] - ETA: 52s - loss: 1.8307 - regression_loss: 1.5647 - classification_loss: 0.2661 365/500 [====================>.........] - ETA: 52s - loss: 1.8329 - regression_loss: 1.5667 - classification_loss: 0.2662 366/500 [====================>.........] - ETA: 52s - loss: 1.8337 - regression_loss: 1.5674 - classification_loss: 0.2664 367/500 [=====================>........] - ETA: 51s - loss: 1.8362 - regression_loss: 1.5694 - classification_loss: 0.2669 368/500 [=====================>........] - ETA: 51s - loss: 1.8365 - regression_loss: 1.5695 - classification_loss: 0.2670 369/500 [=====================>........] - ETA: 50s - loss: 1.8370 - regression_loss: 1.5698 - classification_loss: 0.2672 370/500 [=====================>........] - ETA: 50s - loss: 1.8373 - regression_loss: 1.5702 - classification_loss: 0.2672 371/500 [=====================>........] - ETA: 50s - loss: 1.8380 - regression_loss: 1.5707 - classification_loss: 0.2673 372/500 [=====================>........] - ETA: 49s - loss: 1.8377 - regression_loss: 1.5705 - classification_loss: 0.2673 373/500 [=====================>........] - ETA: 49s - loss: 1.8393 - regression_loss: 1.5717 - classification_loss: 0.2676 374/500 [=====================>........] - ETA: 49s - loss: 1.8409 - regression_loss: 1.5728 - classification_loss: 0.2681 375/500 [=====================>........] - ETA: 48s - loss: 1.8410 - regression_loss: 1.5731 - classification_loss: 0.2680 376/500 [=====================>........] - ETA: 48s - loss: 1.8404 - regression_loss: 1.5725 - classification_loss: 0.2679 377/500 [=====================>........] - ETA: 47s - loss: 1.8381 - regression_loss: 1.5703 - classification_loss: 0.2678 378/500 [=====================>........] - ETA: 47s - loss: 1.8383 - regression_loss: 1.5704 - classification_loss: 0.2679 379/500 [=====================>........] - ETA: 47s - loss: 1.8388 - regression_loss: 1.5710 - classification_loss: 0.2678 380/500 [=====================>........] - ETA: 46s - loss: 1.8383 - regression_loss: 1.5705 - classification_loss: 0.2678 381/500 [=====================>........] - ETA: 46s - loss: 1.8392 - regression_loss: 1.5714 - classification_loss: 0.2677 382/500 [=====================>........] - ETA: 45s - loss: 1.8393 - regression_loss: 1.5716 - classification_loss: 0.2676 383/500 [=====================>........] - ETA: 45s - loss: 1.8396 - regression_loss: 1.5719 - classification_loss: 0.2677 384/500 [======================>.......] - ETA: 45s - loss: 1.8391 - regression_loss: 1.5714 - classification_loss: 0.2677 385/500 [======================>.......] - ETA: 44s - loss: 1.8391 - regression_loss: 1.5714 - classification_loss: 0.2677 386/500 [======================>.......] - ETA: 44s - loss: 1.8383 - regression_loss: 1.5709 - classification_loss: 0.2674 387/500 [======================>.......] - ETA: 43s - loss: 1.8386 - regression_loss: 1.5711 - classification_loss: 0.2675 388/500 [======================>.......] - ETA: 43s - loss: 1.8389 - regression_loss: 1.5716 - classification_loss: 0.2673 389/500 [======================>.......] - ETA: 43s - loss: 1.8384 - regression_loss: 1.5712 - classification_loss: 0.2672 390/500 [======================>.......] - ETA: 42s - loss: 1.8384 - regression_loss: 1.5711 - classification_loss: 0.2673 391/500 [======================>.......] - ETA: 42s - loss: 1.8384 - regression_loss: 1.5711 - classification_loss: 0.2672 392/500 [======================>.......] - ETA: 42s - loss: 1.8385 - regression_loss: 1.5711 - classification_loss: 0.2675 393/500 [======================>.......] - ETA: 41s - loss: 1.8389 - regression_loss: 1.5711 - classification_loss: 0.2678 394/500 [======================>.......] - ETA: 41s - loss: 1.8374 - regression_loss: 1.5699 - classification_loss: 0.2675 395/500 [======================>.......] - ETA: 40s - loss: 1.8375 - regression_loss: 1.5700 - classification_loss: 0.2675 396/500 [======================>.......] - ETA: 40s - loss: 1.8365 - regression_loss: 1.5693 - classification_loss: 0.2672 397/500 [======================>.......] - ETA: 40s - loss: 1.8353 - regression_loss: 1.5682 - classification_loss: 0.2671 398/500 [======================>.......] - ETA: 39s - loss: 1.8350 - regression_loss: 1.5678 - classification_loss: 0.2672 399/500 [======================>.......] - ETA: 39s - loss: 1.8351 - regression_loss: 1.5681 - classification_loss: 0.2670 400/500 [=======================>......] - ETA: 38s - loss: 1.8343 - regression_loss: 1.5674 - classification_loss: 0.2669 401/500 [=======================>......] - ETA: 38s - loss: 1.8336 - regression_loss: 1.5668 - classification_loss: 0.2667 402/500 [=======================>......] - ETA: 38s - loss: 1.8332 - regression_loss: 1.5665 - classification_loss: 0.2667 403/500 [=======================>......] - ETA: 37s - loss: 1.8322 - regression_loss: 1.5657 - classification_loss: 0.2665 404/500 [=======================>......] - ETA: 37s - loss: 1.8299 - regression_loss: 1.5637 - classification_loss: 0.2663 405/500 [=======================>......] - ETA: 36s - loss: 1.8296 - regression_loss: 1.5634 - classification_loss: 0.2662 406/500 [=======================>......] - ETA: 36s - loss: 1.8290 - regression_loss: 1.5628 - classification_loss: 0.2662 407/500 [=======================>......] - ETA: 36s - loss: 1.8263 - regression_loss: 1.5605 - classification_loss: 0.2659 408/500 [=======================>......] - ETA: 35s - loss: 1.8276 - regression_loss: 1.5615 - classification_loss: 0.2661 409/500 [=======================>......] - ETA: 35s - loss: 1.8262 - regression_loss: 1.5603 - classification_loss: 0.2658 410/500 [=======================>......] - ETA: 34s - loss: 1.8263 - regression_loss: 1.5604 - classification_loss: 0.2659 411/500 [=======================>......] - ETA: 34s - loss: 1.8245 - regression_loss: 1.5589 - classification_loss: 0.2656 412/500 [=======================>......] - ETA: 34s - loss: 1.8244 - regression_loss: 1.5588 - classification_loss: 0.2656 413/500 [=======================>......] - ETA: 33s - loss: 1.8234 - regression_loss: 1.5580 - classification_loss: 0.2654 414/500 [=======================>......] - ETA: 33s - loss: 1.8219 - regression_loss: 1.5568 - classification_loss: 0.2652 415/500 [=======================>......] - ETA: 33s - loss: 1.8210 - regression_loss: 1.5561 - classification_loss: 0.2650 416/500 [=======================>......] - ETA: 32s - loss: 1.8218 - regression_loss: 1.5568 - classification_loss: 0.2650 417/500 [========================>.....] - ETA: 32s - loss: 1.8209 - regression_loss: 1.5559 - classification_loss: 0.2649 418/500 [========================>.....] - ETA: 31s - loss: 1.8202 - regression_loss: 1.5553 - classification_loss: 0.2649 419/500 [========================>.....] - ETA: 31s - loss: 1.8194 - regression_loss: 1.5541 - classification_loss: 0.2652 420/500 [========================>.....] - ETA: 31s - loss: 1.8181 - regression_loss: 1.5532 - classification_loss: 0.2650 421/500 [========================>.....] - ETA: 30s - loss: 1.8198 - regression_loss: 1.5544 - classification_loss: 0.2654 422/500 [========================>.....] - ETA: 30s - loss: 1.8202 - regression_loss: 1.5547 - classification_loss: 0.2655 423/500 [========================>.....] - ETA: 29s - loss: 1.8214 - regression_loss: 1.5560 - classification_loss: 0.2654 424/500 [========================>.....] - ETA: 29s - loss: 1.8205 - regression_loss: 1.5554 - classification_loss: 0.2652 425/500 [========================>.....] - ETA: 29s - loss: 1.8199 - regression_loss: 1.5549 - classification_loss: 0.2650 426/500 [========================>.....] - ETA: 28s - loss: 1.8185 - regression_loss: 1.5536 - classification_loss: 0.2648 427/500 [========================>.....] - ETA: 28s - loss: 1.8168 - regression_loss: 1.5523 - classification_loss: 0.2646 428/500 [========================>.....] - ETA: 27s - loss: 1.8146 - regression_loss: 1.5502 - classification_loss: 0.2644 429/500 [========================>.....] - ETA: 27s - loss: 1.8146 - regression_loss: 1.5502 - classification_loss: 0.2644 430/500 [========================>.....] - ETA: 27s - loss: 1.8147 - regression_loss: 1.5504 - classification_loss: 0.2644 431/500 [========================>.....] - ETA: 26s - loss: 1.8144 - regression_loss: 1.5501 - classification_loss: 0.2643 432/500 [========================>.....] - ETA: 26s - loss: 1.8136 - regression_loss: 1.5495 - classification_loss: 0.2641 433/500 [========================>.....] - ETA: 26s - loss: 1.8155 - regression_loss: 1.5509 - classification_loss: 0.2646 434/500 [=========================>....] - ETA: 25s - loss: 1.8175 - regression_loss: 1.5523 - classification_loss: 0.2652 435/500 [=========================>....] - ETA: 25s - loss: 1.8177 - regression_loss: 1.5524 - classification_loss: 0.2653 436/500 [=========================>....] - ETA: 24s - loss: 1.8168 - regression_loss: 1.5515 - classification_loss: 0.2653 437/500 [=========================>....] - ETA: 24s - loss: 1.8174 - regression_loss: 1.5521 - classification_loss: 0.2653 438/500 [=========================>....] - ETA: 24s - loss: 1.8181 - regression_loss: 1.5528 - classification_loss: 0.2653 439/500 [=========================>....] - ETA: 23s - loss: 1.8170 - regression_loss: 1.5517 - classification_loss: 0.2653 440/500 [=========================>....] - ETA: 23s - loss: 1.8166 - regression_loss: 1.5515 - classification_loss: 0.2651 441/500 [=========================>....] - ETA: 22s - loss: 1.8154 - regression_loss: 1.5505 - classification_loss: 0.2648 442/500 [=========================>....] - ETA: 22s - loss: 1.8134 - regression_loss: 1.5488 - classification_loss: 0.2645 443/500 [=========================>....] - ETA: 22s - loss: 1.8137 - regression_loss: 1.5491 - classification_loss: 0.2646 444/500 [=========================>....] - ETA: 21s - loss: 1.8121 - regression_loss: 1.5477 - classification_loss: 0.2644 445/500 [=========================>....] - ETA: 21s - loss: 1.8108 - regression_loss: 1.5467 - classification_loss: 0.2641 446/500 [=========================>....] - ETA: 20s - loss: 1.8102 - regression_loss: 1.5463 - classification_loss: 0.2639 447/500 [=========================>....] - ETA: 20s - loss: 1.8107 - regression_loss: 1.5467 - classification_loss: 0.2640 448/500 [=========================>....] - ETA: 20s - loss: 1.8098 - regression_loss: 1.5459 - classification_loss: 0.2639 449/500 [=========================>....] - ETA: 19s - loss: 1.8101 - regression_loss: 1.5462 - classification_loss: 0.2639 450/500 [==========================>...] - ETA: 19s - loss: 1.8101 - regression_loss: 1.5464 - classification_loss: 0.2637 451/500 [==========================>...] - ETA: 19s - loss: 1.8111 - regression_loss: 1.5471 - classification_loss: 0.2640 452/500 [==========================>...] - ETA: 18s - loss: 1.8097 - regression_loss: 1.5458 - classification_loss: 0.2639 453/500 [==========================>...] - ETA: 18s - loss: 1.8090 - regression_loss: 1.5452 - classification_loss: 0.2639 454/500 [==========================>...] - ETA: 17s - loss: 1.8077 - regression_loss: 1.5439 - classification_loss: 0.2638 455/500 [==========================>...] - ETA: 17s - loss: 1.8071 - regression_loss: 1.5435 - classification_loss: 0.2636 456/500 [==========================>...] - ETA: 17s - loss: 1.8062 - regression_loss: 1.5428 - classification_loss: 0.2634 457/500 [==========================>...] - ETA: 16s - loss: 1.8056 - regression_loss: 1.5425 - classification_loss: 0.2631 458/500 [==========================>...] - ETA: 16s - loss: 1.8050 - regression_loss: 1.5421 - classification_loss: 0.2629 459/500 [==========================>...] - ETA: 15s - loss: 1.8049 - regression_loss: 1.5420 - classification_loss: 0.2629 460/500 [==========================>...] - ETA: 15s - loss: 1.8052 - regression_loss: 1.5422 - classification_loss: 0.2630 461/500 [==========================>...] - ETA: 15s - loss: 1.8049 - regression_loss: 1.5419 - classification_loss: 0.2630 462/500 [==========================>...] - ETA: 14s - loss: 1.8042 - regression_loss: 1.5413 - classification_loss: 0.2629 463/500 [==========================>...] - ETA: 14s - loss: 1.8040 - regression_loss: 1.5413 - classification_loss: 0.2627 464/500 [==========================>...] - ETA: 13s - loss: 1.8031 - regression_loss: 1.5405 - classification_loss: 0.2626 465/500 [==========================>...] - ETA: 13s - loss: 1.8033 - regression_loss: 1.5405 - classification_loss: 0.2628 466/500 [==========================>...] - ETA: 13s - loss: 1.8010 - regression_loss: 1.5385 - classification_loss: 0.2624 467/500 [===========================>..] - ETA: 12s - loss: 1.8013 - regression_loss: 1.5387 - classification_loss: 0.2625 468/500 [===========================>..] - ETA: 12s - loss: 1.7996 - regression_loss: 1.5374 - classification_loss: 0.2622 469/500 [===========================>..] - ETA: 12s - loss: 1.7993 - regression_loss: 1.5372 - classification_loss: 0.2621 470/500 [===========================>..] - ETA: 11s - loss: 1.7991 - regression_loss: 1.5371 - classification_loss: 0.2620 471/500 [===========================>..] - ETA: 11s - loss: 1.7986 - regression_loss: 1.5367 - classification_loss: 0.2619 472/500 [===========================>..] - ETA: 10s - loss: 1.7986 - regression_loss: 1.5367 - classification_loss: 0.2619 473/500 [===========================>..] - ETA: 10s - loss: 1.7975 - regression_loss: 1.5358 - classification_loss: 0.2616 474/500 [===========================>..] - ETA: 10s - loss: 1.7962 - regression_loss: 1.5348 - classification_loss: 0.2614 475/500 [===========================>..] - ETA: 9s - loss: 1.7960 - regression_loss: 1.5347 - classification_loss: 0.2613  476/500 [===========================>..] - ETA: 9s - loss: 1.7950 - regression_loss: 1.5339 - classification_loss: 0.2611 477/500 [===========================>..] - ETA: 8s - loss: 1.7933 - regression_loss: 1.5324 - classification_loss: 0.2609 478/500 [===========================>..] - ETA: 8s - loss: 1.7925 - regression_loss: 1.5317 - classification_loss: 0.2608 479/500 [===========================>..] - ETA: 8s - loss: 1.7923 - regression_loss: 1.5316 - classification_loss: 0.2607 480/500 [===========================>..] - ETA: 7s - loss: 1.7915 - regression_loss: 1.5309 - classification_loss: 0.2605 481/500 [===========================>..] - ETA: 7s - loss: 1.7906 - regression_loss: 1.5302 - classification_loss: 0.2604 482/500 [===========================>..] - ETA: 6s - loss: 1.7904 - regression_loss: 1.5302 - classification_loss: 0.2602 483/500 [===========================>..] - ETA: 6s - loss: 1.7893 - regression_loss: 1.5293 - classification_loss: 0.2600 484/500 [============================>.] - ETA: 6s - loss: 1.7895 - regression_loss: 1.5296 - classification_loss: 0.2599 485/500 [============================>.] - ETA: 5s - loss: 1.7891 - regression_loss: 1.5293 - classification_loss: 0.2598 486/500 [============================>.] - ETA: 5s - loss: 1.7871 - regression_loss: 1.5276 - classification_loss: 0.2595 487/500 [============================>.] - ETA: 5s - loss: 1.7867 - regression_loss: 1.5273 - classification_loss: 0.2594 488/500 [============================>.] - ETA: 4s - loss: 1.7888 - regression_loss: 1.5289 - classification_loss: 0.2600 489/500 [============================>.] - ETA: 4s - loss: 1.7905 - regression_loss: 1.5297 - classification_loss: 0.2608 490/500 [============================>.] - ETA: 3s - loss: 1.7897 - regression_loss: 1.5292 - classification_loss: 0.2606 491/500 [============================>.] - ETA: 3s - loss: 1.7897 - regression_loss: 1.5292 - classification_loss: 0.2605 492/500 [============================>.] - ETA: 3s - loss: 1.7890 - regression_loss: 1.5286 - classification_loss: 0.2603 493/500 [============================>.] - ETA: 2s - loss: 1.7878 - regression_loss: 1.5277 - classification_loss: 0.2601 494/500 [============================>.] - ETA: 2s - loss: 1.7872 - regression_loss: 1.5270 - classification_loss: 0.2602 495/500 [============================>.] - ETA: 1s - loss: 1.7870 - regression_loss: 1.5267 - classification_loss: 0.2603 496/500 [============================>.] - ETA: 1s - loss: 1.7854 - regression_loss: 1.5254 - classification_loss: 0.2600 497/500 [============================>.] - ETA: 1s - loss: 1.7862 - regression_loss: 1.5259 - classification_loss: 0.2603 498/500 [============================>.] - ETA: 0s - loss: 1.7871 - regression_loss: 1.5266 - classification_loss: 0.2605 499/500 [============================>.] - ETA: 0s - loss: 1.7874 - regression_loss: 1.5269 - classification_loss: 0.2605 500/500 [==============================] - 194s 389ms/step - loss: 1.7885 - regression_loss: 1.5277 - classification_loss: 0.2608 326 instances of class plum with average precision: 0.7754 mAP: 0.7754 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/20 1/500 [..............................] - ETA: 3:00 - loss: 2.3803 - regression_loss: 1.9238 - classification_loss: 0.4565 2/500 [..............................] - ETA: 3:05 - loss: 2.0884 - regression_loss: 1.7371 - classification_loss: 0.3513 3/500 [..............................] - ETA: 3:08 - loss: 2.0063 - regression_loss: 1.6903 - classification_loss: 0.3159 4/500 [..............................] - ETA: 3:10 - loss: 1.8170 - regression_loss: 1.5402 - classification_loss: 0.2768 5/500 [..............................] - ETA: 3:09 - loss: 1.8200 - regression_loss: 1.5553 - classification_loss: 0.2647 6/500 [..............................] - ETA: 3:10 - loss: 1.7372 - regression_loss: 1.4845 - classification_loss: 0.2526 7/500 [..............................] - ETA: 3:10 - loss: 1.7703 - regression_loss: 1.5196 - classification_loss: 0.2507 8/500 [..............................] - ETA: 3:10 - loss: 1.6923 - regression_loss: 1.4463 - classification_loss: 0.2460 9/500 [..............................] - ETA: 3:09 - loss: 1.6513 - regression_loss: 1.4141 - classification_loss: 0.2372 10/500 [..............................] - ETA: 3:09 - loss: 1.6217 - regression_loss: 1.3829 - classification_loss: 0.2387 11/500 [..............................] - ETA: 3:09 - loss: 1.6819 - regression_loss: 1.4111 - classification_loss: 0.2708 12/500 [..............................] - ETA: 3:09 - loss: 1.6652 - regression_loss: 1.4031 - classification_loss: 0.2621 13/500 [..............................] - ETA: 3:10 - loss: 1.6683 - regression_loss: 1.4089 - classification_loss: 0.2594 14/500 [..............................] - ETA: 3:09 - loss: 1.7129 - regression_loss: 1.4411 - classification_loss: 0.2718 15/500 [..............................] - ETA: 3:08 - loss: 1.7171 - regression_loss: 1.4453 - classification_loss: 0.2718 16/500 [..............................] - ETA: 3:08 - loss: 1.7274 - regression_loss: 1.4531 - classification_loss: 0.2743 17/500 [>.............................] - ETA: 3:07 - loss: 1.7205 - regression_loss: 1.4479 - classification_loss: 0.2726 18/500 [>.............................] - ETA: 3:07 - loss: 1.6957 - regression_loss: 1.4276 - classification_loss: 0.2681 19/500 [>.............................] - ETA: 3:06 - loss: 1.6789 - regression_loss: 1.4113 - classification_loss: 0.2675 20/500 [>.............................] - ETA: 3:06 - loss: 1.6558 - regression_loss: 1.3916 - classification_loss: 0.2642 21/500 [>.............................] - ETA: 3:05 - loss: 1.6404 - regression_loss: 1.3802 - classification_loss: 0.2601 22/500 [>.............................] - ETA: 3:05 - loss: 1.6328 - regression_loss: 1.3715 - classification_loss: 0.2613 23/500 [>.............................] - ETA: 3:05 - loss: 1.5852 - regression_loss: 1.3312 - classification_loss: 0.2540 24/500 [>.............................] - ETA: 3:04 - loss: 1.6584 - regression_loss: 1.3919 - classification_loss: 0.2665 25/500 [>.............................] - ETA: 3:04 - loss: 1.6447 - regression_loss: 1.3816 - classification_loss: 0.2631 26/500 [>.............................] - ETA: 3:04 - loss: 1.6791 - regression_loss: 1.4067 - classification_loss: 0.2723 27/500 [>.............................] - ETA: 3:04 - loss: 1.7009 - regression_loss: 1.4269 - classification_loss: 0.2740 28/500 [>.............................] - ETA: 3:03 - loss: 1.6917 - regression_loss: 1.4197 - classification_loss: 0.2720 29/500 [>.............................] - ETA: 3:03 - loss: 1.6575 - regression_loss: 1.3914 - classification_loss: 0.2661 30/500 [>.............................] - ETA: 3:03 - loss: 1.6465 - regression_loss: 1.3818 - classification_loss: 0.2647 31/500 [>.............................] - ETA: 3:03 - loss: 1.6712 - regression_loss: 1.4032 - classification_loss: 0.2680 32/500 [>.............................] - ETA: 3:02 - loss: 1.6639 - regression_loss: 1.3972 - classification_loss: 0.2668 33/500 [>.............................] - ETA: 3:02 - loss: 1.6361 - regression_loss: 1.3748 - classification_loss: 0.2613 34/500 [=>............................] - ETA: 3:01 - loss: 1.6554 - regression_loss: 1.3882 - classification_loss: 0.2672 35/500 [=>............................] - ETA: 3:01 - loss: 1.6678 - regression_loss: 1.3982 - classification_loss: 0.2696 36/500 [=>............................] - ETA: 3:00 - loss: 1.6858 - regression_loss: 1.4163 - classification_loss: 0.2694 37/500 [=>............................] - ETA: 3:00 - loss: 1.6925 - regression_loss: 1.4212 - classification_loss: 0.2713 38/500 [=>............................] - ETA: 2:59 - loss: 1.6929 - regression_loss: 1.4239 - classification_loss: 0.2690 39/500 [=>............................] - ETA: 2:59 - loss: 1.6888 - regression_loss: 1.4212 - classification_loss: 0.2675 40/500 [=>............................] - ETA: 2:59 - loss: 1.7098 - regression_loss: 1.4410 - classification_loss: 0.2688 41/500 [=>............................] - ETA: 2:58 - loss: 1.7058 - regression_loss: 1.4401 - classification_loss: 0.2657 42/500 [=>............................] - ETA: 2:58 - loss: 1.7218 - regression_loss: 1.4547 - classification_loss: 0.2670 43/500 [=>............................] - ETA: 2:58 - loss: 1.7204 - regression_loss: 1.4559 - classification_loss: 0.2645 44/500 [=>............................] - ETA: 2:57 - loss: 1.7317 - regression_loss: 1.4665 - classification_loss: 0.2652 45/500 [=>............................] - ETA: 2:57 - loss: 1.7186 - regression_loss: 1.4552 - classification_loss: 0.2634 46/500 [=>............................] - ETA: 2:56 - loss: 1.7147 - regression_loss: 1.4521 - classification_loss: 0.2627 47/500 [=>............................] - ETA: 2:56 - loss: 1.7138 - regression_loss: 1.4514 - classification_loss: 0.2624 48/500 [=>............................] - ETA: 2:56 - loss: 1.6978 - regression_loss: 1.4371 - classification_loss: 0.2607 49/500 [=>............................] - ETA: 2:55 - loss: 1.6993 - regression_loss: 1.4372 - classification_loss: 0.2621 50/500 [==>...........................] - ETA: 2:55 - loss: 1.7038 - regression_loss: 1.4411 - classification_loss: 0.2627 51/500 [==>...........................] - ETA: 2:55 - loss: 1.6841 - regression_loss: 1.4234 - classification_loss: 0.2607 52/500 [==>...........................] - ETA: 2:54 - loss: 1.6790 - regression_loss: 1.4200 - classification_loss: 0.2590 53/500 [==>...........................] - ETA: 2:54 - loss: 1.6761 - regression_loss: 1.4176 - classification_loss: 0.2584 54/500 [==>...........................] - ETA: 2:53 - loss: 1.6758 - regression_loss: 1.4175 - classification_loss: 0.2584 55/500 [==>...........................] - ETA: 2:53 - loss: 1.6684 - regression_loss: 1.4116 - classification_loss: 0.2568 56/500 [==>...........................] - ETA: 2:53 - loss: 1.6770 - regression_loss: 1.4206 - classification_loss: 0.2564 57/500 [==>...........................] - ETA: 2:52 - loss: 1.6806 - regression_loss: 1.4230 - classification_loss: 0.2576 58/500 [==>...........................] - ETA: 2:52 - loss: 1.6724 - regression_loss: 1.4169 - classification_loss: 0.2556 59/500 [==>...........................] - ETA: 2:51 - loss: 1.6733 - regression_loss: 1.4181 - classification_loss: 0.2552 60/500 [==>...........................] - ETA: 2:51 - loss: 1.6690 - regression_loss: 1.4153 - classification_loss: 0.2538 61/500 [==>...........................] - ETA: 2:51 - loss: 1.6641 - regression_loss: 1.4118 - classification_loss: 0.2523 62/500 [==>...........................] - ETA: 2:50 - loss: 1.6734 - regression_loss: 1.4190 - classification_loss: 0.2544 63/500 [==>...........................] - ETA: 2:50 - loss: 1.6721 - regression_loss: 1.4179 - classification_loss: 0.2542 64/500 [==>...........................] - ETA: 2:49 - loss: 1.6745 - regression_loss: 1.4202 - classification_loss: 0.2543 65/500 [==>...........................] - ETA: 2:49 - loss: 1.6766 - regression_loss: 1.4231 - classification_loss: 0.2535 66/500 [==>...........................] - ETA: 2:49 - loss: 1.6701 - regression_loss: 1.4184 - classification_loss: 0.2517 67/500 [===>..........................] - ETA: 2:48 - loss: 1.6774 - regression_loss: 1.4246 - classification_loss: 0.2528 68/500 [===>..........................] - ETA: 2:48 - loss: 1.6682 - regression_loss: 1.4175 - classification_loss: 0.2507 69/500 [===>..........................] - ETA: 2:47 - loss: 1.6641 - regression_loss: 1.4138 - classification_loss: 0.2503 70/500 [===>..........................] - ETA: 2:47 - loss: 1.6640 - regression_loss: 1.4159 - classification_loss: 0.2481 71/500 [===>..........................] - ETA: 2:46 - loss: 1.6621 - regression_loss: 1.4149 - classification_loss: 0.2472 72/500 [===>..........................] - ETA: 2:46 - loss: 1.6611 - regression_loss: 1.4138 - classification_loss: 0.2473 73/500 [===>..........................] - ETA: 2:46 - loss: 1.6669 - regression_loss: 1.4192 - classification_loss: 0.2477 74/500 [===>..........................] - ETA: 2:45 - loss: 1.6718 - regression_loss: 1.4232 - classification_loss: 0.2487 75/500 [===>..........................] - ETA: 2:45 - loss: 1.6735 - regression_loss: 1.4248 - classification_loss: 0.2487 76/500 [===>..........................] - ETA: 2:45 - loss: 1.6713 - regression_loss: 1.4234 - classification_loss: 0.2479 77/500 [===>..........................] - ETA: 2:44 - loss: 1.6700 - regression_loss: 1.4225 - classification_loss: 0.2475 78/500 [===>..........................] - ETA: 2:44 - loss: 1.6708 - regression_loss: 1.4230 - classification_loss: 0.2478 79/500 [===>..........................] - ETA: 2:43 - loss: 1.6637 - regression_loss: 1.4174 - classification_loss: 0.2463 80/500 [===>..........................] - ETA: 2:43 - loss: 1.6595 - regression_loss: 1.4140 - classification_loss: 0.2455 81/500 [===>..........................] - ETA: 2:43 - loss: 1.6692 - regression_loss: 1.4233 - classification_loss: 0.2459 82/500 [===>..........................] - ETA: 2:42 - loss: 1.6699 - regression_loss: 1.4233 - classification_loss: 0.2465 83/500 [===>..........................] - ETA: 2:42 - loss: 1.6598 - regression_loss: 1.4148 - classification_loss: 0.2449 84/500 [====>.........................] - ETA: 2:41 - loss: 1.6574 - regression_loss: 1.4119 - classification_loss: 0.2454 85/500 [====>.........................] - ETA: 2:41 - loss: 1.6565 - regression_loss: 1.4114 - classification_loss: 0.2450 86/500 [====>.........................] - ETA: 2:41 - loss: 1.6540 - regression_loss: 1.4089 - classification_loss: 0.2451 87/500 [====>.........................] - ETA: 2:40 - loss: 1.6546 - regression_loss: 1.4098 - classification_loss: 0.2448 88/500 [====>.........................] - ETA: 2:40 - loss: 1.6499 - regression_loss: 1.4060 - classification_loss: 0.2439 89/500 [====>.........................] - ETA: 2:39 - loss: 1.6520 - regression_loss: 1.4060 - classification_loss: 0.2461 90/500 [====>.........................] - ETA: 2:39 - loss: 1.6497 - regression_loss: 1.4038 - classification_loss: 0.2459 91/500 [====>.........................] - ETA: 2:38 - loss: 1.6487 - regression_loss: 1.4033 - classification_loss: 0.2454 92/500 [====>.........................] - ETA: 2:38 - loss: 1.6487 - regression_loss: 1.4038 - classification_loss: 0.2449 93/500 [====>.........................] - ETA: 2:38 - loss: 1.6451 - regression_loss: 1.4006 - classification_loss: 0.2445 94/500 [====>.........................] - ETA: 2:37 - loss: 1.6435 - regression_loss: 1.3995 - classification_loss: 0.2440 95/500 [====>.........................] - ETA: 2:37 - loss: 1.6440 - regression_loss: 1.4009 - classification_loss: 0.2431 96/500 [====>.........................] - ETA: 2:37 - loss: 1.6433 - regression_loss: 1.4010 - classification_loss: 0.2423 97/500 [====>.........................] - ETA: 2:36 - loss: 1.6418 - regression_loss: 1.4005 - classification_loss: 0.2412 98/500 [====>.........................] - ETA: 2:36 - loss: 1.6378 - regression_loss: 1.3972 - classification_loss: 0.2406 99/500 [====>.........................] - ETA: 2:35 - loss: 1.6335 - regression_loss: 1.3940 - classification_loss: 0.2395 100/500 [=====>........................] - ETA: 2:35 - loss: 1.6326 - regression_loss: 1.3936 - classification_loss: 0.2390 101/500 [=====>........................] - ETA: 2:35 - loss: 1.6247 - regression_loss: 1.3871 - classification_loss: 0.2377 102/500 [=====>........................] - ETA: 2:34 - loss: 1.6202 - regression_loss: 1.3833 - classification_loss: 0.2368 103/500 [=====>........................] - ETA: 2:34 - loss: 1.6179 - regression_loss: 1.3815 - classification_loss: 0.2364 104/500 [=====>........................] - ETA: 2:33 - loss: 1.6140 - regression_loss: 1.3777 - classification_loss: 0.2363 105/500 [=====>........................] - ETA: 2:33 - loss: 1.6142 - regression_loss: 1.3780 - classification_loss: 0.2362 106/500 [=====>........................] - ETA: 2:33 - loss: 1.6083 - regression_loss: 1.3727 - classification_loss: 0.2356 107/500 [=====>........................] - ETA: 2:32 - loss: 1.6114 - regression_loss: 1.3750 - classification_loss: 0.2364 108/500 [=====>........................] - ETA: 2:32 - loss: 1.6050 - regression_loss: 1.3696 - classification_loss: 0.2354 109/500 [=====>........................] - ETA: 2:31 - loss: 1.6024 - regression_loss: 1.3672 - classification_loss: 0.2352 110/500 [=====>........................] - ETA: 2:31 - loss: 1.6205 - regression_loss: 1.3808 - classification_loss: 0.2397 111/500 [=====>........................] - ETA: 2:31 - loss: 1.6103 - regression_loss: 1.3720 - classification_loss: 0.2383 112/500 [=====>........................] - ETA: 2:30 - loss: 1.6116 - regression_loss: 1.3736 - classification_loss: 0.2380 113/500 [=====>........................] - ETA: 2:30 - loss: 1.6107 - regression_loss: 1.3730 - classification_loss: 0.2378 114/500 [=====>........................] - ETA: 2:30 - loss: 1.6071 - regression_loss: 1.3702 - classification_loss: 0.2369 115/500 [=====>........................] - ETA: 2:29 - loss: 1.6073 - regression_loss: 1.3708 - classification_loss: 0.2365 116/500 [=====>........................] - ETA: 2:29 - loss: 1.6110 - regression_loss: 1.3736 - classification_loss: 0.2373 117/500 [======>.......................] - ETA: 2:28 - loss: 1.6126 - regression_loss: 1.3751 - classification_loss: 0.2375 118/500 [======>.......................] - ETA: 2:28 - loss: 1.6123 - regression_loss: 1.3750 - classification_loss: 0.2373 119/500 [======>.......................] - ETA: 2:28 - loss: 1.6094 - regression_loss: 1.3727 - classification_loss: 0.2367 120/500 [======>.......................] - ETA: 2:27 - loss: 1.6098 - regression_loss: 1.3734 - classification_loss: 0.2364 121/500 [======>.......................] - ETA: 2:27 - loss: 1.6060 - regression_loss: 1.3703 - classification_loss: 0.2357 122/500 [======>.......................] - ETA: 2:26 - loss: 1.6046 - regression_loss: 1.3694 - classification_loss: 0.2352 123/500 [======>.......................] - ETA: 2:26 - loss: 1.6008 - regression_loss: 1.3664 - classification_loss: 0.2344 124/500 [======>.......................] - ETA: 2:26 - loss: 1.6030 - regression_loss: 1.3687 - classification_loss: 0.2344 125/500 [======>.......................] - ETA: 2:25 - loss: 1.6010 - regression_loss: 1.3672 - classification_loss: 0.2338 126/500 [======>.......................] - ETA: 2:25 - loss: 1.5936 - regression_loss: 1.3601 - classification_loss: 0.2335 127/500 [======>.......................] - ETA: 2:25 - loss: 1.5968 - regression_loss: 1.3628 - classification_loss: 0.2341 128/500 [======>.......................] - ETA: 2:24 - loss: 1.6011 - regression_loss: 1.3661 - classification_loss: 0.2350 129/500 [======>.......................] - ETA: 2:24 - loss: 1.5962 - regression_loss: 1.3620 - classification_loss: 0.2341 130/500 [======>.......................] - ETA: 2:23 - loss: 1.5969 - regression_loss: 1.3628 - classification_loss: 0.2341 131/500 [======>.......................] - ETA: 2:23 - loss: 1.5971 - regression_loss: 1.3632 - classification_loss: 0.2339 132/500 [======>.......................] - ETA: 2:23 - loss: 1.6013 - regression_loss: 1.3670 - classification_loss: 0.2343 133/500 [======>.......................] - ETA: 2:22 - loss: 1.6016 - regression_loss: 1.3670 - classification_loss: 0.2345 134/500 [=======>......................] - ETA: 2:22 - loss: 1.6023 - regression_loss: 1.3681 - classification_loss: 0.2342 135/500 [=======>......................] - ETA: 2:21 - loss: 1.6012 - regression_loss: 1.3673 - classification_loss: 0.2339 136/500 [=======>......................] - ETA: 2:21 - loss: 1.5971 - regression_loss: 1.3634 - classification_loss: 0.2337 137/500 [=======>......................] - ETA: 2:21 - loss: 1.5963 - regression_loss: 1.3631 - classification_loss: 0.2331 138/500 [=======>......................] - ETA: 2:20 - loss: 1.5923 - regression_loss: 1.3596 - classification_loss: 0.2327 139/500 [=======>......................] - ETA: 2:20 - loss: 1.5934 - regression_loss: 1.3605 - classification_loss: 0.2329 140/500 [=======>......................] - ETA: 2:19 - loss: 1.5912 - regression_loss: 1.3588 - classification_loss: 0.2324 141/500 [=======>......................] - ETA: 2:19 - loss: 1.5838 - regression_loss: 1.3525 - classification_loss: 0.2313 142/500 [=======>......................] - ETA: 2:19 - loss: 1.5892 - regression_loss: 1.3571 - classification_loss: 0.2321 143/500 [=======>......................] - ETA: 2:18 - loss: 1.5846 - regression_loss: 1.3530 - classification_loss: 0.2316 144/500 [=======>......................] - ETA: 2:18 - loss: 1.5779 - regression_loss: 1.3472 - classification_loss: 0.2307 145/500 [=======>......................] - ETA: 2:17 - loss: 1.5775 - regression_loss: 1.3468 - classification_loss: 0.2307 146/500 [=======>......................] - ETA: 2:17 - loss: 1.5712 - regression_loss: 1.3415 - classification_loss: 0.2297 147/500 [=======>......................] - ETA: 2:17 - loss: 1.5709 - regression_loss: 1.3412 - classification_loss: 0.2298 148/500 [=======>......................] - ETA: 2:16 - loss: 1.5721 - regression_loss: 1.3427 - classification_loss: 0.2294 149/500 [=======>......................] - ETA: 2:16 - loss: 1.5707 - regression_loss: 1.3418 - classification_loss: 0.2289 150/500 [========>.....................] - ETA: 2:16 - loss: 1.5729 - regression_loss: 1.3435 - classification_loss: 0.2294 151/500 [========>.....................] - ETA: 2:15 - loss: 1.5775 - regression_loss: 1.3475 - classification_loss: 0.2300 152/500 [========>.....................] - ETA: 2:15 - loss: 1.5742 - regression_loss: 1.3445 - classification_loss: 0.2297 153/500 [========>.....................] - ETA: 2:15 - loss: 1.5752 - regression_loss: 1.3453 - classification_loss: 0.2299 154/500 [========>.....................] - ETA: 2:14 - loss: 1.5693 - regression_loss: 1.3403 - classification_loss: 0.2290 155/500 [========>.....................] - ETA: 2:14 - loss: 1.5699 - regression_loss: 1.3410 - classification_loss: 0.2289 156/500 [========>.....................] - ETA: 2:13 - loss: 1.5693 - regression_loss: 1.3403 - classification_loss: 0.2289 157/500 [========>.....................] - ETA: 2:13 - loss: 1.5687 - regression_loss: 1.3400 - classification_loss: 0.2287 158/500 [========>.....................] - ETA: 2:12 - loss: 1.5698 - regression_loss: 1.3411 - classification_loss: 0.2287 159/500 [========>.....................] - ETA: 2:12 - loss: 1.5714 - regression_loss: 1.3424 - classification_loss: 0.2289 160/500 [========>.....................] - ETA: 2:12 - loss: 1.5693 - regression_loss: 1.3407 - classification_loss: 0.2286 161/500 [========>.....................] - ETA: 2:11 - loss: 1.5666 - regression_loss: 1.3385 - classification_loss: 0.2281 162/500 [========>.....................] - ETA: 2:11 - loss: 1.5651 - regression_loss: 1.3374 - classification_loss: 0.2277 163/500 [========>.....................] - ETA: 2:11 - loss: 1.5689 - regression_loss: 1.3406 - classification_loss: 0.2283 164/500 [========>.....................] - ETA: 2:10 - loss: 1.5715 - regression_loss: 1.3430 - classification_loss: 0.2285 165/500 [========>.....................] - ETA: 2:10 - loss: 1.5692 - regression_loss: 1.3405 - classification_loss: 0.2288 166/500 [========>.....................] - ETA: 2:09 - loss: 1.5689 - regression_loss: 1.3403 - classification_loss: 0.2286 167/500 [=========>....................] - ETA: 2:09 - loss: 1.5669 - regression_loss: 1.3388 - classification_loss: 0.2281 168/500 [=========>....................] - ETA: 2:09 - loss: 1.5700 - regression_loss: 1.3414 - classification_loss: 0.2286 169/500 [=========>....................] - ETA: 2:08 - loss: 1.5696 - regression_loss: 1.3413 - classification_loss: 0.2284 170/500 [=========>....................] - ETA: 2:08 - loss: 1.5686 - regression_loss: 1.3405 - classification_loss: 0.2281 171/500 [=========>....................] - ETA: 2:08 - loss: 1.5685 - regression_loss: 1.3408 - classification_loss: 0.2277 172/500 [=========>....................] - ETA: 2:07 - loss: 1.5682 - regression_loss: 1.3408 - classification_loss: 0.2274 173/500 [=========>....................] - ETA: 2:07 - loss: 1.5653 - regression_loss: 1.3384 - classification_loss: 0.2270 174/500 [=========>....................] - ETA: 2:06 - loss: 1.5691 - regression_loss: 1.3417 - classification_loss: 0.2275 175/500 [=========>....................] - ETA: 2:06 - loss: 1.5670 - regression_loss: 1.3399 - classification_loss: 0.2271 176/500 [=========>....................] - ETA: 2:06 - loss: 1.5676 - regression_loss: 1.3404 - classification_loss: 0.2272 177/500 [=========>....................] - ETA: 2:05 - loss: 1.5685 - regression_loss: 1.3413 - classification_loss: 0.2273 178/500 [=========>....................] - ETA: 2:05 - loss: 1.5674 - regression_loss: 1.3403 - classification_loss: 0.2271 179/500 [=========>....................] - ETA: 2:04 - loss: 1.5666 - regression_loss: 1.3397 - classification_loss: 0.2270 180/500 [=========>....................] - ETA: 2:04 - loss: 1.5692 - regression_loss: 1.3421 - classification_loss: 0.2271 181/500 [=========>....................] - ETA: 2:04 - loss: 1.5694 - regression_loss: 1.3425 - classification_loss: 0.2269 182/500 [=========>....................] - ETA: 2:03 - loss: 1.5699 - regression_loss: 1.3429 - classification_loss: 0.2270 183/500 [=========>....................] - ETA: 2:03 - loss: 1.5748 - regression_loss: 1.3465 - classification_loss: 0.2283 184/500 [==========>...................] - ETA: 2:02 - loss: 1.5725 - regression_loss: 1.3445 - classification_loss: 0.2280 185/500 [==========>...................] - ETA: 2:02 - loss: 1.5714 - regression_loss: 1.3438 - classification_loss: 0.2276 186/500 [==========>...................] - ETA: 2:02 - loss: 1.5715 - regression_loss: 1.3440 - classification_loss: 0.2275 187/500 [==========>...................] - ETA: 2:01 - loss: 1.5713 - regression_loss: 1.3442 - classification_loss: 0.2271 188/500 [==========>...................] - ETA: 2:01 - loss: 1.5743 - regression_loss: 1.3467 - classification_loss: 0.2276 189/500 [==========>...................] - ETA: 2:01 - loss: 1.5741 - regression_loss: 1.3466 - classification_loss: 0.2275 190/500 [==========>...................] - ETA: 2:00 - loss: 1.5754 - regression_loss: 1.3475 - classification_loss: 0.2280 191/500 [==========>...................] - ETA: 2:00 - loss: 1.5744 - regression_loss: 1.3468 - classification_loss: 0.2276 192/500 [==========>...................] - ETA: 1:59 - loss: 1.5742 - regression_loss: 1.3465 - classification_loss: 0.2276 193/500 [==========>...................] - ETA: 1:59 - loss: 1.5746 - regression_loss: 1.3470 - classification_loss: 0.2276 194/500 [==========>...................] - ETA: 1:59 - loss: 1.5786 - regression_loss: 1.3505 - classification_loss: 0.2281 195/500 [==========>...................] - ETA: 1:58 - loss: 1.5758 - regression_loss: 1.3479 - classification_loss: 0.2279 196/500 [==========>...................] - ETA: 1:58 - loss: 1.5777 - regression_loss: 1.3498 - classification_loss: 0.2280 197/500 [==========>...................] - ETA: 1:57 - loss: 1.5804 - regression_loss: 1.3523 - classification_loss: 0.2280 198/500 [==========>...................] - ETA: 1:57 - loss: 1.5773 - regression_loss: 1.3495 - classification_loss: 0.2278 199/500 [==========>...................] - ETA: 1:57 - loss: 1.5767 - regression_loss: 1.3491 - classification_loss: 0.2275 200/500 [===========>..................] - ETA: 1:56 - loss: 1.5749 - regression_loss: 1.3477 - classification_loss: 0.2272 201/500 [===========>..................] - ETA: 1:56 - loss: 1.5762 - regression_loss: 1.3489 - classification_loss: 0.2273 202/500 [===========>..................] - ETA: 1:56 - loss: 1.5739 - regression_loss: 1.3470 - classification_loss: 0.2269 203/500 [===========>..................] - ETA: 1:55 - loss: 1.5780 - regression_loss: 1.3501 - classification_loss: 0.2279 204/500 [===========>..................] - ETA: 1:55 - loss: 1.5776 - regression_loss: 1.3499 - classification_loss: 0.2278 205/500 [===========>..................] - ETA: 1:54 - loss: 1.5787 - regression_loss: 1.3508 - classification_loss: 0.2279 206/500 [===========>..................] - ETA: 1:54 - loss: 1.5782 - regression_loss: 1.3507 - classification_loss: 0.2275 207/500 [===========>..................] - ETA: 1:54 - loss: 1.5757 - regression_loss: 1.3487 - classification_loss: 0.2270 208/500 [===========>..................] - ETA: 1:53 - loss: 1.5748 - regression_loss: 1.3482 - classification_loss: 0.2266 209/500 [===========>..................] - ETA: 1:53 - loss: 1.5747 - regression_loss: 1.3482 - classification_loss: 0.2265 210/500 [===========>..................] - ETA: 1:52 - loss: 1.5721 - regression_loss: 1.3462 - classification_loss: 0.2259 211/500 [===========>..................] - ETA: 1:52 - loss: 1.5739 - regression_loss: 1.3474 - classification_loss: 0.2264 212/500 [===========>..................] - ETA: 1:52 - loss: 1.5744 - regression_loss: 1.3476 - classification_loss: 0.2268 213/500 [===========>..................] - ETA: 1:51 - loss: 1.5753 - regression_loss: 1.3484 - classification_loss: 0.2268 214/500 [===========>..................] - ETA: 1:51 - loss: 1.5764 - regression_loss: 1.3502 - classification_loss: 0.2262 215/500 [===========>..................] - ETA: 1:51 - loss: 1.5786 - regression_loss: 1.3526 - classification_loss: 0.2260 216/500 [===========>..................] - ETA: 1:50 - loss: 1.5803 - regression_loss: 1.3543 - classification_loss: 0.2259 217/500 [============>.................] - ETA: 1:50 - loss: 1.5811 - regression_loss: 1.3552 - classification_loss: 0.2259 218/500 [============>.................] - ETA: 1:49 - loss: 1.5803 - regression_loss: 1.3532 - classification_loss: 0.2271 219/500 [============>.................] - ETA: 1:49 - loss: 1.5777 - regression_loss: 1.3510 - classification_loss: 0.2267 220/500 [============>.................] - ETA: 1:49 - loss: 1.5809 - regression_loss: 1.3534 - classification_loss: 0.2275 221/500 [============>.................] - ETA: 1:48 - loss: 1.5809 - regression_loss: 1.3532 - classification_loss: 0.2277 222/500 [============>.................] - ETA: 1:48 - loss: 1.5793 - regression_loss: 1.3520 - classification_loss: 0.2273 223/500 [============>.................] - ETA: 1:47 - loss: 1.5786 - regression_loss: 1.3517 - classification_loss: 0.2268 224/500 [============>.................] - ETA: 1:47 - loss: 1.5791 - regression_loss: 1.3522 - classification_loss: 0.2269 225/500 [============>.................] - ETA: 1:47 - loss: 1.5792 - regression_loss: 1.3528 - classification_loss: 0.2265 226/500 [============>.................] - ETA: 1:46 - loss: 1.5763 - regression_loss: 1.3503 - classification_loss: 0.2260 227/500 [============>.................] - ETA: 1:46 - loss: 1.5751 - regression_loss: 1.3491 - classification_loss: 0.2259 228/500 [============>.................] - ETA: 1:46 - loss: 1.5757 - regression_loss: 1.3494 - classification_loss: 0.2262 229/500 [============>.................] - ETA: 1:45 - loss: 1.5766 - regression_loss: 1.3503 - classification_loss: 0.2263 230/500 [============>.................] - ETA: 1:45 - loss: 1.5791 - regression_loss: 1.3527 - classification_loss: 0.2264 231/500 [============>.................] - ETA: 1:44 - loss: 1.5794 - regression_loss: 1.3531 - classification_loss: 0.2263 232/500 [============>.................] - ETA: 1:44 - loss: 1.5778 - regression_loss: 1.3518 - classification_loss: 0.2260 233/500 [============>.................] - ETA: 1:44 - loss: 1.5813 - regression_loss: 1.3551 - classification_loss: 0.2262 234/500 [=============>................] - ETA: 1:43 - loss: 1.5849 - regression_loss: 1.3582 - classification_loss: 0.2267 235/500 [=============>................] - ETA: 1:43 - loss: 1.5806 - regression_loss: 1.3543 - classification_loss: 0.2263 236/500 [=============>................] - ETA: 1:42 - loss: 1.5798 - regression_loss: 1.3536 - classification_loss: 0.2262 237/500 [=============>................] - ETA: 1:42 - loss: 1.5788 - regression_loss: 1.3531 - classification_loss: 0.2257 238/500 [=============>................] - ETA: 1:42 - loss: 1.5813 - regression_loss: 1.3552 - classification_loss: 0.2261 239/500 [=============>................] - ETA: 1:41 - loss: 1.5790 - regression_loss: 1.3531 - classification_loss: 0.2260 240/500 [=============>................] - ETA: 1:41 - loss: 1.5774 - regression_loss: 1.3517 - classification_loss: 0.2257 241/500 [=============>................] - ETA: 1:40 - loss: 1.5776 - regression_loss: 1.3521 - classification_loss: 0.2255 242/500 [=============>................] - ETA: 1:40 - loss: 1.5773 - regression_loss: 1.3520 - classification_loss: 0.2253 243/500 [=============>................] - ETA: 1:40 - loss: 1.5779 - regression_loss: 1.3525 - classification_loss: 0.2254 244/500 [=============>................] - ETA: 1:39 - loss: 1.5775 - regression_loss: 1.3524 - classification_loss: 0.2251 245/500 [=============>................] - ETA: 1:39 - loss: 1.5767 - regression_loss: 1.3518 - classification_loss: 0.2249 246/500 [=============>................] - ETA: 1:39 - loss: 1.5758 - regression_loss: 1.3513 - classification_loss: 0.2245 247/500 [=============>................] - ETA: 1:38 - loss: 1.5770 - regression_loss: 1.3525 - classification_loss: 0.2244 248/500 [=============>................] - ETA: 1:38 - loss: 1.5783 - regression_loss: 1.3540 - classification_loss: 0.2243 249/500 [=============>................] - ETA: 1:37 - loss: 1.5783 - regression_loss: 1.3541 - classification_loss: 0.2242 250/500 [==============>...............] - ETA: 1:37 - loss: 1.5781 - regression_loss: 1.3542 - classification_loss: 0.2239 251/500 [==============>...............] - ETA: 1:37 - loss: 1.5778 - regression_loss: 1.3538 - classification_loss: 0.2240 252/500 [==============>...............] - ETA: 1:36 - loss: 1.5826 - regression_loss: 1.3575 - classification_loss: 0.2251 253/500 [==============>...............] - ETA: 1:36 - loss: 1.5800 - regression_loss: 1.3555 - classification_loss: 0.2245 254/500 [==============>...............] - ETA: 1:35 - loss: 1.5834 - regression_loss: 1.3575 - classification_loss: 0.2258 255/500 [==============>...............] - ETA: 1:35 - loss: 1.5800 - regression_loss: 1.3546 - classification_loss: 0.2253 256/500 [==============>...............] - ETA: 1:35 - loss: 1.5802 - regression_loss: 1.3545 - classification_loss: 0.2256 257/500 [==============>...............] - ETA: 1:34 - loss: 1.5763 - regression_loss: 1.3512 - classification_loss: 0.2252 258/500 [==============>...............] - ETA: 1:34 - loss: 1.5773 - regression_loss: 1.3522 - classification_loss: 0.2251 259/500 [==============>...............] - ETA: 1:33 - loss: 1.5742 - regression_loss: 1.3496 - classification_loss: 0.2246 260/500 [==============>...............] - ETA: 1:33 - loss: 1.5726 - regression_loss: 1.3481 - classification_loss: 0.2245 261/500 [==============>...............] - ETA: 1:33 - loss: 1.5725 - regression_loss: 1.3482 - classification_loss: 0.2244 262/500 [==============>...............] - ETA: 1:32 - loss: 1.5722 - regression_loss: 1.3480 - classification_loss: 0.2242 263/500 [==============>...............] - ETA: 1:32 - loss: 1.5740 - regression_loss: 1.3495 - classification_loss: 0.2245 264/500 [==============>...............] - ETA: 1:31 - loss: 1.5722 - regression_loss: 1.3479 - classification_loss: 0.2243 265/500 [==============>...............] - ETA: 1:31 - loss: 1.5686 - regression_loss: 1.3448 - classification_loss: 0.2238 266/500 [==============>...............] - ETA: 1:31 - loss: 1.5696 - regression_loss: 1.3456 - classification_loss: 0.2240 267/500 [===============>..............] - ETA: 1:30 - loss: 1.5690 - regression_loss: 1.3452 - classification_loss: 0.2238 268/500 [===============>..............] - ETA: 1:30 - loss: 1.5671 - regression_loss: 1.3436 - classification_loss: 0.2235 269/500 [===============>..............] - ETA: 1:30 - loss: 1.5680 - regression_loss: 1.3442 - classification_loss: 0.2238 270/500 [===============>..............] - ETA: 1:29 - loss: 1.5700 - regression_loss: 1.3462 - classification_loss: 0.2238 271/500 [===============>..............] - ETA: 1:29 - loss: 1.5721 - regression_loss: 1.3479 - classification_loss: 0.2242 272/500 [===============>..............] - ETA: 1:28 - loss: 1.5699 - regression_loss: 1.3461 - classification_loss: 0.2238 273/500 [===============>..............] - ETA: 1:28 - loss: 1.5700 - regression_loss: 1.3465 - classification_loss: 0.2235 274/500 [===============>..............] - ETA: 1:28 - loss: 1.5679 - regression_loss: 1.3446 - classification_loss: 0.2233 275/500 [===============>..............] - ETA: 1:27 - loss: 1.5658 - regression_loss: 1.3429 - classification_loss: 0.2228 276/500 [===============>..............] - ETA: 1:27 - loss: 1.5659 - regression_loss: 1.3433 - classification_loss: 0.2226 277/500 [===============>..............] - ETA: 1:26 - loss: 1.5668 - regression_loss: 1.3439 - classification_loss: 0.2229 278/500 [===============>..............] - ETA: 1:26 - loss: 1.5685 - regression_loss: 1.3454 - classification_loss: 0.2230 279/500 [===============>..............] - ETA: 1:26 - loss: 1.5686 - regression_loss: 1.3456 - classification_loss: 0.2230 280/500 [===============>..............] - ETA: 1:25 - loss: 1.5682 - regression_loss: 1.3453 - classification_loss: 0.2229 281/500 [===============>..............] - ETA: 1:25 - loss: 1.5650 - regression_loss: 1.3425 - classification_loss: 0.2225 282/500 [===============>..............] - ETA: 1:25 - loss: 1.5658 - regression_loss: 1.3432 - classification_loss: 0.2226 283/500 [===============>..............] - ETA: 1:24 - loss: 1.5628 - regression_loss: 1.3406 - classification_loss: 0.2222 284/500 [================>.............] - ETA: 1:24 - loss: 1.5627 - regression_loss: 1.3405 - classification_loss: 0.2222 285/500 [================>.............] - ETA: 1:23 - loss: 1.5605 - regression_loss: 1.3385 - classification_loss: 0.2220 286/500 [================>.............] - ETA: 1:23 - loss: 1.5609 - regression_loss: 1.3388 - classification_loss: 0.2221 287/500 [================>.............] - ETA: 1:23 - loss: 1.5594 - regression_loss: 1.3375 - classification_loss: 0.2220 288/500 [================>.............] - ETA: 1:22 - loss: 1.5611 - regression_loss: 1.3389 - classification_loss: 0.2222 289/500 [================>.............] - ETA: 1:22 - loss: 1.5595 - regression_loss: 1.3376 - classification_loss: 0.2219 290/500 [================>.............] - ETA: 1:21 - loss: 1.5616 - regression_loss: 1.3393 - classification_loss: 0.2223 291/500 [================>.............] - ETA: 1:21 - loss: 1.5624 - regression_loss: 1.3402 - classification_loss: 0.2222 292/500 [================>.............] - ETA: 1:21 - loss: 1.5612 - regression_loss: 1.3391 - classification_loss: 0.2221 293/500 [================>.............] - ETA: 1:20 - loss: 1.5614 - regression_loss: 1.3391 - classification_loss: 0.2224 294/500 [================>.............] - ETA: 1:20 - loss: 1.5592 - regression_loss: 1.3371 - classification_loss: 0.2221 295/500 [================>.............] - ETA: 1:19 - loss: 1.5606 - regression_loss: 1.3380 - classification_loss: 0.2226 296/500 [================>.............] - ETA: 1:19 - loss: 1.5575 - regression_loss: 1.3353 - classification_loss: 0.2222 297/500 [================>.............] - ETA: 1:19 - loss: 1.5583 - regression_loss: 1.3361 - classification_loss: 0.2222 298/500 [================>.............] - ETA: 1:18 - loss: 1.5584 - regression_loss: 1.3362 - classification_loss: 0.2222 299/500 [================>.............] - ETA: 1:18 - loss: 1.5560 - regression_loss: 1.3342 - classification_loss: 0.2218 300/500 [=================>............] - ETA: 1:18 - loss: 1.5567 - regression_loss: 1.3345 - classification_loss: 0.2221 301/500 [=================>............] - ETA: 1:17 - loss: 1.5556 - regression_loss: 1.3337 - classification_loss: 0.2218 302/500 [=================>............] - ETA: 1:17 - loss: 1.5539 - regression_loss: 1.3323 - classification_loss: 0.2216 303/500 [=================>............] - ETA: 1:16 - loss: 1.5533 - regression_loss: 1.3318 - classification_loss: 0.2215 304/500 [=================>............] - ETA: 1:16 - loss: 1.5567 - regression_loss: 1.3344 - classification_loss: 0.2223 305/500 [=================>............] - ETA: 1:16 - loss: 1.5559 - regression_loss: 1.3337 - classification_loss: 0.2222 306/500 [=================>............] - ETA: 1:15 - loss: 1.5555 - regression_loss: 1.3336 - classification_loss: 0.2220 307/500 [=================>............] - ETA: 1:15 - loss: 1.5589 - regression_loss: 1.3363 - classification_loss: 0.2226 308/500 [=================>............] - ETA: 1:14 - loss: 1.5571 - regression_loss: 1.3349 - classification_loss: 0.2222 309/500 [=================>............] - ETA: 1:14 - loss: 1.5578 - regression_loss: 1.3355 - classification_loss: 0.2222 310/500 [=================>............] - ETA: 1:14 - loss: 1.5577 - regression_loss: 1.3358 - classification_loss: 0.2220 311/500 [=================>............] - ETA: 1:13 - loss: 1.5582 - regression_loss: 1.3362 - classification_loss: 0.2220 312/500 [=================>............] - ETA: 1:13 - loss: 1.5567 - regression_loss: 1.3350 - classification_loss: 0.2218 313/500 [=================>............] - ETA: 1:12 - loss: 1.5563 - regression_loss: 1.3347 - classification_loss: 0.2217 314/500 [=================>............] - ETA: 1:12 - loss: 1.5555 - regression_loss: 1.3338 - classification_loss: 0.2217 315/500 [=================>............] - ETA: 1:12 - loss: 1.5529 - regression_loss: 1.3315 - classification_loss: 0.2214 316/500 [=================>............] - ETA: 1:11 - loss: 1.5522 - regression_loss: 1.3310 - classification_loss: 0.2212 317/500 [==================>...........] - ETA: 1:11 - loss: 1.5515 - regression_loss: 1.3304 - classification_loss: 0.2211 318/500 [==================>...........] - ETA: 1:11 - loss: 1.5526 - regression_loss: 1.3317 - classification_loss: 0.2209 319/500 [==================>...........] - ETA: 1:10 - loss: 1.5535 - regression_loss: 1.3324 - classification_loss: 0.2210 320/500 [==================>...........] - ETA: 1:10 - loss: 1.5523 - regression_loss: 1.3315 - classification_loss: 0.2208 321/500 [==================>...........] - ETA: 1:09 - loss: 1.5518 - regression_loss: 1.3312 - classification_loss: 0.2206 322/500 [==================>...........] - ETA: 1:09 - loss: 1.5515 - regression_loss: 1.3309 - classification_loss: 0.2206 323/500 [==================>...........] - ETA: 1:09 - loss: 1.5535 - regression_loss: 1.3321 - classification_loss: 0.2214 324/500 [==================>...........] - ETA: 1:08 - loss: 1.5535 - regression_loss: 1.3322 - classification_loss: 0.2213 325/500 [==================>...........] - ETA: 1:08 - loss: 1.5520 - regression_loss: 1.3308 - classification_loss: 0.2213 326/500 [==================>...........] - ETA: 1:07 - loss: 1.5508 - regression_loss: 1.3297 - classification_loss: 0.2211 327/500 [==================>...........] - ETA: 1:07 - loss: 1.5510 - regression_loss: 1.3300 - classification_loss: 0.2210 328/500 [==================>...........] - ETA: 1:07 - loss: 1.5497 - regression_loss: 1.3286 - classification_loss: 0.2211 329/500 [==================>...........] - ETA: 1:06 - loss: 1.5479 - regression_loss: 1.3272 - classification_loss: 0.2207 330/500 [==================>...........] - ETA: 1:06 - loss: 1.5492 - regression_loss: 1.3285 - classification_loss: 0.2207 331/500 [==================>...........] - ETA: 1:05 - loss: 1.5482 - regression_loss: 1.3277 - classification_loss: 0.2205 332/500 [==================>...........] - ETA: 1:05 - loss: 1.5466 - regression_loss: 1.3264 - classification_loss: 0.2202 333/500 [==================>...........] - ETA: 1:05 - loss: 1.5449 - regression_loss: 1.3250 - classification_loss: 0.2199 334/500 [===================>..........] - ETA: 1:04 - loss: 1.5436 - regression_loss: 1.3238 - classification_loss: 0.2197 335/500 [===================>..........] - ETA: 1:04 - loss: 1.5429 - regression_loss: 1.3234 - classification_loss: 0.2196 336/500 [===================>..........] - ETA: 1:03 - loss: 1.5421 - regression_loss: 1.3226 - classification_loss: 0.2194 337/500 [===================>..........] - ETA: 1:03 - loss: 1.5429 - regression_loss: 1.3232 - classification_loss: 0.2197 338/500 [===================>..........] - ETA: 1:03 - loss: 1.5402 - regression_loss: 1.3207 - classification_loss: 0.2195 339/500 [===================>..........] - ETA: 1:02 - loss: 1.5388 - regression_loss: 1.3196 - classification_loss: 0.2192 340/500 [===================>..........] - ETA: 1:02 - loss: 1.5390 - regression_loss: 1.3197 - classification_loss: 0.2193 341/500 [===================>..........] - ETA: 1:01 - loss: 1.5399 - regression_loss: 1.3205 - classification_loss: 0.2194 342/500 [===================>..........] - ETA: 1:01 - loss: 1.5406 - regression_loss: 1.3212 - classification_loss: 0.2194 343/500 [===================>..........] - ETA: 1:01 - loss: 1.5386 - regression_loss: 1.3194 - classification_loss: 0.2192 344/500 [===================>..........] - ETA: 1:00 - loss: 1.5378 - regression_loss: 1.3189 - classification_loss: 0.2190 345/500 [===================>..........] - ETA: 1:00 - loss: 1.5364 - regression_loss: 1.3177 - classification_loss: 0.2187 346/500 [===================>..........] - ETA: 1:00 - loss: 1.5355 - regression_loss: 1.3170 - classification_loss: 0.2185 347/500 [===================>..........] - ETA: 59s - loss: 1.5402 - regression_loss: 1.3210 - classification_loss: 0.2193  348/500 [===================>..........] - ETA: 59s - loss: 1.5408 - regression_loss: 1.3218 - classification_loss: 0.2190 349/500 [===================>..........] - ETA: 58s - loss: 1.5399 - regression_loss: 1.3210 - classification_loss: 0.2190 350/500 [====================>.........] - ETA: 58s - loss: 1.5399 - regression_loss: 1.3212 - classification_loss: 0.2188 351/500 [====================>.........] - ETA: 58s - loss: 1.5402 - regression_loss: 1.3215 - classification_loss: 0.2187 352/500 [====================>.........] - ETA: 57s - loss: 1.5390 - regression_loss: 1.3205 - classification_loss: 0.2185 353/500 [====================>.........] - ETA: 57s - loss: 1.5390 - regression_loss: 1.3205 - classification_loss: 0.2185 354/500 [====================>.........] - ETA: 56s - loss: 1.5385 - regression_loss: 1.3201 - classification_loss: 0.2183 355/500 [====================>.........] - ETA: 56s - loss: 1.5402 - regression_loss: 1.3215 - classification_loss: 0.2188 356/500 [====================>.........] - ETA: 56s - loss: 1.5395 - regression_loss: 1.3209 - classification_loss: 0.2186 357/500 [====================>.........] - ETA: 55s - loss: 1.5383 - regression_loss: 1.3199 - classification_loss: 0.2183 358/500 [====================>.........] - ETA: 55s - loss: 1.5406 - regression_loss: 1.3218 - classification_loss: 0.2188 359/500 [====================>.........] - ETA: 54s - loss: 1.5379 - regression_loss: 1.3194 - classification_loss: 0.2184 360/500 [====================>.........] - ETA: 54s - loss: 1.5388 - regression_loss: 1.3202 - classification_loss: 0.2186 361/500 [====================>.........] - ETA: 54s - loss: 1.5377 - regression_loss: 1.3192 - classification_loss: 0.2185 362/500 [====================>.........] - ETA: 53s - loss: 1.5383 - regression_loss: 1.3197 - classification_loss: 0.2186 363/500 [====================>.........] - ETA: 53s - loss: 1.5384 - regression_loss: 1.3198 - classification_loss: 0.2186 364/500 [====================>.........] - ETA: 53s - loss: 1.5387 - regression_loss: 1.3203 - classification_loss: 0.2184 365/500 [====================>.........] - ETA: 52s - loss: 1.5370 - regression_loss: 1.3189 - classification_loss: 0.2181 366/500 [====================>.........] - ETA: 52s - loss: 1.5380 - regression_loss: 1.3192 - classification_loss: 0.2188 367/500 [=====================>........] - ETA: 51s - loss: 1.5375 - regression_loss: 1.3187 - classification_loss: 0.2187 368/500 [=====================>........] - ETA: 51s - loss: 1.5373 - regression_loss: 1.3187 - classification_loss: 0.2187 369/500 [=====================>........] - ETA: 51s - loss: 1.5363 - regression_loss: 1.3177 - classification_loss: 0.2186 370/500 [=====================>........] - ETA: 50s - loss: 1.5373 - regression_loss: 1.3184 - classification_loss: 0.2188 371/500 [=====================>........] - ETA: 50s - loss: 1.5358 - regression_loss: 1.3173 - classification_loss: 0.2185 372/500 [=====================>........] - ETA: 49s - loss: 1.5368 - regression_loss: 1.3182 - classification_loss: 0.2186 373/500 [=====================>........] - ETA: 49s - loss: 1.5370 - regression_loss: 1.3185 - classification_loss: 0.2184 374/500 [=====================>........] - ETA: 49s - loss: 1.5368 - regression_loss: 1.3183 - classification_loss: 0.2185 375/500 [=====================>........] - ETA: 48s - loss: 1.5378 - regression_loss: 1.3190 - classification_loss: 0.2188 376/500 [=====================>........] - ETA: 48s - loss: 1.5373 - regression_loss: 1.3185 - classification_loss: 0.2188 377/500 [=====================>........] - ETA: 47s - loss: 1.5365 - regression_loss: 1.3177 - classification_loss: 0.2188 378/500 [=====================>........] - ETA: 47s - loss: 1.5381 - regression_loss: 1.3190 - classification_loss: 0.2191 379/500 [=====================>........] - ETA: 47s - loss: 1.5381 - regression_loss: 1.3189 - classification_loss: 0.2191 380/500 [=====================>........] - ETA: 46s - loss: 1.5377 - regression_loss: 1.3187 - classification_loss: 0.2190 381/500 [=====================>........] - ETA: 46s - loss: 1.5375 - regression_loss: 1.3185 - classification_loss: 0.2190 382/500 [=====================>........] - ETA: 46s - loss: 1.5383 - regression_loss: 1.3191 - classification_loss: 0.2192 383/500 [=====================>........] - ETA: 45s - loss: 1.5378 - regression_loss: 1.3188 - classification_loss: 0.2190 384/500 [======================>.......] - ETA: 45s - loss: 1.5368 - regression_loss: 1.3178 - classification_loss: 0.2190 385/500 [======================>.......] - ETA: 44s - loss: 1.5380 - regression_loss: 1.3189 - classification_loss: 0.2190 386/500 [======================>.......] - ETA: 44s - loss: 1.5402 - regression_loss: 1.3207 - classification_loss: 0.2195 387/500 [======================>.......] - ETA: 44s - loss: 1.5400 - regression_loss: 1.3207 - classification_loss: 0.2193 388/500 [======================>.......] - ETA: 43s - loss: 1.5397 - regression_loss: 1.3206 - classification_loss: 0.2191 389/500 [======================>.......] - ETA: 43s - loss: 1.5387 - regression_loss: 1.3198 - classification_loss: 0.2189 390/500 [======================>.......] - ETA: 42s - loss: 1.5383 - regression_loss: 1.3195 - classification_loss: 0.2188 391/500 [======================>.......] - ETA: 42s - loss: 1.5376 - regression_loss: 1.3189 - classification_loss: 0.2187 392/500 [======================>.......] - ETA: 42s - loss: 1.5378 - regression_loss: 1.3190 - classification_loss: 0.2188 393/500 [======================>.......] - ETA: 41s - loss: 1.5381 - regression_loss: 1.3193 - classification_loss: 0.2188 394/500 [======================>.......] - ETA: 41s - loss: 1.5375 - regression_loss: 1.3186 - classification_loss: 0.2188 395/500 [======================>.......] - ETA: 40s - loss: 1.5378 - regression_loss: 1.3192 - classification_loss: 0.2187 396/500 [======================>.......] - ETA: 40s - loss: 1.5388 - regression_loss: 1.3201 - classification_loss: 0.2187 397/500 [======================>.......] - ETA: 40s - loss: 1.5400 - regression_loss: 1.3210 - classification_loss: 0.2191 398/500 [======================>.......] - ETA: 39s - loss: 1.5402 - regression_loss: 1.3209 - classification_loss: 0.2193 399/500 [======================>.......] - ETA: 39s - loss: 1.5389 - regression_loss: 1.3198 - classification_loss: 0.2190 400/500 [=======================>......] - ETA: 38s - loss: 1.5383 - regression_loss: 1.3193 - classification_loss: 0.2189 401/500 [=======================>......] - ETA: 38s - loss: 1.5386 - regression_loss: 1.3198 - classification_loss: 0.2189 402/500 [=======================>......] - ETA: 38s - loss: 1.5385 - regression_loss: 1.3198 - classification_loss: 0.2187 403/500 [=======================>......] - ETA: 37s - loss: 1.5389 - regression_loss: 1.3202 - classification_loss: 0.2187 404/500 [=======================>......] - ETA: 37s - loss: 1.5369 - regression_loss: 1.3185 - classification_loss: 0.2184 405/500 [=======================>......] - ETA: 37s - loss: 1.5362 - regression_loss: 1.3180 - classification_loss: 0.2182 406/500 [=======================>......] - ETA: 36s - loss: 1.5339 - regression_loss: 1.3158 - classification_loss: 0.2181 407/500 [=======================>......] - ETA: 36s - loss: 1.5333 - regression_loss: 1.3154 - classification_loss: 0.2179 408/500 [=======================>......] - ETA: 35s - loss: 1.5333 - regression_loss: 1.3155 - classification_loss: 0.2178 409/500 [=======================>......] - ETA: 35s - loss: 1.5332 - regression_loss: 1.3154 - classification_loss: 0.2178 410/500 [=======================>......] - ETA: 35s - loss: 1.5323 - regression_loss: 1.3147 - classification_loss: 0.2176 411/500 [=======================>......] - ETA: 34s - loss: 1.5319 - regression_loss: 1.3145 - classification_loss: 0.2174 412/500 [=======================>......] - ETA: 34s - loss: 1.5307 - regression_loss: 1.3134 - classification_loss: 0.2173 413/500 [=======================>......] - ETA: 33s - loss: 1.5297 - regression_loss: 1.3125 - classification_loss: 0.2172 414/500 [=======================>......] - ETA: 33s - loss: 1.5291 - regression_loss: 1.3122 - classification_loss: 0.2170 415/500 [=======================>......] - ETA: 33s - loss: 1.5297 - regression_loss: 1.3127 - classification_loss: 0.2170 416/500 [=======================>......] - ETA: 32s - loss: 1.5302 - regression_loss: 1.3126 - classification_loss: 0.2176 417/500 [========================>.....] - ETA: 32s - loss: 1.5294 - regression_loss: 1.3120 - classification_loss: 0.2175 418/500 [========================>.....] - ETA: 31s - loss: 1.5291 - regression_loss: 1.3118 - classification_loss: 0.2173 419/500 [========================>.....] - ETA: 31s - loss: 1.5293 - regression_loss: 1.3122 - classification_loss: 0.2171 420/500 [========================>.....] - ETA: 31s - loss: 1.5284 - regression_loss: 1.3114 - classification_loss: 0.2170 421/500 [========================>.....] - ETA: 30s - loss: 1.5294 - regression_loss: 1.3123 - classification_loss: 0.2171 422/500 [========================>.....] - ETA: 30s - loss: 1.5289 - regression_loss: 1.3120 - classification_loss: 0.2169 423/500 [========================>.....] - ETA: 29s - loss: 1.5286 - regression_loss: 1.3115 - classification_loss: 0.2170 424/500 [========================>.....] - ETA: 29s - loss: 1.5290 - regression_loss: 1.3120 - classification_loss: 0.2170 425/500 [========================>.....] - ETA: 29s - loss: 1.5296 - regression_loss: 1.3125 - classification_loss: 0.2170 426/500 [========================>.....] - ETA: 28s - loss: 1.5301 - regression_loss: 1.3129 - classification_loss: 0.2172 427/500 [========================>.....] - ETA: 28s - loss: 1.5283 - regression_loss: 1.3114 - classification_loss: 0.2169 428/500 [========================>.....] - ETA: 28s - loss: 1.5275 - regression_loss: 1.3108 - classification_loss: 0.2167 429/500 [========================>.....] - ETA: 27s - loss: 1.5300 - regression_loss: 1.3130 - classification_loss: 0.2170 430/500 [========================>.....] - ETA: 27s - loss: 1.5303 - regression_loss: 1.3132 - classification_loss: 0.2171 431/500 [========================>.....] - ETA: 26s - loss: 1.5302 - regression_loss: 1.3131 - classification_loss: 0.2171 432/500 [========================>.....] - ETA: 26s - loss: 1.5292 - regression_loss: 1.3120 - classification_loss: 0.2172 433/500 [========================>.....] - ETA: 26s - loss: 1.5293 - regression_loss: 1.3122 - classification_loss: 0.2171 434/500 [=========================>....] - ETA: 25s - loss: 1.5285 - regression_loss: 1.3115 - classification_loss: 0.2170 435/500 [=========================>....] - ETA: 25s - loss: 1.5304 - regression_loss: 1.3131 - classification_loss: 0.2174 436/500 [=========================>....] - ETA: 24s - loss: 1.5306 - regression_loss: 1.3133 - classification_loss: 0.2174 437/500 [=========================>....] - ETA: 24s - loss: 1.5303 - regression_loss: 1.3130 - classification_loss: 0.2172 438/500 [=========================>....] - ETA: 24s - loss: 1.5320 - regression_loss: 1.3142 - classification_loss: 0.2178 439/500 [=========================>....] - ETA: 23s - loss: 1.5308 - regression_loss: 1.3133 - classification_loss: 0.2176 440/500 [=========================>....] - ETA: 23s - loss: 1.5298 - regression_loss: 1.3124 - classification_loss: 0.2174 441/500 [=========================>....] - ETA: 22s - loss: 1.5312 - regression_loss: 1.3134 - classification_loss: 0.2179 442/500 [=========================>....] - ETA: 22s - loss: 1.5303 - regression_loss: 1.3126 - classification_loss: 0.2177 443/500 [=========================>....] - ETA: 22s - loss: 1.5317 - regression_loss: 1.3138 - classification_loss: 0.2179 444/500 [=========================>....] - ETA: 21s - loss: 1.5311 - regression_loss: 1.3134 - classification_loss: 0.2177 445/500 [=========================>....] - ETA: 21s - loss: 1.5322 - regression_loss: 1.3144 - classification_loss: 0.2178 446/500 [=========================>....] - ETA: 21s - loss: 1.5321 - regression_loss: 1.3143 - classification_loss: 0.2178 447/500 [=========================>....] - ETA: 20s - loss: 1.5314 - regression_loss: 1.3138 - classification_loss: 0.2176 448/500 [=========================>....] - ETA: 20s - loss: 1.5296 - regression_loss: 1.3122 - classification_loss: 0.2174 449/500 [=========================>....] - ETA: 19s - loss: 1.5304 - regression_loss: 1.3128 - classification_loss: 0.2177 450/500 [==========================>...] - ETA: 19s - loss: 1.5314 - regression_loss: 1.3136 - classification_loss: 0.2178 451/500 [==========================>...] - ETA: 19s - loss: 1.5302 - regression_loss: 1.3126 - classification_loss: 0.2175 452/500 [==========================>...] - ETA: 18s - loss: 1.5293 - regression_loss: 1.3120 - classification_loss: 0.2173 453/500 [==========================>...] - ETA: 18s - loss: 1.5288 - regression_loss: 1.3115 - classification_loss: 0.2173 454/500 [==========================>...] - ETA: 17s - loss: 1.5267 - regression_loss: 1.3097 - classification_loss: 0.2170 455/500 [==========================>...] - ETA: 17s - loss: 1.5291 - regression_loss: 1.3117 - classification_loss: 0.2174 456/500 [==========================>...] - ETA: 17s - loss: 1.5273 - regression_loss: 1.3102 - classification_loss: 0.2172 457/500 [==========================>...] - ETA: 16s - loss: 1.5270 - regression_loss: 1.3100 - classification_loss: 0.2170 458/500 [==========================>...] - ETA: 16s - loss: 1.5265 - regression_loss: 1.3096 - classification_loss: 0.2169 459/500 [==========================>...] - ETA: 15s - loss: 1.5256 - regression_loss: 1.3088 - classification_loss: 0.2168 460/500 [==========================>...] - ETA: 15s - loss: 1.5256 - regression_loss: 1.3088 - classification_loss: 0.2169 461/500 [==========================>...] - ETA: 15s - loss: 1.5251 - regression_loss: 1.3083 - classification_loss: 0.2168 462/500 [==========================>...] - ETA: 14s - loss: 1.5249 - regression_loss: 1.3082 - classification_loss: 0.2167 463/500 [==========================>...] - ETA: 14s - loss: 1.5255 - regression_loss: 1.3086 - classification_loss: 0.2169 464/500 [==========================>...] - ETA: 14s - loss: 1.5258 - regression_loss: 1.3088 - classification_loss: 0.2169 465/500 [==========================>...] - ETA: 13s - loss: 1.5247 - regression_loss: 1.3080 - classification_loss: 0.2168 466/500 [==========================>...] - ETA: 13s - loss: 1.5248 - regression_loss: 1.3082 - classification_loss: 0.2166 467/500 [===========================>..] - ETA: 12s - loss: 1.5245 - regression_loss: 1.3078 - classification_loss: 0.2166 468/500 [===========================>..] - ETA: 12s - loss: 1.5239 - regression_loss: 1.3074 - classification_loss: 0.2164 469/500 [===========================>..] - ETA: 12s - loss: 1.5232 - regression_loss: 1.3068 - classification_loss: 0.2164 470/500 [===========================>..] - ETA: 11s - loss: 1.5222 - regression_loss: 1.3059 - classification_loss: 0.2163 471/500 [===========================>..] - ETA: 11s - loss: 1.5253 - regression_loss: 1.3084 - classification_loss: 0.2169 472/500 [===========================>..] - ETA: 10s - loss: 1.5248 - regression_loss: 1.3080 - classification_loss: 0.2168 473/500 [===========================>..] - ETA: 10s - loss: 1.5236 - regression_loss: 1.3070 - classification_loss: 0.2166 474/500 [===========================>..] - ETA: 10s - loss: 1.5230 - regression_loss: 1.3064 - classification_loss: 0.2166 475/500 [===========================>..] - ETA: 9s - loss: 1.5213 - regression_loss: 1.3050 - classification_loss: 0.2163  476/500 [===========================>..] - ETA: 9s - loss: 1.5214 - regression_loss: 1.3051 - classification_loss: 0.2163 477/500 [===========================>..] - ETA: 8s - loss: 1.5203 - regression_loss: 1.3042 - classification_loss: 0.2161 478/500 [===========================>..] - ETA: 8s - loss: 1.5209 - regression_loss: 1.3045 - classification_loss: 0.2164 479/500 [===========================>..] - ETA: 8s - loss: 1.5214 - regression_loss: 1.3049 - classification_loss: 0.2164 480/500 [===========================>..] - ETA: 7s - loss: 1.5202 - regression_loss: 1.3040 - classification_loss: 0.2162 481/500 [===========================>..] - ETA: 7s - loss: 1.5193 - regression_loss: 1.3032 - classification_loss: 0.2160 482/500 [===========================>..] - ETA: 7s - loss: 1.5188 - regression_loss: 1.3027 - classification_loss: 0.2160 483/500 [===========================>..] - ETA: 6s - loss: 1.5185 - regression_loss: 1.3026 - classification_loss: 0.2159 484/500 [============================>.] - ETA: 6s - loss: 1.5182 - regression_loss: 1.3024 - classification_loss: 0.2158 485/500 [============================>.] - ETA: 5s - loss: 1.5176 - regression_loss: 1.3020 - classification_loss: 0.2157 486/500 [============================>.] - ETA: 5s - loss: 1.5176 - regression_loss: 1.3020 - classification_loss: 0.2156 487/500 [============================>.] - ETA: 5s - loss: 1.5159 - regression_loss: 1.3006 - classification_loss: 0.2153 488/500 [============================>.] - ETA: 4s - loss: 1.5143 - regression_loss: 1.2991 - classification_loss: 0.2152 489/500 [============================>.] - ETA: 4s - loss: 1.5137 - regression_loss: 1.2985 - classification_loss: 0.2152 490/500 [============================>.] - ETA: 3s - loss: 1.5125 - regression_loss: 1.2976 - classification_loss: 0.2149 491/500 [============================>.] - ETA: 3s - loss: 1.5110 - regression_loss: 1.2963 - classification_loss: 0.2147 492/500 [============================>.] - ETA: 3s - loss: 1.5104 - regression_loss: 1.2958 - classification_loss: 0.2146 493/500 [============================>.] - ETA: 2s - loss: 1.5102 - regression_loss: 1.2957 - classification_loss: 0.2146 494/500 [============================>.] - ETA: 2s - loss: 1.5119 - regression_loss: 1.2970 - classification_loss: 0.2149 495/500 [============================>.] - ETA: 1s - loss: 1.5122 - regression_loss: 1.2971 - classification_loss: 0.2150 496/500 [============================>.] - ETA: 1s - loss: 1.5100 - regression_loss: 1.2952 - classification_loss: 0.2148 497/500 [============================>.] - ETA: 1s - loss: 1.5099 - regression_loss: 1.2949 - classification_loss: 0.2149 498/500 [============================>.] - ETA: 0s - loss: 1.5112 - regression_loss: 1.2960 - classification_loss: 0.2153 499/500 [============================>.] - ETA: 0s - loss: 1.5102 - regression_loss: 1.2950 - classification_loss: 0.2152 500/500 [==============================] - 195s 389ms/step - loss: 1.5101 - regression_loss: 1.2949 - classification_loss: 0.2153 326 instances of class plum with average precision: 0.8032 mAP: 0.8032 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/20 1/500 [..............................] - ETA: 3:04 - loss: 2.2791 - regression_loss: 1.9494 - classification_loss: 0.3297 2/500 [..............................] - ETA: 3:07 - loss: 1.4296 - regression_loss: 1.2294 - classification_loss: 0.2002 3/500 [..............................] - ETA: 3:12 - loss: 1.3358 - regression_loss: 1.1610 - classification_loss: 0.1748 4/500 [..............................] - ETA: 3:13 - loss: 1.4292 - regression_loss: 1.2470 - classification_loss: 0.1821 5/500 [..............................] - ETA: 3:12 - loss: 1.4631 - regression_loss: 1.2635 - classification_loss: 0.1996 6/500 [..............................] - ETA: 3:13 - loss: 1.4013 - regression_loss: 1.2074 - classification_loss: 0.1939 7/500 [..............................] - ETA: 3:13 - loss: 1.4786 - regression_loss: 1.2680 - classification_loss: 0.2106 8/500 [..............................] - ETA: 3:12 - loss: 1.4929 - regression_loss: 1.2855 - classification_loss: 0.2074 9/500 [..............................] - ETA: 3:12 - loss: 1.5259 - regression_loss: 1.3165 - classification_loss: 0.2093 10/500 [..............................] - ETA: 3:12 - loss: 1.4426 - regression_loss: 1.2355 - classification_loss: 0.2072 11/500 [..............................] - ETA: 3:11 - loss: 1.4387 - regression_loss: 1.2344 - classification_loss: 0.2043 12/500 [..............................] - ETA: 3:10 - loss: 1.3961 - regression_loss: 1.1976 - classification_loss: 0.1985 13/500 [..............................] - ETA: 3:10 - loss: 1.4349 - regression_loss: 1.2274 - classification_loss: 0.2075 14/500 [..............................] - ETA: 3:09 - loss: 1.4284 - regression_loss: 1.2265 - classification_loss: 0.2019 15/500 [..............................] - ETA: 3:09 - loss: 1.4412 - regression_loss: 1.2407 - classification_loss: 0.2005 16/500 [..............................] - ETA: 3:08 - loss: 1.4098 - regression_loss: 1.2131 - classification_loss: 0.1967 17/500 [>.............................] - ETA: 3:07 - loss: 1.4145 - regression_loss: 1.2224 - classification_loss: 0.1920 18/500 [>.............................] - ETA: 3:07 - loss: 1.4035 - regression_loss: 1.2122 - classification_loss: 0.1913 19/500 [>.............................] - ETA: 3:07 - loss: 1.4010 - regression_loss: 1.2107 - classification_loss: 0.1903 20/500 [>.............................] - ETA: 3:07 - loss: 1.3998 - regression_loss: 1.2103 - classification_loss: 0.1894 21/500 [>.............................] - ETA: 3:06 - loss: 1.3897 - regression_loss: 1.2025 - classification_loss: 0.1872 22/500 [>.............................] - ETA: 3:06 - loss: 1.3647 - regression_loss: 1.1802 - classification_loss: 0.1845 23/500 [>.............................] - ETA: 3:05 - loss: 1.3869 - regression_loss: 1.1980 - classification_loss: 0.1890 24/500 [>.............................] - ETA: 3:05 - loss: 1.3555 - regression_loss: 1.1696 - classification_loss: 0.1859 25/500 [>.............................] - ETA: 3:04 - loss: 1.4024 - regression_loss: 1.2060 - classification_loss: 0.1964 26/500 [>.............................] - ETA: 3:04 - loss: 1.3658 - regression_loss: 1.1729 - classification_loss: 0.1929 27/500 [>.............................] - ETA: 3:03 - loss: 1.3618 - regression_loss: 1.1694 - classification_loss: 0.1924 28/500 [>.............................] - ETA: 3:03 - loss: 1.3629 - regression_loss: 1.1709 - classification_loss: 0.1920 29/500 [>.............................] - ETA: 3:02 - loss: 1.3593 - regression_loss: 1.1684 - classification_loss: 0.1909 30/500 [>.............................] - ETA: 3:02 - loss: 1.3677 - regression_loss: 1.1760 - classification_loss: 0.1917 31/500 [>.............................] - ETA: 3:02 - loss: 1.3774 - regression_loss: 1.1846 - classification_loss: 0.1928 32/500 [>.............................] - ETA: 3:01 - loss: 1.3709 - regression_loss: 1.1805 - classification_loss: 0.1904 33/500 [>.............................] - ETA: 3:01 - loss: 1.3703 - regression_loss: 1.1803 - classification_loss: 0.1901 34/500 [=>............................] - ETA: 3:01 - loss: 1.3709 - regression_loss: 1.1809 - classification_loss: 0.1900 35/500 [=>............................] - ETA: 3:00 - loss: 1.3718 - regression_loss: 1.1821 - classification_loss: 0.1897 36/500 [=>............................] - ETA: 3:00 - loss: 1.3660 - regression_loss: 1.1781 - classification_loss: 0.1878 37/500 [=>............................] - ETA: 2:59 - loss: 1.3559 - regression_loss: 1.1680 - classification_loss: 0.1879 38/500 [=>............................] - ETA: 2:59 - loss: 1.3851 - regression_loss: 1.1919 - classification_loss: 0.1932 39/500 [=>............................] - ETA: 2:59 - loss: 1.3727 - regression_loss: 1.1788 - classification_loss: 0.1938 40/500 [=>............................] - ETA: 2:58 - loss: 1.3599 - regression_loss: 1.1668 - classification_loss: 0.1930 41/500 [=>............................] - ETA: 2:58 - loss: 1.3524 - regression_loss: 1.1593 - classification_loss: 0.1931 42/500 [=>............................] - ETA: 2:58 - loss: 1.3595 - regression_loss: 1.1661 - classification_loss: 0.1934 43/500 [=>............................] - ETA: 2:57 - loss: 1.3637 - regression_loss: 1.1662 - classification_loss: 0.1976 44/500 [=>............................] - ETA: 2:57 - loss: 1.3653 - regression_loss: 1.1676 - classification_loss: 0.1976 45/500 [=>............................] - ETA: 2:57 - loss: 1.3616 - regression_loss: 1.1646 - classification_loss: 0.1970 46/500 [=>............................] - ETA: 2:56 - loss: 1.3747 - regression_loss: 1.1753 - classification_loss: 0.1994 47/500 [=>............................] - ETA: 2:56 - loss: 1.3758 - regression_loss: 1.1770 - classification_loss: 0.1988 48/500 [=>............................] - ETA: 2:56 - loss: 1.3825 - regression_loss: 1.1845 - classification_loss: 0.1981 49/500 [=>............................] - ETA: 2:55 - loss: 1.3740 - regression_loss: 1.1769 - classification_loss: 0.1970 50/500 [==>...........................] - ETA: 2:55 - loss: 1.3691 - regression_loss: 1.1727 - classification_loss: 0.1964 51/500 [==>...........................] - ETA: 2:55 - loss: 1.3673 - regression_loss: 1.1712 - classification_loss: 0.1961 52/500 [==>...........................] - ETA: 2:54 - loss: 1.3665 - regression_loss: 1.1713 - classification_loss: 0.1952 53/500 [==>...........................] - ETA: 2:54 - loss: 1.3675 - regression_loss: 1.1722 - classification_loss: 0.1954 54/500 [==>...........................] - ETA: 2:53 - loss: 1.3822 - regression_loss: 1.1842 - classification_loss: 0.1981 55/500 [==>...........................] - ETA: 2:53 - loss: 1.3784 - regression_loss: 1.1817 - classification_loss: 0.1967 56/500 [==>...........................] - ETA: 2:53 - loss: 1.3766 - regression_loss: 1.1799 - classification_loss: 0.1967 57/500 [==>...........................] - ETA: 2:52 - loss: 1.4031 - regression_loss: 1.2036 - classification_loss: 0.1995 58/500 [==>...........................] - ETA: 2:52 - loss: 1.4061 - regression_loss: 1.2066 - classification_loss: 0.1995 59/500 [==>...........................] - ETA: 2:51 - loss: 1.4033 - regression_loss: 1.2042 - classification_loss: 0.1991 60/500 [==>...........................] - ETA: 2:51 - loss: 1.3976 - regression_loss: 1.1983 - classification_loss: 0.1993 61/500 [==>...........................] - ETA: 2:51 - loss: 1.3972 - regression_loss: 1.1979 - classification_loss: 0.1993 62/500 [==>...........................] - ETA: 2:50 - loss: 1.4034 - regression_loss: 1.2032 - classification_loss: 0.2003 63/500 [==>...........................] - ETA: 2:50 - loss: 1.4053 - regression_loss: 1.2053 - classification_loss: 0.1999 64/500 [==>...........................] - ETA: 2:49 - loss: 1.4060 - regression_loss: 1.2057 - classification_loss: 0.2003 65/500 [==>...........................] - ETA: 2:49 - loss: 1.3984 - regression_loss: 1.1995 - classification_loss: 0.1989 66/500 [==>...........................] - ETA: 2:48 - loss: 1.3954 - regression_loss: 1.1966 - classification_loss: 0.1987 67/500 [===>..........................] - ETA: 2:48 - loss: 1.3878 - regression_loss: 1.1907 - classification_loss: 0.1971 68/500 [===>..........................] - ETA: 2:47 - loss: 1.3900 - regression_loss: 1.1932 - classification_loss: 0.1968 69/500 [===>..........................] - ETA: 2:47 - loss: 1.4013 - regression_loss: 1.2026 - classification_loss: 0.1987 70/500 [===>..........................] - ETA: 2:47 - loss: 1.4040 - regression_loss: 1.2054 - classification_loss: 0.1986 71/500 [===>..........................] - ETA: 2:46 - loss: 1.3993 - regression_loss: 1.2019 - classification_loss: 0.1975 72/500 [===>..........................] - ETA: 2:46 - loss: 1.3989 - regression_loss: 1.2017 - classification_loss: 0.1972 73/500 [===>..........................] - ETA: 2:46 - loss: 1.3948 - regression_loss: 1.1986 - classification_loss: 0.1961 74/500 [===>..........................] - ETA: 2:45 - loss: 1.3942 - regression_loss: 1.1985 - classification_loss: 0.1958 75/500 [===>..........................] - ETA: 2:45 - loss: 1.3858 - regression_loss: 1.1906 - classification_loss: 0.1952 76/500 [===>..........................] - ETA: 2:44 - loss: 1.3747 - regression_loss: 1.1812 - classification_loss: 0.1935 77/500 [===>..........................] - ETA: 2:44 - loss: 1.3846 - regression_loss: 1.1910 - classification_loss: 0.1936 78/500 [===>..........................] - ETA: 2:44 - loss: 1.3829 - regression_loss: 1.1900 - classification_loss: 0.1929 79/500 [===>..........................] - ETA: 2:43 - loss: 1.3823 - regression_loss: 1.1893 - classification_loss: 0.1931 80/500 [===>..........................] - ETA: 2:43 - loss: 1.3886 - regression_loss: 1.1953 - classification_loss: 0.1933 81/500 [===>..........................] - ETA: 2:43 - loss: 1.3779 - regression_loss: 1.1864 - classification_loss: 0.1915 82/500 [===>..........................] - ETA: 2:42 - loss: 1.3798 - regression_loss: 1.1886 - classification_loss: 0.1912 83/500 [===>..........................] - ETA: 2:42 - loss: 1.3821 - regression_loss: 1.1896 - classification_loss: 0.1926 84/500 [====>.........................] - ETA: 2:41 - loss: 1.3814 - regression_loss: 1.1893 - classification_loss: 0.1921 85/500 [====>.........................] - ETA: 2:41 - loss: 1.3801 - regression_loss: 1.1888 - classification_loss: 0.1913 86/500 [====>.........................] - ETA: 2:40 - loss: 1.3723 - regression_loss: 1.1822 - classification_loss: 0.1900 87/500 [====>.........................] - ETA: 2:40 - loss: 1.3721 - regression_loss: 1.1828 - classification_loss: 0.1893 88/500 [====>.........................] - ETA: 2:40 - loss: 1.3740 - regression_loss: 1.1844 - classification_loss: 0.1896 89/500 [====>.........................] - ETA: 2:39 - loss: 1.3685 - regression_loss: 1.1800 - classification_loss: 0.1885 90/500 [====>.........................] - ETA: 2:39 - loss: 1.3689 - regression_loss: 1.1808 - classification_loss: 0.1881 91/500 [====>.........................] - ETA: 2:39 - loss: 1.3666 - regression_loss: 1.1794 - classification_loss: 0.1872 92/500 [====>.........................] - ETA: 2:38 - loss: 1.3675 - regression_loss: 1.1802 - classification_loss: 0.1873 93/500 [====>.........................] - ETA: 2:38 - loss: 1.3731 - regression_loss: 1.1849 - classification_loss: 0.1881 94/500 [====>.........................] - ETA: 2:37 - loss: 1.3741 - regression_loss: 1.1860 - classification_loss: 0.1881 95/500 [====>.........................] - ETA: 2:37 - loss: 1.3771 - regression_loss: 1.1885 - classification_loss: 0.1885 96/500 [====>.........................] - ETA: 2:37 - loss: 1.3771 - regression_loss: 1.1891 - classification_loss: 0.1879 97/500 [====>.........................] - ETA: 2:36 - loss: 1.3771 - regression_loss: 1.1898 - classification_loss: 0.1873 98/500 [====>.........................] - ETA: 2:36 - loss: 1.3817 - regression_loss: 1.1941 - classification_loss: 0.1876 99/500 [====>.........................] - ETA: 2:36 - loss: 1.3826 - regression_loss: 1.1947 - classification_loss: 0.1879 100/500 [=====>........................] - ETA: 2:35 - loss: 1.3798 - regression_loss: 1.1927 - classification_loss: 0.1871 101/500 [=====>........................] - ETA: 2:35 - loss: 1.3833 - regression_loss: 1.1950 - classification_loss: 0.1883 102/500 [=====>........................] - ETA: 2:34 - loss: 1.3870 - regression_loss: 1.1986 - classification_loss: 0.1884 103/500 [=====>........................] - ETA: 2:34 - loss: 1.3833 - regression_loss: 1.1958 - classification_loss: 0.1875 104/500 [=====>........................] - ETA: 2:34 - loss: 1.3859 - regression_loss: 1.1980 - classification_loss: 0.1879 105/500 [=====>........................] - ETA: 2:33 - loss: 1.3879 - regression_loss: 1.1996 - classification_loss: 0.1884 106/500 [=====>........................] - ETA: 2:33 - loss: 1.3868 - regression_loss: 1.1986 - classification_loss: 0.1882 107/500 [=====>........................] - ETA: 2:32 - loss: 1.3912 - regression_loss: 1.2026 - classification_loss: 0.1886 108/500 [=====>........................] - ETA: 2:32 - loss: 1.3887 - regression_loss: 1.2008 - classification_loss: 0.1879 109/500 [=====>........................] - ETA: 2:32 - loss: 1.3948 - regression_loss: 1.2055 - classification_loss: 0.1893 110/500 [=====>........................] - ETA: 2:31 - loss: 1.3931 - regression_loss: 1.2044 - classification_loss: 0.1887 111/500 [=====>........................] - ETA: 2:31 - loss: 1.4050 - regression_loss: 1.2134 - classification_loss: 0.1916 112/500 [=====>........................] - ETA: 2:30 - loss: 1.4074 - regression_loss: 1.2158 - classification_loss: 0.1915 113/500 [=====>........................] - ETA: 2:30 - loss: 1.4061 - regression_loss: 1.2148 - classification_loss: 0.1912 114/500 [=====>........................] - ETA: 2:29 - loss: 1.4046 - regression_loss: 1.2140 - classification_loss: 0.1907 115/500 [=====>........................] - ETA: 2:29 - loss: 1.4050 - regression_loss: 1.2148 - classification_loss: 0.1901 116/500 [=====>........................] - ETA: 2:28 - loss: 1.4054 - regression_loss: 1.2158 - classification_loss: 0.1896 117/500 [======>.......................] - ETA: 2:28 - loss: 1.4101 - regression_loss: 1.2192 - classification_loss: 0.1908 118/500 [======>.......................] - ETA: 2:28 - loss: 1.4190 - regression_loss: 1.2264 - classification_loss: 0.1925 119/500 [======>.......................] - ETA: 2:27 - loss: 1.4171 - regression_loss: 1.2250 - classification_loss: 0.1921 120/500 [======>.......................] - ETA: 2:27 - loss: 1.4168 - regression_loss: 1.2245 - classification_loss: 0.1922 121/500 [======>.......................] - ETA: 2:26 - loss: 1.4157 - regression_loss: 1.2242 - classification_loss: 0.1915 122/500 [======>.......................] - ETA: 2:26 - loss: 1.4117 - regression_loss: 1.2208 - classification_loss: 0.1909 123/500 [======>.......................] - ETA: 2:26 - loss: 1.4047 - regression_loss: 1.2149 - classification_loss: 0.1899 124/500 [======>.......................] - ETA: 2:25 - loss: 1.4022 - regression_loss: 1.2128 - classification_loss: 0.1894 125/500 [======>.......................] - ETA: 2:25 - loss: 1.4076 - regression_loss: 1.2174 - classification_loss: 0.1901 126/500 [======>.......................] - ETA: 2:24 - loss: 1.4050 - regression_loss: 1.2148 - classification_loss: 0.1902 127/500 [======>.......................] - ETA: 2:24 - loss: 1.4042 - regression_loss: 1.2145 - classification_loss: 0.1897 128/500 [======>.......................] - ETA: 2:24 - loss: 1.4051 - regression_loss: 1.2155 - classification_loss: 0.1896 129/500 [======>.......................] - ETA: 2:23 - loss: 1.3987 - regression_loss: 1.2100 - classification_loss: 0.1887 130/500 [======>.......................] - ETA: 2:23 - loss: 1.3947 - regression_loss: 1.2067 - classification_loss: 0.1880 131/500 [======>.......................] - ETA: 2:23 - loss: 1.3923 - regression_loss: 1.2045 - classification_loss: 0.1879 132/500 [======>.......................] - ETA: 2:22 - loss: 1.3905 - regression_loss: 1.2027 - classification_loss: 0.1878 133/500 [======>.......................] - ETA: 2:22 - loss: 1.3880 - regression_loss: 1.2005 - classification_loss: 0.1875 134/500 [=======>......................] - ETA: 2:21 - loss: 1.3866 - regression_loss: 1.1995 - classification_loss: 0.1871 135/500 [=======>......................] - ETA: 2:21 - loss: 1.3818 - regression_loss: 1.1953 - classification_loss: 0.1865 136/500 [=======>......................] - ETA: 2:21 - loss: 1.3857 - regression_loss: 1.1986 - classification_loss: 0.1871 137/500 [=======>......................] - ETA: 2:20 - loss: 1.3848 - regression_loss: 1.1978 - classification_loss: 0.1870 138/500 [=======>......................] - ETA: 2:20 - loss: 1.3800 - regression_loss: 1.1937 - classification_loss: 0.1863 139/500 [=======>......................] - ETA: 2:20 - loss: 1.3748 - regression_loss: 1.1891 - classification_loss: 0.1857 140/500 [=======>......................] - ETA: 2:19 - loss: 1.3676 - regression_loss: 1.1828 - classification_loss: 0.1848 141/500 [=======>......................] - ETA: 2:19 - loss: 1.3713 - regression_loss: 1.1857 - classification_loss: 0.1856 142/500 [=======>......................] - ETA: 2:18 - loss: 1.3687 - regression_loss: 1.1837 - classification_loss: 0.1850 143/500 [=======>......................] - ETA: 2:18 - loss: 1.3650 - regression_loss: 1.1805 - classification_loss: 0.1845 144/500 [=======>......................] - ETA: 2:18 - loss: 1.3609 - regression_loss: 1.1770 - classification_loss: 0.1839 145/500 [=======>......................] - ETA: 2:17 - loss: 1.3643 - regression_loss: 1.1802 - classification_loss: 0.1841 146/500 [=======>......................] - ETA: 2:17 - loss: 1.3593 - regression_loss: 1.1760 - classification_loss: 0.1833 147/500 [=======>......................] - ETA: 2:16 - loss: 1.3558 - regression_loss: 1.1728 - classification_loss: 0.1830 148/500 [=======>......................] - ETA: 2:16 - loss: 1.3530 - regression_loss: 1.1705 - classification_loss: 0.1826 149/500 [=======>......................] - ETA: 2:16 - loss: 1.3556 - regression_loss: 1.1732 - classification_loss: 0.1824 150/500 [========>.....................] - ETA: 2:15 - loss: 1.3544 - regression_loss: 1.1722 - classification_loss: 0.1823 151/500 [========>.....................] - ETA: 2:15 - loss: 1.3542 - regression_loss: 1.1719 - classification_loss: 0.1823 152/500 [========>.....................] - ETA: 2:15 - loss: 1.3538 - regression_loss: 1.1715 - classification_loss: 0.1823 153/500 [========>.....................] - ETA: 2:14 - loss: 1.3558 - regression_loss: 1.1726 - classification_loss: 0.1832 154/500 [========>.....................] - ETA: 2:14 - loss: 1.3532 - regression_loss: 1.1705 - classification_loss: 0.1827 155/500 [========>.....................] - ETA: 2:13 - loss: 1.3503 - regression_loss: 1.1680 - classification_loss: 0.1823 156/500 [========>.....................] - ETA: 2:13 - loss: 1.3562 - regression_loss: 1.1718 - classification_loss: 0.1843 157/500 [========>.....................] - ETA: 2:13 - loss: 1.3567 - regression_loss: 1.1722 - classification_loss: 0.1845 158/500 [========>.....................] - ETA: 2:12 - loss: 1.3627 - regression_loss: 1.1760 - classification_loss: 0.1867 159/500 [========>.....................] - ETA: 2:12 - loss: 1.3580 - regression_loss: 1.1721 - classification_loss: 0.1859 160/500 [========>.....................] - ETA: 2:12 - loss: 1.3537 - regression_loss: 1.1683 - classification_loss: 0.1855 161/500 [========>.....................] - ETA: 2:11 - loss: 1.3528 - regression_loss: 1.1675 - classification_loss: 0.1853 162/500 [========>.....................] - ETA: 2:11 - loss: 1.3540 - regression_loss: 1.1685 - classification_loss: 0.1855 163/500 [========>.....................] - ETA: 2:10 - loss: 1.3567 - regression_loss: 1.1711 - classification_loss: 0.1856 164/500 [========>.....................] - ETA: 2:10 - loss: 1.3557 - regression_loss: 1.1707 - classification_loss: 0.1850 165/500 [========>.....................] - ETA: 2:10 - loss: 1.3527 - regression_loss: 1.1682 - classification_loss: 0.1845 166/500 [========>.....................] - ETA: 2:09 - loss: 1.3557 - regression_loss: 1.1706 - classification_loss: 0.1851 167/500 [=========>....................] - ETA: 2:09 - loss: 1.3579 - regression_loss: 1.1726 - classification_loss: 0.1853 168/500 [=========>....................] - ETA: 2:08 - loss: 1.3529 - regression_loss: 1.1683 - classification_loss: 0.1846 169/500 [=========>....................] - ETA: 2:08 - loss: 1.3633 - regression_loss: 1.1774 - classification_loss: 0.1859 170/500 [=========>....................] - ETA: 2:08 - loss: 1.3655 - regression_loss: 1.1796 - classification_loss: 0.1859 171/500 [=========>....................] - ETA: 2:07 - loss: 1.3696 - regression_loss: 1.1837 - classification_loss: 0.1859 172/500 [=========>....................] - ETA: 2:07 - loss: 1.3709 - regression_loss: 1.1846 - classification_loss: 0.1863 173/500 [=========>....................] - ETA: 2:06 - loss: 1.3738 - regression_loss: 1.1870 - classification_loss: 0.1867 174/500 [=========>....................] - ETA: 2:06 - loss: 1.3736 - regression_loss: 1.1869 - classification_loss: 0.1867 175/500 [=========>....................] - ETA: 2:06 - loss: 1.3731 - regression_loss: 1.1867 - classification_loss: 0.1864 176/500 [=========>....................] - ETA: 2:05 - loss: 1.3724 - regression_loss: 1.1861 - classification_loss: 0.1863 177/500 [=========>....................] - ETA: 2:05 - loss: 1.3729 - regression_loss: 1.1868 - classification_loss: 0.1861 178/500 [=========>....................] - ETA: 2:04 - loss: 1.3711 - regression_loss: 1.1852 - classification_loss: 0.1859 179/500 [=========>....................] - ETA: 2:04 - loss: 1.3695 - regression_loss: 1.1839 - classification_loss: 0.1856 180/500 [=========>....................] - ETA: 2:04 - loss: 1.3674 - regression_loss: 1.1823 - classification_loss: 0.1851 181/500 [=========>....................] - ETA: 2:03 - loss: 1.3712 - regression_loss: 1.1862 - classification_loss: 0.1850 182/500 [=========>....................] - ETA: 2:03 - loss: 1.3707 - regression_loss: 1.1859 - classification_loss: 0.1848 183/500 [=========>....................] - ETA: 2:03 - loss: 1.3688 - regression_loss: 1.1843 - classification_loss: 0.1846 184/500 [==========>...................] - ETA: 2:02 - loss: 1.3680 - regression_loss: 1.1838 - classification_loss: 0.1842 185/500 [==========>...................] - ETA: 2:02 - loss: 1.3678 - regression_loss: 1.1838 - classification_loss: 0.1839 186/500 [==========>...................] - ETA: 2:01 - loss: 1.3675 - regression_loss: 1.1838 - classification_loss: 0.1837 187/500 [==========>...................] - ETA: 2:01 - loss: 1.3632 - regression_loss: 1.1802 - classification_loss: 0.1831 188/500 [==========>...................] - ETA: 2:01 - loss: 1.3619 - regression_loss: 1.1785 - classification_loss: 0.1834 189/500 [==========>...................] - ETA: 2:00 - loss: 1.3618 - regression_loss: 1.1785 - classification_loss: 0.1833 190/500 [==========>...................] - ETA: 2:00 - loss: 1.3591 - regression_loss: 1.1764 - classification_loss: 0.1828 191/500 [==========>...................] - ETA: 1:59 - loss: 1.3625 - regression_loss: 1.1792 - classification_loss: 0.1833 192/500 [==========>...................] - ETA: 1:59 - loss: 1.3639 - regression_loss: 1.1801 - classification_loss: 0.1838 193/500 [==========>...................] - ETA: 1:59 - loss: 1.3629 - regression_loss: 1.1791 - classification_loss: 0.1838 194/500 [==========>...................] - ETA: 1:58 - loss: 1.3675 - regression_loss: 1.1826 - classification_loss: 0.1849 195/500 [==========>...................] - ETA: 1:58 - loss: 1.3668 - regression_loss: 1.1822 - classification_loss: 0.1846 196/500 [==========>...................] - ETA: 1:58 - loss: 1.3696 - regression_loss: 1.1841 - classification_loss: 0.1855 197/500 [==========>...................] - ETA: 1:57 - loss: 1.3707 - regression_loss: 1.1853 - classification_loss: 0.1854 198/500 [==========>...................] - ETA: 1:57 - loss: 1.3722 - regression_loss: 1.1866 - classification_loss: 0.1856 199/500 [==========>...................] - ETA: 1:56 - loss: 1.3749 - regression_loss: 1.1892 - classification_loss: 0.1857 200/500 [===========>..................] - ETA: 1:56 - loss: 1.3749 - regression_loss: 1.1892 - classification_loss: 0.1857 201/500 [===========>..................] - ETA: 1:56 - loss: 1.3740 - regression_loss: 1.1884 - classification_loss: 0.1856 202/500 [===========>..................] - ETA: 1:55 - loss: 1.3776 - regression_loss: 1.1917 - classification_loss: 0.1859 203/500 [===========>..................] - ETA: 1:55 - loss: 1.3784 - regression_loss: 1.1925 - classification_loss: 0.1859 204/500 [===========>..................] - ETA: 1:54 - loss: 1.3803 - regression_loss: 1.1943 - classification_loss: 0.1861 205/500 [===========>..................] - ETA: 1:54 - loss: 1.3909 - regression_loss: 1.1996 - classification_loss: 0.1913 206/500 [===========>..................] - ETA: 1:54 - loss: 1.3873 - regression_loss: 1.1964 - classification_loss: 0.1909 207/500 [===========>..................] - ETA: 1:53 - loss: 1.3849 - regression_loss: 1.1944 - classification_loss: 0.1904 208/500 [===========>..................] - ETA: 1:53 - loss: 1.3836 - regression_loss: 1.1936 - classification_loss: 0.1900 209/500 [===========>..................] - ETA: 1:52 - loss: 1.3888 - regression_loss: 1.1980 - classification_loss: 0.1908 210/500 [===========>..................] - ETA: 1:52 - loss: 1.3889 - regression_loss: 1.1981 - classification_loss: 0.1908 211/500 [===========>..................] - ETA: 1:52 - loss: 1.3872 - regression_loss: 1.1967 - classification_loss: 0.1904 212/500 [===========>..................] - ETA: 1:51 - loss: 1.3873 - regression_loss: 1.1971 - classification_loss: 0.1902 213/500 [===========>..................] - ETA: 1:51 - loss: 1.3887 - regression_loss: 1.1982 - classification_loss: 0.1904 214/500 [===========>..................] - ETA: 1:50 - loss: 1.3901 - regression_loss: 1.1998 - classification_loss: 0.1904 215/500 [===========>..................] - ETA: 1:50 - loss: 1.3890 - regression_loss: 1.1990 - classification_loss: 0.1900 216/500 [===========>..................] - ETA: 1:50 - loss: 1.3893 - regression_loss: 1.1992 - classification_loss: 0.1901 217/500 [============>.................] - ETA: 1:49 - loss: 1.3897 - regression_loss: 1.1997 - classification_loss: 0.1900 218/500 [============>.................] - ETA: 1:49 - loss: 1.3873 - regression_loss: 1.1974 - classification_loss: 0.1899 219/500 [============>.................] - ETA: 1:49 - loss: 1.3875 - regression_loss: 1.1972 - classification_loss: 0.1903 220/500 [============>.................] - ETA: 1:48 - loss: 1.3887 - regression_loss: 1.1986 - classification_loss: 0.1901 221/500 [============>.................] - ETA: 1:48 - loss: 1.3881 - regression_loss: 1.1983 - classification_loss: 0.1898 222/500 [============>.................] - ETA: 1:47 - loss: 1.3865 - regression_loss: 1.1968 - classification_loss: 0.1896 223/500 [============>.................] - ETA: 1:47 - loss: 1.3884 - regression_loss: 1.1986 - classification_loss: 0.1898 224/500 [============>.................] - ETA: 1:47 - loss: 1.3900 - regression_loss: 1.2002 - classification_loss: 0.1898 225/500 [============>.................] - ETA: 1:46 - loss: 1.3891 - regression_loss: 1.1990 - classification_loss: 0.1900 226/500 [============>.................] - ETA: 1:46 - loss: 1.3905 - regression_loss: 1.2005 - classification_loss: 0.1901 227/500 [============>.................] - ETA: 1:45 - loss: 1.3888 - regression_loss: 1.1990 - classification_loss: 0.1898 228/500 [============>.................] - ETA: 1:45 - loss: 1.3889 - regression_loss: 1.1992 - classification_loss: 0.1897 229/500 [============>.................] - ETA: 1:45 - loss: 1.3870 - regression_loss: 1.1976 - classification_loss: 0.1894 230/500 [============>.................] - ETA: 1:44 - loss: 1.3879 - regression_loss: 1.1985 - classification_loss: 0.1894 231/500 [============>.................] - ETA: 1:44 - loss: 1.3872 - regression_loss: 1.1979 - classification_loss: 0.1893 232/500 [============>.................] - ETA: 1:44 - loss: 1.3848 - regression_loss: 1.1957 - classification_loss: 0.1891 233/500 [============>.................] - ETA: 1:43 - loss: 1.3882 - regression_loss: 1.1985 - classification_loss: 0.1897 234/500 [=============>................] - ETA: 1:43 - loss: 1.3873 - regression_loss: 1.1978 - classification_loss: 0.1895 235/500 [=============>................] - ETA: 1:42 - loss: 1.3858 - regression_loss: 1.1967 - classification_loss: 0.1892 236/500 [=============>................] - ETA: 1:42 - loss: 1.3865 - regression_loss: 1.1974 - classification_loss: 0.1891 237/500 [=============>................] - ETA: 1:42 - loss: 1.3862 - regression_loss: 1.1971 - classification_loss: 0.1891 238/500 [=============>................] - ETA: 1:41 - loss: 1.3853 - regression_loss: 1.1964 - classification_loss: 0.1889 239/500 [=============>................] - ETA: 1:41 - loss: 1.3871 - regression_loss: 1.1979 - classification_loss: 0.1892 240/500 [=============>................] - ETA: 1:41 - loss: 1.3859 - regression_loss: 1.1968 - classification_loss: 0.1891 241/500 [=============>................] - ETA: 1:40 - loss: 1.3881 - regression_loss: 1.1987 - classification_loss: 0.1894 242/500 [=============>................] - ETA: 1:40 - loss: 1.3882 - regression_loss: 1.1987 - classification_loss: 0.1895 243/500 [=============>................] - ETA: 1:39 - loss: 1.3900 - regression_loss: 1.2000 - classification_loss: 0.1900 244/500 [=============>................] - ETA: 1:39 - loss: 1.3894 - regression_loss: 1.1995 - classification_loss: 0.1899 245/500 [=============>................] - ETA: 1:39 - loss: 1.3875 - regression_loss: 1.1976 - classification_loss: 0.1899 246/500 [=============>................] - ETA: 1:38 - loss: 1.3885 - regression_loss: 1.1985 - classification_loss: 0.1899 247/500 [=============>................] - ETA: 1:38 - loss: 1.3894 - regression_loss: 1.1992 - classification_loss: 0.1902 248/500 [=============>................] - ETA: 1:37 - loss: 1.3884 - regression_loss: 1.1986 - classification_loss: 0.1899 249/500 [=============>................] - ETA: 1:37 - loss: 1.3886 - regression_loss: 1.1988 - classification_loss: 0.1898 250/500 [==============>...............] - ETA: 1:37 - loss: 1.3879 - regression_loss: 1.1983 - classification_loss: 0.1896 251/500 [==============>...............] - ETA: 1:36 - loss: 1.3863 - regression_loss: 1.1968 - classification_loss: 0.1894 252/500 [==============>...............] - ETA: 1:36 - loss: 1.3857 - regression_loss: 1.1965 - classification_loss: 0.1892 253/500 [==============>...............] - ETA: 1:36 - loss: 1.3851 - regression_loss: 1.1961 - classification_loss: 0.1890 254/500 [==============>...............] - ETA: 1:35 - loss: 1.3867 - regression_loss: 1.1975 - classification_loss: 0.1892 255/500 [==============>...............] - ETA: 1:35 - loss: 1.3850 - regression_loss: 1.1957 - classification_loss: 0.1893 256/500 [==============>...............] - ETA: 1:34 - loss: 1.3851 - regression_loss: 1.1958 - classification_loss: 0.1893 257/500 [==============>...............] - ETA: 1:34 - loss: 1.3864 - regression_loss: 1.1968 - classification_loss: 0.1897 258/500 [==============>...............] - ETA: 1:34 - loss: 1.3863 - regression_loss: 1.1968 - classification_loss: 0.1895 259/500 [==============>...............] - ETA: 1:33 - loss: 1.3857 - regression_loss: 1.1963 - classification_loss: 0.1894 260/500 [==============>...............] - ETA: 1:33 - loss: 1.3858 - regression_loss: 1.1963 - classification_loss: 0.1895 261/500 [==============>...............] - ETA: 1:32 - loss: 1.3848 - regression_loss: 1.1954 - classification_loss: 0.1894 262/500 [==============>...............] - ETA: 1:32 - loss: 1.3861 - regression_loss: 1.1964 - classification_loss: 0.1897 263/500 [==============>...............] - ETA: 1:32 - loss: 1.3869 - regression_loss: 1.1972 - classification_loss: 0.1897 264/500 [==============>...............] - ETA: 1:31 - loss: 1.3869 - regression_loss: 1.1974 - classification_loss: 0.1895 265/500 [==============>...............] - ETA: 1:31 - loss: 1.3888 - regression_loss: 1.1991 - classification_loss: 0.1897 266/500 [==============>...............] - ETA: 1:31 - loss: 1.3895 - regression_loss: 1.1997 - classification_loss: 0.1897 267/500 [===============>..............] - ETA: 1:30 - loss: 1.3890 - regression_loss: 1.1994 - classification_loss: 0.1896 268/500 [===============>..............] - ETA: 1:30 - loss: 1.3868 - regression_loss: 1.1974 - classification_loss: 0.1894 269/500 [===============>..............] - ETA: 1:29 - loss: 1.3861 - regression_loss: 1.1968 - classification_loss: 0.1893 270/500 [===============>..............] - ETA: 1:29 - loss: 1.3877 - regression_loss: 1.1980 - classification_loss: 0.1897 271/500 [===============>..............] - ETA: 1:29 - loss: 1.3866 - regression_loss: 1.1971 - classification_loss: 0.1895 272/500 [===============>..............] - ETA: 1:28 - loss: 1.3869 - regression_loss: 1.1975 - classification_loss: 0.1894 273/500 [===============>..............] - ETA: 1:28 - loss: 1.3864 - regression_loss: 1.1971 - classification_loss: 0.1893 274/500 [===============>..............] - ETA: 1:27 - loss: 1.3859 - regression_loss: 1.1966 - classification_loss: 0.1893 275/500 [===============>..............] - ETA: 1:27 - loss: 1.3850 - regression_loss: 1.1960 - classification_loss: 0.1890 276/500 [===============>..............] - ETA: 1:27 - loss: 1.3863 - regression_loss: 1.1971 - classification_loss: 0.1892 277/500 [===============>..............] - ETA: 1:26 - loss: 1.3869 - regression_loss: 1.1977 - classification_loss: 0.1892 278/500 [===============>..............] - ETA: 1:26 - loss: 1.3898 - regression_loss: 1.2004 - classification_loss: 0.1894 279/500 [===============>..............] - ETA: 1:25 - loss: 1.3927 - regression_loss: 1.2028 - classification_loss: 0.1899 280/500 [===============>..............] - ETA: 1:25 - loss: 1.3927 - regression_loss: 1.2029 - classification_loss: 0.1899 281/500 [===============>..............] - ETA: 1:25 - loss: 1.3932 - regression_loss: 1.2032 - classification_loss: 0.1899 282/500 [===============>..............] - ETA: 1:24 - loss: 1.3938 - regression_loss: 1.2038 - classification_loss: 0.1900 283/500 [===============>..............] - ETA: 1:24 - loss: 1.3926 - regression_loss: 1.2029 - classification_loss: 0.1896 284/500 [================>.............] - ETA: 1:24 - loss: 1.3923 - regression_loss: 1.2029 - classification_loss: 0.1894 285/500 [================>.............] - ETA: 1:23 - loss: 1.3917 - regression_loss: 1.2024 - classification_loss: 0.1893 286/500 [================>.............] - ETA: 1:23 - loss: 1.3917 - regression_loss: 1.2023 - classification_loss: 0.1893 287/500 [================>.............] - ETA: 1:22 - loss: 1.3930 - regression_loss: 1.2035 - classification_loss: 0.1895 288/500 [================>.............] - ETA: 1:22 - loss: 1.3963 - regression_loss: 1.2060 - classification_loss: 0.1903 289/500 [================>.............] - ETA: 1:22 - loss: 1.3986 - regression_loss: 1.2080 - classification_loss: 0.1906 290/500 [================>.............] - ETA: 1:21 - loss: 1.3986 - regression_loss: 1.2081 - classification_loss: 0.1905 291/500 [================>.............] - ETA: 1:21 - loss: 1.3975 - regression_loss: 1.2072 - classification_loss: 0.1903 292/500 [================>.............] - ETA: 1:20 - loss: 1.3966 - regression_loss: 1.2058 - classification_loss: 0.1908 293/500 [================>.............] - ETA: 1:20 - loss: 1.3945 - regression_loss: 1.2040 - classification_loss: 0.1905 294/500 [================>.............] - ETA: 1:20 - loss: 1.3915 - regression_loss: 1.2013 - classification_loss: 0.1903 295/500 [================>.............] - ETA: 1:19 - loss: 1.3895 - regression_loss: 1.1995 - classification_loss: 0.1900 296/500 [================>.............] - ETA: 1:19 - loss: 1.3893 - regression_loss: 1.1993 - classification_loss: 0.1901 297/500 [================>.............] - ETA: 1:18 - loss: 1.3912 - regression_loss: 1.2012 - classification_loss: 0.1900 298/500 [================>.............] - ETA: 1:18 - loss: 1.3887 - regression_loss: 1.1988 - classification_loss: 0.1898 299/500 [================>.............] - ETA: 1:18 - loss: 1.3886 - regression_loss: 1.1988 - classification_loss: 0.1899 300/500 [=================>............] - ETA: 1:17 - loss: 1.3887 - regression_loss: 1.1988 - classification_loss: 0.1899 301/500 [=================>............] - ETA: 1:17 - loss: 1.3877 - regression_loss: 1.1979 - classification_loss: 0.1898 302/500 [=================>............] - ETA: 1:17 - loss: 1.3859 - regression_loss: 1.1964 - classification_loss: 0.1895 303/500 [=================>............] - ETA: 1:16 - loss: 1.3864 - regression_loss: 1.1971 - classification_loss: 0.1893 304/500 [=================>............] - ETA: 1:16 - loss: 1.3848 - regression_loss: 1.1957 - classification_loss: 0.1891 305/500 [=================>............] - ETA: 1:15 - loss: 1.3838 - regression_loss: 1.1949 - classification_loss: 0.1889 306/500 [=================>............] - ETA: 1:15 - loss: 1.3805 - regression_loss: 1.1920 - classification_loss: 0.1885 307/500 [=================>............] - ETA: 1:15 - loss: 1.3806 - regression_loss: 1.1922 - classification_loss: 0.1884 308/500 [=================>............] - ETA: 1:14 - loss: 1.3808 - regression_loss: 1.1920 - classification_loss: 0.1888 309/500 [=================>............] - ETA: 1:14 - loss: 1.3796 - regression_loss: 1.1909 - classification_loss: 0.1887 310/500 [=================>............] - ETA: 1:13 - loss: 1.3764 - regression_loss: 1.1881 - classification_loss: 0.1883 311/500 [=================>............] - ETA: 1:13 - loss: 1.3764 - regression_loss: 1.1881 - classification_loss: 0.1883 312/500 [=================>............] - ETA: 1:13 - loss: 1.3781 - regression_loss: 1.1896 - classification_loss: 0.1885 313/500 [=================>............] - ETA: 1:12 - loss: 1.3781 - regression_loss: 1.1895 - classification_loss: 0.1885 314/500 [=================>............] - ETA: 1:12 - loss: 1.3777 - regression_loss: 1.1894 - classification_loss: 0.1884 315/500 [=================>............] - ETA: 1:11 - loss: 1.3775 - regression_loss: 1.1892 - classification_loss: 0.1883 316/500 [=================>............] - ETA: 1:11 - loss: 1.3784 - regression_loss: 1.1902 - classification_loss: 0.1882 317/500 [==================>...........] - ETA: 1:11 - loss: 1.3785 - regression_loss: 1.1903 - classification_loss: 0.1882 318/500 [==================>...........] - ETA: 1:10 - loss: 1.3777 - regression_loss: 1.1896 - classification_loss: 0.1880 319/500 [==================>...........] - ETA: 1:10 - loss: 1.3805 - regression_loss: 1.1915 - classification_loss: 0.1891 320/500 [==================>...........] - ETA: 1:09 - loss: 1.3812 - regression_loss: 1.1921 - classification_loss: 0.1891 321/500 [==================>...........] - ETA: 1:09 - loss: 1.3808 - regression_loss: 1.1920 - classification_loss: 0.1888 322/500 [==================>...........] - ETA: 1:09 - loss: 1.3832 - regression_loss: 1.1942 - classification_loss: 0.1890 323/500 [==================>...........] - ETA: 1:08 - loss: 1.3837 - regression_loss: 1.1946 - classification_loss: 0.1890 324/500 [==================>...........] - ETA: 1:08 - loss: 1.3830 - regression_loss: 1.1942 - classification_loss: 0.1888 325/500 [==================>...........] - ETA: 1:08 - loss: 1.3831 - regression_loss: 1.1945 - classification_loss: 0.1887 326/500 [==================>...........] - ETA: 1:07 - loss: 1.3837 - regression_loss: 1.1951 - classification_loss: 0.1886 327/500 [==================>...........] - ETA: 1:07 - loss: 1.3825 - regression_loss: 1.1940 - classification_loss: 0.1884 328/500 [==================>...........] - ETA: 1:06 - loss: 1.3817 - regression_loss: 1.1934 - classification_loss: 0.1883 329/500 [==================>...........] - ETA: 1:06 - loss: 1.3801 - regression_loss: 1.1921 - classification_loss: 0.1879 330/500 [==================>...........] - ETA: 1:06 - loss: 1.3794 - regression_loss: 1.1916 - classification_loss: 0.1878 331/500 [==================>...........] - ETA: 1:05 - loss: 1.3779 - regression_loss: 1.1903 - classification_loss: 0.1876 332/500 [==================>...........] - ETA: 1:05 - loss: 1.3813 - regression_loss: 1.1926 - classification_loss: 0.1887 333/500 [==================>...........] - ETA: 1:04 - loss: 1.3823 - regression_loss: 1.1936 - classification_loss: 0.1887 334/500 [===================>..........] - ETA: 1:04 - loss: 1.3812 - regression_loss: 1.1928 - classification_loss: 0.1885 335/500 [===================>..........] - ETA: 1:04 - loss: 1.3816 - regression_loss: 1.1932 - classification_loss: 0.1884 336/500 [===================>..........] - ETA: 1:03 - loss: 1.3804 - regression_loss: 1.1921 - classification_loss: 0.1883 337/500 [===================>..........] - ETA: 1:03 - loss: 1.3810 - regression_loss: 1.1926 - classification_loss: 0.1884 338/500 [===================>..........] - ETA: 1:03 - loss: 1.3792 - regression_loss: 1.1909 - classification_loss: 0.1882 339/500 [===================>..........] - ETA: 1:02 - loss: 1.3803 - regression_loss: 1.1917 - classification_loss: 0.1886 340/500 [===================>..........] - ETA: 1:02 - loss: 1.3808 - regression_loss: 1.1922 - classification_loss: 0.1887 341/500 [===================>..........] - ETA: 1:01 - loss: 1.3799 - regression_loss: 1.1914 - classification_loss: 0.1885 342/500 [===================>..........] - ETA: 1:01 - loss: 1.3809 - regression_loss: 1.1922 - classification_loss: 0.1886 343/500 [===================>..........] - ETA: 1:01 - loss: 1.3804 - regression_loss: 1.1919 - classification_loss: 0.1885 344/500 [===================>..........] - ETA: 1:00 - loss: 1.3806 - regression_loss: 1.1920 - classification_loss: 0.1886 345/500 [===================>..........] - ETA: 1:00 - loss: 1.3806 - regression_loss: 1.1920 - classification_loss: 0.1886 346/500 [===================>..........] - ETA: 59s - loss: 1.3799 - regression_loss: 1.1915 - classification_loss: 0.1884  347/500 [===================>..........] - ETA: 59s - loss: 1.3792 - regression_loss: 1.1909 - classification_loss: 0.1883 348/500 [===================>..........] - ETA: 59s - loss: 1.3787 - regression_loss: 1.1905 - classification_loss: 0.1882 349/500 [===================>..........] - ETA: 58s - loss: 1.3780 - regression_loss: 1.1899 - classification_loss: 0.1881 350/500 [====================>.........] - ETA: 58s - loss: 1.3774 - regression_loss: 1.1893 - classification_loss: 0.1881 351/500 [====================>.........] - ETA: 57s - loss: 1.3774 - regression_loss: 1.1894 - classification_loss: 0.1880 352/500 [====================>.........] - ETA: 57s - loss: 1.3766 - regression_loss: 1.1887 - classification_loss: 0.1879 353/500 [====================>.........] - ETA: 57s - loss: 1.3749 - regression_loss: 1.1874 - classification_loss: 0.1876 354/500 [====================>.........] - ETA: 56s - loss: 1.3754 - regression_loss: 1.1877 - classification_loss: 0.1877 355/500 [====================>.........] - ETA: 56s - loss: 1.3748 - regression_loss: 1.1873 - classification_loss: 0.1875 356/500 [====================>.........] - ETA: 55s - loss: 1.3746 - regression_loss: 1.1872 - classification_loss: 0.1874 357/500 [====================>.........] - ETA: 55s - loss: 1.3754 - regression_loss: 1.1879 - classification_loss: 0.1875 358/500 [====================>.........] - ETA: 55s - loss: 1.3745 - regression_loss: 1.1873 - classification_loss: 0.1873 359/500 [====================>.........] - ETA: 54s - loss: 1.3757 - regression_loss: 1.1881 - classification_loss: 0.1876 360/500 [====================>.........] - ETA: 54s - loss: 1.3756 - regression_loss: 1.1881 - classification_loss: 0.1876 361/500 [====================>.........] - ETA: 54s - loss: 1.3764 - regression_loss: 1.1887 - classification_loss: 0.1877 362/500 [====================>.........] - ETA: 53s - loss: 1.3771 - regression_loss: 1.1894 - classification_loss: 0.1877 363/500 [====================>.........] - ETA: 53s - loss: 1.3754 - regression_loss: 1.1879 - classification_loss: 0.1875 364/500 [====================>.........] - ETA: 52s - loss: 1.3751 - regression_loss: 1.1876 - classification_loss: 0.1875 365/500 [====================>.........] - ETA: 52s - loss: 1.3740 - regression_loss: 1.1867 - classification_loss: 0.1873 366/500 [====================>.........] - ETA: 52s - loss: 1.3724 - regression_loss: 1.1853 - classification_loss: 0.1871 367/500 [=====================>........] - ETA: 51s - loss: 1.3713 - regression_loss: 1.1844 - classification_loss: 0.1870 368/500 [=====================>........] - ETA: 51s - loss: 1.3705 - regression_loss: 1.1837 - classification_loss: 0.1868 369/500 [=====================>........] - ETA: 50s - loss: 1.3702 - regression_loss: 1.1836 - classification_loss: 0.1866 370/500 [=====================>........] - ETA: 50s - loss: 1.3706 - regression_loss: 1.1839 - classification_loss: 0.1867 371/500 [=====================>........] - ETA: 50s - loss: 1.3702 - regression_loss: 1.1836 - classification_loss: 0.1866 372/500 [=====================>........] - ETA: 49s - loss: 1.3701 - regression_loss: 1.1836 - classification_loss: 0.1865 373/500 [=====================>........] - ETA: 49s - loss: 1.3696 - regression_loss: 1.1832 - classification_loss: 0.1864 374/500 [=====================>........] - ETA: 49s - loss: 1.3685 - regression_loss: 1.1822 - classification_loss: 0.1863 375/500 [=====================>........] - ETA: 48s - loss: 1.3688 - regression_loss: 1.1825 - classification_loss: 0.1863 376/500 [=====================>........] - ETA: 48s - loss: 1.3690 - regression_loss: 1.1828 - classification_loss: 0.1862 377/500 [=====================>........] - ETA: 47s - loss: 1.3682 - regression_loss: 1.1822 - classification_loss: 0.1860 378/500 [=====================>........] - ETA: 47s - loss: 1.3675 - regression_loss: 1.1816 - classification_loss: 0.1859 379/500 [=====================>........] - ETA: 47s - loss: 1.3670 - regression_loss: 1.1812 - classification_loss: 0.1859 380/500 [=====================>........] - ETA: 46s - loss: 1.3678 - regression_loss: 1.1821 - classification_loss: 0.1857 381/500 [=====================>........] - ETA: 46s - loss: 1.3676 - regression_loss: 1.1820 - classification_loss: 0.1857 382/500 [=====================>........] - ETA: 45s - loss: 1.3679 - regression_loss: 1.1823 - classification_loss: 0.1856 383/500 [=====================>........] - ETA: 45s - loss: 1.3667 - regression_loss: 1.1813 - classification_loss: 0.1854 384/500 [======================>.......] - ETA: 45s - loss: 1.3652 - regression_loss: 1.1801 - classification_loss: 0.1851 385/500 [======================>.......] - ETA: 44s - loss: 1.3662 - regression_loss: 1.1810 - classification_loss: 0.1851 386/500 [======================>.......] - ETA: 44s - loss: 1.3674 - regression_loss: 1.1822 - classification_loss: 0.1852 387/500 [======================>.......] - ETA: 43s - loss: 1.3659 - regression_loss: 1.1808 - classification_loss: 0.1851 388/500 [======================>.......] - ETA: 43s - loss: 1.3677 - regression_loss: 1.1823 - classification_loss: 0.1854 389/500 [======================>.......] - ETA: 43s - loss: 1.3682 - regression_loss: 1.1824 - classification_loss: 0.1858 390/500 [======================>.......] - ETA: 42s - loss: 1.3678 - regression_loss: 1.1821 - classification_loss: 0.1856 391/500 [======================>.......] - ETA: 42s - loss: 1.3679 - regression_loss: 1.1822 - classification_loss: 0.1857 392/500 [======================>.......] - ETA: 42s - loss: 1.3671 - regression_loss: 1.1816 - classification_loss: 0.1856 393/500 [======================>.......] - ETA: 41s - loss: 1.3660 - regression_loss: 1.1807 - classification_loss: 0.1853 394/500 [======================>.......] - ETA: 41s - loss: 1.3644 - regression_loss: 1.1794 - classification_loss: 0.1850 395/500 [======================>.......] - ETA: 40s - loss: 1.3674 - regression_loss: 1.1817 - classification_loss: 0.1857 396/500 [======================>.......] - ETA: 40s - loss: 1.3660 - regression_loss: 1.1805 - classification_loss: 0.1855 397/500 [======================>.......] - ETA: 40s - loss: 1.3659 - regression_loss: 1.1804 - classification_loss: 0.1855 398/500 [======================>.......] - ETA: 39s - loss: 1.3648 - regression_loss: 1.1794 - classification_loss: 0.1854 399/500 [======================>.......] - ETA: 39s - loss: 1.3643 - regression_loss: 1.1791 - classification_loss: 0.1853 400/500 [=======================>......] - ETA: 38s - loss: 1.3637 - regression_loss: 1.1787 - classification_loss: 0.1851 401/500 [=======================>......] - ETA: 38s - loss: 1.3635 - regression_loss: 1.1785 - classification_loss: 0.1850 402/500 [=======================>......] - ETA: 38s - loss: 1.3640 - regression_loss: 1.1789 - classification_loss: 0.1851 403/500 [=======================>......] - ETA: 37s - loss: 1.3623 - regression_loss: 1.1774 - classification_loss: 0.1849 404/500 [=======================>......] - ETA: 37s - loss: 1.3628 - regression_loss: 1.1778 - classification_loss: 0.1850 405/500 [=======================>......] - ETA: 36s - loss: 1.3627 - regression_loss: 1.1778 - classification_loss: 0.1849 406/500 [=======================>......] - ETA: 36s - loss: 1.3626 - regression_loss: 1.1778 - classification_loss: 0.1848 407/500 [=======================>......] - ETA: 36s - loss: 1.3638 - regression_loss: 1.1786 - classification_loss: 0.1852 408/500 [=======================>......] - ETA: 35s - loss: 1.3639 - regression_loss: 1.1787 - classification_loss: 0.1852 409/500 [=======================>......] - ETA: 35s - loss: 1.3632 - regression_loss: 1.1782 - classification_loss: 0.1850 410/500 [=======================>......] - ETA: 35s - loss: 1.3626 - regression_loss: 1.1777 - classification_loss: 0.1849 411/500 [=======================>......] - ETA: 34s - loss: 1.3617 - regression_loss: 1.1770 - classification_loss: 0.1848 412/500 [=======================>......] - ETA: 34s - loss: 1.3626 - regression_loss: 1.1774 - classification_loss: 0.1851 413/500 [=======================>......] - ETA: 33s - loss: 1.3606 - regression_loss: 1.1757 - classification_loss: 0.1849 414/500 [=======================>......] - ETA: 33s - loss: 1.3633 - regression_loss: 1.1780 - classification_loss: 0.1852 415/500 [=======================>......] - ETA: 33s - loss: 1.3620 - regression_loss: 1.1768 - classification_loss: 0.1852 416/500 [=======================>......] - ETA: 32s - loss: 1.3602 - regression_loss: 1.1753 - classification_loss: 0.1850 417/500 [========================>.....] - ETA: 32s - loss: 1.3614 - regression_loss: 1.1764 - classification_loss: 0.1850 418/500 [========================>.....] - ETA: 31s - loss: 1.3604 - regression_loss: 1.1754 - classification_loss: 0.1849 419/500 [========================>.....] - ETA: 31s - loss: 1.3600 - regression_loss: 1.1752 - classification_loss: 0.1848 420/500 [========================>.....] - ETA: 31s - loss: 1.3608 - regression_loss: 1.1758 - classification_loss: 0.1850 421/500 [========================>.....] - ETA: 30s - loss: 1.3608 - regression_loss: 1.1758 - classification_loss: 0.1850 422/500 [========================>.....] - ETA: 30s - loss: 1.3599 - regression_loss: 1.1751 - classification_loss: 0.1848 423/500 [========================>.....] - ETA: 29s - loss: 1.3613 - regression_loss: 1.1762 - classification_loss: 0.1851 424/500 [========================>.....] - ETA: 29s - loss: 1.3604 - regression_loss: 1.1754 - classification_loss: 0.1850 425/500 [========================>.....] - ETA: 29s - loss: 1.3596 - regression_loss: 1.1745 - classification_loss: 0.1851 426/500 [========================>.....] - ETA: 28s - loss: 1.3624 - regression_loss: 1.1767 - classification_loss: 0.1857 427/500 [========================>.....] - ETA: 28s - loss: 1.3607 - regression_loss: 1.1751 - classification_loss: 0.1856 428/500 [========================>.....] - ETA: 28s - loss: 1.3613 - regression_loss: 1.1756 - classification_loss: 0.1857 429/500 [========================>.....] - ETA: 27s - loss: 1.3609 - regression_loss: 1.1753 - classification_loss: 0.1856 430/500 [========================>.....] - ETA: 27s - loss: 1.3592 - regression_loss: 1.1738 - classification_loss: 0.1854 431/500 [========================>.....] - ETA: 26s - loss: 1.3588 - regression_loss: 1.1734 - classification_loss: 0.1853 432/500 [========================>.....] - ETA: 26s - loss: 1.3607 - regression_loss: 1.1748 - classification_loss: 0.1860 433/500 [========================>.....] - ETA: 26s - loss: 1.3602 - regression_loss: 1.1743 - classification_loss: 0.1859 434/500 [=========================>....] - ETA: 25s - loss: 1.3600 - regression_loss: 1.1741 - classification_loss: 0.1859 435/500 [=========================>....] - ETA: 25s - loss: 1.3597 - regression_loss: 1.1738 - classification_loss: 0.1859 436/500 [=========================>....] - ETA: 24s - loss: 1.3588 - regression_loss: 1.1730 - classification_loss: 0.1858 437/500 [=========================>....] - ETA: 24s - loss: 1.3592 - regression_loss: 1.1735 - classification_loss: 0.1857 438/500 [=========================>....] - ETA: 24s - loss: 1.3586 - regression_loss: 1.1730 - classification_loss: 0.1856 439/500 [=========================>....] - ETA: 23s - loss: 1.3588 - regression_loss: 1.1733 - classification_loss: 0.1855 440/500 [=========================>....] - ETA: 23s - loss: 1.3580 - regression_loss: 1.1727 - classification_loss: 0.1854 441/500 [=========================>....] - ETA: 22s - loss: 1.3592 - regression_loss: 1.1735 - classification_loss: 0.1856 442/500 [=========================>....] - ETA: 22s - loss: 1.3594 - regression_loss: 1.1736 - classification_loss: 0.1858 443/500 [=========================>....] - ETA: 22s - loss: 1.3594 - regression_loss: 1.1736 - classification_loss: 0.1858 444/500 [=========================>....] - ETA: 21s - loss: 1.3581 - regression_loss: 1.1725 - classification_loss: 0.1856 445/500 [=========================>....] - ETA: 21s - loss: 1.3593 - regression_loss: 1.1736 - classification_loss: 0.1857 446/500 [=========================>....] - ETA: 21s - loss: 1.3593 - regression_loss: 1.1736 - classification_loss: 0.1857 447/500 [=========================>....] - ETA: 20s - loss: 1.3594 - regression_loss: 1.1737 - classification_loss: 0.1857 448/500 [=========================>....] - ETA: 20s - loss: 1.3591 - regression_loss: 1.1736 - classification_loss: 0.1855 449/500 [=========================>....] - ETA: 19s - loss: 1.3586 - regression_loss: 1.1732 - classification_loss: 0.1854 450/500 [==========================>...] - ETA: 19s - loss: 1.3583 - regression_loss: 1.1727 - classification_loss: 0.1855 451/500 [==========================>...] - ETA: 19s - loss: 1.3582 - regression_loss: 1.1727 - classification_loss: 0.1855 452/500 [==========================>...] - ETA: 18s - loss: 1.3563 - regression_loss: 1.1710 - classification_loss: 0.1853 453/500 [==========================>...] - ETA: 18s - loss: 1.3566 - regression_loss: 1.1711 - classification_loss: 0.1855 454/500 [==========================>...] - ETA: 17s - loss: 1.3567 - regression_loss: 1.1714 - classification_loss: 0.1854 455/500 [==========================>...] - ETA: 17s - loss: 1.3559 - regression_loss: 1.1707 - classification_loss: 0.1852 456/500 [==========================>...] - ETA: 17s - loss: 1.3549 - regression_loss: 1.1699 - classification_loss: 0.1850 457/500 [==========================>...] - ETA: 16s - loss: 1.3545 - regression_loss: 1.1695 - classification_loss: 0.1850 458/500 [==========================>...] - ETA: 16s - loss: 1.3536 - regression_loss: 1.1688 - classification_loss: 0.1848 459/500 [==========================>...] - ETA: 15s - loss: 1.3530 - regression_loss: 1.1682 - classification_loss: 0.1848 460/500 [==========================>...] - ETA: 15s - loss: 1.3565 - regression_loss: 1.1711 - classification_loss: 0.1854 461/500 [==========================>...] - ETA: 15s - loss: 1.3557 - regression_loss: 1.1704 - classification_loss: 0.1852 462/500 [==========================>...] - ETA: 14s - loss: 1.3567 - regression_loss: 1.1715 - classification_loss: 0.1853 463/500 [==========================>...] - ETA: 14s - loss: 1.3571 - regression_loss: 1.1719 - classification_loss: 0.1852 464/500 [==========================>...] - ETA: 14s - loss: 1.3578 - regression_loss: 1.1724 - classification_loss: 0.1854 465/500 [==========================>...] - ETA: 13s - loss: 1.3583 - regression_loss: 1.1728 - classification_loss: 0.1855 466/500 [==========================>...] - ETA: 13s - loss: 1.3590 - regression_loss: 1.1735 - classification_loss: 0.1855 467/500 [===========================>..] - ETA: 12s - loss: 1.3593 - regression_loss: 1.1738 - classification_loss: 0.1855 468/500 [===========================>..] - ETA: 12s - loss: 1.3607 - regression_loss: 1.1750 - classification_loss: 0.1856 469/500 [===========================>..] - ETA: 12s - loss: 1.3593 - regression_loss: 1.1738 - classification_loss: 0.1854 470/500 [===========================>..] - ETA: 11s - loss: 1.3578 - regression_loss: 1.1723 - classification_loss: 0.1854 471/500 [===========================>..] - ETA: 11s - loss: 1.3586 - regression_loss: 1.1730 - classification_loss: 0.1855 472/500 [===========================>..] - ETA: 10s - loss: 1.3595 - regression_loss: 1.1737 - classification_loss: 0.1858 473/500 [===========================>..] - ETA: 10s - loss: 1.3602 - regression_loss: 1.1742 - classification_loss: 0.1860 474/500 [===========================>..] - ETA: 10s - loss: 1.3598 - regression_loss: 1.1738 - classification_loss: 0.1860 475/500 [===========================>..] - ETA: 9s - loss: 1.3593 - regression_loss: 1.1732 - classification_loss: 0.1861  476/500 [===========================>..] - ETA: 9s - loss: 1.3605 - regression_loss: 1.1740 - classification_loss: 0.1865 477/500 [===========================>..] - ETA: 8s - loss: 1.3605 - regression_loss: 1.1741 - classification_loss: 0.1864 478/500 [===========================>..] - ETA: 8s - loss: 1.3594 - regression_loss: 1.1731 - classification_loss: 0.1862 479/500 [===========================>..] - ETA: 8s - loss: 1.3575 - regression_loss: 1.1715 - classification_loss: 0.1860 480/500 [===========================>..] - ETA: 7s - loss: 1.3588 - regression_loss: 1.1723 - classification_loss: 0.1865 481/500 [===========================>..] - ETA: 7s - loss: 1.3583 - regression_loss: 1.1719 - classification_loss: 0.1864 482/500 [===========================>..] - ETA: 7s - loss: 1.3585 - regression_loss: 1.1721 - classification_loss: 0.1864 483/500 [===========================>..] - ETA: 6s - loss: 1.3589 - regression_loss: 1.1726 - classification_loss: 0.1863 484/500 [============================>.] - ETA: 6s - loss: 1.3584 - regression_loss: 1.1721 - classification_loss: 0.1863 485/500 [============================>.] - ETA: 5s - loss: 1.3585 - regression_loss: 1.1724 - classification_loss: 0.1861 486/500 [============================>.] - ETA: 5s - loss: 1.3573 - regression_loss: 1.1713 - classification_loss: 0.1860 487/500 [============================>.] - ETA: 5s - loss: 1.3571 - regression_loss: 1.1713 - classification_loss: 0.1859 488/500 [============================>.] - ETA: 4s - loss: 1.3555 - regression_loss: 1.1699 - classification_loss: 0.1856 489/500 [============================>.] - ETA: 4s - loss: 1.3559 - regression_loss: 1.1704 - classification_loss: 0.1855 490/500 [============================>.] - ETA: 3s - loss: 1.3547 - regression_loss: 1.1695 - classification_loss: 0.1853 491/500 [============================>.] - ETA: 3s - loss: 1.3546 - regression_loss: 1.1693 - classification_loss: 0.1852 492/500 [============================>.] - ETA: 3s - loss: 1.3536 - regression_loss: 1.1685 - classification_loss: 0.1851 493/500 [============================>.] - ETA: 2s - loss: 1.3531 - regression_loss: 1.1680 - classification_loss: 0.1851 494/500 [============================>.] - ETA: 2s - loss: 1.3524 - regression_loss: 1.1674 - classification_loss: 0.1850 495/500 [============================>.] - ETA: 1s - loss: 1.3523 - regression_loss: 1.1673 - classification_loss: 0.1850 496/500 [============================>.] - ETA: 1s - loss: 1.3531 - regression_loss: 1.1680 - classification_loss: 0.1851 497/500 [============================>.] - ETA: 1s - loss: 1.3532 - regression_loss: 1.1681 - classification_loss: 0.1852 498/500 [============================>.] - ETA: 0s - loss: 1.3532 - regression_loss: 1.1681 - classification_loss: 0.1851 499/500 [============================>.] - ETA: 0s - loss: 1.3528 - regression_loss: 1.1677 - classification_loss: 0.1850 500/500 [==============================] - 194s 389ms/step - loss: 1.3526 - regression_loss: 1.1676 - classification_loss: 0.1850 326 instances of class plum with average precision: 0.8215 mAP: 0.8215 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/20 1/500 [..............................] - ETA: 3:02 - loss: 2.0822 - regression_loss: 1.7048 - classification_loss: 0.3773 2/500 [..............................] - ETA: 3:04 - loss: 1.5818 - regression_loss: 1.3342 - classification_loss: 0.2476 3/500 [..............................] - ETA: 3:07 - loss: 1.5342 - regression_loss: 1.3010 - classification_loss: 0.2331 4/500 [..............................] - ETA: 3:09 - loss: 1.4483 - regression_loss: 1.2339 - classification_loss: 0.2144 5/500 [..............................] - ETA: 3:10 - loss: 1.3824 - regression_loss: 1.1861 - classification_loss: 0.1963 6/500 [..............................] - ETA: 3:10 - loss: 1.3533 - regression_loss: 1.1666 - classification_loss: 0.1866 7/500 [..............................] - ETA: 3:09 - loss: 1.2305 - regression_loss: 1.0599 - classification_loss: 0.1705 8/500 [..............................] - ETA: 3:08 - loss: 1.1552 - regression_loss: 0.9995 - classification_loss: 0.1557 9/500 [..............................] - ETA: 3:09 - loss: 1.1388 - regression_loss: 0.9866 - classification_loss: 0.1522 10/500 [..............................] - ETA: 3:09 - loss: 1.1521 - regression_loss: 0.9992 - classification_loss: 0.1529 11/500 [..............................] - ETA: 3:09 - loss: 1.1553 - regression_loss: 0.9983 - classification_loss: 0.1570 12/500 [..............................] - ETA: 3:08 - loss: 1.1247 - regression_loss: 0.9733 - classification_loss: 0.1514 13/500 [..............................] - ETA: 3:08 - loss: 1.1117 - regression_loss: 0.9567 - classification_loss: 0.1550 14/500 [..............................] - ETA: 3:08 - loss: 1.1614 - regression_loss: 0.9965 - classification_loss: 0.1649 15/500 [..............................] - ETA: 3:08 - loss: 1.1446 - regression_loss: 0.9845 - classification_loss: 0.1601 16/500 [..............................] - ETA: 3:08 - loss: 1.1683 - regression_loss: 1.0106 - classification_loss: 0.1577 17/500 [>.............................] - ETA: 3:07 - loss: 1.1472 - regression_loss: 0.9948 - classification_loss: 0.1524 18/500 [>.............................] - ETA: 3:07 - loss: 1.2015 - regression_loss: 1.0433 - classification_loss: 0.1582 19/500 [>.............................] - ETA: 3:07 - loss: 1.1852 - regression_loss: 1.0304 - classification_loss: 0.1548 20/500 [>.............................] - ETA: 3:07 - loss: 1.1890 - regression_loss: 1.0347 - classification_loss: 0.1543 21/500 [>.............................] - ETA: 3:06 - loss: 1.2005 - regression_loss: 1.0439 - classification_loss: 0.1565 22/500 [>.............................] - ETA: 3:05 - loss: 1.1684 - regression_loss: 1.0151 - classification_loss: 0.1534 23/500 [>.............................] - ETA: 3:05 - loss: 1.1614 - regression_loss: 1.0085 - classification_loss: 0.1529 24/500 [>.............................] - ETA: 3:05 - loss: 1.1836 - regression_loss: 1.0273 - classification_loss: 0.1563 25/500 [>.............................] - ETA: 3:05 - loss: 1.1871 - regression_loss: 1.0312 - classification_loss: 0.1559 26/500 [>.............................] - ETA: 3:04 - loss: 1.1979 - regression_loss: 1.0394 - classification_loss: 0.1585 27/500 [>.............................] - ETA: 3:03 - loss: 1.1836 - regression_loss: 1.0268 - classification_loss: 0.1567 28/500 [>.............................] - ETA: 3:02 - loss: 1.1681 - regression_loss: 1.0124 - classification_loss: 0.1557 29/500 [>.............................] - ETA: 3:02 - loss: 1.1605 - regression_loss: 1.0065 - classification_loss: 0.1540 30/500 [>.............................] - ETA: 3:02 - loss: 1.1674 - regression_loss: 1.0126 - classification_loss: 0.1547 31/500 [>.............................] - ETA: 3:01 - loss: 1.1778 - regression_loss: 1.0204 - classification_loss: 0.1574 32/500 [>.............................] - ETA: 3:01 - loss: 1.1770 - regression_loss: 1.0194 - classification_loss: 0.1576 33/500 [>.............................] - ETA: 3:00 - loss: 1.1624 - regression_loss: 1.0073 - classification_loss: 0.1551 34/500 [=>............................] - ETA: 2:59 - loss: 1.1636 - regression_loss: 1.0090 - classification_loss: 0.1547 35/500 [=>............................] - ETA: 2:59 - loss: 1.1652 - regression_loss: 1.0109 - classification_loss: 0.1543 36/500 [=>............................] - ETA: 2:59 - loss: 1.1846 - regression_loss: 1.0299 - classification_loss: 0.1547 37/500 [=>............................] - ETA: 2:58 - loss: 1.1851 - regression_loss: 1.0320 - classification_loss: 0.1532 38/500 [=>............................] - ETA: 2:58 - loss: 1.1791 - regression_loss: 1.0261 - classification_loss: 0.1529 39/500 [=>............................] - ETA: 2:58 - loss: 1.1857 - regression_loss: 1.0323 - classification_loss: 0.1534 40/500 [=>............................] - ETA: 2:57 - loss: 1.1892 - regression_loss: 1.0352 - classification_loss: 0.1540 41/500 [=>............................] - ETA: 2:57 - loss: 1.1840 - regression_loss: 1.0299 - classification_loss: 0.1541 42/500 [=>............................] - ETA: 2:56 - loss: 1.1829 - regression_loss: 1.0285 - classification_loss: 0.1544 43/500 [=>............................] - ETA: 2:56 - loss: 1.1913 - regression_loss: 1.0335 - classification_loss: 0.1578 44/500 [=>............................] - ETA: 2:56 - loss: 1.1905 - regression_loss: 1.0331 - classification_loss: 0.1575 45/500 [=>............................] - ETA: 2:56 - loss: 1.1892 - regression_loss: 1.0321 - classification_loss: 0.1571 46/500 [=>............................] - ETA: 2:55 - loss: 1.1984 - regression_loss: 1.0398 - classification_loss: 0.1586 47/500 [=>............................] - ETA: 2:55 - loss: 1.1963 - regression_loss: 1.0383 - classification_loss: 0.1580 48/500 [=>............................] - ETA: 2:54 - loss: 1.2111 - regression_loss: 1.0519 - classification_loss: 0.1592 49/500 [=>............................] - ETA: 2:54 - loss: 1.2098 - regression_loss: 1.0511 - classification_loss: 0.1586 50/500 [==>...........................] - ETA: 2:54 - loss: 1.2225 - regression_loss: 1.0606 - classification_loss: 0.1619 51/500 [==>...........................] - ETA: 2:54 - loss: 1.2160 - regression_loss: 1.0545 - classification_loss: 0.1615 52/500 [==>...........................] - ETA: 2:53 - loss: 1.2196 - regression_loss: 1.0579 - classification_loss: 0.1617 53/500 [==>...........................] - ETA: 2:53 - loss: 1.2210 - regression_loss: 1.0593 - classification_loss: 0.1617 54/500 [==>...........................] - ETA: 2:53 - loss: 1.2154 - regression_loss: 1.0544 - classification_loss: 0.1610 55/500 [==>...........................] - ETA: 2:52 - loss: 1.2199 - regression_loss: 1.0569 - classification_loss: 0.1630 56/500 [==>...........................] - ETA: 2:52 - loss: 1.2151 - regression_loss: 1.0531 - classification_loss: 0.1620 57/500 [==>...........................] - ETA: 2:51 - loss: 1.2152 - regression_loss: 1.0536 - classification_loss: 0.1617 58/500 [==>...........................] - ETA: 2:51 - loss: 1.2170 - regression_loss: 1.0563 - classification_loss: 0.1606 59/500 [==>...........................] - ETA: 2:51 - loss: 1.2117 - regression_loss: 1.0516 - classification_loss: 0.1601 60/500 [==>...........................] - ETA: 2:50 - loss: 1.2127 - regression_loss: 1.0530 - classification_loss: 0.1597 61/500 [==>...........................] - ETA: 2:50 - loss: 1.2126 - regression_loss: 1.0536 - classification_loss: 0.1590 62/500 [==>...........................] - ETA: 2:49 - loss: 1.2092 - regression_loss: 1.0496 - classification_loss: 0.1596 63/500 [==>...........................] - ETA: 2:49 - loss: 1.2106 - regression_loss: 1.0514 - classification_loss: 0.1592 64/500 [==>...........................] - ETA: 2:49 - loss: 1.2047 - regression_loss: 1.0462 - classification_loss: 0.1585 65/500 [==>...........................] - ETA: 2:48 - loss: 1.2163 - regression_loss: 1.0563 - classification_loss: 0.1600 66/500 [==>...........................] - ETA: 2:48 - loss: 1.2196 - regression_loss: 1.0593 - classification_loss: 0.1603 67/500 [===>..........................] - ETA: 2:47 - loss: 1.2286 - regression_loss: 1.0667 - classification_loss: 0.1619 68/500 [===>..........................] - ETA: 2:47 - loss: 1.2424 - regression_loss: 1.0781 - classification_loss: 0.1643 69/500 [===>..........................] - ETA: 2:47 - loss: 1.2325 - regression_loss: 1.0695 - classification_loss: 0.1630 70/500 [===>..........................] - ETA: 2:46 - loss: 1.2340 - regression_loss: 1.0708 - classification_loss: 0.1632 71/500 [===>..........................] - ETA: 2:46 - loss: 1.2430 - regression_loss: 1.0775 - classification_loss: 0.1656 72/500 [===>..........................] - ETA: 2:46 - loss: 1.2395 - regression_loss: 1.0748 - classification_loss: 0.1647 73/500 [===>..........................] - ETA: 2:45 - loss: 1.2342 - regression_loss: 1.0692 - classification_loss: 0.1650 74/500 [===>..........................] - ETA: 2:45 - loss: 1.2446 - regression_loss: 1.0771 - classification_loss: 0.1675 75/500 [===>..........................] - ETA: 2:45 - loss: 1.2428 - regression_loss: 1.0766 - classification_loss: 0.1662 76/500 [===>..........................] - ETA: 2:44 - loss: 1.2476 - regression_loss: 1.0807 - classification_loss: 0.1669 77/500 [===>..........................] - ETA: 2:44 - loss: 1.2493 - regression_loss: 1.0819 - classification_loss: 0.1674 78/500 [===>..........................] - ETA: 2:43 - loss: 1.2485 - regression_loss: 1.0809 - classification_loss: 0.1676 79/500 [===>..........................] - ETA: 2:43 - loss: 1.2420 - regression_loss: 1.0753 - classification_loss: 0.1668 80/500 [===>..........................] - ETA: 2:43 - loss: 1.2394 - regression_loss: 1.0735 - classification_loss: 0.1659 81/500 [===>..........................] - ETA: 2:42 - loss: 1.2316 - regression_loss: 1.0665 - classification_loss: 0.1651 82/500 [===>..........................] - ETA: 2:42 - loss: 1.2282 - regression_loss: 1.0633 - classification_loss: 0.1649 83/500 [===>..........................] - ETA: 2:42 - loss: 1.2232 - regression_loss: 1.0591 - classification_loss: 0.1641 84/500 [====>.........................] - ETA: 2:41 - loss: 1.2273 - regression_loss: 1.0626 - classification_loss: 0.1647 85/500 [====>.........................] - ETA: 2:41 - loss: 1.2286 - regression_loss: 1.0641 - classification_loss: 0.1646 86/500 [====>.........................] - ETA: 2:41 - loss: 1.2282 - regression_loss: 1.0638 - classification_loss: 0.1644 87/500 [====>.........................] - ETA: 2:40 - loss: 1.2352 - regression_loss: 1.0699 - classification_loss: 0.1653 88/500 [====>.........................] - ETA: 2:40 - loss: 1.2425 - regression_loss: 1.0758 - classification_loss: 0.1667 89/500 [====>.........................] - ETA: 2:39 - loss: 1.2371 - regression_loss: 1.0716 - classification_loss: 0.1655 90/500 [====>.........................] - ETA: 2:39 - loss: 1.2346 - regression_loss: 1.0692 - classification_loss: 0.1655 91/500 [====>.........................] - ETA: 2:39 - loss: 1.2357 - regression_loss: 1.0708 - classification_loss: 0.1649 92/500 [====>.........................] - ETA: 2:38 - loss: 1.2353 - regression_loss: 1.0706 - classification_loss: 0.1647 93/500 [====>.........................] - ETA: 2:38 - loss: 1.2369 - regression_loss: 1.0726 - classification_loss: 0.1643 94/500 [====>.........................] - ETA: 2:37 - loss: 1.2334 - regression_loss: 1.0696 - classification_loss: 0.1637 95/500 [====>.........................] - ETA: 2:37 - loss: 1.2325 - regression_loss: 1.0691 - classification_loss: 0.1634 96/500 [====>.........................] - ETA: 2:37 - loss: 1.2335 - regression_loss: 1.0702 - classification_loss: 0.1634 97/500 [====>.........................] - ETA: 2:36 - loss: 1.2361 - regression_loss: 1.0730 - classification_loss: 0.1631 98/500 [====>.........................] - ETA: 2:36 - loss: 1.2344 - regression_loss: 1.0717 - classification_loss: 0.1628 99/500 [====>.........................] - ETA: 2:35 - loss: 1.2285 - regression_loss: 1.0667 - classification_loss: 0.1619 100/500 [=====>........................] - ETA: 2:35 - loss: 1.2230 - regression_loss: 1.0618 - classification_loss: 0.1612 101/500 [=====>........................] - ETA: 2:35 - loss: 1.2238 - regression_loss: 1.0626 - classification_loss: 0.1611 102/500 [=====>........................] - ETA: 2:34 - loss: 1.2200 - regression_loss: 1.0598 - classification_loss: 0.1602 103/500 [=====>........................] - ETA: 2:34 - loss: 1.2167 - regression_loss: 1.0570 - classification_loss: 0.1597 104/500 [=====>........................] - ETA: 2:33 - loss: 1.2172 - regression_loss: 1.0571 - classification_loss: 0.1601 105/500 [=====>........................] - ETA: 2:33 - loss: 1.2144 - regression_loss: 1.0549 - classification_loss: 0.1595 106/500 [=====>........................] - ETA: 2:33 - loss: 1.2141 - regression_loss: 1.0550 - classification_loss: 0.1591 107/500 [=====>........................] - ETA: 2:32 - loss: 1.2150 - regression_loss: 1.0556 - classification_loss: 0.1594 108/500 [=====>........................] - ETA: 2:32 - loss: 1.2120 - regression_loss: 1.0526 - classification_loss: 0.1594 109/500 [=====>........................] - ETA: 2:32 - loss: 1.2091 - regression_loss: 1.0495 - classification_loss: 0.1597 110/500 [=====>........................] - ETA: 2:31 - loss: 1.2092 - regression_loss: 1.0498 - classification_loss: 0.1594 111/500 [=====>........................] - ETA: 2:31 - loss: 1.2084 - regression_loss: 1.0493 - classification_loss: 0.1592 112/500 [=====>........................] - ETA: 2:30 - loss: 1.2161 - regression_loss: 1.0554 - classification_loss: 0.1607 113/500 [=====>........................] - ETA: 2:30 - loss: 1.2132 - regression_loss: 1.0531 - classification_loss: 0.1601 114/500 [=====>........................] - ETA: 2:30 - loss: 1.2190 - regression_loss: 1.0591 - classification_loss: 0.1599 115/500 [=====>........................] - ETA: 2:29 - loss: 1.2157 - regression_loss: 1.0564 - classification_loss: 0.1593 116/500 [=====>........................] - ETA: 2:29 - loss: 1.2186 - regression_loss: 1.0590 - classification_loss: 0.1596 117/500 [======>.......................] - ETA: 2:28 - loss: 1.2134 - regression_loss: 1.0545 - classification_loss: 0.1590 118/500 [======>.......................] - ETA: 2:28 - loss: 1.2129 - regression_loss: 1.0541 - classification_loss: 0.1588 119/500 [======>.......................] - ETA: 2:28 - loss: 1.2120 - regression_loss: 1.0536 - classification_loss: 0.1583 120/500 [======>.......................] - ETA: 2:27 - loss: 1.2065 - regression_loss: 1.0488 - classification_loss: 0.1577 121/500 [======>.......................] - ETA: 2:27 - loss: 1.2055 - regression_loss: 1.0481 - classification_loss: 0.1574 122/500 [======>.......................] - ETA: 2:27 - loss: 1.2104 - regression_loss: 1.0520 - classification_loss: 0.1584 123/500 [======>.......................] - ETA: 2:26 - loss: 1.2086 - regression_loss: 1.0504 - classification_loss: 0.1582 124/500 [======>.......................] - ETA: 2:26 - loss: 1.2042 - regression_loss: 1.0466 - classification_loss: 0.1576 125/500 [======>.......................] - ETA: 2:25 - loss: 1.2054 - regression_loss: 1.0476 - classification_loss: 0.1578 126/500 [======>.......................] - ETA: 2:25 - loss: 1.2068 - regression_loss: 1.0489 - classification_loss: 0.1579 127/500 [======>.......................] - ETA: 2:25 - loss: 1.2061 - regression_loss: 1.0482 - classification_loss: 0.1579 128/500 [======>.......................] - ETA: 2:24 - loss: 1.2107 - regression_loss: 1.0525 - classification_loss: 0.1581 129/500 [======>.......................] - ETA: 2:24 - loss: 1.2157 - regression_loss: 1.0566 - classification_loss: 0.1590 130/500 [======>.......................] - ETA: 2:24 - loss: 1.2128 - regression_loss: 1.0543 - classification_loss: 0.1585 131/500 [======>.......................] - ETA: 2:23 - loss: 1.2131 - regression_loss: 1.0548 - classification_loss: 0.1583 132/500 [======>.......................] - ETA: 2:23 - loss: 1.2127 - regression_loss: 1.0548 - classification_loss: 0.1579 133/500 [======>.......................] - ETA: 2:22 - loss: 1.2156 - regression_loss: 1.0570 - classification_loss: 0.1585 134/500 [=======>......................] - ETA: 2:22 - loss: 1.2244 - regression_loss: 1.0648 - classification_loss: 0.1595 135/500 [=======>......................] - ETA: 2:22 - loss: 1.2197 - regression_loss: 1.0609 - classification_loss: 0.1588 136/500 [=======>......................] - ETA: 2:21 - loss: 1.2278 - regression_loss: 1.0665 - classification_loss: 0.1612 137/500 [=======>......................] - ETA: 2:21 - loss: 1.2274 - regression_loss: 1.0663 - classification_loss: 0.1611 138/500 [=======>......................] - ETA: 2:20 - loss: 1.2256 - regression_loss: 1.0648 - classification_loss: 0.1608 139/500 [=======>......................] - ETA: 2:20 - loss: 1.2341 - regression_loss: 1.0720 - classification_loss: 0.1621 140/500 [=======>......................] - ETA: 2:19 - loss: 1.2334 - regression_loss: 1.0714 - classification_loss: 0.1620 141/500 [=======>......................] - ETA: 2:19 - loss: 1.2298 - regression_loss: 1.0681 - classification_loss: 0.1617 142/500 [=======>......................] - ETA: 2:19 - loss: 1.2279 - regression_loss: 1.0663 - classification_loss: 0.1616 143/500 [=======>......................] - ETA: 2:18 - loss: 1.2278 - regression_loss: 1.0663 - classification_loss: 0.1616 144/500 [=======>......................] - ETA: 2:18 - loss: 1.2281 - regression_loss: 1.0667 - classification_loss: 0.1614 145/500 [=======>......................] - ETA: 2:17 - loss: 1.2288 - regression_loss: 1.0673 - classification_loss: 0.1614 146/500 [=======>......................] - ETA: 2:17 - loss: 1.2324 - regression_loss: 1.0705 - classification_loss: 0.1619 147/500 [=======>......................] - ETA: 2:17 - loss: 1.2314 - regression_loss: 1.0697 - classification_loss: 0.1617 148/500 [=======>......................] - ETA: 2:16 - loss: 1.2297 - regression_loss: 1.0684 - classification_loss: 0.1612 149/500 [=======>......................] - ETA: 2:16 - loss: 1.2304 - regression_loss: 1.0690 - classification_loss: 0.1614 150/500 [========>.....................] - ETA: 2:16 - loss: 1.2294 - regression_loss: 1.0684 - classification_loss: 0.1610 151/500 [========>.....................] - ETA: 2:15 - loss: 1.2285 - regression_loss: 1.0672 - classification_loss: 0.1613 152/500 [========>.....................] - ETA: 2:15 - loss: 1.2275 - regression_loss: 1.0665 - classification_loss: 0.1610 153/500 [========>.....................] - ETA: 2:14 - loss: 1.2315 - regression_loss: 1.0702 - classification_loss: 0.1613 154/500 [========>.....................] - ETA: 2:14 - loss: 1.2272 - regression_loss: 1.0664 - classification_loss: 0.1608 155/500 [========>.....................] - ETA: 2:14 - loss: 1.2294 - regression_loss: 1.0678 - classification_loss: 0.1616 156/500 [========>.....................] - ETA: 2:13 - loss: 1.2275 - regression_loss: 1.0660 - classification_loss: 0.1615 157/500 [========>.....................] - ETA: 2:13 - loss: 1.2275 - regression_loss: 1.0662 - classification_loss: 0.1613 158/500 [========>.....................] - ETA: 2:12 - loss: 1.2262 - regression_loss: 1.0652 - classification_loss: 0.1611 159/500 [========>.....................] - ETA: 2:12 - loss: 1.2314 - regression_loss: 1.0693 - classification_loss: 0.1621 160/500 [========>.....................] - ETA: 2:12 - loss: 1.2321 - regression_loss: 1.0700 - classification_loss: 0.1622 161/500 [========>.....................] - ETA: 2:11 - loss: 1.2301 - regression_loss: 1.0682 - classification_loss: 0.1619 162/500 [========>.....................] - ETA: 2:11 - loss: 1.2293 - regression_loss: 1.0677 - classification_loss: 0.1616 163/500 [========>.....................] - ETA: 2:11 - loss: 1.2263 - regression_loss: 1.0652 - classification_loss: 0.1612 164/500 [========>.....................] - ETA: 2:10 - loss: 1.2262 - regression_loss: 1.0652 - classification_loss: 0.1610 165/500 [========>.....................] - ETA: 2:10 - loss: 1.2271 - regression_loss: 1.0664 - classification_loss: 0.1607 166/500 [========>.....................] - ETA: 2:09 - loss: 1.2267 - regression_loss: 1.0661 - classification_loss: 0.1606 167/500 [=========>....................] - ETA: 2:09 - loss: 1.2303 - regression_loss: 1.0693 - classification_loss: 0.1610 168/500 [=========>....................] - ETA: 2:09 - loss: 1.2304 - regression_loss: 1.0694 - classification_loss: 0.1610 169/500 [=========>....................] - ETA: 2:08 - loss: 1.2301 - regression_loss: 1.0692 - classification_loss: 0.1609 170/500 [=========>....................] - ETA: 2:08 - loss: 1.2271 - regression_loss: 1.0664 - classification_loss: 0.1608 171/500 [=========>....................] - ETA: 2:07 - loss: 1.2239 - regression_loss: 1.0633 - classification_loss: 0.1607 172/500 [=========>....................] - ETA: 2:07 - loss: 1.2233 - regression_loss: 1.0627 - classification_loss: 0.1605 173/500 [=========>....................] - ETA: 2:06 - loss: 1.2228 - regression_loss: 1.0623 - classification_loss: 0.1605 174/500 [=========>....................] - ETA: 2:06 - loss: 1.2213 - regression_loss: 1.0612 - classification_loss: 0.1601 175/500 [=========>....................] - ETA: 2:06 - loss: 1.2212 - regression_loss: 1.0614 - classification_loss: 0.1598 176/500 [=========>....................] - ETA: 2:05 - loss: 1.2212 - regression_loss: 1.0615 - classification_loss: 0.1598 177/500 [=========>....................] - ETA: 2:05 - loss: 1.2218 - regression_loss: 1.0615 - classification_loss: 0.1603 178/500 [=========>....................] - ETA: 2:05 - loss: 1.2281 - regression_loss: 1.0668 - classification_loss: 0.1612 179/500 [=========>....................] - ETA: 2:04 - loss: 1.2294 - regression_loss: 1.0684 - classification_loss: 0.1611 180/500 [=========>....................] - ETA: 2:04 - loss: 1.2261 - regression_loss: 1.0653 - classification_loss: 0.1608 181/500 [=========>....................] - ETA: 2:03 - loss: 1.2276 - regression_loss: 1.0667 - classification_loss: 0.1609 182/500 [=========>....................] - ETA: 2:03 - loss: 1.2260 - regression_loss: 1.0653 - classification_loss: 0.1607 183/500 [=========>....................] - ETA: 2:03 - loss: 1.2234 - regression_loss: 1.0632 - classification_loss: 0.1602 184/500 [==========>...................] - ETA: 2:02 - loss: 1.2233 - regression_loss: 1.0631 - classification_loss: 0.1602 185/500 [==========>...................] - ETA: 2:02 - loss: 1.2233 - regression_loss: 1.0633 - classification_loss: 0.1601 186/500 [==========>...................] - ETA: 2:01 - loss: 1.2213 - regression_loss: 1.0616 - classification_loss: 0.1596 187/500 [==========>...................] - ETA: 2:01 - loss: 1.2208 - regression_loss: 1.0613 - classification_loss: 0.1595 188/500 [==========>...................] - ETA: 2:01 - loss: 1.2248 - regression_loss: 1.0648 - classification_loss: 0.1600 189/500 [==========>...................] - ETA: 2:00 - loss: 1.2251 - regression_loss: 1.0650 - classification_loss: 0.1600 190/500 [==========>...................] - ETA: 2:00 - loss: 1.2262 - regression_loss: 1.0660 - classification_loss: 0.1602 191/500 [==========>...................] - ETA: 1:59 - loss: 1.2279 - regression_loss: 1.0674 - classification_loss: 0.1605 192/500 [==========>...................] - ETA: 1:59 - loss: 1.2281 - regression_loss: 1.0678 - classification_loss: 0.1604 193/500 [==========>...................] - ETA: 1:59 - loss: 1.2266 - regression_loss: 1.0666 - classification_loss: 0.1600 194/500 [==========>...................] - ETA: 1:58 - loss: 1.2224 - regression_loss: 1.0629 - classification_loss: 0.1595 195/500 [==========>...................] - ETA: 1:58 - loss: 1.2240 - regression_loss: 1.0643 - classification_loss: 0.1598 196/500 [==========>...................] - ETA: 1:58 - loss: 1.2263 - regression_loss: 1.0659 - classification_loss: 0.1603 197/500 [==========>...................] - ETA: 1:57 - loss: 1.2247 - regression_loss: 1.0646 - classification_loss: 0.1601 198/500 [==========>...................] - ETA: 1:57 - loss: 1.2269 - regression_loss: 1.0664 - classification_loss: 0.1605 199/500 [==========>...................] - ETA: 1:56 - loss: 1.2246 - regression_loss: 1.0645 - classification_loss: 0.1601 200/500 [===========>..................] - ETA: 1:56 - loss: 1.2228 - regression_loss: 1.0629 - classification_loss: 0.1598 201/500 [===========>..................] - ETA: 1:56 - loss: 1.2238 - regression_loss: 1.0639 - classification_loss: 0.1598 202/500 [===========>..................] - ETA: 1:55 - loss: 1.2236 - regression_loss: 1.0639 - classification_loss: 0.1597 203/500 [===========>..................] - ETA: 1:55 - loss: 1.2237 - regression_loss: 1.0638 - classification_loss: 0.1599 204/500 [===========>..................] - ETA: 1:55 - loss: 1.2216 - regression_loss: 1.0620 - classification_loss: 0.1596 205/500 [===========>..................] - ETA: 1:54 - loss: 1.2224 - regression_loss: 1.0627 - classification_loss: 0.1597 206/500 [===========>..................] - ETA: 1:54 - loss: 1.2267 - regression_loss: 1.0663 - classification_loss: 0.1604 207/500 [===========>..................] - ETA: 1:53 - loss: 1.2259 - regression_loss: 1.0656 - classification_loss: 0.1603 208/500 [===========>..................] - ETA: 1:53 - loss: 1.2254 - regression_loss: 1.0653 - classification_loss: 0.1601 209/500 [===========>..................] - ETA: 1:53 - loss: 1.2227 - regression_loss: 1.0627 - classification_loss: 0.1600 210/500 [===========>..................] - ETA: 1:52 - loss: 1.2224 - regression_loss: 1.0627 - classification_loss: 0.1597 211/500 [===========>..................] - ETA: 1:52 - loss: 1.2201 - regression_loss: 1.0606 - classification_loss: 0.1595 212/500 [===========>..................] - ETA: 1:52 - loss: 1.2217 - regression_loss: 1.0622 - classification_loss: 0.1596 213/500 [===========>..................] - ETA: 1:51 - loss: 1.2226 - regression_loss: 1.0630 - classification_loss: 0.1596 214/500 [===========>..................] - ETA: 1:51 - loss: 1.2221 - regression_loss: 1.0627 - classification_loss: 0.1594 215/500 [===========>..................] - ETA: 1:50 - loss: 1.2216 - regression_loss: 1.0620 - classification_loss: 0.1596 216/500 [===========>..................] - ETA: 1:50 - loss: 1.2197 - regression_loss: 1.0603 - classification_loss: 0.1593 217/500 [============>.................] - ETA: 1:50 - loss: 1.2183 - regression_loss: 1.0593 - classification_loss: 0.1590 218/500 [============>.................] - ETA: 1:49 - loss: 1.2187 - regression_loss: 1.0599 - classification_loss: 0.1588 219/500 [============>.................] - ETA: 1:49 - loss: 1.2185 - regression_loss: 1.0599 - classification_loss: 0.1586 220/500 [============>.................] - ETA: 1:48 - loss: 1.2170 - regression_loss: 1.0584 - classification_loss: 0.1586 221/500 [============>.................] - ETA: 1:48 - loss: 1.2152 - regression_loss: 1.0569 - classification_loss: 0.1583 222/500 [============>.................] - ETA: 1:48 - loss: 1.2166 - regression_loss: 1.0581 - classification_loss: 0.1586 223/500 [============>.................] - ETA: 1:47 - loss: 1.2164 - regression_loss: 1.0578 - classification_loss: 0.1585 224/500 [============>.................] - ETA: 1:47 - loss: 1.2181 - regression_loss: 1.0590 - classification_loss: 0.1591 225/500 [============>.................] - ETA: 1:47 - loss: 1.2159 - regression_loss: 1.0569 - classification_loss: 0.1590 226/500 [============>.................] - ETA: 1:46 - loss: 1.2161 - regression_loss: 1.0573 - classification_loss: 0.1588 227/500 [============>.................] - ETA: 1:46 - loss: 1.2178 - regression_loss: 1.0588 - classification_loss: 0.1590 228/500 [============>.................] - ETA: 1:45 - loss: 1.2175 - regression_loss: 1.0586 - classification_loss: 0.1589 229/500 [============>.................] - ETA: 1:45 - loss: 1.2164 - regression_loss: 1.0576 - classification_loss: 0.1588 230/500 [============>.................] - ETA: 1:45 - loss: 1.2163 - regression_loss: 1.0578 - classification_loss: 0.1586 231/500 [============>.................] - ETA: 1:44 - loss: 1.2162 - regression_loss: 1.0576 - classification_loss: 0.1587 232/500 [============>.................] - ETA: 1:44 - loss: 1.2130 - regression_loss: 1.0548 - classification_loss: 0.1582 233/500 [============>.................] - ETA: 1:43 - loss: 1.2118 - regression_loss: 1.0538 - classification_loss: 0.1580 234/500 [=============>................] - ETA: 1:43 - loss: 1.2143 - regression_loss: 1.0555 - classification_loss: 0.1588 235/500 [=============>................] - ETA: 1:43 - loss: 1.2173 - regression_loss: 1.0578 - classification_loss: 0.1594 236/500 [=============>................] - ETA: 1:42 - loss: 1.2154 - regression_loss: 1.0563 - classification_loss: 0.1591 237/500 [=============>................] - ETA: 1:42 - loss: 1.2138 - regression_loss: 1.0549 - classification_loss: 0.1588 238/500 [=============>................] - ETA: 1:41 - loss: 1.2170 - regression_loss: 1.0572 - classification_loss: 0.1598 239/500 [=============>................] - ETA: 1:41 - loss: 1.2196 - regression_loss: 1.0593 - classification_loss: 0.1604 240/500 [=============>................] - ETA: 1:41 - loss: 1.2208 - regression_loss: 1.0604 - classification_loss: 0.1604 241/500 [=============>................] - ETA: 1:40 - loss: 1.2203 - regression_loss: 1.0601 - classification_loss: 0.1602 242/500 [=============>................] - ETA: 1:40 - loss: 1.2204 - regression_loss: 1.0603 - classification_loss: 0.1601 243/500 [=============>................] - ETA: 1:39 - loss: 1.2206 - regression_loss: 1.0605 - classification_loss: 0.1601 244/500 [=============>................] - ETA: 1:39 - loss: 1.2202 - regression_loss: 1.0602 - classification_loss: 0.1600 245/500 [=============>................] - ETA: 1:39 - loss: 1.2174 - regression_loss: 1.0577 - classification_loss: 0.1597 246/500 [=============>................] - ETA: 1:38 - loss: 1.2175 - regression_loss: 1.0577 - classification_loss: 0.1598 247/500 [=============>................] - ETA: 1:38 - loss: 1.2161 - regression_loss: 1.0566 - classification_loss: 0.1595 248/500 [=============>................] - ETA: 1:37 - loss: 1.2151 - regression_loss: 1.0558 - classification_loss: 0.1593 249/500 [=============>................] - ETA: 1:37 - loss: 1.2148 - regression_loss: 1.0556 - classification_loss: 0.1591 250/500 [==============>...............] - ETA: 1:37 - loss: 1.2151 - regression_loss: 1.0562 - classification_loss: 0.1589 251/500 [==============>...............] - ETA: 1:36 - loss: 1.2182 - regression_loss: 1.0589 - classification_loss: 0.1593 252/500 [==============>...............] - ETA: 1:36 - loss: 1.2168 - regression_loss: 1.0578 - classification_loss: 0.1590 253/500 [==============>...............] - ETA: 1:36 - loss: 1.2169 - regression_loss: 1.0580 - classification_loss: 0.1589 254/500 [==============>...............] - ETA: 1:35 - loss: 1.2161 - regression_loss: 1.0572 - classification_loss: 0.1589 255/500 [==============>...............] - ETA: 1:35 - loss: 1.2181 - regression_loss: 1.0588 - classification_loss: 0.1593 256/500 [==============>...............] - ETA: 1:34 - loss: 1.2192 - regression_loss: 1.0601 - classification_loss: 0.1591 257/500 [==============>...............] - ETA: 1:34 - loss: 1.2203 - regression_loss: 1.0609 - classification_loss: 0.1594 258/500 [==============>...............] - ETA: 1:34 - loss: 1.2184 - regression_loss: 1.0594 - classification_loss: 0.1590 259/500 [==============>...............] - ETA: 1:33 - loss: 1.2166 - regression_loss: 1.0580 - classification_loss: 0.1587 260/500 [==============>...............] - ETA: 1:33 - loss: 1.2177 - regression_loss: 1.0587 - classification_loss: 0.1590 261/500 [==============>...............] - ETA: 1:32 - loss: 1.2218 - regression_loss: 1.0620 - classification_loss: 0.1598 262/500 [==============>...............] - ETA: 1:32 - loss: 1.2197 - regression_loss: 1.0601 - classification_loss: 0.1596 263/500 [==============>...............] - ETA: 1:32 - loss: 1.2205 - regression_loss: 1.0607 - classification_loss: 0.1599 264/500 [==============>...............] - ETA: 1:31 - loss: 1.2207 - regression_loss: 1.0609 - classification_loss: 0.1598 265/500 [==============>...............] - ETA: 1:31 - loss: 1.2193 - regression_loss: 1.0598 - classification_loss: 0.1595 266/500 [==============>...............] - ETA: 1:31 - loss: 1.2184 - regression_loss: 1.0591 - classification_loss: 0.1593 267/500 [===============>..............] - ETA: 1:30 - loss: 1.2203 - regression_loss: 1.0606 - classification_loss: 0.1597 268/500 [===============>..............] - ETA: 1:30 - loss: 1.2208 - regression_loss: 1.0611 - classification_loss: 0.1596 269/500 [===============>..............] - ETA: 1:29 - loss: 1.2185 - regression_loss: 1.0591 - classification_loss: 0.1594 270/500 [===============>..............] - ETA: 1:29 - loss: 1.2195 - regression_loss: 1.0599 - classification_loss: 0.1596 271/500 [===============>..............] - ETA: 1:29 - loss: 1.2172 - regression_loss: 1.0579 - classification_loss: 0.1593 272/500 [===============>..............] - ETA: 1:28 - loss: 1.2196 - regression_loss: 1.0595 - classification_loss: 0.1601 273/500 [===============>..............] - ETA: 1:28 - loss: 1.2185 - regression_loss: 1.0584 - classification_loss: 0.1601 274/500 [===============>..............] - ETA: 1:27 - loss: 1.2183 - regression_loss: 1.0584 - classification_loss: 0.1599 275/500 [===============>..............] - ETA: 1:27 - loss: 1.2165 - regression_loss: 1.0570 - classification_loss: 0.1596 276/500 [===============>..............] - ETA: 1:27 - loss: 1.2141 - regression_loss: 1.0547 - classification_loss: 0.1593 277/500 [===============>..............] - ETA: 1:26 - loss: 1.2152 - regression_loss: 1.0558 - classification_loss: 0.1595 278/500 [===============>..............] - ETA: 1:26 - loss: 1.2149 - regression_loss: 1.0556 - classification_loss: 0.1593 279/500 [===============>..............] - ETA: 1:26 - loss: 1.2154 - regression_loss: 1.0560 - classification_loss: 0.1594 280/500 [===============>..............] - ETA: 1:25 - loss: 1.2155 - regression_loss: 1.0561 - classification_loss: 0.1594 281/500 [===============>..............] - ETA: 1:25 - loss: 1.2173 - regression_loss: 1.0579 - classification_loss: 0.1595 282/500 [===============>..............] - ETA: 1:24 - loss: 1.2188 - regression_loss: 1.0591 - classification_loss: 0.1597 283/500 [===============>..............] - ETA: 1:24 - loss: 1.2179 - regression_loss: 1.0583 - classification_loss: 0.1596 284/500 [================>.............] - ETA: 1:24 - loss: 1.2179 - regression_loss: 1.0584 - classification_loss: 0.1595 285/500 [================>.............] - ETA: 1:23 - loss: 1.2181 - regression_loss: 1.0586 - classification_loss: 0.1595 286/500 [================>.............] - ETA: 1:23 - loss: 1.2171 - regression_loss: 1.0577 - classification_loss: 0.1593 287/500 [================>.............] - ETA: 1:22 - loss: 1.2195 - regression_loss: 1.0597 - classification_loss: 0.1598 288/500 [================>.............] - ETA: 1:22 - loss: 1.2189 - regression_loss: 1.0592 - classification_loss: 0.1597 289/500 [================>.............] - ETA: 1:22 - loss: 1.2184 - regression_loss: 1.0589 - classification_loss: 0.1595 290/500 [================>.............] - ETA: 1:21 - loss: 1.2199 - regression_loss: 1.0602 - classification_loss: 0.1597 291/500 [================>.............] - ETA: 1:21 - loss: 1.2195 - regression_loss: 1.0598 - classification_loss: 0.1596 292/500 [================>.............] - ETA: 1:20 - loss: 1.2177 - regression_loss: 1.0584 - classification_loss: 0.1593 293/500 [================>.............] - ETA: 1:20 - loss: 1.2171 - regression_loss: 1.0580 - classification_loss: 0.1592 294/500 [================>.............] - ETA: 1:20 - loss: 1.2175 - regression_loss: 1.0585 - classification_loss: 0.1590 295/500 [================>.............] - ETA: 1:19 - loss: 1.2149 - regression_loss: 1.0561 - classification_loss: 0.1589 296/500 [================>.............] - ETA: 1:19 - loss: 1.2124 - regression_loss: 1.0538 - classification_loss: 0.1586 297/500 [================>.............] - ETA: 1:19 - loss: 1.2126 - regression_loss: 1.0541 - classification_loss: 0.1585 298/500 [================>.............] - ETA: 1:18 - loss: 1.2114 - regression_loss: 1.0529 - classification_loss: 0.1585 299/500 [================>.............] - ETA: 1:18 - loss: 1.2107 - regression_loss: 1.0524 - classification_loss: 0.1583 300/500 [=================>............] - ETA: 1:17 - loss: 1.2093 - regression_loss: 1.0512 - classification_loss: 0.1581 301/500 [=================>............] - ETA: 1:17 - loss: 1.2075 - regression_loss: 1.0496 - classification_loss: 0.1578 302/500 [=================>............] - ETA: 1:17 - loss: 1.2077 - regression_loss: 1.0497 - classification_loss: 0.1580 303/500 [=================>............] - ETA: 1:16 - loss: 1.2063 - regression_loss: 1.0486 - classification_loss: 0.1578 304/500 [=================>............] - ETA: 1:16 - loss: 1.2083 - regression_loss: 1.0504 - classification_loss: 0.1579 305/500 [=================>............] - ETA: 1:15 - loss: 1.2087 - regression_loss: 1.0510 - classification_loss: 0.1577 306/500 [=================>............] - ETA: 1:15 - loss: 1.2082 - regression_loss: 1.0507 - classification_loss: 0.1575 307/500 [=================>............] - ETA: 1:15 - loss: 1.2080 - regression_loss: 1.0507 - classification_loss: 0.1574 308/500 [=================>............] - ETA: 1:14 - loss: 1.2066 - regression_loss: 1.0495 - classification_loss: 0.1572 309/500 [=================>............] - ETA: 1:14 - loss: 1.2066 - regression_loss: 1.0490 - classification_loss: 0.1576 310/500 [=================>............] - ETA: 1:13 - loss: 1.2053 - regression_loss: 1.0480 - classification_loss: 0.1574 311/500 [=================>............] - ETA: 1:13 - loss: 1.2051 - regression_loss: 1.0479 - classification_loss: 0.1572 312/500 [=================>............] - ETA: 1:13 - loss: 1.2038 - regression_loss: 1.0469 - classification_loss: 0.1569 313/500 [=================>............] - ETA: 1:12 - loss: 1.2046 - regression_loss: 1.0476 - classification_loss: 0.1570 314/500 [=================>............] - ETA: 1:12 - loss: 1.2049 - regression_loss: 1.0479 - classification_loss: 0.1571 315/500 [=================>............] - ETA: 1:12 - loss: 1.2046 - regression_loss: 1.0476 - classification_loss: 0.1570 316/500 [=================>............] - ETA: 1:11 - loss: 1.2065 - regression_loss: 1.0494 - classification_loss: 0.1571 317/500 [==================>...........] - ETA: 1:11 - loss: 1.2059 - regression_loss: 1.0490 - classification_loss: 0.1569 318/500 [==================>...........] - ETA: 1:10 - loss: 1.2077 - regression_loss: 1.0504 - classification_loss: 0.1573 319/500 [==================>...........] - ETA: 1:10 - loss: 1.2079 - regression_loss: 1.0506 - classification_loss: 0.1572 320/500 [==================>...........] - ETA: 1:10 - loss: 1.2070 - regression_loss: 1.0499 - classification_loss: 0.1571 321/500 [==================>...........] - ETA: 1:09 - loss: 1.2067 - regression_loss: 1.0497 - classification_loss: 0.1570 322/500 [==================>...........] - ETA: 1:09 - loss: 1.2050 - regression_loss: 1.0482 - classification_loss: 0.1568 323/500 [==================>...........] - ETA: 1:08 - loss: 1.2056 - regression_loss: 1.0488 - classification_loss: 0.1568 324/500 [==================>...........] - ETA: 1:08 - loss: 1.2050 - regression_loss: 1.0483 - classification_loss: 0.1567 325/500 [==================>...........] - ETA: 1:08 - loss: 1.2046 - regression_loss: 1.0478 - classification_loss: 0.1568 326/500 [==================>...........] - ETA: 1:07 - loss: 1.2028 - regression_loss: 1.0461 - classification_loss: 0.1567 327/500 [==================>...........] - ETA: 1:07 - loss: 1.2019 - regression_loss: 1.0454 - classification_loss: 0.1565 328/500 [==================>...........] - ETA: 1:06 - loss: 1.1997 - regression_loss: 1.0435 - classification_loss: 0.1562 329/500 [==================>...........] - ETA: 1:06 - loss: 1.1999 - regression_loss: 1.0437 - classification_loss: 0.1562 330/500 [==================>...........] - ETA: 1:06 - loss: 1.2003 - regression_loss: 1.0440 - classification_loss: 0.1562 331/500 [==================>...........] - ETA: 1:05 - loss: 1.1996 - regression_loss: 1.0435 - classification_loss: 0.1561 332/500 [==================>...........] - ETA: 1:05 - loss: 1.1991 - regression_loss: 1.0431 - classification_loss: 0.1560 333/500 [==================>...........] - ETA: 1:05 - loss: 1.2003 - regression_loss: 1.0441 - classification_loss: 0.1563 334/500 [===================>..........] - ETA: 1:04 - loss: 1.2007 - regression_loss: 1.0445 - classification_loss: 0.1562 335/500 [===================>..........] - ETA: 1:04 - loss: 1.1990 - regression_loss: 1.0431 - classification_loss: 0.1560 336/500 [===================>..........] - ETA: 1:03 - loss: 1.1976 - regression_loss: 1.0419 - classification_loss: 0.1557 337/500 [===================>..........] - ETA: 1:03 - loss: 1.1951 - regression_loss: 1.0396 - classification_loss: 0.1555 338/500 [===================>..........] - ETA: 1:03 - loss: 1.1972 - regression_loss: 1.0411 - classification_loss: 0.1561 339/500 [===================>..........] - ETA: 1:02 - loss: 1.1976 - regression_loss: 1.0417 - classification_loss: 0.1559 340/500 [===================>..........] - ETA: 1:02 - loss: 1.1963 - regression_loss: 1.0406 - classification_loss: 0.1557 341/500 [===================>..........] - ETA: 1:01 - loss: 1.1961 - regression_loss: 1.0406 - classification_loss: 0.1555 342/500 [===================>..........] - ETA: 1:01 - loss: 1.1962 - regression_loss: 1.0407 - classification_loss: 0.1555 343/500 [===================>..........] - ETA: 1:01 - loss: 1.1962 - regression_loss: 1.0407 - classification_loss: 0.1555 344/500 [===================>..........] - ETA: 1:00 - loss: 1.1941 - regression_loss: 1.0390 - classification_loss: 0.1551 345/500 [===================>..........] - ETA: 1:00 - loss: 1.1944 - regression_loss: 1.0393 - classification_loss: 0.1551 346/500 [===================>..........] - ETA: 59s - loss: 1.1966 - regression_loss: 1.0407 - classification_loss: 0.1559  347/500 [===================>..........] - ETA: 59s - loss: 1.1965 - regression_loss: 1.0407 - classification_loss: 0.1559 348/500 [===================>..........] - ETA: 59s - loss: 1.1981 - regression_loss: 1.0417 - classification_loss: 0.1563 349/500 [===================>..........] - ETA: 58s - loss: 1.1960 - regression_loss: 1.0399 - classification_loss: 0.1561 350/500 [====================>.........] - ETA: 58s - loss: 1.1949 - regression_loss: 1.0387 - classification_loss: 0.1562 351/500 [====================>.........] - ETA: 58s - loss: 1.1937 - regression_loss: 1.0376 - classification_loss: 0.1561 352/500 [====================>.........] - ETA: 57s - loss: 1.1940 - regression_loss: 1.0380 - classification_loss: 0.1560 353/500 [====================>.........] - ETA: 57s - loss: 1.1933 - regression_loss: 1.0374 - classification_loss: 0.1558 354/500 [====================>.........] - ETA: 56s - loss: 1.1921 - regression_loss: 1.0364 - classification_loss: 0.1556 355/500 [====================>.........] - ETA: 56s - loss: 1.1935 - regression_loss: 1.0377 - classification_loss: 0.1558 356/500 [====================>.........] - ETA: 56s - loss: 1.1966 - regression_loss: 1.0404 - classification_loss: 0.1562 357/500 [====================>.........] - ETA: 55s - loss: 1.1946 - regression_loss: 1.0387 - classification_loss: 0.1559 358/500 [====================>.........] - ETA: 55s - loss: 1.1932 - regression_loss: 1.0376 - classification_loss: 0.1556 359/500 [====================>.........] - ETA: 54s - loss: 1.1944 - regression_loss: 1.0386 - classification_loss: 0.1558 360/500 [====================>.........] - ETA: 54s - loss: 1.1926 - regression_loss: 1.0371 - classification_loss: 0.1554 361/500 [====================>.........] - ETA: 54s - loss: 1.1910 - regression_loss: 1.0358 - classification_loss: 0.1552 362/500 [====================>.........] - ETA: 53s - loss: 1.1912 - regression_loss: 1.0360 - classification_loss: 0.1552 363/500 [====================>.........] - ETA: 53s - loss: 1.1925 - regression_loss: 1.0372 - classification_loss: 0.1553 364/500 [====================>.........] - ETA: 52s - loss: 1.1927 - regression_loss: 1.0374 - classification_loss: 0.1552 365/500 [====================>.........] - ETA: 52s - loss: 1.1918 - regression_loss: 1.0367 - classification_loss: 0.1551 366/500 [====================>.........] - ETA: 52s - loss: 1.1911 - regression_loss: 1.0361 - classification_loss: 0.1549 367/500 [=====================>........] - ETA: 51s - loss: 1.1906 - regression_loss: 1.0357 - classification_loss: 0.1548 368/500 [=====================>........] - ETA: 51s - loss: 1.1904 - regression_loss: 1.0357 - classification_loss: 0.1547 369/500 [=====================>........] - ETA: 51s - loss: 1.1910 - regression_loss: 1.0362 - classification_loss: 0.1547 370/500 [=====================>........] - ETA: 50s - loss: 1.1920 - regression_loss: 1.0372 - classification_loss: 0.1548 371/500 [=====================>........] - ETA: 50s - loss: 1.1926 - regression_loss: 1.0377 - classification_loss: 0.1549 372/500 [=====================>........] - ETA: 49s - loss: 1.1913 - regression_loss: 1.0366 - classification_loss: 0.1547 373/500 [=====================>........] - ETA: 49s - loss: 1.1906 - regression_loss: 1.0359 - classification_loss: 0.1547 374/500 [=====================>........] - ETA: 49s - loss: 1.1891 - regression_loss: 1.0346 - classification_loss: 0.1546 375/500 [=====================>........] - ETA: 48s - loss: 1.1889 - regression_loss: 1.0344 - classification_loss: 0.1545 376/500 [=====================>........] - ETA: 48s - loss: 1.1870 - regression_loss: 1.0327 - classification_loss: 0.1543 377/500 [=====================>........] - ETA: 47s - loss: 1.1863 - regression_loss: 1.0321 - classification_loss: 0.1542 378/500 [=====================>........] - ETA: 47s - loss: 1.1856 - regression_loss: 1.0315 - classification_loss: 0.1541 379/500 [=====================>........] - ETA: 47s - loss: 1.1870 - regression_loss: 1.0327 - classification_loss: 0.1543 380/500 [=====================>........] - ETA: 46s - loss: 1.1869 - regression_loss: 1.0326 - classification_loss: 0.1543 381/500 [=====================>........] - ETA: 46s - loss: 1.1870 - regression_loss: 1.0328 - classification_loss: 0.1542 382/500 [=====================>........] - ETA: 45s - loss: 1.1872 - regression_loss: 1.0329 - classification_loss: 0.1543 383/500 [=====================>........] - ETA: 45s - loss: 1.1886 - regression_loss: 1.0343 - classification_loss: 0.1543 384/500 [======================>.......] - ETA: 45s - loss: 1.1877 - regression_loss: 1.0335 - classification_loss: 0.1543 385/500 [======================>.......] - ETA: 44s - loss: 1.1873 - regression_loss: 1.0331 - classification_loss: 0.1542 386/500 [======================>.......] - ETA: 44s - loss: 1.1867 - regression_loss: 1.0327 - classification_loss: 0.1541 387/500 [======================>.......] - ETA: 44s - loss: 1.1853 - regression_loss: 1.0314 - classification_loss: 0.1539 388/500 [======================>.......] - ETA: 43s - loss: 1.1837 - regression_loss: 1.0300 - classification_loss: 0.1537 389/500 [======================>.......] - ETA: 43s - loss: 1.1823 - regression_loss: 1.0288 - classification_loss: 0.1535 390/500 [======================>.......] - ETA: 42s - loss: 1.1809 - regression_loss: 1.0277 - classification_loss: 0.1532 391/500 [======================>.......] - ETA: 42s - loss: 1.1807 - regression_loss: 1.0275 - classification_loss: 0.1531 392/500 [======================>.......] - ETA: 42s - loss: 1.1797 - regression_loss: 1.0267 - classification_loss: 0.1530 393/500 [======================>.......] - ETA: 41s - loss: 1.1794 - regression_loss: 1.0263 - classification_loss: 0.1530 394/500 [======================>.......] - ETA: 41s - loss: 1.1807 - regression_loss: 1.0275 - classification_loss: 0.1533 395/500 [======================>.......] - ETA: 40s - loss: 1.1802 - regression_loss: 1.0269 - classification_loss: 0.1533 396/500 [======================>.......] - ETA: 40s - loss: 1.1818 - regression_loss: 1.0282 - classification_loss: 0.1536 397/500 [======================>.......] - ETA: 40s - loss: 1.1821 - regression_loss: 1.0285 - classification_loss: 0.1536 398/500 [======================>.......] - ETA: 39s - loss: 1.1813 - regression_loss: 1.0279 - classification_loss: 0.1534 399/500 [======================>.......] - ETA: 39s - loss: 1.1824 - regression_loss: 1.0288 - classification_loss: 0.1536 400/500 [=======================>......] - ETA: 38s - loss: 1.1819 - regression_loss: 1.0284 - classification_loss: 0.1535 401/500 [=======================>......] - ETA: 38s - loss: 1.1808 - regression_loss: 1.0275 - classification_loss: 0.1533 402/500 [=======================>......] - ETA: 38s - loss: 1.1806 - regression_loss: 1.0275 - classification_loss: 0.1532 403/500 [=======================>......] - ETA: 37s - loss: 1.1816 - regression_loss: 1.0285 - classification_loss: 0.1532 404/500 [=======================>......] - ETA: 37s - loss: 1.1822 - regression_loss: 1.0289 - classification_loss: 0.1533 405/500 [=======================>......] - ETA: 37s - loss: 1.1816 - regression_loss: 1.0281 - classification_loss: 0.1534 406/500 [=======================>......] - ETA: 36s - loss: 1.1806 - regression_loss: 1.0273 - classification_loss: 0.1533 407/500 [=======================>......] - ETA: 36s - loss: 1.1821 - regression_loss: 1.0286 - classification_loss: 0.1535 408/500 [=======================>......] - ETA: 35s - loss: 1.1831 - regression_loss: 1.0294 - classification_loss: 0.1537 409/500 [=======================>......] - ETA: 35s - loss: 1.1837 - regression_loss: 1.0300 - classification_loss: 0.1537 410/500 [=======================>......] - ETA: 35s - loss: 1.1837 - regression_loss: 1.0300 - classification_loss: 0.1537 411/500 [=======================>......] - ETA: 34s - loss: 1.1831 - regression_loss: 1.0295 - classification_loss: 0.1536 412/500 [=======================>......] - ETA: 34s - loss: 1.1855 - regression_loss: 1.0316 - classification_loss: 0.1539 413/500 [=======================>......] - ETA: 33s - loss: 1.1857 - regression_loss: 1.0317 - classification_loss: 0.1540 414/500 [=======================>......] - ETA: 33s - loss: 1.1848 - regression_loss: 1.0310 - classification_loss: 0.1539 415/500 [=======================>......] - ETA: 33s - loss: 1.1841 - regression_loss: 1.0303 - classification_loss: 0.1538 416/500 [=======================>......] - ETA: 32s - loss: 1.1854 - regression_loss: 1.0315 - classification_loss: 0.1539 417/500 [========================>.....] - ETA: 32s - loss: 1.1853 - regression_loss: 1.0314 - classification_loss: 0.1539 418/500 [========================>.....] - ETA: 31s - loss: 1.1855 - regression_loss: 1.0313 - classification_loss: 0.1543 419/500 [========================>.....] - ETA: 31s - loss: 1.1856 - regression_loss: 1.0313 - classification_loss: 0.1543 420/500 [========================>.....] - ETA: 31s - loss: 1.1847 - regression_loss: 1.0305 - classification_loss: 0.1542 421/500 [========================>.....] - ETA: 30s - loss: 1.1851 - regression_loss: 1.0309 - classification_loss: 0.1542 422/500 [========================>.....] - ETA: 30s - loss: 1.1848 - regression_loss: 1.0307 - classification_loss: 0.1541 423/500 [========================>.....] - ETA: 30s - loss: 1.1873 - regression_loss: 1.0327 - classification_loss: 0.1547 424/500 [========================>.....] - ETA: 29s - loss: 1.1876 - regression_loss: 1.0330 - classification_loss: 0.1545 425/500 [========================>.....] - ETA: 29s - loss: 1.1869 - regression_loss: 1.0325 - classification_loss: 0.1544 426/500 [========================>.....] - ETA: 28s - loss: 1.1880 - regression_loss: 1.0333 - classification_loss: 0.1548 427/500 [========================>.....] - ETA: 28s - loss: 1.1878 - regression_loss: 1.0331 - classification_loss: 0.1547 428/500 [========================>.....] - ETA: 28s - loss: 1.1870 - regression_loss: 1.0324 - classification_loss: 0.1546 429/500 [========================>.....] - ETA: 27s - loss: 1.1873 - regression_loss: 1.0327 - classification_loss: 0.1546 430/500 [========================>.....] - ETA: 27s - loss: 1.1873 - regression_loss: 1.0327 - classification_loss: 0.1546 431/500 [========================>.....] - ETA: 26s - loss: 1.1883 - regression_loss: 1.0336 - classification_loss: 0.1547 432/500 [========================>.....] - ETA: 26s - loss: 1.1881 - regression_loss: 1.0334 - classification_loss: 0.1547 433/500 [========================>.....] - ETA: 26s - loss: 1.1882 - regression_loss: 1.0334 - classification_loss: 0.1548 434/500 [=========================>....] - ETA: 25s - loss: 1.1885 - regression_loss: 1.0336 - classification_loss: 0.1550 435/500 [=========================>....] - ETA: 25s - loss: 1.1893 - regression_loss: 1.0342 - classification_loss: 0.1551 436/500 [=========================>....] - ETA: 24s - loss: 1.1887 - regression_loss: 1.0338 - classification_loss: 0.1549 437/500 [=========================>....] - ETA: 24s - loss: 1.1904 - regression_loss: 1.0351 - classification_loss: 0.1553 438/500 [=========================>....] - ETA: 24s - loss: 1.1909 - regression_loss: 1.0355 - classification_loss: 0.1554 439/500 [=========================>....] - ETA: 23s - loss: 1.1901 - regression_loss: 1.0349 - classification_loss: 0.1552 440/500 [=========================>....] - ETA: 23s - loss: 1.1928 - regression_loss: 1.0371 - classification_loss: 0.1557 441/500 [=========================>....] - ETA: 23s - loss: 1.1923 - regression_loss: 1.0366 - classification_loss: 0.1557 442/500 [=========================>....] - ETA: 22s - loss: 1.1926 - regression_loss: 1.0367 - classification_loss: 0.1559 443/500 [=========================>....] - ETA: 22s - loss: 1.1919 - regression_loss: 1.0361 - classification_loss: 0.1558 444/500 [=========================>....] - ETA: 21s - loss: 1.1911 - regression_loss: 1.0354 - classification_loss: 0.1556 445/500 [=========================>....] - ETA: 21s - loss: 1.1902 - regression_loss: 1.0347 - classification_loss: 0.1555 446/500 [=========================>....] - ETA: 21s - loss: 1.1893 - regression_loss: 1.0340 - classification_loss: 0.1553 447/500 [=========================>....] - ETA: 20s - loss: 1.1897 - regression_loss: 1.0344 - classification_loss: 0.1553 448/500 [=========================>....] - ETA: 20s - loss: 1.1893 - regression_loss: 1.0340 - classification_loss: 0.1553 449/500 [=========================>....] - ETA: 19s - loss: 1.1907 - regression_loss: 1.0339 - classification_loss: 0.1567 450/500 [==========================>...] - ETA: 19s - loss: 1.1909 - regression_loss: 1.0341 - classification_loss: 0.1567 451/500 [==========================>...] - ETA: 19s - loss: 1.1895 - regression_loss: 1.0328 - classification_loss: 0.1566 452/500 [==========================>...] - ETA: 18s - loss: 1.1895 - regression_loss: 1.0328 - classification_loss: 0.1567 453/500 [==========================>...] - ETA: 18s - loss: 1.1899 - regression_loss: 1.0331 - classification_loss: 0.1568 454/500 [==========================>...] - ETA: 17s - loss: 1.1892 - regression_loss: 1.0326 - classification_loss: 0.1566 455/500 [==========================>...] - ETA: 17s - loss: 1.1893 - regression_loss: 1.0327 - classification_loss: 0.1565 456/500 [==========================>...] - ETA: 17s - loss: 1.1879 - regression_loss: 1.0316 - classification_loss: 0.1563 457/500 [==========================>...] - ETA: 16s - loss: 1.1879 - regression_loss: 1.0316 - classification_loss: 0.1564 458/500 [==========================>...] - ETA: 16s - loss: 1.1867 - regression_loss: 1.0306 - classification_loss: 0.1562 459/500 [==========================>...] - ETA: 16s - loss: 1.1860 - regression_loss: 1.0299 - classification_loss: 0.1561 460/500 [==========================>...] - ETA: 15s - loss: 1.1846 - regression_loss: 1.0287 - classification_loss: 0.1560 461/500 [==========================>...] - ETA: 15s - loss: 1.1837 - regression_loss: 1.0279 - classification_loss: 0.1558 462/500 [==========================>...] - ETA: 14s - loss: 1.1831 - regression_loss: 1.0274 - classification_loss: 0.1557 463/500 [==========================>...] - ETA: 14s - loss: 1.1839 - regression_loss: 1.0281 - classification_loss: 0.1558 464/500 [==========================>...] - ETA: 14s - loss: 1.1833 - regression_loss: 1.0276 - classification_loss: 0.1557 465/500 [==========================>...] - ETA: 13s - loss: 1.1839 - regression_loss: 1.0282 - classification_loss: 0.1558 466/500 [==========================>...] - ETA: 13s - loss: 1.1834 - regression_loss: 1.0278 - classification_loss: 0.1556 467/500 [===========================>..] - ETA: 12s - loss: 1.1823 - regression_loss: 1.0268 - classification_loss: 0.1554 468/500 [===========================>..] - ETA: 12s - loss: 1.1826 - regression_loss: 1.0271 - classification_loss: 0.1555 469/500 [===========================>..] - ETA: 12s - loss: 1.1825 - regression_loss: 1.0270 - classification_loss: 0.1554 470/500 [===========================>..] - ETA: 11s - loss: 1.1817 - regression_loss: 1.0264 - classification_loss: 0.1553 471/500 [===========================>..] - ETA: 11s - loss: 1.1834 - regression_loss: 1.0278 - classification_loss: 0.1556 472/500 [===========================>..] - ETA: 10s - loss: 1.1834 - regression_loss: 1.0279 - classification_loss: 0.1555 473/500 [===========================>..] - ETA: 10s - loss: 1.1849 - regression_loss: 1.0293 - classification_loss: 0.1556 474/500 [===========================>..] - ETA: 10s - loss: 1.1856 - regression_loss: 1.0300 - classification_loss: 0.1556 475/500 [===========================>..] - ETA: 9s - loss: 1.1847 - regression_loss: 1.0293 - classification_loss: 0.1554  476/500 [===========================>..] - ETA: 9s - loss: 1.1838 - regression_loss: 1.0286 - classification_loss: 0.1552 477/500 [===========================>..] - ETA: 8s - loss: 1.1837 - regression_loss: 1.0285 - classification_loss: 0.1552 478/500 [===========================>..] - ETA: 8s - loss: 1.1831 - regression_loss: 1.0279 - classification_loss: 0.1551 479/500 [===========================>..] - ETA: 8s - loss: 1.1842 - regression_loss: 1.0288 - classification_loss: 0.1553 480/500 [===========================>..] - ETA: 7s - loss: 1.1840 - regression_loss: 1.0288 - classification_loss: 0.1553 481/500 [===========================>..] - ETA: 7s - loss: 1.1829 - regression_loss: 1.0278 - classification_loss: 0.1551 482/500 [===========================>..] - ETA: 7s - loss: 1.1828 - regression_loss: 1.0278 - classification_loss: 0.1550 483/500 [===========================>..] - ETA: 6s - loss: 1.1821 - regression_loss: 1.0272 - classification_loss: 0.1549 484/500 [============================>.] - ETA: 6s - loss: 1.1818 - regression_loss: 1.0268 - classification_loss: 0.1550 485/500 [============================>.] - ETA: 5s - loss: 1.1827 - regression_loss: 1.0275 - classification_loss: 0.1552 486/500 [============================>.] - ETA: 5s - loss: 1.1822 - regression_loss: 1.0269 - classification_loss: 0.1553 487/500 [============================>.] - ETA: 5s - loss: 1.1820 - regression_loss: 1.0267 - classification_loss: 0.1553 488/500 [============================>.] - ETA: 4s - loss: 1.1821 - regression_loss: 1.0268 - classification_loss: 0.1552 489/500 [============================>.] - ETA: 4s - loss: 1.1817 - regression_loss: 1.0265 - classification_loss: 0.1551 490/500 [============================>.] - ETA: 3s - loss: 1.1818 - regression_loss: 1.0266 - classification_loss: 0.1552 491/500 [============================>.] - ETA: 3s - loss: 1.1820 - regression_loss: 1.0268 - classification_loss: 0.1552 492/500 [============================>.] - ETA: 3s - loss: 1.1827 - regression_loss: 1.0275 - classification_loss: 0.1551 493/500 [============================>.] - ETA: 2s - loss: 1.1823 - regression_loss: 1.0272 - classification_loss: 0.1551 494/500 [============================>.] - ETA: 2s - loss: 1.1824 - regression_loss: 1.0274 - classification_loss: 0.1550 495/500 [============================>.] - ETA: 1s - loss: 1.1819 - regression_loss: 1.0270 - classification_loss: 0.1549 496/500 [============================>.] - ETA: 1s - loss: 1.1820 - regression_loss: 1.0271 - classification_loss: 0.1549 497/500 [============================>.] - ETA: 1s - loss: 1.1813 - regression_loss: 1.0265 - classification_loss: 0.1548 498/500 [============================>.] - ETA: 0s - loss: 1.1808 - regression_loss: 1.0262 - classification_loss: 0.1547 499/500 [============================>.] - ETA: 0s - loss: 1.1808 - regression_loss: 1.0262 - classification_loss: 0.1546 500/500 [==============================] - 195s 390ms/step - loss: 1.1807 - regression_loss: 1.0262 - classification_loss: 0.1545 326 instances of class plum with average precision: 0.8157 mAP: 0.8157 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/20 1/500 [..............................] - ETA: 3:05 - loss: 1.1980 - regression_loss: 1.0244 - classification_loss: 0.1736 2/500 [..............................] - ETA: 3:10 - loss: 0.8253 - regression_loss: 0.7073 - classification_loss: 0.1181 3/500 [..............................] - ETA: 3:12 - loss: 0.8835 - regression_loss: 0.7613 - classification_loss: 0.1222 4/500 [..............................] - ETA: 3:13 - loss: 0.8205 - regression_loss: 0.7038 - classification_loss: 0.1167 5/500 [..............................] - ETA: 3:14 - loss: 0.7741 - regression_loss: 0.6580 - classification_loss: 0.1161 6/500 [..............................] - ETA: 3:13 - loss: 0.8691 - regression_loss: 0.7370 - classification_loss: 0.1320 7/500 [..............................] - ETA: 3:12 - loss: 0.8455 - regression_loss: 0.7211 - classification_loss: 0.1244 8/500 [..............................] - ETA: 3:11 - loss: 0.8678 - regression_loss: 0.7477 - classification_loss: 0.1201 9/500 [..............................] - ETA: 3:11 - loss: 0.8656 - regression_loss: 0.7477 - classification_loss: 0.1180 10/500 [..............................] - ETA: 3:10 - loss: 0.8389 - regression_loss: 0.7285 - classification_loss: 0.1104 11/500 [..............................] - ETA: 3:10 - loss: 0.8249 - regression_loss: 0.7136 - classification_loss: 0.1113 12/500 [..............................] - ETA: 3:10 - loss: 0.8375 - regression_loss: 0.7235 - classification_loss: 0.1140 13/500 [..............................] - ETA: 3:09 - loss: 0.8452 - regression_loss: 0.7304 - classification_loss: 0.1148 14/500 [..............................] - ETA: 3:08 - loss: 0.8853 - regression_loss: 0.7686 - classification_loss: 0.1167 15/500 [..............................] - ETA: 3:08 - loss: 0.9093 - regression_loss: 0.7930 - classification_loss: 0.1163 16/500 [..............................] - ETA: 3:08 - loss: 0.9529 - regression_loss: 0.8327 - classification_loss: 0.1203 17/500 [>.............................] - ETA: 3:07 - loss: 0.9540 - regression_loss: 0.8347 - classification_loss: 0.1192 18/500 [>.............................] - ETA: 3:06 - loss: 0.9396 - regression_loss: 0.8232 - classification_loss: 0.1163 19/500 [>.............................] - ETA: 3:06 - loss: 0.9619 - regression_loss: 0.8401 - classification_loss: 0.1218 20/500 [>.............................] - ETA: 3:05 - loss: 0.9635 - regression_loss: 0.8411 - classification_loss: 0.1225 21/500 [>.............................] - ETA: 3:05 - loss: 0.9622 - regression_loss: 0.8402 - classification_loss: 0.1220 22/500 [>.............................] - ETA: 3:05 - loss: 0.9495 - regression_loss: 0.8270 - classification_loss: 0.1225 23/500 [>.............................] - ETA: 3:05 - loss: 0.9529 - regression_loss: 0.8314 - classification_loss: 0.1216 24/500 [>.............................] - ETA: 3:04 - loss: 0.9582 - regression_loss: 0.8356 - classification_loss: 0.1226 25/500 [>.............................] - ETA: 3:04 - loss: 0.9530 - regression_loss: 0.8314 - classification_loss: 0.1217 26/500 [>.............................] - ETA: 3:03 - loss: 0.9609 - regression_loss: 0.8388 - classification_loss: 0.1221 27/500 [>.............................] - ETA: 3:03 - loss: 0.9720 - regression_loss: 0.8475 - classification_loss: 0.1245 28/500 [>.............................] - ETA: 3:03 - loss: 0.9757 - regression_loss: 0.8517 - classification_loss: 0.1241 29/500 [>.............................] - ETA: 3:02 - loss: 0.9697 - regression_loss: 0.8463 - classification_loss: 0.1234 30/500 [>.............................] - ETA: 3:02 - loss: 0.9576 - regression_loss: 0.8330 - classification_loss: 0.1246 31/500 [>.............................] - ETA: 3:02 - loss: 0.9435 - regression_loss: 0.8212 - classification_loss: 0.1223 32/500 [>.............................] - ETA: 3:01 - loss: 0.9490 - regression_loss: 0.8264 - classification_loss: 0.1225 33/500 [>.............................] - ETA: 3:01 - loss: 0.9559 - regression_loss: 0.8331 - classification_loss: 0.1228 34/500 [=>............................] - ETA: 3:01 - loss: 0.9633 - regression_loss: 0.8411 - classification_loss: 0.1223 35/500 [=>............................] - ETA: 3:00 - loss: 0.9603 - regression_loss: 0.8388 - classification_loss: 0.1215 36/500 [=>............................] - ETA: 3:00 - loss: 0.9782 - regression_loss: 0.8543 - classification_loss: 0.1240 37/500 [=>............................] - ETA: 2:59 - loss: 0.9808 - regression_loss: 0.8578 - classification_loss: 0.1230 38/500 [=>............................] - ETA: 2:59 - loss: 0.9863 - regression_loss: 0.8643 - classification_loss: 0.1220 39/500 [=>............................] - ETA: 2:58 - loss: 0.9852 - regression_loss: 0.8630 - classification_loss: 0.1222 40/500 [=>............................] - ETA: 2:58 - loss: 0.9950 - regression_loss: 0.8724 - classification_loss: 0.1226 41/500 [=>............................] - ETA: 2:58 - loss: 0.9960 - regression_loss: 0.8745 - classification_loss: 0.1216 42/500 [=>............................] - ETA: 2:57 - loss: 1.0020 - regression_loss: 0.8804 - classification_loss: 0.1216 43/500 [=>............................] - ETA: 2:57 - loss: 0.9928 - regression_loss: 0.8722 - classification_loss: 0.1206 44/500 [=>............................] - ETA: 2:57 - loss: 1.0196 - regression_loss: 0.8942 - classification_loss: 0.1253 45/500 [=>............................] - ETA: 2:56 - loss: 1.0183 - regression_loss: 0.8933 - classification_loss: 0.1251 46/500 [=>............................] - ETA: 2:56 - loss: 1.0142 - regression_loss: 0.8894 - classification_loss: 0.1248 47/500 [=>............................] - ETA: 2:55 - loss: 1.0118 - regression_loss: 0.8867 - classification_loss: 0.1251 48/500 [=>............................] - ETA: 2:55 - loss: 1.0133 - regression_loss: 0.8885 - classification_loss: 0.1249 49/500 [=>............................] - ETA: 2:54 - loss: 1.0160 - regression_loss: 0.8914 - classification_loss: 0.1245 50/500 [==>...........................] - ETA: 2:54 - loss: 1.0177 - regression_loss: 0.8939 - classification_loss: 0.1238 51/500 [==>...........................] - ETA: 2:54 - loss: 1.0183 - regression_loss: 0.8946 - classification_loss: 0.1237 52/500 [==>...........................] - ETA: 2:53 - loss: 1.0278 - regression_loss: 0.9022 - classification_loss: 0.1256 53/500 [==>...........................] - ETA: 2:53 - loss: 1.0285 - regression_loss: 0.9030 - classification_loss: 0.1254 54/500 [==>...........................] - ETA: 2:52 - loss: 1.0427 - regression_loss: 0.9157 - classification_loss: 0.1270 55/500 [==>...........................] - ETA: 2:52 - loss: 1.0314 - regression_loss: 0.9061 - classification_loss: 0.1253 56/500 [==>...........................] - ETA: 2:52 - loss: 1.0439 - regression_loss: 0.9166 - classification_loss: 0.1273 57/500 [==>...........................] - ETA: 2:51 - loss: 1.0443 - regression_loss: 0.9166 - classification_loss: 0.1277 58/500 [==>...........................] - ETA: 2:51 - loss: 1.0484 - regression_loss: 0.9208 - classification_loss: 0.1276 59/500 [==>...........................] - ETA: 2:51 - loss: 1.0628 - regression_loss: 0.9327 - classification_loss: 0.1301 60/500 [==>...........................] - ETA: 2:50 - loss: 1.0717 - regression_loss: 0.9407 - classification_loss: 0.1310 61/500 [==>...........................] - ETA: 2:50 - loss: 1.0736 - regression_loss: 0.9425 - classification_loss: 0.1311 62/500 [==>...........................] - ETA: 2:49 - loss: 1.0685 - regression_loss: 0.9382 - classification_loss: 0.1303 63/500 [==>...........................] - ETA: 2:49 - loss: 1.0679 - regression_loss: 0.9384 - classification_loss: 0.1295 64/500 [==>...........................] - ETA: 2:48 - loss: 1.0607 - regression_loss: 0.9321 - classification_loss: 0.1286 65/500 [==>...........................] - ETA: 2:48 - loss: 1.0518 - regression_loss: 0.9234 - classification_loss: 0.1284 66/500 [==>...........................] - ETA: 2:48 - loss: 1.0500 - regression_loss: 0.9216 - classification_loss: 0.1284 67/500 [===>..........................] - ETA: 2:47 - loss: 1.0599 - regression_loss: 0.9295 - classification_loss: 0.1304 68/500 [===>..........................] - ETA: 2:47 - loss: 1.0638 - regression_loss: 0.9331 - classification_loss: 0.1307 69/500 [===>..........................] - ETA: 2:46 - loss: 1.0645 - regression_loss: 0.9344 - classification_loss: 0.1302 70/500 [===>..........................] - ETA: 2:46 - loss: 1.0670 - regression_loss: 0.9369 - classification_loss: 0.1302 71/500 [===>..........................] - ETA: 2:46 - loss: 1.0773 - regression_loss: 0.9452 - classification_loss: 0.1321 72/500 [===>..........................] - ETA: 2:45 - loss: 1.0714 - regression_loss: 0.9399 - classification_loss: 0.1314 73/500 [===>..........................] - ETA: 2:45 - loss: 1.0705 - regression_loss: 0.9390 - classification_loss: 0.1315 74/500 [===>..........................] - ETA: 2:45 - loss: 1.0771 - regression_loss: 0.9447 - classification_loss: 0.1324 75/500 [===>..........................] - ETA: 2:44 - loss: 1.0730 - regression_loss: 0.9410 - classification_loss: 0.1320 76/500 [===>..........................] - ETA: 2:44 - loss: 1.0688 - regression_loss: 0.9376 - classification_loss: 0.1312 77/500 [===>..........................] - ETA: 2:44 - loss: 1.0756 - regression_loss: 0.9431 - classification_loss: 0.1325 78/500 [===>..........................] - ETA: 2:43 - loss: 1.0711 - regression_loss: 0.9394 - classification_loss: 0.1317 79/500 [===>..........................] - ETA: 2:43 - loss: 1.0696 - regression_loss: 0.9380 - classification_loss: 0.1315 80/500 [===>..........................] - ETA: 2:43 - loss: 1.0683 - regression_loss: 0.9369 - classification_loss: 0.1313 81/500 [===>..........................] - ETA: 2:42 - loss: 1.0644 - regression_loss: 0.9327 - classification_loss: 0.1317 82/500 [===>..........................] - ETA: 2:42 - loss: 1.0621 - regression_loss: 0.9299 - classification_loss: 0.1322 83/500 [===>..........................] - ETA: 2:41 - loss: 1.0629 - regression_loss: 0.9302 - classification_loss: 0.1327 84/500 [====>.........................] - ETA: 2:41 - loss: 1.0593 - regression_loss: 0.9269 - classification_loss: 0.1324 85/500 [====>.........................] - ETA: 2:40 - loss: 1.0583 - regression_loss: 0.9262 - classification_loss: 0.1322 86/500 [====>.........................] - ETA: 2:40 - loss: 1.0605 - regression_loss: 0.9281 - classification_loss: 0.1324 87/500 [====>.........................] - ETA: 2:40 - loss: 1.0614 - regression_loss: 0.9288 - classification_loss: 0.1326 88/500 [====>.........................] - ETA: 2:39 - loss: 1.0667 - regression_loss: 0.9341 - classification_loss: 0.1326 89/500 [====>.........................] - ETA: 2:39 - loss: 1.0696 - regression_loss: 0.9362 - classification_loss: 0.1334 90/500 [====>.........................] - ETA: 2:39 - loss: 1.0699 - regression_loss: 0.9363 - classification_loss: 0.1336 91/500 [====>.........................] - ETA: 2:38 - loss: 1.0695 - regression_loss: 0.9363 - classification_loss: 0.1332 92/500 [====>.........................] - ETA: 2:38 - loss: 1.0704 - regression_loss: 0.9372 - classification_loss: 0.1332 93/500 [====>.........................] - ETA: 2:37 - loss: 1.0747 - regression_loss: 0.9414 - classification_loss: 0.1333 94/500 [====>.........................] - ETA: 2:37 - loss: 1.0774 - regression_loss: 0.9433 - classification_loss: 0.1341 95/500 [====>.........................] - ETA: 2:37 - loss: 1.0830 - regression_loss: 0.9470 - classification_loss: 0.1359 96/500 [====>.........................] - ETA: 2:36 - loss: 1.0777 - regression_loss: 0.9426 - classification_loss: 0.1350 97/500 [====>.........................] - ETA: 2:36 - loss: 1.0744 - regression_loss: 0.9400 - classification_loss: 0.1344 98/500 [====>.........................] - ETA: 2:36 - loss: 1.0738 - regression_loss: 0.9400 - classification_loss: 0.1338 99/500 [====>.........................] - ETA: 2:35 - loss: 1.0742 - regression_loss: 0.9404 - classification_loss: 0.1338 100/500 [=====>........................] - ETA: 2:35 - loss: 1.0683 - regression_loss: 0.9353 - classification_loss: 0.1330 101/500 [=====>........................] - ETA: 2:34 - loss: 1.0705 - regression_loss: 0.9378 - classification_loss: 0.1327 102/500 [=====>........................] - ETA: 2:34 - loss: 1.0704 - regression_loss: 0.9374 - classification_loss: 0.1330 103/500 [=====>........................] - ETA: 2:33 - loss: 1.0705 - regression_loss: 0.9375 - classification_loss: 0.1330 104/500 [=====>........................] - ETA: 2:33 - loss: 1.0749 - regression_loss: 0.9414 - classification_loss: 0.1336 105/500 [=====>........................] - ETA: 2:33 - loss: 1.0784 - regression_loss: 0.9443 - classification_loss: 0.1341 106/500 [=====>........................] - ETA: 2:32 - loss: 1.0736 - regression_loss: 0.9400 - classification_loss: 0.1337 107/500 [=====>........................] - ETA: 2:32 - loss: 1.0705 - regression_loss: 0.9374 - classification_loss: 0.1331 108/500 [=====>........................] - ETA: 2:31 - loss: 1.0729 - regression_loss: 0.9398 - classification_loss: 0.1332 109/500 [=====>........................] - ETA: 2:31 - loss: 1.0795 - regression_loss: 0.9444 - classification_loss: 0.1351 110/500 [=====>........................] - ETA: 2:31 - loss: 1.0790 - regression_loss: 0.9440 - classification_loss: 0.1350 111/500 [=====>........................] - ETA: 2:30 - loss: 1.0777 - regression_loss: 0.9427 - classification_loss: 0.1350 112/500 [=====>........................] - ETA: 2:30 - loss: 1.0802 - regression_loss: 0.9449 - classification_loss: 0.1354 113/500 [=====>........................] - ETA: 2:29 - loss: 1.0826 - regression_loss: 0.9472 - classification_loss: 0.1354 114/500 [=====>........................] - ETA: 2:29 - loss: 1.0834 - regression_loss: 0.9480 - classification_loss: 0.1354 115/500 [=====>........................] - ETA: 2:29 - loss: 1.0828 - regression_loss: 0.9476 - classification_loss: 0.1352 116/500 [=====>........................] - ETA: 2:28 - loss: 1.0923 - regression_loss: 0.9555 - classification_loss: 0.1368 117/500 [======>.......................] - ETA: 2:28 - loss: 1.0909 - regression_loss: 0.9546 - classification_loss: 0.1363 118/500 [======>.......................] - ETA: 2:27 - loss: 1.0938 - regression_loss: 0.9568 - classification_loss: 0.1369 119/500 [======>.......................] - ETA: 2:27 - loss: 1.0926 - regression_loss: 0.9558 - classification_loss: 0.1367 120/500 [======>.......................] - ETA: 2:27 - loss: 1.0939 - regression_loss: 0.9569 - classification_loss: 0.1370 121/500 [======>.......................] - ETA: 2:26 - loss: 1.0915 - regression_loss: 0.9546 - classification_loss: 0.1369 122/500 [======>.......................] - ETA: 2:26 - loss: 1.0891 - regression_loss: 0.9527 - classification_loss: 0.1364 123/500 [======>.......................] - ETA: 2:26 - loss: 1.1025 - regression_loss: 0.9621 - classification_loss: 0.1403 124/500 [======>.......................] - ETA: 2:25 - loss: 1.1013 - regression_loss: 0.9612 - classification_loss: 0.1401 125/500 [======>.......................] - ETA: 2:25 - loss: 1.0966 - regression_loss: 0.9573 - classification_loss: 0.1393 126/500 [======>.......................] - ETA: 2:24 - loss: 1.0980 - regression_loss: 0.9590 - classification_loss: 0.1389 127/500 [======>.......................] - ETA: 2:24 - loss: 1.1067 - regression_loss: 0.9664 - classification_loss: 0.1403 128/500 [======>.......................] - ETA: 2:24 - loss: 1.1075 - regression_loss: 0.9670 - classification_loss: 0.1406 129/500 [======>.......................] - ETA: 2:23 - loss: 1.1072 - regression_loss: 0.9668 - classification_loss: 0.1404 130/500 [======>.......................] - ETA: 2:23 - loss: 1.1034 - regression_loss: 0.9636 - classification_loss: 0.1397 131/500 [======>.......................] - ETA: 2:23 - loss: 1.1079 - regression_loss: 0.9674 - classification_loss: 0.1406 132/500 [======>.......................] - ETA: 2:22 - loss: 1.1107 - regression_loss: 0.9699 - classification_loss: 0.1408 133/500 [======>.......................] - ETA: 2:22 - loss: 1.1131 - regression_loss: 0.9720 - classification_loss: 0.1411 134/500 [=======>......................] - ETA: 2:21 - loss: 1.1157 - regression_loss: 0.9739 - classification_loss: 0.1418 135/500 [=======>......................] - ETA: 2:21 - loss: 1.1137 - regression_loss: 0.9722 - classification_loss: 0.1415 136/500 [=======>......................] - ETA: 2:21 - loss: 1.1095 - regression_loss: 0.9684 - classification_loss: 0.1411 137/500 [=======>......................] - ETA: 2:20 - loss: 1.1159 - regression_loss: 0.9725 - classification_loss: 0.1434 138/500 [=======>......................] - ETA: 2:20 - loss: 1.1140 - regression_loss: 0.9710 - classification_loss: 0.1430 139/500 [=======>......................] - ETA: 2:20 - loss: 1.1119 - regression_loss: 0.9693 - classification_loss: 0.1426 140/500 [=======>......................] - ETA: 2:19 - loss: 1.1111 - regression_loss: 0.9687 - classification_loss: 0.1423 141/500 [=======>......................] - ETA: 2:19 - loss: 1.1112 - regression_loss: 0.9688 - classification_loss: 0.1424 142/500 [=======>......................] - ETA: 2:18 - loss: 1.1153 - regression_loss: 0.9716 - classification_loss: 0.1436 143/500 [=======>......................] - ETA: 2:18 - loss: 1.1121 - regression_loss: 0.9691 - classification_loss: 0.1431 144/500 [=======>......................] - ETA: 2:18 - loss: 1.1112 - regression_loss: 0.9685 - classification_loss: 0.1427 145/500 [=======>......................] - ETA: 2:17 - loss: 1.1103 - regression_loss: 0.9678 - classification_loss: 0.1425 146/500 [=======>......................] - ETA: 2:17 - loss: 1.1080 - regression_loss: 0.9659 - classification_loss: 0.1421 147/500 [=======>......................] - ETA: 2:16 - loss: 1.1027 - regression_loss: 0.9612 - classification_loss: 0.1415 148/500 [=======>......................] - ETA: 2:16 - loss: 1.1020 - regression_loss: 0.9607 - classification_loss: 0.1413 149/500 [=======>......................] - ETA: 2:16 - loss: 1.1012 - regression_loss: 0.9600 - classification_loss: 0.1412 150/500 [========>.....................] - ETA: 2:15 - loss: 1.1003 - regression_loss: 0.9591 - classification_loss: 0.1412 151/500 [========>.....................] - ETA: 2:15 - loss: 1.1077 - regression_loss: 0.9652 - classification_loss: 0.1425 152/500 [========>.....................] - ETA: 2:15 - loss: 1.1061 - regression_loss: 0.9639 - classification_loss: 0.1423 153/500 [========>.....................] - ETA: 2:14 - loss: 1.1104 - regression_loss: 0.9674 - classification_loss: 0.1430 154/500 [========>.....................] - ETA: 2:14 - loss: 1.1115 - regression_loss: 0.9685 - classification_loss: 0.1430 155/500 [========>.....................] - ETA: 2:13 - loss: 1.1116 - regression_loss: 0.9688 - classification_loss: 0.1428 156/500 [========>.....................] - ETA: 2:13 - loss: 1.1117 - regression_loss: 0.9691 - classification_loss: 0.1427 157/500 [========>.....................] - ETA: 2:13 - loss: 1.1086 - regression_loss: 0.9662 - classification_loss: 0.1424 158/500 [========>.....................] - ETA: 2:12 - loss: 1.1072 - regression_loss: 0.9652 - classification_loss: 0.1421 159/500 [========>.....................] - ETA: 2:12 - loss: 1.1045 - regression_loss: 0.9627 - classification_loss: 0.1418 160/500 [========>.....................] - ETA: 2:11 - loss: 1.1047 - regression_loss: 0.9630 - classification_loss: 0.1417 161/500 [========>.....................] - ETA: 2:11 - loss: 1.1061 - regression_loss: 0.9641 - classification_loss: 0.1420 162/500 [========>.....................] - ETA: 2:11 - loss: 1.1025 - regression_loss: 0.9611 - classification_loss: 0.1414 163/500 [========>.....................] - ETA: 2:10 - loss: 1.1065 - regression_loss: 0.9646 - classification_loss: 0.1419 164/500 [========>.....................] - ETA: 2:10 - loss: 1.1078 - regression_loss: 0.9659 - classification_loss: 0.1419 165/500 [========>.....................] - ETA: 2:09 - loss: 1.1111 - regression_loss: 0.9685 - classification_loss: 0.1426 166/500 [========>.....................] - ETA: 2:09 - loss: 1.1080 - regression_loss: 0.9657 - classification_loss: 0.1422 167/500 [=========>....................] - ETA: 2:09 - loss: 1.1094 - regression_loss: 0.9669 - classification_loss: 0.1425 168/500 [=========>....................] - ETA: 2:08 - loss: 1.1078 - regression_loss: 0.9655 - classification_loss: 0.1423 169/500 [=========>....................] - ETA: 2:08 - loss: 1.1066 - regression_loss: 0.9644 - classification_loss: 0.1422 170/500 [=========>....................] - ETA: 2:07 - loss: 1.1091 - regression_loss: 0.9667 - classification_loss: 0.1424 171/500 [=========>....................] - ETA: 2:07 - loss: 1.1111 - regression_loss: 0.9687 - classification_loss: 0.1424 172/500 [=========>....................] - ETA: 2:07 - loss: 1.1141 - regression_loss: 0.9710 - classification_loss: 0.1431 173/500 [=========>....................] - ETA: 2:06 - loss: 1.1150 - regression_loss: 0.9718 - classification_loss: 0.1432 174/500 [=========>....................] - ETA: 2:06 - loss: 1.1127 - regression_loss: 0.9699 - classification_loss: 0.1428 175/500 [=========>....................] - ETA: 2:06 - loss: 1.1168 - regression_loss: 0.9734 - classification_loss: 0.1435 176/500 [=========>....................] - ETA: 2:05 - loss: 1.1148 - regression_loss: 0.9717 - classification_loss: 0.1431 177/500 [=========>....................] - ETA: 2:05 - loss: 1.1147 - regression_loss: 0.9718 - classification_loss: 0.1429 178/500 [=========>....................] - ETA: 2:04 - loss: 1.1143 - regression_loss: 0.9712 - classification_loss: 0.1430 179/500 [=========>....................] - ETA: 2:04 - loss: 1.1160 - regression_loss: 0.9728 - classification_loss: 0.1432 180/500 [=========>....................] - ETA: 2:04 - loss: 1.1146 - regression_loss: 0.9718 - classification_loss: 0.1428 181/500 [=========>....................] - ETA: 2:03 - loss: 1.1183 - regression_loss: 0.9751 - classification_loss: 0.1432 182/500 [=========>....................] - ETA: 2:03 - loss: 1.1161 - regression_loss: 0.9732 - classification_loss: 0.1429 183/500 [=========>....................] - ETA: 2:03 - loss: 1.1139 - regression_loss: 0.9713 - classification_loss: 0.1426 184/500 [==========>...................] - ETA: 2:02 - loss: 1.1145 - regression_loss: 0.9715 - classification_loss: 0.1430 185/500 [==========>...................] - ETA: 2:02 - loss: 1.1148 - regression_loss: 0.9719 - classification_loss: 0.1430 186/500 [==========>...................] - ETA: 2:01 - loss: 1.1148 - regression_loss: 0.9720 - classification_loss: 0.1428 187/500 [==========>...................] - ETA: 2:01 - loss: 1.1138 - regression_loss: 0.9712 - classification_loss: 0.1425 188/500 [==========>...................] - ETA: 2:01 - loss: 1.1137 - regression_loss: 0.9713 - classification_loss: 0.1425 189/500 [==========>...................] - ETA: 2:00 - loss: 1.1156 - regression_loss: 0.9728 - classification_loss: 0.1427 190/500 [==========>...................] - ETA: 2:00 - loss: 1.1163 - regression_loss: 0.9735 - classification_loss: 0.1428 191/500 [==========>...................] - ETA: 1:59 - loss: 1.1151 - regression_loss: 0.9725 - classification_loss: 0.1426 192/500 [==========>...................] - ETA: 1:59 - loss: 1.1141 - regression_loss: 0.9717 - classification_loss: 0.1424 193/500 [==========>...................] - ETA: 1:59 - loss: 1.1112 - regression_loss: 0.9693 - classification_loss: 0.1419 194/500 [==========>...................] - ETA: 1:58 - loss: 1.1110 - regression_loss: 0.9690 - classification_loss: 0.1421 195/500 [==========>...................] - ETA: 1:58 - loss: 1.1139 - regression_loss: 0.9710 - classification_loss: 0.1430 196/500 [==========>...................] - ETA: 1:57 - loss: 1.1147 - regression_loss: 0.9719 - classification_loss: 0.1428 197/500 [==========>...................] - ETA: 1:57 - loss: 1.1142 - regression_loss: 0.9716 - classification_loss: 0.1427 198/500 [==========>...................] - ETA: 1:57 - loss: 1.1156 - regression_loss: 0.9723 - classification_loss: 0.1433 199/500 [==========>...................] - ETA: 1:56 - loss: 1.1156 - regression_loss: 0.9722 - classification_loss: 0.1434 200/500 [===========>..................] - ETA: 1:56 - loss: 1.1143 - regression_loss: 0.9711 - classification_loss: 0.1432 201/500 [===========>..................] - ETA: 1:56 - loss: 1.1111 - regression_loss: 0.9683 - classification_loss: 0.1428 202/500 [===========>..................] - ETA: 1:55 - loss: 1.1074 - regression_loss: 0.9651 - classification_loss: 0.1423 203/500 [===========>..................] - ETA: 1:55 - loss: 1.1065 - regression_loss: 0.9646 - classification_loss: 0.1420 204/500 [===========>..................] - ETA: 1:54 - loss: 1.1066 - regression_loss: 0.9647 - classification_loss: 0.1419 205/500 [===========>..................] - ETA: 1:54 - loss: 1.1059 - regression_loss: 0.9642 - classification_loss: 0.1417 206/500 [===========>..................] - ETA: 1:54 - loss: 1.1052 - regression_loss: 0.9635 - classification_loss: 0.1417 207/500 [===========>..................] - ETA: 1:53 - loss: 1.1053 - regression_loss: 0.9634 - classification_loss: 0.1419 208/500 [===========>..................] - ETA: 1:53 - loss: 1.1048 - regression_loss: 0.9630 - classification_loss: 0.1417 209/500 [===========>..................] - ETA: 1:52 - loss: 1.1027 - regression_loss: 0.9609 - classification_loss: 0.1417 210/500 [===========>..................] - ETA: 1:52 - loss: 1.1035 - regression_loss: 0.9617 - classification_loss: 0.1418 211/500 [===========>..................] - ETA: 1:52 - loss: 1.1039 - regression_loss: 0.9624 - classification_loss: 0.1416 212/500 [===========>..................] - ETA: 1:51 - loss: 1.1043 - regression_loss: 0.9627 - classification_loss: 0.1415 213/500 [===========>..................] - ETA: 1:51 - loss: 1.1039 - regression_loss: 0.9625 - classification_loss: 0.1414 214/500 [===========>..................] - ETA: 1:51 - loss: 1.1046 - regression_loss: 0.9630 - classification_loss: 0.1416 215/500 [===========>..................] - ETA: 1:50 - loss: 1.1047 - regression_loss: 0.9634 - classification_loss: 0.1413 216/500 [===========>..................] - ETA: 1:50 - loss: 1.1033 - regression_loss: 0.9621 - classification_loss: 0.1413 217/500 [============>.................] - ETA: 1:49 - loss: 1.1059 - regression_loss: 0.9642 - classification_loss: 0.1417 218/500 [============>.................] - ETA: 1:49 - loss: 1.1067 - regression_loss: 0.9649 - classification_loss: 0.1418 219/500 [============>.................] - ETA: 1:49 - loss: 1.1064 - regression_loss: 0.9647 - classification_loss: 0.1417 220/500 [============>.................] - ETA: 1:48 - loss: 1.1047 - regression_loss: 0.9631 - classification_loss: 0.1415 221/500 [============>.................] - ETA: 1:48 - loss: 1.1074 - regression_loss: 0.9656 - classification_loss: 0.1419 222/500 [============>.................] - ETA: 1:47 - loss: 1.1057 - regression_loss: 0.9640 - classification_loss: 0.1417 223/500 [============>.................] - ETA: 1:47 - loss: 1.1044 - regression_loss: 0.9629 - classification_loss: 0.1415 224/500 [============>.................] - ETA: 1:47 - loss: 1.1037 - regression_loss: 0.9625 - classification_loss: 0.1412 225/500 [============>.................] - ETA: 1:46 - loss: 1.1040 - regression_loss: 0.9629 - classification_loss: 0.1412 226/500 [============>.................] - ETA: 1:46 - loss: 1.1012 - regression_loss: 0.9603 - classification_loss: 0.1408 227/500 [============>.................] - ETA: 1:46 - loss: 1.0997 - regression_loss: 0.9590 - classification_loss: 0.1407 228/500 [============>.................] - ETA: 1:45 - loss: 1.0999 - regression_loss: 0.9591 - classification_loss: 0.1408 229/500 [============>.................] - ETA: 1:45 - loss: 1.1016 - regression_loss: 0.9604 - classification_loss: 0.1412 230/500 [============>.................] - ETA: 1:44 - loss: 1.0989 - regression_loss: 0.9578 - classification_loss: 0.1411 231/500 [============>.................] - ETA: 1:44 - loss: 1.0979 - regression_loss: 0.9570 - classification_loss: 0.1409 232/500 [============>.................] - ETA: 1:44 - loss: 1.0988 - regression_loss: 0.9576 - classification_loss: 0.1412 233/500 [============>.................] - ETA: 1:43 - loss: 1.0976 - regression_loss: 0.9565 - classification_loss: 0.1411 234/500 [=============>................] - ETA: 1:43 - loss: 1.0969 - regression_loss: 0.9559 - classification_loss: 0.1410 235/500 [=============>................] - ETA: 1:42 - loss: 1.0982 - regression_loss: 0.9567 - classification_loss: 0.1416 236/500 [=============>................] - ETA: 1:42 - loss: 1.0964 - regression_loss: 0.9549 - classification_loss: 0.1414 237/500 [=============>................] - ETA: 1:42 - loss: 1.0962 - regression_loss: 0.9547 - classification_loss: 0.1414 238/500 [=============>................] - ETA: 1:41 - loss: 1.0953 - regression_loss: 0.9540 - classification_loss: 0.1413 239/500 [=============>................] - ETA: 1:41 - loss: 1.0943 - regression_loss: 0.9531 - classification_loss: 0.1412 240/500 [=============>................] - ETA: 1:40 - loss: 1.0963 - regression_loss: 0.9548 - classification_loss: 0.1415 241/500 [=============>................] - ETA: 1:40 - loss: 1.0966 - regression_loss: 0.9548 - classification_loss: 0.1417 242/500 [=============>................] - ETA: 1:40 - loss: 1.0933 - regression_loss: 0.9520 - classification_loss: 0.1413 243/500 [=============>................] - ETA: 1:39 - loss: 1.0901 - regression_loss: 0.9491 - classification_loss: 0.1410 244/500 [=============>................] - ETA: 1:39 - loss: 1.0891 - regression_loss: 0.9482 - classification_loss: 0.1408 245/500 [=============>................] - ETA: 1:39 - loss: 1.0905 - regression_loss: 0.9494 - classification_loss: 0.1411 246/500 [=============>................] - ETA: 1:38 - loss: 1.0892 - regression_loss: 0.9482 - classification_loss: 0.1410 247/500 [=============>................] - ETA: 1:38 - loss: 1.0887 - regression_loss: 0.9479 - classification_loss: 0.1408 248/500 [=============>................] - ETA: 1:37 - loss: 1.0881 - regression_loss: 0.9474 - classification_loss: 0.1407 249/500 [=============>................] - ETA: 1:37 - loss: 1.0862 - regression_loss: 0.9459 - classification_loss: 0.1403 250/500 [==============>...............] - ETA: 1:37 - loss: 1.0862 - regression_loss: 0.9459 - classification_loss: 0.1403 251/500 [==============>...............] - ETA: 1:36 - loss: 1.0849 - regression_loss: 0.9448 - classification_loss: 0.1400 252/500 [==============>...............] - ETA: 1:36 - loss: 1.0824 - regression_loss: 0.9427 - classification_loss: 0.1397 253/500 [==============>...............] - ETA: 1:35 - loss: 1.0806 - regression_loss: 0.9411 - classification_loss: 0.1395 254/500 [==============>...............] - ETA: 1:35 - loss: 1.0816 - regression_loss: 0.9418 - classification_loss: 0.1398 255/500 [==============>...............] - ETA: 1:35 - loss: 1.0819 - regression_loss: 0.9420 - classification_loss: 0.1399 256/500 [==============>...............] - ETA: 1:34 - loss: 1.0847 - regression_loss: 0.9439 - classification_loss: 0.1409 257/500 [==============>...............] - ETA: 1:34 - loss: 1.0852 - regression_loss: 0.9442 - classification_loss: 0.1411 258/500 [==============>...............] - ETA: 1:34 - loss: 1.0848 - regression_loss: 0.9439 - classification_loss: 0.1409 259/500 [==============>...............] - ETA: 1:33 - loss: 1.0824 - regression_loss: 0.9417 - classification_loss: 0.1407 260/500 [==============>...............] - ETA: 1:33 - loss: 1.0810 - regression_loss: 0.9405 - classification_loss: 0.1405 261/500 [==============>...............] - ETA: 1:32 - loss: 1.0801 - regression_loss: 0.9397 - classification_loss: 0.1404 262/500 [==============>...............] - ETA: 1:32 - loss: 1.0807 - regression_loss: 0.9402 - classification_loss: 0.1405 263/500 [==============>...............] - ETA: 1:32 - loss: 1.0814 - regression_loss: 0.9408 - classification_loss: 0.1406 264/500 [==============>...............] - ETA: 1:31 - loss: 1.0813 - regression_loss: 0.9408 - classification_loss: 0.1405 265/500 [==============>...............] - ETA: 1:31 - loss: 1.0821 - regression_loss: 0.9415 - classification_loss: 0.1406 266/500 [==============>...............] - ETA: 1:30 - loss: 1.0818 - regression_loss: 0.9414 - classification_loss: 0.1405 267/500 [===============>..............] - ETA: 1:30 - loss: 1.0820 - regression_loss: 0.9417 - classification_loss: 0.1403 268/500 [===============>..............] - ETA: 1:30 - loss: 1.0803 - regression_loss: 0.9402 - classification_loss: 0.1401 269/500 [===============>..............] - ETA: 1:29 - loss: 1.0797 - regression_loss: 0.9397 - classification_loss: 0.1400 270/500 [===============>..............] - ETA: 1:29 - loss: 1.0786 - regression_loss: 0.9388 - classification_loss: 0.1398 271/500 [===============>..............] - ETA: 1:29 - loss: 1.0826 - regression_loss: 0.9418 - classification_loss: 0.1408 272/500 [===============>..............] - ETA: 1:28 - loss: 1.0843 - regression_loss: 0.9433 - classification_loss: 0.1410 273/500 [===============>..............] - ETA: 1:28 - loss: 1.0843 - regression_loss: 0.9435 - classification_loss: 0.1409 274/500 [===============>..............] - ETA: 1:27 - loss: 1.0834 - regression_loss: 0.9428 - classification_loss: 0.1406 275/500 [===============>..............] - ETA: 1:27 - loss: 1.0829 - regression_loss: 0.9426 - classification_loss: 0.1403 276/500 [===============>..............] - ETA: 1:27 - loss: 1.0813 - regression_loss: 0.9413 - classification_loss: 0.1400 277/500 [===============>..............] - ETA: 1:26 - loss: 1.0843 - regression_loss: 0.9438 - classification_loss: 0.1405 278/500 [===============>..............] - ETA: 1:26 - loss: 1.0864 - regression_loss: 0.9455 - classification_loss: 0.1409 279/500 [===============>..............] - ETA: 1:25 - loss: 1.0872 - regression_loss: 0.9463 - classification_loss: 0.1409 280/500 [===============>..............] - ETA: 1:25 - loss: 1.0883 - regression_loss: 0.9474 - classification_loss: 0.1409 281/500 [===============>..............] - ETA: 1:25 - loss: 1.0874 - regression_loss: 0.9466 - classification_loss: 0.1408 282/500 [===============>..............] - ETA: 1:24 - loss: 1.0876 - regression_loss: 0.9468 - classification_loss: 0.1407 283/500 [===============>..............] - ETA: 1:24 - loss: 1.0877 - regression_loss: 0.9471 - classification_loss: 0.1406 284/500 [================>.............] - ETA: 1:24 - loss: 1.0871 - regression_loss: 0.9467 - classification_loss: 0.1404 285/500 [================>.............] - ETA: 1:23 - loss: 1.0863 - regression_loss: 0.9459 - classification_loss: 0.1404 286/500 [================>.............] - ETA: 1:23 - loss: 1.0879 - regression_loss: 0.9473 - classification_loss: 0.1406 287/500 [================>.............] - ETA: 1:22 - loss: 1.0889 - regression_loss: 0.9482 - classification_loss: 0.1407 288/500 [================>.............] - ETA: 1:22 - loss: 1.0890 - regression_loss: 0.9485 - classification_loss: 0.1405 289/500 [================>.............] - ETA: 1:22 - loss: 1.0883 - regression_loss: 0.9481 - classification_loss: 0.1402 290/500 [================>.............] - ETA: 1:21 - loss: 1.0884 - regression_loss: 0.9483 - classification_loss: 0.1400 291/500 [================>.............] - ETA: 1:21 - loss: 1.0876 - regression_loss: 0.9477 - classification_loss: 0.1399 292/500 [================>.............] - ETA: 1:20 - loss: 1.0879 - regression_loss: 0.9481 - classification_loss: 0.1398 293/500 [================>.............] - ETA: 1:20 - loss: 1.0885 - regression_loss: 0.9485 - classification_loss: 0.1400 294/500 [================>.............] - ETA: 1:20 - loss: 1.0865 - regression_loss: 0.9467 - classification_loss: 0.1398 295/500 [================>.............] - ETA: 1:19 - loss: 1.0858 - regression_loss: 0.9461 - classification_loss: 0.1398 296/500 [================>.............] - ETA: 1:19 - loss: 1.0868 - regression_loss: 0.9467 - classification_loss: 0.1401 297/500 [================>.............] - ETA: 1:19 - loss: 1.0852 - regression_loss: 0.9454 - classification_loss: 0.1398 298/500 [================>.............] - ETA: 1:18 - loss: 1.0867 - regression_loss: 0.9465 - classification_loss: 0.1402 299/500 [================>.............] - ETA: 1:18 - loss: 1.0861 - regression_loss: 0.9460 - classification_loss: 0.1401 300/500 [=================>............] - ETA: 1:17 - loss: 1.0849 - regression_loss: 0.9449 - classification_loss: 0.1400 301/500 [=================>............] - ETA: 1:17 - loss: 1.0833 - regression_loss: 0.9436 - classification_loss: 0.1397 302/500 [=================>............] - ETA: 1:17 - loss: 1.0831 - regression_loss: 0.9437 - classification_loss: 0.1395 303/500 [=================>............] - ETA: 1:16 - loss: 1.0846 - regression_loss: 0.9446 - classification_loss: 0.1399 304/500 [=================>............] - ETA: 1:16 - loss: 1.0837 - regression_loss: 0.9438 - classification_loss: 0.1399 305/500 [=================>............] - ETA: 1:15 - loss: 1.0834 - regression_loss: 0.9436 - classification_loss: 0.1398 306/500 [=================>............] - ETA: 1:15 - loss: 1.0841 - regression_loss: 0.9443 - classification_loss: 0.1398 307/500 [=================>............] - ETA: 1:15 - loss: 1.0858 - regression_loss: 0.9459 - classification_loss: 0.1399 308/500 [=================>............] - ETA: 1:14 - loss: 1.0851 - regression_loss: 0.9454 - classification_loss: 0.1398 309/500 [=================>............] - ETA: 1:14 - loss: 1.0869 - regression_loss: 0.9471 - classification_loss: 0.1398 310/500 [=================>............] - ETA: 1:13 - loss: 1.0861 - regression_loss: 0.9464 - classification_loss: 0.1396 311/500 [=================>............] - ETA: 1:13 - loss: 1.0852 - regression_loss: 0.9457 - classification_loss: 0.1394 312/500 [=================>............] - ETA: 1:13 - loss: 1.0839 - regression_loss: 0.9445 - classification_loss: 0.1394 313/500 [=================>............] - ETA: 1:12 - loss: 1.0847 - regression_loss: 0.9452 - classification_loss: 0.1395 314/500 [=================>............] - ETA: 1:12 - loss: 1.0853 - regression_loss: 0.9458 - classification_loss: 0.1396 315/500 [=================>............] - ETA: 1:12 - loss: 1.0876 - regression_loss: 0.9476 - classification_loss: 0.1400 316/500 [=================>............] - ETA: 1:11 - loss: 1.0881 - regression_loss: 0.9480 - classification_loss: 0.1401 317/500 [==================>...........] - ETA: 1:11 - loss: 1.0901 - regression_loss: 0.9497 - classification_loss: 0.1403 318/500 [==================>...........] - ETA: 1:10 - loss: 1.0907 - regression_loss: 0.9503 - classification_loss: 0.1403 319/500 [==================>...........] - ETA: 1:10 - loss: 1.0888 - regression_loss: 0.9488 - classification_loss: 0.1401 320/500 [==================>...........] - ETA: 1:10 - loss: 1.0895 - regression_loss: 0.9494 - classification_loss: 0.1400 321/500 [==================>...........] - ETA: 1:09 - loss: 1.0916 - regression_loss: 0.9512 - classification_loss: 0.1404 322/500 [==================>...........] - ETA: 1:09 - loss: 1.0911 - regression_loss: 0.9508 - classification_loss: 0.1403 323/500 [==================>...........] - ETA: 1:08 - loss: 1.0899 - regression_loss: 0.9500 - classification_loss: 0.1400 324/500 [==================>...........] - ETA: 1:08 - loss: 1.0899 - regression_loss: 0.9499 - classification_loss: 0.1400 325/500 [==================>...........] - ETA: 1:08 - loss: 1.0896 - regression_loss: 0.9498 - classification_loss: 0.1399 326/500 [==================>...........] - ETA: 1:07 - loss: 1.0900 - regression_loss: 0.9501 - classification_loss: 0.1399 327/500 [==================>...........] - ETA: 1:07 - loss: 1.0924 - regression_loss: 0.9519 - classification_loss: 0.1405 328/500 [==================>...........] - ETA: 1:07 - loss: 1.0933 - regression_loss: 0.9527 - classification_loss: 0.1406 329/500 [==================>...........] - ETA: 1:06 - loss: 1.0927 - regression_loss: 0.9522 - classification_loss: 0.1405 330/500 [==================>...........] - ETA: 1:06 - loss: 1.0928 - regression_loss: 0.9523 - classification_loss: 0.1405 331/500 [==================>...........] - ETA: 1:05 - loss: 1.0926 - regression_loss: 0.9523 - classification_loss: 0.1404 332/500 [==================>...........] - ETA: 1:05 - loss: 1.0918 - regression_loss: 0.9516 - classification_loss: 0.1403 333/500 [==================>...........] - ETA: 1:05 - loss: 1.0918 - regression_loss: 0.9515 - classification_loss: 0.1402 334/500 [===================>..........] - ETA: 1:04 - loss: 1.0951 - regression_loss: 0.9545 - classification_loss: 0.1406 335/500 [===================>..........] - ETA: 1:04 - loss: 1.0954 - regression_loss: 0.9548 - classification_loss: 0.1407 336/500 [===================>..........] - ETA: 1:03 - loss: 1.0943 - regression_loss: 0.9538 - classification_loss: 0.1405 337/500 [===================>..........] - ETA: 1:03 - loss: 1.0967 - regression_loss: 0.9556 - classification_loss: 0.1411 338/500 [===================>..........] - ETA: 1:03 - loss: 1.0951 - regression_loss: 0.9542 - classification_loss: 0.1410 339/500 [===================>..........] - ETA: 1:02 - loss: 1.0944 - regression_loss: 0.9535 - classification_loss: 0.1409 340/500 [===================>..........] - ETA: 1:02 - loss: 1.0933 - regression_loss: 0.9524 - classification_loss: 0.1409 341/500 [===================>..........] - ETA: 1:01 - loss: 1.0921 - regression_loss: 0.9514 - classification_loss: 0.1406 342/500 [===================>..........] - ETA: 1:01 - loss: 1.0930 - regression_loss: 0.9523 - classification_loss: 0.1407 343/500 [===================>..........] - ETA: 1:01 - loss: 1.0918 - regression_loss: 0.9512 - classification_loss: 0.1405 344/500 [===================>..........] - ETA: 1:00 - loss: 1.0906 - regression_loss: 0.9503 - classification_loss: 0.1403 345/500 [===================>..........] - ETA: 1:00 - loss: 1.0914 - regression_loss: 0.9509 - classification_loss: 0.1405 346/500 [===================>..........] - ETA: 1:00 - loss: 1.0901 - regression_loss: 0.9497 - classification_loss: 0.1404 347/500 [===================>..........] - ETA: 59s - loss: 1.0884 - regression_loss: 0.9482 - classification_loss: 0.1402  348/500 [===================>..........] - ETA: 59s - loss: 1.0871 - regression_loss: 0.9471 - classification_loss: 0.1400 349/500 [===================>..........] - ETA: 58s - loss: 1.0878 - regression_loss: 0.9479 - classification_loss: 0.1399 350/500 [====================>.........] - ETA: 58s - loss: 1.0861 - regression_loss: 0.9464 - classification_loss: 0.1396 351/500 [====================>.........] - ETA: 58s - loss: 1.0855 - regression_loss: 0.9461 - classification_loss: 0.1395 352/500 [====================>.........] - ETA: 57s - loss: 1.0849 - regression_loss: 0.9455 - classification_loss: 0.1394 353/500 [====================>.........] - ETA: 57s - loss: 1.0840 - regression_loss: 0.9447 - classification_loss: 0.1392 354/500 [====================>.........] - ETA: 56s - loss: 1.0842 - regression_loss: 0.9450 - classification_loss: 0.1392 355/500 [====================>.........] - ETA: 56s - loss: 1.0824 - regression_loss: 0.9434 - classification_loss: 0.1390 356/500 [====================>.........] - ETA: 56s - loss: 1.0817 - regression_loss: 0.9428 - classification_loss: 0.1389 357/500 [====================>.........] - ETA: 55s - loss: 1.0820 - regression_loss: 0.9432 - classification_loss: 0.1388 358/500 [====================>.........] - ETA: 55s - loss: 1.0826 - regression_loss: 0.9439 - classification_loss: 0.1387 359/500 [====================>.........] - ETA: 54s - loss: 1.0805 - regression_loss: 0.9421 - classification_loss: 0.1384 360/500 [====================>.........] - ETA: 54s - loss: 1.0827 - regression_loss: 0.9440 - classification_loss: 0.1387 361/500 [====================>.........] - ETA: 54s - loss: 1.0830 - regression_loss: 0.9440 - classification_loss: 0.1390 362/500 [====================>.........] - ETA: 53s - loss: 1.0842 - regression_loss: 0.9449 - classification_loss: 0.1393 363/500 [====================>.........] - ETA: 53s - loss: 1.0831 - regression_loss: 0.9441 - classification_loss: 0.1390 364/500 [====================>.........] - ETA: 53s - loss: 1.0833 - regression_loss: 0.9443 - classification_loss: 0.1390 365/500 [====================>.........] - ETA: 52s - loss: 1.0825 - regression_loss: 0.9436 - classification_loss: 0.1389 366/500 [====================>.........] - ETA: 52s - loss: 1.0825 - regression_loss: 0.9437 - classification_loss: 0.1388 367/500 [=====================>........] - ETA: 51s - loss: 1.0828 - regression_loss: 0.9440 - classification_loss: 0.1388 368/500 [=====================>........] - ETA: 51s - loss: 1.0851 - regression_loss: 0.9459 - classification_loss: 0.1392 369/500 [=====================>........] - ETA: 51s - loss: 1.0848 - regression_loss: 0.9457 - classification_loss: 0.1391 370/500 [=====================>........] - ETA: 50s - loss: 1.0840 - regression_loss: 0.9451 - classification_loss: 0.1389 371/500 [=====================>........] - ETA: 50s - loss: 1.0841 - regression_loss: 0.9452 - classification_loss: 0.1389 372/500 [=====================>........] - ETA: 49s - loss: 1.0850 - regression_loss: 0.9460 - classification_loss: 0.1390 373/500 [=====================>........] - ETA: 49s - loss: 1.0838 - regression_loss: 0.9448 - classification_loss: 0.1390 374/500 [=====================>........] - ETA: 49s - loss: 1.0845 - regression_loss: 0.9455 - classification_loss: 0.1390 375/500 [=====================>........] - ETA: 48s - loss: 1.0850 - regression_loss: 0.9459 - classification_loss: 0.1391 376/500 [=====================>........] - ETA: 48s - loss: 1.0838 - regression_loss: 0.9448 - classification_loss: 0.1389 377/500 [=====================>........] - ETA: 47s - loss: 1.0838 - regression_loss: 0.9450 - classification_loss: 0.1388 378/500 [=====================>........] - ETA: 47s - loss: 1.0830 - regression_loss: 0.9443 - classification_loss: 0.1387 379/500 [=====================>........] - ETA: 47s - loss: 1.0848 - regression_loss: 0.9457 - classification_loss: 0.1392 380/500 [=====================>........] - ETA: 46s - loss: 1.0835 - regression_loss: 0.9446 - classification_loss: 0.1390 381/500 [=====================>........] - ETA: 46s - loss: 1.0824 - regression_loss: 0.9435 - classification_loss: 0.1389 382/500 [=====================>........] - ETA: 45s - loss: 1.0812 - regression_loss: 0.9426 - classification_loss: 0.1387 383/500 [=====================>........] - ETA: 45s - loss: 1.0795 - regression_loss: 0.9411 - classification_loss: 0.1385 384/500 [======================>.......] - ETA: 45s - loss: 1.0789 - regression_loss: 0.9406 - classification_loss: 0.1384 385/500 [======================>.......] - ETA: 44s - loss: 1.0802 - regression_loss: 0.9414 - classification_loss: 0.1388 386/500 [======================>.......] - ETA: 44s - loss: 1.0806 - regression_loss: 0.9417 - classification_loss: 0.1389 387/500 [======================>.......] - ETA: 44s - loss: 1.0796 - regression_loss: 0.9408 - classification_loss: 0.1388 388/500 [======================>.......] - ETA: 43s - loss: 1.0795 - regression_loss: 0.9407 - classification_loss: 0.1389 389/500 [======================>.......] - ETA: 43s - loss: 1.0799 - regression_loss: 0.9411 - classification_loss: 0.1388 390/500 [======================>.......] - ETA: 42s - loss: 1.0796 - regression_loss: 0.9408 - classification_loss: 0.1388 391/500 [======================>.......] - ETA: 42s - loss: 1.0788 - regression_loss: 0.9401 - classification_loss: 0.1387 392/500 [======================>.......] - ETA: 42s - loss: 1.0785 - regression_loss: 0.9399 - classification_loss: 0.1386 393/500 [======================>.......] - ETA: 41s - loss: 1.0785 - regression_loss: 0.9399 - classification_loss: 0.1386 394/500 [======================>.......] - ETA: 41s - loss: 1.0782 - regression_loss: 0.9396 - classification_loss: 0.1386 395/500 [======================>.......] - ETA: 40s - loss: 1.0774 - regression_loss: 0.9389 - classification_loss: 0.1385 396/500 [======================>.......] - ETA: 40s - loss: 1.0778 - regression_loss: 0.9393 - classification_loss: 0.1385 397/500 [======================>.......] - ETA: 40s - loss: 1.0773 - regression_loss: 0.9389 - classification_loss: 0.1384 398/500 [======================>.......] - ETA: 39s - loss: 1.0776 - regression_loss: 0.9393 - classification_loss: 0.1384 399/500 [======================>.......] - ETA: 39s - loss: 1.0776 - regression_loss: 0.9394 - classification_loss: 0.1383 400/500 [=======================>......] - ETA: 38s - loss: 1.0773 - regression_loss: 0.9391 - classification_loss: 0.1382 401/500 [=======================>......] - ETA: 38s - loss: 1.0778 - regression_loss: 0.9395 - classification_loss: 0.1382 402/500 [=======================>......] - ETA: 38s - loss: 1.0778 - regression_loss: 0.9394 - classification_loss: 0.1384 403/500 [=======================>......] - ETA: 37s - loss: 1.0764 - regression_loss: 0.9381 - classification_loss: 0.1383 404/500 [=======================>......] - ETA: 37s - loss: 1.0766 - regression_loss: 0.9384 - classification_loss: 0.1382 405/500 [=======================>......] - ETA: 36s - loss: 1.0766 - regression_loss: 0.9385 - classification_loss: 0.1382 406/500 [=======================>......] - ETA: 36s - loss: 1.0749 - regression_loss: 0.9369 - classification_loss: 0.1380 407/500 [=======================>......] - ETA: 36s - loss: 1.0744 - regression_loss: 0.9364 - classification_loss: 0.1380 408/500 [=======================>......] - ETA: 35s - loss: 1.0745 - regression_loss: 0.9366 - classification_loss: 0.1379 409/500 [=======================>......] - ETA: 35s - loss: 1.0754 - regression_loss: 0.9375 - classification_loss: 0.1379 410/500 [=======================>......] - ETA: 35s - loss: 1.0749 - regression_loss: 0.9371 - classification_loss: 0.1378 411/500 [=======================>......] - ETA: 34s - loss: 1.0751 - regression_loss: 0.9374 - classification_loss: 0.1377 412/500 [=======================>......] - ETA: 34s - loss: 1.0749 - regression_loss: 0.9372 - classification_loss: 0.1376 413/500 [=======================>......] - ETA: 33s - loss: 1.0747 - regression_loss: 0.9372 - classification_loss: 0.1375 414/500 [=======================>......] - ETA: 33s - loss: 1.0747 - regression_loss: 0.9372 - classification_loss: 0.1375 415/500 [=======================>......] - ETA: 33s - loss: 1.0756 - regression_loss: 0.9380 - classification_loss: 0.1376 416/500 [=======================>......] - ETA: 32s - loss: 1.0750 - regression_loss: 0.9374 - classification_loss: 0.1376 417/500 [========================>.....] - ETA: 32s - loss: 1.0774 - regression_loss: 0.9394 - classification_loss: 0.1380 418/500 [========================>.....] - ETA: 31s - loss: 1.0761 - regression_loss: 0.9383 - classification_loss: 0.1378 419/500 [========================>.....] - ETA: 31s - loss: 1.0768 - regression_loss: 0.9387 - classification_loss: 0.1381 420/500 [========================>.....] - ETA: 31s - loss: 1.0774 - regression_loss: 0.9394 - classification_loss: 0.1380 421/500 [========================>.....] - ETA: 30s - loss: 1.0776 - regression_loss: 0.9396 - classification_loss: 0.1380 422/500 [========================>.....] - ETA: 30s - loss: 1.0800 - regression_loss: 0.9416 - classification_loss: 0.1385 423/500 [========================>.....] - ETA: 29s - loss: 1.0801 - regression_loss: 0.9415 - classification_loss: 0.1386 424/500 [========================>.....] - ETA: 29s - loss: 1.0788 - regression_loss: 0.9405 - classification_loss: 0.1384 425/500 [========================>.....] - ETA: 29s - loss: 1.0784 - regression_loss: 0.9402 - classification_loss: 0.1383 426/500 [========================>.....] - ETA: 28s - loss: 1.0774 - regression_loss: 0.9394 - classification_loss: 0.1381 427/500 [========================>.....] - ETA: 28s - loss: 1.0786 - regression_loss: 0.9402 - classification_loss: 0.1384 428/500 [========================>.....] - ETA: 28s - loss: 1.0779 - regression_loss: 0.9396 - classification_loss: 0.1384 429/500 [========================>.....] - ETA: 27s - loss: 1.0790 - regression_loss: 0.9404 - classification_loss: 0.1385 430/500 [========================>.....] - ETA: 27s - loss: 1.0790 - regression_loss: 0.9405 - classification_loss: 0.1385 431/500 [========================>.....] - ETA: 26s - loss: 1.0795 - regression_loss: 0.9409 - classification_loss: 0.1386 432/500 [========================>.....] - ETA: 26s - loss: 1.0798 - regression_loss: 0.9412 - classification_loss: 0.1385 433/500 [========================>.....] - ETA: 26s - loss: 1.0820 - regression_loss: 0.9428 - classification_loss: 0.1392 434/500 [=========================>....] - ETA: 25s - loss: 1.0850 - regression_loss: 0.9447 - classification_loss: 0.1403 435/500 [=========================>....] - ETA: 25s - loss: 1.0844 - regression_loss: 0.9442 - classification_loss: 0.1402 436/500 [=========================>....] - ETA: 24s - loss: 1.0840 - regression_loss: 0.9438 - classification_loss: 0.1402 437/500 [=========================>....] - ETA: 24s - loss: 1.0825 - regression_loss: 0.9425 - classification_loss: 0.1399 438/500 [=========================>....] - ETA: 24s - loss: 1.0832 - regression_loss: 0.9431 - classification_loss: 0.1400 439/500 [=========================>....] - ETA: 23s - loss: 1.0832 - regression_loss: 0.9432 - classification_loss: 0.1400 440/500 [=========================>....] - ETA: 23s - loss: 1.0838 - regression_loss: 0.9437 - classification_loss: 0.1401 441/500 [=========================>....] - ETA: 22s - loss: 1.0831 - regression_loss: 0.9431 - classification_loss: 0.1399 442/500 [=========================>....] - ETA: 22s - loss: 1.0832 - regression_loss: 0.9431 - classification_loss: 0.1401 443/500 [=========================>....] - ETA: 22s - loss: 1.0818 - regression_loss: 0.9419 - classification_loss: 0.1399 444/500 [=========================>....] - ETA: 21s - loss: 1.0811 - regression_loss: 0.9413 - classification_loss: 0.1398 445/500 [=========================>....] - ETA: 21s - loss: 1.0808 - regression_loss: 0.9411 - classification_loss: 0.1397 446/500 [=========================>....] - ETA: 21s - loss: 1.0814 - regression_loss: 0.9417 - classification_loss: 0.1397 447/500 [=========================>....] - ETA: 20s - loss: 1.0816 - regression_loss: 0.9421 - classification_loss: 0.1396 448/500 [=========================>....] - ETA: 20s - loss: 1.0824 - regression_loss: 0.9426 - classification_loss: 0.1397 449/500 [=========================>....] - ETA: 19s - loss: 1.0827 - regression_loss: 0.9430 - classification_loss: 0.1397 450/500 [==========================>...] - ETA: 19s - loss: 1.0825 - regression_loss: 0.9428 - classification_loss: 0.1397 451/500 [==========================>...] - ETA: 19s - loss: 1.0829 - regression_loss: 0.9431 - classification_loss: 0.1398 452/500 [==========================>...] - ETA: 18s - loss: 1.0818 - regression_loss: 0.9422 - classification_loss: 0.1396 453/500 [==========================>...] - ETA: 18s - loss: 1.0811 - regression_loss: 0.9416 - classification_loss: 0.1395 454/500 [==========================>...] - ETA: 17s - loss: 1.0808 - regression_loss: 0.9413 - classification_loss: 0.1395 455/500 [==========================>...] - ETA: 17s - loss: 1.0801 - regression_loss: 0.9406 - classification_loss: 0.1395 456/500 [==========================>...] - ETA: 17s - loss: 1.0800 - regression_loss: 0.9406 - classification_loss: 0.1395 457/500 [==========================>...] - ETA: 16s - loss: 1.0792 - regression_loss: 0.9399 - classification_loss: 0.1393 458/500 [==========================>...] - ETA: 16s - loss: 1.0795 - regression_loss: 0.9401 - classification_loss: 0.1393 459/500 [==========================>...] - ETA: 15s - loss: 1.0794 - regression_loss: 0.9401 - classification_loss: 0.1393 460/500 [==========================>...] - ETA: 15s - loss: 1.0785 - regression_loss: 0.9394 - classification_loss: 0.1392 461/500 [==========================>...] - ETA: 15s - loss: 1.0778 - regression_loss: 0.9388 - classification_loss: 0.1391 462/500 [==========================>...] - ETA: 14s - loss: 1.0768 - regression_loss: 0.9379 - classification_loss: 0.1389 463/500 [==========================>...] - ETA: 14s - loss: 1.0769 - regression_loss: 0.9380 - classification_loss: 0.1389 464/500 [==========================>...] - ETA: 14s - loss: 1.0767 - regression_loss: 0.9378 - classification_loss: 0.1389 465/500 [==========================>...] - ETA: 13s - loss: 1.0759 - regression_loss: 0.9371 - classification_loss: 0.1389 466/500 [==========================>...] - ETA: 13s - loss: 1.0751 - regression_loss: 0.9361 - classification_loss: 0.1389 467/500 [===========================>..] - ETA: 12s - loss: 1.0739 - regression_loss: 0.9351 - classification_loss: 0.1388 468/500 [===========================>..] - ETA: 12s - loss: 1.0753 - regression_loss: 0.9363 - classification_loss: 0.1390 469/500 [===========================>..] - ETA: 12s - loss: 1.0739 - regression_loss: 0.9350 - classification_loss: 0.1389 470/500 [===========================>..] - ETA: 11s - loss: 1.0735 - regression_loss: 0.9347 - classification_loss: 0.1388 471/500 [===========================>..] - ETA: 11s - loss: 1.0746 - regression_loss: 0.9357 - classification_loss: 0.1389 472/500 [===========================>..] - ETA: 10s - loss: 1.0746 - regression_loss: 0.9357 - classification_loss: 0.1389 473/500 [===========================>..] - ETA: 10s - loss: 1.0732 - regression_loss: 0.9345 - classification_loss: 0.1387 474/500 [===========================>..] - ETA: 10s - loss: 1.0715 - regression_loss: 0.9330 - classification_loss: 0.1385 475/500 [===========================>..] - ETA: 9s - loss: 1.0701 - regression_loss: 0.9318 - classification_loss: 0.1383  476/500 [===========================>..] - ETA: 9s - loss: 1.0717 - regression_loss: 0.9331 - classification_loss: 0.1386 477/500 [===========================>..] - ETA: 8s - loss: 1.0733 - regression_loss: 0.9344 - classification_loss: 0.1388 478/500 [===========================>..] - ETA: 8s - loss: 1.0723 - regression_loss: 0.9336 - classification_loss: 0.1387 479/500 [===========================>..] - ETA: 8s - loss: 1.0719 - regression_loss: 0.9332 - classification_loss: 0.1387 480/500 [===========================>..] - ETA: 7s - loss: 1.0727 - regression_loss: 0.9342 - classification_loss: 0.1385 481/500 [===========================>..] - ETA: 7s - loss: 1.0736 - regression_loss: 0.9353 - classification_loss: 0.1383 482/500 [===========================>..] - ETA: 7s - loss: 1.0733 - regression_loss: 0.9351 - classification_loss: 0.1382 483/500 [===========================>..] - ETA: 6s - loss: 1.0735 - regression_loss: 0.9351 - classification_loss: 0.1384 484/500 [============================>.] - ETA: 6s - loss: 1.0740 - regression_loss: 0.9354 - classification_loss: 0.1385 485/500 [============================>.] - ETA: 5s - loss: 1.0735 - regression_loss: 0.9350 - classification_loss: 0.1385 486/500 [============================>.] - ETA: 5s - loss: 1.0736 - regression_loss: 0.9352 - classification_loss: 0.1384 487/500 [============================>.] - ETA: 5s - loss: 1.0730 - regression_loss: 0.9347 - classification_loss: 0.1383 488/500 [============================>.] - ETA: 4s - loss: 1.0727 - regression_loss: 0.9345 - classification_loss: 0.1383 489/500 [============================>.] - ETA: 4s - loss: 1.0721 - regression_loss: 0.9340 - classification_loss: 0.1381 490/500 [============================>.] - ETA: 3s - loss: 1.0729 - regression_loss: 0.9349 - classification_loss: 0.1380 491/500 [============================>.] - ETA: 3s - loss: 1.0723 - regression_loss: 0.9343 - classification_loss: 0.1380 492/500 [============================>.] - ETA: 3s - loss: 1.0730 - regression_loss: 0.9349 - classification_loss: 0.1381 493/500 [============================>.] - ETA: 2s - loss: 1.0728 - regression_loss: 0.9342 - classification_loss: 0.1385 494/500 [============================>.] - ETA: 2s - loss: 1.0733 - regression_loss: 0.9347 - classification_loss: 0.1386 495/500 [============================>.] - ETA: 1s - loss: 1.0731 - regression_loss: 0.9345 - classification_loss: 0.1386 496/500 [============================>.] - ETA: 1s - loss: 1.0744 - regression_loss: 0.9355 - classification_loss: 0.1388 497/500 [============================>.] - ETA: 1s - loss: 1.0741 - regression_loss: 0.9354 - classification_loss: 0.1387 498/500 [============================>.] - ETA: 0s - loss: 1.0739 - regression_loss: 0.9353 - classification_loss: 0.1386 499/500 [============================>.] - ETA: 0s - loss: 1.0745 - regression_loss: 0.9358 - classification_loss: 0.1386 500/500 [==============================] - 195s 389ms/step - loss: 1.0756 - regression_loss: 0.9369 - classification_loss: 0.1387 326 instances of class plum with average precision: 0.8220 mAP: 0.8220 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/20 1/500 [..............................] - ETA: 3:12 - loss: 0.5296 - regression_loss: 0.4763 - classification_loss: 0.0533 2/500 [..............................] - ETA: 3:14 - loss: 0.8803 - regression_loss: 0.7945 - classification_loss: 0.0858 3/500 [..............................] - ETA: 3:16 - loss: 0.7769 - regression_loss: 0.6978 - classification_loss: 0.0790 4/500 [..............................] - ETA: 3:14 - loss: 0.8448 - regression_loss: 0.7488 - classification_loss: 0.0960 5/500 [..............................] - ETA: 3:14 - loss: 0.8670 - regression_loss: 0.7649 - classification_loss: 0.1022 6/500 [..............................] - ETA: 3:14 - loss: 0.8522 - regression_loss: 0.7572 - classification_loss: 0.0950 7/500 [..............................] - ETA: 3:13 - loss: 0.8637 - regression_loss: 0.7701 - classification_loss: 0.0936 8/500 [..............................] - ETA: 3:11 - loss: 0.8628 - regression_loss: 0.7715 - classification_loss: 0.0913 9/500 [..............................] - ETA: 3:10 - loss: 0.8437 - regression_loss: 0.7547 - classification_loss: 0.0890 10/500 [..............................] - ETA: 3:10 - loss: 0.8889 - regression_loss: 0.7969 - classification_loss: 0.0920 11/500 [..............................] - ETA: 3:10 - loss: 0.8898 - regression_loss: 0.7969 - classification_loss: 0.0929 12/500 [..............................] - ETA: 3:09 - loss: 0.9665 - regression_loss: 0.8608 - classification_loss: 0.1057 13/500 [..............................] - ETA: 3:09 - loss: 1.0051 - regression_loss: 0.8964 - classification_loss: 0.1087 14/500 [..............................] - ETA: 3:09 - loss: 1.0346 - regression_loss: 0.9202 - classification_loss: 0.1144 15/500 [..............................] - ETA: 3:08 - loss: 1.0132 - regression_loss: 0.9015 - classification_loss: 0.1117 16/500 [..............................] - ETA: 3:08 - loss: 1.0000 - regression_loss: 0.8903 - classification_loss: 0.1097 17/500 [>.............................] - ETA: 3:07 - loss: 1.0293 - regression_loss: 0.9139 - classification_loss: 0.1153 18/500 [>.............................] - ETA: 3:07 - loss: 1.0267 - regression_loss: 0.9117 - classification_loss: 0.1150 19/500 [>.............................] - ETA: 3:06 - loss: 1.0309 - regression_loss: 0.9129 - classification_loss: 0.1180 20/500 [>.............................] - ETA: 3:06 - loss: 1.0267 - regression_loss: 0.9093 - classification_loss: 0.1174 21/500 [>.............................] - ETA: 3:05 - loss: 1.0099 - regression_loss: 0.8940 - classification_loss: 0.1159 22/500 [>.............................] - ETA: 3:05 - loss: 0.9914 - regression_loss: 0.8777 - classification_loss: 0.1138 23/500 [>.............................] - ETA: 3:04 - loss: 1.0192 - regression_loss: 0.8960 - classification_loss: 0.1232 24/500 [>.............................] - ETA: 3:04 - loss: 1.0132 - regression_loss: 0.8916 - classification_loss: 0.1216 25/500 [>.............................] - ETA: 3:04 - loss: 0.9917 - regression_loss: 0.8732 - classification_loss: 0.1185 26/500 [>.............................] - ETA: 3:03 - loss: 0.9970 - regression_loss: 0.8797 - classification_loss: 0.1173 27/500 [>.............................] - ETA: 3:03 - loss: 0.9832 - regression_loss: 0.8670 - classification_loss: 0.1162 28/500 [>.............................] - ETA: 3:03 - loss: 0.9676 - regression_loss: 0.8525 - classification_loss: 0.1151 29/500 [>.............................] - ETA: 3:02 - loss: 0.9737 - regression_loss: 0.8570 - classification_loss: 0.1168 30/500 [>.............................] - ETA: 3:01 - loss: 0.9678 - regression_loss: 0.8519 - classification_loss: 0.1159 31/500 [>.............................] - ETA: 3:01 - loss: 0.9664 - regression_loss: 0.8499 - classification_loss: 0.1165 32/500 [>.............................] - ETA: 3:01 - loss: 0.9718 - regression_loss: 0.8541 - classification_loss: 0.1177 33/500 [>.............................] - ETA: 3:01 - loss: 0.9634 - regression_loss: 0.8464 - classification_loss: 0.1170 34/500 [=>............................] - ETA: 3:00 - loss: 0.9667 - regression_loss: 0.8462 - classification_loss: 0.1205 35/500 [=>............................] - ETA: 3:00 - loss: 0.9721 - regression_loss: 0.8511 - classification_loss: 0.1210 36/500 [=>............................] - ETA: 2:59 - loss: 0.9681 - regression_loss: 0.8474 - classification_loss: 0.1207 37/500 [=>............................] - ETA: 2:59 - loss: 0.9668 - regression_loss: 0.8473 - classification_loss: 0.1196 38/500 [=>............................] - ETA: 2:59 - loss: 0.9629 - regression_loss: 0.8439 - classification_loss: 0.1190 39/500 [=>............................] - ETA: 2:58 - loss: 0.9649 - regression_loss: 0.8476 - classification_loss: 0.1173 40/500 [=>............................] - ETA: 2:58 - loss: 0.9607 - regression_loss: 0.8440 - classification_loss: 0.1167 41/500 [=>............................] - ETA: 2:58 - loss: 0.9540 - regression_loss: 0.8378 - classification_loss: 0.1162 42/500 [=>............................] - ETA: 2:57 - loss: 0.9614 - regression_loss: 0.8425 - classification_loss: 0.1189 43/500 [=>............................] - ETA: 2:57 - loss: 0.9753 - regression_loss: 0.8535 - classification_loss: 0.1218 44/500 [=>............................] - ETA: 2:57 - loss: 0.9774 - regression_loss: 0.8553 - classification_loss: 0.1221 45/500 [=>............................] - ETA: 2:56 - loss: 0.9838 - regression_loss: 0.8618 - classification_loss: 0.1220 46/500 [=>............................] - ETA: 2:56 - loss: 0.9780 - regression_loss: 0.8570 - classification_loss: 0.1210 47/500 [=>............................] - ETA: 2:56 - loss: 0.9798 - regression_loss: 0.8592 - classification_loss: 0.1206 48/500 [=>............................] - ETA: 2:55 - loss: 0.9748 - regression_loss: 0.8552 - classification_loss: 0.1196 49/500 [=>............................] - ETA: 2:55 - loss: 0.9699 - regression_loss: 0.8499 - classification_loss: 0.1200 50/500 [==>...........................] - ETA: 2:54 - loss: 0.9683 - regression_loss: 0.8486 - classification_loss: 0.1197 51/500 [==>...........................] - ETA: 2:54 - loss: 0.9688 - regression_loss: 0.8482 - classification_loss: 0.1206 52/500 [==>...........................] - ETA: 2:54 - loss: 0.9604 - regression_loss: 0.8414 - classification_loss: 0.1190 53/500 [==>...........................] - ETA: 2:53 - loss: 0.9615 - regression_loss: 0.8420 - classification_loss: 0.1195 54/500 [==>...........................] - ETA: 2:53 - loss: 0.9625 - regression_loss: 0.8426 - classification_loss: 0.1199 55/500 [==>...........................] - ETA: 2:52 - loss: 0.9588 - regression_loss: 0.8395 - classification_loss: 0.1193 56/500 [==>...........................] - ETA: 2:52 - loss: 0.9511 - regression_loss: 0.8329 - classification_loss: 0.1182 57/500 [==>...........................] - ETA: 2:51 - loss: 0.9449 - regression_loss: 0.8275 - classification_loss: 0.1174 58/500 [==>...........................] - ETA: 2:51 - loss: 0.9459 - regression_loss: 0.8282 - classification_loss: 0.1177 59/500 [==>...........................] - ETA: 2:51 - loss: 0.9409 - regression_loss: 0.8240 - classification_loss: 0.1169 60/500 [==>...........................] - ETA: 2:50 - loss: 0.9417 - regression_loss: 0.8254 - classification_loss: 0.1163 61/500 [==>...........................] - ETA: 2:50 - loss: 0.9487 - regression_loss: 0.8312 - classification_loss: 0.1175 62/500 [==>...........................] - ETA: 2:50 - loss: 0.9398 - regression_loss: 0.8236 - classification_loss: 0.1163 63/500 [==>...........................] - ETA: 2:49 - loss: 0.9417 - regression_loss: 0.8256 - classification_loss: 0.1161 64/500 [==>...........................] - ETA: 2:49 - loss: 0.9440 - regression_loss: 0.8277 - classification_loss: 0.1163 65/500 [==>...........................] - ETA: 2:48 - loss: 0.9402 - regression_loss: 0.8245 - classification_loss: 0.1157 66/500 [==>...........................] - ETA: 2:48 - loss: 0.9387 - regression_loss: 0.8233 - classification_loss: 0.1154 67/500 [===>..........................] - ETA: 2:48 - loss: 0.9384 - regression_loss: 0.8229 - classification_loss: 0.1155 68/500 [===>..........................] - ETA: 2:47 - loss: 0.9446 - regression_loss: 0.8288 - classification_loss: 0.1158 69/500 [===>..........................] - ETA: 2:47 - loss: 0.9419 - regression_loss: 0.8266 - classification_loss: 0.1153 70/500 [===>..........................] - ETA: 2:47 - loss: 0.9399 - regression_loss: 0.8247 - classification_loss: 0.1153 71/500 [===>..........................] - ETA: 2:46 - loss: 0.9398 - regression_loss: 0.8250 - classification_loss: 0.1147 72/500 [===>..........................] - ETA: 2:46 - loss: 0.9403 - regression_loss: 0.8258 - classification_loss: 0.1145 73/500 [===>..........................] - ETA: 2:46 - loss: 0.9365 - regression_loss: 0.8224 - classification_loss: 0.1141 74/500 [===>..........................] - ETA: 2:45 - loss: 0.9426 - regression_loss: 0.8271 - classification_loss: 0.1155 75/500 [===>..........................] - ETA: 2:45 - loss: 0.9369 - regression_loss: 0.8216 - classification_loss: 0.1154 76/500 [===>..........................] - ETA: 2:45 - loss: 0.9309 - regression_loss: 0.8165 - classification_loss: 0.1144 77/500 [===>..........................] - ETA: 2:44 - loss: 0.9304 - regression_loss: 0.8163 - classification_loss: 0.1141 78/500 [===>..........................] - ETA: 2:44 - loss: 0.9325 - regression_loss: 0.8181 - classification_loss: 0.1144 79/500 [===>..........................] - ETA: 2:43 - loss: 0.9350 - regression_loss: 0.8198 - classification_loss: 0.1152 80/500 [===>..........................] - ETA: 2:43 - loss: 0.9366 - regression_loss: 0.8219 - classification_loss: 0.1147 81/500 [===>..........................] - ETA: 2:43 - loss: 0.9339 - regression_loss: 0.8196 - classification_loss: 0.1143 82/500 [===>..........................] - ETA: 2:42 - loss: 0.9380 - regression_loss: 0.8232 - classification_loss: 0.1148 83/500 [===>..........................] - ETA: 2:42 - loss: 0.9485 - regression_loss: 0.8316 - classification_loss: 0.1169 84/500 [====>.........................] - ETA: 2:42 - loss: 0.9576 - regression_loss: 0.8398 - classification_loss: 0.1178 85/500 [====>.........................] - ETA: 2:41 - loss: 0.9604 - regression_loss: 0.8416 - classification_loss: 0.1188 86/500 [====>.........................] - ETA: 2:41 - loss: 0.9613 - regression_loss: 0.8427 - classification_loss: 0.1186 87/500 [====>.........................] - ETA: 2:41 - loss: 0.9626 - regression_loss: 0.8436 - classification_loss: 0.1191 88/500 [====>.........................] - ETA: 2:40 - loss: 0.9622 - regression_loss: 0.8435 - classification_loss: 0.1187 89/500 [====>.........................] - ETA: 2:40 - loss: 0.9617 - regression_loss: 0.8429 - classification_loss: 0.1188 90/500 [====>.........................] - ETA: 2:39 - loss: 0.9595 - regression_loss: 0.8412 - classification_loss: 0.1183 91/500 [====>.........................] - ETA: 2:39 - loss: 0.9569 - regression_loss: 0.8391 - classification_loss: 0.1178 92/500 [====>.........................] - ETA: 2:39 - loss: 0.9555 - regression_loss: 0.8379 - classification_loss: 0.1176 93/500 [====>.........................] - ETA: 2:39 - loss: 0.9518 - regression_loss: 0.8345 - classification_loss: 0.1172 94/500 [====>.........................] - ETA: 2:38 - loss: 0.9481 - regression_loss: 0.8317 - classification_loss: 0.1164 95/500 [====>.........................] - ETA: 2:38 - loss: 0.9450 - regression_loss: 0.8290 - classification_loss: 0.1160 96/500 [====>.........................] - ETA: 2:37 - loss: 0.9453 - regression_loss: 0.8290 - classification_loss: 0.1163 97/500 [====>.........................] - ETA: 2:37 - loss: 0.9460 - regression_loss: 0.8300 - classification_loss: 0.1161 98/500 [====>.........................] - ETA: 2:37 - loss: 0.9470 - regression_loss: 0.8310 - classification_loss: 0.1159 99/500 [====>.........................] - ETA: 2:36 - loss: 0.9467 - regression_loss: 0.8306 - classification_loss: 0.1161 100/500 [=====>........................] - ETA: 2:36 - loss: 0.9475 - regression_loss: 0.8317 - classification_loss: 0.1159 101/500 [=====>........................] - ETA: 2:35 - loss: 0.9519 - regression_loss: 0.8363 - classification_loss: 0.1156 102/500 [=====>........................] - ETA: 2:35 - loss: 0.9489 - regression_loss: 0.8337 - classification_loss: 0.1152 103/500 [=====>........................] - ETA: 2:35 - loss: 0.9454 - regression_loss: 0.8307 - classification_loss: 0.1148 104/500 [=====>........................] - ETA: 2:34 - loss: 0.9393 - regression_loss: 0.8251 - classification_loss: 0.1141 105/500 [=====>........................] - ETA: 2:34 - loss: 0.9405 - regression_loss: 0.8259 - classification_loss: 0.1146 106/500 [=====>........................] - ETA: 2:33 - loss: 0.9440 - regression_loss: 0.8283 - classification_loss: 0.1156 107/500 [=====>........................] - ETA: 2:33 - loss: 0.9428 - regression_loss: 0.8276 - classification_loss: 0.1152 108/500 [=====>........................] - ETA: 2:33 - loss: 0.9429 - regression_loss: 0.8279 - classification_loss: 0.1150 109/500 [=====>........................] - ETA: 2:32 - loss: 0.9468 - regression_loss: 0.8316 - classification_loss: 0.1152 110/500 [=====>........................] - ETA: 2:32 - loss: 0.9478 - regression_loss: 0.8323 - classification_loss: 0.1154 111/500 [=====>........................] - ETA: 2:31 - loss: 0.9443 - regression_loss: 0.8294 - classification_loss: 0.1149 112/500 [=====>........................] - ETA: 2:31 - loss: 0.9461 - regression_loss: 0.8307 - classification_loss: 0.1154 113/500 [=====>........................] - ETA: 2:31 - loss: 0.9481 - regression_loss: 0.8324 - classification_loss: 0.1157 114/500 [=====>........................] - ETA: 2:30 - loss: 0.9509 - regression_loss: 0.8352 - classification_loss: 0.1158 115/500 [=====>........................] - ETA: 2:30 - loss: 0.9534 - regression_loss: 0.8376 - classification_loss: 0.1158 116/500 [=====>........................] - ETA: 2:30 - loss: 0.9531 - regression_loss: 0.8372 - classification_loss: 0.1158 117/500 [======>.......................] - ETA: 2:29 - loss: 0.9483 - regression_loss: 0.8328 - classification_loss: 0.1156 118/500 [======>.......................] - ETA: 2:29 - loss: 0.9539 - regression_loss: 0.8380 - classification_loss: 0.1159 119/500 [======>.......................] - ETA: 2:28 - loss: 0.9478 - regression_loss: 0.8325 - classification_loss: 0.1153 120/500 [======>.......................] - ETA: 2:28 - loss: 0.9542 - regression_loss: 0.8373 - classification_loss: 0.1169 121/500 [======>.......................] - ETA: 2:28 - loss: 0.9569 - regression_loss: 0.8395 - classification_loss: 0.1174 122/500 [======>.......................] - ETA: 2:27 - loss: 0.9645 - regression_loss: 0.8461 - classification_loss: 0.1185 123/500 [======>.......................] - ETA: 2:27 - loss: 0.9640 - regression_loss: 0.8458 - classification_loss: 0.1182 124/500 [======>.......................] - ETA: 2:26 - loss: 0.9648 - regression_loss: 0.8467 - classification_loss: 0.1181 125/500 [======>.......................] - ETA: 2:26 - loss: 0.9650 - regression_loss: 0.8471 - classification_loss: 0.1178 126/500 [======>.......................] - ETA: 2:26 - loss: 0.9652 - regression_loss: 0.8473 - classification_loss: 0.1179 127/500 [======>.......................] - ETA: 2:25 - loss: 0.9623 - regression_loss: 0.8447 - classification_loss: 0.1176 128/500 [======>.......................] - ETA: 2:25 - loss: 0.9657 - regression_loss: 0.8480 - classification_loss: 0.1177 129/500 [======>.......................] - ETA: 2:24 - loss: 0.9670 - regression_loss: 0.8494 - classification_loss: 0.1176 130/500 [======>.......................] - ETA: 2:24 - loss: 0.9715 - regression_loss: 0.8534 - classification_loss: 0.1181 131/500 [======>.......................] - ETA: 2:24 - loss: 0.9699 - regression_loss: 0.8521 - classification_loss: 0.1178 132/500 [======>.......................] - ETA: 2:23 - loss: 0.9712 - regression_loss: 0.8535 - classification_loss: 0.1178 133/500 [======>.......................] - ETA: 2:23 - loss: 0.9711 - regression_loss: 0.8534 - classification_loss: 0.1176 134/500 [=======>......................] - ETA: 2:22 - loss: 0.9761 - regression_loss: 0.8570 - classification_loss: 0.1192 135/500 [=======>......................] - ETA: 2:22 - loss: 0.9737 - regression_loss: 0.8547 - classification_loss: 0.1190 136/500 [=======>......................] - ETA: 2:22 - loss: 0.9764 - regression_loss: 0.8568 - classification_loss: 0.1196 137/500 [=======>......................] - ETA: 2:21 - loss: 0.9749 - regression_loss: 0.8555 - classification_loss: 0.1194 138/500 [=======>......................] - ETA: 2:21 - loss: 0.9743 - regression_loss: 0.8550 - classification_loss: 0.1192 139/500 [=======>......................] - ETA: 2:20 - loss: 0.9786 - regression_loss: 0.8585 - classification_loss: 0.1201 140/500 [=======>......................] - ETA: 2:20 - loss: 0.9815 - regression_loss: 0.8611 - classification_loss: 0.1204 141/500 [=======>......................] - ETA: 2:20 - loss: 0.9850 - regression_loss: 0.8640 - classification_loss: 0.1211 142/500 [=======>......................] - ETA: 2:19 - loss: 0.9867 - regression_loss: 0.8653 - classification_loss: 0.1214 143/500 [=======>......................] - ETA: 2:19 - loss: 0.9837 - regression_loss: 0.8625 - classification_loss: 0.1211 144/500 [=======>......................] - ETA: 2:18 - loss: 0.9817 - regression_loss: 0.8606 - classification_loss: 0.1211 145/500 [=======>......................] - ETA: 2:18 - loss: 0.9834 - regression_loss: 0.8622 - classification_loss: 0.1212 146/500 [=======>......................] - ETA: 2:18 - loss: 0.9816 - regression_loss: 0.8608 - classification_loss: 0.1208 147/500 [=======>......................] - ETA: 2:17 - loss: 0.9776 - regression_loss: 0.8572 - classification_loss: 0.1203 148/500 [=======>......................] - ETA: 2:17 - loss: 0.9747 - regression_loss: 0.8547 - classification_loss: 0.1200 149/500 [=======>......................] - ETA: 2:17 - loss: 0.9755 - regression_loss: 0.8554 - classification_loss: 0.1201 150/500 [========>.....................] - ETA: 2:16 - loss: 0.9755 - regression_loss: 0.8551 - classification_loss: 0.1204 151/500 [========>.....................] - ETA: 2:16 - loss: 0.9737 - regression_loss: 0.8535 - classification_loss: 0.1202 152/500 [========>.....................] - ETA: 2:15 - loss: 0.9753 - regression_loss: 0.8549 - classification_loss: 0.1205 153/500 [========>.....................] - ETA: 2:15 - loss: 0.9727 - regression_loss: 0.8524 - classification_loss: 0.1203 154/500 [========>.....................] - ETA: 2:15 - loss: 0.9725 - regression_loss: 0.8524 - classification_loss: 0.1201 155/500 [========>.....................] - ETA: 2:14 - loss: 0.9757 - regression_loss: 0.8553 - classification_loss: 0.1204 156/500 [========>.....................] - ETA: 2:14 - loss: 0.9738 - regression_loss: 0.8537 - classification_loss: 0.1201 157/500 [========>.....................] - ETA: 2:14 - loss: 0.9726 - regression_loss: 0.8527 - classification_loss: 0.1198 158/500 [========>.....................] - ETA: 2:13 - loss: 0.9744 - regression_loss: 0.8543 - classification_loss: 0.1201 159/500 [========>.....................] - ETA: 2:13 - loss: 0.9744 - regression_loss: 0.8542 - classification_loss: 0.1202 160/500 [========>.....................] - ETA: 2:12 - loss: 0.9762 - regression_loss: 0.8556 - classification_loss: 0.1205 161/500 [========>.....................] - ETA: 2:12 - loss: 0.9764 - regression_loss: 0.8559 - classification_loss: 0.1205 162/500 [========>.....................] - ETA: 2:12 - loss: 0.9781 - regression_loss: 0.8574 - classification_loss: 0.1206 163/500 [========>.....................] - ETA: 2:11 - loss: 0.9787 - regression_loss: 0.8582 - classification_loss: 0.1206 164/500 [========>.....................] - ETA: 2:11 - loss: 0.9759 - regression_loss: 0.8555 - classification_loss: 0.1205 165/500 [========>.....................] - ETA: 2:10 - loss: 0.9761 - regression_loss: 0.8559 - classification_loss: 0.1203 166/500 [========>.....................] - ETA: 2:10 - loss: 0.9761 - regression_loss: 0.8559 - classification_loss: 0.1202 167/500 [=========>....................] - ETA: 2:10 - loss: 0.9783 - regression_loss: 0.8576 - classification_loss: 0.1207 168/500 [=========>....................] - ETA: 2:09 - loss: 0.9817 - regression_loss: 0.8606 - classification_loss: 0.1212 169/500 [=========>....................] - ETA: 2:09 - loss: 0.9842 - regression_loss: 0.8627 - classification_loss: 0.1215 170/500 [=========>....................] - ETA: 2:09 - loss: 0.9846 - regression_loss: 0.8630 - classification_loss: 0.1215 171/500 [=========>....................] - ETA: 2:08 - loss: 0.9837 - regression_loss: 0.8625 - classification_loss: 0.1212 172/500 [=========>....................] - ETA: 2:08 - loss: 0.9819 - regression_loss: 0.8610 - classification_loss: 0.1209 173/500 [=========>....................] - ETA: 2:07 - loss: 0.9818 - regression_loss: 0.8612 - classification_loss: 0.1206 174/500 [=========>....................] - ETA: 2:07 - loss: 0.9801 - regression_loss: 0.8598 - classification_loss: 0.1203 175/500 [=========>....................] - ETA: 2:07 - loss: 0.9776 - regression_loss: 0.8574 - classification_loss: 0.1202 176/500 [=========>....................] - ETA: 2:06 - loss: 0.9777 - regression_loss: 0.8575 - classification_loss: 0.1202 177/500 [=========>....................] - ETA: 2:06 - loss: 0.9776 - regression_loss: 0.8575 - classification_loss: 0.1201 178/500 [=========>....................] - ETA: 2:05 - loss: 0.9788 - regression_loss: 0.8585 - classification_loss: 0.1202 179/500 [=========>....................] - ETA: 2:05 - loss: 0.9843 - regression_loss: 0.8628 - classification_loss: 0.1214 180/500 [=========>....................] - ETA: 2:05 - loss: 0.9839 - regression_loss: 0.8625 - classification_loss: 0.1214 181/500 [=========>....................] - ETA: 2:04 - loss: 0.9849 - regression_loss: 0.8635 - classification_loss: 0.1214 182/500 [=========>....................] - ETA: 2:04 - loss: 0.9820 - regression_loss: 0.8610 - classification_loss: 0.1210 183/500 [=========>....................] - ETA: 2:03 - loss: 0.9819 - regression_loss: 0.8610 - classification_loss: 0.1210 184/500 [==========>...................] - ETA: 2:03 - loss: 0.9882 - regression_loss: 0.8653 - classification_loss: 0.1229 185/500 [==========>...................] - ETA: 2:03 - loss: 0.9878 - regression_loss: 0.8651 - classification_loss: 0.1227 186/500 [==========>...................] - ETA: 2:02 - loss: 0.9947 - regression_loss: 0.8706 - classification_loss: 0.1241 187/500 [==========>...................] - ETA: 2:02 - loss: 0.9940 - regression_loss: 0.8701 - classification_loss: 0.1239 188/500 [==========>...................] - ETA: 2:01 - loss: 0.9938 - regression_loss: 0.8701 - classification_loss: 0.1237 189/500 [==========>...................] - ETA: 2:01 - loss: 0.9941 - regression_loss: 0.8704 - classification_loss: 0.1237 190/500 [==========>...................] - ETA: 2:01 - loss: 0.9923 - regression_loss: 0.8688 - classification_loss: 0.1235 191/500 [==========>...................] - ETA: 2:00 - loss: 0.9964 - regression_loss: 0.8722 - classification_loss: 0.1242 192/500 [==========>...................] - ETA: 2:00 - loss: 0.9971 - regression_loss: 0.8728 - classification_loss: 0.1243 193/500 [==========>...................] - ETA: 1:59 - loss: 0.9957 - regression_loss: 0.8715 - classification_loss: 0.1242 194/500 [==========>...................] - ETA: 1:59 - loss: 0.9938 - regression_loss: 0.8699 - classification_loss: 0.1239 195/500 [==========>...................] - ETA: 1:59 - loss: 0.9931 - regression_loss: 0.8693 - classification_loss: 0.1237 196/500 [==========>...................] - ETA: 1:58 - loss: 0.9917 - regression_loss: 0.8682 - classification_loss: 0.1234 197/500 [==========>...................] - ETA: 1:58 - loss: 0.9910 - regression_loss: 0.8678 - classification_loss: 0.1232 198/500 [==========>...................] - ETA: 1:57 - loss: 0.9910 - regression_loss: 0.8678 - classification_loss: 0.1232 199/500 [==========>...................] - ETA: 1:57 - loss: 0.9886 - regression_loss: 0.8658 - classification_loss: 0.1228 200/500 [===========>..................] - ETA: 1:57 - loss: 0.9886 - regression_loss: 0.8659 - classification_loss: 0.1227 201/500 [===========>..................] - ETA: 1:56 - loss: 0.9883 - regression_loss: 0.8658 - classification_loss: 0.1225 202/500 [===========>..................] - ETA: 1:56 - loss: 0.9876 - regression_loss: 0.8650 - classification_loss: 0.1225 203/500 [===========>..................] - ETA: 1:55 - loss: 0.9879 - regression_loss: 0.8653 - classification_loss: 0.1226 204/500 [===========>..................] - ETA: 1:55 - loss: 0.9878 - regression_loss: 0.8654 - classification_loss: 0.1223 205/500 [===========>..................] - ETA: 1:55 - loss: 0.9860 - regression_loss: 0.8639 - classification_loss: 0.1220 206/500 [===========>..................] - ETA: 1:54 - loss: 0.9847 - regression_loss: 0.8630 - classification_loss: 0.1217 207/500 [===========>..................] - ETA: 1:54 - loss: 0.9861 - regression_loss: 0.8644 - classification_loss: 0.1217 208/500 [===========>..................] - ETA: 1:53 - loss: 0.9899 - regression_loss: 0.8674 - classification_loss: 0.1225 209/500 [===========>..................] - ETA: 1:53 - loss: 0.9897 - regression_loss: 0.8674 - classification_loss: 0.1223 210/500 [===========>..................] - ETA: 1:53 - loss: 0.9881 - regression_loss: 0.8660 - classification_loss: 0.1221 211/500 [===========>..................] - ETA: 1:52 - loss: 0.9869 - regression_loss: 0.8651 - classification_loss: 0.1218 212/500 [===========>..................] - ETA: 1:52 - loss: 0.9870 - regression_loss: 0.8652 - classification_loss: 0.1219 213/500 [===========>..................] - ETA: 1:51 - loss: 0.9889 - regression_loss: 0.8668 - classification_loss: 0.1221 214/500 [===========>..................] - ETA: 1:51 - loss: 0.9872 - regression_loss: 0.8653 - classification_loss: 0.1219 215/500 [===========>..................] - ETA: 1:51 - loss: 0.9874 - regression_loss: 0.8654 - classification_loss: 0.1220 216/500 [===========>..................] - ETA: 1:50 - loss: 0.9866 - regression_loss: 0.8646 - classification_loss: 0.1219 217/500 [============>.................] - ETA: 1:50 - loss: 0.9844 - regression_loss: 0.8627 - classification_loss: 0.1218 218/500 [============>.................] - ETA: 1:50 - loss: 0.9829 - regression_loss: 0.8615 - classification_loss: 0.1215 219/500 [============>.................] - ETA: 1:49 - loss: 0.9829 - regression_loss: 0.8614 - classification_loss: 0.1214 220/500 [============>.................] - ETA: 1:49 - loss: 0.9810 - regression_loss: 0.8597 - classification_loss: 0.1212 221/500 [============>.................] - ETA: 1:48 - loss: 0.9814 - regression_loss: 0.8598 - classification_loss: 0.1215 222/500 [============>.................] - ETA: 1:48 - loss: 0.9818 - regression_loss: 0.8603 - classification_loss: 0.1215 223/500 [============>.................] - ETA: 1:48 - loss: 0.9806 - regression_loss: 0.8591 - classification_loss: 0.1215 224/500 [============>.................] - ETA: 1:47 - loss: 0.9801 - regression_loss: 0.8588 - classification_loss: 0.1213 225/500 [============>.................] - ETA: 1:47 - loss: 0.9801 - regression_loss: 0.8586 - classification_loss: 0.1215 226/500 [============>.................] - ETA: 1:46 - loss: 0.9791 - regression_loss: 0.8578 - classification_loss: 0.1213 227/500 [============>.................] - ETA: 1:46 - loss: 0.9760 - regression_loss: 0.8551 - classification_loss: 0.1209 228/500 [============>.................] - ETA: 1:46 - loss: 0.9795 - regression_loss: 0.8579 - classification_loss: 0.1215 229/500 [============>.................] - ETA: 1:45 - loss: 0.9767 - regression_loss: 0.8554 - classification_loss: 0.1212 230/500 [============>.................] - ETA: 1:45 - loss: 0.9779 - regression_loss: 0.8564 - classification_loss: 0.1215 231/500 [============>.................] - ETA: 1:44 - loss: 0.9762 - regression_loss: 0.8550 - classification_loss: 0.1212 232/500 [============>.................] - ETA: 1:44 - loss: 0.9762 - regression_loss: 0.8550 - classification_loss: 0.1211 233/500 [============>.................] - ETA: 1:44 - loss: 0.9749 - regression_loss: 0.8540 - classification_loss: 0.1209 234/500 [=============>................] - ETA: 1:43 - loss: 0.9746 - regression_loss: 0.8538 - classification_loss: 0.1208 235/500 [=============>................] - ETA: 1:43 - loss: 0.9747 - regression_loss: 0.8540 - classification_loss: 0.1207 236/500 [=============>................] - ETA: 1:43 - loss: 0.9752 - regression_loss: 0.8545 - classification_loss: 0.1208 237/500 [=============>................] - ETA: 1:42 - loss: 0.9745 - regression_loss: 0.8539 - classification_loss: 0.1206 238/500 [=============>................] - ETA: 1:42 - loss: 0.9741 - regression_loss: 0.8534 - classification_loss: 0.1207 239/500 [=============>................] - ETA: 1:41 - loss: 0.9739 - regression_loss: 0.8533 - classification_loss: 0.1206 240/500 [=============>................] - ETA: 1:41 - loss: 0.9729 - regression_loss: 0.8524 - classification_loss: 0.1205 241/500 [=============>................] - ETA: 1:41 - loss: 0.9743 - regression_loss: 0.8535 - classification_loss: 0.1208 242/500 [=============>................] - ETA: 1:40 - loss: 0.9754 - regression_loss: 0.8542 - classification_loss: 0.1212 243/500 [=============>................] - ETA: 1:40 - loss: 0.9766 - regression_loss: 0.8555 - classification_loss: 0.1211 244/500 [=============>................] - ETA: 1:39 - loss: 0.9747 - regression_loss: 0.8536 - classification_loss: 0.1211 245/500 [=============>................] - ETA: 1:39 - loss: 0.9752 - regression_loss: 0.8540 - classification_loss: 0.1213 246/500 [=============>................] - ETA: 1:39 - loss: 0.9734 - regression_loss: 0.8525 - classification_loss: 0.1209 247/500 [=============>................] - ETA: 1:38 - loss: 0.9739 - regression_loss: 0.8533 - classification_loss: 0.1207 248/500 [=============>................] - ETA: 1:38 - loss: 0.9724 - regression_loss: 0.8520 - classification_loss: 0.1204 249/500 [=============>................] - ETA: 1:37 - loss: 0.9734 - regression_loss: 0.8529 - classification_loss: 0.1204 250/500 [==============>...............] - ETA: 1:37 - loss: 0.9724 - regression_loss: 0.8521 - classification_loss: 0.1203 251/500 [==============>...............] - ETA: 1:37 - loss: 0.9714 - regression_loss: 0.8512 - classification_loss: 0.1202 252/500 [==============>...............] - ETA: 1:36 - loss: 0.9695 - regression_loss: 0.8496 - classification_loss: 0.1199 253/500 [==============>...............] - ETA: 1:36 - loss: 0.9685 - regression_loss: 0.8487 - classification_loss: 0.1198 254/500 [==============>...............] - ETA: 1:36 - loss: 0.9709 - regression_loss: 0.8507 - classification_loss: 0.1202 255/500 [==============>...............] - ETA: 1:35 - loss: 0.9708 - regression_loss: 0.8507 - classification_loss: 0.1201 256/500 [==============>...............] - ETA: 1:35 - loss: 0.9710 - regression_loss: 0.8509 - classification_loss: 0.1201 257/500 [==============>...............] - ETA: 1:34 - loss: 0.9699 - regression_loss: 0.8499 - classification_loss: 0.1200 258/500 [==============>...............] - ETA: 1:34 - loss: 0.9699 - regression_loss: 0.8500 - classification_loss: 0.1199 259/500 [==============>...............] - ETA: 1:34 - loss: 0.9702 - regression_loss: 0.8505 - classification_loss: 0.1198 260/500 [==============>...............] - ETA: 1:33 - loss: 0.9743 - regression_loss: 0.8537 - classification_loss: 0.1205 261/500 [==============>...............] - ETA: 1:33 - loss: 0.9754 - regression_loss: 0.8549 - classification_loss: 0.1205 262/500 [==============>...............] - ETA: 1:32 - loss: 0.9755 - regression_loss: 0.8549 - classification_loss: 0.1206 263/500 [==============>...............] - ETA: 1:32 - loss: 0.9754 - regression_loss: 0.8548 - classification_loss: 0.1205 264/500 [==============>...............] - ETA: 1:32 - loss: 0.9747 - regression_loss: 0.8542 - classification_loss: 0.1205 265/500 [==============>...............] - ETA: 1:31 - loss: 0.9736 - regression_loss: 0.8533 - classification_loss: 0.1203 266/500 [==============>...............] - ETA: 1:31 - loss: 0.9724 - regression_loss: 0.8520 - classification_loss: 0.1204 267/500 [===============>..............] - ETA: 1:30 - loss: 0.9720 - regression_loss: 0.8516 - classification_loss: 0.1204 268/500 [===============>..............] - ETA: 1:30 - loss: 0.9727 - regression_loss: 0.8520 - classification_loss: 0.1206 269/500 [===============>..............] - ETA: 1:30 - loss: 0.9740 - regression_loss: 0.8532 - classification_loss: 0.1209 270/500 [===============>..............] - ETA: 1:29 - loss: 0.9730 - regression_loss: 0.8522 - classification_loss: 0.1208 271/500 [===============>..............] - ETA: 1:29 - loss: 0.9722 - regression_loss: 0.8516 - classification_loss: 0.1206 272/500 [===============>..............] - ETA: 1:28 - loss: 0.9708 - regression_loss: 0.8504 - classification_loss: 0.1203 273/500 [===============>..............] - ETA: 1:28 - loss: 0.9703 - regression_loss: 0.8500 - classification_loss: 0.1202 274/500 [===============>..............] - ETA: 1:28 - loss: 0.9688 - regression_loss: 0.8489 - classification_loss: 0.1200 275/500 [===============>..............] - ETA: 1:27 - loss: 0.9690 - regression_loss: 0.8490 - classification_loss: 0.1199 276/500 [===============>..............] - ETA: 1:27 - loss: 0.9681 - regression_loss: 0.8484 - classification_loss: 0.1197 277/500 [===============>..............] - ETA: 1:26 - loss: 0.9685 - regression_loss: 0.8488 - classification_loss: 0.1197 278/500 [===============>..............] - ETA: 1:26 - loss: 0.9679 - regression_loss: 0.8484 - classification_loss: 0.1195 279/500 [===============>..............] - ETA: 1:26 - loss: 0.9688 - regression_loss: 0.8492 - classification_loss: 0.1196 280/500 [===============>..............] - ETA: 1:25 - loss: 0.9715 - regression_loss: 0.8512 - classification_loss: 0.1203 281/500 [===============>..............] - ETA: 1:25 - loss: 0.9715 - regression_loss: 0.8512 - classification_loss: 0.1203 282/500 [===============>..............] - ETA: 1:24 - loss: 0.9723 - regression_loss: 0.8518 - classification_loss: 0.1205 283/500 [===============>..............] - ETA: 1:24 - loss: 0.9716 - regression_loss: 0.8511 - classification_loss: 0.1204 284/500 [================>.............] - ETA: 1:24 - loss: 0.9724 - regression_loss: 0.8518 - classification_loss: 0.1206 285/500 [================>.............] - ETA: 1:23 - loss: 0.9714 - regression_loss: 0.8510 - classification_loss: 0.1204 286/500 [================>.............] - ETA: 1:23 - loss: 0.9700 - regression_loss: 0.8496 - classification_loss: 0.1205 287/500 [================>.............] - ETA: 1:22 - loss: 0.9675 - regression_loss: 0.8473 - classification_loss: 0.1202 288/500 [================>.............] - ETA: 1:22 - loss: 0.9691 - regression_loss: 0.8487 - classification_loss: 0.1204 289/500 [================>.............] - ETA: 1:22 - loss: 0.9687 - regression_loss: 0.8484 - classification_loss: 0.1203 290/500 [================>.............] - ETA: 1:21 - loss: 0.9673 - regression_loss: 0.8472 - classification_loss: 0.1202 291/500 [================>.............] - ETA: 1:21 - loss: 0.9673 - regression_loss: 0.8471 - classification_loss: 0.1201 292/500 [================>.............] - ETA: 1:21 - loss: 0.9680 - regression_loss: 0.8479 - classification_loss: 0.1201 293/500 [================>.............] - ETA: 1:20 - loss: 0.9675 - regression_loss: 0.8475 - classification_loss: 0.1200 294/500 [================>.............] - ETA: 1:20 - loss: 0.9672 - regression_loss: 0.8473 - classification_loss: 0.1199 295/500 [================>.............] - ETA: 1:19 - loss: 0.9689 - regression_loss: 0.8487 - classification_loss: 0.1202 296/500 [================>.............] - ETA: 1:19 - loss: 0.9707 - regression_loss: 0.8499 - classification_loss: 0.1208 297/500 [================>.............] - ETA: 1:19 - loss: 0.9694 - regression_loss: 0.8487 - classification_loss: 0.1207 298/500 [================>.............] - ETA: 1:18 - loss: 0.9701 - regression_loss: 0.8492 - classification_loss: 0.1210 299/500 [================>.............] - ETA: 1:18 - loss: 0.9701 - regression_loss: 0.8491 - classification_loss: 0.1210 300/500 [=================>............] - ETA: 1:17 - loss: 0.9687 - regression_loss: 0.8479 - classification_loss: 0.1208 301/500 [=================>............] - ETA: 1:17 - loss: 0.9683 - regression_loss: 0.8476 - classification_loss: 0.1207 302/500 [=================>............] - ETA: 1:17 - loss: 0.9681 - regression_loss: 0.8475 - classification_loss: 0.1206 303/500 [=================>............] - ETA: 1:16 - loss: 0.9674 - regression_loss: 0.8468 - classification_loss: 0.1206 304/500 [=================>............] - ETA: 1:16 - loss: 0.9657 - regression_loss: 0.8453 - classification_loss: 0.1204 305/500 [=================>............] - ETA: 1:15 - loss: 0.9634 - regression_loss: 0.8433 - classification_loss: 0.1201 306/500 [=================>............] - ETA: 1:15 - loss: 0.9622 - regression_loss: 0.8422 - classification_loss: 0.1200 307/500 [=================>............] - ETA: 1:15 - loss: 0.9612 - regression_loss: 0.8413 - classification_loss: 0.1198 308/500 [=================>............] - ETA: 1:14 - loss: 0.9616 - regression_loss: 0.8419 - classification_loss: 0.1197 309/500 [=================>............] - ETA: 1:14 - loss: 0.9615 - regression_loss: 0.8418 - classification_loss: 0.1196 310/500 [=================>............] - ETA: 1:14 - loss: 0.9607 - regression_loss: 0.8411 - classification_loss: 0.1196 311/500 [=================>............] - ETA: 1:13 - loss: 0.9602 - regression_loss: 0.8406 - classification_loss: 0.1196 312/500 [=================>............] - ETA: 1:13 - loss: 0.9611 - regression_loss: 0.8415 - classification_loss: 0.1195 313/500 [=================>............] - ETA: 1:12 - loss: 0.9613 - regression_loss: 0.8418 - classification_loss: 0.1195 314/500 [=================>............] - ETA: 1:12 - loss: 0.9625 - regression_loss: 0.8429 - classification_loss: 0.1196 315/500 [=================>............] - ETA: 1:12 - loss: 0.9617 - regression_loss: 0.8423 - classification_loss: 0.1195 316/500 [=================>............] - ETA: 1:11 - loss: 0.9617 - regression_loss: 0.8423 - classification_loss: 0.1194 317/500 [==================>...........] - ETA: 1:11 - loss: 0.9626 - regression_loss: 0.8430 - classification_loss: 0.1196 318/500 [==================>...........] - ETA: 1:10 - loss: 0.9644 - regression_loss: 0.8444 - classification_loss: 0.1200 319/500 [==================>...........] - ETA: 1:10 - loss: 0.9638 - regression_loss: 0.8440 - classification_loss: 0.1199 320/500 [==================>...........] - ETA: 1:10 - loss: 0.9661 - regression_loss: 0.8459 - classification_loss: 0.1201 321/500 [==================>...........] - ETA: 1:09 - loss: 0.9686 - regression_loss: 0.8482 - classification_loss: 0.1204 322/500 [==================>...........] - ETA: 1:09 - loss: 0.9677 - regression_loss: 0.8474 - classification_loss: 0.1203 323/500 [==================>...........] - ETA: 1:08 - loss: 0.9688 - regression_loss: 0.8482 - classification_loss: 0.1206 324/500 [==================>...........] - ETA: 1:08 - loss: 0.9677 - regression_loss: 0.8472 - classification_loss: 0.1205 325/500 [==================>...........] - ETA: 1:08 - loss: 0.9671 - regression_loss: 0.8467 - classification_loss: 0.1204 326/500 [==================>...........] - ETA: 1:07 - loss: 0.9666 - regression_loss: 0.8464 - classification_loss: 0.1202 327/500 [==================>...........] - ETA: 1:07 - loss: 0.9688 - regression_loss: 0.8484 - classification_loss: 0.1204 328/500 [==================>...........] - ETA: 1:07 - loss: 0.9686 - regression_loss: 0.8483 - classification_loss: 0.1203 329/500 [==================>...........] - ETA: 1:06 - loss: 0.9683 - regression_loss: 0.8480 - classification_loss: 0.1202 330/500 [==================>...........] - ETA: 1:06 - loss: 0.9682 - regression_loss: 0.8479 - classification_loss: 0.1203 331/500 [==================>...........] - ETA: 1:05 - loss: 0.9665 - regression_loss: 0.8464 - classification_loss: 0.1201 332/500 [==================>...........] - ETA: 1:05 - loss: 0.9677 - regression_loss: 0.8475 - classification_loss: 0.1202 333/500 [==================>...........] - ETA: 1:05 - loss: 0.9681 - regression_loss: 0.8478 - classification_loss: 0.1203 334/500 [===================>..........] - ETA: 1:04 - loss: 0.9667 - regression_loss: 0.8464 - classification_loss: 0.1203 335/500 [===================>..........] - ETA: 1:04 - loss: 0.9660 - regression_loss: 0.8456 - classification_loss: 0.1205 336/500 [===================>..........] - ETA: 1:03 - loss: 0.9650 - regression_loss: 0.8446 - classification_loss: 0.1204 337/500 [===================>..........] - ETA: 1:03 - loss: 0.9640 - regression_loss: 0.8437 - classification_loss: 0.1203 338/500 [===================>..........] - ETA: 1:03 - loss: 0.9666 - regression_loss: 0.8457 - classification_loss: 0.1209 339/500 [===================>..........] - ETA: 1:02 - loss: 0.9678 - regression_loss: 0.8469 - classification_loss: 0.1209 340/500 [===================>..........] - ETA: 1:02 - loss: 0.9682 - regression_loss: 0.8472 - classification_loss: 0.1210 341/500 [===================>..........] - ETA: 1:01 - loss: 0.9691 - regression_loss: 0.8480 - classification_loss: 0.1211 342/500 [===================>..........] - ETA: 1:01 - loss: 0.9698 - regression_loss: 0.8486 - classification_loss: 0.1212 343/500 [===================>..........] - ETA: 1:01 - loss: 0.9690 - regression_loss: 0.8479 - classification_loss: 0.1211 344/500 [===================>..........] - ETA: 1:00 - loss: 0.9689 - regression_loss: 0.8478 - classification_loss: 0.1211 345/500 [===================>..........] - ETA: 1:00 - loss: 0.9676 - regression_loss: 0.8467 - classification_loss: 0.1209 346/500 [===================>..........] - ETA: 1:00 - loss: 0.9669 - regression_loss: 0.8461 - classification_loss: 0.1208 347/500 [===================>..........] - ETA: 59s - loss: 0.9665 - regression_loss: 0.8458 - classification_loss: 0.1207  348/500 [===================>..........] - ETA: 59s - loss: 0.9654 - regression_loss: 0.8449 - classification_loss: 0.1206 349/500 [===================>..........] - ETA: 58s - loss: 0.9655 - regression_loss: 0.8450 - classification_loss: 0.1205 350/500 [====================>.........] - ETA: 58s - loss: 0.9647 - regression_loss: 0.8443 - classification_loss: 0.1204 351/500 [====================>.........] - ETA: 58s - loss: 0.9646 - regression_loss: 0.8444 - classification_loss: 0.1203 352/500 [====================>.........] - ETA: 57s - loss: 0.9627 - regression_loss: 0.8427 - classification_loss: 0.1200 353/500 [====================>.........] - ETA: 57s - loss: 0.9621 - regression_loss: 0.8420 - classification_loss: 0.1202 354/500 [====================>.........] - ETA: 56s - loss: 0.9620 - regression_loss: 0.8415 - classification_loss: 0.1205 355/500 [====================>.........] - ETA: 56s - loss: 0.9623 - regression_loss: 0.8418 - classification_loss: 0.1205 356/500 [====================>.........] - ETA: 56s - loss: 0.9623 - regression_loss: 0.8419 - classification_loss: 0.1204 357/500 [====================>.........] - ETA: 55s - loss: 0.9638 - regression_loss: 0.8431 - classification_loss: 0.1208 358/500 [====================>.........] - ETA: 55s - loss: 0.9652 - regression_loss: 0.8441 - classification_loss: 0.1211 359/500 [====================>.........] - ETA: 54s - loss: 0.9651 - regression_loss: 0.8440 - classification_loss: 0.1211 360/500 [====================>.........] - ETA: 54s - loss: 0.9652 - regression_loss: 0.8441 - classification_loss: 0.1211 361/500 [====================>.........] - ETA: 54s - loss: 0.9647 - regression_loss: 0.8436 - classification_loss: 0.1211 362/500 [====================>.........] - ETA: 53s - loss: 0.9635 - regression_loss: 0.8426 - classification_loss: 0.1209 363/500 [====================>.........] - ETA: 53s - loss: 0.9629 - regression_loss: 0.8422 - classification_loss: 0.1208 364/500 [====================>.........] - ETA: 53s - loss: 0.9633 - regression_loss: 0.8425 - classification_loss: 0.1208 365/500 [====================>.........] - ETA: 52s - loss: 0.9623 - regression_loss: 0.8416 - classification_loss: 0.1207 366/500 [====================>.........] - ETA: 52s - loss: 0.9617 - regression_loss: 0.8409 - classification_loss: 0.1208 367/500 [=====================>........] - ETA: 51s - loss: 0.9602 - regression_loss: 0.8396 - classification_loss: 0.1206 368/500 [=====================>........] - ETA: 51s - loss: 0.9600 - regression_loss: 0.8396 - classification_loss: 0.1204 369/500 [=====================>........] - ETA: 51s - loss: 0.9600 - regression_loss: 0.8397 - classification_loss: 0.1203 370/500 [=====================>........] - ETA: 50s - loss: 0.9596 - regression_loss: 0.8393 - classification_loss: 0.1202 371/500 [=====================>........] - ETA: 50s - loss: 0.9596 - regression_loss: 0.8395 - classification_loss: 0.1202 372/500 [=====================>........] - ETA: 49s - loss: 0.9592 - regression_loss: 0.8392 - classification_loss: 0.1200 373/500 [=====================>........] - ETA: 49s - loss: 0.9578 - regression_loss: 0.8380 - classification_loss: 0.1198 374/500 [=====================>........] - ETA: 49s - loss: 0.9573 - regression_loss: 0.8376 - classification_loss: 0.1198 375/500 [=====================>........] - ETA: 48s - loss: 0.9577 - regression_loss: 0.8378 - classification_loss: 0.1199 376/500 [=====================>........] - ETA: 48s - loss: 0.9565 - regression_loss: 0.8369 - classification_loss: 0.1197 377/500 [=====================>........] - ETA: 47s - loss: 0.9554 - regression_loss: 0.8359 - classification_loss: 0.1195 378/500 [=====================>........] - ETA: 47s - loss: 0.9559 - regression_loss: 0.8365 - classification_loss: 0.1195 379/500 [=====================>........] - ETA: 47s - loss: 0.9576 - regression_loss: 0.8380 - classification_loss: 0.1196 380/500 [=====================>........] - ETA: 46s - loss: 0.9577 - regression_loss: 0.8380 - classification_loss: 0.1197 381/500 [=====================>........] - ETA: 46s - loss: 0.9585 - regression_loss: 0.8388 - classification_loss: 0.1198 382/500 [=====================>........] - ETA: 45s - loss: 0.9584 - regression_loss: 0.8388 - classification_loss: 0.1196 383/500 [=====================>........] - ETA: 45s - loss: 0.9568 - regression_loss: 0.8373 - classification_loss: 0.1195 384/500 [======================>.......] - ETA: 45s - loss: 0.9556 - regression_loss: 0.8361 - classification_loss: 0.1195 385/500 [======================>.......] - ETA: 44s - loss: 0.9558 - regression_loss: 0.8363 - classification_loss: 0.1195 386/500 [======================>.......] - ETA: 44s - loss: 0.9555 - regression_loss: 0.8361 - classification_loss: 0.1194 387/500 [======================>.......] - ETA: 44s - loss: 0.9546 - regression_loss: 0.8353 - classification_loss: 0.1193 388/500 [======================>.......] - ETA: 43s - loss: 0.9546 - regression_loss: 0.8353 - classification_loss: 0.1193 389/500 [======================>.......] - ETA: 43s - loss: 0.9543 - regression_loss: 0.8351 - classification_loss: 0.1192 390/500 [======================>.......] - ETA: 42s - loss: 0.9545 - regression_loss: 0.8353 - classification_loss: 0.1193 391/500 [======================>.......] - ETA: 42s - loss: 0.9537 - regression_loss: 0.8346 - classification_loss: 0.1191 392/500 [======================>.......] - ETA: 42s - loss: 0.9530 - regression_loss: 0.8340 - classification_loss: 0.1190 393/500 [======================>.......] - ETA: 41s - loss: 0.9526 - regression_loss: 0.8337 - classification_loss: 0.1189 394/500 [======================>.......] - ETA: 41s - loss: 0.9519 - regression_loss: 0.8331 - classification_loss: 0.1188 395/500 [======================>.......] - ETA: 40s - loss: 0.9526 - regression_loss: 0.8337 - classification_loss: 0.1189 396/500 [======================>.......] - ETA: 40s - loss: 0.9518 - regression_loss: 0.8330 - classification_loss: 0.1188 397/500 [======================>.......] - ETA: 40s - loss: 0.9520 - regression_loss: 0.8333 - classification_loss: 0.1187 398/500 [======================>.......] - ETA: 39s - loss: 0.9527 - regression_loss: 0.8340 - classification_loss: 0.1187 399/500 [======================>.......] - ETA: 39s - loss: 0.9524 - regression_loss: 0.8338 - classification_loss: 0.1186 400/500 [=======================>......] - ETA: 38s - loss: 0.9522 - regression_loss: 0.8337 - classification_loss: 0.1185 401/500 [=======================>......] - ETA: 38s - loss: 0.9514 - regression_loss: 0.8330 - classification_loss: 0.1184 402/500 [=======================>......] - ETA: 38s - loss: 0.9507 - regression_loss: 0.8324 - classification_loss: 0.1183 403/500 [=======================>......] - ETA: 37s - loss: 0.9507 - regression_loss: 0.8324 - classification_loss: 0.1183 404/500 [=======================>......] - ETA: 37s - loss: 0.9493 - regression_loss: 0.8312 - classification_loss: 0.1181 405/500 [=======================>......] - ETA: 37s - loss: 0.9502 - regression_loss: 0.8320 - classification_loss: 0.1182 406/500 [=======================>......] - ETA: 36s - loss: 0.9494 - regression_loss: 0.8314 - classification_loss: 0.1181 407/500 [=======================>......] - ETA: 36s - loss: 0.9491 - regression_loss: 0.8311 - classification_loss: 0.1180 408/500 [=======================>......] - ETA: 35s - loss: 0.9493 - regression_loss: 0.8313 - classification_loss: 0.1180 409/500 [=======================>......] - ETA: 35s - loss: 0.9503 - regression_loss: 0.8322 - classification_loss: 0.1181 410/500 [=======================>......] - ETA: 35s - loss: 0.9504 - regression_loss: 0.8323 - classification_loss: 0.1181 411/500 [=======================>......] - ETA: 34s - loss: 0.9495 - regression_loss: 0.8316 - classification_loss: 0.1180 412/500 [=======================>......] - ETA: 34s - loss: 0.9502 - regression_loss: 0.8323 - classification_loss: 0.1179 413/500 [=======================>......] - ETA: 33s - loss: 0.9498 - regression_loss: 0.8320 - classification_loss: 0.1179 414/500 [=======================>......] - ETA: 33s - loss: 0.9510 - regression_loss: 0.8331 - classification_loss: 0.1179 415/500 [=======================>......] - ETA: 33s - loss: 0.9522 - regression_loss: 0.8340 - classification_loss: 0.1183 416/500 [=======================>......] - ETA: 32s - loss: 0.9533 - regression_loss: 0.8351 - classification_loss: 0.1182 417/500 [========================>.....] - ETA: 32s - loss: 0.9537 - regression_loss: 0.8357 - classification_loss: 0.1180 418/500 [========================>.....] - ETA: 31s - loss: 0.9539 - regression_loss: 0.8359 - classification_loss: 0.1180 419/500 [========================>.....] - ETA: 31s - loss: 0.9539 - regression_loss: 0.8361 - classification_loss: 0.1179 420/500 [========================>.....] - ETA: 31s - loss: 0.9547 - regression_loss: 0.8368 - classification_loss: 0.1180 421/500 [========================>.....] - ETA: 30s - loss: 0.9548 - regression_loss: 0.8369 - classification_loss: 0.1179 422/500 [========================>.....] - ETA: 30s - loss: 0.9549 - regression_loss: 0.8371 - classification_loss: 0.1178 423/500 [========================>.....] - ETA: 30s - loss: 0.9545 - regression_loss: 0.8368 - classification_loss: 0.1177 424/500 [========================>.....] - ETA: 29s - loss: 0.9546 - regression_loss: 0.8369 - classification_loss: 0.1177 425/500 [========================>.....] - ETA: 29s - loss: 0.9552 - regression_loss: 0.8373 - classification_loss: 0.1179 426/500 [========================>.....] - ETA: 28s - loss: 0.9555 - regression_loss: 0.8377 - classification_loss: 0.1178 427/500 [========================>.....] - ETA: 28s - loss: 0.9581 - regression_loss: 0.8395 - classification_loss: 0.1185 428/500 [========================>.....] - ETA: 28s - loss: 0.9578 - regression_loss: 0.8393 - classification_loss: 0.1185 429/500 [========================>.....] - ETA: 27s - loss: 0.9582 - regression_loss: 0.8397 - classification_loss: 0.1184 430/500 [========================>.....] - ETA: 27s - loss: 0.9587 - regression_loss: 0.8402 - classification_loss: 0.1185 431/500 [========================>.....] - ETA: 26s - loss: 0.9579 - regression_loss: 0.8395 - classification_loss: 0.1184 432/500 [========================>.....] - ETA: 26s - loss: 0.9587 - regression_loss: 0.8402 - classification_loss: 0.1186 433/500 [========================>.....] - ETA: 26s - loss: 0.9582 - regression_loss: 0.8398 - classification_loss: 0.1184 434/500 [=========================>....] - ETA: 25s - loss: 0.9581 - regression_loss: 0.8394 - classification_loss: 0.1187 435/500 [=========================>....] - ETA: 25s - loss: 0.9569 - regression_loss: 0.8385 - classification_loss: 0.1185 436/500 [=========================>....] - ETA: 24s - loss: 0.9563 - regression_loss: 0.8379 - classification_loss: 0.1184 437/500 [=========================>....] - ETA: 24s - loss: 0.9560 - regression_loss: 0.8378 - classification_loss: 0.1182 438/500 [=========================>....] - ETA: 24s - loss: 0.9565 - regression_loss: 0.8383 - classification_loss: 0.1182 439/500 [=========================>....] - ETA: 23s - loss: 0.9563 - regression_loss: 0.8382 - classification_loss: 0.1181 440/500 [=========================>....] - ETA: 23s - loss: 0.9585 - regression_loss: 0.8402 - classification_loss: 0.1183 441/500 [=========================>....] - ETA: 22s - loss: 0.9595 - regression_loss: 0.8412 - classification_loss: 0.1183 442/500 [=========================>....] - ETA: 22s - loss: 0.9588 - regression_loss: 0.8406 - classification_loss: 0.1182 443/500 [=========================>....] - ETA: 22s - loss: 0.9609 - regression_loss: 0.8424 - classification_loss: 0.1185 444/500 [=========================>....] - ETA: 21s - loss: 0.9611 - regression_loss: 0.8425 - classification_loss: 0.1185 445/500 [=========================>....] - ETA: 21s - loss: 0.9615 - regression_loss: 0.8429 - classification_loss: 0.1185 446/500 [=========================>....] - ETA: 21s - loss: 0.9624 - regression_loss: 0.8437 - classification_loss: 0.1188 447/500 [=========================>....] - ETA: 20s - loss: 0.9623 - regression_loss: 0.8437 - classification_loss: 0.1187 448/500 [=========================>....] - ETA: 20s - loss: 0.9626 - regression_loss: 0.8440 - classification_loss: 0.1186 449/500 [=========================>....] - ETA: 19s - loss: 0.9623 - regression_loss: 0.8438 - classification_loss: 0.1185 450/500 [==========================>...] - ETA: 19s - loss: 0.9623 - regression_loss: 0.8438 - classification_loss: 0.1185 451/500 [==========================>...] - ETA: 19s - loss: 0.9613 - regression_loss: 0.8429 - classification_loss: 0.1183 452/500 [==========================>...] - ETA: 18s - loss: 0.9605 - regression_loss: 0.8423 - classification_loss: 0.1182 453/500 [==========================>...] - ETA: 18s - loss: 0.9610 - regression_loss: 0.8427 - classification_loss: 0.1182 454/500 [==========================>...] - ETA: 17s - loss: 0.9612 - regression_loss: 0.8429 - classification_loss: 0.1183 455/500 [==========================>...] - ETA: 17s - loss: 0.9614 - regression_loss: 0.8432 - classification_loss: 0.1182 456/500 [==========================>...] - ETA: 17s - loss: 0.9622 - regression_loss: 0.8439 - classification_loss: 0.1183 457/500 [==========================>...] - ETA: 16s - loss: 0.9619 - regression_loss: 0.8436 - classification_loss: 0.1182 458/500 [==========================>...] - ETA: 16s - loss: 0.9611 - regression_loss: 0.8430 - classification_loss: 0.1182 459/500 [==========================>...] - ETA: 15s - loss: 0.9610 - regression_loss: 0.8429 - classification_loss: 0.1181 460/500 [==========================>...] - ETA: 15s - loss: 0.9608 - regression_loss: 0.8427 - classification_loss: 0.1181 461/500 [==========================>...] - ETA: 15s - loss: 0.9605 - regression_loss: 0.8425 - classification_loss: 0.1180 462/500 [==========================>...] - ETA: 14s - loss: 0.9593 - regression_loss: 0.8415 - classification_loss: 0.1178 463/500 [==========================>...] - ETA: 14s - loss: 0.9594 - regression_loss: 0.8415 - classification_loss: 0.1178 464/500 [==========================>...] - ETA: 14s - loss: 0.9588 - regression_loss: 0.8411 - classification_loss: 0.1178 465/500 [==========================>...] - ETA: 13s - loss: 0.9590 - regression_loss: 0.8412 - classification_loss: 0.1178 466/500 [==========================>...] - ETA: 13s - loss: 0.9602 - regression_loss: 0.8420 - classification_loss: 0.1182 467/500 [===========================>..] - ETA: 12s - loss: 0.9595 - regression_loss: 0.8414 - classification_loss: 0.1181 468/500 [===========================>..] - ETA: 12s - loss: 0.9584 - regression_loss: 0.8404 - classification_loss: 0.1180 469/500 [===========================>..] - ETA: 12s - loss: 0.9585 - regression_loss: 0.8404 - classification_loss: 0.1181 470/500 [===========================>..] - ETA: 11s - loss: 0.9604 - regression_loss: 0.8418 - classification_loss: 0.1186 471/500 [===========================>..] - ETA: 11s - loss: 0.9598 - regression_loss: 0.8413 - classification_loss: 0.1185 472/500 [===========================>..] - ETA: 10s - loss: 0.9606 - regression_loss: 0.8420 - classification_loss: 0.1186 473/500 [===========================>..] - ETA: 10s - loss: 0.9595 - regression_loss: 0.8411 - classification_loss: 0.1184 474/500 [===========================>..] - ETA: 10s - loss: 0.9591 - regression_loss: 0.8407 - classification_loss: 0.1184 475/500 [===========================>..] - ETA: 9s - loss: 0.9591 - regression_loss: 0.8407 - classification_loss: 0.1184  476/500 [===========================>..] - ETA: 9s - loss: 0.9589 - regression_loss: 0.8406 - classification_loss: 0.1183 477/500 [===========================>..] - ETA: 8s - loss: 0.9589 - regression_loss: 0.8404 - classification_loss: 0.1185 478/500 [===========================>..] - ETA: 8s - loss: 0.9586 - regression_loss: 0.8402 - classification_loss: 0.1183 479/500 [===========================>..] - ETA: 8s - loss: 0.9578 - regression_loss: 0.8396 - classification_loss: 0.1183 480/500 [===========================>..] - ETA: 7s - loss: 0.9584 - regression_loss: 0.8401 - classification_loss: 0.1183 481/500 [===========================>..] - ETA: 7s - loss: 0.9578 - regression_loss: 0.8395 - classification_loss: 0.1183 482/500 [===========================>..] - ETA: 7s - loss: 0.9577 - regression_loss: 0.8394 - classification_loss: 0.1183 483/500 [===========================>..] - ETA: 6s - loss: 0.9570 - regression_loss: 0.8388 - classification_loss: 0.1182 484/500 [============================>.] - ETA: 6s - loss: 0.9570 - regression_loss: 0.8388 - classification_loss: 0.1182 485/500 [============================>.] - ETA: 5s - loss: 0.9578 - regression_loss: 0.8395 - classification_loss: 0.1183 486/500 [============================>.] - ETA: 5s - loss: 0.9578 - regression_loss: 0.8396 - classification_loss: 0.1183 487/500 [============================>.] - ETA: 5s - loss: 0.9572 - regression_loss: 0.8389 - classification_loss: 0.1182 488/500 [============================>.] - ETA: 4s - loss: 0.9572 - regression_loss: 0.8389 - classification_loss: 0.1182 489/500 [============================>.] - ETA: 4s - loss: 0.9560 - regression_loss: 0.8378 - classification_loss: 0.1181 490/500 [============================>.] - ETA: 3s - loss: 0.9565 - regression_loss: 0.8383 - classification_loss: 0.1182 491/500 [============================>.] - ETA: 3s - loss: 0.9563 - regression_loss: 0.8381 - classification_loss: 0.1182 492/500 [============================>.] - ETA: 3s - loss: 0.9551 - regression_loss: 0.8370 - classification_loss: 0.1180 493/500 [============================>.] - ETA: 2s - loss: 0.9558 - regression_loss: 0.8379 - classification_loss: 0.1180 494/500 [============================>.] - ETA: 2s - loss: 0.9557 - regression_loss: 0.8377 - classification_loss: 0.1179 495/500 [============================>.] - ETA: 1s - loss: 0.9563 - regression_loss: 0.8382 - classification_loss: 0.1180 496/500 [============================>.] - ETA: 1s - loss: 0.9554 - regression_loss: 0.8374 - classification_loss: 0.1180 497/500 [============================>.] - ETA: 1s - loss: 0.9576 - regression_loss: 0.8391 - classification_loss: 0.1185 498/500 [============================>.] - ETA: 0s - loss: 0.9575 - regression_loss: 0.8389 - classification_loss: 0.1186 499/500 [============================>.] - ETA: 0s - loss: 0.9576 - regression_loss: 0.8390 - classification_loss: 0.1185 500/500 [==============================] - 195s 390ms/step - loss: 0.9565 - regression_loss: 0.8381 - classification_loss: 0.1184 326 instances of class plum with average precision: 0.8342 mAP: 0.8342 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/20 1/500 [..............................] - ETA: 3:08 - loss: 0.8576 - regression_loss: 0.7508 - classification_loss: 0.1068 2/500 [..............................] - ETA: 3:09 - loss: 0.8565 - regression_loss: 0.7405 - classification_loss: 0.1161 3/500 [..............................] - ETA: 3:07 - loss: 0.7505 - regression_loss: 0.6488 - classification_loss: 0.1017 4/500 [..............................] - ETA: 3:08 - loss: 0.7792 - regression_loss: 0.6773 - classification_loss: 0.1019 5/500 [..............................] - ETA: 3:08 - loss: 0.7324 - regression_loss: 0.6361 - classification_loss: 0.0963 6/500 [..............................] - ETA: 3:09 - loss: 0.7114 - regression_loss: 0.6182 - classification_loss: 0.0932 7/500 [..............................] - ETA: 3:09 - loss: 0.7426 - regression_loss: 0.6530 - classification_loss: 0.0896 8/500 [..............................] - ETA: 3:10 - loss: 0.7939 - regression_loss: 0.6990 - classification_loss: 0.0949 9/500 [..............................] - ETA: 3:10 - loss: 0.7530 - regression_loss: 0.6632 - classification_loss: 0.0898 10/500 [..............................] - ETA: 3:10 - loss: 0.8477 - regression_loss: 0.7441 - classification_loss: 0.1036 11/500 [..............................] - ETA: 3:10 - loss: 0.8315 - regression_loss: 0.7329 - classification_loss: 0.0986 12/500 [..............................] - ETA: 3:09 - loss: 0.8463 - regression_loss: 0.7452 - classification_loss: 0.1012 13/500 [..............................] - ETA: 3:09 - loss: 0.8450 - regression_loss: 0.7419 - classification_loss: 0.1031 14/500 [..............................] - ETA: 3:08 - loss: 0.8260 - regression_loss: 0.7262 - classification_loss: 0.0998 15/500 [..............................] - ETA: 3:08 - loss: 0.8107 - regression_loss: 0.7129 - classification_loss: 0.0978 16/500 [..............................] - ETA: 3:08 - loss: 0.8346 - regression_loss: 0.7357 - classification_loss: 0.0989 17/500 [>.............................] - ETA: 3:07 - loss: 0.8254 - regression_loss: 0.7281 - classification_loss: 0.0973 18/500 [>.............................] - ETA: 3:07 - loss: 0.8421 - regression_loss: 0.7417 - classification_loss: 0.1004 19/500 [>.............................] - ETA: 3:07 - loss: 0.8608 - regression_loss: 0.7596 - classification_loss: 0.1013 20/500 [>.............................] - ETA: 3:07 - loss: 0.8621 - regression_loss: 0.7613 - classification_loss: 0.1008 21/500 [>.............................] - ETA: 3:06 - loss: 0.8660 - regression_loss: 0.7655 - classification_loss: 0.1005 22/500 [>.............................] - ETA: 3:05 - loss: 0.8657 - regression_loss: 0.7656 - classification_loss: 0.1000 23/500 [>.............................] - ETA: 3:05 - loss: 0.8618 - regression_loss: 0.7623 - classification_loss: 0.0995 24/500 [>.............................] - ETA: 3:05 - loss: 0.8675 - regression_loss: 0.7652 - classification_loss: 0.1023 25/500 [>.............................] - ETA: 3:04 - loss: 0.8693 - regression_loss: 0.7654 - classification_loss: 0.1039 26/500 [>.............................] - ETA: 3:04 - loss: 0.8646 - regression_loss: 0.7621 - classification_loss: 0.1025 27/500 [>.............................] - ETA: 3:03 - loss: 0.8623 - regression_loss: 0.7607 - classification_loss: 0.1016 28/500 [>.............................] - ETA: 3:03 - loss: 0.8885 - regression_loss: 0.7818 - classification_loss: 0.1067 29/500 [>.............................] - ETA: 3:02 - loss: 0.8771 - regression_loss: 0.7723 - classification_loss: 0.1049 30/500 [>.............................] - ETA: 3:02 - loss: 0.8860 - regression_loss: 0.7814 - classification_loss: 0.1046 31/500 [>.............................] - ETA: 3:01 - loss: 0.9031 - regression_loss: 0.7937 - classification_loss: 0.1094 32/500 [>.............................] - ETA: 3:01 - loss: 0.9026 - regression_loss: 0.7941 - classification_loss: 0.1085 33/500 [>.............................] - ETA: 3:01 - loss: 0.9025 - regression_loss: 0.7942 - classification_loss: 0.1083 34/500 [=>............................] - ETA: 3:00 - loss: 0.9003 - regression_loss: 0.7928 - classification_loss: 0.1075 35/500 [=>............................] - ETA: 3:00 - loss: 0.8819 - regression_loss: 0.7761 - classification_loss: 0.1058 36/500 [=>............................] - ETA: 2:59 - loss: 0.8670 - regression_loss: 0.7627 - classification_loss: 0.1044 37/500 [=>............................] - ETA: 2:59 - loss: 0.9019 - regression_loss: 0.7855 - classification_loss: 0.1164 38/500 [=>............................] - ETA: 2:59 - loss: 0.8949 - regression_loss: 0.7796 - classification_loss: 0.1153 39/500 [=>............................] - ETA: 2:59 - loss: 0.8930 - regression_loss: 0.7786 - classification_loss: 0.1144 40/500 [=>............................] - ETA: 2:58 - loss: 0.8949 - regression_loss: 0.7802 - classification_loss: 0.1147 41/500 [=>............................] - ETA: 2:58 - loss: 0.8905 - regression_loss: 0.7772 - classification_loss: 0.1133 42/500 [=>............................] - ETA: 2:58 - loss: 0.8879 - regression_loss: 0.7748 - classification_loss: 0.1130 43/500 [=>............................] - ETA: 2:58 - loss: 0.8944 - regression_loss: 0.7814 - classification_loss: 0.1130 44/500 [=>............................] - ETA: 2:57 - loss: 0.8900 - regression_loss: 0.7770 - classification_loss: 0.1129 45/500 [=>............................] - ETA: 2:57 - loss: 0.8933 - regression_loss: 0.7800 - classification_loss: 0.1133 46/500 [=>............................] - ETA: 2:56 - loss: 0.8873 - regression_loss: 0.7750 - classification_loss: 0.1123 47/500 [=>............................] - ETA: 2:56 - loss: 0.8812 - regression_loss: 0.7702 - classification_loss: 0.1111 48/500 [=>............................] - ETA: 2:55 - loss: 0.8931 - regression_loss: 0.7801 - classification_loss: 0.1129 49/500 [=>............................] - ETA: 2:55 - loss: 0.8952 - regression_loss: 0.7817 - classification_loss: 0.1135 50/500 [==>...........................] - ETA: 2:54 - loss: 0.9002 - regression_loss: 0.7851 - classification_loss: 0.1151 51/500 [==>...........................] - ETA: 2:54 - loss: 0.9110 - regression_loss: 0.7951 - classification_loss: 0.1159 52/500 [==>...........................] - ETA: 2:53 - loss: 0.8996 - regression_loss: 0.7853 - classification_loss: 0.1144 53/500 [==>...........................] - ETA: 2:53 - loss: 0.8969 - regression_loss: 0.7835 - classification_loss: 0.1134 54/500 [==>...........................] - ETA: 2:52 - loss: 0.8994 - regression_loss: 0.7872 - classification_loss: 0.1122 55/500 [==>...........................] - ETA: 2:52 - loss: 0.8921 - regression_loss: 0.7795 - classification_loss: 0.1126 56/500 [==>...........................] - ETA: 2:51 - loss: 0.8939 - regression_loss: 0.7807 - classification_loss: 0.1132 57/500 [==>...........................] - ETA: 2:51 - loss: 0.9071 - regression_loss: 0.7909 - classification_loss: 0.1162 58/500 [==>...........................] - ETA: 2:51 - loss: 0.9054 - regression_loss: 0.7900 - classification_loss: 0.1154 59/500 [==>...........................] - ETA: 2:50 - loss: 0.8951 - regression_loss: 0.7808 - classification_loss: 0.1143 60/500 [==>...........................] - ETA: 2:50 - loss: 0.8937 - regression_loss: 0.7803 - classification_loss: 0.1134 61/500 [==>...........................] - ETA: 2:50 - loss: 0.8991 - regression_loss: 0.7850 - classification_loss: 0.1141 62/500 [==>...........................] - ETA: 2:49 - loss: 0.9028 - regression_loss: 0.7887 - classification_loss: 0.1141 63/500 [==>...........................] - ETA: 2:49 - loss: 0.9082 - regression_loss: 0.7936 - classification_loss: 0.1146 64/500 [==>...........................] - ETA: 2:49 - loss: 0.9001 - regression_loss: 0.7866 - classification_loss: 0.1135 65/500 [==>...........................] - ETA: 2:48 - loss: 0.9062 - regression_loss: 0.7923 - classification_loss: 0.1139 66/500 [==>...........................] - ETA: 2:48 - loss: 0.9012 - regression_loss: 0.7878 - classification_loss: 0.1134 67/500 [===>..........................] - ETA: 2:47 - loss: 0.8988 - regression_loss: 0.7861 - classification_loss: 0.1126 68/500 [===>..........................] - ETA: 2:47 - loss: 0.8943 - regression_loss: 0.7822 - classification_loss: 0.1121 69/500 [===>..........................] - ETA: 2:47 - loss: 0.8956 - regression_loss: 0.7831 - classification_loss: 0.1125 70/500 [===>..........................] - ETA: 2:46 - loss: 0.8919 - regression_loss: 0.7797 - classification_loss: 0.1122 71/500 [===>..........................] - ETA: 2:46 - loss: 0.8932 - regression_loss: 0.7811 - classification_loss: 0.1120 72/500 [===>..........................] - ETA: 2:45 - loss: 0.8952 - regression_loss: 0.7832 - classification_loss: 0.1120 73/500 [===>..........................] - ETA: 2:45 - loss: 0.8964 - regression_loss: 0.7847 - classification_loss: 0.1116 74/500 [===>..........................] - ETA: 2:45 - loss: 0.8936 - regression_loss: 0.7822 - classification_loss: 0.1114 75/500 [===>..........................] - ETA: 2:44 - loss: 0.8970 - regression_loss: 0.7852 - classification_loss: 0.1117 76/500 [===>..........................] - ETA: 2:44 - loss: 0.8954 - regression_loss: 0.7842 - classification_loss: 0.1112 77/500 [===>..........................] - ETA: 2:43 - loss: 0.8890 - regression_loss: 0.7788 - classification_loss: 0.1102 78/500 [===>..........................] - ETA: 2:43 - loss: 0.8883 - regression_loss: 0.7787 - classification_loss: 0.1096 79/500 [===>..........................] - ETA: 2:43 - loss: 0.8910 - regression_loss: 0.7812 - classification_loss: 0.1098 80/500 [===>..........................] - ETA: 2:42 - loss: 0.8935 - regression_loss: 0.7835 - classification_loss: 0.1100 81/500 [===>..........................] - ETA: 2:42 - loss: 0.8998 - regression_loss: 0.7887 - classification_loss: 0.1111 82/500 [===>..........................] - ETA: 2:42 - loss: 0.9001 - regression_loss: 0.7895 - classification_loss: 0.1106 83/500 [===>..........................] - ETA: 2:41 - loss: 0.8952 - regression_loss: 0.7853 - classification_loss: 0.1100 84/500 [====>.........................] - ETA: 2:41 - loss: 0.8906 - regression_loss: 0.7812 - classification_loss: 0.1095 85/500 [====>.........................] - ETA: 2:40 - loss: 0.8924 - regression_loss: 0.7828 - classification_loss: 0.1096 86/500 [====>.........................] - ETA: 2:40 - loss: 0.8929 - regression_loss: 0.7833 - classification_loss: 0.1096 87/500 [====>.........................] - ETA: 2:40 - loss: 0.8895 - regression_loss: 0.7803 - classification_loss: 0.1091 88/500 [====>.........................] - ETA: 2:39 - loss: 0.8840 - regression_loss: 0.7755 - classification_loss: 0.1085 89/500 [====>.........................] - ETA: 2:39 - loss: 0.8869 - regression_loss: 0.7780 - classification_loss: 0.1089 90/500 [====>.........................] - ETA: 2:39 - loss: 0.8868 - regression_loss: 0.7779 - classification_loss: 0.1088 91/500 [====>.........................] - ETA: 2:38 - loss: 0.8894 - regression_loss: 0.7803 - classification_loss: 0.1091 92/500 [====>.........................] - ETA: 2:38 - loss: 0.8878 - regression_loss: 0.7791 - classification_loss: 0.1087 93/500 [====>.........................] - ETA: 2:37 - loss: 0.8871 - regression_loss: 0.7788 - classification_loss: 0.1083 94/500 [====>.........................] - ETA: 2:37 - loss: 0.8846 - regression_loss: 0.7767 - classification_loss: 0.1079 95/500 [====>.........................] - ETA: 2:36 - loss: 0.8807 - regression_loss: 0.7731 - classification_loss: 0.1076 96/500 [====>.........................] - ETA: 2:36 - loss: 0.8771 - regression_loss: 0.7696 - classification_loss: 0.1075 97/500 [====>.........................] - ETA: 2:35 - loss: 0.8732 - regression_loss: 0.7662 - classification_loss: 0.1070 98/500 [====>.........................] - ETA: 2:35 - loss: 0.8721 - regression_loss: 0.7655 - classification_loss: 0.1067 99/500 [====>.........................] - ETA: 2:35 - loss: 0.8708 - regression_loss: 0.7646 - classification_loss: 0.1063 100/500 [=====>........................] - ETA: 2:34 - loss: 0.8727 - regression_loss: 0.7667 - classification_loss: 0.1059 101/500 [=====>........................] - ETA: 2:34 - loss: 0.8696 - regression_loss: 0.7640 - classification_loss: 0.1056 102/500 [=====>........................] - ETA: 2:34 - loss: 0.8706 - regression_loss: 0.7649 - classification_loss: 0.1057 103/500 [=====>........................] - ETA: 2:33 - loss: 0.8715 - regression_loss: 0.7659 - classification_loss: 0.1056 104/500 [=====>........................] - ETA: 2:33 - loss: 0.8722 - regression_loss: 0.7667 - classification_loss: 0.1055 105/500 [=====>........................] - ETA: 2:32 - loss: 0.8715 - regression_loss: 0.7663 - classification_loss: 0.1052 106/500 [=====>........................] - ETA: 2:32 - loss: 0.8743 - regression_loss: 0.7687 - classification_loss: 0.1056 107/500 [=====>........................] - ETA: 2:32 - loss: 0.8749 - regression_loss: 0.7695 - classification_loss: 0.1054 108/500 [=====>........................] - ETA: 2:31 - loss: 0.8777 - regression_loss: 0.7727 - classification_loss: 0.1050 109/500 [=====>........................] - ETA: 2:31 - loss: 0.8842 - regression_loss: 0.7779 - classification_loss: 0.1063 110/500 [=====>........................] - ETA: 2:31 - loss: 0.8847 - regression_loss: 0.7786 - classification_loss: 0.1061 111/500 [=====>........................] - ETA: 2:30 - loss: 0.8808 - regression_loss: 0.7749 - classification_loss: 0.1058 112/500 [=====>........................] - ETA: 2:30 - loss: 0.8796 - regression_loss: 0.7740 - classification_loss: 0.1056 113/500 [=====>........................] - ETA: 2:30 - loss: 0.8782 - regression_loss: 0.7729 - classification_loss: 0.1053 114/500 [=====>........................] - ETA: 2:29 - loss: 0.8793 - regression_loss: 0.7740 - classification_loss: 0.1053 115/500 [=====>........................] - ETA: 2:29 - loss: 0.8776 - regression_loss: 0.7723 - classification_loss: 0.1053 116/500 [=====>........................] - ETA: 2:28 - loss: 0.8755 - regression_loss: 0.7704 - classification_loss: 0.1051 117/500 [======>.......................] - ETA: 2:28 - loss: 0.8739 - regression_loss: 0.7691 - classification_loss: 0.1049 118/500 [======>.......................] - ETA: 2:28 - loss: 0.8725 - regression_loss: 0.7679 - classification_loss: 0.1046 119/500 [======>.......................] - ETA: 2:27 - loss: 0.8759 - regression_loss: 0.7707 - classification_loss: 0.1052 120/500 [======>.......................] - ETA: 2:27 - loss: 0.8752 - regression_loss: 0.7702 - classification_loss: 0.1050 121/500 [======>.......................] - ETA: 2:26 - loss: 0.8742 - regression_loss: 0.7694 - classification_loss: 0.1048 122/500 [======>.......................] - ETA: 2:26 - loss: 0.8806 - regression_loss: 0.7748 - classification_loss: 0.1058 123/500 [======>.......................] - ETA: 2:26 - loss: 0.8842 - regression_loss: 0.7785 - classification_loss: 0.1057 124/500 [======>.......................] - ETA: 2:25 - loss: 0.8829 - regression_loss: 0.7774 - classification_loss: 0.1055 125/500 [======>.......................] - ETA: 2:25 - loss: 0.8798 - regression_loss: 0.7745 - classification_loss: 0.1053 126/500 [======>.......................] - ETA: 2:25 - loss: 0.8771 - regression_loss: 0.7722 - classification_loss: 0.1049 127/500 [======>.......................] - ETA: 2:24 - loss: 0.8739 - regression_loss: 0.7696 - classification_loss: 0.1043 128/500 [======>.......................] - ETA: 2:24 - loss: 0.8747 - regression_loss: 0.7704 - classification_loss: 0.1044 129/500 [======>.......................] - ETA: 2:23 - loss: 0.8744 - regression_loss: 0.7700 - classification_loss: 0.1044 130/500 [======>.......................] - ETA: 2:23 - loss: 0.8761 - regression_loss: 0.7719 - classification_loss: 0.1042 131/500 [======>.......................] - ETA: 2:22 - loss: 0.8753 - regression_loss: 0.7712 - classification_loss: 0.1041 132/500 [======>.......................] - ETA: 2:22 - loss: 0.8770 - regression_loss: 0.7729 - classification_loss: 0.1041 133/500 [======>.......................] - ETA: 2:22 - loss: 0.8744 - regression_loss: 0.7707 - classification_loss: 0.1037 134/500 [=======>......................] - ETA: 2:21 - loss: 0.8716 - regression_loss: 0.7684 - classification_loss: 0.1032 135/500 [=======>......................] - ETA: 2:21 - loss: 0.8694 - regression_loss: 0.7660 - classification_loss: 0.1034 136/500 [=======>......................] - ETA: 2:20 - loss: 0.8669 - regression_loss: 0.7639 - classification_loss: 0.1030 137/500 [=======>......................] - ETA: 2:20 - loss: 0.8659 - regression_loss: 0.7632 - classification_loss: 0.1027 138/500 [=======>......................] - ETA: 2:20 - loss: 0.8653 - regression_loss: 0.7629 - classification_loss: 0.1024 139/500 [=======>......................] - ETA: 2:19 - loss: 0.8674 - regression_loss: 0.7648 - classification_loss: 0.1027 140/500 [=======>......................] - ETA: 2:19 - loss: 0.8721 - regression_loss: 0.7684 - classification_loss: 0.1037 141/500 [=======>......................] - ETA: 2:19 - loss: 0.8739 - regression_loss: 0.7700 - classification_loss: 0.1038 142/500 [=======>......................] - ETA: 2:18 - loss: 0.8785 - regression_loss: 0.7741 - classification_loss: 0.1043 143/500 [=======>......................] - ETA: 2:18 - loss: 0.8788 - regression_loss: 0.7749 - classification_loss: 0.1039 144/500 [=======>......................] - ETA: 2:17 - loss: 0.8762 - regression_loss: 0.7727 - classification_loss: 0.1035 145/500 [=======>......................] - ETA: 2:17 - loss: 0.8753 - regression_loss: 0.7721 - classification_loss: 0.1032 146/500 [=======>......................] - ETA: 2:17 - loss: 0.8712 - regression_loss: 0.7684 - classification_loss: 0.1028 147/500 [=======>......................] - ETA: 2:16 - loss: 0.8705 - regression_loss: 0.7679 - classification_loss: 0.1026 148/500 [=======>......................] - ETA: 2:16 - loss: 0.8696 - regression_loss: 0.7670 - classification_loss: 0.1025 149/500 [=======>......................] - ETA: 2:15 - loss: 0.8682 - regression_loss: 0.7658 - classification_loss: 0.1024 150/500 [========>.....................] - ETA: 2:15 - loss: 0.8699 - regression_loss: 0.7673 - classification_loss: 0.1026 151/500 [========>.....................] - ETA: 2:15 - loss: 0.8722 - regression_loss: 0.7695 - classification_loss: 0.1027 152/500 [========>.....................] - ETA: 2:14 - loss: 0.8740 - regression_loss: 0.7710 - classification_loss: 0.1030 153/500 [========>.....................] - ETA: 2:14 - loss: 0.8735 - regression_loss: 0.7706 - classification_loss: 0.1028 154/500 [========>.....................] - ETA: 2:14 - loss: 0.8768 - regression_loss: 0.7732 - classification_loss: 0.1036 155/500 [========>.....................] - ETA: 2:13 - loss: 0.8759 - regression_loss: 0.7726 - classification_loss: 0.1033 156/500 [========>.....................] - ETA: 2:13 - loss: 0.8799 - regression_loss: 0.7758 - classification_loss: 0.1041 157/500 [========>.....................] - ETA: 2:12 - loss: 0.8803 - regression_loss: 0.7760 - classification_loss: 0.1043 158/500 [========>.....................] - ETA: 2:12 - loss: 0.8855 - regression_loss: 0.7806 - classification_loss: 0.1048 159/500 [========>.....................] - ETA: 2:12 - loss: 0.8866 - regression_loss: 0.7814 - classification_loss: 0.1051 160/500 [========>.....................] - ETA: 2:11 - loss: 0.8875 - regression_loss: 0.7823 - classification_loss: 0.1052 161/500 [========>.....................] - ETA: 2:11 - loss: 0.8872 - regression_loss: 0.7821 - classification_loss: 0.1051 162/500 [========>.....................] - ETA: 2:11 - loss: 0.8873 - regression_loss: 0.7823 - classification_loss: 0.1050 163/500 [========>.....................] - ETA: 2:10 - loss: 0.8886 - regression_loss: 0.7835 - classification_loss: 0.1051 164/500 [========>.....................] - ETA: 2:10 - loss: 0.8876 - regression_loss: 0.7826 - classification_loss: 0.1050 165/500 [========>.....................] - ETA: 2:09 - loss: 0.8898 - regression_loss: 0.7841 - classification_loss: 0.1057 166/500 [========>.....................] - ETA: 2:09 - loss: 0.8899 - regression_loss: 0.7842 - classification_loss: 0.1057 167/500 [=========>....................] - ETA: 2:09 - loss: 0.8900 - regression_loss: 0.7842 - classification_loss: 0.1058 168/500 [=========>....................] - ETA: 2:08 - loss: 0.8894 - regression_loss: 0.7838 - classification_loss: 0.1056 169/500 [=========>....................] - ETA: 2:08 - loss: 0.8879 - regression_loss: 0.7825 - classification_loss: 0.1054 170/500 [=========>....................] - ETA: 2:08 - loss: 0.8884 - regression_loss: 0.7832 - classification_loss: 0.1052 171/500 [=========>....................] - ETA: 2:07 - loss: 0.8861 - regression_loss: 0.7812 - classification_loss: 0.1049 172/500 [=========>....................] - ETA: 2:07 - loss: 0.8854 - regression_loss: 0.7806 - classification_loss: 0.1048 173/500 [=========>....................] - ETA: 2:06 - loss: 0.8835 - regression_loss: 0.7787 - classification_loss: 0.1048 174/500 [=========>....................] - ETA: 2:06 - loss: 0.8815 - regression_loss: 0.7769 - classification_loss: 0.1046 175/500 [=========>....................] - ETA: 2:06 - loss: 0.8805 - regression_loss: 0.7761 - classification_loss: 0.1044 176/500 [=========>....................] - ETA: 2:05 - loss: 0.8786 - regression_loss: 0.7745 - classification_loss: 0.1041 177/500 [=========>....................] - ETA: 2:05 - loss: 0.8820 - regression_loss: 0.7774 - classification_loss: 0.1047 178/500 [=========>....................] - ETA: 2:04 - loss: 0.8828 - regression_loss: 0.7782 - classification_loss: 0.1046 179/500 [=========>....................] - ETA: 2:04 - loss: 0.8827 - regression_loss: 0.7783 - classification_loss: 0.1044 180/500 [=========>....................] - ETA: 2:04 - loss: 0.8833 - regression_loss: 0.7789 - classification_loss: 0.1044 181/500 [=========>....................] - ETA: 2:03 - loss: 0.8831 - regression_loss: 0.7789 - classification_loss: 0.1043 182/500 [=========>....................] - ETA: 2:03 - loss: 0.8821 - regression_loss: 0.7778 - classification_loss: 0.1043 183/500 [=========>....................] - ETA: 2:03 - loss: 0.8822 - regression_loss: 0.7779 - classification_loss: 0.1043 184/500 [==========>...................] - ETA: 2:02 - loss: 0.8830 - regression_loss: 0.7787 - classification_loss: 0.1043 185/500 [==========>...................] - ETA: 2:02 - loss: 0.8812 - regression_loss: 0.7772 - classification_loss: 0.1040 186/500 [==========>...................] - ETA: 2:01 - loss: 0.8808 - regression_loss: 0.7770 - classification_loss: 0.1038 187/500 [==========>...................] - ETA: 2:01 - loss: 0.8841 - regression_loss: 0.7798 - classification_loss: 0.1043 188/500 [==========>...................] - ETA: 2:01 - loss: 0.8832 - regression_loss: 0.7791 - classification_loss: 0.1041 189/500 [==========>...................] - ETA: 2:00 - loss: 0.8806 - regression_loss: 0.7767 - classification_loss: 0.1039 190/500 [==========>...................] - ETA: 2:00 - loss: 0.8790 - regression_loss: 0.7752 - classification_loss: 0.1038 191/500 [==========>...................] - ETA: 2:00 - loss: 0.8784 - regression_loss: 0.7746 - classification_loss: 0.1037 192/500 [==========>...................] - ETA: 1:59 - loss: 0.8820 - regression_loss: 0.7780 - classification_loss: 0.1040 193/500 [==========>...................] - ETA: 1:59 - loss: 0.8804 - regression_loss: 0.7767 - classification_loss: 0.1038 194/500 [==========>...................] - ETA: 1:58 - loss: 0.8849 - regression_loss: 0.7802 - classification_loss: 0.1046 195/500 [==========>...................] - ETA: 1:58 - loss: 0.8848 - regression_loss: 0.7800 - classification_loss: 0.1048 196/500 [==========>...................] - ETA: 1:58 - loss: 0.8834 - regression_loss: 0.7787 - classification_loss: 0.1047 197/500 [==========>...................] - ETA: 1:57 - loss: 0.8829 - regression_loss: 0.7782 - classification_loss: 0.1047 198/500 [==========>...................] - ETA: 1:57 - loss: 0.8831 - regression_loss: 0.7784 - classification_loss: 0.1048 199/500 [==========>...................] - ETA: 1:56 - loss: 0.8878 - regression_loss: 0.7825 - classification_loss: 0.1054 200/500 [===========>..................] - ETA: 1:56 - loss: 0.8893 - regression_loss: 0.7837 - classification_loss: 0.1056 201/500 [===========>..................] - ETA: 1:56 - loss: 0.8890 - regression_loss: 0.7835 - classification_loss: 0.1055 202/500 [===========>..................] - ETA: 1:55 - loss: 0.8891 - regression_loss: 0.7836 - classification_loss: 0.1055 203/500 [===========>..................] - ETA: 1:55 - loss: 0.8887 - regression_loss: 0.7832 - classification_loss: 0.1055 204/500 [===========>..................] - ETA: 1:55 - loss: 0.8893 - regression_loss: 0.7838 - classification_loss: 0.1055 205/500 [===========>..................] - ETA: 1:54 - loss: 0.8933 - regression_loss: 0.7868 - classification_loss: 0.1065 206/500 [===========>..................] - ETA: 1:54 - loss: 0.8937 - regression_loss: 0.7873 - classification_loss: 0.1064 207/500 [===========>..................] - ETA: 1:53 - loss: 0.8923 - regression_loss: 0.7861 - classification_loss: 0.1062 208/500 [===========>..................] - ETA: 1:53 - loss: 0.8913 - regression_loss: 0.7852 - classification_loss: 0.1061 209/500 [===========>..................] - ETA: 1:53 - loss: 0.8904 - regression_loss: 0.7845 - classification_loss: 0.1058 210/500 [===========>..................] - ETA: 1:52 - loss: 0.8893 - regression_loss: 0.7836 - classification_loss: 0.1058 211/500 [===========>..................] - ETA: 1:52 - loss: 0.8885 - regression_loss: 0.7828 - classification_loss: 0.1057 212/500 [===========>..................] - ETA: 1:51 - loss: 0.8902 - regression_loss: 0.7844 - classification_loss: 0.1058 213/500 [===========>..................] - ETA: 1:51 - loss: 0.8899 - regression_loss: 0.7843 - classification_loss: 0.1056 214/500 [===========>..................] - ETA: 1:51 - loss: 0.8892 - regression_loss: 0.7838 - classification_loss: 0.1054 215/500 [===========>..................] - ETA: 1:50 - loss: 0.8908 - regression_loss: 0.7847 - classification_loss: 0.1061 216/500 [===========>..................] - ETA: 1:50 - loss: 0.8928 - regression_loss: 0.7862 - classification_loss: 0.1065 217/500 [============>.................] - ETA: 1:50 - loss: 0.8933 - regression_loss: 0.7865 - classification_loss: 0.1068 218/500 [============>.................] - ETA: 1:49 - loss: 0.8922 - regression_loss: 0.7857 - classification_loss: 0.1065 219/500 [============>.................] - ETA: 1:49 - loss: 0.8914 - regression_loss: 0.7850 - classification_loss: 0.1064 220/500 [============>.................] - ETA: 1:48 - loss: 0.8916 - regression_loss: 0.7852 - classification_loss: 0.1064 221/500 [============>.................] - ETA: 1:48 - loss: 0.8962 - regression_loss: 0.7888 - classification_loss: 0.1074 222/500 [============>.................] - ETA: 1:48 - loss: 0.8946 - regression_loss: 0.7874 - classification_loss: 0.1071 223/500 [============>.................] - ETA: 1:47 - loss: 0.8934 - regression_loss: 0.7865 - classification_loss: 0.1069 224/500 [============>.................] - ETA: 1:47 - loss: 0.8932 - regression_loss: 0.7863 - classification_loss: 0.1068 225/500 [============>.................] - ETA: 1:46 - loss: 0.8949 - regression_loss: 0.7877 - classification_loss: 0.1072 226/500 [============>.................] - ETA: 1:46 - loss: 0.8948 - regression_loss: 0.7878 - classification_loss: 0.1071 227/500 [============>.................] - ETA: 1:46 - loss: 0.8941 - regression_loss: 0.7872 - classification_loss: 0.1068 228/500 [============>.................] - ETA: 1:45 - loss: 0.8928 - regression_loss: 0.7862 - classification_loss: 0.1066 229/500 [============>.................] - ETA: 1:45 - loss: 0.8922 - regression_loss: 0.7857 - classification_loss: 0.1065 230/500 [============>.................] - ETA: 1:45 - loss: 0.8902 - regression_loss: 0.7839 - classification_loss: 0.1063 231/500 [============>.................] - ETA: 1:44 - loss: 0.8914 - regression_loss: 0.7848 - classification_loss: 0.1066 232/500 [============>.................] - ETA: 1:44 - loss: 0.8918 - regression_loss: 0.7852 - classification_loss: 0.1066 233/500 [============>.................] - ETA: 1:43 - loss: 0.8922 - regression_loss: 0.7856 - classification_loss: 0.1065 234/500 [=============>................] - ETA: 1:43 - loss: 0.8913 - regression_loss: 0.7849 - classification_loss: 0.1064 235/500 [=============>................] - ETA: 1:43 - loss: 0.8905 - regression_loss: 0.7842 - classification_loss: 0.1063 236/500 [=============>................] - ETA: 1:42 - loss: 0.8894 - regression_loss: 0.7833 - classification_loss: 0.1061 237/500 [=============>................] - ETA: 1:42 - loss: 0.8875 - regression_loss: 0.7816 - classification_loss: 0.1059 238/500 [=============>................] - ETA: 1:41 - loss: 0.8859 - regression_loss: 0.7801 - classification_loss: 0.1058 239/500 [=============>................] - ETA: 1:41 - loss: 0.8892 - regression_loss: 0.7824 - classification_loss: 0.1067 240/500 [=============>................] - ETA: 1:41 - loss: 0.8902 - regression_loss: 0.7834 - classification_loss: 0.1069 241/500 [=============>................] - ETA: 1:40 - loss: 0.8913 - regression_loss: 0.7845 - classification_loss: 0.1069 242/500 [=============>................] - ETA: 1:40 - loss: 0.8913 - regression_loss: 0.7844 - classification_loss: 0.1069 243/500 [=============>................] - ETA: 1:40 - loss: 0.8924 - regression_loss: 0.7854 - classification_loss: 0.1069 244/500 [=============>................] - ETA: 1:39 - loss: 0.8934 - regression_loss: 0.7865 - classification_loss: 0.1069 245/500 [=============>................] - ETA: 1:39 - loss: 0.8939 - regression_loss: 0.7868 - classification_loss: 0.1071 246/500 [=============>................] - ETA: 1:38 - loss: 0.8961 - regression_loss: 0.7884 - classification_loss: 0.1077 247/500 [=============>................] - ETA: 1:38 - loss: 0.8955 - regression_loss: 0.7879 - classification_loss: 0.1076 248/500 [=============>................] - ETA: 1:38 - loss: 0.8952 - regression_loss: 0.7875 - classification_loss: 0.1077 249/500 [=============>................] - ETA: 1:37 - loss: 0.8959 - regression_loss: 0.7882 - classification_loss: 0.1077 250/500 [==============>...............] - ETA: 1:37 - loss: 0.8965 - regression_loss: 0.7887 - classification_loss: 0.1078 251/500 [==============>...............] - ETA: 1:36 - loss: 0.8966 - regression_loss: 0.7888 - classification_loss: 0.1078 252/500 [==============>...............] - ETA: 1:36 - loss: 0.8994 - regression_loss: 0.7914 - classification_loss: 0.1080 253/500 [==============>...............] - ETA: 1:36 - loss: 0.9003 - regression_loss: 0.7923 - classification_loss: 0.1080 254/500 [==============>...............] - ETA: 1:35 - loss: 0.9011 - regression_loss: 0.7931 - classification_loss: 0.1079 255/500 [==============>...............] - ETA: 1:35 - loss: 0.9016 - regression_loss: 0.7937 - classification_loss: 0.1079 256/500 [==============>...............] - ETA: 1:35 - loss: 0.9024 - regression_loss: 0.7942 - classification_loss: 0.1082 257/500 [==============>...............] - ETA: 1:34 - loss: 0.9026 - regression_loss: 0.7943 - classification_loss: 0.1084 258/500 [==============>...............] - ETA: 1:34 - loss: 0.9028 - regression_loss: 0.7945 - classification_loss: 0.1083 259/500 [==============>...............] - ETA: 1:33 - loss: 0.9018 - regression_loss: 0.7937 - classification_loss: 0.1081 260/500 [==============>...............] - ETA: 1:33 - loss: 0.9023 - regression_loss: 0.7940 - classification_loss: 0.1083 261/500 [==============>...............] - ETA: 1:33 - loss: 0.9047 - regression_loss: 0.7963 - classification_loss: 0.1084 262/500 [==============>...............] - ETA: 1:32 - loss: 0.9069 - regression_loss: 0.7978 - classification_loss: 0.1091 263/500 [==============>...............] - ETA: 1:32 - loss: 0.9064 - regression_loss: 0.7975 - classification_loss: 0.1089 264/500 [==============>...............] - ETA: 1:31 - loss: 0.9057 - regression_loss: 0.7969 - classification_loss: 0.1088 265/500 [==============>...............] - ETA: 1:31 - loss: 0.9053 - regression_loss: 0.7967 - classification_loss: 0.1086 266/500 [==============>...............] - ETA: 1:31 - loss: 0.9061 - regression_loss: 0.7974 - classification_loss: 0.1087 267/500 [===============>..............] - ETA: 1:30 - loss: 0.9090 - regression_loss: 0.7999 - classification_loss: 0.1091 268/500 [===============>..............] - ETA: 1:30 - loss: 0.9087 - regression_loss: 0.7998 - classification_loss: 0.1089 269/500 [===============>..............] - ETA: 1:29 - loss: 0.9089 - regression_loss: 0.7999 - classification_loss: 0.1090 270/500 [===============>..............] - ETA: 1:29 - loss: 0.9086 - regression_loss: 0.7998 - classification_loss: 0.1088 271/500 [===============>..............] - ETA: 1:29 - loss: 0.9079 - regression_loss: 0.7991 - classification_loss: 0.1087 272/500 [===============>..............] - ETA: 1:28 - loss: 0.9076 - regression_loss: 0.7990 - classification_loss: 0.1087 273/500 [===============>..............] - ETA: 1:28 - loss: 0.9079 - regression_loss: 0.7993 - classification_loss: 0.1086 274/500 [===============>..............] - ETA: 1:28 - loss: 0.9082 - regression_loss: 0.7996 - classification_loss: 0.1086 275/500 [===============>..............] - ETA: 1:27 - loss: 0.9077 - regression_loss: 0.7992 - classification_loss: 0.1085 276/500 [===============>..............] - ETA: 1:27 - loss: 0.9070 - regression_loss: 0.7987 - classification_loss: 0.1083 277/500 [===============>..............] - ETA: 1:26 - loss: 0.9074 - regression_loss: 0.7991 - classification_loss: 0.1083 278/500 [===============>..............] - ETA: 1:26 - loss: 0.9087 - regression_loss: 0.8002 - classification_loss: 0.1085 279/500 [===============>..............] - ETA: 1:26 - loss: 0.9121 - regression_loss: 0.8033 - classification_loss: 0.1089 280/500 [===============>..............] - ETA: 1:25 - loss: 0.9116 - regression_loss: 0.8029 - classification_loss: 0.1087 281/500 [===============>..............] - ETA: 1:25 - loss: 0.9113 - regression_loss: 0.8026 - classification_loss: 0.1087 282/500 [===============>..............] - ETA: 1:24 - loss: 0.9112 - regression_loss: 0.8026 - classification_loss: 0.1086 283/500 [===============>..............] - ETA: 1:24 - loss: 0.9106 - regression_loss: 0.8021 - classification_loss: 0.1085 284/500 [================>.............] - ETA: 1:24 - loss: 0.9106 - regression_loss: 0.8022 - classification_loss: 0.1084 285/500 [================>.............] - ETA: 1:23 - loss: 0.9102 - regression_loss: 0.8017 - classification_loss: 0.1085 286/500 [================>.............] - ETA: 1:23 - loss: 0.9094 - regression_loss: 0.8010 - classification_loss: 0.1084 287/500 [================>.............] - ETA: 1:23 - loss: 0.9089 - regression_loss: 0.8005 - classification_loss: 0.1084 288/500 [================>.............] - ETA: 1:22 - loss: 0.9086 - regression_loss: 0.8003 - classification_loss: 0.1082 289/500 [================>.............] - ETA: 1:22 - loss: 0.9088 - regression_loss: 0.8005 - classification_loss: 0.1083 290/500 [================>.............] - ETA: 1:21 - loss: 0.9074 - regression_loss: 0.7993 - classification_loss: 0.1081 291/500 [================>.............] - ETA: 1:21 - loss: 0.9077 - regression_loss: 0.7997 - classification_loss: 0.1080 292/500 [================>.............] - ETA: 1:21 - loss: 0.9081 - regression_loss: 0.8000 - classification_loss: 0.1081 293/500 [================>.............] - ETA: 1:20 - loss: 0.9086 - regression_loss: 0.8004 - classification_loss: 0.1082 294/500 [================>.............] - ETA: 1:20 - loss: 0.9078 - regression_loss: 0.7997 - classification_loss: 0.1081 295/500 [================>.............] - ETA: 1:19 - loss: 0.9073 - regression_loss: 0.7992 - classification_loss: 0.1080 296/500 [================>.............] - ETA: 1:19 - loss: 0.9065 - regression_loss: 0.7984 - classification_loss: 0.1081 297/500 [================>.............] - ETA: 1:19 - loss: 0.9060 - regression_loss: 0.7980 - classification_loss: 0.1080 298/500 [================>.............] - ETA: 1:18 - loss: 0.9054 - regression_loss: 0.7975 - classification_loss: 0.1079 299/500 [================>.............] - ETA: 1:18 - loss: 0.9050 - regression_loss: 0.7973 - classification_loss: 0.1077 300/500 [=================>............] - ETA: 1:17 - loss: 0.9038 - regression_loss: 0.7963 - classification_loss: 0.1075 301/500 [=================>............] - ETA: 1:17 - loss: 0.9049 - regression_loss: 0.7974 - classification_loss: 0.1075 302/500 [=================>............] - ETA: 1:17 - loss: 0.9041 - regression_loss: 0.7969 - classification_loss: 0.1073 303/500 [=================>............] - ETA: 1:16 - loss: 0.9022 - regression_loss: 0.7952 - classification_loss: 0.1070 304/500 [=================>............] - ETA: 1:16 - loss: 0.9017 - regression_loss: 0.7947 - classification_loss: 0.1069 305/500 [=================>............] - ETA: 1:16 - loss: 0.9020 - regression_loss: 0.7951 - classification_loss: 0.1069 306/500 [=================>............] - ETA: 1:15 - loss: 0.9031 - regression_loss: 0.7962 - classification_loss: 0.1069 307/500 [=================>............] - ETA: 1:15 - loss: 0.9033 - regression_loss: 0.7965 - classification_loss: 0.1068 308/500 [=================>............] - ETA: 1:14 - loss: 0.9033 - regression_loss: 0.7965 - classification_loss: 0.1068 309/500 [=================>............] - ETA: 1:14 - loss: 0.9031 - regression_loss: 0.7964 - classification_loss: 0.1067 310/500 [=================>............] - ETA: 1:14 - loss: 0.9020 - regression_loss: 0.7953 - classification_loss: 0.1066 311/500 [=================>............] - ETA: 1:13 - loss: 0.9015 - regression_loss: 0.7950 - classification_loss: 0.1065 312/500 [=================>............] - ETA: 1:13 - loss: 0.8999 - regression_loss: 0.7936 - classification_loss: 0.1063 313/500 [=================>............] - ETA: 1:12 - loss: 0.8985 - regression_loss: 0.7924 - classification_loss: 0.1061 314/500 [=================>............] - ETA: 1:12 - loss: 0.8977 - regression_loss: 0.7915 - classification_loss: 0.1062 315/500 [=================>............] - ETA: 1:12 - loss: 0.8963 - regression_loss: 0.7902 - classification_loss: 0.1061 316/500 [=================>............] - ETA: 1:11 - loss: 0.8994 - regression_loss: 0.7930 - classification_loss: 0.1064 317/500 [==================>...........] - ETA: 1:11 - loss: 0.8989 - regression_loss: 0.7926 - classification_loss: 0.1063 318/500 [==================>...........] - ETA: 1:10 - loss: 0.9008 - regression_loss: 0.7942 - classification_loss: 0.1065 319/500 [==================>...........] - ETA: 1:10 - loss: 0.9004 - regression_loss: 0.7940 - classification_loss: 0.1064 320/500 [==================>...........] - ETA: 1:10 - loss: 0.9004 - regression_loss: 0.7941 - classification_loss: 0.1064 321/500 [==================>...........] - ETA: 1:09 - loss: 0.9001 - regression_loss: 0.7938 - classification_loss: 0.1062 322/500 [==================>...........] - ETA: 1:09 - loss: 0.8990 - regression_loss: 0.7929 - classification_loss: 0.1061 323/500 [==================>...........] - ETA: 1:08 - loss: 0.8998 - regression_loss: 0.7936 - classification_loss: 0.1062 324/500 [==================>...........] - ETA: 1:08 - loss: 0.8989 - regression_loss: 0.7928 - classification_loss: 0.1061 325/500 [==================>...........] - ETA: 1:08 - loss: 0.8993 - regression_loss: 0.7930 - classification_loss: 0.1063 326/500 [==================>...........] - ETA: 1:07 - loss: 0.8996 - regression_loss: 0.7934 - classification_loss: 0.1062 327/500 [==================>...........] - ETA: 1:07 - loss: 0.9008 - regression_loss: 0.7944 - classification_loss: 0.1064 328/500 [==================>...........] - ETA: 1:07 - loss: 0.9015 - regression_loss: 0.7950 - classification_loss: 0.1064 329/500 [==================>...........] - ETA: 1:06 - loss: 0.9000 - regression_loss: 0.7938 - classification_loss: 0.1062 330/500 [==================>...........] - ETA: 1:06 - loss: 0.9014 - regression_loss: 0.7950 - classification_loss: 0.1064 331/500 [==================>...........] - ETA: 1:05 - loss: 0.9030 - regression_loss: 0.7960 - classification_loss: 0.1069 332/500 [==================>...........] - ETA: 1:05 - loss: 0.9026 - regression_loss: 0.7958 - classification_loss: 0.1068 333/500 [==================>...........] - ETA: 1:05 - loss: 0.9023 - regression_loss: 0.7956 - classification_loss: 0.1068 334/500 [===================>..........] - ETA: 1:04 - loss: 0.9014 - regression_loss: 0.7948 - classification_loss: 0.1066 335/500 [===================>..........] - ETA: 1:04 - loss: 0.9008 - regression_loss: 0.7943 - classification_loss: 0.1065 336/500 [===================>..........] - ETA: 1:03 - loss: 0.9016 - regression_loss: 0.7950 - classification_loss: 0.1066 337/500 [===================>..........] - ETA: 1:03 - loss: 0.9013 - regression_loss: 0.7949 - classification_loss: 0.1064 338/500 [===================>..........] - ETA: 1:03 - loss: 0.9012 - regression_loss: 0.7948 - classification_loss: 0.1064 339/500 [===================>..........] - ETA: 1:02 - loss: 0.9005 - regression_loss: 0.7936 - classification_loss: 0.1069 340/500 [===================>..........] - ETA: 1:02 - loss: 0.8998 - regression_loss: 0.7929 - classification_loss: 0.1069 341/500 [===================>..........] - ETA: 1:01 - loss: 0.9011 - regression_loss: 0.7941 - classification_loss: 0.1070 342/500 [===================>..........] - ETA: 1:01 - loss: 0.9019 - regression_loss: 0.7947 - classification_loss: 0.1072 343/500 [===================>..........] - ETA: 1:01 - loss: 0.9008 - regression_loss: 0.7937 - classification_loss: 0.1071 344/500 [===================>..........] - ETA: 1:00 - loss: 0.9009 - regression_loss: 0.7938 - classification_loss: 0.1070 345/500 [===================>..........] - ETA: 1:00 - loss: 0.9014 - regression_loss: 0.7941 - classification_loss: 0.1073 346/500 [===================>..........] - ETA: 1:00 - loss: 0.9002 - regression_loss: 0.7930 - classification_loss: 0.1072 347/500 [===================>..........] - ETA: 59s - loss: 0.9008 - regression_loss: 0.7935 - classification_loss: 0.1073  348/500 [===================>..........] - ETA: 59s - loss: 0.9021 - regression_loss: 0.7944 - classification_loss: 0.1077 349/500 [===================>..........] - ETA: 58s - loss: 0.9017 - regression_loss: 0.7941 - classification_loss: 0.1077 350/500 [====================>.........] - ETA: 58s - loss: 0.9009 - regression_loss: 0.7934 - classification_loss: 0.1075 351/500 [====================>.........] - ETA: 58s - loss: 0.9014 - regression_loss: 0.7940 - classification_loss: 0.1074 352/500 [====================>.........] - ETA: 57s - loss: 0.8998 - regression_loss: 0.7925 - classification_loss: 0.1073 353/500 [====================>.........] - ETA: 57s - loss: 0.8989 - regression_loss: 0.7918 - classification_loss: 0.1071 354/500 [====================>.........] - ETA: 56s - loss: 0.8983 - regression_loss: 0.7913 - classification_loss: 0.1070 355/500 [====================>.........] - ETA: 56s - loss: 0.8976 - regression_loss: 0.7907 - classification_loss: 0.1069 356/500 [====================>.........] - ETA: 56s - loss: 0.8967 - regression_loss: 0.7899 - classification_loss: 0.1068 357/500 [====================>.........] - ETA: 55s - loss: 0.8974 - regression_loss: 0.7905 - classification_loss: 0.1069 358/500 [====================>.........] - ETA: 55s - loss: 0.8968 - regression_loss: 0.7899 - classification_loss: 0.1068 359/500 [====================>.........] - ETA: 54s - loss: 0.8962 - regression_loss: 0.7894 - classification_loss: 0.1068 360/500 [====================>.........] - ETA: 54s - loss: 0.8964 - regression_loss: 0.7895 - classification_loss: 0.1068 361/500 [====================>.........] - ETA: 54s - loss: 0.8971 - regression_loss: 0.7902 - classification_loss: 0.1069 362/500 [====================>.........] - ETA: 53s - loss: 0.8965 - regression_loss: 0.7895 - classification_loss: 0.1070 363/500 [====================>.........] - ETA: 53s - loss: 0.8951 - regression_loss: 0.7882 - classification_loss: 0.1068 364/500 [====================>.........] - ETA: 53s - loss: 0.8947 - regression_loss: 0.7880 - classification_loss: 0.1067 365/500 [====================>.........] - ETA: 52s - loss: 0.8932 - regression_loss: 0.7867 - classification_loss: 0.1065 366/500 [====================>.........] - ETA: 52s - loss: 0.8925 - regression_loss: 0.7861 - classification_loss: 0.1064 367/500 [=====================>........] - ETA: 51s - loss: 0.8916 - regression_loss: 0.7852 - classification_loss: 0.1064 368/500 [=====================>........] - ETA: 51s - loss: 0.8940 - regression_loss: 0.7869 - classification_loss: 0.1071 369/500 [=====================>........] - ETA: 51s - loss: 0.8936 - regression_loss: 0.7866 - classification_loss: 0.1070 370/500 [=====================>........] - ETA: 50s - loss: 0.8935 - regression_loss: 0.7865 - classification_loss: 0.1070 371/500 [=====================>........] - ETA: 50s - loss: 0.8922 - regression_loss: 0.7854 - classification_loss: 0.1068 372/500 [=====================>........] - ETA: 49s - loss: 0.8916 - regression_loss: 0.7849 - classification_loss: 0.1067 373/500 [=====================>........] - ETA: 49s - loss: 0.8921 - regression_loss: 0.7852 - classification_loss: 0.1069 374/500 [=====================>........] - ETA: 49s - loss: 0.8923 - regression_loss: 0.7854 - classification_loss: 0.1069 375/500 [=====================>........] - ETA: 48s - loss: 0.8932 - regression_loss: 0.7861 - classification_loss: 0.1070 376/500 [=====================>........] - ETA: 48s - loss: 0.8932 - regression_loss: 0.7863 - classification_loss: 0.1069 377/500 [=====================>........] - ETA: 47s - loss: 0.8924 - regression_loss: 0.7856 - classification_loss: 0.1068 378/500 [=====================>........] - ETA: 47s - loss: 0.8917 - regression_loss: 0.7847 - classification_loss: 0.1070 379/500 [=====================>........] - ETA: 47s - loss: 0.8915 - regression_loss: 0.7845 - classification_loss: 0.1070 380/500 [=====================>........] - ETA: 46s - loss: 0.8917 - regression_loss: 0.7847 - classification_loss: 0.1070 381/500 [=====================>........] - ETA: 46s - loss: 0.8929 - regression_loss: 0.7857 - classification_loss: 0.1072 382/500 [=====================>........] - ETA: 45s - loss: 0.8936 - regression_loss: 0.7866 - classification_loss: 0.1070 383/500 [=====================>........] - ETA: 45s - loss: 0.8950 - regression_loss: 0.7878 - classification_loss: 0.1072 384/500 [======================>.......] - ETA: 45s - loss: 0.8935 - regression_loss: 0.7866 - classification_loss: 0.1069 385/500 [======================>.......] - ETA: 44s - loss: 0.8941 - regression_loss: 0.7873 - classification_loss: 0.1068 386/500 [======================>.......] - ETA: 44s - loss: 0.8933 - regression_loss: 0.7866 - classification_loss: 0.1067 387/500 [======================>.......] - ETA: 44s - loss: 0.8925 - regression_loss: 0.7859 - classification_loss: 0.1066 388/500 [======================>.......] - ETA: 43s - loss: 0.8921 - regression_loss: 0.7856 - classification_loss: 0.1065 389/500 [======================>.......] - ETA: 43s - loss: 0.8913 - regression_loss: 0.7849 - classification_loss: 0.1064 390/500 [======================>.......] - ETA: 42s - loss: 0.8906 - regression_loss: 0.7844 - classification_loss: 0.1063 391/500 [======================>.......] - ETA: 42s - loss: 0.8898 - regression_loss: 0.7837 - classification_loss: 0.1061 392/500 [======================>.......] - ETA: 42s - loss: 0.8914 - regression_loss: 0.7851 - classification_loss: 0.1062 393/500 [======================>.......] - ETA: 41s - loss: 0.8903 - regression_loss: 0.7842 - classification_loss: 0.1061 394/500 [======================>.......] - ETA: 41s - loss: 0.8901 - regression_loss: 0.7840 - classification_loss: 0.1061 395/500 [======================>.......] - ETA: 40s - loss: 0.8931 - regression_loss: 0.7863 - classification_loss: 0.1067 396/500 [======================>.......] - ETA: 40s - loss: 0.8931 - regression_loss: 0.7865 - classification_loss: 0.1066 397/500 [======================>.......] - ETA: 40s - loss: 0.8925 - regression_loss: 0.7860 - classification_loss: 0.1065 398/500 [======================>.......] - ETA: 39s - loss: 0.8920 - regression_loss: 0.7856 - classification_loss: 0.1064 399/500 [======================>.......] - ETA: 39s - loss: 0.8914 - regression_loss: 0.7852 - classification_loss: 0.1062 400/500 [=======================>......] - ETA: 38s - loss: 0.8914 - regression_loss: 0.7852 - classification_loss: 0.1062 401/500 [=======================>......] - ETA: 38s - loss: 0.8911 - regression_loss: 0.7851 - classification_loss: 0.1060 402/500 [=======================>......] - ETA: 38s - loss: 0.8904 - regression_loss: 0.7844 - classification_loss: 0.1059 403/500 [=======================>......] - ETA: 37s - loss: 0.8907 - regression_loss: 0.7848 - classification_loss: 0.1059 404/500 [=======================>......] - ETA: 37s - loss: 0.8911 - regression_loss: 0.7851 - classification_loss: 0.1059 405/500 [=======================>......] - ETA: 37s - loss: 0.8910 - regression_loss: 0.7850 - classification_loss: 0.1060 406/500 [=======================>......] - ETA: 36s - loss: 0.8903 - regression_loss: 0.7845 - classification_loss: 0.1059 407/500 [=======================>......] - ETA: 36s - loss: 0.8898 - regression_loss: 0.7840 - classification_loss: 0.1058 408/500 [=======================>......] - ETA: 35s - loss: 0.8895 - regression_loss: 0.7839 - classification_loss: 0.1056 409/500 [=======================>......] - ETA: 35s - loss: 0.8883 - regression_loss: 0.7828 - classification_loss: 0.1055 410/500 [=======================>......] - ETA: 35s - loss: 0.8881 - regression_loss: 0.7827 - classification_loss: 0.1054 411/500 [=======================>......] - ETA: 34s - loss: 0.8872 - regression_loss: 0.7819 - classification_loss: 0.1054 412/500 [=======================>......] - ETA: 34s - loss: 0.8861 - regression_loss: 0.7809 - classification_loss: 0.1052 413/500 [=======================>......] - ETA: 33s - loss: 0.8878 - regression_loss: 0.7823 - classification_loss: 0.1056 414/500 [=======================>......] - ETA: 33s - loss: 0.8874 - regression_loss: 0.7819 - classification_loss: 0.1055 415/500 [=======================>......] - ETA: 33s - loss: 0.8874 - regression_loss: 0.7820 - classification_loss: 0.1054 416/500 [=======================>......] - ETA: 32s - loss: 0.8882 - regression_loss: 0.7827 - classification_loss: 0.1055 417/500 [========================>.....] - ETA: 32s - loss: 0.8878 - regression_loss: 0.7825 - classification_loss: 0.1053 418/500 [========================>.....] - ETA: 31s - loss: 0.8878 - regression_loss: 0.7825 - classification_loss: 0.1053 419/500 [========================>.....] - ETA: 31s - loss: 0.8881 - regression_loss: 0.7828 - classification_loss: 0.1052 420/500 [========================>.....] - ETA: 31s - loss: 0.8884 - regression_loss: 0.7832 - classification_loss: 0.1052 421/500 [========================>.....] - ETA: 30s - loss: 0.8882 - regression_loss: 0.7830 - classification_loss: 0.1051 422/500 [========================>.....] - ETA: 30s - loss: 0.8877 - regression_loss: 0.7826 - classification_loss: 0.1050 423/500 [========================>.....] - ETA: 30s - loss: 0.8874 - regression_loss: 0.7825 - classification_loss: 0.1050 424/500 [========================>.....] - ETA: 29s - loss: 0.8874 - regression_loss: 0.7824 - classification_loss: 0.1049 425/500 [========================>.....] - ETA: 29s - loss: 0.8875 - regression_loss: 0.7826 - classification_loss: 0.1049 426/500 [========================>.....] - ETA: 28s - loss: 0.8892 - regression_loss: 0.7840 - classification_loss: 0.1052 427/500 [========================>.....] - ETA: 28s - loss: 0.8881 - regression_loss: 0.7831 - classification_loss: 0.1051 428/500 [========================>.....] - ETA: 28s - loss: 0.8878 - regression_loss: 0.7828 - classification_loss: 0.1050 429/500 [========================>.....] - ETA: 27s - loss: 0.8871 - regression_loss: 0.7822 - classification_loss: 0.1049 430/500 [========================>.....] - ETA: 27s - loss: 0.8865 - regression_loss: 0.7817 - classification_loss: 0.1048 431/500 [========================>.....] - ETA: 26s - loss: 0.8857 - regression_loss: 0.7810 - classification_loss: 0.1047 432/500 [========================>.....] - ETA: 26s - loss: 0.8860 - regression_loss: 0.7813 - classification_loss: 0.1047 433/500 [========================>.....] - ETA: 26s - loss: 0.8852 - regression_loss: 0.7806 - classification_loss: 0.1046 434/500 [=========================>....] - ETA: 25s - loss: 0.8850 - regression_loss: 0.7805 - classification_loss: 0.1046 435/500 [=========================>....] - ETA: 25s - loss: 0.8851 - regression_loss: 0.7805 - classification_loss: 0.1045 436/500 [=========================>....] - ETA: 24s - loss: 0.8842 - regression_loss: 0.7798 - classification_loss: 0.1044 437/500 [=========================>....] - ETA: 24s - loss: 0.8839 - regression_loss: 0.7796 - classification_loss: 0.1043 438/500 [=========================>....] - ETA: 24s - loss: 0.8842 - regression_loss: 0.7799 - classification_loss: 0.1043 439/500 [=========================>....] - ETA: 23s - loss: 0.8834 - regression_loss: 0.7791 - classification_loss: 0.1042 440/500 [=========================>....] - ETA: 23s - loss: 0.8824 - regression_loss: 0.7783 - classification_loss: 0.1041 441/500 [=========================>....] - ETA: 23s - loss: 0.8841 - regression_loss: 0.7798 - classification_loss: 0.1044 442/500 [=========================>....] - ETA: 22s - loss: 0.8844 - regression_loss: 0.7801 - classification_loss: 0.1043 443/500 [=========================>....] - ETA: 22s - loss: 0.8832 - regression_loss: 0.7791 - classification_loss: 0.1042 444/500 [=========================>....] - ETA: 21s - loss: 0.8824 - regression_loss: 0.7784 - classification_loss: 0.1041 445/500 [=========================>....] - ETA: 21s - loss: 0.8824 - regression_loss: 0.7783 - classification_loss: 0.1041 446/500 [=========================>....] - ETA: 21s - loss: 0.8831 - regression_loss: 0.7788 - classification_loss: 0.1043 447/500 [=========================>....] - ETA: 20s - loss: 0.8832 - regression_loss: 0.7788 - classification_loss: 0.1044 448/500 [=========================>....] - ETA: 20s - loss: 0.8823 - regression_loss: 0.7781 - classification_loss: 0.1042 449/500 [=========================>....] - ETA: 19s - loss: 0.8819 - regression_loss: 0.7778 - classification_loss: 0.1042 450/500 [==========================>...] - ETA: 19s - loss: 0.8810 - regression_loss: 0.7769 - classification_loss: 0.1041 451/500 [==========================>...] - ETA: 19s - loss: 0.8815 - regression_loss: 0.7775 - classification_loss: 0.1041 452/500 [==========================>...] - ETA: 18s - loss: 0.8821 - regression_loss: 0.7778 - classification_loss: 0.1043 453/500 [==========================>...] - ETA: 18s - loss: 0.8817 - regression_loss: 0.7774 - classification_loss: 0.1043 454/500 [==========================>...] - ETA: 17s - loss: 0.8810 - regression_loss: 0.7768 - classification_loss: 0.1042 455/500 [==========================>...] - ETA: 17s - loss: 0.8816 - regression_loss: 0.7774 - classification_loss: 0.1042 456/500 [==========================>...] - ETA: 17s - loss: 0.8809 - regression_loss: 0.7768 - classification_loss: 0.1041 457/500 [==========================>...] - ETA: 16s - loss: 0.8820 - regression_loss: 0.7779 - classification_loss: 0.1041 458/500 [==========================>...] - ETA: 16s - loss: 0.8823 - regression_loss: 0.7781 - classification_loss: 0.1042 459/500 [==========================>...] - ETA: 15s - loss: 0.8813 - regression_loss: 0.7773 - classification_loss: 0.1040 460/500 [==========================>...] - ETA: 15s - loss: 0.8810 - regression_loss: 0.7770 - classification_loss: 0.1040 461/500 [==========================>...] - ETA: 15s - loss: 0.8810 - regression_loss: 0.7770 - classification_loss: 0.1041 462/500 [==========================>...] - ETA: 14s - loss: 0.8811 - regression_loss: 0.7770 - classification_loss: 0.1041 463/500 [==========================>...] - ETA: 14s - loss: 0.8804 - regression_loss: 0.7763 - classification_loss: 0.1041 464/500 [==========================>...] - ETA: 14s - loss: 0.8809 - regression_loss: 0.7768 - classification_loss: 0.1042 465/500 [==========================>...] - ETA: 13s - loss: 0.8810 - regression_loss: 0.7769 - classification_loss: 0.1041 466/500 [==========================>...] - ETA: 13s - loss: 0.8824 - regression_loss: 0.7781 - classification_loss: 0.1043 467/500 [===========================>..] - ETA: 12s - loss: 0.8815 - regression_loss: 0.7774 - classification_loss: 0.1041 468/500 [===========================>..] - ETA: 12s - loss: 0.8812 - regression_loss: 0.7770 - classification_loss: 0.1041 469/500 [===========================>..] - ETA: 12s - loss: 0.8814 - regression_loss: 0.7772 - classification_loss: 0.1042 470/500 [===========================>..] - ETA: 11s - loss: 0.8807 - regression_loss: 0.7766 - classification_loss: 0.1041 471/500 [===========================>..] - ETA: 11s - loss: 0.8799 - regression_loss: 0.7759 - classification_loss: 0.1040 472/500 [===========================>..] - ETA: 10s - loss: 0.8797 - regression_loss: 0.7757 - classification_loss: 0.1040 473/500 [===========================>..] - ETA: 10s - loss: 0.8785 - regression_loss: 0.7746 - classification_loss: 0.1039 474/500 [===========================>..] - ETA: 10s - loss: 0.8793 - regression_loss: 0.7752 - classification_loss: 0.1041 475/500 [===========================>..] - ETA: 9s - loss: 0.8784 - regression_loss: 0.7744 - classification_loss: 0.1039  476/500 [===========================>..] - ETA: 9s - loss: 0.8781 - regression_loss: 0.7742 - classification_loss: 0.1039 477/500 [===========================>..] - ETA: 8s - loss: 0.8793 - regression_loss: 0.7753 - classification_loss: 0.1040 478/500 [===========================>..] - ETA: 8s - loss: 0.8786 - regression_loss: 0.7747 - classification_loss: 0.1039 479/500 [===========================>..] - ETA: 8s - loss: 0.8780 - regression_loss: 0.7742 - classification_loss: 0.1038 480/500 [===========================>..] - ETA: 7s - loss: 0.8772 - regression_loss: 0.7736 - classification_loss: 0.1036 481/500 [===========================>..] - ETA: 7s - loss: 0.8770 - regression_loss: 0.7734 - classification_loss: 0.1036 482/500 [===========================>..] - ETA: 7s - loss: 0.8774 - regression_loss: 0.7737 - classification_loss: 0.1037 483/500 [===========================>..] - ETA: 6s - loss: 0.8763 - regression_loss: 0.7728 - classification_loss: 0.1036 484/500 [============================>.] - ETA: 6s - loss: 0.8756 - regression_loss: 0.7719 - classification_loss: 0.1037 485/500 [============================>.] - ETA: 5s - loss: 0.8752 - regression_loss: 0.7716 - classification_loss: 0.1037 486/500 [============================>.] - ETA: 5s - loss: 0.8743 - regression_loss: 0.7707 - classification_loss: 0.1036 487/500 [============================>.] - ETA: 5s - loss: 0.8752 - regression_loss: 0.7713 - classification_loss: 0.1039 488/500 [============================>.] - ETA: 4s - loss: 0.8754 - regression_loss: 0.7714 - classification_loss: 0.1040 489/500 [============================>.] - ETA: 4s - loss: 0.8757 - regression_loss: 0.7718 - classification_loss: 0.1040 490/500 [============================>.] - ETA: 3s - loss: 0.8760 - regression_loss: 0.7721 - classification_loss: 0.1039 491/500 [============================>.] - ETA: 3s - loss: 0.8766 - regression_loss: 0.7726 - classification_loss: 0.1040 492/500 [============================>.] - ETA: 3s - loss: 0.8761 - regression_loss: 0.7722 - classification_loss: 0.1039 493/500 [============================>.] - ETA: 2s - loss: 0.8756 - regression_loss: 0.7718 - classification_loss: 0.1038 494/500 [============================>.] - ETA: 2s - loss: 0.8754 - regression_loss: 0.7716 - classification_loss: 0.1038 495/500 [============================>.] - ETA: 1s - loss: 0.8756 - regression_loss: 0.7718 - classification_loss: 0.1038 496/500 [============================>.] - ETA: 1s - loss: 0.8751 - regression_loss: 0.7713 - classification_loss: 0.1038 497/500 [============================>.] - ETA: 1s - loss: 0.8744 - regression_loss: 0.7707 - classification_loss: 0.1038 498/500 [============================>.] - ETA: 0s - loss: 0.8746 - regression_loss: 0.7709 - classification_loss: 0.1038 499/500 [============================>.] - ETA: 0s - loss: 0.8753 - regression_loss: 0.7714 - classification_loss: 0.1039 500/500 [==============================] - 195s 390ms/step - loss: 0.8755 - regression_loss: 0.7716 - classification_loss: 0.1039 326 instances of class plum with average precision: 0.8131 mAP: 0.8131 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/20 1/500 [..............................] - ETA: 2:59 - loss: 0.8915 - regression_loss: 0.7782 - classification_loss: 0.1133 2/500 [..............................] - ETA: 3:07 - loss: 0.6476 - regression_loss: 0.5672 - classification_loss: 0.0804 3/500 [..............................] - ETA: 3:10 - loss: 0.8197 - regression_loss: 0.7262 - classification_loss: 0.0935 4/500 [..............................] - ETA: 3:13 - loss: 0.7571 - regression_loss: 0.6729 - classification_loss: 0.0841 5/500 [..............................] - ETA: 3:12 - loss: 0.7779 - regression_loss: 0.6957 - classification_loss: 0.0822 6/500 [..............................] - ETA: 3:13 - loss: 0.7552 - regression_loss: 0.6767 - classification_loss: 0.0785 7/500 [..............................] - ETA: 3:13 - loss: 0.7709 - regression_loss: 0.6972 - classification_loss: 0.0737 8/500 [..............................] - ETA: 3:12 - loss: 0.7840 - regression_loss: 0.7078 - classification_loss: 0.0762 9/500 [..............................] - ETA: 3:12 - loss: 0.7793 - regression_loss: 0.7034 - classification_loss: 0.0759 10/500 [..............................] - ETA: 3:11 - loss: 0.8853 - regression_loss: 0.7909 - classification_loss: 0.0944 11/500 [..............................] - ETA: 3:11 - loss: 0.8655 - regression_loss: 0.7735 - classification_loss: 0.0920 12/500 [..............................] - ETA: 3:11 - loss: 0.8731 - regression_loss: 0.7791 - classification_loss: 0.0940 13/500 [..............................] - ETA: 3:10 - loss: 0.8604 - regression_loss: 0.7695 - classification_loss: 0.0909 14/500 [..............................] - ETA: 3:10 - loss: 0.8677 - regression_loss: 0.7753 - classification_loss: 0.0924 15/500 [..............................] - ETA: 3:10 - loss: 0.8694 - regression_loss: 0.7794 - classification_loss: 0.0900 16/500 [..............................] - ETA: 3:09 - loss: 0.8769 - regression_loss: 0.7865 - classification_loss: 0.0904 17/500 [>.............................] - ETA: 3:08 - loss: 0.8514 - regression_loss: 0.7638 - classification_loss: 0.0875 18/500 [>.............................] - ETA: 3:08 - loss: 0.8473 - regression_loss: 0.7606 - classification_loss: 0.0867 19/500 [>.............................] - ETA: 3:08 - loss: 0.8287 - regression_loss: 0.7443 - classification_loss: 0.0844 20/500 [>.............................] - ETA: 3:07 - loss: 0.8159 - regression_loss: 0.7326 - classification_loss: 0.0833 21/500 [>.............................] - ETA: 3:07 - loss: 0.8058 - regression_loss: 0.7241 - classification_loss: 0.0817 22/500 [>.............................] - ETA: 3:07 - loss: 0.8119 - regression_loss: 0.7261 - classification_loss: 0.0858 23/500 [>.............................] - ETA: 3:06 - loss: 0.8095 - regression_loss: 0.7254 - classification_loss: 0.0841 24/500 [>.............................] - ETA: 3:06 - loss: 0.8273 - regression_loss: 0.7408 - classification_loss: 0.0865 25/500 [>.............................] - ETA: 3:05 - loss: 0.8275 - regression_loss: 0.7409 - classification_loss: 0.0866 26/500 [>.............................] - ETA: 3:05 - loss: 0.8456 - regression_loss: 0.7564 - classification_loss: 0.0892 27/500 [>.............................] - ETA: 3:05 - loss: 0.8350 - regression_loss: 0.7473 - classification_loss: 0.0876 28/500 [>.............................] - ETA: 3:04 - loss: 0.8328 - regression_loss: 0.7455 - classification_loss: 0.0873 29/500 [>.............................] - ETA: 3:04 - loss: 0.8328 - regression_loss: 0.7449 - classification_loss: 0.0879 30/500 [>.............................] - ETA: 3:03 - loss: 0.8344 - regression_loss: 0.7453 - classification_loss: 0.0891 31/500 [>.............................] - ETA: 3:03 - loss: 0.8226 - regression_loss: 0.7348 - classification_loss: 0.0878 32/500 [>.............................] - ETA: 3:03 - loss: 0.8170 - regression_loss: 0.7305 - classification_loss: 0.0866 33/500 [>.............................] - ETA: 3:02 - loss: 0.8159 - regression_loss: 0.7292 - classification_loss: 0.0867 34/500 [=>............................] - ETA: 3:02 - loss: 0.8153 - regression_loss: 0.7293 - classification_loss: 0.0860 35/500 [=>............................] - ETA: 3:02 - loss: 0.8062 - regression_loss: 0.7213 - classification_loss: 0.0850 36/500 [=>............................] - ETA: 3:02 - loss: 0.8152 - regression_loss: 0.7275 - classification_loss: 0.0877 37/500 [=>............................] - ETA: 3:01 - loss: 0.8216 - regression_loss: 0.7341 - classification_loss: 0.0875 38/500 [=>............................] - ETA: 3:01 - loss: 0.8275 - regression_loss: 0.7387 - classification_loss: 0.0888 39/500 [=>............................] - ETA: 3:00 - loss: 0.8285 - regression_loss: 0.7389 - classification_loss: 0.0896 40/500 [=>............................] - ETA: 3:00 - loss: 0.8491 - regression_loss: 0.7565 - classification_loss: 0.0926 41/500 [=>............................] - ETA: 3:00 - loss: 0.8561 - regression_loss: 0.7623 - classification_loss: 0.0939 42/500 [=>............................] - ETA: 2:59 - loss: 0.8456 - regression_loss: 0.7532 - classification_loss: 0.0924 43/500 [=>............................] - ETA: 2:59 - loss: 0.8431 - regression_loss: 0.7504 - classification_loss: 0.0926 44/500 [=>............................] - ETA: 2:58 - loss: 0.8422 - regression_loss: 0.7503 - classification_loss: 0.0918 45/500 [=>............................] - ETA: 2:58 - loss: 0.8325 - regression_loss: 0.7414 - classification_loss: 0.0910 46/500 [=>............................] - ETA: 2:57 - loss: 0.8330 - regression_loss: 0.7418 - classification_loss: 0.0912 47/500 [=>............................] - ETA: 2:57 - loss: 0.8326 - regression_loss: 0.7418 - classification_loss: 0.0908 48/500 [=>............................] - ETA: 2:56 - loss: 0.8322 - regression_loss: 0.7413 - classification_loss: 0.0909 49/500 [=>............................] - ETA: 2:56 - loss: 0.8391 - regression_loss: 0.7475 - classification_loss: 0.0917 50/500 [==>...........................] - ETA: 2:56 - loss: 0.8350 - regression_loss: 0.7442 - classification_loss: 0.0908 51/500 [==>...........................] - ETA: 2:55 - loss: 0.8471 - regression_loss: 0.7521 - classification_loss: 0.0950 52/500 [==>...........................] - ETA: 2:54 - loss: 0.8483 - regression_loss: 0.7524 - classification_loss: 0.0960 53/500 [==>...........................] - ETA: 2:54 - loss: 0.8484 - regression_loss: 0.7529 - classification_loss: 0.0956 54/500 [==>...........................] - ETA: 2:54 - loss: 0.8403 - regression_loss: 0.7457 - classification_loss: 0.0946 55/500 [==>...........................] - ETA: 2:53 - loss: 0.8524 - regression_loss: 0.7566 - classification_loss: 0.0958 56/500 [==>...........................] - ETA: 2:53 - loss: 0.8519 - regression_loss: 0.7563 - classification_loss: 0.0956 57/500 [==>...........................] - ETA: 2:52 - loss: 0.8535 - regression_loss: 0.7585 - classification_loss: 0.0951 58/500 [==>...........................] - ETA: 2:52 - loss: 0.8461 - regression_loss: 0.7519 - classification_loss: 0.0942 59/500 [==>...........................] - ETA: 2:51 - loss: 0.8441 - regression_loss: 0.7503 - classification_loss: 0.0937 60/500 [==>...........................] - ETA: 2:51 - loss: 0.8466 - regression_loss: 0.7524 - classification_loss: 0.0942 61/500 [==>...........................] - ETA: 2:51 - loss: 0.8556 - regression_loss: 0.7591 - classification_loss: 0.0965 62/500 [==>...........................] - ETA: 2:50 - loss: 0.8539 - regression_loss: 0.7576 - classification_loss: 0.0963 63/500 [==>...........................] - ETA: 2:50 - loss: 0.8537 - regression_loss: 0.7573 - classification_loss: 0.0964 64/500 [==>...........................] - ETA: 2:50 - loss: 0.8627 - regression_loss: 0.7655 - classification_loss: 0.0972 65/500 [==>...........................] - ETA: 2:49 - loss: 0.8576 - regression_loss: 0.7609 - classification_loss: 0.0967 66/500 [==>...........................] - ETA: 2:49 - loss: 0.8558 - regression_loss: 0.7594 - classification_loss: 0.0964 67/500 [===>..........................] - ETA: 2:49 - loss: 0.8575 - regression_loss: 0.7610 - classification_loss: 0.0965 68/500 [===>..........................] - ETA: 2:48 - loss: 0.8582 - regression_loss: 0.7615 - classification_loss: 0.0967 69/500 [===>..........................] - ETA: 2:48 - loss: 0.8586 - regression_loss: 0.7622 - classification_loss: 0.0964 70/500 [===>..........................] - ETA: 2:47 - loss: 0.8537 - regression_loss: 0.7580 - classification_loss: 0.0957 71/500 [===>..........................] - ETA: 2:47 - loss: 0.8473 - regression_loss: 0.7523 - classification_loss: 0.0950 72/500 [===>..........................] - ETA: 2:47 - loss: 0.8505 - regression_loss: 0.7553 - classification_loss: 0.0952 73/500 [===>..........................] - ETA: 2:46 - loss: 0.8527 - regression_loss: 0.7573 - classification_loss: 0.0954 74/500 [===>..........................] - ETA: 2:46 - loss: 0.8534 - regression_loss: 0.7562 - classification_loss: 0.0972 75/500 [===>..........................] - ETA: 2:46 - loss: 0.8513 - regression_loss: 0.7547 - classification_loss: 0.0966 76/500 [===>..........................] - ETA: 2:45 - loss: 0.8480 - regression_loss: 0.7514 - classification_loss: 0.0967 77/500 [===>..........................] - ETA: 2:45 - loss: 0.8480 - regression_loss: 0.7518 - classification_loss: 0.0962 78/500 [===>..........................] - ETA: 2:44 - loss: 0.8451 - regression_loss: 0.7497 - classification_loss: 0.0954 79/500 [===>..........................] - ETA: 2:44 - loss: 0.8430 - regression_loss: 0.7477 - classification_loss: 0.0953 80/500 [===>..........................] - ETA: 2:43 - loss: 0.8439 - regression_loss: 0.7488 - classification_loss: 0.0951 81/500 [===>..........................] - ETA: 2:43 - loss: 0.8428 - regression_loss: 0.7480 - classification_loss: 0.0947 82/500 [===>..........................] - ETA: 2:43 - loss: 0.8414 - regression_loss: 0.7471 - classification_loss: 0.0942 83/500 [===>..........................] - ETA: 2:42 - loss: 0.8420 - regression_loss: 0.7472 - classification_loss: 0.0948 84/500 [====>.........................] - ETA: 2:42 - loss: 0.8402 - regression_loss: 0.7457 - classification_loss: 0.0945 85/500 [====>.........................] - ETA: 2:41 - loss: 0.8370 - regression_loss: 0.7431 - classification_loss: 0.0939 86/500 [====>.........................] - ETA: 2:41 - loss: 0.8447 - regression_loss: 0.7494 - classification_loss: 0.0952 87/500 [====>.........................] - ETA: 2:41 - loss: 0.8464 - regression_loss: 0.7512 - classification_loss: 0.0952 88/500 [====>.........................] - ETA: 2:40 - loss: 0.8471 - regression_loss: 0.7514 - classification_loss: 0.0956 89/500 [====>.........................] - ETA: 2:40 - loss: 0.8572 - regression_loss: 0.7600 - classification_loss: 0.0971 90/500 [====>.........................] - ETA: 2:40 - loss: 0.8565 - regression_loss: 0.7594 - classification_loss: 0.0971 91/500 [====>.........................] - ETA: 2:39 - loss: 0.8536 - regression_loss: 0.7568 - classification_loss: 0.0968 92/500 [====>.........................] - ETA: 2:39 - loss: 0.8537 - regression_loss: 0.7571 - classification_loss: 0.0966 93/500 [====>.........................] - ETA: 2:38 - loss: 0.8492 - regression_loss: 0.7531 - classification_loss: 0.0961 94/500 [====>.........................] - ETA: 2:38 - loss: 0.8488 - regression_loss: 0.7527 - classification_loss: 0.0960 95/500 [====>.........................] - ETA: 2:37 - loss: 0.8463 - regression_loss: 0.7504 - classification_loss: 0.0960 96/500 [====>.........................] - ETA: 2:37 - loss: 0.8502 - regression_loss: 0.7538 - classification_loss: 0.0964 97/500 [====>.........................] - ETA: 2:37 - loss: 0.8563 - regression_loss: 0.7584 - classification_loss: 0.0979 98/500 [====>.........................] - ETA: 2:36 - loss: 0.8542 - regression_loss: 0.7565 - classification_loss: 0.0977 99/500 [====>.........................] - ETA: 2:36 - loss: 0.8528 - regression_loss: 0.7554 - classification_loss: 0.0974 100/500 [=====>........................] - ETA: 2:36 - loss: 0.8532 - regression_loss: 0.7559 - classification_loss: 0.0973 101/500 [=====>........................] - ETA: 2:35 - loss: 0.8527 - regression_loss: 0.7558 - classification_loss: 0.0969 102/500 [=====>........................] - ETA: 2:35 - loss: 0.8506 - regression_loss: 0.7541 - classification_loss: 0.0965 103/500 [=====>........................] - ETA: 2:34 - loss: 0.8546 - regression_loss: 0.7576 - classification_loss: 0.0969 104/500 [=====>........................] - ETA: 2:34 - loss: 0.8524 - regression_loss: 0.7558 - classification_loss: 0.0966 105/500 [=====>........................] - ETA: 2:34 - loss: 0.8525 - regression_loss: 0.7560 - classification_loss: 0.0965 106/500 [=====>........................] - ETA: 2:33 - loss: 0.8520 - regression_loss: 0.7555 - classification_loss: 0.0965 107/500 [=====>........................] - ETA: 2:33 - loss: 0.8551 - regression_loss: 0.7574 - classification_loss: 0.0978 108/500 [=====>........................] - ETA: 2:33 - loss: 0.8545 - regression_loss: 0.7568 - classification_loss: 0.0977 109/500 [=====>........................] - ETA: 2:32 - loss: 0.8527 - regression_loss: 0.7554 - classification_loss: 0.0973 110/500 [=====>........................] - ETA: 2:32 - loss: 0.8567 - regression_loss: 0.7588 - classification_loss: 0.0979 111/500 [=====>........................] - ETA: 2:31 - loss: 0.8552 - regression_loss: 0.7575 - classification_loss: 0.0977 112/500 [=====>........................] - ETA: 2:31 - loss: 0.8550 - regression_loss: 0.7573 - classification_loss: 0.0977 113/500 [=====>........................] - ETA: 2:30 - loss: 0.8584 - regression_loss: 0.7606 - classification_loss: 0.0978 114/500 [=====>........................] - ETA: 2:30 - loss: 0.8600 - regression_loss: 0.7619 - classification_loss: 0.0981 115/500 [=====>........................] - ETA: 2:30 - loss: 0.8598 - regression_loss: 0.7617 - classification_loss: 0.0982 116/500 [=====>........................] - ETA: 2:29 - loss: 0.8571 - regression_loss: 0.7593 - classification_loss: 0.0978 117/500 [======>.......................] - ETA: 2:29 - loss: 0.8526 - regression_loss: 0.7554 - classification_loss: 0.0972 118/500 [======>.......................] - ETA: 2:28 - loss: 0.8487 - regression_loss: 0.7516 - classification_loss: 0.0971 119/500 [======>.......................] - ETA: 2:28 - loss: 0.8457 - regression_loss: 0.7489 - classification_loss: 0.0968 120/500 [======>.......................] - ETA: 2:28 - loss: 0.8442 - regression_loss: 0.7475 - classification_loss: 0.0967 121/500 [======>.......................] - ETA: 2:27 - loss: 0.8415 - regression_loss: 0.7453 - classification_loss: 0.0962 122/500 [======>.......................] - ETA: 2:27 - loss: 0.8394 - regression_loss: 0.7435 - classification_loss: 0.0959 123/500 [======>.......................] - ETA: 2:27 - loss: 0.8390 - regression_loss: 0.7431 - classification_loss: 0.0959 124/500 [======>.......................] - ETA: 2:26 - loss: 0.8420 - regression_loss: 0.7453 - classification_loss: 0.0967 125/500 [======>.......................] - ETA: 2:26 - loss: 0.8423 - regression_loss: 0.7455 - classification_loss: 0.0967 126/500 [======>.......................] - ETA: 2:25 - loss: 0.8414 - regression_loss: 0.7451 - classification_loss: 0.0963 127/500 [======>.......................] - ETA: 2:25 - loss: 0.8422 - regression_loss: 0.7456 - classification_loss: 0.0966 128/500 [======>.......................] - ETA: 2:25 - loss: 0.8448 - regression_loss: 0.7477 - classification_loss: 0.0971 129/500 [======>.......................] - ETA: 2:24 - loss: 0.8413 - regression_loss: 0.7447 - classification_loss: 0.0965 130/500 [======>.......................] - ETA: 2:24 - loss: 0.8401 - regression_loss: 0.7436 - classification_loss: 0.0965 131/500 [======>.......................] - ETA: 2:23 - loss: 0.8400 - regression_loss: 0.7438 - classification_loss: 0.0962 132/500 [======>.......................] - ETA: 2:23 - loss: 0.8403 - regression_loss: 0.7445 - classification_loss: 0.0959 133/500 [======>.......................] - ETA: 2:23 - loss: 0.8438 - regression_loss: 0.7471 - classification_loss: 0.0968 134/500 [=======>......................] - ETA: 2:22 - loss: 0.8420 - regression_loss: 0.7450 - classification_loss: 0.0970 135/500 [=======>......................] - ETA: 2:22 - loss: 0.8414 - regression_loss: 0.7444 - classification_loss: 0.0970 136/500 [=======>......................] - ETA: 2:22 - loss: 0.8422 - regression_loss: 0.7453 - classification_loss: 0.0969 137/500 [=======>......................] - ETA: 2:21 - loss: 0.8406 - regression_loss: 0.7438 - classification_loss: 0.0968 138/500 [=======>......................] - ETA: 2:21 - loss: 0.8379 - regression_loss: 0.7414 - classification_loss: 0.0965 139/500 [=======>......................] - ETA: 2:20 - loss: 0.8353 - regression_loss: 0.7390 - classification_loss: 0.0963 140/500 [=======>......................] - ETA: 2:20 - loss: 0.8351 - regression_loss: 0.7392 - classification_loss: 0.0959 141/500 [=======>......................] - ETA: 2:20 - loss: 0.8352 - regression_loss: 0.7394 - classification_loss: 0.0957 142/500 [=======>......................] - ETA: 2:19 - loss: 0.8364 - regression_loss: 0.7404 - classification_loss: 0.0960 143/500 [=======>......................] - ETA: 2:19 - loss: 0.8372 - regression_loss: 0.7412 - classification_loss: 0.0960 144/500 [=======>......................] - ETA: 2:19 - loss: 0.8347 - regression_loss: 0.7390 - classification_loss: 0.0957 145/500 [=======>......................] - ETA: 2:18 - loss: 0.8348 - regression_loss: 0.7390 - classification_loss: 0.0958 146/500 [=======>......................] - ETA: 2:18 - loss: 0.8329 - regression_loss: 0.7374 - classification_loss: 0.0955 147/500 [=======>......................] - ETA: 2:17 - loss: 0.8349 - regression_loss: 0.7387 - classification_loss: 0.0963 148/500 [=======>......................] - ETA: 2:17 - loss: 0.8379 - regression_loss: 0.7407 - classification_loss: 0.0972 149/500 [=======>......................] - ETA: 2:17 - loss: 0.8365 - regression_loss: 0.7396 - classification_loss: 0.0969 150/500 [========>.....................] - ETA: 2:16 - loss: 0.8353 - regression_loss: 0.7385 - classification_loss: 0.0968 151/500 [========>.....................] - ETA: 2:16 - loss: 0.8336 - regression_loss: 0.7371 - classification_loss: 0.0965 152/500 [========>.....................] - ETA: 2:15 - loss: 0.8327 - regression_loss: 0.7363 - classification_loss: 0.0963 153/500 [========>.....................] - ETA: 2:15 - loss: 0.8333 - regression_loss: 0.7370 - classification_loss: 0.0962 154/500 [========>.....................] - ETA: 2:15 - loss: 0.8375 - regression_loss: 0.7409 - classification_loss: 0.0967 155/500 [========>.....................] - ETA: 2:14 - loss: 0.8334 - regression_loss: 0.7372 - classification_loss: 0.0962 156/500 [========>.....................] - ETA: 2:14 - loss: 0.8314 - regression_loss: 0.7355 - classification_loss: 0.0959 157/500 [========>.....................] - ETA: 2:13 - loss: 0.8311 - regression_loss: 0.7354 - classification_loss: 0.0957 158/500 [========>.....................] - ETA: 2:13 - loss: 0.8335 - regression_loss: 0.7379 - classification_loss: 0.0956 159/500 [========>.....................] - ETA: 2:13 - loss: 0.8308 - regression_loss: 0.7355 - classification_loss: 0.0953 160/500 [========>.....................] - ETA: 2:12 - loss: 0.8284 - regression_loss: 0.7332 - classification_loss: 0.0952 161/500 [========>.....................] - ETA: 2:12 - loss: 0.8278 - regression_loss: 0.7327 - classification_loss: 0.0951 162/500 [========>.....................] - ETA: 2:11 - loss: 0.8266 - regression_loss: 0.7318 - classification_loss: 0.0948 163/500 [========>.....................] - ETA: 2:11 - loss: 0.8327 - regression_loss: 0.7363 - classification_loss: 0.0963 164/500 [========>.....................] - ETA: 2:11 - loss: 0.8302 - regression_loss: 0.7342 - classification_loss: 0.0960 165/500 [========>.....................] - ETA: 2:10 - loss: 0.8293 - regression_loss: 0.7334 - classification_loss: 0.0959 166/500 [========>.....................] - ETA: 2:10 - loss: 0.8276 - regression_loss: 0.7319 - classification_loss: 0.0957 167/500 [=========>....................] - ETA: 2:09 - loss: 0.8302 - regression_loss: 0.7340 - classification_loss: 0.0962 168/500 [=========>....................] - ETA: 2:09 - loss: 0.8285 - regression_loss: 0.7326 - classification_loss: 0.0959 169/500 [=========>....................] - ETA: 2:09 - loss: 0.8287 - regression_loss: 0.7327 - classification_loss: 0.0960 170/500 [=========>....................] - ETA: 2:08 - loss: 0.8275 - regression_loss: 0.7315 - classification_loss: 0.0960 171/500 [=========>....................] - ETA: 2:08 - loss: 0.8254 - regression_loss: 0.7296 - classification_loss: 0.0958 172/500 [=========>....................] - ETA: 2:07 - loss: 0.8257 - regression_loss: 0.7299 - classification_loss: 0.0959 173/500 [=========>....................] - ETA: 2:07 - loss: 0.8266 - regression_loss: 0.7307 - classification_loss: 0.0959 174/500 [=========>....................] - ETA: 2:07 - loss: 0.8252 - regression_loss: 0.7295 - classification_loss: 0.0957 175/500 [=========>....................] - ETA: 2:06 - loss: 0.8255 - regression_loss: 0.7298 - classification_loss: 0.0957 176/500 [=========>....................] - ETA: 2:06 - loss: 0.8242 - regression_loss: 0.7286 - classification_loss: 0.0955 177/500 [=========>....................] - ETA: 2:06 - loss: 0.8283 - regression_loss: 0.7325 - classification_loss: 0.0958 178/500 [=========>....................] - ETA: 2:05 - loss: 0.8264 - regression_loss: 0.7310 - classification_loss: 0.0954 179/500 [=========>....................] - ETA: 2:05 - loss: 0.8261 - regression_loss: 0.7308 - classification_loss: 0.0953 180/500 [=========>....................] - ETA: 2:04 - loss: 0.8254 - regression_loss: 0.7303 - classification_loss: 0.0951 181/500 [=========>....................] - ETA: 2:04 - loss: 0.8245 - regression_loss: 0.7296 - classification_loss: 0.0949 182/500 [=========>....................] - ETA: 2:04 - loss: 0.8259 - regression_loss: 0.7308 - classification_loss: 0.0951 183/500 [=========>....................] - ETA: 2:03 - loss: 0.8254 - regression_loss: 0.7304 - classification_loss: 0.0951 184/500 [==========>...................] - ETA: 2:03 - loss: 0.8227 - regression_loss: 0.7279 - classification_loss: 0.0948 185/500 [==========>...................] - ETA: 2:02 - loss: 0.8206 - regression_loss: 0.7261 - classification_loss: 0.0945 186/500 [==========>...................] - ETA: 2:02 - loss: 0.8187 - regression_loss: 0.7245 - classification_loss: 0.0943 187/500 [==========>...................] - ETA: 2:02 - loss: 0.8199 - regression_loss: 0.7258 - classification_loss: 0.0941 188/500 [==========>...................] - ETA: 2:01 - loss: 0.8235 - regression_loss: 0.7288 - classification_loss: 0.0946 189/500 [==========>...................] - ETA: 2:01 - loss: 0.8234 - regression_loss: 0.7288 - classification_loss: 0.0946 190/500 [==========>...................] - ETA: 2:00 - loss: 0.8227 - regression_loss: 0.7282 - classification_loss: 0.0945 191/500 [==========>...................] - ETA: 2:00 - loss: 0.8233 - regression_loss: 0.7288 - classification_loss: 0.0945 192/500 [==========>...................] - ETA: 2:00 - loss: 0.8238 - regression_loss: 0.7294 - classification_loss: 0.0943 193/500 [==========>...................] - ETA: 1:59 - loss: 0.8256 - regression_loss: 0.7311 - classification_loss: 0.0945 194/500 [==========>...................] - ETA: 1:59 - loss: 0.8270 - regression_loss: 0.7326 - classification_loss: 0.0943 195/500 [==========>...................] - ETA: 1:59 - loss: 0.8284 - regression_loss: 0.7339 - classification_loss: 0.0945 196/500 [==========>...................] - ETA: 1:58 - loss: 0.8287 - regression_loss: 0.7341 - classification_loss: 0.0946 197/500 [==========>...................] - ETA: 1:58 - loss: 0.8274 - regression_loss: 0.7331 - classification_loss: 0.0943 198/500 [==========>...................] - ETA: 1:57 - loss: 0.8271 - regression_loss: 0.7330 - classification_loss: 0.0942 199/500 [==========>...................] - ETA: 1:57 - loss: 0.8282 - regression_loss: 0.7340 - classification_loss: 0.0942 200/500 [===========>..................] - ETA: 1:57 - loss: 0.8269 - regression_loss: 0.7328 - classification_loss: 0.0941 201/500 [===========>..................] - ETA: 1:56 - loss: 0.8265 - regression_loss: 0.7326 - classification_loss: 0.0940 202/500 [===========>..................] - ETA: 1:56 - loss: 0.8255 - regression_loss: 0.7315 - classification_loss: 0.0940 203/500 [===========>..................] - ETA: 1:55 - loss: 0.8241 - regression_loss: 0.7301 - classification_loss: 0.0940 204/500 [===========>..................] - ETA: 1:55 - loss: 0.8233 - regression_loss: 0.7294 - classification_loss: 0.0939 205/500 [===========>..................] - ETA: 1:55 - loss: 0.8252 - regression_loss: 0.7311 - classification_loss: 0.0941 206/500 [===========>..................] - ETA: 1:54 - loss: 0.8248 - regression_loss: 0.7309 - classification_loss: 0.0939 207/500 [===========>..................] - ETA: 1:54 - loss: 0.8231 - regression_loss: 0.7294 - classification_loss: 0.0937 208/500 [===========>..................] - ETA: 1:53 - loss: 0.8237 - regression_loss: 0.7297 - classification_loss: 0.0940 209/500 [===========>..................] - ETA: 1:53 - loss: 0.8240 - regression_loss: 0.7300 - classification_loss: 0.0940 210/500 [===========>..................] - ETA: 1:53 - loss: 0.8231 - regression_loss: 0.7292 - classification_loss: 0.0939 211/500 [===========>..................] - ETA: 1:52 - loss: 0.8224 - regression_loss: 0.7285 - classification_loss: 0.0939 212/500 [===========>..................] - ETA: 1:52 - loss: 0.8231 - regression_loss: 0.7291 - classification_loss: 0.0940 213/500 [===========>..................] - ETA: 1:51 - loss: 0.8224 - regression_loss: 0.7285 - classification_loss: 0.0940 214/500 [===========>..................] - ETA: 1:51 - loss: 0.8219 - regression_loss: 0.7279 - classification_loss: 0.0939 215/500 [===========>..................] - ETA: 1:51 - loss: 0.8252 - regression_loss: 0.7303 - classification_loss: 0.0949 216/500 [===========>..................] - ETA: 1:50 - loss: 0.8234 - regression_loss: 0.7287 - classification_loss: 0.0947 217/500 [============>.................] - ETA: 1:50 - loss: 0.8225 - regression_loss: 0.7278 - classification_loss: 0.0947 218/500 [============>.................] - ETA: 1:50 - loss: 0.8226 - regression_loss: 0.7280 - classification_loss: 0.0947 219/500 [============>.................] - ETA: 1:49 - loss: 0.8225 - regression_loss: 0.7281 - classification_loss: 0.0945 220/500 [============>.................] - ETA: 1:49 - loss: 0.8227 - regression_loss: 0.7283 - classification_loss: 0.0944 221/500 [============>.................] - ETA: 1:48 - loss: 0.8234 - regression_loss: 0.7290 - classification_loss: 0.0944 222/500 [============>.................] - ETA: 1:48 - loss: 0.8229 - regression_loss: 0.7286 - classification_loss: 0.0944 223/500 [============>.................] - ETA: 1:48 - loss: 0.8220 - regression_loss: 0.7278 - classification_loss: 0.0942 224/500 [============>.................] - ETA: 1:47 - loss: 0.8208 - regression_loss: 0.7267 - classification_loss: 0.0940 225/500 [============>.................] - ETA: 1:47 - loss: 0.8213 - regression_loss: 0.7271 - classification_loss: 0.0942 226/500 [============>.................] - ETA: 1:46 - loss: 0.8245 - regression_loss: 0.7299 - classification_loss: 0.0946 227/500 [============>.................] - ETA: 1:46 - loss: 0.8229 - regression_loss: 0.7285 - classification_loss: 0.0943 228/500 [============>.................] - ETA: 1:46 - loss: 0.8226 - regression_loss: 0.7284 - classification_loss: 0.0942 229/500 [============>.................] - ETA: 1:45 - loss: 0.8218 - regression_loss: 0.7278 - classification_loss: 0.0941 230/500 [============>.................] - ETA: 1:45 - loss: 0.8214 - regression_loss: 0.7274 - classification_loss: 0.0940 231/500 [============>.................] - ETA: 1:45 - loss: 0.8215 - regression_loss: 0.7276 - classification_loss: 0.0940 232/500 [============>.................] - ETA: 1:44 - loss: 0.8222 - regression_loss: 0.7282 - classification_loss: 0.0940 233/500 [============>.................] - ETA: 1:44 - loss: 0.8255 - regression_loss: 0.7312 - classification_loss: 0.0943 234/500 [=============>................] - ETA: 1:43 - loss: 0.8233 - regression_loss: 0.7292 - classification_loss: 0.0942 235/500 [=============>................] - ETA: 1:43 - loss: 0.8228 - regression_loss: 0.7285 - classification_loss: 0.0942 236/500 [=============>................] - ETA: 1:43 - loss: 0.8233 - regression_loss: 0.7291 - classification_loss: 0.0942 237/500 [=============>................] - ETA: 1:42 - loss: 0.8232 - regression_loss: 0.7292 - classification_loss: 0.0941 238/500 [=============>................] - ETA: 1:42 - loss: 0.8218 - regression_loss: 0.7280 - classification_loss: 0.0938 239/500 [=============>................] - ETA: 1:41 - loss: 0.8204 - regression_loss: 0.7267 - classification_loss: 0.0936 240/500 [=============>................] - ETA: 1:41 - loss: 0.8225 - regression_loss: 0.7286 - classification_loss: 0.0939 241/500 [=============>................] - ETA: 1:41 - loss: 0.8220 - regression_loss: 0.7282 - classification_loss: 0.0938 242/500 [=============>................] - ETA: 1:40 - loss: 0.8223 - regression_loss: 0.7286 - classification_loss: 0.0938 243/500 [=============>................] - ETA: 1:40 - loss: 0.8215 - regression_loss: 0.7278 - classification_loss: 0.0936 244/500 [=============>................] - ETA: 1:40 - loss: 0.8206 - regression_loss: 0.7271 - classification_loss: 0.0935 245/500 [=============>................] - ETA: 1:39 - loss: 0.8203 - regression_loss: 0.7268 - classification_loss: 0.0935 246/500 [=============>................] - ETA: 1:39 - loss: 0.8196 - regression_loss: 0.7262 - classification_loss: 0.0934 247/500 [=============>................] - ETA: 1:38 - loss: 0.8179 - regression_loss: 0.7247 - classification_loss: 0.0932 248/500 [=============>................] - ETA: 1:38 - loss: 0.8167 - regression_loss: 0.7237 - classification_loss: 0.0931 249/500 [=============>................] - ETA: 1:38 - loss: 0.8160 - regression_loss: 0.7230 - classification_loss: 0.0930 250/500 [==============>...............] - ETA: 1:37 - loss: 0.8145 - regression_loss: 0.7216 - classification_loss: 0.0929 251/500 [==============>...............] - ETA: 1:37 - loss: 0.8132 - regression_loss: 0.7205 - classification_loss: 0.0927 252/500 [==============>...............] - ETA: 1:36 - loss: 0.8126 - regression_loss: 0.7200 - classification_loss: 0.0926 253/500 [==============>...............] - ETA: 1:36 - loss: 0.8123 - regression_loss: 0.7197 - classification_loss: 0.0925 254/500 [==============>...............] - ETA: 1:36 - loss: 0.8120 - regression_loss: 0.7195 - classification_loss: 0.0924 255/500 [==============>...............] - ETA: 1:35 - loss: 0.8104 - regression_loss: 0.7180 - classification_loss: 0.0923 256/500 [==============>...............] - ETA: 1:35 - loss: 0.8097 - regression_loss: 0.7175 - classification_loss: 0.0922 257/500 [==============>...............] - ETA: 1:34 - loss: 0.8088 - regression_loss: 0.7167 - classification_loss: 0.0920 258/500 [==============>...............] - ETA: 1:34 - loss: 0.8100 - regression_loss: 0.7177 - classification_loss: 0.0923 259/500 [==============>...............] - ETA: 1:34 - loss: 0.8093 - regression_loss: 0.7172 - classification_loss: 0.0921 260/500 [==============>...............] - ETA: 1:33 - loss: 0.8091 - regression_loss: 0.7169 - classification_loss: 0.0922 261/500 [==============>...............] - ETA: 1:33 - loss: 0.8076 - regression_loss: 0.7156 - classification_loss: 0.0921 262/500 [==============>...............] - ETA: 1:32 - loss: 0.8074 - regression_loss: 0.7154 - classification_loss: 0.0920 263/500 [==============>...............] - ETA: 1:32 - loss: 0.8073 - regression_loss: 0.7154 - classification_loss: 0.0919 264/500 [==============>...............] - ETA: 1:32 - loss: 0.8079 - regression_loss: 0.7159 - classification_loss: 0.0920 265/500 [==============>...............] - ETA: 1:31 - loss: 0.8060 - regression_loss: 0.7143 - classification_loss: 0.0917 266/500 [==============>...............] - ETA: 1:31 - loss: 0.8055 - regression_loss: 0.7139 - classification_loss: 0.0916 267/500 [===============>..............] - ETA: 1:30 - loss: 0.8079 - regression_loss: 0.7161 - classification_loss: 0.0918 268/500 [===============>..............] - ETA: 1:30 - loss: 0.8076 - regression_loss: 0.7158 - classification_loss: 0.0918 269/500 [===============>..............] - ETA: 1:30 - loss: 0.8056 - regression_loss: 0.7141 - classification_loss: 0.0915 270/500 [===============>..............] - ETA: 1:29 - loss: 0.8055 - regression_loss: 0.7140 - classification_loss: 0.0915 271/500 [===============>..............] - ETA: 1:29 - loss: 0.8056 - regression_loss: 0.7141 - classification_loss: 0.0915 272/500 [===============>..............] - ETA: 1:29 - loss: 0.8050 - regression_loss: 0.7136 - classification_loss: 0.0914 273/500 [===============>..............] - ETA: 1:28 - loss: 0.8063 - regression_loss: 0.7146 - classification_loss: 0.0917 274/500 [===============>..............] - ETA: 1:28 - loss: 0.8069 - regression_loss: 0.7151 - classification_loss: 0.0918 275/500 [===============>..............] - ETA: 1:27 - loss: 0.8074 - regression_loss: 0.7155 - classification_loss: 0.0919 276/500 [===============>..............] - ETA: 1:27 - loss: 0.8096 - regression_loss: 0.7175 - classification_loss: 0.0921 277/500 [===============>..............] - ETA: 1:27 - loss: 0.8081 - regression_loss: 0.7162 - classification_loss: 0.0919 278/500 [===============>..............] - ETA: 1:26 - loss: 0.8077 - regression_loss: 0.7159 - classification_loss: 0.0919 279/500 [===============>..............] - ETA: 1:26 - loss: 0.8081 - regression_loss: 0.7163 - classification_loss: 0.0918 280/500 [===============>..............] - ETA: 1:25 - loss: 0.8088 - regression_loss: 0.7170 - classification_loss: 0.0918 281/500 [===============>..............] - ETA: 1:25 - loss: 0.8074 - regression_loss: 0.7158 - classification_loss: 0.0916 282/500 [===============>..............] - ETA: 1:25 - loss: 0.8074 - regression_loss: 0.7158 - classification_loss: 0.0916 283/500 [===============>..............] - ETA: 1:24 - loss: 0.8067 - regression_loss: 0.7153 - classification_loss: 0.0914 284/500 [================>.............] - ETA: 1:24 - loss: 0.8064 - regression_loss: 0.7150 - classification_loss: 0.0914 285/500 [================>.............] - ETA: 1:23 - loss: 0.8049 - regression_loss: 0.7137 - classification_loss: 0.0912 286/500 [================>.............] - ETA: 1:23 - loss: 0.8055 - regression_loss: 0.7144 - classification_loss: 0.0911 287/500 [================>.............] - ETA: 1:23 - loss: 0.8053 - regression_loss: 0.7143 - classification_loss: 0.0910 288/500 [================>.............] - ETA: 1:22 - loss: 0.8051 - regression_loss: 0.7142 - classification_loss: 0.0910 289/500 [================>.............] - ETA: 1:22 - loss: 0.8083 - regression_loss: 0.7169 - classification_loss: 0.0914 290/500 [================>.............] - ETA: 1:21 - loss: 0.8077 - regression_loss: 0.7164 - classification_loss: 0.0913 291/500 [================>.............] - ETA: 1:21 - loss: 0.8066 - regression_loss: 0.7154 - classification_loss: 0.0912 292/500 [================>.............] - ETA: 1:21 - loss: 0.8095 - regression_loss: 0.7175 - classification_loss: 0.0919 293/500 [================>.............] - ETA: 1:20 - loss: 0.8094 - regression_loss: 0.7175 - classification_loss: 0.0919 294/500 [================>.............] - ETA: 1:20 - loss: 0.8112 - regression_loss: 0.7188 - classification_loss: 0.0924 295/500 [================>.............] - ETA: 1:20 - loss: 0.8115 - regression_loss: 0.7193 - classification_loss: 0.0923 296/500 [================>.............] - ETA: 1:19 - loss: 0.8126 - regression_loss: 0.7202 - classification_loss: 0.0923 297/500 [================>.............] - ETA: 1:19 - loss: 0.8130 - regression_loss: 0.7207 - classification_loss: 0.0923 298/500 [================>.............] - ETA: 1:18 - loss: 0.8120 - regression_loss: 0.7199 - classification_loss: 0.0922 299/500 [================>.............] - ETA: 1:18 - loss: 0.8119 - regression_loss: 0.7198 - classification_loss: 0.0921 300/500 [=================>............] - ETA: 1:18 - loss: 0.8117 - regression_loss: 0.7196 - classification_loss: 0.0920 301/500 [=================>............] - ETA: 1:17 - loss: 0.8108 - regression_loss: 0.7188 - classification_loss: 0.0919 302/500 [=================>............] - ETA: 1:17 - loss: 0.8122 - regression_loss: 0.7199 - classification_loss: 0.0923 303/500 [=================>............] - ETA: 1:16 - loss: 0.8121 - regression_loss: 0.7200 - classification_loss: 0.0921 304/500 [=================>............] - ETA: 1:16 - loss: 0.8138 - regression_loss: 0.7214 - classification_loss: 0.0923 305/500 [=================>............] - ETA: 1:16 - loss: 0.8142 - regression_loss: 0.7220 - classification_loss: 0.0922 306/500 [=================>............] - ETA: 1:15 - loss: 0.8157 - regression_loss: 0.7234 - classification_loss: 0.0923 307/500 [=================>............] - ETA: 1:15 - loss: 0.8151 - regression_loss: 0.7228 - classification_loss: 0.0922 308/500 [=================>............] - ETA: 1:14 - loss: 0.8160 - regression_loss: 0.7237 - classification_loss: 0.0923 309/500 [=================>............] - ETA: 1:14 - loss: 0.8146 - regression_loss: 0.7225 - classification_loss: 0.0921 310/500 [=================>............] - ETA: 1:14 - loss: 0.8153 - regression_loss: 0.7231 - classification_loss: 0.0921 311/500 [=================>............] - ETA: 1:13 - loss: 0.8152 - regression_loss: 0.7230 - classification_loss: 0.0921 312/500 [=================>............] - ETA: 1:13 - loss: 0.8143 - regression_loss: 0.7223 - classification_loss: 0.0920 313/500 [=================>............] - ETA: 1:12 - loss: 0.8129 - regression_loss: 0.7211 - classification_loss: 0.0919 314/500 [=================>............] - ETA: 1:12 - loss: 0.8119 - regression_loss: 0.7201 - classification_loss: 0.0918 315/500 [=================>............] - ETA: 1:12 - loss: 0.8114 - regression_loss: 0.7198 - classification_loss: 0.0916 316/500 [=================>............] - ETA: 1:11 - loss: 0.8108 - regression_loss: 0.7193 - classification_loss: 0.0915 317/500 [==================>...........] - ETA: 1:11 - loss: 0.8109 - regression_loss: 0.7194 - classification_loss: 0.0915 318/500 [==================>...........] - ETA: 1:10 - loss: 0.8116 - regression_loss: 0.7201 - classification_loss: 0.0915 319/500 [==================>...........] - ETA: 1:10 - loss: 0.8134 - regression_loss: 0.7215 - classification_loss: 0.0918 320/500 [==================>...........] - ETA: 1:10 - loss: 0.8130 - regression_loss: 0.7212 - classification_loss: 0.0918 321/500 [==================>...........] - ETA: 1:09 - loss: 0.8123 - regression_loss: 0.7204 - classification_loss: 0.0918 322/500 [==================>...........] - ETA: 1:09 - loss: 0.8114 - regression_loss: 0.7195 - classification_loss: 0.0920 323/500 [==================>...........] - ETA: 1:09 - loss: 0.8116 - regression_loss: 0.7197 - classification_loss: 0.0919 324/500 [==================>...........] - ETA: 1:08 - loss: 0.8140 - regression_loss: 0.7214 - classification_loss: 0.0926 325/500 [==================>...........] - ETA: 1:08 - loss: 0.8133 - regression_loss: 0.7208 - classification_loss: 0.0925 326/500 [==================>...........] - ETA: 1:07 - loss: 0.8136 - regression_loss: 0.7210 - classification_loss: 0.0926 327/500 [==================>...........] - ETA: 1:07 - loss: 0.8158 - regression_loss: 0.7225 - classification_loss: 0.0933 328/500 [==================>...........] - ETA: 1:07 - loss: 0.8155 - regression_loss: 0.7223 - classification_loss: 0.0933 329/500 [==================>...........] - ETA: 1:06 - loss: 0.8166 - regression_loss: 0.7233 - classification_loss: 0.0932 330/500 [==================>...........] - ETA: 1:06 - loss: 0.8153 - regression_loss: 0.7222 - classification_loss: 0.0931 331/500 [==================>...........] - ETA: 1:05 - loss: 0.8141 - regression_loss: 0.7212 - classification_loss: 0.0929 332/500 [==================>...........] - ETA: 1:05 - loss: 0.8136 - regression_loss: 0.7208 - classification_loss: 0.0928 333/500 [==================>...........] - ETA: 1:05 - loss: 0.8125 - regression_loss: 0.7197 - classification_loss: 0.0928 334/500 [===================>..........] - ETA: 1:04 - loss: 0.8112 - regression_loss: 0.7184 - classification_loss: 0.0927 335/500 [===================>..........] - ETA: 1:04 - loss: 0.8105 - regression_loss: 0.7179 - classification_loss: 0.0926 336/500 [===================>..........] - ETA: 1:03 - loss: 0.8093 - regression_loss: 0.7169 - classification_loss: 0.0925 337/500 [===================>..........] - ETA: 1:03 - loss: 0.8090 - regression_loss: 0.7167 - classification_loss: 0.0924 338/500 [===================>..........] - ETA: 1:03 - loss: 0.8089 - regression_loss: 0.7167 - classification_loss: 0.0922 339/500 [===================>..........] - ETA: 1:02 - loss: 0.8082 - regression_loss: 0.7161 - classification_loss: 0.0921 340/500 [===================>..........] - ETA: 1:02 - loss: 0.8080 - regression_loss: 0.7159 - classification_loss: 0.0921 341/500 [===================>..........] - ETA: 1:01 - loss: 0.8082 - regression_loss: 0.7162 - classification_loss: 0.0921 342/500 [===================>..........] - ETA: 1:01 - loss: 0.8076 - regression_loss: 0.7155 - classification_loss: 0.0920 343/500 [===================>..........] - ETA: 1:01 - loss: 0.8063 - regression_loss: 0.7145 - classification_loss: 0.0919 344/500 [===================>..........] - ETA: 1:00 - loss: 0.8063 - regression_loss: 0.7143 - classification_loss: 0.0919 345/500 [===================>..........] - ETA: 1:00 - loss: 0.8050 - regression_loss: 0.7133 - classification_loss: 0.0918 346/500 [===================>..........] - ETA: 59s - loss: 0.8037 - regression_loss: 0.7120 - classification_loss: 0.0916  347/500 [===================>..........] - ETA: 59s - loss: 0.8034 - regression_loss: 0.7119 - classification_loss: 0.0916 348/500 [===================>..........] - ETA: 59s - loss: 0.8025 - regression_loss: 0.7110 - classification_loss: 0.0915 349/500 [===================>..........] - ETA: 58s - loss: 0.8023 - regression_loss: 0.7108 - classification_loss: 0.0914 350/500 [====================>.........] - ETA: 58s - loss: 0.8018 - regression_loss: 0.7103 - classification_loss: 0.0915 351/500 [====================>.........] - ETA: 58s - loss: 0.8010 - regression_loss: 0.7097 - classification_loss: 0.0913 352/500 [====================>.........] - ETA: 57s - loss: 0.7997 - regression_loss: 0.7085 - classification_loss: 0.0912 353/500 [====================>.........] - ETA: 57s - loss: 0.7989 - regression_loss: 0.7079 - classification_loss: 0.0910 354/500 [====================>.........] - ETA: 56s - loss: 0.7996 - regression_loss: 0.7085 - classification_loss: 0.0911 355/500 [====================>.........] - ETA: 56s - loss: 0.7995 - regression_loss: 0.7085 - classification_loss: 0.0910 356/500 [====================>.........] - ETA: 56s - loss: 0.7991 - regression_loss: 0.7081 - classification_loss: 0.0910 357/500 [====================>.........] - ETA: 55s - loss: 0.7983 - regression_loss: 0.7074 - classification_loss: 0.0909 358/500 [====================>.........] - ETA: 55s - loss: 0.7980 - regression_loss: 0.7072 - classification_loss: 0.0909 359/500 [====================>.........] - ETA: 54s - loss: 0.7973 - regression_loss: 0.7066 - classification_loss: 0.0907 360/500 [====================>.........] - ETA: 54s - loss: 0.7980 - regression_loss: 0.7072 - classification_loss: 0.0908 361/500 [====================>.........] - ETA: 54s - loss: 0.7968 - regression_loss: 0.7062 - classification_loss: 0.0906 362/500 [====================>.........] - ETA: 53s - loss: 0.7965 - regression_loss: 0.7059 - classification_loss: 0.0906 363/500 [====================>.........] - ETA: 53s - loss: 0.7968 - regression_loss: 0.7059 - classification_loss: 0.0908 364/500 [====================>.........] - ETA: 52s - loss: 0.7972 - regression_loss: 0.7063 - classification_loss: 0.0909 365/500 [====================>.........] - ETA: 52s - loss: 0.7969 - regression_loss: 0.7060 - classification_loss: 0.0908 366/500 [====================>.........] - ETA: 52s - loss: 0.7968 - regression_loss: 0.7059 - classification_loss: 0.0908 367/500 [=====================>........] - ETA: 51s - loss: 0.7963 - regression_loss: 0.7056 - classification_loss: 0.0907 368/500 [=====================>........] - ETA: 51s - loss: 0.7959 - regression_loss: 0.7052 - classification_loss: 0.0908 369/500 [=====================>........] - ETA: 51s - loss: 0.7951 - regression_loss: 0.7044 - classification_loss: 0.0907 370/500 [=====================>........] - ETA: 50s - loss: 0.7959 - regression_loss: 0.7050 - classification_loss: 0.0909 371/500 [=====================>........] - ETA: 50s - loss: 0.7954 - regression_loss: 0.7046 - classification_loss: 0.0908 372/500 [=====================>........] - ETA: 49s - loss: 0.7954 - regression_loss: 0.7045 - classification_loss: 0.0909 373/500 [=====================>........] - ETA: 49s - loss: 0.7952 - regression_loss: 0.7044 - classification_loss: 0.0908 374/500 [=====================>........] - ETA: 49s - loss: 0.7954 - regression_loss: 0.7046 - classification_loss: 0.0908 375/500 [=====================>........] - ETA: 48s - loss: 0.7949 - regression_loss: 0.7042 - classification_loss: 0.0907 376/500 [=====================>........] - ETA: 48s - loss: 0.7945 - regression_loss: 0.7039 - classification_loss: 0.0906 377/500 [=====================>........] - ETA: 47s - loss: 0.7955 - regression_loss: 0.7047 - classification_loss: 0.0909 378/500 [=====================>........] - ETA: 47s - loss: 0.7946 - regression_loss: 0.7038 - classification_loss: 0.0907 379/500 [=====================>........] - ETA: 47s - loss: 0.7954 - regression_loss: 0.7046 - classification_loss: 0.0909 380/500 [=====================>........] - ETA: 46s - loss: 0.7952 - regression_loss: 0.7044 - classification_loss: 0.0908 381/500 [=====================>........] - ETA: 46s - loss: 0.7955 - regression_loss: 0.7047 - classification_loss: 0.0908 382/500 [=====================>........] - ETA: 45s - loss: 0.7954 - regression_loss: 0.7047 - classification_loss: 0.0907 383/500 [=====================>........] - ETA: 45s - loss: 0.7961 - regression_loss: 0.7052 - classification_loss: 0.0909 384/500 [======================>.......] - ETA: 45s - loss: 0.7959 - regression_loss: 0.7051 - classification_loss: 0.0908 385/500 [======================>.......] - ETA: 44s - loss: 0.7950 - regression_loss: 0.7043 - classification_loss: 0.0907 386/500 [======================>.......] - ETA: 44s - loss: 0.7943 - regression_loss: 0.7037 - classification_loss: 0.0906 387/500 [======================>.......] - ETA: 44s - loss: 0.7939 - regression_loss: 0.7034 - classification_loss: 0.0905 388/500 [======================>.......] - ETA: 43s - loss: 0.7936 - regression_loss: 0.7031 - classification_loss: 0.0904 389/500 [======================>.......] - ETA: 43s - loss: 0.7923 - regression_loss: 0.7020 - classification_loss: 0.0903 390/500 [======================>.......] - ETA: 42s - loss: 0.7917 - regression_loss: 0.7014 - classification_loss: 0.0903 391/500 [======================>.......] - ETA: 42s - loss: 0.7924 - regression_loss: 0.7021 - classification_loss: 0.0903 392/500 [======================>.......] - ETA: 42s - loss: 0.7921 - regression_loss: 0.7019 - classification_loss: 0.0903 393/500 [======================>.......] - ETA: 41s - loss: 0.7946 - regression_loss: 0.7037 - classification_loss: 0.0909 394/500 [======================>.......] - ETA: 41s - loss: 0.7940 - regression_loss: 0.7032 - classification_loss: 0.0908 395/500 [======================>.......] - ETA: 40s - loss: 0.7939 - regression_loss: 0.7032 - classification_loss: 0.0908 396/500 [======================>.......] - ETA: 40s - loss: 0.7936 - regression_loss: 0.7029 - classification_loss: 0.0907 397/500 [======================>.......] - ETA: 40s - loss: 0.7937 - regression_loss: 0.7031 - classification_loss: 0.0906 398/500 [======================>.......] - ETA: 39s - loss: 0.7932 - regression_loss: 0.7027 - classification_loss: 0.0905 399/500 [======================>.......] - ETA: 39s - loss: 0.7945 - regression_loss: 0.7039 - classification_loss: 0.0906 400/500 [=======================>......] - ETA: 38s - loss: 0.7945 - regression_loss: 0.7039 - classification_loss: 0.0906 401/500 [=======================>......] - ETA: 38s - loss: 0.7939 - regression_loss: 0.7035 - classification_loss: 0.0905 402/500 [=======================>......] - ETA: 38s - loss: 0.7933 - regression_loss: 0.7029 - classification_loss: 0.0904 403/500 [=======================>......] - ETA: 37s - loss: 0.7934 - regression_loss: 0.7030 - classification_loss: 0.0904 404/500 [=======================>......] - ETA: 37s - loss: 0.7924 - regression_loss: 0.7021 - classification_loss: 0.0903 405/500 [=======================>......] - ETA: 37s - loss: 0.7927 - regression_loss: 0.7024 - classification_loss: 0.0903 406/500 [=======================>......] - ETA: 36s - loss: 0.7916 - regression_loss: 0.7015 - classification_loss: 0.0902 407/500 [=======================>......] - ETA: 36s - loss: 0.7917 - regression_loss: 0.7016 - classification_loss: 0.0901 408/500 [=======================>......] - ETA: 35s - loss: 0.7916 - regression_loss: 0.7015 - classification_loss: 0.0902 409/500 [=======================>......] - ETA: 35s - loss: 0.7926 - regression_loss: 0.7024 - classification_loss: 0.0902 410/500 [=======================>......] - ETA: 35s - loss: 0.7921 - regression_loss: 0.7019 - classification_loss: 0.0902 411/500 [=======================>......] - ETA: 34s - loss: 0.7922 - regression_loss: 0.7019 - classification_loss: 0.0903 412/500 [=======================>......] - ETA: 34s - loss: 0.7920 - regression_loss: 0.7017 - classification_loss: 0.0902 413/500 [=======================>......] - ETA: 33s - loss: 0.7912 - regression_loss: 0.7011 - classification_loss: 0.0902 414/500 [=======================>......] - ETA: 33s - loss: 0.7915 - regression_loss: 0.7013 - classification_loss: 0.0902 415/500 [=======================>......] - ETA: 33s - loss: 0.7914 - regression_loss: 0.7012 - classification_loss: 0.0901 416/500 [=======================>......] - ETA: 32s - loss: 0.7930 - regression_loss: 0.7026 - classification_loss: 0.0904 417/500 [========================>.....] - ETA: 32s - loss: 0.7927 - regression_loss: 0.7023 - classification_loss: 0.0903 418/500 [========================>.....] - ETA: 31s - loss: 0.7928 - regression_loss: 0.7024 - classification_loss: 0.0904 419/500 [========================>.....] - ETA: 31s - loss: 0.7930 - regression_loss: 0.7025 - classification_loss: 0.0905 420/500 [========================>.....] - ETA: 31s - loss: 0.7928 - regression_loss: 0.7023 - classification_loss: 0.0904 421/500 [========================>.....] - ETA: 30s - loss: 0.7924 - regression_loss: 0.7020 - classification_loss: 0.0904 422/500 [========================>.....] - ETA: 30s - loss: 0.7917 - regression_loss: 0.7014 - classification_loss: 0.0903 423/500 [========================>.....] - ETA: 29s - loss: 0.7926 - regression_loss: 0.7022 - classification_loss: 0.0904 424/500 [========================>.....] - ETA: 29s - loss: 0.7923 - regression_loss: 0.7020 - classification_loss: 0.0903 425/500 [========================>.....] - ETA: 29s - loss: 0.7930 - regression_loss: 0.7024 - classification_loss: 0.0905 426/500 [========================>.....] - ETA: 28s - loss: 0.7938 - regression_loss: 0.7033 - classification_loss: 0.0905 427/500 [========================>.....] - ETA: 28s - loss: 0.7937 - regression_loss: 0.7031 - classification_loss: 0.0905 428/500 [========================>.....] - ETA: 28s - loss: 0.7943 - regression_loss: 0.7036 - classification_loss: 0.0906 429/500 [========================>.....] - ETA: 27s - loss: 0.7938 - regression_loss: 0.7032 - classification_loss: 0.0906 430/500 [========================>.....] - ETA: 27s - loss: 0.7943 - regression_loss: 0.7038 - classification_loss: 0.0906 431/500 [========================>.....] - ETA: 26s - loss: 0.7941 - regression_loss: 0.7035 - classification_loss: 0.0906 432/500 [========================>.....] - ETA: 26s - loss: 0.7929 - regression_loss: 0.7025 - classification_loss: 0.0904 433/500 [========================>.....] - ETA: 26s - loss: 0.7931 - regression_loss: 0.7027 - classification_loss: 0.0905 434/500 [=========================>....] - ETA: 25s - loss: 0.7927 - regression_loss: 0.7023 - classification_loss: 0.0904 435/500 [=========================>....] - ETA: 25s - loss: 0.7929 - regression_loss: 0.7024 - classification_loss: 0.0905 436/500 [=========================>....] - ETA: 24s - loss: 0.7938 - regression_loss: 0.7032 - classification_loss: 0.0906 437/500 [=========================>....] - ETA: 24s - loss: 0.7932 - regression_loss: 0.7027 - classification_loss: 0.0905 438/500 [=========================>....] - ETA: 24s - loss: 0.7935 - regression_loss: 0.7030 - classification_loss: 0.0905 439/500 [=========================>....] - ETA: 23s - loss: 0.7937 - regression_loss: 0.7032 - classification_loss: 0.0905 440/500 [=========================>....] - ETA: 23s - loss: 0.7936 - regression_loss: 0.7032 - classification_loss: 0.0905 441/500 [=========================>....] - ETA: 22s - loss: 0.7951 - regression_loss: 0.7045 - classification_loss: 0.0906 442/500 [=========================>....] - ETA: 22s - loss: 0.7948 - regression_loss: 0.7042 - classification_loss: 0.0906 443/500 [=========================>....] - ETA: 22s - loss: 0.7950 - regression_loss: 0.7045 - classification_loss: 0.0906 444/500 [=========================>....] - ETA: 21s - loss: 0.7941 - regression_loss: 0.7037 - classification_loss: 0.0905 445/500 [=========================>....] - ETA: 21s - loss: 0.7960 - regression_loss: 0.7053 - classification_loss: 0.0907 446/500 [=========================>....] - ETA: 21s - loss: 0.7962 - regression_loss: 0.7055 - classification_loss: 0.0907 447/500 [=========================>....] - ETA: 20s - loss: 0.7968 - regression_loss: 0.7060 - classification_loss: 0.0907 448/500 [=========================>....] - ETA: 20s - loss: 0.7969 - regression_loss: 0.7062 - classification_loss: 0.0907 449/500 [=========================>....] - ETA: 19s - loss: 0.7964 - regression_loss: 0.7058 - classification_loss: 0.0906 450/500 [==========================>...] - ETA: 19s - loss: 0.7962 - regression_loss: 0.7056 - classification_loss: 0.0906 451/500 [==========================>...] - ETA: 19s - loss: 0.7955 - regression_loss: 0.7050 - classification_loss: 0.0905 452/500 [==========================>...] - ETA: 18s - loss: 0.7947 - regression_loss: 0.7043 - classification_loss: 0.0904 453/500 [==========================>...] - ETA: 18s - loss: 0.7944 - regression_loss: 0.7040 - classification_loss: 0.0903 454/500 [==========================>...] - ETA: 17s - loss: 0.7940 - regression_loss: 0.7038 - classification_loss: 0.0902 455/500 [==========================>...] - ETA: 17s - loss: 0.7941 - regression_loss: 0.7039 - classification_loss: 0.0903 456/500 [==========================>...] - ETA: 17s - loss: 0.7937 - regression_loss: 0.7036 - classification_loss: 0.0901 457/500 [==========================>...] - ETA: 16s - loss: 0.7933 - regression_loss: 0.7033 - classification_loss: 0.0900 458/500 [==========================>...] - ETA: 16s - loss: 0.7940 - regression_loss: 0.7039 - classification_loss: 0.0901 459/500 [==========================>...] - ETA: 15s - loss: 0.7944 - regression_loss: 0.7042 - classification_loss: 0.0901 460/500 [==========================>...] - ETA: 15s - loss: 0.7937 - regression_loss: 0.7037 - classification_loss: 0.0900 461/500 [==========================>...] - ETA: 15s - loss: 0.7953 - regression_loss: 0.7050 - classification_loss: 0.0903 462/500 [==========================>...] - ETA: 14s - loss: 0.7953 - regression_loss: 0.7050 - classification_loss: 0.0902 463/500 [==========================>...] - ETA: 14s - loss: 0.7957 - regression_loss: 0.7053 - classification_loss: 0.0904 464/500 [==========================>...] - ETA: 14s - loss: 0.7946 - regression_loss: 0.7043 - classification_loss: 0.0903 465/500 [==========================>...] - ETA: 13s - loss: 0.7941 - regression_loss: 0.7039 - classification_loss: 0.0902 466/500 [==========================>...] - ETA: 13s - loss: 0.7938 - regression_loss: 0.7036 - classification_loss: 0.0901 467/500 [===========================>..] - ETA: 12s - loss: 0.7937 - regression_loss: 0.7036 - classification_loss: 0.0901 468/500 [===========================>..] - ETA: 12s - loss: 0.7931 - regression_loss: 0.7031 - classification_loss: 0.0900 469/500 [===========================>..] - ETA: 12s - loss: 0.7931 - regression_loss: 0.7031 - classification_loss: 0.0900 470/500 [===========================>..] - ETA: 11s - loss: 0.7934 - regression_loss: 0.7035 - classification_loss: 0.0900 471/500 [===========================>..] - ETA: 11s - loss: 0.7941 - regression_loss: 0.7042 - classification_loss: 0.0899 472/500 [===========================>..] - ETA: 10s - loss: 0.7943 - regression_loss: 0.7044 - classification_loss: 0.0899 473/500 [===========================>..] - ETA: 10s - loss: 0.7955 - regression_loss: 0.7056 - classification_loss: 0.0899 474/500 [===========================>..] - ETA: 10s - loss: 0.7953 - regression_loss: 0.7054 - classification_loss: 0.0899 475/500 [===========================>..] - ETA: 9s - loss: 0.7969 - regression_loss: 0.7068 - classification_loss: 0.0901  476/500 [===========================>..] - ETA: 9s - loss: 0.7961 - regression_loss: 0.7062 - classification_loss: 0.0900 477/500 [===========================>..] - ETA: 8s - loss: 0.7957 - regression_loss: 0.7058 - classification_loss: 0.0899 478/500 [===========================>..] - ETA: 8s - loss: 0.7958 - regression_loss: 0.7059 - classification_loss: 0.0899 479/500 [===========================>..] - ETA: 8s - loss: 0.7958 - regression_loss: 0.7060 - classification_loss: 0.0898 480/500 [===========================>..] - ETA: 7s - loss: 0.7956 - regression_loss: 0.7058 - classification_loss: 0.0897 481/500 [===========================>..] - ETA: 7s - loss: 0.7951 - regression_loss: 0.7055 - classification_loss: 0.0896 482/500 [===========================>..] - ETA: 7s - loss: 0.7946 - regression_loss: 0.7051 - classification_loss: 0.0895 483/500 [===========================>..] - ETA: 6s - loss: 0.7958 - regression_loss: 0.7062 - classification_loss: 0.0896 484/500 [============================>.] - ETA: 6s - loss: 0.7949 - regression_loss: 0.7053 - classification_loss: 0.0896 485/500 [============================>.] - ETA: 5s - loss: 0.7954 - regression_loss: 0.7057 - classification_loss: 0.0897 486/500 [============================>.] - ETA: 5s - loss: 0.7948 - regression_loss: 0.7051 - classification_loss: 0.0896 487/500 [============================>.] - ETA: 5s - loss: 0.7967 - regression_loss: 0.7067 - classification_loss: 0.0900 488/500 [============================>.] - ETA: 4s - loss: 0.7966 - regression_loss: 0.7066 - classification_loss: 0.0900 489/500 [============================>.] - ETA: 4s - loss: 0.7965 - regression_loss: 0.7065 - classification_loss: 0.0899 490/500 [============================>.] - ETA: 3s - loss: 0.7967 - regression_loss: 0.7069 - classification_loss: 0.0898 491/500 [============================>.] - ETA: 3s - loss: 0.7977 - regression_loss: 0.7076 - classification_loss: 0.0901 492/500 [============================>.] - ETA: 3s - loss: 0.7975 - regression_loss: 0.7075 - classification_loss: 0.0900 493/500 [============================>.] - ETA: 2s - loss: 0.7975 - regression_loss: 0.7074 - classification_loss: 0.0901 494/500 [============================>.] - ETA: 2s - loss: 0.7979 - regression_loss: 0.7078 - classification_loss: 0.0901 495/500 [============================>.] - ETA: 1s - loss: 0.7975 - regression_loss: 0.7075 - classification_loss: 0.0900 496/500 [============================>.] - ETA: 1s - loss: 0.7974 - regression_loss: 0.7074 - classification_loss: 0.0900 497/500 [============================>.] - ETA: 1s - loss: 0.7969 - regression_loss: 0.7070 - classification_loss: 0.0900 498/500 [============================>.] - ETA: 0s - loss: 0.7965 - regression_loss: 0.7066 - classification_loss: 0.0899 499/500 [============================>.] - ETA: 0s - loss: 0.7961 - regression_loss: 0.7063 - classification_loss: 0.0898 500/500 [==============================] - 195s 389ms/step - loss: 0.7960 - regression_loss: 0.7062 - classification_loss: 0.0898 326 instances of class plum with average precision: 0.8464 mAP: 0.8464 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/20 1/500 [..............................] - ETA: 3:03 - loss: 0.6994 - regression_loss: 0.6306 - classification_loss: 0.0689 2/500 [..............................] - ETA: 3:07 - loss: 0.7503 - regression_loss: 0.6702 - classification_loss: 0.0801 3/500 [..............................] - ETA: 3:09 - loss: 0.7264 - regression_loss: 0.6496 - classification_loss: 0.0768 4/500 [..............................] - ETA: 3:09 - loss: 0.7397 - regression_loss: 0.6577 - classification_loss: 0.0820 5/500 [..............................] - ETA: 3:11 - loss: 0.7388 - regression_loss: 0.6606 - classification_loss: 0.0782 6/500 [..............................] - ETA: 3:11 - loss: 0.7001 - regression_loss: 0.6220 - classification_loss: 0.0781 7/500 [..............................] - ETA: 3:11 - loss: 0.7390 - regression_loss: 0.6518 - classification_loss: 0.0873 8/500 [..............................] - ETA: 3:11 - loss: 0.7573 - regression_loss: 0.6685 - classification_loss: 0.0888 9/500 [..............................] - ETA: 3:10 - loss: 0.7109 - regression_loss: 0.6279 - classification_loss: 0.0830 10/500 [..............................] - ETA: 3:10 - loss: 0.7357 - regression_loss: 0.6533 - classification_loss: 0.0824 11/500 [..............................] - ETA: 3:10 - loss: 0.7501 - regression_loss: 0.6675 - classification_loss: 0.0827 12/500 [..............................] - ETA: 3:10 - loss: 0.7198 - regression_loss: 0.6378 - classification_loss: 0.0820 13/500 [..............................] - ETA: 3:10 - loss: 0.7041 - regression_loss: 0.6230 - classification_loss: 0.0811 14/500 [..............................] - ETA: 3:10 - loss: 0.6877 - regression_loss: 0.6083 - classification_loss: 0.0794 15/500 [..............................] - ETA: 3:10 - loss: 0.6801 - regression_loss: 0.6042 - classification_loss: 0.0760 16/500 [..............................] - ETA: 3:09 - loss: 0.6759 - regression_loss: 0.6000 - classification_loss: 0.0759 17/500 [>.............................] - ETA: 3:08 - loss: 0.7009 - regression_loss: 0.6193 - classification_loss: 0.0816 18/500 [>.............................] - ETA: 3:07 - loss: 0.7450 - regression_loss: 0.6578 - classification_loss: 0.0872 19/500 [>.............................] - ETA: 3:07 - loss: 0.7535 - regression_loss: 0.6664 - classification_loss: 0.0871 20/500 [>.............................] - ETA: 3:07 - loss: 0.7535 - regression_loss: 0.6675 - classification_loss: 0.0860 21/500 [>.............................] - ETA: 3:07 - loss: 0.7559 - regression_loss: 0.6693 - classification_loss: 0.0867 22/500 [>.............................] - ETA: 3:06 - loss: 0.7570 - regression_loss: 0.6702 - classification_loss: 0.0868 23/500 [>.............................] - ETA: 3:06 - loss: 0.7546 - regression_loss: 0.6687 - classification_loss: 0.0859 24/500 [>.............................] - ETA: 3:06 - loss: 0.7478 - regression_loss: 0.6629 - classification_loss: 0.0848 25/500 [>.............................] - ETA: 3:05 - loss: 0.7422 - regression_loss: 0.6581 - classification_loss: 0.0841 26/500 [>.............................] - ETA: 3:05 - loss: 0.7421 - regression_loss: 0.6594 - classification_loss: 0.0827 27/500 [>.............................] - ETA: 3:04 - loss: 0.7285 - regression_loss: 0.6474 - classification_loss: 0.0811 28/500 [>.............................] - ETA: 3:04 - loss: 0.7263 - regression_loss: 0.6456 - classification_loss: 0.0808 29/500 [>.............................] - ETA: 3:03 - loss: 0.7212 - regression_loss: 0.6419 - classification_loss: 0.0794 30/500 [>.............................] - ETA: 3:03 - loss: 0.7255 - regression_loss: 0.6460 - classification_loss: 0.0796 31/500 [>.............................] - ETA: 3:02 - loss: 0.7326 - regression_loss: 0.6524 - classification_loss: 0.0802 32/500 [>.............................] - ETA: 3:02 - loss: 0.7387 - regression_loss: 0.6578 - classification_loss: 0.0810 33/500 [>.............................] - ETA: 3:02 - loss: 0.7374 - regression_loss: 0.6572 - classification_loss: 0.0802 34/500 [=>............................] - ETA: 3:01 - loss: 0.7285 - regression_loss: 0.6497 - classification_loss: 0.0789 35/500 [=>............................] - ETA: 3:01 - loss: 0.7231 - regression_loss: 0.6450 - classification_loss: 0.0781 36/500 [=>............................] - ETA: 3:01 - loss: 0.7113 - regression_loss: 0.6345 - classification_loss: 0.0769 37/500 [=>............................] - ETA: 3:01 - loss: 0.7110 - regression_loss: 0.6347 - classification_loss: 0.0762 38/500 [=>............................] - ETA: 3:00 - loss: 0.7074 - regression_loss: 0.6315 - classification_loss: 0.0760 39/500 [=>............................] - ETA: 3:00 - loss: 0.7197 - regression_loss: 0.6423 - classification_loss: 0.0773 40/500 [=>............................] - ETA: 3:00 - loss: 0.7235 - regression_loss: 0.6460 - classification_loss: 0.0775 41/500 [=>............................] - ETA: 2:59 - loss: 0.7225 - regression_loss: 0.6456 - classification_loss: 0.0769 42/500 [=>............................] - ETA: 2:59 - loss: 0.7232 - regression_loss: 0.6464 - classification_loss: 0.0768 43/500 [=>............................] - ETA: 2:58 - loss: 0.7187 - regression_loss: 0.6425 - classification_loss: 0.0762 44/500 [=>............................] - ETA: 2:58 - loss: 0.7134 - regression_loss: 0.6377 - classification_loss: 0.0757 45/500 [=>............................] - ETA: 2:58 - loss: 0.7107 - regression_loss: 0.6352 - classification_loss: 0.0754 46/500 [=>............................] - ETA: 2:57 - loss: 0.7131 - regression_loss: 0.6372 - classification_loss: 0.0759 47/500 [=>............................] - ETA: 2:57 - loss: 0.7106 - regression_loss: 0.6350 - classification_loss: 0.0756 48/500 [=>............................] - ETA: 2:56 - loss: 0.7154 - regression_loss: 0.6387 - classification_loss: 0.0767 49/500 [=>............................] - ETA: 2:56 - loss: 0.7257 - regression_loss: 0.6482 - classification_loss: 0.0775 50/500 [==>...........................] - ETA: 2:55 - loss: 0.7216 - regression_loss: 0.6452 - classification_loss: 0.0764 51/500 [==>...........................] - ETA: 2:55 - loss: 0.7360 - regression_loss: 0.6572 - classification_loss: 0.0788 52/500 [==>...........................] - ETA: 2:55 - loss: 0.7254 - regression_loss: 0.6476 - classification_loss: 0.0778 53/500 [==>...........................] - ETA: 2:54 - loss: 0.7210 - regression_loss: 0.6439 - classification_loss: 0.0771 54/500 [==>...........................] - ETA: 2:54 - loss: 0.7212 - regression_loss: 0.6448 - classification_loss: 0.0764 55/500 [==>...........................] - ETA: 2:53 - loss: 0.7252 - regression_loss: 0.6488 - classification_loss: 0.0763 56/500 [==>...........................] - ETA: 2:53 - loss: 0.7172 - regression_loss: 0.6419 - classification_loss: 0.0753 57/500 [==>...........................] - ETA: 2:52 - loss: 0.7226 - regression_loss: 0.6463 - classification_loss: 0.0762 58/500 [==>...........................] - ETA: 2:52 - loss: 0.7267 - regression_loss: 0.6495 - classification_loss: 0.0771 59/500 [==>...........................] - ETA: 2:52 - loss: 0.7285 - regression_loss: 0.6514 - classification_loss: 0.0771 60/500 [==>...........................] - ETA: 2:51 - loss: 0.7396 - regression_loss: 0.6622 - classification_loss: 0.0774 61/500 [==>...........................] - ETA: 2:51 - loss: 0.7381 - regression_loss: 0.6611 - classification_loss: 0.0771 62/500 [==>...........................] - ETA: 2:50 - loss: 0.7375 - regression_loss: 0.6606 - classification_loss: 0.0768 63/500 [==>...........................] - ETA: 2:50 - loss: 0.7351 - regression_loss: 0.6587 - classification_loss: 0.0764 64/500 [==>...........................] - ETA: 2:50 - loss: 0.7336 - regression_loss: 0.6571 - classification_loss: 0.0765 65/500 [==>...........................] - ETA: 2:49 - loss: 0.7421 - regression_loss: 0.6631 - classification_loss: 0.0790 66/500 [==>...........................] - ETA: 2:49 - loss: 0.7427 - regression_loss: 0.6639 - classification_loss: 0.0789 67/500 [===>..........................] - ETA: 2:49 - loss: 0.7418 - regression_loss: 0.6631 - classification_loss: 0.0787 68/500 [===>..........................] - ETA: 2:48 - loss: 0.7434 - regression_loss: 0.6650 - classification_loss: 0.0784 69/500 [===>..........................] - ETA: 2:48 - loss: 0.7460 - regression_loss: 0.6676 - classification_loss: 0.0784 70/500 [===>..........................] - ETA: 2:47 - loss: 0.7458 - regression_loss: 0.6680 - classification_loss: 0.0778 71/500 [===>..........................] - ETA: 2:47 - loss: 0.7446 - regression_loss: 0.6671 - classification_loss: 0.0775 72/500 [===>..........................] - ETA: 2:47 - loss: 0.7404 - regression_loss: 0.6634 - classification_loss: 0.0770 73/500 [===>..........................] - ETA: 2:46 - loss: 0.7374 - regression_loss: 0.6600 - classification_loss: 0.0774 74/500 [===>..........................] - ETA: 2:46 - loss: 0.7321 - regression_loss: 0.6551 - classification_loss: 0.0770 75/500 [===>..........................] - ETA: 2:46 - loss: 0.7300 - regression_loss: 0.6534 - classification_loss: 0.0766 76/500 [===>..........................] - ETA: 2:45 - loss: 0.7300 - regression_loss: 0.6536 - classification_loss: 0.0764 77/500 [===>..........................] - ETA: 2:45 - loss: 0.7374 - regression_loss: 0.6591 - classification_loss: 0.0782 78/500 [===>..........................] - ETA: 2:44 - loss: 0.7378 - regression_loss: 0.6599 - classification_loss: 0.0780 79/500 [===>..........................] - ETA: 2:44 - loss: 0.7368 - regression_loss: 0.6591 - classification_loss: 0.0777 80/500 [===>..........................] - ETA: 2:44 - loss: 0.7359 - regression_loss: 0.6584 - classification_loss: 0.0775 81/500 [===>..........................] - ETA: 2:43 - loss: 0.7349 - regression_loss: 0.6577 - classification_loss: 0.0771 82/500 [===>..........................] - ETA: 2:43 - loss: 0.7281 - regression_loss: 0.6514 - classification_loss: 0.0767 83/500 [===>..........................] - ETA: 2:42 - loss: 0.7277 - regression_loss: 0.6507 - classification_loss: 0.0770 84/500 [====>.........................] - ETA: 2:42 - loss: 0.7312 - regression_loss: 0.6539 - classification_loss: 0.0773 85/500 [====>.........................] - ETA: 2:41 - loss: 0.7339 - regression_loss: 0.6568 - classification_loss: 0.0771 86/500 [====>.........................] - ETA: 2:41 - loss: 0.7305 - regression_loss: 0.6539 - classification_loss: 0.0766 87/500 [====>.........................] - ETA: 2:41 - loss: 0.7297 - regression_loss: 0.6533 - classification_loss: 0.0764 88/500 [====>.........................] - ETA: 2:40 - loss: 0.7280 - regression_loss: 0.6520 - classification_loss: 0.0760 89/500 [====>.........................] - ETA: 2:40 - loss: 0.7266 - regression_loss: 0.6506 - classification_loss: 0.0760 90/500 [====>.........................] - ETA: 2:40 - loss: 0.7296 - regression_loss: 0.6529 - classification_loss: 0.0767 91/500 [====>.........................] - ETA: 2:39 - loss: 0.7278 - regression_loss: 0.6513 - classification_loss: 0.0764 92/500 [====>.........................] - ETA: 2:39 - loss: 0.7241 - regression_loss: 0.6482 - classification_loss: 0.0760 93/500 [====>.........................] - ETA: 2:38 - loss: 0.7257 - regression_loss: 0.6497 - classification_loss: 0.0760 94/500 [====>.........................] - ETA: 2:38 - loss: 0.7257 - regression_loss: 0.6499 - classification_loss: 0.0758 95/500 [====>.........................] - ETA: 2:38 - loss: 0.7313 - regression_loss: 0.6540 - classification_loss: 0.0773 96/500 [====>.........................] - ETA: 2:37 - loss: 0.7307 - regression_loss: 0.6535 - classification_loss: 0.0772 97/500 [====>.........................] - ETA: 2:37 - loss: 0.7393 - regression_loss: 0.6600 - classification_loss: 0.0793 98/500 [====>.........................] - ETA: 2:36 - loss: 0.7479 - regression_loss: 0.6663 - classification_loss: 0.0816 99/500 [====>.........................] - ETA: 2:36 - loss: 0.7452 - regression_loss: 0.6641 - classification_loss: 0.0812 100/500 [=====>........................] - ETA: 2:36 - loss: 0.7443 - regression_loss: 0.6630 - classification_loss: 0.0813 101/500 [=====>........................] - ETA: 2:35 - loss: 0.7438 - regression_loss: 0.6624 - classification_loss: 0.0814 102/500 [=====>........................] - ETA: 2:35 - loss: 0.7419 - regression_loss: 0.6608 - classification_loss: 0.0811 103/500 [=====>........................] - ETA: 2:34 - loss: 0.7429 - regression_loss: 0.6618 - classification_loss: 0.0810 104/500 [=====>........................] - ETA: 2:34 - loss: 0.7465 - regression_loss: 0.6653 - classification_loss: 0.0812 105/500 [=====>........................] - ETA: 2:33 - loss: 0.7456 - regression_loss: 0.6645 - classification_loss: 0.0811 106/500 [=====>........................] - ETA: 2:33 - loss: 0.7460 - regression_loss: 0.6648 - classification_loss: 0.0811 107/500 [=====>........................] - ETA: 2:33 - loss: 0.7474 - regression_loss: 0.6660 - classification_loss: 0.0814 108/500 [=====>........................] - ETA: 2:32 - loss: 0.7484 - regression_loss: 0.6671 - classification_loss: 0.0813 109/500 [=====>........................] - ETA: 2:32 - loss: 0.7466 - regression_loss: 0.6656 - classification_loss: 0.0810 110/500 [=====>........................] - ETA: 2:32 - loss: 0.7475 - regression_loss: 0.6666 - classification_loss: 0.0809 111/500 [=====>........................] - ETA: 2:31 - loss: 0.7519 - regression_loss: 0.6703 - classification_loss: 0.0816 112/500 [=====>........................] - ETA: 2:31 - loss: 0.7517 - regression_loss: 0.6698 - classification_loss: 0.0819 113/500 [=====>........................] - ETA: 2:30 - loss: 0.7512 - regression_loss: 0.6695 - classification_loss: 0.0818 114/500 [=====>........................] - ETA: 2:30 - loss: 0.7528 - regression_loss: 0.6709 - classification_loss: 0.0820 115/500 [=====>........................] - ETA: 2:30 - loss: 0.7524 - regression_loss: 0.6706 - classification_loss: 0.0817 116/500 [=====>........................] - ETA: 2:29 - loss: 0.7515 - regression_loss: 0.6699 - classification_loss: 0.0816 117/500 [======>.......................] - ETA: 2:29 - loss: 0.7487 - regression_loss: 0.6673 - classification_loss: 0.0814 118/500 [======>.......................] - ETA: 2:28 - loss: 0.7465 - regression_loss: 0.6655 - classification_loss: 0.0811 119/500 [======>.......................] - ETA: 2:28 - loss: 0.7499 - regression_loss: 0.6688 - classification_loss: 0.0811 120/500 [======>.......................] - ETA: 2:28 - loss: 0.7541 - regression_loss: 0.6717 - classification_loss: 0.0824 121/500 [======>.......................] - ETA: 2:27 - loss: 0.7525 - regression_loss: 0.6704 - classification_loss: 0.0821 122/500 [======>.......................] - ETA: 2:27 - loss: 0.7531 - regression_loss: 0.6713 - classification_loss: 0.0819 123/500 [======>.......................] - ETA: 2:26 - loss: 0.7525 - regression_loss: 0.6707 - classification_loss: 0.0818 124/500 [======>.......................] - ETA: 2:26 - loss: 0.7533 - regression_loss: 0.6716 - classification_loss: 0.0816 125/500 [======>.......................] - ETA: 2:26 - loss: 0.7540 - regression_loss: 0.6723 - classification_loss: 0.0817 126/500 [======>.......................] - ETA: 2:25 - loss: 0.7532 - regression_loss: 0.6719 - classification_loss: 0.0813 127/500 [======>.......................] - ETA: 2:25 - loss: 0.7533 - regression_loss: 0.6722 - classification_loss: 0.0810 128/500 [======>.......................] - ETA: 2:25 - loss: 0.7520 - regression_loss: 0.6713 - classification_loss: 0.0808 129/500 [======>.......................] - ETA: 2:24 - loss: 0.7538 - regression_loss: 0.6728 - classification_loss: 0.0810 130/500 [======>.......................] - ETA: 2:24 - loss: 0.7586 - regression_loss: 0.6769 - classification_loss: 0.0818 131/500 [======>.......................] - ETA: 2:23 - loss: 0.7593 - regression_loss: 0.6775 - classification_loss: 0.0817 132/500 [======>.......................] - ETA: 2:23 - loss: 0.7593 - regression_loss: 0.6778 - classification_loss: 0.0815 133/500 [======>.......................] - ETA: 2:23 - loss: 0.7595 - regression_loss: 0.6780 - classification_loss: 0.0815 134/500 [=======>......................] - ETA: 2:22 - loss: 0.7607 - regression_loss: 0.6790 - classification_loss: 0.0816 135/500 [=======>......................] - ETA: 2:22 - loss: 0.7635 - regression_loss: 0.6815 - classification_loss: 0.0819 136/500 [=======>......................] - ETA: 2:21 - loss: 0.7627 - regression_loss: 0.6809 - classification_loss: 0.0818 137/500 [=======>......................] - ETA: 2:21 - loss: 0.7640 - regression_loss: 0.6819 - classification_loss: 0.0821 138/500 [=======>......................] - ETA: 2:21 - loss: 0.7679 - regression_loss: 0.6853 - classification_loss: 0.0826 139/500 [=======>......................] - ETA: 2:20 - loss: 0.7656 - regression_loss: 0.6832 - classification_loss: 0.0823 140/500 [=======>......................] - ETA: 2:20 - loss: 0.7645 - regression_loss: 0.6821 - classification_loss: 0.0823 141/500 [=======>......................] - ETA: 2:19 - loss: 0.7656 - regression_loss: 0.6834 - classification_loss: 0.0822 142/500 [=======>......................] - ETA: 2:19 - loss: 0.7657 - regression_loss: 0.6837 - classification_loss: 0.0820 143/500 [=======>......................] - ETA: 2:19 - loss: 0.7676 - regression_loss: 0.6854 - classification_loss: 0.0822 144/500 [=======>......................] - ETA: 2:18 - loss: 0.7682 - regression_loss: 0.6862 - classification_loss: 0.0820 145/500 [=======>......................] - ETA: 2:18 - loss: 0.7667 - regression_loss: 0.6848 - classification_loss: 0.0819 146/500 [=======>......................] - ETA: 2:17 - loss: 0.7664 - regression_loss: 0.6846 - classification_loss: 0.0819 147/500 [=======>......................] - ETA: 2:17 - loss: 0.7637 - regression_loss: 0.6820 - classification_loss: 0.0817 148/500 [=======>......................] - ETA: 2:17 - loss: 0.7636 - regression_loss: 0.6819 - classification_loss: 0.0817 149/500 [=======>......................] - ETA: 2:16 - loss: 0.7649 - regression_loss: 0.6832 - classification_loss: 0.0817 150/500 [========>.....................] - ETA: 2:16 - loss: 0.7643 - regression_loss: 0.6828 - classification_loss: 0.0815 151/500 [========>.....................] - ETA: 2:16 - loss: 0.7674 - regression_loss: 0.6854 - classification_loss: 0.0820 152/500 [========>.....................] - ETA: 2:15 - loss: 0.7747 - regression_loss: 0.6915 - classification_loss: 0.0831 153/500 [========>.....................] - ETA: 2:15 - loss: 0.7760 - regression_loss: 0.6929 - classification_loss: 0.0831 154/500 [========>.....................] - ETA: 2:14 - loss: 0.7771 - regression_loss: 0.6939 - classification_loss: 0.0832 155/500 [========>.....................] - ETA: 2:14 - loss: 0.7766 - regression_loss: 0.6935 - classification_loss: 0.0831 156/500 [========>.....................] - ETA: 2:13 - loss: 0.7748 - regression_loss: 0.6919 - classification_loss: 0.0829 157/500 [========>.....................] - ETA: 2:13 - loss: 0.7728 - regression_loss: 0.6901 - classification_loss: 0.0827 158/500 [========>.....................] - ETA: 2:13 - loss: 0.7777 - regression_loss: 0.6945 - classification_loss: 0.0832 159/500 [========>.....................] - ETA: 2:12 - loss: 0.7778 - regression_loss: 0.6947 - classification_loss: 0.0831 160/500 [========>.....................] - ETA: 2:12 - loss: 0.7764 - regression_loss: 0.6935 - classification_loss: 0.0828 161/500 [========>.....................] - ETA: 2:12 - loss: 0.7741 - regression_loss: 0.6916 - classification_loss: 0.0825 162/500 [========>.....................] - ETA: 2:11 - loss: 0.7738 - regression_loss: 0.6912 - classification_loss: 0.0825 163/500 [========>.....................] - ETA: 2:11 - loss: 0.7741 - regression_loss: 0.6915 - classification_loss: 0.0826 164/500 [========>.....................] - ETA: 2:10 - loss: 0.7743 - regression_loss: 0.6919 - classification_loss: 0.0824 165/500 [========>.....................] - ETA: 2:10 - loss: 0.7757 - regression_loss: 0.6933 - classification_loss: 0.0825 166/500 [========>.....................] - ETA: 2:10 - loss: 0.7726 - regression_loss: 0.6905 - classification_loss: 0.0821 167/500 [=========>....................] - ETA: 2:09 - loss: 0.7729 - regression_loss: 0.6908 - classification_loss: 0.0820 168/500 [=========>....................] - ETA: 2:09 - loss: 0.7702 - regression_loss: 0.6884 - classification_loss: 0.0818 169/500 [=========>....................] - ETA: 2:08 - loss: 0.7715 - regression_loss: 0.6896 - classification_loss: 0.0819 170/500 [=========>....................] - ETA: 2:08 - loss: 0.7740 - regression_loss: 0.6911 - classification_loss: 0.0829 171/500 [=========>....................] - ETA: 2:08 - loss: 0.7744 - regression_loss: 0.6914 - classification_loss: 0.0830 172/500 [=========>....................] - ETA: 2:07 - loss: 0.7736 - regression_loss: 0.6907 - classification_loss: 0.0829 173/500 [=========>....................] - ETA: 2:07 - loss: 0.7743 - regression_loss: 0.6913 - classification_loss: 0.0830 174/500 [=========>....................] - ETA: 2:07 - loss: 0.7756 - regression_loss: 0.6924 - classification_loss: 0.0832 175/500 [=========>....................] - ETA: 2:06 - loss: 0.7746 - regression_loss: 0.6913 - classification_loss: 0.0832 176/500 [=========>....................] - ETA: 2:06 - loss: 0.7740 - regression_loss: 0.6909 - classification_loss: 0.0831 177/500 [=========>....................] - ETA: 2:05 - loss: 0.7732 - regression_loss: 0.6902 - classification_loss: 0.0830 178/500 [=========>....................] - ETA: 2:05 - loss: 0.7728 - regression_loss: 0.6898 - classification_loss: 0.0829 179/500 [=========>....................] - ETA: 2:05 - loss: 0.7742 - regression_loss: 0.6910 - classification_loss: 0.0832 180/500 [=========>....................] - ETA: 2:04 - loss: 0.7745 - regression_loss: 0.6913 - classification_loss: 0.0833 181/500 [=========>....................] - ETA: 2:04 - loss: 0.7711 - regression_loss: 0.6882 - classification_loss: 0.0829 182/500 [=========>....................] - ETA: 2:03 - loss: 0.7725 - regression_loss: 0.6892 - classification_loss: 0.0833 183/500 [=========>....................] - ETA: 2:03 - loss: 0.7707 - regression_loss: 0.6874 - classification_loss: 0.0833 184/500 [==========>...................] - ETA: 2:03 - loss: 0.7696 - regression_loss: 0.6865 - classification_loss: 0.0831 185/500 [==========>...................] - ETA: 2:02 - loss: 0.7688 - regression_loss: 0.6858 - classification_loss: 0.0829 186/500 [==========>...................] - ETA: 2:02 - loss: 0.7712 - regression_loss: 0.6877 - classification_loss: 0.0836 187/500 [==========>...................] - ETA: 2:01 - loss: 0.7697 - regression_loss: 0.6864 - classification_loss: 0.0833 188/500 [==========>...................] - ETA: 2:01 - loss: 0.7678 - regression_loss: 0.6848 - classification_loss: 0.0829 189/500 [==========>...................] - ETA: 2:01 - loss: 0.7713 - regression_loss: 0.6876 - classification_loss: 0.0837 190/500 [==========>...................] - ETA: 2:00 - loss: 0.7714 - regression_loss: 0.6876 - classification_loss: 0.0839 191/500 [==========>...................] - ETA: 2:00 - loss: 0.7718 - regression_loss: 0.6880 - classification_loss: 0.0838 192/500 [==========>...................] - ETA: 2:00 - loss: 0.7702 - regression_loss: 0.6866 - classification_loss: 0.0836 193/500 [==========>...................] - ETA: 1:59 - loss: 0.7704 - regression_loss: 0.6867 - classification_loss: 0.0838 194/500 [==========>...................] - ETA: 1:59 - loss: 0.7696 - regression_loss: 0.6860 - classification_loss: 0.0836 195/500 [==========>...................] - ETA: 1:58 - loss: 0.7689 - regression_loss: 0.6855 - classification_loss: 0.0834 196/500 [==========>...................] - ETA: 1:58 - loss: 0.7685 - regression_loss: 0.6853 - classification_loss: 0.0832 197/500 [==========>...................] - ETA: 1:58 - loss: 0.7686 - regression_loss: 0.6856 - classification_loss: 0.0830 198/500 [==========>...................] - ETA: 1:57 - loss: 0.7690 - regression_loss: 0.6860 - classification_loss: 0.0830 199/500 [==========>...................] - ETA: 1:57 - loss: 0.7690 - regression_loss: 0.6860 - classification_loss: 0.0830 200/500 [===========>..................] - ETA: 1:56 - loss: 0.7670 - regression_loss: 0.6843 - classification_loss: 0.0827 201/500 [===========>..................] - ETA: 1:56 - loss: 0.7665 - regression_loss: 0.6839 - classification_loss: 0.0826 202/500 [===========>..................] - ETA: 1:56 - loss: 0.7690 - regression_loss: 0.6859 - classification_loss: 0.0831 203/500 [===========>..................] - ETA: 1:55 - loss: 0.7687 - regression_loss: 0.6857 - classification_loss: 0.0829 204/500 [===========>..................] - ETA: 1:55 - loss: 0.7706 - regression_loss: 0.6873 - classification_loss: 0.0833 205/500 [===========>..................] - ETA: 1:54 - loss: 0.7687 - regression_loss: 0.6857 - classification_loss: 0.0830 206/500 [===========>..................] - ETA: 1:54 - loss: 0.7680 - regression_loss: 0.6852 - classification_loss: 0.0829 207/500 [===========>..................] - ETA: 1:54 - loss: 0.7709 - regression_loss: 0.6876 - classification_loss: 0.0833 208/500 [===========>..................] - ETA: 1:53 - loss: 0.7717 - regression_loss: 0.6883 - classification_loss: 0.0834 209/500 [===========>..................] - ETA: 1:53 - loss: 0.7716 - regression_loss: 0.6883 - classification_loss: 0.0833 210/500 [===========>..................] - ETA: 1:52 - loss: 0.7715 - regression_loss: 0.6883 - classification_loss: 0.0831 211/500 [===========>..................] - ETA: 1:52 - loss: 0.7704 - regression_loss: 0.6874 - classification_loss: 0.0830 212/500 [===========>..................] - ETA: 1:52 - loss: 0.7701 - regression_loss: 0.6871 - classification_loss: 0.0830 213/500 [===========>..................] - ETA: 1:51 - loss: 0.7687 - regression_loss: 0.6857 - classification_loss: 0.0829 214/500 [===========>..................] - ETA: 1:51 - loss: 0.7678 - regression_loss: 0.6850 - classification_loss: 0.0828 215/500 [===========>..................] - ETA: 1:51 - loss: 0.7688 - regression_loss: 0.6858 - classification_loss: 0.0830 216/500 [===========>..................] - ETA: 1:50 - loss: 0.7667 - regression_loss: 0.6840 - classification_loss: 0.0827 217/500 [============>.................] - ETA: 1:50 - loss: 0.7662 - regression_loss: 0.6836 - classification_loss: 0.0826 218/500 [============>.................] - ETA: 1:49 - loss: 0.7642 - regression_loss: 0.6819 - classification_loss: 0.0823 219/500 [============>.................] - ETA: 1:49 - loss: 0.7649 - regression_loss: 0.6822 - classification_loss: 0.0826 220/500 [============>.................] - ETA: 1:49 - loss: 0.7644 - regression_loss: 0.6819 - classification_loss: 0.0825 221/500 [============>.................] - ETA: 1:48 - loss: 0.7633 - regression_loss: 0.6811 - classification_loss: 0.0823 222/500 [============>.................] - ETA: 1:48 - loss: 0.7637 - regression_loss: 0.6812 - classification_loss: 0.0825 223/500 [============>.................] - ETA: 1:47 - loss: 0.7639 - regression_loss: 0.6814 - classification_loss: 0.0825 224/500 [============>.................] - ETA: 1:47 - loss: 0.7660 - regression_loss: 0.6833 - classification_loss: 0.0827 225/500 [============>.................] - ETA: 1:47 - loss: 0.7646 - regression_loss: 0.6820 - classification_loss: 0.0826 226/500 [============>.................] - ETA: 1:46 - loss: 0.7655 - regression_loss: 0.6829 - classification_loss: 0.0826 227/500 [============>.................] - ETA: 1:46 - loss: 0.7666 - regression_loss: 0.6839 - classification_loss: 0.0828 228/500 [============>.................] - ETA: 1:45 - loss: 0.7677 - regression_loss: 0.6850 - classification_loss: 0.0826 229/500 [============>.................] - ETA: 1:45 - loss: 0.7673 - regression_loss: 0.6846 - classification_loss: 0.0826 230/500 [============>.................] - ETA: 1:45 - loss: 0.7669 - regression_loss: 0.6845 - classification_loss: 0.0825 231/500 [============>.................] - ETA: 1:44 - loss: 0.7679 - regression_loss: 0.6854 - classification_loss: 0.0825 232/500 [============>.................] - ETA: 1:44 - loss: 0.7691 - regression_loss: 0.6861 - classification_loss: 0.0829 233/500 [============>.................] - ETA: 1:43 - loss: 0.7699 - regression_loss: 0.6869 - classification_loss: 0.0830 234/500 [=============>................] - ETA: 1:43 - loss: 0.7714 - regression_loss: 0.6882 - classification_loss: 0.0832 235/500 [=============>................] - ETA: 1:43 - loss: 0.7711 - regression_loss: 0.6878 - classification_loss: 0.0832 236/500 [=============>................] - ETA: 1:42 - loss: 0.7696 - regression_loss: 0.6866 - classification_loss: 0.0830 237/500 [=============>................] - ETA: 1:42 - loss: 0.7691 - regression_loss: 0.6861 - classification_loss: 0.0830 238/500 [=============>................] - ETA: 1:42 - loss: 0.7706 - regression_loss: 0.6875 - classification_loss: 0.0831 239/500 [=============>................] - ETA: 1:41 - loss: 0.7706 - regression_loss: 0.6875 - classification_loss: 0.0831 240/500 [=============>................] - ETA: 1:41 - loss: 0.7710 - regression_loss: 0.6880 - classification_loss: 0.0830 241/500 [=============>................] - ETA: 1:40 - loss: 0.7709 - regression_loss: 0.6881 - classification_loss: 0.0829 242/500 [=============>................] - ETA: 1:40 - loss: 0.7711 - regression_loss: 0.6884 - classification_loss: 0.0827 243/500 [=============>................] - ETA: 1:40 - loss: 0.7707 - regression_loss: 0.6881 - classification_loss: 0.0826 244/500 [=============>................] - ETA: 1:39 - loss: 0.7698 - regression_loss: 0.6872 - classification_loss: 0.0826 245/500 [=============>................] - ETA: 1:39 - loss: 0.7716 - regression_loss: 0.6890 - classification_loss: 0.0826 246/500 [=============>................] - ETA: 1:38 - loss: 0.7715 - regression_loss: 0.6889 - classification_loss: 0.0826 247/500 [=============>................] - ETA: 1:38 - loss: 0.7703 - regression_loss: 0.6879 - classification_loss: 0.0824 248/500 [=============>................] - ETA: 1:38 - loss: 0.7704 - regression_loss: 0.6880 - classification_loss: 0.0824 249/500 [=============>................] - ETA: 1:37 - loss: 0.7705 - regression_loss: 0.6883 - classification_loss: 0.0823 250/500 [==============>...............] - ETA: 1:37 - loss: 0.7722 - regression_loss: 0.6899 - classification_loss: 0.0823 251/500 [==============>...............] - ETA: 1:37 - loss: 0.7728 - regression_loss: 0.6905 - classification_loss: 0.0823 252/500 [==============>...............] - ETA: 1:36 - loss: 0.7732 - regression_loss: 0.6910 - classification_loss: 0.0822 253/500 [==============>...............] - ETA: 1:36 - loss: 0.7728 - regression_loss: 0.6906 - classification_loss: 0.0822 254/500 [==============>...............] - ETA: 1:35 - loss: 0.7729 - regression_loss: 0.6908 - classification_loss: 0.0821 255/500 [==============>...............] - ETA: 1:35 - loss: 0.7722 - regression_loss: 0.6900 - classification_loss: 0.0822 256/500 [==============>...............] - ETA: 1:35 - loss: 0.7728 - regression_loss: 0.6906 - classification_loss: 0.0822 257/500 [==============>...............] - ETA: 1:34 - loss: 0.7721 - regression_loss: 0.6900 - classification_loss: 0.0821 258/500 [==============>...............] - ETA: 1:34 - loss: 0.7719 - regression_loss: 0.6899 - classification_loss: 0.0820 259/500 [==============>...............] - ETA: 1:33 - loss: 0.7723 - regression_loss: 0.6902 - classification_loss: 0.0821 260/500 [==============>...............] - ETA: 1:33 - loss: 0.7732 - regression_loss: 0.6911 - classification_loss: 0.0821 261/500 [==============>...............] - ETA: 1:33 - loss: 0.7756 - regression_loss: 0.6929 - classification_loss: 0.0828 262/500 [==============>...............] - ETA: 1:32 - loss: 0.7756 - regression_loss: 0.6929 - classification_loss: 0.0827 263/500 [==============>...............] - ETA: 1:32 - loss: 0.7752 - regression_loss: 0.6926 - classification_loss: 0.0826 264/500 [==============>...............] - ETA: 1:31 - loss: 0.7774 - regression_loss: 0.6943 - classification_loss: 0.0831 265/500 [==============>...............] - ETA: 1:31 - loss: 0.7761 - regression_loss: 0.6932 - classification_loss: 0.0830 266/500 [==============>...............] - ETA: 1:31 - loss: 0.7765 - regression_loss: 0.6935 - classification_loss: 0.0829 267/500 [===============>..............] - ETA: 1:30 - loss: 0.7754 - regression_loss: 0.6926 - classification_loss: 0.0827 268/500 [===============>..............] - ETA: 1:30 - loss: 0.7758 - regression_loss: 0.6930 - classification_loss: 0.0827 269/500 [===============>..............] - ETA: 1:29 - loss: 0.7759 - regression_loss: 0.6932 - classification_loss: 0.0827 270/500 [===============>..............] - ETA: 1:29 - loss: 0.7754 - regression_loss: 0.6928 - classification_loss: 0.0826 271/500 [===============>..............] - ETA: 1:29 - loss: 0.7765 - regression_loss: 0.6937 - classification_loss: 0.0828 272/500 [===============>..............] - ETA: 1:28 - loss: 0.7759 - regression_loss: 0.6932 - classification_loss: 0.0827 273/500 [===============>..............] - ETA: 1:28 - loss: 0.7767 - regression_loss: 0.6940 - classification_loss: 0.0827 274/500 [===============>..............] - ETA: 1:28 - loss: 0.7759 - regression_loss: 0.6932 - classification_loss: 0.0827 275/500 [===============>..............] - ETA: 1:27 - loss: 0.7755 - regression_loss: 0.6930 - classification_loss: 0.0825 276/500 [===============>..............] - ETA: 1:27 - loss: 0.7762 - regression_loss: 0.6936 - classification_loss: 0.0825 277/500 [===============>..............] - ETA: 1:26 - loss: 0.7758 - regression_loss: 0.6934 - classification_loss: 0.0824 278/500 [===============>..............] - ETA: 1:26 - loss: 0.7749 - regression_loss: 0.6926 - classification_loss: 0.0823 279/500 [===============>..............] - ETA: 1:26 - loss: 0.7746 - regression_loss: 0.6924 - classification_loss: 0.0823 280/500 [===============>..............] - ETA: 1:25 - loss: 0.7732 - regression_loss: 0.6911 - classification_loss: 0.0821 281/500 [===============>..............] - ETA: 1:25 - loss: 0.7724 - regression_loss: 0.6905 - classification_loss: 0.0819 282/500 [===============>..............] - ETA: 1:24 - loss: 0.7757 - regression_loss: 0.6930 - classification_loss: 0.0827 283/500 [===============>..............] - ETA: 1:24 - loss: 0.7745 - regression_loss: 0.6920 - classification_loss: 0.0825 284/500 [================>.............] - ETA: 1:24 - loss: 0.7742 - regression_loss: 0.6919 - classification_loss: 0.0823 285/500 [================>.............] - ETA: 1:23 - loss: 0.7728 - regression_loss: 0.6906 - classification_loss: 0.0822 286/500 [================>.............] - ETA: 1:23 - loss: 0.7729 - regression_loss: 0.6907 - classification_loss: 0.0822 287/500 [================>.............] - ETA: 1:22 - loss: 0.7731 - regression_loss: 0.6909 - classification_loss: 0.0822 288/500 [================>.............] - ETA: 1:22 - loss: 0.7743 - regression_loss: 0.6918 - classification_loss: 0.0825 289/500 [================>.............] - ETA: 1:22 - loss: 0.7739 - regression_loss: 0.6915 - classification_loss: 0.0824 290/500 [================>.............] - ETA: 1:21 - loss: 0.7736 - regression_loss: 0.6912 - classification_loss: 0.0824 291/500 [================>.............] - ETA: 1:21 - loss: 0.7730 - regression_loss: 0.6906 - classification_loss: 0.0824 292/500 [================>.............] - ETA: 1:20 - loss: 0.7729 - regression_loss: 0.6905 - classification_loss: 0.0824 293/500 [================>.............] - ETA: 1:20 - loss: 0.7722 - regression_loss: 0.6899 - classification_loss: 0.0822 294/500 [================>.............] - ETA: 1:20 - loss: 0.7727 - regression_loss: 0.6903 - classification_loss: 0.0824 295/500 [================>.............] - ETA: 1:19 - loss: 0.7737 - regression_loss: 0.6913 - classification_loss: 0.0824 296/500 [================>.............] - ETA: 1:19 - loss: 0.7726 - regression_loss: 0.6902 - classification_loss: 0.0823 297/500 [================>.............] - ETA: 1:19 - loss: 0.7718 - regression_loss: 0.6896 - classification_loss: 0.0823 298/500 [================>.............] - ETA: 1:18 - loss: 0.7713 - regression_loss: 0.6891 - classification_loss: 0.0822 299/500 [================>.............] - ETA: 1:18 - loss: 0.7700 - regression_loss: 0.6880 - classification_loss: 0.0821 300/500 [=================>............] - ETA: 1:17 - loss: 0.7686 - regression_loss: 0.6866 - classification_loss: 0.0819 301/500 [=================>............] - ETA: 1:17 - loss: 0.7678 - regression_loss: 0.6860 - classification_loss: 0.0819 302/500 [=================>............] - ETA: 1:17 - loss: 0.7667 - regression_loss: 0.6850 - classification_loss: 0.0818 303/500 [=================>............] - ETA: 1:16 - loss: 0.7666 - regression_loss: 0.6848 - classification_loss: 0.0818 304/500 [=================>............] - ETA: 1:16 - loss: 0.7661 - regression_loss: 0.6844 - classification_loss: 0.0817 305/500 [=================>............] - ETA: 1:15 - loss: 0.7661 - regression_loss: 0.6844 - classification_loss: 0.0817 306/500 [=================>............] - ETA: 1:15 - loss: 0.7664 - regression_loss: 0.6847 - classification_loss: 0.0817 307/500 [=================>............] - ETA: 1:15 - loss: 0.7661 - regression_loss: 0.6844 - classification_loss: 0.0817 308/500 [=================>............] - ETA: 1:14 - loss: 0.7651 - regression_loss: 0.6836 - classification_loss: 0.0816 309/500 [=================>............] - ETA: 1:14 - loss: 0.7661 - regression_loss: 0.6844 - classification_loss: 0.0817 310/500 [=================>............] - ETA: 1:13 - loss: 0.7657 - regression_loss: 0.6841 - classification_loss: 0.0817 311/500 [=================>............] - ETA: 1:13 - loss: 0.7650 - regression_loss: 0.6834 - classification_loss: 0.0816 312/500 [=================>............] - ETA: 1:13 - loss: 0.7675 - regression_loss: 0.6851 - classification_loss: 0.0823 313/500 [=================>............] - ETA: 1:12 - loss: 0.7662 - regression_loss: 0.6840 - classification_loss: 0.0822 314/500 [=================>............] - ETA: 1:12 - loss: 0.7657 - regression_loss: 0.6835 - classification_loss: 0.0821 315/500 [=================>............] - ETA: 1:12 - loss: 0.7643 - regression_loss: 0.6823 - classification_loss: 0.0819 316/500 [=================>............] - ETA: 1:11 - loss: 0.7632 - regression_loss: 0.6814 - classification_loss: 0.0818 317/500 [==================>...........] - ETA: 1:11 - loss: 0.7626 - regression_loss: 0.6809 - classification_loss: 0.0818 318/500 [==================>...........] - ETA: 1:10 - loss: 0.7623 - regression_loss: 0.6806 - classification_loss: 0.0817 319/500 [==================>...........] - ETA: 1:10 - loss: 0.7615 - regression_loss: 0.6800 - classification_loss: 0.0815 320/500 [==================>...........] - ETA: 1:10 - loss: 0.7623 - regression_loss: 0.6806 - classification_loss: 0.0816 321/500 [==================>...........] - ETA: 1:09 - loss: 0.7615 - regression_loss: 0.6799 - classification_loss: 0.0816 322/500 [==================>...........] - ETA: 1:09 - loss: 0.7625 - regression_loss: 0.6806 - classification_loss: 0.0819 323/500 [==================>...........] - ETA: 1:08 - loss: 0.7634 - regression_loss: 0.6814 - classification_loss: 0.0820 324/500 [==================>...........] - ETA: 1:08 - loss: 0.7636 - regression_loss: 0.6818 - classification_loss: 0.0818 325/500 [==================>...........] - ETA: 1:08 - loss: 0.7648 - regression_loss: 0.6829 - classification_loss: 0.0820 326/500 [==================>...........] - ETA: 1:07 - loss: 0.7640 - regression_loss: 0.6822 - classification_loss: 0.0819 327/500 [==================>...........] - ETA: 1:07 - loss: 0.7639 - regression_loss: 0.6822 - classification_loss: 0.0817 328/500 [==================>...........] - ETA: 1:06 - loss: 0.7631 - regression_loss: 0.6813 - classification_loss: 0.0818 329/500 [==================>...........] - ETA: 1:06 - loss: 0.7629 - regression_loss: 0.6812 - classification_loss: 0.0817 330/500 [==================>...........] - ETA: 1:06 - loss: 0.7627 - regression_loss: 0.6808 - classification_loss: 0.0819 331/500 [==================>...........] - ETA: 1:05 - loss: 0.7636 - regression_loss: 0.6816 - classification_loss: 0.0820 332/500 [==================>...........] - ETA: 1:05 - loss: 0.7642 - regression_loss: 0.6821 - classification_loss: 0.0821 333/500 [==================>...........] - ETA: 1:05 - loss: 0.7639 - regression_loss: 0.6818 - classification_loss: 0.0821 334/500 [===================>..........] - ETA: 1:04 - loss: 0.7624 - regression_loss: 0.6805 - classification_loss: 0.0819 335/500 [===================>..........] - ETA: 1:04 - loss: 0.7621 - regression_loss: 0.6803 - classification_loss: 0.0818 336/500 [===================>..........] - ETA: 1:03 - loss: 0.7616 - regression_loss: 0.6798 - classification_loss: 0.0818 337/500 [===================>..........] - ETA: 1:03 - loss: 0.7620 - regression_loss: 0.6801 - classification_loss: 0.0819 338/500 [===================>..........] - ETA: 1:03 - loss: 0.7610 - regression_loss: 0.6792 - classification_loss: 0.0818 339/500 [===================>..........] - ETA: 1:02 - loss: 0.7608 - regression_loss: 0.6790 - classification_loss: 0.0818 340/500 [===================>..........] - ETA: 1:02 - loss: 0.7592 - regression_loss: 0.6776 - classification_loss: 0.0816 341/500 [===================>..........] - ETA: 1:01 - loss: 0.7583 - regression_loss: 0.6769 - classification_loss: 0.0814 342/500 [===================>..........] - ETA: 1:01 - loss: 0.7583 - regression_loss: 0.6769 - classification_loss: 0.0814 343/500 [===================>..........] - ETA: 1:01 - loss: 0.7576 - regression_loss: 0.6763 - classification_loss: 0.0813 344/500 [===================>..........] - ETA: 1:00 - loss: 0.7575 - regression_loss: 0.6762 - classification_loss: 0.0812 345/500 [===================>..........] - ETA: 1:00 - loss: 0.7571 - regression_loss: 0.6759 - classification_loss: 0.0812 346/500 [===================>..........] - ETA: 59s - loss: 0.7573 - regression_loss: 0.6760 - classification_loss: 0.0812  347/500 [===================>..........] - ETA: 59s - loss: 0.7571 - regression_loss: 0.6758 - classification_loss: 0.0812 348/500 [===================>..........] - ETA: 59s - loss: 0.7567 - regression_loss: 0.6755 - classification_loss: 0.0812 349/500 [===================>..........] - ETA: 58s - loss: 0.7556 - regression_loss: 0.6745 - classification_loss: 0.0811 350/500 [====================>.........] - ETA: 58s - loss: 0.7562 - regression_loss: 0.6751 - classification_loss: 0.0810 351/500 [====================>.........] - ETA: 58s - loss: 0.7561 - regression_loss: 0.6751 - classification_loss: 0.0809 352/500 [====================>.........] - ETA: 57s - loss: 0.7578 - regression_loss: 0.6767 - classification_loss: 0.0811 353/500 [====================>.........] - ETA: 57s - loss: 0.7585 - regression_loss: 0.6774 - classification_loss: 0.0811 354/500 [====================>.........] - ETA: 56s - loss: 0.7590 - regression_loss: 0.6780 - classification_loss: 0.0810 355/500 [====================>.........] - ETA: 56s - loss: 0.7587 - regression_loss: 0.6777 - classification_loss: 0.0811 356/500 [====================>.........] - ETA: 56s - loss: 0.7580 - regression_loss: 0.6770 - classification_loss: 0.0810 357/500 [====================>.........] - ETA: 55s - loss: 0.7587 - regression_loss: 0.6776 - classification_loss: 0.0811 358/500 [====================>.........] - ETA: 55s - loss: 0.7586 - regression_loss: 0.6776 - classification_loss: 0.0810 359/500 [====================>.........] - ETA: 54s - loss: 0.7584 - regression_loss: 0.6775 - classification_loss: 0.0809 360/500 [====================>.........] - ETA: 54s - loss: 0.7586 - regression_loss: 0.6776 - classification_loss: 0.0810 361/500 [====================>.........] - ETA: 54s - loss: 0.7585 - regression_loss: 0.6775 - classification_loss: 0.0810 362/500 [====================>.........] - ETA: 53s - loss: 0.7573 - regression_loss: 0.6764 - classification_loss: 0.0809 363/500 [====================>.........] - ETA: 53s - loss: 0.7571 - regression_loss: 0.6763 - classification_loss: 0.0808 364/500 [====================>.........] - ETA: 53s - loss: 0.7562 - regression_loss: 0.6755 - classification_loss: 0.0807 365/500 [====================>.........] - ETA: 52s - loss: 0.7557 - regression_loss: 0.6752 - classification_loss: 0.0805 366/500 [====================>.........] - ETA: 52s - loss: 0.7555 - regression_loss: 0.6750 - classification_loss: 0.0805 367/500 [=====================>........] - ETA: 51s - loss: 0.7546 - regression_loss: 0.6742 - classification_loss: 0.0804 368/500 [=====================>........] - ETA: 51s - loss: 0.7540 - regression_loss: 0.6737 - classification_loss: 0.0803 369/500 [=====================>........] - ETA: 51s - loss: 0.7535 - regression_loss: 0.6733 - classification_loss: 0.0802 370/500 [=====================>........] - ETA: 50s - loss: 0.7532 - regression_loss: 0.6731 - classification_loss: 0.0801 371/500 [=====================>........] - ETA: 50s - loss: 0.7526 - regression_loss: 0.6725 - classification_loss: 0.0801 372/500 [=====================>........] - ETA: 49s - loss: 0.7519 - regression_loss: 0.6719 - classification_loss: 0.0801 373/500 [=====================>........] - ETA: 49s - loss: 0.7511 - regression_loss: 0.6710 - classification_loss: 0.0800 374/500 [=====================>........] - ETA: 49s - loss: 0.7508 - regression_loss: 0.6708 - classification_loss: 0.0800 375/500 [=====================>........] - ETA: 48s - loss: 0.7508 - regression_loss: 0.6708 - classification_loss: 0.0800 376/500 [=====================>........] - ETA: 48s - loss: 0.7497 - regression_loss: 0.6698 - classification_loss: 0.0799 377/500 [=====================>........] - ETA: 47s - loss: 0.7500 - regression_loss: 0.6702 - classification_loss: 0.0798 378/500 [=====================>........] - ETA: 47s - loss: 0.7508 - regression_loss: 0.6708 - classification_loss: 0.0800 379/500 [=====================>........] - ETA: 47s - loss: 0.7497 - regression_loss: 0.6698 - classification_loss: 0.0799 380/500 [=====================>........] - ETA: 46s - loss: 0.7502 - regression_loss: 0.6703 - classification_loss: 0.0800 381/500 [=====================>........] - ETA: 46s - loss: 0.7501 - regression_loss: 0.6701 - classification_loss: 0.0799 382/500 [=====================>........] - ETA: 45s - loss: 0.7490 - regression_loss: 0.6691 - classification_loss: 0.0798 383/500 [=====================>........] - ETA: 45s - loss: 0.7487 - regression_loss: 0.6690 - classification_loss: 0.0797 384/500 [======================>.......] - ETA: 45s - loss: 0.7490 - regression_loss: 0.6694 - classification_loss: 0.0797 385/500 [======================>.......] - ETA: 44s - loss: 0.7492 - regression_loss: 0.6696 - classification_loss: 0.0796 386/500 [======================>.......] - ETA: 44s - loss: 0.7488 - regression_loss: 0.6693 - classification_loss: 0.0796 387/500 [======================>.......] - ETA: 44s - loss: 0.7485 - regression_loss: 0.6690 - classification_loss: 0.0795 388/500 [======================>.......] - ETA: 43s - loss: 0.7483 - regression_loss: 0.6689 - classification_loss: 0.0794 389/500 [======================>.......] - ETA: 43s - loss: 0.7482 - regression_loss: 0.6688 - classification_loss: 0.0794 390/500 [======================>.......] - ETA: 42s - loss: 0.7480 - regression_loss: 0.6687 - classification_loss: 0.0793 391/500 [======================>.......] - ETA: 42s - loss: 0.7477 - regression_loss: 0.6684 - classification_loss: 0.0792 392/500 [======================>.......] - ETA: 42s - loss: 0.7472 - regression_loss: 0.6680 - classification_loss: 0.0792 393/500 [======================>.......] - ETA: 41s - loss: 0.7480 - regression_loss: 0.6686 - classification_loss: 0.0794 394/500 [======================>.......] - ETA: 41s - loss: 0.7500 - regression_loss: 0.6699 - classification_loss: 0.0800 395/500 [======================>.......] - ETA: 40s - loss: 0.7493 - regression_loss: 0.6694 - classification_loss: 0.0799 396/500 [======================>.......] - ETA: 40s - loss: 0.7492 - regression_loss: 0.6693 - classification_loss: 0.0799 397/500 [======================>.......] - ETA: 40s - loss: 0.7494 - regression_loss: 0.6696 - classification_loss: 0.0798 398/500 [======================>.......] - ETA: 39s - loss: 0.7503 - regression_loss: 0.6705 - classification_loss: 0.0797 399/500 [======================>.......] - ETA: 39s - loss: 0.7490 - regression_loss: 0.6694 - classification_loss: 0.0796 400/500 [=======================>......] - ETA: 38s - loss: 0.7494 - regression_loss: 0.6698 - classification_loss: 0.0796 401/500 [=======================>......] - ETA: 38s - loss: 0.7508 - regression_loss: 0.6711 - classification_loss: 0.0797 402/500 [=======================>......] - ETA: 38s - loss: 0.7501 - regression_loss: 0.6704 - classification_loss: 0.0796 403/500 [=======================>......] - ETA: 37s - loss: 0.7500 - regression_loss: 0.6704 - classification_loss: 0.0796 404/500 [=======================>......] - ETA: 37s - loss: 0.7498 - regression_loss: 0.6702 - classification_loss: 0.0796 405/500 [=======================>......] - ETA: 37s - loss: 0.7491 - regression_loss: 0.6696 - classification_loss: 0.0795 406/500 [=======================>......] - ETA: 36s - loss: 0.7484 - regression_loss: 0.6691 - classification_loss: 0.0794 407/500 [=======================>......] - ETA: 36s - loss: 0.7481 - regression_loss: 0.6688 - classification_loss: 0.0793 408/500 [=======================>......] - ETA: 35s - loss: 0.7494 - regression_loss: 0.6698 - classification_loss: 0.0795 409/500 [=======================>......] - ETA: 35s - loss: 0.7509 - regression_loss: 0.6713 - classification_loss: 0.0797 410/500 [=======================>......] - ETA: 35s - loss: 0.7505 - regression_loss: 0.6709 - classification_loss: 0.0796 411/500 [=======================>......] - ETA: 34s - loss: 0.7501 - regression_loss: 0.6705 - classification_loss: 0.0796 412/500 [=======================>......] - ETA: 34s - loss: 0.7506 - regression_loss: 0.6709 - classification_loss: 0.0797 413/500 [=======================>......] - ETA: 33s - loss: 0.7519 - regression_loss: 0.6721 - classification_loss: 0.0798 414/500 [=======================>......] - ETA: 33s - loss: 0.7514 - regression_loss: 0.6717 - classification_loss: 0.0797 415/500 [=======================>......] - ETA: 33s - loss: 0.7520 - regression_loss: 0.6723 - classification_loss: 0.0797 416/500 [=======================>......] - ETA: 32s - loss: 0.7519 - regression_loss: 0.6721 - classification_loss: 0.0797 417/500 [========================>.....] - ETA: 32s - loss: 0.7510 - regression_loss: 0.6713 - classification_loss: 0.0796 418/500 [========================>.....] - ETA: 31s - loss: 0.7512 - regression_loss: 0.6716 - classification_loss: 0.0796 419/500 [========================>.....] - ETA: 31s - loss: 0.7515 - regression_loss: 0.6718 - classification_loss: 0.0797 420/500 [========================>.....] - ETA: 31s - loss: 0.7511 - regression_loss: 0.6714 - classification_loss: 0.0796 421/500 [========================>.....] - ETA: 30s - loss: 0.7511 - regression_loss: 0.6714 - classification_loss: 0.0796 422/500 [========================>.....] - ETA: 30s - loss: 0.7514 - regression_loss: 0.6717 - classification_loss: 0.0797 423/500 [========================>.....] - ETA: 30s - loss: 0.7515 - regression_loss: 0.6718 - classification_loss: 0.0797 424/500 [========================>.....] - ETA: 29s - loss: 0.7506 - regression_loss: 0.6710 - classification_loss: 0.0796 425/500 [========================>.....] - ETA: 29s - loss: 0.7511 - regression_loss: 0.6715 - classification_loss: 0.0797 426/500 [========================>.....] - ETA: 28s - loss: 0.7509 - regression_loss: 0.6713 - classification_loss: 0.0796 427/500 [========================>.....] - ETA: 28s - loss: 0.7504 - regression_loss: 0.6708 - classification_loss: 0.0796 428/500 [========================>.....] - ETA: 28s - loss: 0.7502 - regression_loss: 0.6706 - classification_loss: 0.0795 429/500 [========================>.....] - ETA: 27s - loss: 0.7502 - regression_loss: 0.6706 - classification_loss: 0.0795 430/500 [========================>.....] - ETA: 27s - loss: 0.7511 - regression_loss: 0.6714 - classification_loss: 0.0797 431/500 [========================>.....] - ETA: 26s - loss: 0.7509 - regression_loss: 0.6712 - classification_loss: 0.0797 432/500 [========================>.....] - ETA: 26s - loss: 0.7511 - regression_loss: 0.6714 - classification_loss: 0.0797 433/500 [========================>.....] - ETA: 26s - loss: 0.7510 - regression_loss: 0.6713 - classification_loss: 0.0797 434/500 [=========================>....] - ETA: 25s - loss: 0.7507 - regression_loss: 0.6710 - classification_loss: 0.0797 435/500 [=========================>....] - ETA: 25s - loss: 0.7502 - regression_loss: 0.6707 - classification_loss: 0.0796 436/500 [=========================>....] - ETA: 24s - loss: 0.7496 - regression_loss: 0.6700 - classification_loss: 0.0795 437/500 [=========================>....] - ETA: 24s - loss: 0.7492 - regression_loss: 0.6697 - classification_loss: 0.0795 438/500 [=========================>....] - ETA: 24s - loss: 0.7485 - regression_loss: 0.6691 - classification_loss: 0.0794 439/500 [=========================>....] - ETA: 23s - loss: 0.7484 - regression_loss: 0.6690 - classification_loss: 0.0794 440/500 [=========================>....] - ETA: 23s - loss: 0.7475 - regression_loss: 0.6682 - classification_loss: 0.0793 441/500 [=========================>....] - ETA: 23s - loss: 0.7473 - regression_loss: 0.6680 - classification_loss: 0.0793 442/500 [=========================>....] - ETA: 22s - loss: 0.7474 - regression_loss: 0.6681 - classification_loss: 0.0793 443/500 [=========================>....] - ETA: 22s - loss: 0.7483 - regression_loss: 0.6688 - classification_loss: 0.0794 444/500 [=========================>....] - ETA: 21s - loss: 0.7483 - regression_loss: 0.6689 - classification_loss: 0.0794 445/500 [=========================>....] - ETA: 21s - loss: 0.7482 - regression_loss: 0.6688 - classification_loss: 0.0794 446/500 [=========================>....] - ETA: 21s - loss: 0.7482 - regression_loss: 0.6689 - classification_loss: 0.0793 447/500 [=========================>....] - ETA: 20s - loss: 0.7487 - regression_loss: 0.6694 - classification_loss: 0.0793 448/500 [=========================>....] - ETA: 20s - loss: 0.7487 - regression_loss: 0.6695 - classification_loss: 0.0792 449/500 [=========================>....] - ETA: 19s - loss: 0.7500 - regression_loss: 0.6706 - classification_loss: 0.0794 450/500 [==========================>...] - ETA: 19s - loss: 0.7494 - regression_loss: 0.6702 - classification_loss: 0.0793 451/500 [==========================>...] - ETA: 19s - loss: 0.7497 - regression_loss: 0.6704 - classification_loss: 0.0793 452/500 [==========================>...] - ETA: 18s - loss: 0.7497 - regression_loss: 0.6704 - classification_loss: 0.0793 453/500 [==========================>...] - ETA: 18s - loss: 0.7498 - regression_loss: 0.6705 - classification_loss: 0.0793 454/500 [==========================>...] - ETA: 17s - loss: 0.7525 - regression_loss: 0.6719 - classification_loss: 0.0806 455/500 [==========================>...] - ETA: 17s - loss: 0.7522 - regression_loss: 0.6716 - classification_loss: 0.0806 456/500 [==========================>...] - ETA: 17s - loss: 0.7524 - regression_loss: 0.6719 - classification_loss: 0.0805 457/500 [==========================>...] - ETA: 16s - loss: 0.7519 - regression_loss: 0.6714 - classification_loss: 0.0805 458/500 [==========================>...] - ETA: 16s - loss: 0.7516 - regression_loss: 0.6711 - classification_loss: 0.0805 459/500 [==========================>...] - ETA: 15s - loss: 0.7514 - regression_loss: 0.6709 - classification_loss: 0.0804 460/500 [==========================>...] - ETA: 15s - loss: 0.7510 - regression_loss: 0.6706 - classification_loss: 0.0804 461/500 [==========================>...] - ETA: 15s - loss: 0.7506 - regression_loss: 0.6703 - classification_loss: 0.0803 462/500 [==========================>...] - ETA: 14s - loss: 0.7506 - regression_loss: 0.6702 - classification_loss: 0.0804 463/500 [==========================>...] - ETA: 14s - loss: 0.7510 - regression_loss: 0.6706 - classification_loss: 0.0804 464/500 [==========================>...] - ETA: 14s - loss: 0.7506 - regression_loss: 0.6703 - classification_loss: 0.0803 465/500 [==========================>...] - ETA: 13s - loss: 0.7499 - regression_loss: 0.6696 - classification_loss: 0.0802 466/500 [==========================>...] - ETA: 13s - loss: 0.7502 - regression_loss: 0.6700 - classification_loss: 0.0802 467/500 [===========================>..] - ETA: 12s - loss: 0.7497 - regression_loss: 0.6695 - classification_loss: 0.0802 468/500 [===========================>..] - ETA: 12s - loss: 0.7526 - regression_loss: 0.6716 - classification_loss: 0.0809 469/500 [===========================>..] - ETA: 12s - loss: 0.7528 - regression_loss: 0.6719 - classification_loss: 0.0809 470/500 [===========================>..] - ETA: 11s - loss: 0.7527 - regression_loss: 0.6718 - classification_loss: 0.0809 471/500 [===========================>..] - ETA: 11s - loss: 0.7542 - regression_loss: 0.6733 - classification_loss: 0.0809 472/500 [===========================>..] - ETA: 10s - loss: 0.7538 - regression_loss: 0.6730 - classification_loss: 0.0809 473/500 [===========================>..] - ETA: 10s - loss: 0.7532 - regression_loss: 0.6724 - classification_loss: 0.0808 474/500 [===========================>..] - ETA: 10s - loss: 0.7531 - regression_loss: 0.6723 - classification_loss: 0.0808 475/500 [===========================>..] - ETA: 9s - loss: 0.7537 - regression_loss: 0.6729 - classification_loss: 0.0808  476/500 [===========================>..] - ETA: 9s - loss: 0.7539 - regression_loss: 0.6731 - classification_loss: 0.0808 477/500 [===========================>..] - ETA: 8s - loss: 0.7529 - regression_loss: 0.6722 - classification_loss: 0.0807 478/500 [===========================>..] - ETA: 8s - loss: 0.7532 - regression_loss: 0.6725 - classification_loss: 0.0807 479/500 [===========================>..] - ETA: 8s - loss: 0.7532 - regression_loss: 0.6725 - classification_loss: 0.0807 480/500 [===========================>..] - ETA: 7s - loss: 0.7531 - regression_loss: 0.6724 - classification_loss: 0.0807 481/500 [===========================>..] - ETA: 7s - loss: 0.7527 - regression_loss: 0.6721 - classification_loss: 0.0806 482/500 [===========================>..] - ETA: 7s - loss: 0.7526 - regression_loss: 0.6721 - classification_loss: 0.0805 483/500 [===========================>..] - ETA: 6s - loss: 0.7526 - regression_loss: 0.6721 - classification_loss: 0.0805 484/500 [============================>.] - ETA: 6s - loss: 0.7517 - regression_loss: 0.6713 - classification_loss: 0.0804 485/500 [============================>.] - ETA: 5s - loss: 0.7511 - regression_loss: 0.6708 - classification_loss: 0.0803 486/500 [============================>.] - ETA: 5s - loss: 0.7515 - regression_loss: 0.6711 - classification_loss: 0.0804 487/500 [============================>.] - ETA: 5s - loss: 0.7512 - regression_loss: 0.6709 - classification_loss: 0.0803 488/500 [============================>.] - ETA: 4s - loss: 0.7509 - regression_loss: 0.6707 - classification_loss: 0.0802 489/500 [============================>.] - ETA: 4s - loss: 0.7497 - regression_loss: 0.6696 - classification_loss: 0.0801 490/500 [============================>.] - ETA: 3s - loss: 0.7496 - regression_loss: 0.6695 - classification_loss: 0.0801 491/500 [============================>.] - ETA: 3s - loss: 0.7495 - regression_loss: 0.6694 - classification_loss: 0.0801 492/500 [============================>.] - ETA: 3s - loss: 0.7493 - regression_loss: 0.6692 - classification_loss: 0.0800 493/500 [============================>.] - ETA: 2s - loss: 0.7489 - regression_loss: 0.6689 - classification_loss: 0.0800 494/500 [============================>.] - ETA: 2s - loss: 0.7484 - regression_loss: 0.6685 - classification_loss: 0.0800 495/500 [============================>.] - ETA: 1s - loss: 0.7483 - regression_loss: 0.6684 - classification_loss: 0.0799 496/500 [============================>.] - ETA: 1s - loss: 0.7482 - regression_loss: 0.6683 - classification_loss: 0.0799 497/500 [============================>.] - ETA: 1s - loss: 0.7481 - regression_loss: 0.6683 - classification_loss: 0.0799 498/500 [============================>.] - ETA: 0s - loss: 0.7480 - regression_loss: 0.6681 - classification_loss: 0.0798 499/500 [============================>.] - ETA: 0s - loss: 0.7486 - regression_loss: 0.6687 - classification_loss: 0.0799 500/500 [==============================] - 195s 390ms/step - loss: 0.7483 - regression_loss: 0.6684 - classification_loss: 0.0799 326 instances of class plum with average precision: 0.8159 mAP: 0.8159 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/20 1/500 [..............................] - ETA: 3:08 - loss: 0.3909 - regression_loss: 0.3634 - classification_loss: 0.0275 2/500 [..............................] - ETA: 3:09 - loss: 0.6264 - regression_loss: 0.5631 - classification_loss: 0.0634 3/500 [..............................] - ETA: 3:12 - loss: 0.6046 - regression_loss: 0.5453 - classification_loss: 0.0593 4/500 [..............................] - ETA: 3:10 - loss: 0.5285 - regression_loss: 0.4768 - classification_loss: 0.0517 5/500 [..............................] - ETA: 3:12 - loss: 0.6286 - regression_loss: 0.5620 - classification_loss: 0.0666 6/500 [..............................] - ETA: 3:10 - loss: 0.6710 - regression_loss: 0.5929 - classification_loss: 0.0781 7/500 [..............................] - ETA: 3:10 - loss: 0.6971 - regression_loss: 0.6209 - classification_loss: 0.0761 8/500 [..............................] - ETA: 3:10 - loss: 0.6838 - regression_loss: 0.6107 - classification_loss: 0.0731 9/500 [..............................] - ETA: 3:09 - loss: 0.6435 - regression_loss: 0.5758 - classification_loss: 0.0677 10/500 [..............................] - ETA: 3:09 - loss: 0.6842 - regression_loss: 0.6155 - classification_loss: 0.0687 11/500 [..............................] - ETA: 3:08 - loss: 0.7558 - regression_loss: 0.6746 - classification_loss: 0.0811 12/500 [..............................] - ETA: 3:08 - loss: 0.7525 - regression_loss: 0.6706 - classification_loss: 0.0820 13/500 [..............................] - ETA: 3:07 - loss: 0.7287 - regression_loss: 0.6493 - classification_loss: 0.0794 14/500 [..............................] - ETA: 3:07 - loss: 0.7211 - regression_loss: 0.6439 - classification_loss: 0.0771 15/500 [..............................] - ETA: 3:07 - loss: 0.7187 - regression_loss: 0.6417 - classification_loss: 0.0770 16/500 [..............................] - ETA: 3:06 - loss: 0.7117 - regression_loss: 0.6362 - classification_loss: 0.0755 17/500 [>.............................] - ETA: 3:05 - loss: 0.7052 - regression_loss: 0.6311 - classification_loss: 0.0741 18/500 [>.............................] - ETA: 3:05 - loss: 0.7065 - regression_loss: 0.6331 - classification_loss: 0.0733 19/500 [>.............................] - ETA: 3:05 - loss: 0.6946 - regression_loss: 0.6243 - classification_loss: 0.0703 20/500 [>.............................] - ETA: 3:05 - loss: 0.6870 - regression_loss: 0.6174 - classification_loss: 0.0696 21/500 [>.............................] - ETA: 3:04 - loss: 0.6817 - regression_loss: 0.6128 - classification_loss: 0.0689 22/500 [>.............................] - ETA: 3:04 - loss: 0.6730 - regression_loss: 0.6050 - classification_loss: 0.0681 23/500 [>.............................] - ETA: 3:03 - loss: 0.6615 - regression_loss: 0.5950 - classification_loss: 0.0666 24/500 [>.............................] - ETA: 3:03 - loss: 0.6940 - regression_loss: 0.6243 - classification_loss: 0.0697 25/500 [>.............................] - ETA: 3:03 - loss: 0.6805 - regression_loss: 0.6121 - classification_loss: 0.0683 26/500 [>.............................] - ETA: 3:02 - loss: 0.6802 - regression_loss: 0.6132 - classification_loss: 0.0670 27/500 [>.............................] - ETA: 3:02 - loss: 0.6960 - regression_loss: 0.6258 - classification_loss: 0.0702 28/500 [>.............................] - ETA: 3:01 - loss: 0.6811 - regression_loss: 0.6118 - classification_loss: 0.0693 29/500 [>.............................] - ETA: 3:01 - loss: 0.6750 - regression_loss: 0.6067 - classification_loss: 0.0683 30/500 [>.............................] - ETA: 3:01 - loss: 0.6881 - regression_loss: 0.6170 - classification_loss: 0.0710 31/500 [>.............................] - ETA: 3:00 - loss: 0.6908 - regression_loss: 0.6208 - classification_loss: 0.0701 32/500 [>.............................] - ETA: 3:00 - loss: 0.6809 - regression_loss: 0.6117 - classification_loss: 0.0691 33/500 [>.............................] - ETA: 3:00 - loss: 0.6746 - regression_loss: 0.6061 - classification_loss: 0.0685 34/500 [=>............................] - ETA: 3:00 - loss: 0.6760 - regression_loss: 0.6082 - classification_loss: 0.0678 35/500 [=>............................] - ETA: 2:59 - loss: 0.6710 - regression_loss: 0.6037 - classification_loss: 0.0672 36/500 [=>............................] - ETA: 2:59 - loss: 0.6740 - regression_loss: 0.6055 - classification_loss: 0.0685 37/500 [=>............................] - ETA: 2:58 - loss: 0.6887 - regression_loss: 0.6172 - classification_loss: 0.0716 38/500 [=>............................] - ETA: 2:58 - loss: 0.6815 - regression_loss: 0.6106 - classification_loss: 0.0709 39/500 [=>............................] - ETA: 2:57 - loss: 0.6919 - regression_loss: 0.6192 - classification_loss: 0.0727 40/500 [=>............................] - ETA: 2:57 - loss: 0.6935 - regression_loss: 0.6209 - classification_loss: 0.0726 41/500 [=>............................] - ETA: 2:57 - loss: 0.6971 - regression_loss: 0.6245 - classification_loss: 0.0727 42/500 [=>............................] - ETA: 2:57 - loss: 0.6999 - regression_loss: 0.6272 - classification_loss: 0.0727 43/500 [=>............................] - ETA: 2:56 - loss: 0.6982 - regression_loss: 0.6261 - classification_loss: 0.0721 44/500 [=>............................] - ETA: 2:56 - loss: 0.7048 - regression_loss: 0.6332 - classification_loss: 0.0716 45/500 [=>............................] - ETA: 2:55 - loss: 0.7039 - regression_loss: 0.6321 - classification_loss: 0.0717 46/500 [=>............................] - ETA: 2:55 - loss: 0.7013 - regression_loss: 0.6300 - classification_loss: 0.0714 47/500 [=>............................] - ETA: 2:55 - loss: 0.7357 - regression_loss: 0.6600 - classification_loss: 0.0757 48/500 [=>............................] - ETA: 2:54 - loss: 0.7265 - regression_loss: 0.6511 - classification_loss: 0.0754 49/500 [=>............................] - ETA: 2:54 - loss: 0.7209 - regression_loss: 0.6457 - classification_loss: 0.0752 50/500 [==>...........................] - ETA: 2:54 - loss: 0.7302 - regression_loss: 0.6537 - classification_loss: 0.0764 51/500 [==>...........................] - ETA: 2:53 - loss: 0.7302 - regression_loss: 0.6542 - classification_loss: 0.0760 52/500 [==>...........................] - ETA: 2:53 - loss: 0.7247 - regression_loss: 0.6492 - classification_loss: 0.0755 53/500 [==>...........................] - ETA: 2:53 - loss: 0.7260 - regression_loss: 0.6505 - classification_loss: 0.0756 54/500 [==>...........................] - ETA: 2:52 - loss: 0.7253 - regression_loss: 0.6504 - classification_loss: 0.0749 55/500 [==>...........................] - ETA: 2:52 - loss: 0.7287 - regression_loss: 0.6537 - classification_loss: 0.0750 56/500 [==>...........................] - ETA: 2:52 - loss: 0.7270 - regression_loss: 0.6523 - classification_loss: 0.0747 57/500 [==>...........................] - ETA: 2:51 - loss: 0.7285 - regression_loss: 0.6535 - classification_loss: 0.0750 58/500 [==>...........................] - ETA: 2:51 - loss: 0.7240 - regression_loss: 0.6496 - classification_loss: 0.0744 59/500 [==>...........................] - ETA: 2:51 - loss: 0.7258 - regression_loss: 0.6513 - classification_loss: 0.0745 60/500 [==>...........................] - ETA: 2:50 - loss: 0.7222 - regression_loss: 0.6479 - classification_loss: 0.0743 61/500 [==>...........................] - ETA: 2:49 - loss: 0.7326 - regression_loss: 0.6569 - classification_loss: 0.0756 62/500 [==>...........................] - ETA: 2:49 - loss: 0.7294 - regression_loss: 0.6542 - classification_loss: 0.0753 63/500 [==>...........................] - ETA: 2:49 - loss: 0.7316 - regression_loss: 0.6559 - classification_loss: 0.0756 64/500 [==>...........................] - ETA: 2:48 - loss: 0.7388 - regression_loss: 0.6630 - classification_loss: 0.0758 65/500 [==>...........................] - ETA: 2:48 - loss: 0.7420 - regression_loss: 0.6664 - classification_loss: 0.0756 66/500 [==>...........................] - ETA: 2:47 - loss: 0.7410 - regression_loss: 0.6653 - classification_loss: 0.0757 67/500 [===>..........................] - ETA: 2:47 - loss: 0.7473 - regression_loss: 0.6714 - classification_loss: 0.0758 68/500 [===>..........................] - ETA: 2:47 - loss: 0.7441 - regression_loss: 0.6689 - classification_loss: 0.0752 69/500 [===>..........................] - ETA: 2:46 - loss: 0.7428 - regression_loss: 0.6678 - classification_loss: 0.0751 70/500 [===>..........................] - ETA: 2:46 - loss: 0.7387 - regression_loss: 0.6641 - classification_loss: 0.0746 71/500 [===>..........................] - ETA: 2:46 - loss: 0.7390 - regression_loss: 0.6646 - classification_loss: 0.0744 72/500 [===>..........................] - ETA: 2:45 - loss: 0.7387 - regression_loss: 0.6644 - classification_loss: 0.0743 73/500 [===>..........................] - ETA: 2:45 - loss: 0.7359 - regression_loss: 0.6619 - classification_loss: 0.0741 74/500 [===>..........................] - ETA: 2:45 - loss: 0.7399 - regression_loss: 0.6659 - classification_loss: 0.0740 75/500 [===>..........................] - ETA: 2:44 - loss: 0.7486 - regression_loss: 0.6743 - classification_loss: 0.0743 76/500 [===>..........................] - ETA: 2:44 - loss: 0.7487 - regression_loss: 0.6745 - classification_loss: 0.0742 77/500 [===>..........................] - ETA: 2:44 - loss: 0.7529 - regression_loss: 0.6779 - classification_loss: 0.0749 78/500 [===>..........................] - ETA: 2:43 - loss: 0.7504 - regression_loss: 0.6759 - classification_loss: 0.0745 79/500 [===>..........................] - ETA: 2:43 - loss: 0.7512 - regression_loss: 0.6759 - classification_loss: 0.0752 80/500 [===>..........................] - ETA: 2:43 - loss: 0.7522 - regression_loss: 0.6772 - classification_loss: 0.0750 81/500 [===>..........................] - ETA: 2:42 - loss: 0.7534 - regression_loss: 0.6787 - classification_loss: 0.0747 82/500 [===>..........................] - ETA: 2:42 - loss: 0.7626 - regression_loss: 0.6852 - classification_loss: 0.0774 83/500 [===>..........................] - ETA: 2:41 - loss: 0.7603 - regression_loss: 0.6832 - classification_loss: 0.0770 84/500 [====>.........................] - ETA: 2:41 - loss: 0.7675 - regression_loss: 0.6897 - classification_loss: 0.0779 85/500 [====>.........................] - ETA: 2:41 - loss: 0.7645 - regression_loss: 0.6871 - classification_loss: 0.0774 86/500 [====>.........................] - ETA: 2:40 - loss: 0.7652 - regression_loss: 0.6879 - classification_loss: 0.0773 87/500 [====>.........................] - ETA: 2:40 - loss: 0.7632 - regression_loss: 0.6863 - classification_loss: 0.0770 88/500 [====>.........................] - ETA: 2:39 - loss: 0.7625 - regression_loss: 0.6859 - classification_loss: 0.0767 89/500 [====>.........................] - ETA: 2:39 - loss: 0.7586 - regression_loss: 0.6823 - classification_loss: 0.0763 90/500 [====>.........................] - ETA: 2:39 - loss: 0.7564 - regression_loss: 0.6803 - classification_loss: 0.0761 91/500 [====>.........................] - ETA: 2:38 - loss: 0.7548 - regression_loss: 0.6788 - classification_loss: 0.0760 92/500 [====>.........................] - ETA: 2:38 - loss: 0.7557 - regression_loss: 0.6795 - classification_loss: 0.0762 93/500 [====>.........................] - ETA: 2:37 - loss: 0.7534 - regression_loss: 0.6774 - classification_loss: 0.0760 94/500 [====>.........................] - ETA: 2:37 - loss: 0.7511 - regression_loss: 0.6754 - classification_loss: 0.0757 95/500 [====>.........................] - ETA: 2:37 - loss: 0.7485 - regression_loss: 0.6730 - classification_loss: 0.0755 96/500 [====>.........................] - ETA: 2:36 - loss: 0.7500 - regression_loss: 0.6738 - classification_loss: 0.0762 97/500 [====>.........................] - ETA: 2:36 - loss: 0.7460 - regression_loss: 0.6703 - classification_loss: 0.0757 98/500 [====>.........................] - ETA: 2:35 - loss: 0.7425 - regression_loss: 0.6671 - classification_loss: 0.0754 99/500 [====>.........................] - ETA: 2:35 - loss: 0.7394 - regression_loss: 0.6644 - classification_loss: 0.0750 100/500 [=====>........................] - ETA: 2:35 - loss: 0.7390 - regression_loss: 0.6641 - classification_loss: 0.0749 101/500 [=====>........................] - ETA: 2:34 - loss: 0.7401 - regression_loss: 0.6653 - classification_loss: 0.0749 102/500 [=====>........................] - ETA: 2:34 - loss: 0.7387 - regression_loss: 0.6641 - classification_loss: 0.0746 103/500 [=====>........................] - ETA: 2:33 - loss: 0.7371 - regression_loss: 0.6628 - classification_loss: 0.0743 104/500 [=====>........................] - ETA: 2:33 - loss: 0.7342 - regression_loss: 0.6602 - classification_loss: 0.0740 105/500 [=====>........................] - ETA: 2:33 - loss: 0.7339 - regression_loss: 0.6598 - classification_loss: 0.0741 106/500 [=====>........................] - ETA: 2:32 - loss: 0.7332 - regression_loss: 0.6593 - classification_loss: 0.0739 107/500 [=====>........................] - ETA: 2:32 - loss: 0.7305 - regression_loss: 0.6570 - classification_loss: 0.0735 108/500 [=====>........................] - ETA: 2:32 - loss: 0.7274 - regression_loss: 0.6543 - classification_loss: 0.0731 109/500 [=====>........................] - ETA: 2:31 - loss: 0.7275 - regression_loss: 0.6542 - classification_loss: 0.0732 110/500 [=====>........................] - ETA: 2:31 - loss: 0.7246 - regression_loss: 0.6515 - classification_loss: 0.0731 111/500 [=====>........................] - ETA: 2:30 - loss: 0.7233 - regression_loss: 0.6504 - classification_loss: 0.0729 112/500 [=====>........................] - ETA: 2:30 - loss: 0.7231 - regression_loss: 0.6503 - classification_loss: 0.0727 113/500 [=====>........................] - ETA: 2:30 - loss: 0.7219 - regression_loss: 0.6493 - classification_loss: 0.0726 114/500 [=====>........................] - ETA: 2:29 - loss: 0.7212 - regression_loss: 0.6489 - classification_loss: 0.0723 115/500 [=====>........................] - ETA: 2:29 - loss: 0.7191 - regression_loss: 0.6470 - classification_loss: 0.0721 116/500 [=====>........................] - ETA: 2:29 - loss: 0.7249 - regression_loss: 0.6520 - classification_loss: 0.0729 117/500 [======>.......................] - ETA: 2:28 - loss: 0.7263 - regression_loss: 0.6532 - classification_loss: 0.0731 118/500 [======>.......................] - ETA: 2:28 - loss: 0.7253 - regression_loss: 0.6522 - classification_loss: 0.0731 119/500 [======>.......................] - ETA: 2:27 - loss: 0.7254 - regression_loss: 0.6522 - classification_loss: 0.0733 120/500 [======>.......................] - ETA: 2:27 - loss: 0.7242 - regression_loss: 0.6513 - classification_loss: 0.0730 121/500 [======>.......................] - ETA: 2:26 - loss: 0.7234 - regression_loss: 0.6506 - classification_loss: 0.0728 122/500 [======>.......................] - ETA: 2:26 - loss: 0.7227 - regression_loss: 0.6501 - classification_loss: 0.0726 123/500 [======>.......................] - ETA: 2:26 - loss: 0.7203 - regression_loss: 0.6481 - classification_loss: 0.0722 124/500 [======>.......................] - ETA: 2:25 - loss: 0.7181 - regression_loss: 0.6462 - classification_loss: 0.0719 125/500 [======>.......................] - ETA: 2:25 - loss: 0.7185 - regression_loss: 0.6465 - classification_loss: 0.0720 126/500 [======>.......................] - ETA: 2:25 - loss: 0.7144 - regression_loss: 0.6428 - classification_loss: 0.0716 127/500 [======>.......................] - ETA: 2:24 - loss: 0.7146 - regression_loss: 0.6428 - classification_loss: 0.0718 128/500 [======>.......................] - ETA: 2:24 - loss: 0.7138 - regression_loss: 0.6421 - classification_loss: 0.0717 129/500 [======>.......................] - ETA: 2:23 - loss: 0.7146 - regression_loss: 0.6428 - classification_loss: 0.0718 130/500 [======>.......................] - ETA: 2:23 - loss: 0.7139 - regression_loss: 0.6422 - classification_loss: 0.0717 131/500 [======>.......................] - ETA: 2:23 - loss: 0.7145 - regression_loss: 0.6425 - classification_loss: 0.0720 132/500 [======>.......................] - ETA: 2:22 - loss: 0.7156 - regression_loss: 0.6437 - classification_loss: 0.0719 133/500 [======>.......................] - ETA: 2:22 - loss: 0.7183 - regression_loss: 0.6464 - classification_loss: 0.0720 134/500 [=======>......................] - ETA: 2:21 - loss: 0.7186 - regression_loss: 0.6468 - classification_loss: 0.0719 135/500 [=======>......................] - ETA: 2:21 - loss: 0.7183 - regression_loss: 0.6466 - classification_loss: 0.0717 136/500 [=======>......................] - ETA: 2:21 - loss: 0.7165 - regression_loss: 0.6450 - classification_loss: 0.0714 137/500 [=======>......................] - ETA: 2:20 - loss: 0.7149 - regression_loss: 0.6437 - classification_loss: 0.0712 138/500 [=======>......................] - ETA: 2:20 - loss: 0.7161 - regression_loss: 0.6448 - classification_loss: 0.0713 139/500 [=======>......................] - ETA: 2:20 - loss: 0.7154 - regression_loss: 0.6443 - classification_loss: 0.0711 140/500 [=======>......................] - ETA: 2:19 - loss: 0.7158 - regression_loss: 0.6446 - classification_loss: 0.0712 141/500 [=======>......................] - ETA: 2:19 - loss: 0.7162 - regression_loss: 0.6450 - classification_loss: 0.0712 142/500 [=======>......................] - ETA: 2:18 - loss: 0.7181 - regression_loss: 0.6466 - classification_loss: 0.0715 143/500 [=======>......................] - ETA: 2:18 - loss: 0.7216 - regression_loss: 0.6492 - classification_loss: 0.0724 144/500 [=======>......................] - ETA: 2:18 - loss: 0.7209 - regression_loss: 0.6486 - classification_loss: 0.0723 145/500 [=======>......................] - ETA: 2:17 - loss: 0.7205 - regression_loss: 0.6481 - classification_loss: 0.0724 146/500 [=======>......................] - ETA: 2:17 - loss: 0.7194 - regression_loss: 0.6470 - classification_loss: 0.0724 147/500 [=======>......................] - ETA: 2:17 - loss: 0.7170 - regression_loss: 0.6446 - classification_loss: 0.0724 148/500 [=======>......................] - ETA: 2:16 - loss: 0.7195 - regression_loss: 0.6465 - classification_loss: 0.0729 149/500 [=======>......................] - ETA: 2:16 - loss: 0.7192 - regression_loss: 0.6463 - classification_loss: 0.0730 150/500 [========>.....................] - ETA: 2:15 - loss: 0.7170 - regression_loss: 0.6443 - classification_loss: 0.0727 151/500 [========>.....................] - ETA: 2:15 - loss: 0.7170 - regression_loss: 0.6441 - classification_loss: 0.0729 152/500 [========>.....................] - ETA: 2:15 - loss: 0.7153 - regression_loss: 0.6426 - classification_loss: 0.0726 153/500 [========>.....................] - ETA: 2:14 - loss: 0.7130 - regression_loss: 0.6406 - classification_loss: 0.0724 154/500 [========>.....................] - ETA: 2:14 - loss: 0.7111 - regression_loss: 0.6389 - classification_loss: 0.0722 155/500 [========>.....................] - ETA: 2:13 - loss: 0.7149 - regression_loss: 0.6423 - classification_loss: 0.0726 156/500 [========>.....................] - ETA: 2:13 - loss: 0.7138 - regression_loss: 0.6414 - classification_loss: 0.0724 157/500 [========>.....................] - ETA: 2:13 - loss: 0.7117 - regression_loss: 0.6395 - classification_loss: 0.0722 158/500 [========>.....................] - ETA: 2:12 - loss: 0.7095 - regression_loss: 0.6375 - classification_loss: 0.0720 159/500 [========>.....................] - ETA: 2:12 - loss: 0.7118 - regression_loss: 0.6396 - classification_loss: 0.0722 160/500 [========>.....................] - ETA: 2:11 - loss: 0.7137 - regression_loss: 0.6414 - classification_loss: 0.0723 161/500 [========>.....................] - ETA: 2:11 - loss: 0.7125 - regression_loss: 0.6403 - classification_loss: 0.0721 162/500 [========>.....................] - ETA: 2:11 - loss: 0.7104 - regression_loss: 0.6385 - classification_loss: 0.0719 163/500 [========>.....................] - ETA: 2:10 - loss: 0.7110 - regression_loss: 0.6390 - classification_loss: 0.0720 164/500 [========>.....................] - ETA: 2:10 - loss: 0.7104 - regression_loss: 0.6386 - classification_loss: 0.0718 165/500 [========>.....................] - ETA: 2:09 - loss: 0.7126 - regression_loss: 0.6404 - classification_loss: 0.0722 166/500 [========>.....................] - ETA: 2:09 - loss: 0.7119 - regression_loss: 0.6399 - classification_loss: 0.0720 167/500 [=========>....................] - ETA: 2:09 - loss: 0.7145 - regression_loss: 0.6419 - classification_loss: 0.0726 168/500 [=========>....................] - ETA: 2:08 - loss: 0.7124 - regression_loss: 0.6400 - classification_loss: 0.0723 169/500 [=========>....................] - ETA: 2:08 - loss: 0.7102 - regression_loss: 0.6382 - classification_loss: 0.0721 170/500 [=========>....................] - ETA: 2:08 - loss: 0.7086 - regression_loss: 0.6365 - classification_loss: 0.0721 171/500 [=========>....................] - ETA: 2:07 - loss: 0.7126 - regression_loss: 0.6405 - classification_loss: 0.0721 172/500 [=========>....................] - ETA: 2:07 - loss: 0.7121 - regression_loss: 0.6399 - classification_loss: 0.0722 173/500 [=========>....................] - ETA: 2:06 - loss: 0.7115 - regression_loss: 0.6395 - classification_loss: 0.0720 174/500 [=========>....................] - ETA: 2:06 - loss: 0.7103 - regression_loss: 0.6384 - classification_loss: 0.0719 175/500 [=========>....................] - ETA: 2:06 - loss: 0.7109 - regression_loss: 0.6392 - classification_loss: 0.0717 176/500 [=========>....................] - ETA: 2:05 - loss: 0.7110 - regression_loss: 0.6394 - classification_loss: 0.0716 177/500 [=========>....................] - ETA: 2:05 - loss: 0.7113 - regression_loss: 0.6396 - classification_loss: 0.0717 178/500 [=========>....................] - ETA: 2:04 - loss: 0.7096 - regression_loss: 0.6381 - classification_loss: 0.0715 179/500 [=========>....................] - ETA: 2:04 - loss: 0.7092 - regression_loss: 0.6378 - classification_loss: 0.0714 180/500 [=========>....................] - ETA: 2:04 - loss: 0.7069 - regression_loss: 0.6358 - classification_loss: 0.0711 181/500 [=========>....................] - ETA: 2:03 - loss: 0.7074 - regression_loss: 0.6362 - classification_loss: 0.0712 182/500 [=========>....................] - ETA: 2:03 - loss: 0.7054 - regression_loss: 0.6345 - classification_loss: 0.0709 183/500 [=========>....................] - ETA: 2:03 - loss: 0.7063 - regression_loss: 0.6353 - classification_loss: 0.0710 184/500 [==========>...................] - ETA: 2:02 - loss: 0.7057 - regression_loss: 0.6349 - classification_loss: 0.0708 185/500 [==========>...................] - ETA: 2:02 - loss: 0.7080 - regression_loss: 0.6371 - classification_loss: 0.0709 186/500 [==========>...................] - ETA: 2:01 - loss: 0.7081 - regression_loss: 0.6373 - classification_loss: 0.0708 187/500 [==========>...................] - ETA: 2:01 - loss: 0.7087 - regression_loss: 0.6376 - classification_loss: 0.0711 188/500 [==========>...................] - ETA: 2:01 - loss: 0.7112 - regression_loss: 0.6400 - classification_loss: 0.0712 189/500 [==========>...................] - ETA: 2:00 - loss: 0.7097 - regression_loss: 0.6387 - classification_loss: 0.0710 190/500 [==========>...................] - ETA: 2:00 - loss: 0.7115 - regression_loss: 0.6404 - classification_loss: 0.0711 191/500 [==========>...................] - ETA: 1:59 - loss: 0.7129 - regression_loss: 0.6416 - classification_loss: 0.0713 192/500 [==========>...................] - ETA: 1:59 - loss: 0.7135 - regression_loss: 0.6422 - classification_loss: 0.0713 193/500 [==========>...................] - ETA: 1:59 - loss: 0.7139 - regression_loss: 0.6426 - classification_loss: 0.0714 194/500 [==========>...................] - ETA: 1:58 - loss: 0.7135 - regression_loss: 0.6421 - classification_loss: 0.0714 195/500 [==========>...................] - ETA: 1:58 - loss: 0.7118 - regression_loss: 0.6406 - classification_loss: 0.0712 196/500 [==========>...................] - ETA: 1:58 - loss: 0.7133 - regression_loss: 0.6419 - classification_loss: 0.0714 197/500 [==========>...................] - ETA: 1:57 - loss: 0.7151 - regression_loss: 0.6437 - classification_loss: 0.0714 198/500 [==========>...................] - ETA: 1:57 - loss: 0.7142 - regression_loss: 0.6430 - classification_loss: 0.0712 199/500 [==========>...................] - ETA: 1:56 - loss: 0.7129 - regression_loss: 0.6419 - classification_loss: 0.0710 200/500 [===========>..................] - ETA: 1:56 - loss: 0.7137 - regression_loss: 0.6426 - classification_loss: 0.0711 201/500 [===========>..................] - ETA: 1:56 - loss: 0.7159 - regression_loss: 0.6445 - classification_loss: 0.0714 202/500 [===========>..................] - ETA: 1:55 - loss: 0.7150 - regression_loss: 0.6438 - classification_loss: 0.0713 203/500 [===========>..................] - ETA: 1:55 - loss: 0.7157 - regression_loss: 0.6444 - classification_loss: 0.0713 204/500 [===========>..................] - ETA: 1:54 - loss: 0.7157 - regression_loss: 0.6444 - classification_loss: 0.0713 205/500 [===========>..................] - ETA: 1:54 - loss: 0.7142 - regression_loss: 0.6431 - classification_loss: 0.0711 206/500 [===========>..................] - ETA: 1:54 - loss: 0.7127 - regression_loss: 0.6418 - classification_loss: 0.0709 207/500 [===========>..................] - ETA: 1:53 - loss: 0.7122 - regression_loss: 0.6413 - classification_loss: 0.0708 208/500 [===========>..................] - ETA: 1:53 - loss: 0.7118 - regression_loss: 0.6411 - classification_loss: 0.0707 209/500 [===========>..................] - ETA: 1:52 - loss: 0.7115 - regression_loss: 0.6408 - classification_loss: 0.0706 210/500 [===========>..................] - ETA: 1:52 - loss: 0.7119 - regression_loss: 0.6413 - classification_loss: 0.0706 211/500 [===========>..................] - ETA: 1:52 - loss: 0.7149 - regression_loss: 0.6439 - classification_loss: 0.0710 212/500 [===========>..................] - ETA: 1:51 - loss: 0.7135 - regression_loss: 0.6427 - classification_loss: 0.0708 213/500 [===========>..................] - ETA: 1:51 - loss: 0.7132 - regression_loss: 0.6423 - classification_loss: 0.0708 214/500 [===========>..................] - ETA: 1:51 - loss: 0.7130 - regression_loss: 0.6421 - classification_loss: 0.0708 215/500 [===========>..................] - ETA: 1:50 - loss: 0.7156 - regression_loss: 0.6440 - classification_loss: 0.0716 216/500 [===========>..................] - ETA: 1:50 - loss: 0.7148 - regression_loss: 0.6433 - classification_loss: 0.0715 217/500 [============>.................] - ETA: 1:49 - loss: 0.7135 - regression_loss: 0.6420 - classification_loss: 0.0714 218/500 [============>.................] - ETA: 1:49 - loss: 0.7133 - regression_loss: 0.6420 - classification_loss: 0.0714 219/500 [============>.................] - ETA: 1:49 - loss: 0.7145 - regression_loss: 0.6427 - classification_loss: 0.0718 220/500 [============>.................] - ETA: 1:48 - loss: 0.7162 - regression_loss: 0.6439 - classification_loss: 0.0723 221/500 [============>.................] - ETA: 1:48 - loss: 0.7144 - regression_loss: 0.6423 - classification_loss: 0.0721 222/500 [============>.................] - ETA: 1:47 - loss: 0.7127 - regression_loss: 0.6407 - classification_loss: 0.0720 223/500 [============>.................] - ETA: 1:47 - loss: 0.7122 - regression_loss: 0.6403 - classification_loss: 0.0719 224/500 [============>.................] - ETA: 1:47 - loss: 0.7103 - regression_loss: 0.6385 - classification_loss: 0.0717 225/500 [============>.................] - ETA: 1:46 - loss: 0.7105 - regression_loss: 0.6387 - classification_loss: 0.0718 226/500 [============>.................] - ETA: 1:46 - loss: 0.7105 - regression_loss: 0.6388 - classification_loss: 0.0716 227/500 [============>.................] - ETA: 1:46 - loss: 0.7097 - regression_loss: 0.6382 - classification_loss: 0.0715 228/500 [============>.................] - ETA: 1:45 - loss: 0.7093 - regression_loss: 0.6379 - classification_loss: 0.0714 229/500 [============>.................] - ETA: 1:45 - loss: 0.7074 - regression_loss: 0.6362 - classification_loss: 0.0712 230/500 [============>.................] - ETA: 1:44 - loss: 0.7088 - regression_loss: 0.6373 - classification_loss: 0.0714 231/500 [============>.................] - ETA: 1:44 - loss: 0.7093 - regression_loss: 0.6378 - classification_loss: 0.0715 232/500 [============>.................] - ETA: 1:44 - loss: 0.7077 - regression_loss: 0.6364 - classification_loss: 0.0713 233/500 [============>.................] - ETA: 1:43 - loss: 0.7079 - regression_loss: 0.6365 - classification_loss: 0.0714 234/500 [=============>................] - ETA: 1:43 - loss: 0.7091 - regression_loss: 0.6376 - classification_loss: 0.0715 235/500 [=============>................] - ETA: 1:42 - loss: 0.7087 - regression_loss: 0.6373 - classification_loss: 0.0714 236/500 [=============>................] - ETA: 1:42 - loss: 0.7089 - regression_loss: 0.6375 - classification_loss: 0.0714 237/500 [=============>................] - ETA: 1:42 - loss: 0.7100 - regression_loss: 0.6386 - classification_loss: 0.0714 238/500 [=============>................] - ETA: 1:41 - loss: 0.7092 - regression_loss: 0.6380 - classification_loss: 0.0713 239/500 [=============>................] - ETA: 1:41 - loss: 0.7086 - regression_loss: 0.6375 - classification_loss: 0.0711 240/500 [=============>................] - ETA: 1:40 - loss: 0.7085 - regression_loss: 0.6375 - classification_loss: 0.0711 241/500 [=============>................] - ETA: 1:40 - loss: 0.7090 - regression_loss: 0.6379 - classification_loss: 0.0712 242/500 [=============>................] - ETA: 1:40 - loss: 0.7077 - regression_loss: 0.6368 - classification_loss: 0.0710 243/500 [=============>................] - ETA: 1:39 - loss: 0.7074 - regression_loss: 0.6366 - classification_loss: 0.0708 244/500 [=============>................] - ETA: 1:39 - loss: 0.7102 - regression_loss: 0.6390 - classification_loss: 0.0713 245/500 [=============>................] - ETA: 1:39 - loss: 0.7105 - regression_loss: 0.6394 - classification_loss: 0.0711 246/500 [=============>................] - ETA: 1:38 - loss: 0.7107 - regression_loss: 0.6397 - classification_loss: 0.0711 247/500 [=============>................] - ETA: 1:38 - loss: 0.7106 - regression_loss: 0.6396 - classification_loss: 0.0710 248/500 [=============>................] - ETA: 1:37 - loss: 0.7104 - regression_loss: 0.6395 - classification_loss: 0.0709 249/500 [=============>................] - ETA: 1:37 - loss: 0.7082 - regression_loss: 0.6375 - classification_loss: 0.0707 250/500 [==============>...............] - ETA: 1:37 - loss: 0.7084 - regression_loss: 0.6375 - classification_loss: 0.0708 251/500 [==============>...............] - ETA: 1:36 - loss: 0.7070 - regression_loss: 0.6363 - classification_loss: 0.0706 252/500 [==============>...............] - ETA: 1:36 - loss: 0.7103 - regression_loss: 0.6386 - classification_loss: 0.0717 253/500 [==============>...............] - ETA: 1:35 - loss: 0.7110 - regression_loss: 0.6393 - classification_loss: 0.0717 254/500 [==============>...............] - ETA: 1:35 - loss: 0.7111 - regression_loss: 0.6394 - classification_loss: 0.0717 255/500 [==============>...............] - ETA: 1:35 - loss: 0.7098 - regression_loss: 0.6383 - classification_loss: 0.0716 256/500 [==============>...............] - ETA: 1:34 - loss: 0.7100 - regression_loss: 0.6384 - classification_loss: 0.0715 257/500 [==============>...............] - ETA: 1:34 - loss: 0.7087 - regression_loss: 0.6373 - classification_loss: 0.0714 258/500 [==============>...............] - ETA: 1:33 - loss: 0.7102 - regression_loss: 0.6386 - classification_loss: 0.0715 259/500 [==============>...............] - ETA: 1:33 - loss: 0.7097 - regression_loss: 0.6383 - classification_loss: 0.0715 260/500 [==============>...............] - ETA: 1:33 - loss: 0.7118 - regression_loss: 0.6396 - classification_loss: 0.0722 261/500 [==============>...............] - ETA: 1:32 - loss: 0.7111 - regression_loss: 0.6390 - classification_loss: 0.0721 262/500 [==============>...............] - ETA: 1:32 - loss: 0.7108 - regression_loss: 0.6388 - classification_loss: 0.0720 263/500 [==============>...............] - ETA: 1:32 - loss: 0.7104 - regression_loss: 0.6385 - classification_loss: 0.0719 264/500 [==============>...............] - ETA: 1:31 - loss: 0.7098 - regression_loss: 0.6380 - classification_loss: 0.0719 265/500 [==============>...............] - ETA: 1:31 - loss: 0.7093 - regression_loss: 0.6375 - classification_loss: 0.0718 266/500 [==============>...............] - ETA: 1:30 - loss: 0.7099 - regression_loss: 0.6382 - classification_loss: 0.0717 267/500 [===============>..............] - ETA: 1:30 - loss: 0.7101 - regression_loss: 0.6384 - classification_loss: 0.0716 268/500 [===============>..............] - ETA: 1:30 - loss: 0.7103 - regression_loss: 0.6386 - classification_loss: 0.0717 269/500 [===============>..............] - ETA: 1:29 - loss: 0.7096 - regression_loss: 0.6380 - classification_loss: 0.0716 270/500 [===============>..............] - ETA: 1:29 - loss: 0.7089 - regression_loss: 0.6374 - classification_loss: 0.0716 271/500 [===============>..............] - ETA: 1:28 - loss: 0.7087 - regression_loss: 0.6371 - classification_loss: 0.0716 272/500 [===============>..............] - ETA: 1:28 - loss: 0.7089 - regression_loss: 0.6373 - classification_loss: 0.0716 273/500 [===============>..............] - ETA: 1:28 - loss: 0.7099 - regression_loss: 0.6382 - classification_loss: 0.0716 274/500 [===============>..............] - ETA: 1:27 - loss: 0.7111 - regression_loss: 0.6394 - classification_loss: 0.0717 275/500 [===============>..............] - ETA: 1:27 - loss: 0.7136 - regression_loss: 0.6413 - classification_loss: 0.0723 276/500 [===============>..............] - ETA: 1:27 - loss: 0.7136 - regression_loss: 0.6413 - classification_loss: 0.0723 277/500 [===============>..............] - ETA: 1:26 - loss: 0.7129 - regression_loss: 0.6405 - classification_loss: 0.0723 278/500 [===============>..............] - ETA: 1:26 - loss: 0.7134 - regression_loss: 0.6411 - classification_loss: 0.0723 279/500 [===============>..............] - ETA: 1:25 - loss: 0.7135 - regression_loss: 0.6412 - classification_loss: 0.0723 280/500 [===============>..............] - ETA: 1:25 - loss: 0.7131 - regression_loss: 0.6408 - classification_loss: 0.0722 281/500 [===============>..............] - ETA: 1:25 - loss: 0.7141 - regression_loss: 0.6417 - classification_loss: 0.0725 282/500 [===============>..............] - ETA: 1:24 - loss: 0.7139 - regression_loss: 0.6415 - classification_loss: 0.0724 283/500 [===============>..............] - ETA: 1:24 - loss: 0.7142 - regression_loss: 0.6419 - classification_loss: 0.0723 284/500 [================>.............] - ETA: 1:23 - loss: 0.7147 - regression_loss: 0.6424 - classification_loss: 0.0723 285/500 [================>.............] - ETA: 1:23 - loss: 0.7155 - regression_loss: 0.6433 - classification_loss: 0.0723 286/500 [================>.............] - ETA: 1:23 - loss: 0.7154 - regression_loss: 0.6432 - classification_loss: 0.0722 287/500 [================>.............] - ETA: 1:22 - loss: 0.7157 - regression_loss: 0.6434 - classification_loss: 0.0723 288/500 [================>.............] - ETA: 1:22 - loss: 0.7164 - regression_loss: 0.6440 - classification_loss: 0.0724 289/500 [================>.............] - ETA: 1:21 - loss: 0.7158 - regression_loss: 0.6436 - classification_loss: 0.0723 290/500 [================>.............] - ETA: 1:21 - loss: 0.7166 - regression_loss: 0.6444 - classification_loss: 0.0723 291/500 [================>.............] - ETA: 1:21 - loss: 0.7160 - regression_loss: 0.6438 - classification_loss: 0.0722 292/500 [================>.............] - ETA: 1:20 - loss: 0.7158 - regression_loss: 0.6437 - classification_loss: 0.0721 293/500 [================>.............] - ETA: 1:20 - loss: 0.7154 - regression_loss: 0.6434 - classification_loss: 0.0720 294/500 [================>.............] - ETA: 1:20 - loss: 0.7156 - regression_loss: 0.6437 - classification_loss: 0.0720 295/500 [================>.............] - ETA: 1:19 - loss: 0.7147 - regression_loss: 0.6428 - classification_loss: 0.0718 296/500 [================>.............] - ETA: 1:19 - loss: 0.7147 - regression_loss: 0.6430 - classification_loss: 0.0718 297/500 [================>.............] - ETA: 1:18 - loss: 0.7145 - regression_loss: 0.6427 - classification_loss: 0.0717 298/500 [================>.............] - ETA: 1:18 - loss: 0.7142 - regression_loss: 0.6424 - classification_loss: 0.0717 299/500 [================>.............] - ETA: 1:18 - loss: 0.7163 - regression_loss: 0.6444 - classification_loss: 0.0719 300/500 [=================>............] - ETA: 1:17 - loss: 0.7159 - regression_loss: 0.6442 - classification_loss: 0.0718 301/500 [=================>............] - ETA: 1:17 - loss: 0.7165 - regression_loss: 0.6447 - classification_loss: 0.0718 302/500 [=================>............] - ETA: 1:16 - loss: 0.7169 - regression_loss: 0.6451 - classification_loss: 0.0718 303/500 [=================>............] - ETA: 1:16 - loss: 0.7165 - regression_loss: 0.6448 - classification_loss: 0.0717 304/500 [=================>............] - ETA: 1:16 - loss: 0.7175 - regression_loss: 0.6458 - classification_loss: 0.0717 305/500 [=================>............] - ETA: 1:15 - loss: 0.7188 - regression_loss: 0.6471 - classification_loss: 0.0718 306/500 [=================>............] - ETA: 1:15 - loss: 0.7191 - regression_loss: 0.6473 - classification_loss: 0.0717 307/500 [=================>............] - ETA: 1:14 - loss: 0.7193 - regression_loss: 0.6475 - classification_loss: 0.0717 308/500 [=================>............] - ETA: 1:14 - loss: 0.7195 - regression_loss: 0.6478 - classification_loss: 0.0717 309/500 [=================>............] - ETA: 1:14 - loss: 0.7204 - regression_loss: 0.6484 - classification_loss: 0.0720 310/500 [=================>............] - ETA: 1:13 - loss: 0.7198 - regression_loss: 0.6478 - classification_loss: 0.0720 311/500 [=================>............] - ETA: 1:13 - loss: 0.7209 - regression_loss: 0.6489 - classification_loss: 0.0720 312/500 [=================>............] - ETA: 1:13 - loss: 0.7210 - regression_loss: 0.6489 - classification_loss: 0.0721 313/500 [=================>............] - ETA: 1:12 - loss: 0.7229 - regression_loss: 0.6505 - classification_loss: 0.0723 314/500 [=================>............] - ETA: 1:12 - loss: 0.7222 - regression_loss: 0.6499 - classification_loss: 0.0723 315/500 [=================>............] - ETA: 1:11 - loss: 0.7213 - regression_loss: 0.6492 - classification_loss: 0.0722 316/500 [=================>............] - ETA: 1:11 - loss: 0.7217 - regression_loss: 0.6496 - classification_loss: 0.0721 317/500 [==================>...........] - ETA: 1:11 - loss: 0.7221 - regression_loss: 0.6500 - classification_loss: 0.0721 318/500 [==================>...........] - ETA: 1:10 - loss: 0.7213 - regression_loss: 0.6493 - classification_loss: 0.0720 319/500 [==================>...........] - ETA: 1:10 - loss: 0.7209 - regression_loss: 0.6489 - classification_loss: 0.0720 320/500 [==================>...........] - ETA: 1:09 - loss: 0.7214 - regression_loss: 0.6492 - classification_loss: 0.0722 321/500 [==================>...........] - ETA: 1:09 - loss: 0.7209 - regression_loss: 0.6488 - classification_loss: 0.0722 322/500 [==================>...........] - ETA: 1:09 - loss: 0.7203 - regression_loss: 0.6482 - classification_loss: 0.0720 323/500 [==================>...........] - ETA: 1:08 - loss: 0.7190 - regression_loss: 0.6471 - classification_loss: 0.0719 324/500 [==================>...........] - ETA: 1:08 - loss: 0.7176 - regression_loss: 0.6458 - classification_loss: 0.0718 325/500 [==================>...........] - ETA: 1:07 - loss: 0.7168 - regression_loss: 0.6451 - classification_loss: 0.0717 326/500 [==================>...........] - ETA: 1:07 - loss: 0.7172 - regression_loss: 0.6456 - classification_loss: 0.0717 327/500 [==================>...........] - ETA: 1:07 - loss: 0.7169 - regression_loss: 0.6453 - classification_loss: 0.0716 328/500 [==================>...........] - ETA: 1:06 - loss: 0.7183 - regression_loss: 0.6466 - classification_loss: 0.0717 329/500 [==================>...........] - ETA: 1:06 - loss: 0.7183 - regression_loss: 0.6466 - classification_loss: 0.0716 330/500 [==================>...........] - ETA: 1:06 - loss: 0.7188 - regression_loss: 0.6472 - classification_loss: 0.0716 331/500 [==================>...........] - ETA: 1:05 - loss: 0.7187 - regression_loss: 0.6471 - classification_loss: 0.0716 332/500 [==================>...........] - ETA: 1:05 - loss: 0.7181 - regression_loss: 0.6466 - classification_loss: 0.0715 333/500 [==================>...........] - ETA: 1:04 - loss: 0.7184 - regression_loss: 0.6470 - classification_loss: 0.0715 334/500 [===================>..........] - ETA: 1:04 - loss: 0.7181 - regression_loss: 0.6466 - classification_loss: 0.0715 335/500 [===================>..........] - ETA: 1:04 - loss: 0.7194 - regression_loss: 0.6479 - classification_loss: 0.0716 336/500 [===================>..........] - ETA: 1:03 - loss: 0.7187 - regression_loss: 0.6471 - classification_loss: 0.0715 337/500 [===================>..........] - ETA: 1:03 - loss: 0.7181 - regression_loss: 0.6466 - classification_loss: 0.0715 338/500 [===================>..........] - ETA: 1:02 - loss: 0.7182 - regression_loss: 0.6467 - classification_loss: 0.0715 339/500 [===================>..........] - ETA: 1:02 - loss: 0.7177 - regression_loss: 0.6463 - classification_loss: 0.0714 340/500 [===================>..........] - ETA: 1:02 - loss: 0.7182 - regression_loss: 0.6468 - classification_loss: 0.0714 341/500 [===================>..........] - ETA: 1:01 - loss: 0.7177 - regression_loss: 0.6463 - classification_loss: 0.0714 342/500 [===================>..........] - ETA: 1:01 - loss: 0.7173 - regression_loss: 0.6459 - classification_loss: 0.0713 343/500 [===================>..........] - ETA: 1:01 - loss: 0.7174 - regression_loss: 0.6461 - classification_loss: 0.0713 344/500 [===================>..........] - ETA: 1:00 - loss: 0.7173 - regression_loss: 0.6461 - classification_loss: 0.0712 345/500 [===================>..........] - ETA: 1:00 - loss: 0.7197 - regression_loss: 0.6479 - classification_loss: 0.0717 346/500 [===================>..........] - ETA: 59s - loss: 0.7195 - regression_loss: 0.6478 - classification_loss: 0.0717  347/500 [===================>..........] - ETA: 59s - loss: 0.7197 - regression_loss: 0.6481 - classification_loss: 0.0716 348/500 [===================>..........] - ETA: 59s - loss: 0.7204 - regression_loss: 0.6486 - classification_loss: 0.0718 349/500 [===================>..........] - ETA: 58s - loss: 0.7210 - regression_loss: 0.6493 - classification_loss: 0.0717 350/500 [====================>.........] - ETA: 58s - loss: 0.7208 - regression_loss: 0.6491 - classification_loss: 0.0717 351/500 [====================>.........] - ETA: 57s - loss: 0.7211 - regression_loss: 0.6493 - classification_loss: 0.0718 352/500 [====================>.........] - ETA: 57s - loss: 0.7206 - regression_loss: 0.6488 - classification_loss: 0.0718 353/500 [====================>.........] - ETA: 57s - loss: 0.7212 - regression_loss: 0.6494 - classification_loss: 0.0718 354/500 [====================>.........] - ETA: 56s - loss: 0.7207 - regression_loss: 0.6491 - classification_loss: 0.0717 355/500 [====================>.........] - ETA: 56s - loss: 0.7208 - regression_loss: 0.6492 - classification_loss: 0.0716 356/500 [====================>.........] - ETA: 55s - loss: 0.7214 - regression_loss: 0.6497 - classification_loss: 0.0717 357/500 [====================>.........] - ETA: 55s - loss: 0.7233 - regression_loss: 0.6514 - classification_loss: 0.0720 358/500 [====================>.........] - ETA: 55s - loss: 0.7231 - regression_loss: 0.6511 - classification_loss: 0.0720 359/500 [====================>.........] - ETA: 54s - loss: 0.7235 - regression_loss: 0.6515 - classification_loss: 0.0719 360/500 [====================>.........] - ETA: 54s - loss: 0.7238 - regression_loss: 0.6519 - classification_loss: 0.0719 361/500 [====================>.........] - ETA: 54s - loss: 0.7235 - regression_loss: 0.6516 - classification_loss: 0.0719 362/500 [====================>.........] - ETA: 53s - loss: 0.7233 - regression_loss: 0.6514 - classification_loss: 0.0718 363/500 [====================>.........] - ETA: 53s - loss: 0.7233 - regression_loss: 0.6514 - classification_loss: 0.0718 364/500 [====================>.........] - ETA: 52s - loss: 0.7239 - regression_loss: 0.6520 - classification_loss: 0.0719 365/500 [====================>.........] - ETA: 52s - loss: 0.7243 - regression_loss: 0.6523 - classification_loss: 0.0720 366/500 [====================>.........] - ETA: 52s - loss: 0.7252 - regression_loss: 0.6530 - classification_loss: 0.0722 367/500 [=====================>........] - ETA: 51s - loss: 0.7243 - regression_loss: 0.6521 - classification_loss: 0.0722 368/500 [=====================>........] - ETA: 51s - loss: 0.7238 - regression_loss: 0.6517 - classification_loss: 0.0722 369/500 [=====================>........] - ETA: 50s - loss: 0.7239 - regression_loss: 0.6518 - classification_loss: 0.0721 370/500 [=====================>........] - ETA: 50s - loss: 0.7232 - regression_loss: 0.6511 - classification_loss: 0.0721 371/500 [=====================>........] - ETA: 50s - loss: 0.7230 - regression_loss: 0.6510 - classification_loss: 0.0720 372/500 [=====================>........] - ETA: 49s - loss: 0.7223 - regression_loss: 0.6504 - classification_loss: 0.0719 373/500 [=====================>........] - ETA: 49s - loss: 0.7218 - regression_loss: 0.6499 - classification_loss: 0.0719 374/500 [=====================>........] - ETA: 48s - loss: 0.7210 - regression_loss: 0.6492 - classification_loss: 0.0718 375/500 [=====================>........] - ETA: 48s - loss: 0.7204 - regression_loss: 0.6487 - classification_loss: 0.0717 376/500 [=====================>........] - ETA: 48s - loss: 0.7195 - regression_loss: 0.6478 - classification_loss: 0.0717 377/500 [=====================>........] - ETA: 47s - loss: 0.7195 - regression_loss: 0.6478 - classification_loss: 0.0717 378/500 [=====================>........] - ETA: 47s - loss: 0.7191 - regression_loss: 0.6475 - classification_loss: 0.0716 379/500 [=====================>........] - ETA: 47s - loss: 0.7183 - regression_loss: 0.6469 - classification_loss: 0.0715 380/500 [=====================>........] - ETA: 46s - loss: 0.7186 - regression_loss: 0.6471 - classification_loss: 0.0715 381/500 [=====================>........] - ETA: 46s - loss: 0.7188 - regression_loss: 0.6474 - classification_loss: 0.0714 382/500 [=====================>........] - ETA: 45s - loss: 0.7180 - regression_loss: 0.6466 - classification_loss: 0.0714 383/500 [=====================>........] - ETA: 45s - loss: 0.7180 - regression_loss: 0.6466 - classification_loss: 0.0713 384/500 [======================>.......] - ETA: 45s - loss: 0.7173 - regression_loss: 0.6460 - classification_loss: 0.0713 385/500 [======================>.......] - ETA: 44s - loss: 0.7172 - regression_loss: 0.6459 - classification_loss: 0.0713 386/500 [======================>.......] - ETA: 44s - loss: 0.7169 - regression_loss: 0.6455 - classification_loss: 0.0713 387/500 [======================>.......] - ETA: 43s - loss: 0.7167 - regression_loss: 0.6454 - classification_loss: 0.0713 388/500 [======================>.......] - ETA: 43s - loss: 0.7175 - regression_loss: 0.6460 - classification_loss: 0.0715 389/500 [======================>.......] - ETA: 43s - loss: 0.7167 - regression_loss: 0.6454 - classification_loss: 0.0714 390/500 [======================>.......] - ETA: 42s - loss: 0.7169 - regression_loss: 0.6455 - classification_loss: 0.0714 391/500 [======================>.......] - ETA: 42s - loss: 0.7170 - regression_loss: 0.6456 - classification_loss: 0.0715 392/500 [======================>.......] - ETA: 41s - loss: 0.7161 - regression_loss: 0.6447 - classification_loss: 0.0714 393/500 [======================>.......] - ETA: 41s - loss: 0.7167 - regression_loss: 0.6450 - classification_loss: 0.0716 394/500 [======================>.......] - ETA: 41s - loss: 0.7156 - regression_loss: 0.6441 - classification_loss: 0.0715 395/500 [======================>.......] - ETA: 40s - loss: 0.7153 - regression_loss: 0.6439 - classification_loss: 0.0714 396/500 [======================>.......] - ETA: 40s - loss: 0.7142 - regression_loss: 0.6429 - classification_loss: 0.0713 397/500 [======================>.......] - ETA: 40s - loss: 0.7149 - regression_loss: 0.6435 - classification_loss: 0.0714 398/500 [======================>.......] - ETA: 39s - loss: 0.7149 - regression_loss: 0.6435 - classification_loss: 0.0714 399/500 [======================>.......] - ETA: 39s - loss: 0.7149 - regression_loss: 0.6436 - classification_loss: 0.0714 400/500 [=======================>......] - ETA: 38s - loss: 0.7143 - regression_loss: 0.6430 - classification_loss: 0.0713 401/500 [=======================>......] - ETA: 38s - loss: 0.7146 - regression_loss: 0.6433 - classification_loss: 0.0713 402/500 [=======================>......] - ETA: 38s - loss: 0.7143 - regression_loss: 0.6431 - classification_loss: 0.0712 403/500 [=======================>......] - ETA: 37s - loss: 0.7138 - regression_loss: 0.6427 - classification_loss: 0.0711 404/500 [=======================>......] - ETA: 37s - loss: 0.7135 - regression_loss: 0.6424 - classification_loss: 0.0711 405/500 [=======================>......] - ETA: 36s - loss: 0.7129 - regression_loss: 0.6418 - classification_loss: 0.0711 406/500 [=======================>......] - ETA: 36s - loss: 0.7125 - regression_loss: 0.6414 - classification_loss: 0.0711 407/500 [=======================>......] - ETA: 36s - loss: 0.7118 - regression_loss: 0.6408 - classification_loss: 0.0710 408/500 [=======================>......] - ETA: 35s - loss: 0.7114 - regression_loss: 0.6404 - classification_loss: 0.0710 409/500 [=======================>......] - ETA: 35s - loss: 0.7112 - regression_loss: 0.6403 - classification_loss: 0.0709 410/500 [=======================>......] - ETA: 34s - loss: 0.7102 - regression_loss: 0.6394 - classification_loss: 0.0708 411/500 [=======================>......] - ETA: 34s - loss: 0.7101 - regression_loss: 0.6393 - classification_loss: 0.0708 412/500 [=======================>......] - ETA: 34s - loss: 0.7116 - regression_loss: 0.6406 - classification_loss: 0.0710 413/500 [=======================>......] - ETA: 33s - loss: 0.7113 - regression_loss: 0.6403 - classification_loss: 0.0710 414/500 [=======================>......] - ETA: 33s - loss: 0.7113 - regression_loss: 0.6403 - classification_loss: 0.0710 415/500 [=======================>......] - ETA: 33s - loss: 0.7112 - regression_loss: 0.6403 - classification_loss: 0.0709 416/500 [=======================>......] - ETA: 32s - loss: 0.7129 - regression_loss: 0.6415 - classification_loss: 0.0714 417/500 [========================>.....] - ETA: 32s - loss: 0.7123 - regression_loss: 0.6410 - classification_loss: 0.0713 418/500 [========================>.....] - ETA: 31s - loss: 0.7122 - regression_loss: 0.6409 - classification_loss: 0.0712 419/500 [========================>.....] - ETA: 31s - loss: 0.7119 - regression_loss: 0.6407 - classification_loss: 0.0712 420/500 [========================>.....] - ETA: 31s - loss: 0.7118 - regression_loss: 0.6406 - classification_loss: 0.0712 421/500 [========================>.....] - ETA: 30s - loss: 0.7115 - regression_loss: 0.6403 - classification_loss: 0.0712 422/500 [========================>.....] - ETA: 30s - loss: 0.7109 - regression_loss: 0.6398 - classification_loss: 0.0711 423/500 [========================>.....] - ETA: 29s - loss: 0.7108 - regression_loss: 0.6397 - classification_loss: 0.0711 424/500 [========================>.....] - ETA: 29s - loss: 0.7120 - regression_loss: 0.6407 - classification_loss: 0.0712 425/500 [========================>.....] - ETA: 29s - loss: 0.7114 - regression_loss: 0.6402 - classification_loss: 0.0712 426/500 [========================>.....] - ETA: 28s - loss: 0.7133 - regression_loss: 0.6418 - classification_loss: 0.0715 427/500 [========================>.....] - ETA: 28s - loss: 0.7134 - regression_loss: 0.6419 - classification_loss: 0.0715 428/500 [========================>.....] - ETA: 27s - loss: 0.7132 - regression_loss: 0.6417 - classification_loss: 0.0714 429/500 [========================>.....] - ETA: 27s - loss: 0.7126 - regression_loss: 0.6413 - classification_loss: 0.0714 430/500 [========================>.....] - ETA: 27s - loss: 0.7133 - regression_loss: 0.6418 - classification_loss: 0.0715 431/500 [========================>.....] - ETA: 26s - loss: 0.7133 - regression_loss: 0.6418 - classification_loss: 0.0715 432/500 [========================>.....] - ETA: 26s - loss: 0.7136 - regression_loss: 0.6420 - classification_loss: 0.0716 433/500 [========================>.....] - ETA: 26s - loss: 0.7133 - regression_loss: 0.6418 - classification_loss: 0.0715 434/500 [=========================>....] - ETA: 25s - loss: 0.7131 - regression_loss: 0.6416 - classification_loss: 0.0715 435/500 [=========================>....] - ETA: 25s - loss: 0.7132 - regression_loss: 0.6417 - classification_loss: 0.0715 436/500 [=========================>....] - ETA: 24s - loss: 0.7130 - regression_loss: 0.6416 - classification_loss: 0.0714 437/500 [=========================>....] - ETA: 24s - loss: 0.7133 - regression_loss: 0.6418 - classification_loss: 0.0715 438/500 [=========================>....] - ETA: 24s - loss: 0.7132 - regression_loss: 0.6417 - classification_loss: 0.0715 439/500 [=========================>....] - ETA: 23s - loss: 0.7130 - regression_loss: 0.6414 - classification_loss: 0.0716 440/500 [=========================>....] - ETA: 23s - loss: 0.7131 - regression_loss: 0.6415 - classification_loss: 0.0716 441/500 [=========================>....] - ETA: 22s - loss: 0.7133 - regression_loss: 0.6417 - classification_loss: 0.0716 442/500 [=========================>....] - ETA: 22s - loss: 0.7121 - regression_loss: 0.6406 - classification_loss: 0.0715 443/500 [=========================>....] - ETA: 22s - loss: 0.7120 - regression_loss: 0.6403 - classification_loss: 0.0716 444/500 [=========================>....] - ETA: 21s - loss: 0.7119 - regression_loss: 0.6403 - classification_loss: 0.0716 445/500 [=========================>....] - ETA: 21s - loss: 0.7116 - regression_loss: 0.6400 - classification_loss: 0.0715 446/500 [=========================>....] - ETA: 21s - loss: 0.7110 - regression_loss: 0.6395 - classification_loss: 0.0715 447/500 [=========================>....] - ETA: 20s - loss: 0.7104 - regression_loss: 0.6390 - classification_loss: 0.0714 448/500 [=========================>....] - ETA: 20s - loss: 0.7094 - regression_loss: 0.6381 - classification_loss: 0.0713 449/500 [=========================>....] - ETA: 19s - loss: 0.7098 - regression_loss: 0.6385 - classification_loss: 0.0714 450/500 [==========================>...] - ETA: 19s - loss: 0.7098 - regression_loss: 0.6385 - classification_loss: 0.0713 451/500 [==========================>...] - ETA: 19s - loss: 0.7114 - regression_loss: 0.6398 - classification_loss: 0.0715 452/500 [==========================>...] - ETA: 18s - loss: 0.7107 - regression_loss: 0.6391 - classification_loss: 0.0716 453/500 [==========================>...] - ETA: 18s - loss: 0.7102 - regression_loss: 0.6387 - classification_loss: 0.0716 454/500 [==========================>...] - ETA: 17s - loss: 0.7102 - regression_loss: 0.6387 - classification_loss: 0.0715 455/500 [==========================>...] - ETA: 17s - loss: 0.7113 - regression_loss: 0.6394 - classification_loss: 0.0719 456/500 [==========================>...] - ETA: 17s - loss: 0.7126 - regression_loss: 0.6404 - classification_loss: 0.0722 457/500 [==========================>...] - ETA: 16s - loss: 0.7128 - regression_loss: 0.6407 - classification_loss: 0.0721 458/500 [==========================>...] - ETA: 16s - loss: 0.7125 - regression_loss: 0.6404 - classification_loss: 0.0721 459/500 [==========================>...] - ETA: 15s - loss: 0.7132 - regression_loss: 0.6410 - classification_loss: 0.0722 460/500 [==========================>...] - ETA: 15s - loss: 0.7133 - regression_loss: 0.6412 - classification_loss: 0.0721 461/500 [==========================>...] - ETA: 15s - loss: 0.7128 - regression_loss: 0.6407 - classification_loss: 0.0721 462/500 [==========================>...] - ETA: 14s - loss: 0.7122 - regression_loss: 0.6402 - classification_loss: 0.0720 463/500 [==========================>...] - ETA: 14s - loss: 0.7116 - regression_loss: 0.6396 - classification_loss: 0.0719 464/500 [==========================>...] - ETA: 14s - loss: 0.7118 - regression_loss: 0.6398 - classification_loss: 0.0720 465/500 [==========================>...] - ETA: 13s - loss: 0.7119 - regression_loss: 0.6399 - classification_loss: 0.0720 466/500 [==========================>...] - ETA: 13s - loss: 0.7119 - regression_loss: 0.6398 - classification_loss: 0.0721 467/500 [===========================>..] - ETA: 12s - loss: 0.7125 - regression_loss: 0.6400 - classification_loss: 0.0725 468/500 [===========================>..] - ETA: 12s - loss: 0.7134 - regression_loss: 0.6405 - classification_loss: 0.0728 469/500 [===========================>..] - ETA: 12s - loss: 0.7131 - regression_loss: 0.6403 - classification_loss: 0.0728 470/500 [===========================>..] - ETA: 11s - loss: 0.7128 - regression_loss: 0.6401 - classification_loss: 0.0728 471/500 [===========================>..] - ETA: 11s - loss: 0.7126 - regression_loss: 0.6399 - classification_loss: 0.0727 472/500 [===========================>..] - ETA: 10s - loss: 0.7123 - regression_loss: 0.6396 - classification_loss: 0.0727 473/500 [===========================>..] - ETA: 10s - loss: 0.7117 - regression_loss: 0.6390 - classification_loss: 0.0727 474/500 [===========================>..] - ETA: 10s - loss: 0.7121 - regression_loss: 0.6394 - classification_loss: 0.0727 475/500 [===========================>..] - ETA: 9s - loss: 0.7122 - regression_loss: 0.6395 - classification_loss: 0.0727  476/500 [===========================>..] - ETA: 9s - loss: 0.7130 - regression_loss: 0.6402 - classification_loss: 0.0728 477/500 [===========================>..] - ETA: 8s - loss: 0.7124 - regression_loss: 0.6397 - classification_loss: 0.0727 478/500 [===========================>..] - ETA: 8s - loss: 0.7120 - regression_loss: 0.6393 - classification_loss: 0.0727 479/500 [===========================>..] - ETA: 8s - loss: 0.7117 - regression_loss: 0.6391 - classification_loss: 0.0726 480/500 [===========================>..] - ETA: 7s - loss: 0.7111 - regression_loss: 0.6385 - classification_loss: 0.0726 481/500 [===========================>..] - ETA: 7s - loss: 0.7110 - regression_loss: 0.6384 - classification_loss: 0.0725 482/500 [===========================>..] - ETA: 6s - loss: 0.7107 - regression_loss: 0.6382 - classification_loss: 0.0725 483/500 [===========================>..] - ETA: 6s - loss: 0.7098 - regression_loss: 0.6374 - classification_loss: 0.0724 484/500 [============================>.] - ETA: 6s - loss: 0.7096 - regression_loss: 0.6373 - classification_loss: 0.0723 485/500 [============================>.] - ETA: 5s - loss: 0.7094 - regression_loss: 0.6370 - classification_loss: 0.0723 486/500 [============================>.] - ETA: 5s - loss: 0.7091 - regression_loss: 0.6368 - classification_loss: 0.0723 487/500 [============================>.] - ETA: 5s - loss: 0.7096 - regression_loss: 0.6373 - classification_loss: 0.0723 488/500 [============================>.] - ETA: 4s - loss: 0.7098 - regression_loss: 0.6376 - classification_loss: 0.0723 489/500 [============================>.] - ETA: 4s - loss: 0.7090 - regression_loss: 0.6369 - classification_loss: 0.0722 490/500 [============================>.] - ETA: 3s - loss: 0.7104 - regression_loss: 0.6380 - classification_loss: 0.0723 491/500 [============================>.] - ETA: 3s - loss: 0.7100 - regression_loss: 0.6377 - classification_loss: 0.0723 492/500 [============================>.] - ETA: 3s - loss: 0.7092 - regression_loss: 0.6371 - classification_loss: 0.0722 493/500 [============================>.] - ETA: 2s - loss: 0.7084 - regression_loss: 0.6363 - classification_loss: 0.0721 494/500 [============================>.] - ETA: 2s - loss: 0.7076 - regression_loss: 0.6356 - classification_loss: 0.0720 495/500 [============================>.] - ETA: 1s - loss: 0.7072 - regression_loss: 0.6353 - classification_loss: 0.0719 496/500 [============================>.] - ETA: 1s - loss: 0.7073 - regression_loss: 0.6354 - classification_loss: 0.0720 497/500 [============================>.] - ETA: 1s - loss: 0.7072 - regression_loss: 0.6352 - classification_loss: 0.0719 498/500 [============================>.] - ETA: 0s - loss: 0.7064 - regression_loss: 0.6345 - classification_loss: 0.0719 499/500 [============================>.] - ETA: 0s - loss: 0.7064 - regression_loss: 0.6345 - classification_loss: 0.0719 500/500 [==============================] - 194s 389ms/step - loss: 0.7057 - regression_loss: 0.6339 - classification_loss: 0.0718 326 instances of class plum with average precision: 0.8398 mAP: 0.8398 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/20 1/500 [..............................] - ETA: 2:57 - loss: 1.0646 - regression_loss: 0.9375 - classification_loss: 0.1272 2/500 [..............................] - ETA: 3:02 - loss: 0.9386 - regression_loss: 0.8270 - classification_loss: 0.1116 3/500 [..............................] - ETA: 3:05 - loss: 0.9001 - regression_loss: 0.7924 - classification_loss: 0.1076 4/500 [..............................] - ETA: 3:06 - loss: 0.7972 - regression_loss: 0.7104 - classification_loss: 0.0868 5/500 [..............................] - ETA: 3:08 - loss: 0.7900 - regression_loss: 0.7142 - classification_loss: 0.0758 6/500 [..............................] - ETA: 3:09 - loss: 0.7344 - regression_loss: 0.6630 - classification_loss: 0.0714 7/500 [..............................] - ETA: 3:08 - loss: 0.7172 - regression_loss: 0.6480 - classification_loss: 0.0692 8/500 [..............................] - ETA: 3:09 - loss: 0.7487 - regression_loss: 0.6733 - classification_loss: 0.0755 9/500 [..............................] - ETA: 3:10 - loss: 0.7044 - regression_loss: 0.6349 - classification_loss: 0.0694 10/500 [..............................] - ETA: 3:10 - loss: 0.7709 - regression_loss: 0.6945 - classification_loss: 0.0764 11/500 [..............................] - ETA: 3:10 - loss: 0.7453 - regression_loss: 0.6724 - classification_loss: 0.0729 12/500 [..............................] - ETA: 3:09 - loss: 0.7372 - regression_loss: 0.6648 - classification_loss: 0.0724 13/500 [..............................] - ETA: 3:09 - loss: 0.7180 - regression_loss: 0.6483 - classification_loss: 0.0697 14/500 [..............................] - ETA: 3:08 - loss: 0.7293 - regression_loss: 0.6584 - classification_loss: 0.0709 15/500 [..............................] - ETA: 3:09 - loss: 0.7262 - regression_loss: 0.6572 - classification_loss: 0.0690 16/500 [..............................] - ETA: 3:08 - loss: 0.7551 - regression_loss: 0.6838 - classification_loss: 0.0713 17/500 [>.............................] - ETA: 3:07 - loss: 0.7455 - regression_loss: 0.6729 - classification_loss: 0.0726 18/500 [>.............................] - ETA: 3:07 - loss: 0.7701 - regression_loss: 0.6929 - classification_loss: 0.0772 19/500 [>.............................] - ETA: 3:06 - loss: 0.7641 - regression_loss: 0.6892 - classification_loss: 0.0749 20/500 [>.............................] - ETA: 3:06 - loss: 0.7769 - regression_loss: 0.7010 - classification_loss: 0.0759 21/500 [>.............................] - ETA: 3:06 - loss: 0.7601 - regression_loss: 0.6855 - classification_loss: 0.0746 22/500 [>.............................] - ETA: 3:06 - loss: 0.7657 - regression_loss: 0.6909 - classification_loss: 0.0747 23/500 [>.............................] - ETA: 3:05 - loss: 0.7682 - regression_loss: 0.6925 - classification_loss: 0.0757 24/500 [>.............................] - ETA: 3:05 - loss: 0.7792 - regression_loss: 0.7034 - classification_loss: 0.0758 25/500 [>.............................] - ETA: 3:05 - loss: 0.7806 - regression_loss: 0.7047 - classification_loss: 0.0759 26/500 [>.............................] - ETA: 3:04 - loss: 0.7614 - regression_loss: 0.6876 - classification_loss: 0.0738 27/500 [>.............................] - ETA: 3:04 - loss: 0.7578 - regression_loss: 0.6845 - classification_loss: 0.0733 28/500 [>.............................] - ETA: 3:03 - loss: 0.7472 - regression_loss: 0.6753 - classification_loss: 0.0719 29/500 [>.............................] - ETA: 3:03 - loss: 0.7583 - regression_loss: 0.6831 - classification_loss: 0.0752 30/500 [>.............................] - ETA: 3:03 - loss: 0.7561 - regression_loss: 0.6811 - classification_loss: 0.0750 31/500 [>.............................] - ETA: 3:02 - loss: 0.7462 - regression_loss: 0.6720 - classification_loss: 0.0742 32/500 [>.............................] - ETA: 3:02 - loss: 0.7371 - regression_loss: 0.6638 - classification_loss: 0.0733 33/500 [>.............................] - ETA: 3:02 - loss: 0.7349 - regression_loss: 0.6621 - classification_loss: 0.0728 34/500 [=>............................] - ETA: 3:01 - loss: 0.7469 - regression_loss: 0.6714 - classification_loss: 0.0755 35/500 [=>............................] - ETA: 3:01 - loss: 0.7450 - regression_loss: 0.6692 - classification_loss: 0.0758 36/500 [=>............................] - ETA: 3:00 - loss: 0.7351 - regression_loss: 0.6603 - classification_loss: 0.0748 37/500 [=>............................] - ETA: 3:00 - loss: 0.7296 - regression_loss: 0.6549 - classification_loss: 0.0746 38/500 [=>............................] - ETA: 3:00 - loss: 0.7214 - regression_loss: 0.6478 - classification_loss: 0.0736 39/500 [=>............................] - ETA: 2:59 - loss: 0.7149 - regression_loss: 0.6421 - classification_loss: 0.0728 40/500 [=>............................] - ETA: 2:59 - loss: 0.7177 - regression_loss: 0.6444 - classification_loss: 0.0732 41/500 [=>............................] - ETA: 2:59 - loss: 0.7108 - regression_loss: 0.6383 - classification_loss: 0.0725 42/500 [=>............................] - ETA: 2:58 - loss: 0.7055 - regression_loss: 0.6328 - classification_loss: 0.0727 43/500 [=>............................] - ETA: 2:58 - loss: 0.7124 - regression_loss: 0.6393 - classification_loss: 0.0732 44/500 [=>............................] - ETA: 2:58 - loss: 0.7058 - regression_loss: 0.6334 - classification_loss: 0.0723 45/500 [=>............................] - ETA: 2:58 - loss: 0.7037 - regression_loss: 0.6315 - classification_loss: 0.0722 46/500 [=>............................] - ETA: 2:57 - loss: 0.6984 - regression_loss: 0.6267 - classification_loss: 0.0717 47/500 [=>............................] - ETA: 2:57 - loss: 0.7034 - regression_loss: 0.6318 - classification_loss: 0.0716 48/500 [=>............................] - ETA: 2:56 - loss: 0.7005 - regression_loss: 0.6294 - classification_loss: 0.0711 49/500 [=>............................] - ETA: 2:56 - loss: 0.6936 - regression_loss: 0.6232 - classification_loss: 0.0704 50/500 [==>...........................] - ETA: 2:56 - loss: 0.6980 - regression_loss: 0.6278 - classification_loss: 0.0703 51/500 [==>...........................] - ETA: 2:55 - loss: 0.7019 - regression_loss: 0.6306 - classification_loss: 0.0713 52/500 [==>...........................] - ETA: 2:55 - loss: 0.6974 - regression_loss: 0.6266 - classification_loss: 0.0708 53/500 [==>...........................] - ETA: 2:55 - loss: 0.7107 - regression_loss: 0.6370 - classification_loss: 0.0737 54/500 [==>...........................] - ETA: 2:54 - loss: 0.7114 - regression_loss: 0.6382 - classification_loss: 0.0733 55/500 [==>...........................] - ETA: 2:54 - loss: 0.7072 - regression_loss: 0.6340 - classification_loss: 0.0731 56/500 [==>...........................] - ETA: 2:54 - loss: 0.7062 - regression_loss: 0.6328 - classification_loss: 0.0734 57/500 [==>...........................] - ETA: 2:53 - loss: 0.7043 - regression_loss: 0.6314 - classification_loss: 0.0729 58/500 [==>...........................] - ETA: 2:53 - loss: 0.7019 - regression_loss: 0.6295 - classification_loss: 0.0723 59/500 [==>...........................] - ETA: 2:52 - loss: 0.6992 - regression_loss: 0.6272 - classification_loss: 0.0719 60/500 [==>...........................] - ETA: 2:52 - loss: 0.6946 - regression_loss: 0.6232 - classification_loss: 0.0713 61/500 [==>...........................] - ETA: 2:51 - loss: 0.7024 - regression_loss: 0.6310 - classification_loss: 0.0714 62/500 [==>...........................] - ETA: 2:51 - loss: 0.7028 - regression_loss: 0.6315 - classification_loss: 0.0713 63/500 [==>...........................] - ETA: 2:51 - loss: 0.7035 - regression_loss: 0.6317 - classification_loss: 0.0718 64/500 [==>...........................] - ETA: 2:50 - loss: 0.7008 - regression_loss: 0.6290 - classification_loss: 0.0718 65/500 [==>...........................] - ETA: 2:50 - loss: 0.7093 - regression_loss: 0.6373 - classification_loss: 0.0719 66/500 [==>...........................] - ETA: 2:50 - loss: 0.7042 - regression_loss: 0.6330 - classification_loss: 0.0712 67/500 [===>..........................] - ETA: 2:49 - loss: 0.7011 - regression_loss: 0.6302 - classification_loss: 0.0709 68/500 [===>..........................] - ETA: 2:49 - loss: 0.6995 - regression_loss: 0.6288 - classification_loss: 0.0706 69/500 [===>..........................] - ETA: 2:48 - loss: 0.6995 - regression_loss: 0.6289 - classification_loss: 0.0706 70/500 [===>..........................] - ETA: 2:48 - loss: 0.6987 - regression_loss: 0.6287 - classification_loss: 0.0700 71/500 [===>..........................] - ETA: 2:47 - loss: 0.6942 - regression_loss: 0.6244 - classification_loss: 0.0698 72/500 [===>..........................] - ETA: 2:47 - loss: 0.6954 - regression_loss: 0.6257 - classification_loss: 0.0697 73/500 [===>..........................] - ETA: 2:46 - loss: 0.6978 - regression_loss: 0.6271 - classification_loss: 0.0707 74/500 [===>..........................] - ETA: 2:46 - loss: 0.7071 - regression_loss: 0.6352 - classification_loss: 0.0719 75/500 [===>..........................] - ETA: 2:46 - loss: 0.7101 - regression_loss: 0.6382 - classification_loss: 0.0719 76/500 [===>..........................] - ETA: 2:45 - loss: 0.7091 - regression_loss: 0.6376 - classification_loss: 0.0716 77/500 [===>..........................] - ETA: 2:45 - loss: 0.7054 - regression_loss: 0.6345 - classification_loss: 0.0709 78/500 [===>..........................] - ETA: 2:44 - loss: 0.7099 - regression_loss: 0.6388 - classification_loss: 0.0710 79/500 [===>..........................] - ETA: 2:44 - loss: 0.7065 - regression_loss: 0.6354 - classification_loss: 0.0711 80/500 [===>..........................] - ETA: 2:43 - loss: 0.7034 - regression_loss: 0.6327 - classification_loss: 0.0707 81/500 [===>..........................] - ETA: 2:43 - loss: 0.7048 - regression_loss: 0.6340 - classification_loss: 0.0708 82/500 [===>..........................] - ETA: 2:42 - loss: 0.6996 - regression_loss: 0.6293 - classification_loss: 0.0703 83/500 [===>..........................] - ETA: 2:42 - loss: 0.6969 - regression_loss: 0.6271 - classification_loss: 0.0698 84/500 [====>.........................] - ETA: 2:42 - loss: 0.6993 - regression_loss: 0.6294 - classification_loss: 0.0699 85/500 [====>.........................] - ETA: 2:42 - loss: 0.7025 - regression_loss: 0.6325 - classification_loss: 0.0701 86/500 [====>.........................] - ETA: 2:41 - loss: 0.7092 - regression_loss: 0.6383 - classification_loss: 0.0709 87/500 [====>.........................] - ETA: 2:41 - loss: 0.7063 - regression_loss: 0.6358 - classification_loss: 0.0705 88/500 [====>.........................] - ETA: 2:40 - loss: 0.7058 - regression_loss: 0.6354 - classification_loss: 0.0704 89/500 [====>.........................] - ETA: 2:40 - loss: 0.7072 - regression_loss: 0.6368 - classification_loss: 0.0703 90/500 [====>.........................] - ETA: 2:40 - loss: 0.7092 - regression_loss: 0.6383 - classification_loss: 0.0709 91/500 [====>.........................] - ETA: 2:39 - loss: 0.7067 - regression_loss: 0.6361 - classification_loss: 0.0706 92/500 [====>.........................] - ETA: 2:39 - loss: 0.7066 - regression_loss: 0.6361 - classification_loss: 0.0704 93/500 [====>.........................] - ETA: 2:38 - loss: 0.7017 - regression_loss: 0.6314 - classification_loss: 0.0703 94/500 [====>.........................] - ETA: 2:38 - loss: 0.7093 - regression_loss: 0.6380 - classification_loss: 0.0713 95/500 [====>.........................] - ETA: 2:38 - loss: 0.7055 - regression_loss: 0.6346 - classification_loss: 0.0709 96/500 [====>.........................] - ETA: 2:37 - loss: 0.7029 - regression_loss: 0.6322 - classification_loss: 0.0707 97/500 [====>.........................] - ETA: 2:37 - loss: 0.7056 - regression_loss: 0.6349 - classification_loss: 0.0707 98/500 [====>.........................] - ETA: 2:37 - loss: 0.7106 - regression_loss: 0.6383 - classification_loss: 0.0724 99/500 [====>.........................] - ETA: 2:36 - loss: 0.7076 - regression_loss: 0.6355 - classification_loss: 0.0721 100/500 [=====>........................] - ETA: 2:36 - loss: 0.7109 - regression_loss: 0.6386 - classification_loss: 0.0722 101/500 [=====>........................] - ETA: 2:35 - loss: 0.7093 - regression_loss: 0.6374 - classification_loss: 0.0719 102/500 [=====>........................] - ETA: 2:35 - loss: 0.7066 - regression_loss: 0.6348 - classification_loss: 0.0718 103/500 [=====>........................] - ETA: 2:35 - loss: 0.7068 - regression_loss: 0.6349 - classification_loss: 0.0718 104/500 [=====>........................] - ETA: 2:34 - loss: 0.7080 - regression_loss: 0.6357 - classification_loss: 0.0723 105/500 [=====>........................] - ETA: 2:34 - loss: 0.7072 - regression_loss: 0.6348 - classification_loss: 0.0724 106/500 [=====>........................] - ETA: 2:33 - loss: 0.7047 - regression_loss: 0.6327 - classification_loss: 0.0720 107/500 [=====>........................] - ETA: 2:33 - loss: 0.7011 - regression_loss: 0.6294 - classification_loss: 0.0716 108/500 [=====>........................] - ETA: 2:33 - loss: 0.7020 - regression_loss: 0.6303 - classification_loss: 0.0717 109/500 [=====>........................] - ETA: 2:32 - loss: 0.7012 - regression_loss: 0.6298 - classification_loss: 0.0714 110/500 [=====>........................] - ETA: 2:32 - loss: 0.7065 - regression_loss: 0.6339 - classification_loss: 0.0726 111/500 [=====>........................] - ETA: 2:32 - loss: 0.7060 - regression_loss: 0.6335 - classification_loss: 0.0726 112/500 [=====>........................] - ETA: 2:31 - loss: 0.7034 - regression_loss: 0.6311 - classification_loss: 0.0723 113/500 [=====>........................] - ETA: 2:31 - loss: 0.7022 - regression_loss: 0.6301 - classification_loss: 0.0721 114/500 [=====>........................] - ETA: 2:31 - loss: 0.7028 - regression_loss: 0.6307 - classification_loss: 0.0721 115/500 [=====>........................] - ETA: 2:30 - loss: 0.7009 - regression_loss: 0.6291 - classification_loss: 0.0718 116/500 [=====>........................] - ETA: 2:30 - loss: 0.6991 - regression_loss: 0.6272 - classification_loss: 0.0719 117/500 [======>.......................] - ETA: 2:29 - loss: 0.7016 - regression_loss: 0.6294 - classification_loss: 0.0722 118/500 [======>.......................] - ETA: 2:29 - loss: 0.7008 - regression_loss: 0.6288 - classification_loss: 0.0720 119/500 [======>.......................] - ETA: 2:29 - loss: 0.7032 - regression_loss: 0.6314 - classification_loss: 0.0718 120/500 [======>.......................] - ETA: 2:28 - loss: 0.7018 - regression_loss: 0.6301 - classification_loss: 0.0717 121/500 [======>.......................] - ETA: 2:28 - loss: 0.7001 - regression_loss: 0.6287 - classification_loss: 0.0714 122/500 [======>.......................] - ETA: 2:27 - loss: 0.7013 - regression_loss: 0.6298 - classification_loss: 0.0715 123/500 [======>.......................] - ETA: 2:27 - loss: 0.7015 - regression_loss: 0.6301 - classification_loss: 0.0714 124/500 [======>.......................] - ETA: 2:27 - loss: 0.6999 - regression_loss: 0.6287 - classification_loss: 0.0712 125/500 [======>.......................] - ETA: 2:26 - loss: 0.6993 - regression_loss: 0.6282 - classification_loss: 0.0711 126/500 [======>.......................] - ETA: 2:26 - loss: 0.6987 - regression_loss: 0.6278 - classification_loss: 0.0709 127/500 [======>.......................] - ETA: 2:25 - loss: 0.6977 - regression_loss: 0.6269 - classification_loss: 0.0708 128/500 [======>.......................] - ETA: 2:25 - loss: 0.7004 - regression_loss: 0.6288 - classification_loss: 0.0716 129/500 [======>.......................] - ETA: 2:25 - loss: 0.6993 - regression_loss: 0.6278 - classification_loss: 0.0715 130/500 [======>.......................] - ETA: 2:24 - loss: 0.7007 - regression_loss: 0.6287 - classification_loss: 0.0720 131/500 [======>.......................] - ETA: 2:24 - loss: 0.6974 - regression_loss: 0.6258 - classification_loss: 0.0716 132/500 [======>.......................] - ETA: 2:23 - loss: 0.6968 - regression_loss: 0.6254 - classification_loss: 0.0714 133/500 [======>.......................] - ETA: 2:23 - loss: 0.6971 - regression_loss: 0.6257 - classification_loss: 0.0714 134/500 [=======>......................] - ETA: 2:23 - loss: 0.6965 - regression_loss: 0.6253 - classification_loss: 0.0713 135/500 [=======>......................] - ETA: 2:22 - loss: 0.6964 - regression_loss: 0.6251 - classification_loss: 0.0713 136/500 [=======>......................] - ETA: 2:22 - loss: 0.6951 - regression_loss: 0.6239 - classification_loss: 0.0712 137/500 [=======>......................] - ETA: 2:22 - loss: 0.6959 - regression_loss: 0.6247 - classification_loss: 0.0713 138/500 [=======>......................] - ETA: 2:21 - loss: 0.6951 - regression_loss: 0.6240 - classification_loss: 0.0711 139/500 [=======>......................] - ETA: 2:21 - loss: 0.6960 - regression_loss: 0.6249 - classification_loss: 0.0712 140/500 [=======>......................] - ETA: 2:20 - loss: 0.6942 - regression_loss: 0.6233 - classification_loss: 0.0709 141/500 [=======>......................] - ETA: 2:20 - loss: 0.6915 - regression_loss: 0.6209 - classification_loss: 0.0706 142/500 [=======>......................] - ETA: 2:20 - loss: 0.6895 - regression_loss: 0.6192 - classification_loss: 0.0704 143/500 [=======>......................] - ETA: 2:19 - loss: 0.6883 - regression_loss: 0.6181 - classification_loss: 0.0702 144/500 [=======>......................] - ETA: 2:19 - loss: 0.6864 - regression_loss: 0.6165 - classification_loss: 0.0699 145/500 [=======>......................] - ETA: 2:18 - loss: 0.6852 - regression_loss: 0.6155 - classification_loss: 0.0697 146/500 [=======>......................] - ETA: 2:18 - loss: 0.6841 - regression_loss: 0.6144 - classification_loss: 0.0697 147/500 [=======>......................] - ETA: 2:18 - loss: 0.6842 - regression_loss: 0.6148 - classification_loss: 0.0695 148/500 [=======>......................] - ETA: 2:17 - loss: 0.6862 - regression_loss: 0.6166 - classification_loss: 0.0696 149/500 [=======>......................] - ETA: 2:17 - loss: 0.6851 - regression_loss: 0.6157 - classification_loss: 0.0694 150/500 [========>.....................] - ETA: 2:16 - loss: 0.6856 - regression_loss: 0.6161 - classification_loss: 0.0694 151/500 [========>.....................] - ETA: 2:16 - loss: 0.6854 - regression_loss: 0.6161 - classification_loss: 0.0693 152/500 [========>.....................] - ETA: 2:16 - loss: 0.6863 - regression_loss: 0.6171 - classification_loss: 0.0692 153/500 [========>.....................] - ETA: 2:15 - loss: 0.6836 - regression_loss: 0.6145 - classification_loss: 0.0691 154/500 [========>.....................] - ETA: 2:15 - loss: 0.6820 - regression_loss: 0.6129 - classification_loss: 0.0691 155/500 [========>.....................] - ETA: 2:14 - loss: 0.6828 - regression_loss: 0.6137 - classification_loss: 0.0690 156/500 [========>.....................] - ETA: 2:14 - loss: 0.6832 - regression_loss: 0.6142 - classification_loss: 0.0691 157/500 [========>.....................] - ETA: 2:14 - loss: 0.6830 - regression_loss: 0.6139 - classification_loss: 0.0691 158/500 [========>.....................] - ETA: 2:13 - loss: 0.6828 - regression_loss: 0.6138 - classification_loss: 0.0690 159/500 [========>.....................] - ETA: 2:13 - loss: 0.6816 - regression_loss: 0.6127 - classification_loss: 0.0688 160/500 [========>.....................] - ETA: 2:12 - loss: 0.6803 - regression_loss: 0.6118 - classification_loss: 0.0685 161/500 [========>.....................] - ETA: 2:12 - loss: 0.6774 - regression_loss: 0.6091 - classification_loss: 0.0683 162/500 [========>.....................] - ETA: 2:12 - loss: 0.6771 - regression_loss: 0.6090 - classification_loss: 0.0681 163/500 [========>.....................] - ETA: 2:11 - loss: 0.6761 - regression_loss: 0.6082 - classification_loss: 0.0679 164/500 [========>.....................] - ETA: 2:11 - loss: 0.6786 - regression_loss: 0.6103 - classification_loss: 0.0683 165/500 [========>.....................] - ETA: 2:11 - loss: 0.6769 - regression_loss: 0.6089 - classification_loss: 0.0680 166/500 [========>.....................] - ETA: 2:10 - loss: 0.6769 - regression_loss: 0.6088 - classification_loss: 0.0682 167/500 [=========>....................] - ETA: 2:10 - loss: 0.6749 - regression_loss: 0.6070 - classification_loss: 0.0680 168/500 [=========>....................] - ETA: 2:09 - loss: 0.6754 - regression_loss: 0.6076 - classification_loss: 0.0678 169/500 [=========>....................] - ETA: 2:09 - loss: 0.6774 - regression_loss: 0.6094 - classification_loss: 0.0681 170/500 [=========>....................] - ETA: 2:09 - loss: 0.6759 - regression_loss: 0.6080 - classification_loss: 0.0679 171/500 [=========>....................] - ETA: 2:08 - loss: 0.6746 - regression_loss: 0.6068 - classification_loss: 0.0677 172/500 [=========>....................] - ETA: 2:08 - loss: 0.6767 - regression_loss: 0.6088 - classification_loss: 0.0678 173/500 [=========>....................] - ETA: 2:07 - loss: 0.6752 - regression_loss: 0.6075 - classification_loss: 0.0676 174/500 [=========>....................] - ETA: 2:07 - loss: 0.6748 - regression_loss: 0.6073 - classification_loss: 0.0676 175/500 [=========>....................] - ETA: 2:07 - loss: 0.6742 - regression_loss: 0.6068 - classification_loss: 0.0673 176/500 [=========>....................] - ETA: 2:06 - loss: 0.6733 - regression_loss: 0.6060 - classification_loss: 0.0673 177/500 [=========>....................] - ETA: 2:06 - loss: 0.6725 - regression_loss: 0.6053 - classification_loss: 0.0672 178/500 [=========>....................] - ETA: 2:05 - loss: 0.6717 - regression_loss: 0.6047 - classification_loss: 0.0670 179/500 [=========>....................] - ETA: 2:05 - loss: 0.6705 - regression_loss: 0.6036 - classification_loss: 0.0669 180/500 [=========>....................] - ETA: 2:05 - loss: 0.6685 - regression_loss: 0.6018 - classification_loss: 0.0667 181/500 [=========>....................] - ETA: 2:04 - loss: 0.6695 - regression_loss: 0.6026 - classification_loss: 0.0668 182/500 [=========>....................] - ETA: 2:04 - loss: 0.6688 - regression_loss: 0.6020 - classification_loss: 0.0667 183/500 [=========>....................] - ETA: 2:03 - loss: 0.6704 - regression_loss: 0.6035 - classification_loss: 0.0669 184/500 [==========>...................] - ETA: 2:03 - loss: 0.6704 - regression_loss: 0.6036 - classification_loss: 0.0669 185/500 [==========>...................] - ETA: 2:03 - loss: 0.6695 - regression_loss: 0.6027 - classification_loss: 0.0668 186/500 [==========>...................] - ETA: 2:02 - loss: 0.6680 - regression_loss: 0.6014 - classification_loss: 0.0666 187/500 [==========>...................] - ETA: 2:02 - loss: 0.6684 - regression_loss: 0.6017 - classification_loss: 0.0667 188/500 [==========>...................] - ETA: 2:01 - loss: 0.6673 - regression_loss: 0.6008 - classification_loss: 0.0665 189/500 [==========>...................] - ETA: 2:01 - loss: 0.6665 - regression_loss: 0.6000 - classification_loss: 0.0664 190/500 [==========>...................] - ETA: 2:01 - loss: 0.6665 - regression_loss: 0.6003 - classification_loss: 0.0662 191/500 [==========>...................] - ETA: 2:00 - loss: 0.6664 - regression_loss: 0.6002 - classification_loss: 0.0662 192/500 [==========>...................] - ETA: 2:00 - loss: 0.6671 - regression_loss: 0.6010 - classification_loss: 0.0661 193/500 [==========>...................] - ETA: 1:59 - loss: 0.6678 - regression_loss: 0.6019 - classification_loss: 0.0660 194/500 [==========>...................] - ETA: 1:59 - loss: 0.6658 - regression_loss: 0.6000 - classification_loss: 0.0657 195/500 [==========>...................] - ETA: 1:59 - loss: 0.6650 - regression_loss: 0.5993 - classification_loss: 0.0657 196/500 [==========>...................] - ETA: 1:58 - loss: 0.6662 - regression_loss: 0.6004 - classification_loss: 0.0657 197/500 [==========>...................] - ETA: 1:58 - loss: 0.6653 - regression_loss: 0.5997 - classification_loss: 0.0656 198/500 [==========>...................] - ETA: 1:57 - loss: 0.6642 - regression_loss: 0.5987 - classification_loss: 0.0655 199/500 [==========>...................] - ETA: 1:57 - loss: 0.6624 - regression_loss: 0.5970 - classification_loss: 0.0653 200/500 [===========>..................] - ETA: 1:57 - loss: 0.6611 - regression_loss: 0.5959 - classification_loss: 0.0652 201/500 [===========>..................] - ETA: 1:56 - loss: 0.6593 - regression_loss: 0.5943 - classification_loss: 0.0651 202/500 [===========>..................] - ETA: 1:56 - loss: 0.6594 - regression_loss: 0.5943 - classification_loss: 0.0652 203/500 [===========>..................] - ETA: 1:55 - loss: 0.6600 - regression_loss: 0.5948 - classification_loss: 0.0652 204/500 [===========>..................] - ETA: 1:55 - loss: 0.6580 - regression_loss: 0.5930 - classification_loss: 0.0650 205/500 [===========>..................] - ETA: 1:55 - loss: 0.6561 - regression_loss: 0.5913 - classification_loss: 0.0648 206/500 [===========>..................] - ETA: 1:54 - loss: 0.6548 - regression_loss: 0.5902 - classification_loss: 0.0647 207/500 [===========>..................] - ETA: 1:54 - loss: 0.6550 - regression_loss: 0.5903 - classification_loss: 0.0646 208/500 [===========>..................] - ETA: 1:53 - loss: 0.6542 - regression_loss: 0.5896 - classification_loss: 0.0646 209/500 [===========>..................] - ETA: 1:53 - loss: 0.6565 - regression_loss: 0.5913 - classification_loss: 0.0652 210/500 [===========>..................] - ETA: 1:53 - loss: 0.6576 - regression_loss: 0.5922 - classification_loss: 0.0654 211/500 [===========>..................] - ETA: 1:52 - loss: 0.6575 - regression_loss: 0.5921 - classification_loss: 0.0654 212/500 [===========>..................] - ETA: 1:52 - loss: 0.6576 - regression_loss: 0.5922 - classification_loss: 0.0654 213/500 [===========>..................] - ETA: 1:52 - loss: 0.6563 - regression_loss: 0.5909 - classification_loss: 0.0654 214/500 [===========>..................] - ETA: 1:51 - loss: 0.6548 - regression_loss: 0.5894 - classification_loss: 0.0653 215/500 [===========>..................] - ETA: 1:51 - loss: 0.6536 - regression_loss: 0.5883 - classification_loss: 0.0653 216/500 [===========>..................] - ETA: 1:50 - loss: 0.6542 - regression_loss: 0.5888 - classification_loss: 0.0654 217/500 [============>.................] - ETA: 1:50 - loss: 0.6526 - regression_loss: 0.5872 - classification_loss: 0.0653 218/500 [============>.................] - ETA: 1:50 - loss: 0.6511 - regression_loss: 0.5860 - classification_loss: 0.0651 219/500 [============>.................] - ETA: 1:49 - loss: 0.6501 - regression_loss: 0.5851 - classification_loss: 0.0650 220/500 [============>.................] - ETA: 1:49 - loss: 0.6505 - regression_loss: 0.5856 - classification_loss: 0.0649 221/500 [============>.................] - ETA: 1:48 - loss: 0.6507 - regression_loss: 0.5858 - classification_loss: 0.0649 222/500 [============>.................] - ETA: 1:48 - loss: 0.6521 - regression_loss: 0.5874 - classification_loss: 0.0647 223/500 [============>.................] - ETA: 1:48 - loss: 0.6515 - regression_loss: 0.5869 - classification_loss: 0.0647 224/500 [============>.................] - ETA: 1:47 - loss: 0.6511 - regression_loss: 0.5865 - classification_loss: 0.0646 225/500 [============>.................] - ETA: 1:47 - loss: 0.6530 - regression_loss: 0.5881 - classification_loss: 0.0649 226/500 [============>.................] - ETA: 1:46 - loss: 0.6520 - regression_loss: 0.5872 - classification_loss: 0.0648 227/500 [============>.................] - ETA: 1:46 - loss: 0.6515 - regression_loss: 0.5868 - classification_loss: 0.0647 228/500 [============>.................] - ETA: 1:46 - loss: 0.6519 - regression_loss: 0.5872 - classification_loss: 0.0648 229/500 [============>.................] - ETA: 1:45 - loss: 0.6519 - regression_loss: 0.5872 - classification_loss: 0.0647 230/500 [============>.................] - ETA: 1:45 - loss: 0.6528 - regression_loss: 0.5879 - classification_loss: 0.0648 231/500 [============>.................] - ETA: 1:45 - loss: 0.6532 - regression_loss: 0.5884 - classification_loss: 0.0648 232/500 [============>.................] - ETA: 1:44 - loss: 0.6522 - regression_loss: 0.5874 - classification_loss: 0.0648 233/500 [============>.................] - ETA: 1:44 - loss: 0.6523 - regression_loss: 0.5875 - classification_loss: 0.0648 234/500 [=============>................] - ETA: 1:43 - loss: 0.6508 - regression_loss: 0.5862 - classification_loss: 0.0646 235/500 [=============>................] - ETA: 1:43 - loss: 0.6495 - regression_loss: 0.5850 - classification_loss: 0.0645 236/500 [=============>................] - ETA: 1:43 - loss: 0.6482 - regression_loss: 0.5837 - classification_loss: 0.0645 237/500 [=============>................] - ETA: 1:42 - loss: 0.6484 - regression_loss: 0.5839 - classification_loss: 0.0644 238/500 [=============>................] - ETA: 1:42 - loss: 0.6483 - regression_loss: 0.5839 - classification_loss: 0.0644 239/500 [=============>................] - ETA: 1:41 - loss: 0.6517 - regression_loss: 0.5865 - classification_loss: 0.0651 240/500 [=============>................] - ETA: 1:41 - loss: 0.6511 - regression_loss: 0.5860 - classification_loss: 0.0651 241/500 [=============>................] - ETA: 1:41 - loss: 0.6500 - regression_loss: 0.5851 - classification_loss: 0.0649 242/500 [=============>................] - ETA: 1:40 - loss: 0.6492 - regression_loss: 0.5844 - classification_loss: 0.0648 243/500 [=============>................] - ETA: 1:40 - loss: 0.6496 - regression_loss: 0.5848 - classification_loss: 0.0648 244/500 [=============>................] - ETA: 1:39 - loss: 0.6496 - regression_loss: 0.5847 - classification_loss: 0.0649 245/500 [=============>................] - ETA: 1:39 - loss: 0.6488 - regression_loss: 0.5841 - classification_loss: 0.0648 246/500 [=============>................] - ETA: 1:39 - loss: 0.6495 - regression_loss: 0.5849 - classification_loss: 0.0647 247/500 [=============>................] - ETA: 1:38 - loss: 0.6511 - regression_loss: 0.5860 - classification_loss: 0.0651 248/500 [=============>................] - ETA: 1:38 - loss: 0.6524 - regression_loss: 0.5872 - classification_loss: 0.0652 249/500 [=============>................] - ETA: 1:38 - loss: 0.6528 - regression_loss: 0.5876 - classification_loss: 0.0652 250/500 [==============>...............] - ETA: 1:37 - loss: 0.6519 - regression_loss: 0.5868 - classification_loss: 0.0651 251/500 [==============>...............] - ETA: 1:37 - loss: 0.6513 - regression_loss: 0.5863 - classification_loss: 0.0650 252/500 [==============>...............] - ETA: 1:36 - loss: 0.6518 - regression_loss: 0.5868 - classification_loss: 0.0650 253/500 [==============>...............] - ETA: 1:36 - loss: 0.6518 - regression_loss: 0.5868 - classification_loss: 0.0650 254/500 [==============>...............] - ETA: 1:36 - loss: 0.6529 - regression_loss: 0.5877 - classification_loss: 0.0652 255/500 [==============>...............] - ETA: 1:35 - loss: 0.6528 - regression_loss: 0.5877 - classification_loss: 0.0651 256/500 [==============>...............] - ETA: 1:35 - loss: 0.6518 - regression_loss: 0.5868 - classification_loss: 0.0650 257/500 [==============>...............] - ETA: 1:34 - loss: 0.6513 - regression_loss: 0.5864 - classification_loss: 0.0649 258/500 [==============>...............] - ETA: 1:34 - loss: 0.6513 - regression_loss: 0.5865 - classification_loss: 0.0649 259/500 [==============>...............] - ETA: 1:34 - loss: 0.6520 - regression_loss: 0.5870 - classification_loss: 0.0650 260/500 [==============>...............] - ETA: 1:33 - loss: 0.6512 - regression_loss: 0.5862 - classification_loss: 0.0650 261/500 [==============>...............] - ETA: 1:33 - loss: 0.6498 - regression_loss: 0.5849 - classification_loss: 0.0649 262/500 [==============>...............] - ETA: 1:32 - loss: 0.6486 - regression_loss: 0.5838 - classification_loss: 0.0648 263/500 [==============>...............] - ETA: 1:32 - loss: 0.6502 - regression_loss: 0.5850 - classification_loss: 0.0653 264/500 [==============>...............] - ETA: 1:32 - loss: 0.6500 - regression_loss: 0.5848 - classification_loss: 0.0652 265/500 [==============>...............] - ETA: 1:31 - loss: 0.6500 - regression_loss: 0.5849 - classification_loss: 0.0651 266/500 [==============>...............] - ETA: 1:31 - loss: 0.6497 - regression_loss: 0.5846 - classification_loss: 0.0651 267/500 [===============>..............] - ETA: 1:30 - loss: 0.6480 - regression_loss: 0.5831 - classification_loss: 0.0649 268/500 [===============>..............] - ETA: 1:30 - loss: 0.6494 - regression_loss: 0.5840 - classification_loss: 0.0654 269/500 [===============>..............] - ETA: 1:30 - loss: 0.6523 - regression_loss: 0.5862 - classification_loss: 0.0661 270/500 [===============>..............] - ETA: 1:29 - loss: 0.6542 - regression_loss: 0.5878 - classification_loss: 0.0663 271/500 [===============>..............] - ETA: 1:29 - loss: 0.6549 - regression_loss: 0.5886 - classification_loss: 0.0663 272/500 [===============>..............] - ETA: 1:28 - loss: 0.6537 - regression_loss: 0.5875 - classification_loss: 0.0662 273/500 [===============>..............] - ETA: 1:28 - loss: 0.6523 - regression_loss: 0.5862 - classification_loss: 0.0661 274/500 [===============>..............] - ETA: 1:28 - loss: 0.6518 - regression_loss: 0.5858 - classification_loss: 0.0660 275/500 [===============>..............] - ETA: 1:27 - loss: 0.6515 - regression_loss: 0.5856 - classification_loss: 0.0659 276/500 [===============>..............] - ETA: 1:27 - loss: 0.6502 - regression_loss: 0.5845 - classification_loss: 0.0657 277/500 [===============>..............] - ETA: 1:27 - loss: 0.6495 - regression_loss: 0.5839 - classification_loss: 0.0656 278/500 [===============>..............] - ETA: 1:26 - loss: 0.6496 - regression_loss: 0.5840 - classification_loss: 0.0656 279/500 [===============>..............] - ETA: 1:26 - loss: 0.6491 - regression_loss: 0.5835 - classification_loss: 0.0655 280/500 [===============>..............] - ETA: 1:25 - loss: 0.6487 - regression_loss: 0.5832 - classification_loss: 0.0655 281/500 [===============>..............] - ETA: 1:25 - loss: 0.6486 - regression_loss: 0.5832 - classification_loss: 0.0654 282/500 [===============>..............] - ETA: 1:25 - loss: 0.6492 - regression_loss: 0.5836 - classification_loss: 0.0656 283/500 [===============>..............] - ETA: 1:24 - loss: 0.6500 - regression_loss: 0.5843 - classification_loss: 0.0657 284/500 [================>.............] - ETA: 1:24 - loss: 0.6500 - regression_loss: 0.5844 - classification_loss: 0.0656 285/500 [================>.............] - ETA: 1:23 - loss: 0.6510 - regression_loss: 0.5854 - classification_loss: 0.0656 286/500 [================>.............] - ETA: 1:23 - loss: 0.6509 - regression_loss: 0.5853 - classification_loss: 0.0656 287/500 [================>.............] - ETA: 1:23 - loss: 0.6517 - regression_loss: 0.5860 - classification_loss: 0.0657 288/500 [================>.............] - ETA: 1:22 - loss: 0.6524 - regression_loss: 0.5867 - classification_loss: 0.0656 289/500 [================>.............] - ETA: 1:22 - loss: 0.6532 - regression_loss: 0.5876 - classification_loss: 0.0657 290/500 [================>.............] - ETA: 1:21 - loss: 0.6542 - regression_loss: 0.5885 - classification_loss: 0.0657 291/500 [================>.............] - ETA: 1:21 - loss: 0.6538 - regression_loss: 0.5880 - classification_loss: 0.0657 292/500 [================>.............] - ETA: 1:21 - loss: 0.6544 - regression_loss: 0.5886 - classification_loss: 0.0658 293/500 [================>.............] - ETA: 1:20 - loss: 0.6566 - regression_loss: 0.5905 - classification_loss: 0.0661 294/500 [================>.............] - ETA: 1:20 - loss: 0.6571 - regression_loss: 0.5910 - classification_loss: 0.0660 295/500 [================>.............] - ETA: 1:20 - loss: 0.6569 - regression_loss: 0.5910 - classification_loss: 0.0659 296/500 [================>.............] - ETA: 1:19 - loss: 0.6570 - regression_loss: 0.5911 - classification_loss: 0.0659 297/500 [================>.............] - ETA: 1:19 - loss: 0.6573 - regression_loss: 0.5914 - classification_loss: 0.0658 298/500 [================>.............] - ETA: 1:18 - loss: 0.6571 - regression_loss: 0.5913 - classification_loss: 0.0658 299/500 [================>.............] - ETA: 1:18 - loss: 0.6565 - regression_loss: 0.5907 - classification_loss: 0.0658 300/500 [=================>............] - ETA: 1:18 - loss: 0.6551 - regression_loss: 0.5894 - classification_loss: 0.0657 301/500 [=================>............] - ETA: 1:17 - loss: 0.6545 - regression_loss: 0.5888 - classification_loss: 0.0657 302/500 [=================>............] - ETA: 1:17 - loss: 0.6547 - regression_loss: 0.5890 - classification_loss: 0.0657 303/500 [=================>............] - ETA: 1:16 - loss: 0.6551 - regression_loss: 0.5894 - classification_loss: 0.0657 304/500 [=================>............] - ETA: 1:16 - loss: 0.6555 - regression_loss: 0.5897 - classification_loss: 0.0658 305/500 [=================>............] - ETA: 1:16 - loss: 0.6547 - regression_loss: 0.5889 - classification_loss: 0.0657 306/500 [=================>............] - ETA: 1:15 - loss: 0.6556 - regression_loss: 0.5898 - classification_loss: 0.0658 307/500 [=================>............] - ETA: 1:15 - loss: 0.6540 - regression_loss: 0.5883 - classification_loss: 0.0657 308/500 [=================>............] - ETA: 1:14 - loss: 0.6542 - regression_loss: 0.5886 - classification_loss: 0.0656 309/500 [=================>............] - ETA: 1:14 - loss: 0.6537 - regression_loss: 0.5881 - classification_loss: 0.0656 310/500 [=================>............] - ETA: 1:14 - loss: 0.6533 - regression_loss: 0.5878 - classification_loss: 0.0655 311/500 [=================>............] - ETA: 1:13 - loss: 0.6533 - regression_loss: 0.5878 - classification_loss: 0.0655 312/500 [=================>............] - ETA: 1:13 - loss: 0.6525 - regression_loss: 0.5870 - classification_loss: 0.0655 313/500 [=================>............] - ETA: 1:12 - loss: 0.6530 - regression_loss: 0.5875 - classification_loss: 0.0655 314/500 [=================>............] - ETA: 1:12 - loss: 0.6531 - regression_loss: 0.5875 - classification_loss: 0.0656 315/500 [=================>............] - ETA: 1:12 - loss: 0.6524 - regression_loss: 0.5870 - classification_loss: 0.0655 316/500 [=================>............] - ETA: 1:11 - loss: 0.6527 - regression_loss: 0.5873 - classification_loss: 0.0654 317/500 [==================>...........] - ETA: 1:11 - loss: 0.6523 - regression_loss: 0.5869 - classification_loss: 0.0653 318/500 [==================>...........] - ETA: 1:10 - loss: 0.6537 - regression_loss: 0.5882 - classification_loss: 0.0655 319/500 [==================>...........] - ETA: 1:10 - loss: 0.6557 - regression_loss: 0.5902 - classification_loss: 0.0655 320/500 [==================>...........] - ETA: 1:10 - loss: 0.6551 - regression_loss: 0.5897 - classification_loss: 0.0654 321/500 [==================>...........] - ETA: 1:09 - loss: 0.6571 - regression_loss: 0.5912 - classification_loss: 0.0658 322/500 [==================>...........] - ETA: 1:09 - loss: 0.6577 - regression_loss: 0.5919 - classification_loss: 0.0659 323/500 [==================>...........] - ETA: 1:09 - loss: 0.6580 - regression_loss: 0.5921 - classification_loss: 0.0659 324/500 [==================>...........] - ETA: 1:08 - loss: 0.6577 - regression_loss: 0.5918 - classification_loss: 0.0659 325/500 [==================>...........] - ETA: 1:08 - loss: 0.6580 - regression_loss: 0.5922 - classification_loss: 0.0658 326/500 [==================>...........] - ETA: 1:07 - loss: 0.6579 - regression_loss: 0.5920 - classification_loss: 0.0659 327/500 [==================>...........] - ETA: 1:07 - loss: 0.6572 - regression_loss: 0.5914 - classification_loss: 0.0658 328/500 [==================>...........] - ETA: 1:07 - loss: 0.6564 - regression_loss: 0.5907 - classification_loss: 0.0657 329/500 [==================>...........] - ETA: 1:06 - loss: 0.6556 - regression_loss: 0.5899 - classification_loss: 0.0657 330/500 [==================>...........] - ETA: 1:06 - loss: 0.6549 - regression_loss: 0.5893 - classification_loss: 0.0656 331/500 [==================>...........] - ETA: 1:05 - loss: 0.6548 - regression_loss: 0.5893 - classification_loss: 0.0655 332/500 [==================>...........] - ETA: 1:05 - loss: 0.6544 - regression_loss: 0.5889 - classification_loss: 0.0655 333/500 [==================>...........] - ETA: 1:05 - loss: 0.6555 - regression_loss: 0.5898 - classification_loss: 0.0657 334/500 [===================>..........] - ETA: 1:04 - loss: 0.6566 - regression_loss: 0.5909 - classification_loss: 0.0657 335/500 [===================>..........] - ETA: 1:04 - loss: 0.6577 - regression_loss: 0.5918 - classification_loss: 0.0659 336/500 [===================>..........] - ETA: 1:03 - loss: 0.6599 - regression_loss: 0.5939 - classification_loss: 0.0661 337/500 [===================>..........] - ETA: 1:03 - loss: 0.6606 - regression_loss: 0.5945 - classification_loss: 0.0662 338/500 [===================>..........] - ETA: 1:03 - loss: 0.6609 - regression_loss: 0.5947 - classification_loss: 0.0661 339/500 [===================>..........] - ETA: 1:02 - loss: 0.6626 - regression_loss: 0.5960 - classification_loss: 0.0666 340/500 [===================>..........] - ETA: 1:02 - loss: 0.6623 - regression_loss: 0.5949 - classification_loss: 0.0673 341/500 [===================>..........] - ETA: 1:01 - loss: 0.6622 - regression_loss: 0.5949 - classification_loss: 0.0673 342/500 [===================>..........] - ETA: 1:01 - loss: 0.6618 - regression_loss: 0.5946 - classification_loss: 0.0672 343/500 [===================>..........] - ETA: 1:01 - loss: 0.6618 - regression_loss: 0.5946 - classification_loss: 0.0672 344/500 [===================>..........] - ETA: 1:00 - loss: 0.6612 - regression_loss: 0.5940 - classification_loss: 0.0672 345/500 [===================>..........] - ETA: 1:00 - loss: 0.6612 - regression_loss: 0.5939 - classification_loss: 0.0673 346/500 [===================>..........] - ETA: 1:00 - loss: 0.6607 - regression_loss: 0.5935 - classification_loss: 0.0672 347/500 [===================>..........] - ETA: 59s - loss: 0.6609 - regression_loss: 0.5936 - classification_loss: 0.0673  348/500 [===================>..........] - ETA: 59s - loss: 0.6607 - regression_loss: 0.5934 - classification_loss: 0.0672 349/500 [===================>..........] - ETA: 58s - loss: 0.6599 - regression_loss: 0.5928 - classification_loss: 0.0671 350/500 [====================>.........] - ETA: 58s - loss: 0.6592 - regression_loss: 0.5921 - classification_loss: 0.0670 351/500 [====================>.........] - ETA: 58s - loss: 0.6589 - regression_loss: 0.5920 - classification_loss: 0.0670 352/500 [====================>.........] - ETA: 57s - loss: 0.6591 - regression_loss: 0.5921 - classification_loss: 0.0670 353/500 [====================>.........] - ETA: 57s - loss: 0.6601 - regression_loss: 0.5928 - classification_loss: 0.0673 354/500 [====================>.........] - ETA: 56s - loss: 0.6599 - regression_loss: 0.5927 - classification_loss: 0.0672 355/500 [====================>.........] - ETA: 56s - loss: 0.6599 - regression_loss: 0.5927 - classification_loss: 0.0672 356/500 [====================>.........] - ETA: 56s - loss: 0.6593 - regression_loss: 0.5921 - classification_loss: 0.0672 357/500 [====================>.........] - ETA: 55s - loss: 0.6615 - regression_loss: 0.5940 - classification_loss: 0.0675 358/500 [====================>.........] - ETA: 55s - loss: 0.6609 - regression_loss: 0.5935 - classification_loss: 0.0674 359/500 [====================>.........] - ETA: 54s - loss: 0.6609 - regression_loss: 0.5935 - classification_loss: 0.0673 360/500 [====================>.........] - ETA: 54s - loss: 0.6605 - regression_loss: 0.5933 - classification_loss: 0.0672 361/500 [====================>.........] - ETA: 54s - loss: 0.6596 - regression_loss: 0.5925 - classification_loss: 0.0671 362/500 [====================>.........] - ETA: 53s - loss: 0.6593 - regression_loss: 0.5922 - classification_loss: 0.0671 363/500 [====================>.........] - ETA: 53s - loss: 0.6593 - regression_loss: 0.5922 - classification_loss: 0.0671 364/500 [====================>.........] - ETA: 53s - loss: 0.6600 - regression_loss: 0.5929 - classification_loss: 0.0671 365/500 [====================>.........] - ETA: 52s - loss: 0.6612 - regression_loss: 0.5941 - classification_loss: 0.0671 366/500 [====================>.........] - ETA: 52s - loss: 0.6616 - regression_loss: 0.5945 - classification_loss: 0.0671 367/500 [=====================>........] - ETA: 51s - loss: 0.6616 - regression_loss: 0.5946 - classification_loss: 0.0671 368/500 [=====================>........] - ETA: 51s - loss: 0.6615 - regression_loss: 0.5945 - classification_loss: 0.0670 369/500 [=====================>........] - ETA: 51s - loss: 0.6619 - regression_loss: 0.5949 - classification_loss: 0.0671 370/500 [=====================>........] - ETA: 50s - loss: 0.6621 - regression_loss: 0.5950 - classification_loss: 0.0670 371/500 [=====================>........] - ETA: 50s - loss: 0.6626 - regression_loss: 0.5955 - classification_loss: 0.0671 372/500 [=====================>........] - ETA: 49s - loss: 0.6623 - regression_loss: 0.5952 - classification_loss: 0.0671 373/500 [=====================>........] - ETA: 49s - loss: 0.6617 - regression_loss: 0.5947 - classification_loss: 0.0670 374/500 [=====================>........] - ETA: 49s - loss: 0.6639 - regression_loss: 0.5964 - classification_loss: 0.0675 375/500 [=====================>........] - ETA: 48s - loss: 0.6643 - regression_loss: 0.5968 - classification_loss: 0.0675 376/500 [=====================>........] - ETA: 48s - loss: 0.6651 - regression_loss: 0.5975 - classification_loss: 0.0676 377/500 [=====================>........] - ETA: 47s - loss: 0.6649 - regression_loss: 0.5974 - classification_loss: 0.0675 378/500 [=====================>........] - ETA: 47s - loss: 0.6641 - regression_loss: 0.5968 - classification_loss: 0.0674 379/500 [=====================>........] - ETA: 47s - loss: 0.6646 - regression_loss: 0.5973 - classification_loss: 0.0673 380/500 [=====================>........] - ETA: 46s - loss: 0.6657 - regression_loss: 0.5984 - classification_loss: 0.0674 381/500 [=====================>........] - ETA: 46s - loss: 0.6651 - regression_loss: 0.5978 - classification_loss: 0.0673 382/500 [=====================>........] - ETA: 45s - loss: 0.6659 - regression_loss: 0.5984 - classification_loss: 0.0675 383/500 [=====================>........] - ETA: 45s - loss: 0.6655 - regression_loss: 0.5981 - classification_loss: 0.0675 384/500 [======================>.......] - ETA: 45s - loss: 0.6645 - regression_loss: 0.5971 - classification_loss: 0.0674 385/500 [======================>.......] - ETA: 44s - loss: 0.6641 - regression_loss: 0.5967 - classification_loss: 0.0673 386/500 [======================>.......] - ETA: 44s - loss: 0.6656 - regression_loss: 0.5980 - classification_loss: 0.0676 387/500 [======================>.......] - ETA: 44s - loss: 0.6652 - regression_loss: 0.5977 - classification_loss: 0.0675 388/500 [======================>.......] - ETA: 43s - loss: 0.6659 - regression_loss: 0.5984 - classification_loss: 0.0675 389/500 [======================>.......] - ETA: 43s - loss: 0.6661 - regression_loss: 0.5987 - classification_loss: 0.0674 390/500 [======================>.......] - ETA: 42s - loss: 0.6658 - regression_loss: 0.5984 - classification_loss: 0.0674 391/500 [======================>.......] - ETA: 42s - loss: 0.6648 - regression_loss: 0.5975 - classification_loss: 0.0672 392/500 [======================>.......] - ETA: 42s - loss: 0.6642 - regression_loss: 0.5970 - classification_loss: 0.0672 393/500 [======================>.......] - ETA: 41s - loss: 0.6631 - regression_loss: 0.5960 - classification_loss: 0.0671 394/500 [======================>.......] - ETA: 41s - loss: 0.6628 - regression_loss: 0.5958 - classification_loss: 0.0670 395/500 [======================>.......] - ETA: 40s - loss: 0.6640 - regression_loss: 0.5967 - classification_loss: 0.0673 396/500 [======================>.......] - ETA: 40s - loss: 0.6647 - regression_loss: 0.5974 - classification_loss: 0.0673 397/500 [======================>.......] - ETA: 40s - loss: 0.6647 - regression_loss: 0.5975 - classification_loss: 0.0672 398/500 [======================>.......] - ETA: 39s - loss: 0.6653 - regression_loss: 0.5981 - classification_loss: 0.0672 399/500 [======================>.......] - ETA: 39s - loss: 0.6655 - regression_loss: 0.5983 - classification_loss: 0.0672 400/500 [=======================>......] - ETA: 38s - loss: 0.6669 - regression_loss: 0.5996 - classification_loss: 0.0673 401/500 [=======================>......] - ETA: 38s - loss: 0.6668 - regression_loss: 0.5996 - classification_loss: 0.0672 402/500 [=======================>......] - ETA: 38s - loss: 0.6669 - regression_loss: 0.5997 - classification_loss: 0.0672 403/500 [=======================>......] - ETA: 37s - loss: 0.6683 - regression_loss: 0.6008 - classification_loss: 0.0675 404/500 [=======================>......] - ETA: 37s - loss: 0.6681 - regression_loss: 0.6007 - classification_loss: 0.0674 405/500 [=======================>......] - ETA: 36s - loss: 0.6701 - regression_loss: 0.6025 - classification_loss: 0.0676 406/500 [=======================>......] - ETA: 36s - loss: 0.6705 - regression_loss: 0.6029 - classification_loss: 0.0676 407/500 [=======================>......] - ETA: 36s - loss: 0.6714 - regression_loss: 0.6038 - classification_loss: 0.0676 408/500 [=======================>......] - ETA: 35s - loss: 0.6711 - regression_loss: 0.6036 - classification_loss: 0.0675 409/500 [=======================>......] - ETA: 35s - loss: 0.6707 - regression_loss: 0.6033 - classification_loss: 0.0674 410/500 [=======================>......] - ETA: 35s - loss: 0.6706 - regression_loss: 0.6032 - classification_loss: 0.0674 411/500 [=======================>......] - ETA: 34s - loss: 0.6717 - regression_loss: 0.6039 - classification_loss: 0.0678 412/500 [=======================>......] - ETA: 34s - loss: 0.6719 - regression_loss: 0.6041 - classification_loss: 0.0677 413/500 [=======================>......] - ETA: 33s - loss: 0.6732 - regression_loss: 0.6050 - classification_loss: 0.0682 414/500 [=======================>......] - ETA: 33s - loss: 0.6728 - regression_loss: 0.6046 - classification_loss: 0.0681 415/500 [=======================>......] - ETA: 33s - loss: 0.6726 - regression_loss: 0.6045 - classification_loss: 0.0681 416/500 [=======================>......] - ETA: 32s - loss: 0.6721 - regression_loss: 0.6041 - classification_loss: 0.0680 417/500 [========================>.....] - ETA: 32s - loss: 0.6719 - regression_loss: 0.6039 - classification_loss: 0.0679 418/500 [========================>.....] - ETA: 31s - loss: 0.6718 - regression_loss: 0.6039 - classification_loss: 0.0679 419/500 [========================>.....] - ETA: 31s - loss: 0.6717 - regression_loss: 0.6039 - classification_loss: 0.0679 420/500 [========================>.....] - ETA: 31s - loss: 0.6723 - regression_loss: 0.6045 - classification_loss: 0.0678 421/500 [========================>.....] - ETA: 30s - loss: 0.6712 - regression_loss: 0.6035 - classification_loss: 0.0677 422/500 [========================>.....] - ETA: 30s - loss: 0.6708 - regression_loss: 0.6032 - classification_loss: 0.0677 423/500 [========================>.....] - ETA: 29s - loss: 0.6699 - regression_loss: 0.6023 - classification_loss: 0.0676 424/500 [========================>.....] - ETA: 29s - loss: 0.6692 - regression_loss: 0.6017 - classification_loss: 0.0675 425/500 [========================>.....] - ETA: 29s - loss: 0.6689 - regression_loss: 0.6015 - classification_loss: 0.0674 426/500 [========================>.....] - ETA: 28s - loss: 0.6686 - regression_loss: 0.6013 - classification_loss: 0.0673 427/500 [========================>.....] - ETA: 28s - loss: 0.6682 - regression_loss: 0.6009 - classification_loss: 0.0673 428/500 [========================>.....] - ETA: 28s - loss: 0.6684 - regression_loss: 0.6011 - classification_loss: 0.0673 429/500 [========================>.....] - ETA: 27s - loss: 0.6689 - regression_loss: 0.6016 - classification_loss: 0.0673 430/500 [========================>.....] - ETA: 27s - loss: 0.6685 - regression_loss: 0.6013 - classification_loss: 0.0673 431/500 [========================>.....] - ETA: 26s - loss: 0.6681 - regression_loss: 0.6009 - classification_loss: 0.0673 432/500 [========================>.....] - ETA: 26s - loss: 0.6680 - regression_loss: 0.6007 - classification_loss: 0.0672 433/500 [========================>.....] - ETA: 26s - loss: 0.6676 - regression_loss: 0.6004 - classification_loss: 0.0672 434/500 [=========================>....] - ETA: 25s - loss: 0.6679 - regression_loss: 0.6007 - classification_loss: 0.0672 435/500 [=========================>....] - ETA: 25s - loss: 0.6681 - regression_loss: 0.6008 - classification_loss: 0.0672 436/500 [=========================>....] - ETA: 24s - loss: 0.6688 - regression_loss: 0.6015 - classification_loss: 0.0673 437/500 [=========================>....] - ETA: 24s - loss: 0.6677 - regression_loss: 0.6005 - classification_loss: 0.0672 438/500 [=========================>....] - ETA: 24s - loss: 0.6668 - regression_loss: 0.5998 - classification_loss: 0.0671 439/500 [=========================>....] - ETA: 23s - loss: 0.6659 - regression_loss: 0.5989 - classification_loss: 0.0669 440/500 [=========================>....] - ETA: 23s - loss: 0.6668 - regression_loss: 0.5999 - classification_loss: 0.0670 441/500 [=========================>....] - ETA: 22s - loss: 0.6672 - regression_loss: 0.6003 - classification_loss: 0.0669 442/500 [=========================>....] - ETA: 22s - loss: 0.6669 - regression_loss: 0.6000 - classification_loss: 0.0669 443/500 [=========================>....] - ETA: 22s - loss: 0.6667 - regression_loss: 0.5998 - classification_loss: 0.0668 444/500 [=========================>....] - ETA: 21s - loss: 0.6659 - regression_loss: 0.5992 - classification_loss: 0.0667 445/500 [=========================>....] - ETA: 21s - loss: 0.6659 - regression_loss: 0.5992 - classification_loss: 0.0667 446/500 [=========================>....] - ETA: 21s - loss: 0.6659 - regression_loss: 0.5992 - classification_loss: 0.0667 447/500 [=========================>....] - ETA: 20s - loss: 0.6656 - regression_loss: 0.5989 - classification_loss: 0.0667 448/500 [=========================>....] - ETA: 20s - loss: 0.6657 - regression_loss: 0.5991 - classification_loss: 0.0666 449/500 [=========================>....] - ETA: 19s - loss: 0.6655 - regression_loss: 0.5990 - classification_loss: 0.0665 450/500 [==========================>...] - ETA: 19s - loss: 0.6657 - regression_loss: 0.5992 - classification_loss: 0.0665 451/500 [==========================>...] - ETA: 19s - loss: 0.6653 - regression_loss: 0.5988 - classification_loss: 0.0665 452/500 [==========================>...] - ETA: 18s - loss: 0.6645 - regression_loss: 0.5980 - classification_loss: 0.0665 453/500 [==========================>...] - ETA: 18s - loss: 0.6641 - regression_loss: 0.5977 - classification_loss: 0.0664 454/500 [==========================>...] - ETA: 17s - loss: 0.6644 - regression_loss: 0.5981 - classification_loss: 0.0664 455/500 [==========================>...] - ETA: 17s - loss: 0.6654 - regression_loss: 0.5988 - classification_loss: 0.0666 456/500 [==========================>...] - ETA: 17s - loss: 0.6657 - regression_loss: 0.5992 - classification_loss: 0.0666 457/500 [==========================>...] - ETA: 16s - loss: 0.6654 - regression_loss: 0.5989 - classification_loss: 0.0665 458/500 [==========================>...] - ETA: 16s - loss: 0.6652 - regression_loss: 0.5987 - classification_loss: 0.0665 459/500 [==========================>...] - ETA: 15s - loss: 0.6656 - regression_loss: 0.5989 - classification_loss: 0.0667 460/500 [==========================>...] - ETA: 15s - loss: 0.6650 - regression_loss: 0.5984 - classification_loss: 0.0666 461/500 [==========================>...] - ETA: 15s - loss: 0.6651 - regression_loss: 0.5985 - classification_loss: 0.0666 462/500 [==========================>...] - ETA: 14s - loss: 0.6646 - regression_loss: 0.5981 - classification_loss: 0.0666 463/500 [==========================>...] - ETA: 14s - loss: 0.6646 - regression_loss: 0.5982 - classification_loss: 0.0665 464/500 [==========================>...] - ETA: 14s - loss: 0.6644 - regression_loss: 0.5980 - classification_loss: 0.0664 465/500 [==========================>...] - ETA: 13s - loss: 0.6654 - regression_loss: 0.5990 - classification_loss: 0.0665 466/500 [==========================>...] - ETA: 13s - loss: 0.6649 - regression_loss: 0.5985 - classification_loss: 0.0664 467/500 [===========================>..] - ETA: 12s - loss: 0.6647 - regression_loss: 0.5983 - classification_loss: 0.0664 468/500 [===========================>..] - ETA: 12s - loss: 0.6646 - regression_loss: 0.5983 - classification_loss: 0.0663 469/500 [===========================>..] - ETA: 12s - loss: 0.6646 - regression_loss: 0.5983 - classification_loss: 0.0662 470/500 [===========================>..] - ETA: 11s - loss: 0.6656 - regression_loss: 0.5993 - classification_loss: 0.0663 471/500 [===========================>..] - ETA: 11s - loss: 0.6658 - regression_loss: 0.5995 - classification_loss: 0.0663 472/500 [===========================>..] - ETA: 10s - loss: 0.6659 - regression_loss: 0.5996 - classification_loss: 0.0663 473/500 [===========================>..] - ETA: 10s - loss: 0.6657 - regression_loss: 0.5994 - classification_loss: 0.0662 474/500 [===========================>..] - ETA: 10s - loss: 0.6660 - regression_loss: 0.5997 - classification_loss: 0.0663 475/500 [===========================>..] - ETA: 9s - loss: 0.6659 - regression_loss: 0.5997 - classification_loss: 0.0662  476/500 [===========================>..] - ETA: 9s - loss: 0.6660 - regression_loss: 0.5998 - classification_loss: 0.0663 477/500 [===========================>..] - ETA: 8s - loss: 0.6654 - regression_loss: 0.5992 - classification_loss: 0.0662 478/500 [===========================>..] - ETA: 8s - loss: 0.6651 - regression_loss: 0.5990 - classification_loss: 0.0661 479/500 [===========================>..] - ETA: 8s - loss: 0.6652 - regression_loss: 0.5990 - classification_loss: 0.0662 480/500 [===========================>..] - ETA: 7s - loss: 0.6659 - regression_loss: 0.5997 - classification_loss: 0.0662 481/500 [===========================>..] - ETA: 7s - loss: 0.6660 - regression_loss: 0.5998 - classification_loss: 0.0662 482/500 [===========================>..] - ETA: 7s - loss: 0.6658 - regression_loss: 0.5996 - classification_loss: 0.0662 483/500 [===========================>..] - ETA: 6s - loss: 0.6653 - regression_loss: 0.5991 - classification_loss: 0.0661 484/500 [============================>.] - ETA: 6s - loss: 0.6659 - regression_loss: 0.5997 - classification_loss: 0.0663 485/500 [============================>.] - ETA: 5s - loss: 0.6657 - regression_loss: 0.5995 - classification_loss: 0.0662 486/500 [============================>.] - ETA: 5s - loss: 0.6648 - regression_loss: 0.5987 - classification_loss: 0.0662 487/500 [============================>.] - ETA: 5s - loss: 0.6649 - regression_loss: 0.5987 - classification_loss: 0.0662 488/500 [============================>.] - ETA: 4s - loss: 0.6645 - regression_loss: 0.5984 - classification_loss: 0.0661 489/500 [============================>.] - ETA: 4s - loss: 0.6647 - regression_loss: 0.5986 - classification_loss: 0.0661 490/500 [============================>.] - ETA: 3s - loss: 0.6644 - regression_loss: 0.5984 - classification_loss: 0.0661 491/500 [============================>.] - ETA: 3s - loss: 0.6638 - regression_loss: 0.5978 - classification_loss: 0.0660 492/500 [============================>.] - ETA: 3s - loss: 0.6636 - regression_loss: 0.5976 - classification_loss: 0.0660 493/500 [============================>.] - ETA: 2s - loss: 0.6633 - regression_loss: 0.5973 - classification_loss: 0.0659 494/500 [============================>.] - ETA: 2s - loss: 0.6633 - regression_loss: 0.5973 - classification_loss: 0.0659 495/500 [============================>.] - ETA: 1s - loss: 0.6629 - regression_loss: 0.5970 - classification_loss: 0.0659 496/500 [============================>.] - ETA: 1s - loss: 0.6626 - regression_loss: 0.5967 - classification_loss: 0.0659 497/500 [============================>.] - ETA: 1s - loss: 0.6631 - regression_loss: 0.5972 - classification_loss: 0.0659 498/500 [============================>.] - ETA: 0s - loss: 0.6632 - regression_loss: 0.5974 - classification_loss: 0.0659 499/500 [============================>.] - ETA: 0s - loss: 0.6630 - regression_loss: 0.5972 - classification_loss: 0.0658 500/500 [==============================] - 195s 390ms/step - loss: 0.6628 - regression_loss: 0.5970 - classification_loss: 0.0658 326 instances of class plum with average precision: 0.8453 mAP: 0.8453 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/20 1/500 [..............................] - ETA: 2:59 - loss: 0.5352 - regression_loss: 0.5001 - classification_loss: 0.0351 2/500 [..............................] - ETA: 3:08 - loss: 0.5677 - regression_loss: 0.5277 - classification_loss: 0.0400 3/500 [..............................] - ETA: 3:07 - loss: 0.5389 - regression_loss: 0.4983 - classification_loss: 0.0406 4/500 [..............................] - ETA: 3:07 - loss: 0.7024 - regression_loss: 0.6367 - classification_loss: 0.0657 5/500 [..............................] - ETA: 3:08 - loss: 0.6677 - regression_loss: 0.6043 - classification_loss: 0.0634 6/500 [..............................] - ETA: 3:08 - loss: 0.6398 - regression_loss: 0.5816 - classification_loss: 0.0582 7/500 [..............................] - ETA: 3:08 - loss: 0.6108 - regression_loss: 0.5554 - classification_loss: 0.0554 8/500 [..............................] - ETA: 3:08 - loss: 0.5832 - regression_loss: 0.5316 - classification_loss: 0.0516 9/500 [..............................] - ETA: 3:07 - loss: 0.6738 - regression_loss: 0.6138 - classification_loss: 0.0600 10/500 [..............................] - ETA: 3:08 - loss: 0.6412 - regression_loss: 0.5846 - classification_loss: 0.0566 11/500 [..............................] - ETA: 3:07 - loss: 0.6281 - regression_loss: 0.5750 - classification_loss: 0.0531 12/500 [..............................] - ETA: 3:06 - loss: 0.6509 - regression_loss: 0.5963 - classification_loss: 0.0546 13/500 [..............................] - ETA: 3:06 - loss: 0.6557 - regression_loss: 0.6004 - classification_loss: 0.0553 14/500 [..............................] - ETA: 3:05 - loss: 0.6348 - regression_loss: 0.5810 - classification_loss: 0.0538 15/500 [..............................] - ETA: 3:05 - loss: 0.6509 - regression_loss: 0.5944 - classification_loss: 0.0565 16/500 [..............................] - ETA: 3:05 - loss: 0.6495 - regression_loss: 0.5920 - classification_loss: 0.0575 17/500 [>.............................] - ETA: 3:05 - loss: 0.6379 - regression_loss: 0.5821 - classification_loss: 0.0559 18/500 [>.............................] - ETA: 3:05 - loss: 0.6411 - regression_loss: 0.5853 - classification_loss: 0.0558 19/500 [>.............................] - ETA: 3:04 - loss: 0.6641 - regression_loss: 0.6063 - classification_loss: 0.0578 20/500 [>.............................] - ETA: 3:04 - loss: 0.6619 - regression_loss: 0.6042 - classification_loss: 0.0577 21/500 [>.............................] - ETA: 3:04 - loss: 0.7156 - regression_loss: 0.6518 - classification_loss: 0.0638 22/500 [>.............................] - ETA: 3:04 - loss: 0.7362 - regression_loss: 0.6679 - classification_loss: 0.0683 23/500 [>.............................] - ETA: 3:04 - loss: 0.7185 - regression_loss: 0.6521 - classification_loss: 0.0664 24/500 [>.............................] - ETA: 3:03 - loss: 0.7142 - regression_loss: 0.6481 - classification_loss: 0.0661 25/500 [>.............................] - ETA: 3:03 - loss: 0.7045 - regression_loss: 0.6390 - classification_loss: 0.0655 26/500 [>.............................] - ETA: 3:03 - loss: 0.6935 - regression_loss: 0.6293 - classification_loss: 0.0642 27/500 [>.............................] - ETA: 3:03 - loss: 0.6796 - regression_loss: 0.6167 - classification_loss: 0.0629 28/500 [>.............................] - ETA: 3:02 - loss: 0.6736 - regression_loss: 0.6114 - classification_loss: 0.0622 29/500 [>.............................] - ETA: 3:02 - loss: 0.6775 - regression_loss: 0.6149 - classification_loss: 0.0625 30/500 [>.............................] - ETA: 3:02 - loss: 0.7037 - regression_loss: 0.6374 - classification_loss: 0.0662 31/500 [>.............................] - ETA: 3:01 - loss: 0.7007 - regression_loss: 0.6347 - classification_loss: 0.0660 32/500 [>.............................] - ETA: 3:01 - loss: 0.6936 - regression_loss: 0.6286 - classification_loss: 0.0650 33/500 [>.............................] - ETA: 3:00 - loss: 0.6932 - regression_loss: 0.6291 - classification_loss: 0.0641 34/500 [=>............................] - ETA: 3:00 - loss: 0.6882 - regression_loss: 0.6244 - classification_loss: 0.0637 35/500 [=>............................] - ETA: 3:00 - loss: 0.6898 - regression_loss: 0.6254 - classification_loss: 0.0644 36/500 [=>............................] - ETA: 3:00 - loss: 0.6826 - regression_loss: 0.6189 - classification_loss: 0.0637 37/500 [=>............................] - ETA: 2:59 - loss: 0.6789 - regression_loss: 0.6153 - classification_loss: 0.0636 38/500 [=>............................] - ETA: 2:59 - loss: 0.6845 - regression_loss: 0.6202 - classification_loss: 0.0644 39/500 [=>............................] - ETA: 2:59 - loss: 0.6761 - regression_loss: 0.6121 - classification_loss: 0.0640 40/500 [=>............................] - ETA: 2:58 - loss: 0.6779 - regression_loss: 0.6136 - classification_loss: 0.0643 41/500 [=>............................] - ETA: 2:58 - loss: 0.6783 - regression_loss: 0.6140 - classification_loss: 0.0643 42/500 [=>............................] - ETA: 2:58 - loss: 0.6727 - regression_loss: 0.6087 - classification_loss: 0.0640 43/500 [=>............................] - ETA: 2:57 - loss: 0.6672 - regression_loss: 0.6039 - classification_loss: 0.0633 44/500 [=>............................] - ETA: 2:57 - loss: 0.6614 - regression_loss: 0.5981 - classification_loss: 0.0633 45/500 [=>............................] - ETA: 2:56 - loss: 0.6673 - regression_loss: 0.6043 - classification_loss: 0.0630 46/500 [=>............................] - ETA: 2:56 - loss: 0.6674 - regression_loss: 0.6047 - classification_loss: 0.0627 47/500 [=>............................] - ETA: 2:55 - loss: 0.6644 - regression_loss: 0.6020 - classification_loss: 0.0624 48/500 [=>............................] - ETA: 2:55 - loss: 0.6763 - regression_loss: 0.6118 - classification_loss: 0.0645 49/500 [=>............................] - ETA: 2:55 - loss: 0.6711 - regression_loss: 0.6072 - classification_loss: 0.0638 50/500 [==>...........................] - ETA: 2:54 - loss: 0.6697 - regression_loss: 0.6062 - classification_loss: 0.0634 51/500 [==>...........................] - ETA: 2:54 - loss: 0.6672 - regression_loss: 0.6040 - classification_loss: 0.0632 52/500 [==>...........................] - ETA: 2:54 - loss: 0.6715 - regression_loss: 0.6074 - classification_loss: 0.0642 53/500 [==>...........................] - ETA: 2:53 - loss: 0.6705 - regression_loss: 0.6065 - classification_loss: 0.0640 54/500 [==>...........................] - ETA: 2:53 - loss: 0.6742 - regression_loss: 0.6101 - classification_loss: 0.0641 55/500 [==>...........................] - ETA: 2:52 - loss: 0.6791 - regression_loss: 0.6145 - classification_loss: 0.0645 56/500 [==>...........................] - ETA: 2:52 - loss: 0.6761 - regression_loss: 0.6120 - classification_loss: 0.0641 57/500 [==>...........................] - ETA: 2:52 - loss: 0.6815 - regression_loss: 0.6169 - classification_loss: 0.0646 58/500 [==>...........................] - ETA: 2:51 - loss: 0.6844 - regression_loss: 0.6197 - classification_loss: 0.0647 59/500 [==>...........................] - ETA: 2:51 - loss: 0.6807 - regression_loss: 0.6165 - classification_loss: 0.0642 60/500 [==>...........................] - ETA: 2:51 - loss: 0.6816 - regression_loss: 0.6175 - classification_loss: 0.0641 61/500 [==>...........................] - ETA: 2:50 - loss: 0.6870 - regression_loss: 0.6224 - classification_loss: 0.0646 62/500 [==>...........................] - ETA: 2:50 - loss: 0.6866 - regression_loss: 0.6224 - classification_loss: 0.0642 63/500 [==>...........................] - ETA: 2:49 - loss: 0.6899 - regression_loss: 0.6258 - classification_loss: 0.0642 64/500 [==>...........................] - ETA: 2:49 - loss: 0.6851 - regression_loss: 0.6215 - classification_loss: 0.0636 65/500 [==>...........................] - ETA: 2:49 - loss: 0.6889 - regression_loss: 0.6246 - classification_loss: 0.0643 66/500 [==>...........................] - ETA: 2:48 - loss: 0.6883 - regression_loss: 0.6241 - classification_loss: 0.0642 67/500 [===>..........................] - ETA: 2:48 - loss: 0.6842 - regression_loss: 0.6203 - classification_loss: 0.0639 68/500 [===>..........................] - ETA: 2:48 - loss: 0.6903 - regression_loss: 0.6240 - classification_loss: 0.0662 69/500 [===>..........................] - ETA: 2:47 - loss: 0.6854 - regression_loss: 0.6196 - classification_loss: 0.0658 70/500 [===>..........................] - ETA: 2:47 - loss: 0.6828 - regression_loss: 0.6174 - classification_loss: 0.0654 71/500 [===>..........................] - ETA: 2:47 - loss: 0.6794 - regression_loss: 0.6144 - classification_loss: 0.0650 72/500 [===>..........................] - ETA: 2:46 - loss: 0.6754 - regression_loss: 0.6109 - classification_loss: 0.0645 73/500 [===>..........................] - ETA: 2:46 - loss: 0.6726 - regression_loss: 0.6082 - classification_loss: 0.0643 74/500 [===>..........................] - ETA: 2:46 - loss: 0.6677 - regression_loss: 0.6039 - classification_loss: 0.0639 75/500 [===>..........................] - ETA: 2:45 - loss: 0.6696 - regression_loss: 0.6054 - classification_loss: 0.0642 76/500 [===>..........................] - ETA: 2:45 - loss: 0.6690 - regression_loss: 0.6049 - classification_loss: 0.0641 77/500 [===>..........................] - ETA: 2:44 - loss: 0.6697 - regression_loss: 0.6054 - classification_loss: 0.0642 78/500 [===>..........................] - ETA: 2:44 - loss: 0.6665 - regression_loss: 0.6028 - classification_loss: 0.0637 79/500 [===>..........................] - ETA: 2:43 - loss: 0.6670 - regression_loss: 0.6035 - classification_loss: 0.0635 80/500 [===>..........................] - ETA: 2:43 - loss: 0.6665 - regression_loss: 0.6032 - classification_loss: 0.0633 81/500 [===>..........................] - ETA: 2:43 - loss: 0.6669 - regression_loss: 0.6038 - classification_loss: 0.0631 82/500 [===>..........................] - ETA: 2:42 - loss: 0.6653 - regression_loss: 0.6024 - classification_loss: 0.0628 83/500 [===>..........................] - ETA: 2:42 - loss: 0.6607 - regression_loss: 0.5984 - classification_loss: 0.0624 84/500 [====>.........................] - ETA: 2:42 - loss: 0.6596 - regression_loss: 0.5975 - classification_loss: 0.0621 85/500 [====>.........................] - ETA: 2:41 - loss: 0.6567 - regression_loss: 0.5949 - classification_loss: 0.0618 86/500 [====>.........................] - ETA: 2:41 - loss: 0.6537 - regression_loss: 0.5923 - classification_loss: 0.0614 87/500 [====>.........................] - ETA: 2:40 - loss: 0.6549 - regression_loss: 0.5937 - classification_loss: 0.0612 88/500 [====>.........................] - ETA: 2:40 - loss: 0.6515 - regression_loss: 0.5907 - classification_loss: 0.0609 89/500 [====>.........................] - ETA: 2:39 - loss: 0.6482 - regression_loss: 0.5878 - classification_loss: 0.0603 90/500 [====>.........................] - ETA: 2:39 - loss: 0.6457 - regression_loss: 0.5858 - classification_loss: 0.0598 91/500 [====>.........................] - ETA: 2:39 - loss: 0.6464 - regression_loss: 0.5867 - classification_loss: 0.0597 92/500 [====>.........................] - ETA: 2:38 - loss: 0.6448 - regression_loss: 0.5854 - classification_loss: 0.0594 93/500 [====>.........................] - ETA: 2:38 - loss: 0.6445 - regression_loss: 0.5850 - classification_loss: 0.0595 94/500 [====>.........................] - ETA: 2:37 - loss: 0.6427 - regression_loss: 0.5835 - classification_loss: 0.0593 95/500 [====>.........................] - ETA: 2:37 - loss: 0.6446 - regression_loss: 0.5852 - classification_loss: 0.0593 96/500 [====>.........................] - ETA: 2:37 - loss: 0.6433 - regression_loss: 0.5841 - classification_loss: 0.0592 97/500 [====>.........................] - ETA: 2:36 - loss: 0.6415 - regression_loss: 0.5824 - classification_loss: 0.0591 98/500 [====>.........................] - ETA: 2:36 - loss: 0.6410 - regression_loss: 0.5822 - classification_loss: 0.0588 99/500 [====>.........................] - ETA: 2:36 - loss: 0.6400 - regression_loss: 0.5814 - classification_loss: 0.0586 100/500 [=====>........................] - ETA: 2:35 - loss: 0.6413 - regression_loss: 0.5820 - classification_loss: 0.0592 101/500 [=====>........................] - ETA: 2:35 - loss: 0.6464 - regression_loss: 0.5868 - classification_loss: 0.0596 102/500 [=====>........................] - ETA: 2:34 - loss: 0.6471 - regression_loss: 0.5875 - classification_loss: 0.0596 103/500 [=====>........................] - ETA: 2:34 - loss: 0.6463 - regression_loss: 0.5868 - classification_loss: 0.0595 104/500 [=====>........................] - ETA: 2:34 - loss: 0.6473 - regression_loss: 0.5876 - classification_loss: 0.0597 105/500 [=====>........................] - ETA: 2:33 - loss: 0.6453 - regression_loss: 0.5856 - classification_loss: 0.0597 106/500 [=====>........................] - ETA: 2:33 - loss: 0.6464 - regression_loss: 0.5866 - classification_loss: 0.0598 107/500 [=====>........................] - ETA: 2:33 - loss: 0.6453 - regression_loss: 0.5855 - classification_loss: 0.0598 108/500 [=====>........................] - ETA: 2:32 - loss: 0.6439 - regression_loss: 0.5844 - classification_loss: 0.0595 109/500 [=====>........................] - ETA: 2:32 - loss: 0.6409 - regression_loss: 0.5816 - classification_loss: 0.0593 110/500 [=====>........................] - ETA: 2:31 - loss: 0.6424 - regression_loss: 0.5830 - classification_loss: 0.0594 111/500 [=====>........................] - ETA: 2:31 - loss: 0.6407 - regression_loss: 0.5815 - classification_loss: 0.0592 112/500 [=====>........................] - ETA: 2:31 - loss: 0.6406 - regression_loss: 0.5815 - classification_loss: 0.0591 113/500 [=====>........................] - ETA: 2:30 - loss: 0.6398 - regression_loss: 0.5808 - classification_loss: 0.0590 114/500 [=====>........................] - ETA: 2:30 - loss: 0.6387 - regression_loss: 0.5798 - classification_loss: 0.0589 115/500 [=====>........................] - ETA: 2:29 - loss: 0.6428 - regression_loss: 0.5835 - classification_loss: 0.0593 116/500 [=====>........................] - ETA: 2:29 - loss: 0.6417 - regression_loss: 0.5825 - classification_loss: 0.0592 117/500 [======>.......................] - ETA: 2:29 - loss: 0.6387 - regression_loss: 0.5798 - classification_loss: 0.0589 118/500 [======>.......................] - ETA: 2:28 - loss: 0.6386 - regression_loss: 0.5800 - classification_loss: 0.0586 119/500 [======>.......................] - ETA: 2:28 - loss: 0.6361 - regression_loss: 0.5779 - classification_loss: 0.0582 120/500 [======>.......................] - ETA: 2:28 - loss: 0.6368 - regression_loss: 0.5786 - classification_loss: 0.0582 121/500 [======>.......................] - ETA: 2:27 - loss: 0.6379 - regression_loss: 0.5793 - classification_loss: 0.0586 122/500 [======>.......................] - ETA: 2:27 - loss: 0.6384 - regression_loss: 0.5799 - classification_loss: 0.0585 123/500 [======>.......................] - ETA: 2:26 - loss: 0.6389 - regression_loss: 0.5805 - classification_loss: 0.0584 124/500 [======>.......................] - ETA: 2:26 - loss: 0.6394 - regression_loss: 0.5812 - classification_loss: 0.0581 125/500 [======>.......................] - ETA: 2:26 - loss: 0.6382 - regression_loss: 0.5801 - classification_loss: 0.0581 126/500 [======>.......................] - ETA: 2:25 - loss: 0.6376 - regression_loss: 0.5797 - classification_loss: 0.0579 127/500 [======>.......................] - ETA: 2:25 - loss: 0.6412 - regression_loss: 0.5830 - classification_loss: 0.0582 128/500 [======>.......................] - ETA: 2:25 - loss: 0.6389 - regression_loss: 0.5809 - classification_loss: 0.0580 129/500 [======>.......................] - ETA: 2:24 - loss: 0.6380 - regression_loss: 0.5801 - classification_loss: 0.0578 130/500 [======>.......................] - ETA: 2:24 - loss: 0.6375 - regression_loss: 0.5798 - classification_loss: 0.0576 131/500 [======>.......................] - ETA: 2:23 - loss: 0.6387 - regression_loss: 0.5811 - classification_loss: 0.0576 132/500 [======>.......................] - ETA: 2:23 - loss: 0.6398 - regression_loss: 0.5822 - classification_loss: 0.0576 133/500 [======>.......................] - ETA: 2:23 - loss: 0.6405 - regression_loss: 0.5827 - classification_loss: 0.0578 134/500 [=======>......................] - ETA: 2:22 - loss: 0.6390 - regression_loss: 0.5814 - classification_loss: 0.0576 135/500 [=======>......................] - ETA: 2:22 - loss: 0.6375 - regression_loss: 0.5801 - classification_loss: 0.0574 136/500 [=======>......................] - ETA: 2:22 - loss: 0.6348 - regression_loss: 0.5777 - classification_loss: 0.0572 137/500 [=======>......................] - ETA: 2:21 - loss: 0.6332 - regression_loss: 0.5762 - classification_loss: 0.0570 138/500 [=======>......................] - ETA: 2:21 - loss: 0.6351 - regression_loss: 0.5780 - classification_loss: 0.0571 139/500 [=======>......................] - ETA: 2:20 - loss: 0.6332 - regression_loss: 0.5761 - classification_loss: 0.0571 140/500 [=======>......................] - ETA: 2:20 - loss: 0.6344 - regression_loss: 0.5774 - classification_loss: 0.0570 141/500 [=======>......................] - ETA: 2:20 - loss: 0.6328 - regression_loss: 0.5757 - classification_loss: 0.0571 142/500 [=======>......................] - ETA: 2:19 - loss: 0.6311 - regression_loss: 0.5742 - classification_loss: 0.0569 143/500 [=======>......................] - ETA: 2:19 - loss: 0.6306 - regression_loss: 0.5739 - classification_loss: 0.0567 144/500 [=======>......................] - ETA: 2:18 - loss: 0.6295 - regression_loss: 0.5729 - classification_loss: 0.0566 145/500 [=======>......................] - ETA: 2:18 - loss: 0.6347 - regression_loss: 0.5774 - classification_loss: 0.0573 146/500 [=======>......................] - ETA: 2:18 - loss: 0.6336 - regression_loss: 0.5764 - classification_loss: 0.0572 147/500 [=======>......................] - ETA: 2:17 - loss: 0.6320 - regression_loss: 0.5750 - classification_loss: 0.0570 148/500 [=======>......................] - ETA: 2:17 - loss: 0.6317 - regression_loss: 0.5747 - classification_loss: 0.0570 149/500 [=======>......................] - ETA: 2:16 - loss: 0.6341 - regression_loss: 0.5768 - classification_loss: 0.0573 150/500 [========>.....................] - ETA: 2:16 - loss: 0.6329 - regression_loss: 0.5756 - classification_loss: 0.0573 151/500 [========>.....................] - ETA: 2:16 - loss: 0.6326 - regression_loss: 0.5752 - classification_loss: 0.0574 152/500 [========>.....................] - ETA: 2:15 - loss: 0.6310 - regression_loss: 0.5738 - classification_loss: 0.0572 153/500 [========>.....................] - ETA: 2:15 - loss: 0.6297 - regression_loss: 0.5728 - classification_loss: 0.0569 154/500 [========>.....................] - ETA: 2:14 - loss: 0.6345 - regression_loss: 0.5768 - classification_loss: 0.0576 155/500 [========>.....................] - ETA: 2:14 - loss: 0.6330 - regression_loss: 0.5755 - classification_loss: 0.0575 156/500 [========>.....................] - ETA: 2:14 - loss: 0.6348 - regression_loss: 0.5772 - classification_loss: 0.0576 157/500 [========>.....................] - ETA: 2:13 - loss: 0.6354 - regression_loss: 0.5779 - classification_loss: 0.0575 158/500 [========>.....................] - ETA: 2:13 - loss: 0.6345 - regression_loss: 0.5772 - classification_loss: 0.0573 159/500 [========>.....................] - ETA: 2:12 - loss: 0.6344 - regression_loss: 0.5771 - classification_loss: 0.0573 160/500 [========>.....................] - ETA: 2:12 - loss: 0.6342 - regression_loss: 0.5770 - classification_loss: 0.0572 161/500 [========>.....................] - ETA: 2:12 - loss: 0.6328 - regression_loss: 0.5757 - classification_loss: 0.0571 162/500 [========>.....................] - ETA: 2:11 - loss: 0.6323 - regression_loss: 0.5752 - classification_loss: 0.0571 163/500 [========>.....................] - ETA: 2:11 - loss: 0.6308 - regression_loss: 0.5739 - classification_loss: 0.0569 164/500 [========>.....................] - ETA: 2:11 - loss: 0.6302 - regression_loss: 0.5733 - classification_loss: 0.0569 165/500 [========>.....................] - ETA: 2:10 - loss: 0.6334 - regression_loss: 0.5761 - classification_loss: 0.0573 166/500 [========>.....................] - ETA: 2:10 - loss: 0.6318 - regression_loss: 0.5747 - classification_loss: 0.0571 167/500 [=========>....................] - ETA: 2:09 - loss: 0.6318 - regression_loss: 0.5747 - classification_loss: 0.0570 168/500 [=========>....................] - ETA: 2:09 - loss: 0.6309 - regression_loss: 0.5740 - classification_loss: 0.0569 169/500 [=========>....................] - ETA: 2:08 - loss: 0.6319 - regression_loss: 0.5748 - classification_loss: 0.0571 170/500 [=========>....................] - ETA: 2:08 - loss: 0.6314 - regression_loss: 0.5745 - classification_loss: 0.0570 171/500 [=========>....................] - ETA: 2:08 - loss: 0.6293 - regression_loss: 0.5724 - classification_loss: 0.0568 172/500 [=========>....................] - ETA: 2:07 - loss: 0.6291 - regression_loss: 0.5723 - classification_loss: 0.0568 173/500 [=========>....................] - ETA: 2:07 - loss: 0.6284 - regression_loss: 0.5716 - classification_loss: 0.0567 174/500 [=========>....................] - ETA: 2:06 - loss: 0.6269 - regression_loss: 0.5702 - classification_loss: 0.0567 175/500 [=========>....................] - ETA: 2:06 - loss: 0.6277 - regression_loss: 0.5710 - classification_loss: 0.0567 176/500 [=========>....................] - ETA: 2:06 - loss: 0.6360 - regression_loss: 0.5786 - classification_loss: 0.0573 177/500 [=========>....................] - ETA: 2:05 - loss: 0.6354 - regression_loss: 0.5782 - classification_loss: 0.0572 178/500 [=========>....................] - ETA: 2:05 - loss: 0.6341 - regression_loss: 0.5771 - classification_loss: 0.0570 179/500 [=========>....................] - ETA: 2:04 - loss: 0.6355 - regression_loss: 0.5782 - classification_loss: 0.0572 180/500 [=========>....................] - ETA: 2:04 - loss: 0.6360 - regression_loss: 0.5786 - classification_loss: 0.0574 181/500 [=========>....................] - ETA: 2:04 - loss: 0.6345 - regression_loss: 0.5773 - classification_loss: 0.0572 182/500 [=========>....................] - ETA: 2:03 - loss: 0.6341 - regression_loss: 0.5771 - classification_loss: 0.0570 183/500 [=========>....................] - ETA: 2:03 - loss: 0.6338 - regression_loss: 0.5768 - classification_loss: 0.0570 184/500 [==========>...................] - ETA: 2:02 - loss: 0.6325 - regression_loss: 0.5756 - classification_loss: 0.0569 185/500 [==========>...................] - ETA: 2:02 - loss: 0.6314 - regression_loss: 0.5746 - classification_loss: 0.0568 186/500 [==========>...................] - ETA: 2:02 - loss: 0.6309 - regression_loss: 0.5741 - classification_loss: 0.0568 187/500 [==========>...................] - ETA: 2:01 - loss: 0.6308 - regression_loss: 0.5740 - classification_loss: 0.0568 188/500 [==========>...................] - ETA: 2:01 - loss: 0.6346 - regression_loss: 0.5766 - classification_loss: 0.0581 189/500 [==========>...................] - ETA: 2:00 - loss: 0.6341 - regression_loss: 0.5761 - classification_loss: 0.0580 190/500 [==========>...................] - ETA: 2:00 - loss: 0.6339 - regression_loss: 0.5760 - classification_loss: 0.0579 191/500 [==========>...................] - ETA: 2:00 - loss: 0.6330 - regression_loss: 0.5751 - classification_loss: 0.0579 192/500 [==========>...................] - ETA: 1:59 - loss: 0.6341 - regression_loss: 0.5762 - classification_loss: 0.0579 193/500 [==========>...................] - ETA: 1:59 - loss: 0.6356 - regression_loss: 0.5774 - classification_loss: 0.0581 194/500 [==========>...................] - ETA: 1:59 - loss: 0.6348 - regression_loss: 0.5768 - classification_loss: 0.0580 195/500 [==========>...................] - ETA: 1:58 - loss: 0.6345 - regression_loss: 0.5765 - classification_loss: 0.0580 196/500 [==========>...................] - ETA: 1:58 - loss: 0.6323 - regression_loss: 0.5745 - classification_loss: 0.0579 197/500 [==========>...................] - ETA: 1:57 - loss: 0.6324 - regression_loss: 0.5746 - classification_loss: 0.0579 198/500 [==========>...................] - ETA: 1:57 - loss: 0.6319 - regression_loss: 0.5741 - classification_loss: 0.0579 199/500 [==========>...................] - ETA: 1:57 - loss: 0.6365 - regression_loss: 0.5782 - classification_loss: 0.0583 200/500 [===========>..................] - ETA: 1:56 - loss: 0.6377 - regression_loss: 0.5794 - classification_loss: 0.0583 201/500 [===========>..................] - ETA: 1:56 - loss: 0.6361 - regression_loss: 0.5780 - classification_loss: 0.0581 202/500 [===========>..................] - ETA: 1:55 - loss: 0.6360 - regression_loss: 0.5780 - classification_loss: 0.0580 203/500 [===========>..................] - ETA: 1:55 - loss: 0.6356 - regression_loss: 0.5777 - classification_loss: 0.0579 204/500 [===========>..................] - ETA: 1:55 - loss: 0.6348 - regression_loss: 0.5771 - classification_loss: 0.0578 205/500 [===========>..................] - ETA: 1:54 - loss: 0.6360 - regression_loss: 0.5777 - classification_loss: 0.0583 206/500 [===========>..................] - ETA: 1:54 - loss: 0.6357 - regression_loss: 0.5774 - classification_loss: 0.0583 207/500 [===========>..................] - ETA: 1:53 - loss: 0.6355 - regression_loss: 0.5773 - classification_loss: 0.0582 208/500 [===========>..................] - ETA: 1:53 - loss: 0.6352 - regression_loss: 0.5771 - classification_loss: 0.0581 209/500 [===========>..................] - ETA: 1:53 - loss: 0.6348 - regression_loss: 0.5768 - classification_loss: 0.0581 210/500 [===========>..................] - ETA: 1:52 - loss: 0.6339 - regression_loss: 0.5759 - classification_loss: 0.0580 211/500 [===========>..................] - ETA: 1:52 - loss: 0.6336 - regression_loss: 0.5756 - classification_loss: 0.0579 212/500 [===========>..................] - ETA: 1:51 - loss: 0.6331 - regression_loss: 0.5751 - classification_loss: 0.0580 213/500 [===========>..................] - ETA: 1:51 - loss: 0.6320 - regression_loss: 0.5740 - classification_loss: 0.0580 214/500 [===========>..................] - ETA: 1:51 - loss: 0.6321 - regression_loss: 0.5741 - classification_loss: 0.0580 215/500 [===========>..................] - ETA: 1:50 - loss: 0.6324 - regression_loss: 0.5744 - classification_loss: 0.0580 216/500 [===========>..................] - ETA: 1:50 - loss: 0.6326 - regression_loss: 0.5746 - classification_loss: 0.0580 217/500 [============>.................] - ETA: 1:50 - loss: 0.6330 - regression_loss: 0.5749 - classification_loss: 0.0581 218/500 [============>.................] - ETA: 1:49 - loss: 0.6325 - regression_loss: 0.5744 - classification_loss: 0.0581 219/500 [============>.................] - ETA: 1:49 - loss: 0.6309 - regression_loss: 0.5729 - classification_loss: 0.0580 220/500 [============>.................] - ETA: 1:48 - loss: 0.6305 - regression_loss: 0.5726 - classification_loss: 0.0579 221/500 [============>.................] - ETA: 1:48 - loss: 0.6299 - regression_loss: 0.5721 - classification_loss: 0.0578 222/500 [============>.................] - ETA: 1:48 - loss: 0.6303 - regression_loss: 0.5724 - classification_loss: 0.0579 223/500 [============>.................] - ETA: 1:47 - loss: 0.6292 - regression_loss: 0.5715 - classification_loss: 0.0578 224/500 [============>.................] - ETA: 1:47 - loss: 0.6283 - regression_loss: 0.5706 - classification_loss: 0.0577 225/500 [============>.................] - ETA: 1:46 - loss: 0.6299 - regression_loss: 0.5716 - classification_loss: 0.0583 226/500 [============>.................] - ETA: 1:46 - loss: 0.6295 - regression_loss: 0.5712 - classification_loss: 0.0583 227/500 [============>.................] - ETA: 1:46 - loss: 0.6284 - regression_loss: 0.5702 - classification_loss: 0.0582 228/500 [============>.................] - ETA: 1:45 - loss: 0.6273 - regression_loss: 0.5692 - classification_loss: 0.0581 229/500 [============>.................] - ETA: 1:45 - loss: 0.6264 - regression_loss: 0.5684 - classification_loss: 0.0581 230/500 [============>.................] - ETA: 1:44 - loss: 0.6257 - regression_loss: 0.5677 - classification_loss: 0.0580 231/500 [============>.................] - ETA: 1:44 - loss: 0.6252 - regression_loss: 0.5673 - classification_loss: 0.0579 232/500 [============>.................] - ETA: 1:44 - loss: 0.6250 - regression_loss: 0.5671 - classification_loss: 0.0578 233/500 [============>.................] - ETA: 1:43 - loss: 0.6238 - regression_loss: 0.5660 - classification_loss: 0.0578 234/500 [=============>................] - ETA: 1:43 - loss: 0.6237 - regression_loss: 0.5660 - classification_loss: 0.0577 235/500 [=============>................] - ETA: 1:42 - loss: 0.6243 - regression_loss: 0.5666 - classification_loss: 0.0578 236/500 [=============>................] - ETA: 1:42 - loss: 0.6255 - regression_loss: 0.5676 - classification_loss: 0.0578 237/500 [=============>................] - ETA: 1:42 - loss: 0.6236 - regression_loss: 0.5658 - classification_loss: 0.0577 238/500 [=============>................] - ETA: 1:41 - loss: 0.6237 - regression_loss: 0.5660 - classification_loss: 0.0577 239/500 [=============>................] - ETA: 1:41 - loss: 0.6227 - regression_loss: 0.5651 - classification_loss: 0.0576 240/500 [=============>................] - ETA: 1:41 - loss: 0.6242 - regression_loss: 0.5665 - classification_loss: 0.0576 241/500 [=============>................] - ETA: 1:40 - loss: 0.6233 - regression_loss: 0.5658 - classification_loss: 0.0575 242/500 [=============>................] - ETA: 1:40 - loss: 0.6216 - regression_loss: 0.5642 - classification_loss: 0.0574 243/500 [=============>................] - ETA: 1:39 - loss: 0.6204 - regression_loss: 0.5631 - classification_loss: 0.0573 244/500 [=============>................] - ETA: 1:39 - loss: 0.6200 - regression_loss: 0.5628 - classification_loss: 0.0572 245/500 [=============>................] - ETA: 1:39 - loss: 0.6197 - regression_loss: 0.5625 - classification_loss: 0.0572 246/500 [=============>................] - ETA: 1:38 - loss: 0.6195 - regression_loss: 0.5624 - classification_loss: 0.0572 247/500 [=============>................] - ETA: 1:38 - loss: 0.6189 - regression_loss: 0.5618 - classification_loss: 0.0571 248/500 [=============>................] - ETA: 1:37 - loss: 0.6222 - regression_loss: 0.5642 - classification_loss: 0.0580 249/500 [=============>................] - ETA: 1:37 - loss: 0.6233 - regression_loss: 0.5651 - classification_loss: 0.0581 250/500 [==============>...............] - ETA: 1:37 - loss: 0.6215 - regression_loss: 0.5635 - classification_loss: 0.0580 251/500 [==============>...............] - ETA: 1:36 - loss: 0.6215 - regression_loss: 0.5635 - classification_loss: 0.0580 252/500 [==============>...............] - ETA: 1:36 - loss: 0.6209 - regression_loss: 0.5630 - classification_loss: 0.0580 253/500 [==============>...............] - ETA: 1:35 - loss: 0.6198 - regression_loss: 0.5619 - classification_loss: 0.0578 254/500 [==============>...............] - ETA: 1:35 - loss: 0.6195 - regression_loss: 0.5618 - classification_loss: 0.0578 255/500 [==============>...............] - ETA: 1:35 - loss: 0.6186 - regression_loss: 0.5609 - classification_loss: 0.0577 256/500 [==============>...............] - ETA: 1:34 - loss: 0.6174 - regression_loss: 0.5598 - classification_loss: 0.0576 257/500 [==============>...............] - ETA: 1:34 - loss: 0.6178 - regression_loss: 0.5603 - classification_loss: 0.0575 258/500 [==============>...............] - ETA: 1:34 - loss: 0.6165 - regression_loss: 0.5591 - classification_loss: 0.0574 259/500 [==============>...............] - ETA: 1:33 - loss: 0.6157 - regression_loss: 0.5583 - classification_loss: 0.0573 260/500 [==============>...............] - ETA: 1:33 - loss: 0.6155 - regression_loss: 0.5583 - classification_loss: 0.0572 261/500 [==============>...............] - ETA: 1:32 - loss: 0.6154 - regression_loss: 0.5582 - classification_loss: 0.0572 262/500 [==============>...............] - ETA: 1:32 - loss: 0.6150 - regression_loss: 0.5577 - classification_loss: 0.0573 263/500 [==============>...............] - ETA: 1:32 - loss: 0.6169 - regression_loss: 0.5592 - classification_loss: 0.0578 264/500 [==============>...............] - ETA: 1:31 - loss: 0.6167 - regression_loss: 0.5591 - classification_loss: 0.0576 265/500 [==============>...............] - ETA: 1:31 - loss: 0.6164 - regression_loss: 0.5589 - classification_loss: 0.0575 266/500 [==============>...............] - ETA: 1:30 - loss: 0.6191 - regression_loss: 0.5612 - classification_loss: 0.0578 267/500 [===============>..............] - ETA: 1:30 - loss: 0.6189 - regression_loss: 0.5611 - classification_loss: 0.0578 268/500 [===============>..............] - ETA: 1:30 - loss: 0.6192 - regression_loss: 0.5615 - classification_loss: 0.0578 269/500 [===============>..............] - ETA: 1:29 - loss: 0.6188 - regression_loss: 0.5611 - classification_loss: 0.0578 270/500 [===============>..............] - ETA: 1:29 - loss: 0.6198 - regression_loss: 0.5620 - classification_loss: 0.0578 271/500 [===============>..............] - ETA: 1:29 - loss: 0.6195 - regression_loss: 0.5617 - classification_loss: 0.0578 272/500 [===============>..............] - ETA: 1:28 - loss: 0.6197 - regression_loss: 0.5619 - classification_loss: 0.0578 273/500 [===============>..............] - ETA: 1:28 - loss: 0.6198 - regression_loss: 0.5621 - classification_loss: 0.0578 274/500 [===============>..............] - ETA: 1:27 - loss: 0.6186 - regression_loss: 0.5610 - classification_loss: 0.0577 275/500 [===============>..............] - ETA: 1:27 - loss: 0.6182 - regression_loss: 0.5606 - classification_loss: 0.0576 276/500 [===============>..............] - ETA: 1:27 - loss: 0.6201 - regression_loss: 0.5623 - classification_loss: 0.0578 277/500 [===============>..............] - ETA: 1:26 - loss: 0.6197 - regression_loss: 0.5620 - classification_loss: 0.0577 278/500 [===============>..............] - ETA: 1:26 - loss: 0.6226 - regression_loss: 0.5643 - classification_loss: 0.0583 279/500 [===============>..............] - ETA: 1:25 - loss: 0.6225 - regression_loss: 0.5642 - classification_loss: 0.0582 280/500 [===============>..............] - ETA: 1:25 - loss: 0.6215 - regression_loss: 0.5634 - classification_loss: 0.0581 281/500 [===============>..............] - ETA: 1:25 - loss: 0.6222 - regression_loss: 0.5639 - classification_loss: 0.0583 282/500 [===============>..............] - ETA: 1:24 - loss: 0.6213 - regression_loss: 0.5632 - classification_loss: 0.0581 283/500 [===============>..............] - ETA: 1:24 - loss: 0.6212 - regression_loss: 0.5631 - classification_loss: 0.0581 284/500 [================>.............] - ETA: 1:24 - loss: 0.6214 - regression_loss: 0.5632 - classification_loss: 0.0581 285/500 [================>.............] - ETA: 1:23 - loss: 0.6213 - regression_loss: 0.5632 - classification_loss: 0.0581 286/500 [================>.............] - ETA: 1:23 - loss: 0.6213 - regression_loss: 0.5634 - classification_loss: 0.0580 287/500 [================>.............] - ETA: 1:22 - loss: 0.6208 - regression_loss: 0.5629 - classification_loss: 0.0579 288/500 [================>.............] - ETA: 1:22 - loss: 0.6215 - regression_loss: 0.5635 - classification_loss: 0.0580 289/500 [================>.............] - ETA: 1:22 - loss: 0.6209 - regression_loss: 0.5630 - classification_loss: 0.0579 290/500 [================>.............] - ETA: 1:21 - loss: 0.6220 - regression_loss: 0.5640 - classification_loss: 0.0580 291/500 [================>.............] - ETA: 1:21 - loss: 0.6212 - regression_loss: 0.5633 - classification_loss: 0.0579 292/500 [================>.............] - ETA: 1:20 - loss: 0.6210 - regression_loss: 0.5632 - classification_loss: 0.0578 293/500 [================>.............] - ETA: 1:20 - loss: 0.6199 - regression_loss: 0.5622 - classification_loss: 0.0577 294/500 [================>.............] - ETA: 1:20 - loss: 0.6199 - regression_loss: 0.5623 - classification_loss: 0.0576 295/500 [================>.............] - ETA: 1:19 - loss: 0.6207 - regression_loss: 0.5630 - classification_loss: 0.0577 296/500 [================>.............] - ETA: 1:19 - loss: 0.6196 - regression_loss: 0.5620 - classification_loss: 0.0576 297/500 [================>.............] - ETA: 1:18 - loss: 0.6197 - regression_loss: 0.5620 - classification_loss: 0.0577 298/500 [================>.............] - ETA: 1:18 - loss: 0.6198 - regression_loss: 0.5621 - classification_loss: 0.0577 299/500 [================>.............] - ETA: 1:18 - loss: 0.6206 - regression_loss: 0.5630 - classification_loss: 0.0576 300/500 [=================>............] - ETA: 1:17 - loss: 0.6216 - regression_loss: 0.5638 - classification_loss: 0.0578 301/500 [=================>............] - ETA: 1:17 - loss: 0.6222 - regression_loss: 0.5644 - classification_loss: 0.0578 302/500 [=================>............] - ETA: 1:16 - loss: 0.6235 - regression_loss: 0.5657 - classification_loss: 0.0578 303/500 [=================>............] - ETA: 1:16 - loss: 0.6229 - regression_loss: 0.5650 - classification_loss: 0.0578 304/500 [=================>............] - ETA: 1:16 - loss: 0.6231 - regression_loss: 0.5652 - classification_loss: 0.0579 305/500 [=================>............] - ETA: 1:15 - loss: 0.6225 - regression_loss: 0.5646 - classification_loss: 0.0578 306/500 [=================>............] - ETA: 1:15 - loss: 0.6225 - regression_loss: 0.5647 - classification_loss: 0.0578 307/500 [=================>............] - ETA: 1:15 - loss: 0.6230 - regression_loss: 0.5652 - classification_loss: 0.0578 308/500 [=================>............] - ETA: 1:14 - loss: 0.6232 - regression_loss: 0.5654 - classification_loss: 0.0578 309/500 [=================>............] - ETA: 1:14 - loss: 0.6227 - regression_loss: 0.5650 - classification_loss: 0.0577 310/500 [=================>............] - ETA: 1:13 - loss: 0.6227 - regression_loss: 0.5649 - classification_loss: 0.0577 311/500 [=================>............] - ETA: 1:13 - loss: 0.6236 - regression_loss: 0.5657 - classification_loss: 0.0579 312/500 [=================>............] - ETA: 1:13 - loss: 0.6232 - regression_loss: 0.5653 - classification_loss: 0.0579 313/500 [=================>............] - ETA: 1:12 - loss: 0.6224 - regression_loss: 0.5647 - classification_loss: 0.0577 314/500 [=================>............] - ETA: 1:12 - loss: 0.6221 - regression_loss: 0.5644 - classification_loss: 0.0577 315/500 [=================>............] - ETA: 1:11 - loss: 0.6240 - regression_loss: 0.5661 - classification_loss: 0.0579 316/500 [=================>............] - ETA: 1:11 - loss: 0.6258 - regression_loss: 0.5677 - classification_loss: 0.0582 317/500 [==================>...........] - ETA: 1:11 - loss: 0.6246 - regression_loss: 0.5665 - classification_loss: 0.0581 318/500 [==================>...........] - ETA: 1:10 - loss: 0.6249 - regression_loss: 0.5669 - classification_loss: 0.0580 319/500 [==================>...........] - ETA: 1:10 - loss: 0.6249 - regression_loss: 0.5669 - classification_loss: 0.0581 320/500 [==================>...........] - ETA: 1:10 - loss: 0.6253 - regression_loss: 0.5672 - classification_loss: 0.0581 321/500 [==================>...........] - ETA: 1:09 - loss: 0.6248 - regression_loss: 0.5668 - classification_loss: 0.0580 322/500 [==================>...........] - ETA: 1:09 - loss: 0.6247 - regression_loss: 0.5668 - classification_loss: 0.0580 323/500 [==================>...........] - ETA: 1:08 - loss: 0.6246 - regression_loss: 0.5666 - classification_loss: 0.0580 324/500 [==================>...........] - ETA: 1:08 - loss: 0.6247 - regression_loss: 0.5668 - classification_loss: 0.0579 325/500 [==================>...........] - ETA: 1:08 - loss: 0.6238 - regression_loss: 0.5659 - classification_loss: 0.0579 326/500 [==================>...........] - ETA: 1:07 - loss: 0.6240 - regression_loss: 0.5662 - classification_loss: 0.0578 327/500 [==================>...........] - ETA: 1:07 - loss: 0.6233 - regression_loss: 0.5655 - classification_loss: 0.0578 328/500 [==================>...........] - ETA: 1:06 - loss: 0.6232 - regression_loss: 0.5655 - classification_loss: 0.0577 329/500 [==================>...........] - ETA: 1:06 - loss: 0.6232 - regression_loss: 0.5655 - classification_loss: 0.0577 330/500 [==================>...........] - ETA: 1:06 - loss: 0.6238 - regression_loss: 0.5662 - classification_loss: 0.0577 331/500 [==================>...........] - ETA: 1:05 - loss: 0.6248 - regression_loss: 0.5671 - classification_loss: 0.0577 332/500 [==================>...........] - ETA: 1:05 - loss: 0.6244 - regression_loss: 0.5667 - classification_loss: 0.0577 333/500 [==================>...........] - ETA: 1:04 - loss: 0.6271 - regression_loss: 0.5691 - classification_loss: 0.0580 334/500 [===================>..........] - ETA: 1:04 - loss: 0.6269 - regression_loss: 0.5690 - classification_loss: 0.0579 335/500 [===================>..........] - ETA: 1:04 - loss: 0.6265 - regression_loss: 0.5687 - classification_loss: 0.0578 336/500 [===================>..........] - ETA: 1:03 - loss: 0.6262 - regression_loss: 0.5684 - classification_loss: 0.0578 337/500 [===================>..........] - ETA: 1:03 - loss: 0.6268 - regression_loss: 0.5690 - classification_loss: 0.0578 338/500 [===================>..........] - ETA: 1:03 - loss: 0.6261 - regression_loss: 0.5684 - classification_loss: 0.0577 339/500 [===================>..........] - ETA: 1:02 - loss: 0.6265 - regression_loss: 0.5688 - classification_loss: 0.0577 340/500 [===================>..........] - ETA: 1:02 - loss: 0.6260 - regression_loss: 0.5684 - classification_loss: 0.0577 341/500 [===================>..........] - ETA: 1:01 - loss: 0.6272 - regression_loss: 0.5695 - classification_loss: 0.0577 342/500 [===================>..........] - ETA: 1:01 - loss: 0.6271 - regression_loss: 0.5694 - classification_loss: 0.0577 343/500 [===================>..........] - ETA: 1:01 - loss: 0.6283 - regression_loss: 0.5703 - classification_loss: 0.0580 344/500 [===================>..........] - ETA: 1:00 - loss: 0.6278 - regression_loss: 0.5699 - classification_loss: 0.0579 345/500 [===================>..........] - ETA: 1:00 - loss: 0.6277 - regression_loss: 0.5698 - classification_loss: 0.0579 346/500 [===================>..........] - ETA: 59s - loss: 0.6290 - regression_loss: 0.5709 - classification_loss: 0.0581  347/500 [===================>..........] - ETA: 59s - loss: 0.6289 - regression_loss: 0.5709 - classification_loss: 0.0581 348/500 [===================>..........] - ETA: 59s - loss: 0.6282 - regression_loss: 0.5702 - classification_loss: 0.0580 349/500 [===================>..........] - ETA: 58s - loss: 0.6274 - regression_loss: 0.5694 - classification_loss: 0.0580 350/500 [====================>.........] - ETA: 58s - loss: 0.6273 - regression_loss: 0.5693 - classification_loss: 0.0579 351/500 [====================>.........] - ETA: 57s - loss: 0.6267 - regression_loss: 0.5689 - classification_loss: 0.0579 352/500 [====================>.........] - ETA: 57s - loss: 0.6269 - regression_loss: 0.5690 - classification_loss: 0.0579 353/500 [====================>.........] - ETA: 57s - loss: 0.6266 - regression_loss: 0.5686 - classification_loss: 0.0580 354/500 [====================>.........] - ETA: 56s - loss: 0.6258 - regression_loss: 0.5679 - classification_loss: 0.0579 355/500 [====================>.........] - ETA: 56s - loss: 0.6254 - regression_loss: 0.5675 - classification_loss: 0.0578 356/500 [====================>.........] - ETA: 56s - loss: 0.6249 - regression_loss: 0.5671 - classification_loss: 0.0578 357/500 [====================>.........] - ETA: 55s - loss: 0.6253 - regression_loss: 0.5675 - classification_loss: 0.0578 358/500 [====================>.........] - ETA: 55s - loss: 0.6265 - regression_loss: 0.5686 - classification_loss: 0.0579 359/500 [====================>.........] - ETA: 54s - loss: 0.6255 - regression_loss: 0.5677 - classification_loss: 0.0578 360/500 [====================>.........] - ETA: 54s - loss: 0.6248 - regression_loss: 0.5670 - classification_loss: 0.0578 361/500 [====================>.........] - ETA: 54s - loss: 0.6265 - regression_loss: 0.5684 - classification_loss: 0.0580 362/500 [====================>.........] - ETA: 53s - loss: 0.6256 - regression_loss: 0.5676 - classification_loss: 0.0580 363/500 [====================>.........] - ETA: 53s - loss: 0.6250 - regression_loss: 0.5671 - classification_loss: 0.0580 364/500 [====================>.........] - ETA: 52s - loss: 0.6249 - regression_loss: 0.5670 - classification_loss: 0.0580 365/500 [====================>.........] - ETA: 52s - loss: 0.6262 - regression_loss: 0.5681 - classification_loss: 0.0581 366/500 [====================>.........] - ETA: 52s - loss: 0.6259 - regression_loss: 0.5678 - classification_loss: 0.0581 367/500 [=====================>........] - ETA: 51s - loss: 0.6263 - regression_loss: 0.5683 - classification_loss: 0.0581 368/500 [=====================>........] - ETA: 51s - loss: 0.6256 - regression_loss: 0.5677 - classification_loss: 0.0580 369/500 [=====================>........] - ETA: 50s - loss: 0.6258 - regression_loss: 0.5679 - classification_loss: 0.0580 370/500 [=====================>........] - ETA: 50s - loss: 0.6259 - regression_loss: 0.5679 - classification_loss: 0.0580 371/500 [=====================>........] - ETA: 50s - loss: 0.6257 - regression_loss: 0.5677 - classification_loss: 0.0580 372/500 [=====================>........] - ETA: 49s - loss: 0.6279 - regression_loss: 0.5697 - classification_loss: 0.0582 373/500 [=====================>........] - ETA: 49s - loss: 0.6283 - regression_loss: 0.5701 - classification_loss: 0.0582 374/500 [=====================>........] - ETA: 49s - loss: 0.6282 - regression_loss: 0.5700 - classification_loss: 0.0581 375/500 [=====================>........] - ETA: 48s - loss: 0.6276 - regression_loss: 0.5695 - classification_loss: 0.0580 376/500 [=====================>........] - ETA: 48s - loss: 0.6272 - regression_loss: 0.5693 - classification_loss: 0.0580 377/500 [=====================>........] - ETA: 47s - loss: 0.6273 - regression_loss: 0.5694 - classification_loss: 0.0579 378/500 [=====================>........] - ETA: 47s - loss: 0.6285 - regression_loss: 0.5706 - classification_loss: 0.0579 379/500 [=====================>........] - ETA: 47s - loss: 0.6284 - regression_loss: 0.5706 - classification_loss: 0.0578 380/500 [=====================>........] - ETA: 46s - loss: 0.6280 - regression_loss: 0.5702 - classification_loss: 0.0578 381/500 [=====================>........] - ETA: 46s - loss: 0.6285 - regression_loss: 0.5706 - classification_loss: 0.0579 382/500 [=====================>........] - ETA: 45s - loss: 0.6287 - regression_loss: 0.5708 - classification_loss: 0.0579 383/500 [=====================>........] - ETA: 45s - loss: 0.6279 - regression_loss: 0.5701 - classification_loss: 0.0578 384/500 [======================>.......] - ETA: 45s - loss: 0.6268 - regression_loss: 0.5690 - classification_loss: 0.0578 385/500 [======================>.......] - ETA: 44s - loss: 0.6281 - regression_loss: 0.5702 - classification_loss: 0.0579 386/500 [======================>.......] - ETA: 44s - loss: 0.6278 - regression_loss: 0.5699 - classification_loss: 0.0579 387/500 [======================>.......] - ETA: 43s - loss: 0.6284 - regression_loss: 0.5705 - classification_loss: 0.0579 388/500 [======================>.......] - ETA: 43s - loss: 0.6291 - regression_loss: 0.5712 - classification_loss: 0.0579 389/500 [======================>.......] - ETA: 43s - loss: 0.6289 - regression_loss: 0.5710 - classification_loss: 0.0579 390/500 [======================>.......] - ETA: 42s - loss: 0.6289 - regression_loss: 0.5710 - classification_loss: 0.0579 391/500 [======================>.......] - ETA: 42s - loss: 0.6289 - regression_loss: 0.5710 - classification_loss: 0.0578 392/500 [======================>.......] - ETA: 42s - loss: 0.6287 - regression_loss: 0.5709 - classification_loss: 0.0578 393/500 [======================>.......] - ETA: 41s - loss: 0.6296 - regression_loss: 0.5717 - classification_loss: 0.0580 394/500 [======================>.......] - ETA: 41s - loss: 0.6296 - regression_loss: 0.5717 - classification_loss: 0.0580 395/500 [======================>.......] - ETA: 40s - loss: 0.6298 - regression_loss: 0.5718 - classification_loss: 0.0580 396/500 [======================>.......] - ETA: 40s - loss: 0.6310 - regression_loss: 0.5726 - classification_loss: 0.0584 397/500 [======================>.......] - ETA: 40s - loss: 0.6313 - regression_loss: 0.5729 - classification_loss: 0.0584 398/500 [======================>.......] - ETA: 39s - loss: 0.6313 - regression_loss: 0.5729 - classification_loss: 0.0584 399/500 [======================>.......] - ETA: 39s - loss: 0.6301 - regression_loss: 0.5717 - classification_loss: 0.0583 400/500 [=======================>......] - ETA: 38s - loss: 0.6297 - regression_loss: 0.5714 - classification_loss: 0.0583 401/500 [=======================>......] - ETA: 38s - loss: 0.6302 - regression_loss: 0.5719 - classification_loss: 0.0583 402/500 [=======================>......] - ETA: 38s - loss: 0.6294 - regression_loss: 0.5712 - classification_loss: 0.0582 403/500 [=======================>......] - ETA: 37s - loss: 0.6305 - regression_loss: 0.5721 - classification_loss: 0.0584 404/500 [=======================>......] - ETA: 37s - loss: 0.6302 - regression_loss: 0.5719 - classification_loss: 0.0583 405/500 [=======================>......] - ETA: 36s - loss: 0.6300 - regression_loss: 0.5717 - classification_loss: 0.0583 406/500 [=======================>......] - ETA: 36s - loss: 0.6299 - regression_loss: 0.5717 - classification_loss: 0.0582 407/500 [=======================>......] - ETA: 36s - loss: 0.6305 - regression_loss: 0.5722 - classification_loss: 0.0583 408/500 [=======================>......] - ETA: 35s - loss: 0.6297 - regression_loss: 0.5714 - classification_loss: 0.0583 409/500 [=======================>......] - ETA: 35s - loss: 0.6298 - regression_loss: 0.5715 - classification_loss: 0.0583 410/500 [=======================>......] - ETA: 35s - loss: 0.6294 - regression_loss: 0.5711 - classification_loss: 0.0583 411/500 [=======================>......] - ETA: 34s - loss: 0.6290 - regression_loss: 0.5707 - classification_loss: 0.0583 412/500 [=======================>......] - ETA: 34s - loss: 0.6286 - regression_loss: 0.5703 - classification_loss: 0.0582 413/500 [=======================>......] - ETA: 33s - loss: 0.6276 - regression_loss: 0.5695 - classification_loss: 0.0582 414/500 [=======================>......] - ETA: 33s - loss: 0.6271 - regression_loss: 0.5690 - classification_loss: 0.0581 415/500 [=======================>......] - ETA: 33s - loss: 0.6269 - regression_loss: 0.5688 - classification_loss: 0.0580 416/500 [=======================>......] - ETA: 32s - loss: 0.6271 - regression_loss: 0.5691 - classification_loss: 0.0580 417/500 [========================>.....] - ETA: 32s - loss: 0.6271 - regression_loss: 0.5691 - classification_loss: 0.0580 418/500 [========================>.....] - ETA: 31s - loss: 0.6284 - regression_loss: 0.5703 - classification_loss: 0.0581 419/500 [========================>.....] - ETA: 31s - loss: 0.6277 - regression_loss: 0.5697 - classification_loss: 0.0580 420/500 [========================>.....] - ETA: 31s - loss: 0.6273 - regression_loss: 0.5693 - classification_loss: 0.0580 421/500 [========================>.....] - ETA: 30s - loss: 0.6267 - regression_loss: 0.5688 - classification_loss: 0.0579 422/500 [========================>.....] - ETA: 30s - loss: 0.6262 - regression_loss: 0.5684 - classification_loss: 0.0578 423/500 [========================>.....] - ETA: 29s - loss: 0.6258 - regression_loss: 0.5680 - classification_loss: 0.0578 424/500 [========================>.....] - ETA: 29s - loss: 0.6269 - regression_loss: 0.5690 - classification_loss: 0.0579 425/500 [========================>.....] - ETA: 29s - loss: 0.6268 - regression_loss: 0.5690 - classification_loss: 0.0578 426/500 [========================>.....] - ETA: 28s - loss: 0.6271 - regression_loss: 0.5693 - classification_loss: 0.0578 427/500 [========================>.....] - ETA: 28s - loss: 0.6277 - regression_loss: 0.5698 - classification_loss: 0.0579 428/500 [========================>.....] - ETA: 28s - loss: 0.6272 - regression_loss: 0.5695 - classification_loss: 0.0578 429/500 [========================>.....] - ETA: 27s - loss: 0.6271 - regression_loss: 0.5693 - classification_loss: 0.0578 430/500 [========================>.....] - ETA: 27s - loss: 0.6268 - regression_loss: 0.5690 - classification_loss: 0.0577 431/500 [========================>.....] - ETA: 26s - loss: 0.6268 - regression_loss: 0.5691 - classification_loss: 0.0577 432/500 [========================>.....] - ETA: 26s - loss: 0.6268 - regression_loss: 0.5692 - classification_loss: 0.0576 433/500 [========================>.....] - ETA: 26s - loss: 0.6268 - regression_loss: 0.5692 - classification_loss: 0.0576 434/500 [=========================>....] - ETA: 25s - loss: 0.6269 - regression_loss: 0.5693 - classification_loss: 0.0576 435/500 [=========================>....] - ETA: 25s - loss: 0.6263 - regression_loss: 0.5688 - classification_loss: 0.0575 436/500 [=========================>....] - ETA: 24s - loss: 0.6259 - regression_loss: 0.5684 - classification_loss: 0.0575 437/500 [=========================>....] - ETA: 24s - loss: 0.6251 - regression_loss: 0.5677 - classification_loss: 0.0574 438/500 [=========================>....] - ETA: 24s - loss: 0.6253 - regression_loss: 0.5679 - classification_loss: 0.0574 439/500 [=========================>....] - ETA: 23s - loss: 0.6259 - regression_loss: 0.5685 - classification_loss: 0.0574 440/500 [=========================>....] - ETA: 23s - loss: 0.6258 - regression_loss: 0.5684 - classification_loss: 0.0574 441/500 [=========================>....] - ETA: 22s - loss: 0.6262 - regression_loss: 0.5688 - classification_loss: 0.0574 442/500 [=========================>....] - ETA: 22s - loss: 0.6259 - regression_loss: 0.5683 - classification_loss: 0.0575 443/500 [=========================>....] - ETA: 22s - loss: 0.6281 - regression_loss: 0.5705 - classification_loss: 0.0575 444/500 [=========================>....] - ETA: 21s - loss: 0.6292 - regression_loss: 0.5718 - classification_loss: 0.0575 445/500 [=========================>....] - ETA: 21s - loss: 0.6301 - regression_loss: 0.5728 - classification_loss: 0.0574 446/500 [=========================>....] - ETA: 21s - loss: 0.6309 - regression_loss: 0.5734 - classification_loss: 0.0575 447/500 [=========================>....] - ETA: 20s - loss: 0.6320 - regression_loss: 0.5745 - classification_loss: 0.0575 448/500 [=========================>....] - ETA: 20s - loss: 0.6325 - regression_loss: 0.5750 - classification_loss: 0.0576 449/500 [=========================>....] - ETA: 19s - loss: 0.6327 - regression_loss: 0.5751 - classification_loss: 0.0575 450/500 [==========================>...] - ETA: 19s - loss: 0.6333 - regression_loss: 0.5757 - classification_loss: 0.0576 451/500 [==========================>...] - ETA: 19s - loss: 0.6340 - regression_loss: 0.5764 - classification_loss: 0.0576 452/500 [==========================>...] - ETA: 18s - loss: 0.6342 - regression_loss: 0.5766 - classification_loss: 0.0576 453/500 [==========================>...] - ETA: 18s - loss: 0.6339 - regression_loss: 0.5763 - classification_loss: 0.0576 454/500 [==========================>...] - ETA: 17s - loss: 0.6357 - regression_loss: 0.5780 - classification_loss: 0.0577 455/500 [==========================>...] - ETA: 17s - loss: 0.6354 - regression_loss: 0.5777 - classification_loss: 0.0576 456/500 [==========================>...] - ETA: 17s - loss: 0.6362 - regression_loss: 0.5784 - classification_loss: 0.0578 457/500 [==========================>...] - ETA: 16s - loss: 0.6368 - regression_loss: 0.5790 - classification_loss: 0.0578 458/500 [==========================>...] - ETA: 16s - loss: 0.6391 - regression_loss: 0.5808 - classification_loss: 0.0583 459/500 [==========================>...] - ETA: 15s - loss: 0.6401 - regression_loss: 0.5816 - classification_loss: 0.0585 460/500 [==========================>...] - ETA: 15s - loss: 0.6402 - regression_loss: 0.5817 - classification_loss: 0.0585 461/500 [==========================>...] - ETA: 15s - loss: 0.6404 - regression_loss: 0.5819 - classification_loss: 0.0585 462/500 [==========================>...] - ETA: 14s - loss: 0.6406 - regression_loss: 0.5821 - classification_loss: 0.0585 463/500 [==========================>...] - ETA: 14s - loss: 0.6402 - regression_loss: 0.5817 - classification_loss: 0.0584 464/500 [==========================>...] - ETA: 14s - loss: 0.6403 - regression_loss: 0.5818 - classification_loss: 0.0584 465/500 [==========================>...] - ETA: 13s - loss: 0.6398 - regression_loss: 0.5814 - classification_loss: 0.0584 466/500 [==========================>...] - ETA: 13s - loss: 0.6404 - regression_loss: 0.5819 - classification_loss: 0.0585 467/500 [===========================>..] - ETA: 12s - loss: 0.6401 - regression_loss: 0.5816 - classification_loss: 0.0585 468/500 [===========================>..] - ETA: 12s - loss: 0.6402 - regression_loss: 0.5818 - classification_loss: 0.0584 469/500 [===========================>..] - ETA: 12s - loss: 0.6404 - regression_loss: 0.5819 - classification_loss: 0.0585 470/500 [===========================>..] - ETA: 11s - loss: 0.6402 - regression_loss: 0.5817 - classification_loss: 0.0585 471/500 [===========================>..] - ETA: 11s - loss: 0.6404 - regression_loss: 0.5818 - classification_loss: 0.0585 472/500 [===========================>..] - ETA: 10s - loss: 0.6394 - regression_loss: 0.5810 - classification_loss: 0.0585 473/500 [===========================>..] - ETA: 10s - loss: 0.6404 - regression_loss: 0.5817 - classification_loss: 0.0587 474/500 [===========================>..] - ETA: 10s - loss: 0.6408 - regression_loss: 0.5822 - classification_loss: 0.0587 475/500 [===========================>..] - ETA: 9s - loss: 0.6411 - regression_loss: 0.5824 - classification_loss: 0.0587  476/500 [===========================>..] - ETA: 9s - loss: 0.6414 - regression_loss: 0.5827 - classification_loss: 0.0586 477/500 [===========================>..] - ETA: 8s - loss: 0.6410 - regression_loss: 0.5824 - classification_loss: 0.0586 478/500 [===========================>..] - ETA: 8s - loss: 0.6408 - regression_loss: 0.5822 - classification_loss: 0.0586 479/500 [===========================>..] - ETA: 8s - loss: 0.6403 - regression_loss: 0.5817 - classification_loss: 0.0586 480/500 [===========================>..] - ETA: 7s - loss: 0.6397 - regression_loss: 0.5811 - classification_loss: 0.0586 481/500 [===========================>..] - ETA: 7s - loss: 0.6396 - regression_loss: 0.5811 - classification_loss: 0.0585 482/500 [===========================>..] - ETA: 7s - loss: 0.6396 - regression_loss: 0.5811 - classification_loss: 0.0585 483/500 [===========================>..] - ETA: 6s - loss: 0.6397 - regression_loss: 0.5812 - classification_loss: 0.0585 484/500 [============================>.] - ETA: 6s - loss: 0.6397 - regression_loss: 0.5813 - classification_loss: 0.0585 485/500 [============================>.] - ETA: 5s - loss: 0.6396 - regression_loss: 0.5811 - classification_loss: 0.0585 486/500 [============================>.] - ETA: 5s - loss: 0.6390 - regression_loss: 0.5806 - classification_loss: 0.0584 487/500 [============================>.] - ETA: 5s - loss: 0.6388 - regression_loss: 0.5803 - classification_loss: 0.0584 488/500 [============================>.] - ETA: 4s - loss: 0.6392 - regression_loss: 0.5808 - classification_loss: 0.0584 489/500 [============================>.] - ETA: 4s - loss: 0.6389 - regression_loss: 0.5805 - classification_loss: 0.0584 490/500 [============================>.] - ETA: 3s - loss: 0.6386 - regression_loss: 0.5803 - classification_loss: 0.0583 491/500 [============================>.] - ETA: 3s - loss: 0.6386 - regression_loss: 0.5803 - classification_loss: 0.0583 492/500 [============================>.] - ETA: 3s - loss: 0.6385 - regression_loss: 0.5802 - classification_loss: 0.0583 493/500 [============================>.] - ETA: 2s - loss: 0.6392 - regression_loss: 0.5807 - classification_loss: 0.0585 494/500 [============================>.] - ETA: 2s - loss: 0.6390 - regression_loss: 0.5805 - classification_loss: 0.0585 495/500 [============================>.] - ETA: 1s - loss: 0.6384 - regression_loss: 0.5800 - classification_loss: 0.0584 496/500 [============================>.] - ETA: 1s - loss: 0.6379 - regression_loss: 0.5795 - classification_loss: 0.0584 497/500 [============================>.] - ETA: 1s - loss: 0.6384 - regression_loss: 0.5800 - classification_loss: 0.0584 498/500 [============================>.] - ETA: 0s - loss: 0.6383 - regression_loss: 0.5799 - classification_loss: 0.0584 499/500 [============================>.] - ETA: 0s - loss: 0.6384 - regression_loss: 0.5801 - classification_loss: 0.0584 500/500 [==============================] - 195s 389ms/step - loss: 0.6391 - regression_loss: 0.5806 - classification_loss: 0.0585 326 instances of class plum with average precision: 0.8204 mAP: 0.8204 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/20 1/500 [..............................] - ETA: 3:16 - loss: 0.6836 - regression_loss: 0.6199 - classification_loss: 0.0637 2/500 [..............................] - ETA: 3:14 - loss: 0.7600 - regression_loss: 0.6859 - classification_loss: 0.0741 3/500 [..............................] - ETA: 3:16 - loss: 0.7098 - regression_loss: 0.6484 - classification_loss: 0.0615 4/500 [..............................] - ETA: 3:16 - loss: 0.7270 - regression_loss: 0.6613 - classification_loss: 0.0656 5/500 [..............................] - ETA: 3:15 - loss: 0.6719 - regression_loss: 0.6124 - classification_loss: 0.0595 6/500 [..............................] - ETA: 3:12 - loss: 0.6102 - regression_loss: 0.5578 - classification_loss: 0.0524 7/500 [..............................] - ETA: 3:11 - loss: 0.5922 - regression_loss: 0.5422 - classification_loss: 0.0500 8/500 [..............................] - ETA: 3:11 - loss: 0.5769 - regression_loss: 0.5257 - classification_loss: 0.0511 9/500 [..............................] - ETA: 3:10 - loss: 0.5565 - regression_loss: 0.5067 - classification_loss: 0.0499 10/500 [..............................] - ETA: 3:11 - loss: 0.5589 - regression_loss: 0.5090 - classification_loss: 0.0499 11/500 [..............................] - ETA: 3:10 - loss: 0.5678 - regression_loss: 0.5156 - classification_loss: 0.0522 12/500 [..............................] - ETA: 3:10 - loss: 0.5679 - regression_loss: 0.5154 - classification_loss: 0.0526 13/500 [..............................] - ETA: 3:10 - loss: 0.5564 - regression_loss: 0.5045 - classification_loss: 0.0519 14/500 [..............................] - ETA: 3:09 - loss: 0.5764 - regression_loss: 0.5226 - classification_loss: 0.0538 15/500 [..............................] - ETA: 3:09 - loss: 0.5466 - regression_loss: 0.4951 - classification_loss: 0.0515 16/500 [..............................] - ETA: 3:08 - loss: 0.5490 - regression_loss: 0.4974 - classification_loss: 0.0516 17/500 [>.............................] - ETA: 3:07 - loss: 0.5468 - regression_loss: 0.4964 - classification_loss: 0.0504 18/500 [>.............................] - ETA: 3:07 - loss: 0.5527 - regression_loss: 0.4993 - classification_loss: 0.0534 19/500 [>.............................] - ETA: 3:07 - loss: 0.5479 - regression_loss: 0.4956 - classification_loss: 0.0523 20/500 [>.............................] - ETA: 3:07 - loss: 0.5375 - regression_loss: 0.4863 - classification_loss: 0.0512 21/500 [>.............................] - ETA: 3:06 - loss: 0.5412 - regression_loss: 0.4898 - classification_loss: 0.0514 22/500 [>.............................] - ETA: 3:06 - loss: 0.5442 - regression_loss: 0.4927 - classification_loss: 0.0515 23/500 [>.............................] - ETA: 3:06 - loss: 0.5294 - regression_loss: 0.4782 - classification_loss: 0.0512 24/500 [>.............................] - ETA: 3:06 - loss: 0.5277 - regression_loss: 0.4772 - classification_loss: 0.0505 25/500 [>.............................] - ETA: 3:06 - loss: 0.5270 - regression_loss: 0.4771 - classification_loss: 0.0498 26/500 [>.............................] - ETA: 3:05 - loss: 0.5289 - regression_loss: 0.4787 - classification_loss: 0.0502 27/500 [>.............................] - ETA: 3:05 - loss: 0.5487 - regression_loss: 0.4946 - classification_loss: 0.0541 28/500 [>.............................] - ETA: 3:04 - loss: 0.5453 - regression_loss: 0.4917 - classification_loss: 0.0536 29/500 [>.............................] - ETA: 3:04 - loss: 0.5424 - regression_loss: 0.4894 - classification_loss: 0.0530 30/500 [>.............................] - ETA: 3:04 - loss: 0.5351 - regression_loss: 0.4824 - classification_loss: 0.0527 31/500 [>.............................] - ETA: 3:03 - loss: 0.5359 - regression_loss: 0.4834 - classification_loss: 0.0525 32/500 [>.............................] - ETA: 3:02 - loss: 0.5526 - regression_loss: 0.4983 - classification_loss: 0.0543 33/500 [>.............................] - ETA: 3:02 - loss: 0.5518 - regression_loss: 0.4976 - classification_loss: 0.0541 34/500 [=>............................] - ETA: 3:02 - loss: 0.5575 - regression_loss: 0.5024 - classification_loss: 0.0552 35/500 [=>............................] - ETA: 3:01 - loss: 0.5530 - regression_loss: 0.4986 - classification_loss: 0.0544 36/500 [=>............................] - ETA: 3:01 - loss: 0.5500 - regression_loss: 0.4968 - classification_loss: 0.0532 37/500 [=>............................] - ETA: 3:00 - loss: 0.5534 - regression_loss: 0.5002 - classification_loss: 0.0532 38/500 [=>............................] - ETA: 3:00 - loss: 0.5657 - regression_loss: 0.5104 - classification_loss: 0.0553 39/500 [=>............................] - ETA: 3:00 - loss: 0.5582 - regression_loss: 0.5037 - classification_loss: 0.0544 40/500 [=>............................] - ETA: 2:59 - loss: 0.5624 - regression_loss: 0.5066 - classification_loss: 0.0558 41/500 [=>............................] - ETA: 2:59 - loss: 0.5591 - regression_loss: 0.5039 - classification_loss: 0.0553 42/500 [=>............................] - ETA: 2:58 - loss: 0.5551 - regression_loss: 0.5002 - classification_loss: 0.0549 43/500 [=>............................] - ETA: 2:58 - loss: 0.5517 - regression_loss: 0.4971 - classification_loss: 0.0545 44/500 [=>............................] - ETA: 2:57 - loss: 0.5479 - regression_loss: 0.4931 - classification_loss: 0.0548 45/500 [=>............................] - ETA: 2:57 - loss: 0.5468 - regression_loss: 0.4926 - classification_loss: 0.0542 46/500 [=>............................] - ETA: 2:57 - loss: 0.5510 - regression_loss: 0.4962 - classification_loss: 0.0548 47/500 [=>............................] - ETA: 2:56 - loss: 0.5473 - regression_loss: 0.4927 - classification_loss: 0.0545 48/500 [=>............................] - ETA: 2:56 - loss: 0.5538 - regression_loss: 0.4980 - classification_loss: 0.0558 49/500 [=>............................] - ETA: 2:55 - loss: 0.5520 - regression_loss: 0.4964 - classification_loss: 0.0556 50/500 [==>...........................] - ETA: 2:55 - loss: 0.5543 - regression_loss: 0.4982 - classification_loss: 0.0561 51/500 [==>...........................] - ETA: 2:55 - loss: 0.5534 - regression_loss: 0.4976 - classification_loss: 0.0558 52/500 [==>...........................] - ETA: 2:54 - loss: 0.5462 - regression_loss: 0.4911 - classification_loss: 0.0551 53/500 [==>...........................] - ETA: 2:54 - loss: 0.5527 - regression_loss: 0.4980 - classification_loss: 0.0547 54/500 [==>...........................] - ETA: 2:54 - loss: 0.5535 - regression_loss: 0.4988 - classification_loss: 0.0546 55/500 [==>...........................] - ETA: 2:53 - loss: 0.5503 - regression_loss: 0.4964 - classification_loss: 0.0539 56/500 [==>...........................] - ETA: 2:53 - loss: 0.5491 - regression_loss: 0.4953 - classification_loss: 0.0539 57/500 [==>...........................] - ETA: 2:53 - loss: 0.5488 - regression_loss: 0.4948 - classification_loss: 0.0540 58/500 [==>...........................] - ETA: 2:52 - loss: 0.5517 - regression_loss: 0.4973 - classification_loss: 0.0544 59/500 [==>...........................] - ETA: 2:52 - loss: 0.5566 - regression_loss: 0.5016 - classification_loss: 0.0550 60/500 [==>...........................] - ETA: 2:52 - loss: 0.5605 - regression_loss: 0.5052 - classification_loss: 0.0554 61/500 [==>...........................] - ETA: 2:51 - loss: 0.5601 - regression_loss: 0.5046 - classification_loss: 0.0555 62/500 [==>...........................] - ETA: 2:51 - loss: 0.5563 - regression_loss: 0.5012 - classification_loss: 0.0551 63/500 [==>...........................] - ETA: 2:51 - loss: 0.5571 - regression_loss: 0.5014 - classification_loss: 0.0558 64/500 [==>...........................] - ETA: 2:50 - loss: 0.5554 - regression_loss: 0.4998 - classification_loss: 0.0555 65/500 [==>...........................] - ETA: 2:50 - loss: 0.5558 - regression_loss: 0.5000 - classification_loss: 0.0558 66/500 [==>...........................] - ETA: 2:49 - loss: 0.5536 - regression_loss: 0.4980 - classification_loss: 0.0555 67/500 [===>..........................] - ETA: 2:49 - loss: 0.5532 - regression_loss: 0.4980 - classification_loss: 0.0552 68/500 [===>..........................] - ETA: 2:48 - loss: 0.5539 - regression_loss: 0.4988 - classification_loss: 0.0551 69/500 [===>..........................] - ETA: 2:48 - loss: 0.5547 - regression_loss: 0.4996 - classification_loss: 0.0551 70/500 [===>..........................] - ETA: 2:48 - loss: 0.5543 - regression_loss: 0.4992 - classification_loss: 0.0551 71/500 [===>..........................] - ETA: 2:47 - loss: 0.5527 - regression_loss: 0.4977 - classification_loss: 0.0550 72/500 [===>..........................] - ETA: 2:47 - loss: 0.5589 - regression_loss: 0.5029 - classification_loss: 0.0560 73/500 [===>..........................] - ETA: 2:46 - loss: 0.5541 - regression_loss: 0.4985 - classification_loss: 0.0555 74/500 [===>..........................] - ETA: 2:46 - loss: 0.5580 - regression_loss: 0.5024 - classification_loss: 0.0556 75/500 [===>..........................] - ETA: 2:45 - loss: 0.5631 - regression_loss: 0.5074 - classification_loss: 0.0557 76/500 [===>..........................] - ETA: 2:45 - loss: 0.5688 - regression_loss: 0.5132 - classification_loss: 0.0556 77/500 [===>..........................] - ETA: 2:45 - loss: 0.5739 - regression_loss: 0.5180 - classification_loss: 0.0559 78/500 [===>..........................] - ETA: 2:44 - loss: 0.5775 - regression_loss: 0.5216 - classification_loss: 0.0559 79/500 [===>..........................] - ETA: 2:44 - loss: 0.5778 - regression_loss: 0.5219 - classification_loss: 0.0559 80/500 [===>..........................] - ETA: 2:43 - loss: 0.5769 - regression_loss: 0.5212 - classification_loss: 0.0557 81/500 [===>..........................] - ETA: 2:43 - loss: 0.5827 - regression_loss: 0.5261 - classification_loss: 0.0565 82/500 [===>..........................] - ETA: 2:43 - loss: 0.5793 - regression_loss: 0.5231 - classification_loss: 0.0562 83/500 [===>..........................] - ETA: 2:42 - loss: 0.5761 - regression_loss: 0.5203 - classification_loss: 0.0558 84/500 [====>.........................] - ETA: 2:42 - loss: 0.5748 - regression_loss: 0.5192 - classification_loss: 0.0556 85/500 [====>.........................] - ETA: 2:42 - loss: 0.5723 - regression_loss: 0.5169 - classification_loss: 0.0554 86/500 [====>.........................] - ETA: 2:41 - loss: 0.5783 - regression_loss: 0.5227 - classification_loss: 0.0556 87/500 [====>.........................] - ETA: 2:41 - loss: 0.5798 - regression_loss: 0.5244 - classification_loss: 0.0554 88/500 [====>.........................] - ETA: 2:40 - loss: 0.5772 - regression_loss: 0.5221 - classification_loss: 0.0551 89/500 [====>.........................] - ETA: 2:40 - loss: 0.5750 - regression_loss: 0.5201 - classification_loss: 0.0549 90/500 [====>.........................] - ETA: 2:40 - loss: 0.5780 - regression_loss: 0.5228 - classification_loss: 0.0553 91/500 [====>.........................] - ETA: 2:39 - loss: 0.5782 - regression_loss: 0.5230 - classification_loss: 0.0553 92/500 [====>.........................] - ETA: 2:39 - loss: 0.5773 - regression_loss: 0.5222 - classification_loss: 0.0551 93/500 [====>.........................] - ETA: 2:39 - loss: 0.5746 - regression_loss: 0.5200 - classification_loss: 0.0546 94/500 [====>.........................] - ETA: 2:38 - loss: 0.5729 - regression_loss: 0.5186 - classification_loss: 0.0543 95/500 [====>.........................] - ETA: 2:38 - loss: 0.5703 - regression_loss: 0.5163 - classification_loss: 0.0540 96/500 [====>.........................] - ETA: 2:37 - loss: 0.5697 - regression_loss: 0.5159 - classification_loss: 0.0538 97/500 [====>.........................] - ETA: 2:37 - loss: 0.5745 - regression_loss: 0.5197 - classification_loss: 0.0548 98/500 [====>.........................] - ETA: 2:37 - loss: 0.5749 - regression_loss: 0.5202 - classification_loss: 0.0547 99/500 [====>.........................] - ETA: 2:36 - loss: 0.5750 - regression_loss: 0.5203 - classification_loss: 0.0548 100/500 [=====>........................] - ETA: 2:36 - loss: 0.5751 - regression_loss: 0.5206 - classification_loss: 0.0545 101/500 [=====>........................] - ETA: 2:36 - loss: 0.5815 - regression_loss: 0.5263 - classification_loss: 0.0552 102/500 [=====>........................] - ETA: 2:35 - loss: 0.5819 - regression_loss: 0.5269 - classification_loss: 0.0550 103/500 [=====>........................] - ETA: 2:35 - loss: 0.5860 - regression_loss: 0.5306 - classification_loss: 0.0555 104/500 [=====>........................] - ETA: 2:34 - loss: 0.5856 - regression_loss: 0.5300 - classification_loss: 0.0556 105/500 [=====>........................] - ETA: 2:34 - loss: 0.5835 - regression_loss: 0.5282 - classification_loss: 0.0553 106/500 [=====>........................] - ETA: 2:34 - loss: 0.5839 - regression_loss: 0.5284 - classification_loss: 0.0555 107/500 [=====>........................] - ETA: 2:33 - loss: 0.5856 - regression_loss: 0.5298 - classification_loss: 0.0558 108/500 [=====>........................] - ETA: 2:33 - loss: 0.5843 - regression_loss: 0.5287 - classification_loss: 0.0556 109/500 [=====>........................] - ETA: 2:32 - loss: 0.5808 - regression_loss: 0.5255 - classification_loss: 0.0553 110/500 [=====>........................] - ETA: 2:32 - loss: 0.5822 - regression_loss: 0.5267 - classification_loss: 0.0555 111/500 [=====>........................] - ETA: 2:32 - loss: 0.5804 - regression_loss: 0.5252 - classification_loss: 0.0553 112/500 [=====>........................] - ETA: 2:31 - loss: 0.5818 - regression_loss: 0.5268 - classification_loss: 0.0551 113/500 [=====>........................] - ETA: 2:31 - loss: 0.5834 - regression_loss: 0.5284 - classification_loss: 0.0550 114/500 [=====>........................] - ETA: 2:30 - loss: 0.5822 - regression_loss: 0.5273 - classification_loss: 0.0549 115/500 [=====>........................] - ETA: 2:30 - loss: 0.5792 - regression_loss: 0.5244 - classification_loss: 0.0548 116/500 [=====>........................] - ETA: 2:30 - loss: 0.5787 - regression_loss: 0.5239 - classification_loss: 0.0547 117/500 [======>.......................] - ETA: 2:29 - loss: 0.5785 - regression_loss: 0.5237 - classification_loss: 0.0548 118/500 [======>.......................] - ETA: 2:29 - loss: 0.5776 - regression_loss: 0.5229 - classification_loss: 0.0547 119/500 [======>.......................] - ETA: 2:28 - loss: 0.5763 - regression_loss: 0.5216 - classification_loss: 0.0547 120/500 [======>.......................] - ETA: 2:28 - loss: 0.5766 - regression_loss: 0.5220 - classification_loss: 0.0545 121/500 [======>.......................] - ETA: 2:28 - loss: 0.5753 - regression_loss: 0.5209 - classification_loss: 0.0544 122/500 [======>.......................] - ETA: 2:27 - loss: 0.5739 - regression_loss: 0.5196 - classification_loss: 0.0543 123/500 [======>.......................] - ETA: 2:27 - loss: 0.5748 - regression_loss: 0.5203 - classification_loss: 0.0545 124/500 [======>.......................] - ETA: 2:26 - loss: 0.5742 - regression_loss: 0.5200 - classification_loss: 0.0542 125/500 [======>.......................] - ETA: 2:26 - loss: 0.5756 - regression_loss: 0.5210 - classification_loss: 0.0546 126/500 [======>.......................] - ETA: 2:26 - loss: 0.5792 - regression_loss: 0.5241 - classification_loss: 0.0551 127/500 [======>.......................] - ETA: 2:25 - loss: 0.5768 - regression_loss: 0.5219 - classification_loss: 0.0550 128/500 [======>.......................] - ETA: 2:25 - loss: 0.5753 - regression_loss: 0.5205 - classification_loss: 0.0548 129/500 [======>.......................] - ETA: 2:24 - loss: 0.5792 - regression_loss: 0.5234 - classification_loss: 0.0558 130/500 [======>.......................] - ETA: 2:24 - loss: 0.5803 - regression_loss: 0.5247 - classification_loss: 0.0556 131/500 [======>.......................] - ETA: 2:24 - loss: 0.5820 - regression_loss: 0.5260 - classification_loss: 0.0559 132/500 [======>.......................] - ETA: 2:23 - loss: 0.5804 - regression_loss: 0.5247 - classification_loss: 0.0557 133/500 [======>.......................] - ETA: 2:23 - loss: 0.5771 - regression_loss: 0.5217 - classification_loss: 0.0554 134/500 [=======>......................] - ETA: 2:23 - loss: 0.5755 - regression_loss: 0.5203 - classification_loss: 0.0552 135/500 [=======>......................] - ETA: 2:22 - loss: 0.5753 - regression_loss: 0.5202 - classification_loss: 0.0551 136/500 [=======>......................] - ETA: 2:22 - loss: 0.5787 - regression_loss: 0.5229 - classification_loss: 0.0559 137/500 [=======>......................] - ETA: 2:21 - loss: 0.5783 - regression_loss: 0.5226 - classification_loss: 0.0558 138/500 [=======>......................] - ETA: 2:21 - loss: 0.5787 - regression_loss: 0.5230 - classification_loss: 0.0557 139/500 [=======>......................] - ETA: 2:20 - loss: 0.5779 - regression_loss: 0.5222 - classification_loss: 0.0556 140/500 [=======>......................] - ETA: 2:20 - loss: 0.5796 - regression_loss: 0.5241 - classification_loss: 0.0555 141/500 [=======>......................] - ETA: 2:20 - loss: 0.5804 - regression_loss: 0.5250 - classification_loss: 0.0554 142/500 [=======>......................] - ETA: 2:19 - loss: 0.5805 - regression_loss: 0.5252 - classification_loss: 0.0553 143/500 [=======>......................] - ETA: 2:19 - loss: 0.5774 - regression_loss: 0.5224 - classification_loss: 0.0550 144/500 [=======>......................] - ETA: 2:19 - loss: 0.5770 - regression_loss: 0.5222 - classification_loss: 0.0548 145/500 [=======>......................] - ETA: 2:18 - loss: 0.5777 - regression_loss: 0.5231 - classification_loss: 0.0546 146/500 [=======>......................] - ETA: 2:18 - loss: 0.5767 - regression_loss: 0.5223 - classification_loss: 0.0544 147/500 [=======>......................] - ETA: 2:17 - loss: 0.5753 - regression_loss: 0.5210 - classification_loss: 0.0543 148/500 [=======>......................] - ETA: 2:17 - loss: 0.5748 - regression_loss: 0.5205 - classification_loss: 0.0542 149/500 [=======>......................] - ETA: 2:17 - loss: 0.5781 - regression_loss: 0.5227 - classification_loss: 0.0554 150/500 [========>.....................] - ETA: 2:16 - loss: 0.5793 - regression_loss: 0.5237 - classification_loss: 0.0556 151/500 [========>.....................] - ETA: 2:16 - loss: 0.5818 - regression_loss: 0.5262 - classification_loss: 0.0557 152/500 [========>.....................] - ETA: 2:15 - loss: 0.5791 - regression_loss: 0.5236 - classification_loss: 0.0555 153/500 [========>.....................] - ETA: 2:15 - loss: 0.5823 - regression_loss: 0.5266 - classification_loss: 0.0557 154/500 [========>.....................] - ETA: 2:14 - loss: 0.5843 - regression_loss: 0.5287 - classification_loss: 0.0556 155/500 [========>.....................] - ETA: 2:14 - loss: 0.5839 - regression_loss: 0.5285 - classification_loss: 0.0555 156/500 [========>.....................] - ETA: 2:14 - loss: 0.5840 - regression_loss: 0.5286 - classification_loss: 0.0554 157/500 [========>.....................] - ETA: 2:13 - loss: 0.5826 - regression_loss: 0.5273 - classification_loss: 0.0552 158/500 [========>.....................] - ETA: 2:13 - loss: 0.5813 - regression_loss: 0.5261 - classification_loss: 0.0551 159/500 [========>.....................] - ETA: 2:12 - loss: 0.5838 - regression_loss: 0.5282 - classification_loss: 0.0555 160/500 [========>.....................] - ETA: 2:12 - loss: 0.5840 - regression_loss: 0.5286 - classification_loss: 0.0554 161/500 [========>.....................] - ETA: 2:12 - loss: 0.5833 - regression_loss: 0.5280 - classification_loss: 0.0553 162/500 [========>.....................] - ETA: 2:11 - loss: 0.5832 - regression_loss: 0.5278 - classification_loss: 0.0554 163/500 [========>.....................] - ETA: 2:11 - loss: 0.5823 - regression_loss: 0.5271 - classification_loss: 0.0553 164/500 [========>.....................] - ETA: 2:10 - loss: 0.5827 - regression_loss: 0.5274 - classification_loss: 0.0553 165/500 [========>.....................] - ETA: 2:10 - loss: 0.5830 - regression_loss: 0.5275 - classification_loss: 0.0555 166/500 [========>.....................] - ETA: 2:10 - loss: 0.5822 - regression_loss: 0.5268 - classification_loss: 0.0554 167/500 [=========>....................] - ETA: 2:09 - loss: 0.5832 - regression_loss: 0.5278 - classification_loss: 0.0554 168/500 [=========>....................] - ETA: 2:09 - loss: 0.5824 - regression_loss: 0.5271 - classification_loss: 0.0553 169/500 [=========>....................] - ETA: 2:08 - loss: 0.5812 - regression_loss: 0.5260 - classification_loss: 0.0552 170/500 [=========>....................] - ETA: 2:08 - loss: 0.5794 - regression_loss: 0.5244 - classification_loss: 0.0550 171/500 [=========>....................] - ETA: 2:08 - loss: 0.5800 - regression_loss: 0.5248 - classification_loss: 0.0551 172/500 [=========>....................] - ETA: 2:07 - loss: 0.5780 - regression_loss: 0.5231 - classification_loss: 0.0549 173/500 [=========>....................] - ETA: 2:07 - loss: 0.5770 - regression_loss: 0.5221 - classification_loss: 0.0549 174/500 [=========>....................] - ETA: 2:06 - loss: 0.5771 - regression_loss: 0.5223 - classification_loss: 0.0548 175/500 [=========>....................] - ETA: 2:06 - loss: 0.5760 - regression_loss: 0.5212 - classification_loss: 0.0548 176/500 [=========>....................] - ETA: 2:06 - loss: 0.5755 - regression_loss: 0.5208 - classification_loss: 0.0547 177/500 [=========>....................] - ETA: 2:05 - loss: 0.5754 - regression_loss: 0.5207 - classification_loss: 0.0547 178/500 [=========>....................] - ETA: 2:05 - loss: 0.5732 - regression_loss: 0.5188 - classification_loss: 0.0545 179/500 [=========>....................] - ETA: 2:05 - loss: 0.5716 - regression_loss: 0.5172 - classification_loss: 0.0544 180/500 [=========>....................] - ETA: 2:04 - loss: 0.5703 - regression_loss: 0.5160 - classification_loss: 0.0543 181/500 [=========>....................] - ETA: 2:04 - loss: 0.5702 - regression_loss: 0.5160 - classification_loss: 0.0542 182/500 [=========>....................] - ETA: 2:03 - loss: 0.5678 - regression_loss: 0.5138 - classification_loss: 0.0540 183/500 [=========>....................] - ETA: 2:03 - loss: 0.5671 - regression_loss: 0.5132 - classification_loss: 0.0539 184/500 [==========>...................] - ETA: 2:03 - loss: 0.5669 - regression_loss: 0.5131 - classification_loss: 0.0538 185/500 [==========>...................] - ETA: 2:02 - loss: 0.5664 - regression_loss: 0.5127 - classification_loss: 0.0537 186/500 [==========>...................] - ETA: 2:02 - loss: 0.5644 - regression_loss: 0.5109 - classification_loss: 0.0536 187/500 [==========>...................] - ETA: 2:02 - loss: 0.5632 - regression_loss: 0.5097 - classification_loss: 0.0534 188/500 [==========>...................] - ETA: 2:01 - loss: 0.5631 - regression_loss: 0.5097 - classification_loss: 0.0535 189/500 [==========>...................] - ETA: 2:01 - loss: 0.5629 - regression_loss: 0.5094 - classification_loss: 0.0534 190/500 [==========>...................] - ETA: 2:00 - loss: 0.5641 - regression_loss: 0.5106 - classification_loss: 0.0535 191/500 [==========>...................] - ETA: 2:00 - loss: 0.5671 - regression_loss: 0.5132 - classification_loss: 0.0539 192/500 [==========>...................] - ETA: 2:00 - loss: 0.5661 - regression_loss: 0.5123 - classification_loss: 0.0538 193/500 [==========>...................] - ETA: 1:59 - loss: 0.5659 - regression_loss: 0.5120 - classification_loss: 0.0538 194/500 [==========>...................] - ETA: 1:59 - loss: 0.5675 - regression_loss: 0.5134 - classification_loss: 0.0541 195/500 [==========>...................] - ETA: 1:58 - loss: 0.5669 - regression_loss: 0.5129 - classification_loss: 0.0541 196/500 [==========>...................] - ETA: 1:58 - loss: 0.5660 - regression_loss: 0.5120 - classification_loss: 0.0540 197/500 [==========>...................] - ETA: 1:58 - loss: 0.5653 - regression_loss: 0.5114 - classification_loss: 0.0539 198/500 [==========>...................] - ETA: 1:57 - loss: 0.5639 - regression_loss: 0.5101 - classification_loss: 0.0537 199/500 [==========>...................] - ETA: 1:57 - loss: 0.5669 - regression_loss: 0.5126 - classification_loss: 0.0543 200/500 [===========>..................] - ETA: 1:56 - loss: 0.5666 - regression_loss: 0.5124 - classification_loss: 0.0542 201/500 [===========>..................] - ETA: 1:56 - loss: 0.5650 - regression_loss: 0.5110 - classification_loss: 0.0540 202/500 [===========>..................] - ETA: 1:56 - loss: 0.5641 - regression_loss: 0.5101 - classification_loss: 0.0540 203/500 [===========>..................] - ETA: 1:55 - loss: 0.5648 - regression_loss: 0.5109 - classification_loss: 0.0539 204/500 [===========>..................] - ETA: 1:55 - loss: 0.5639 - regression_loss: 0.5101 - classification_loss: 0.0538 205/500 [===========>..................] - ETA: 1:55 - loss: 0.5625 - regression_loss: 0.5089 - classification_loss: 0.0536 206/500 [===========>..................] - ETA: 1:54 - loss: 0.5626 - regression_loss: 0.5092 - classification_loss: 0.0535 207/500 [===========>..................] - ETA: 1:54 - loss: 0.5627 - regression_loss: 0.5093 - classification_loss: 0.0534 208/500 [===========>..................] - ETA: 1:53 - loss: 0.5627 - regression_loss: 0.5094 - classification_loss: 0.0533 209/500 [===========>..................] - ETA: 1:53 - loss: 0.5622 - regression_loss: 0.5089 - classification_loss: 0.0533 210/500 [===========>..................] - ETA: 1:53 - loss: 0.5622 - regression_loss: 0.5090 - classification_loss: 0.0532 211/500 [===========>..................] - ETA: 1:52 - loss: 0.5617 - regression_loss: 0.5086 - classification_loss: 0.0531 212/500 [===========>..................] - ETA: 1:52 - loss: 0.5614 - regression_loss: 0.5083 - classification_loss: 0.0531 213/500 [===========>..................] - ETA: 1:51 - loss: 0.5612 - regression_loss: 0.5081 - classification_loss: 0.0530 214/500 [===========>..................] - ETA: 1:51 - loss: 0.5606 - regression_loss: 0.5077 - classification_loss: 0.0529 215/500 [===========>..................] - ETA: 1:51 - loss: 0.5596 - regression_loss: 0.5068 - classification_loss: 0.0528 216/500 [===========>..................] - ETA: 1:50 - loss: 0.5593 - regression_loss: 0.5066 - classification_loss: 0.0528 217/500 [============>.................] - ETA: 1:50 - loss: 0.5588 - regression_loss: 0.5061 - classification_loss: 0.0526 218/500 [============>.................] - ETA: 1:49 - loss: 0.5588 - regression_loss: 0.5062 - classification_loss: 0.0526 219/500 [============>.................] - ETA: 1:49 - loss: 0.5596 - regression_loss: 0.5070 - classification_loss: 0.0526 220/500 [============>.................] - ETA: 1:49 - loss: 0.5596 - regression_loss: 0.5070 - classification_loss: 0.0525 221/500 [============>.................] - ETA: 1:48 - loss: 0.5594 - regression_loss: 0.5068 - classification_loss: 0.0526 222/500 [============>.................] - ETA: 1:48 - loss: 0.5602 - regression_loss: 0.5074 - classification_loss: 0.0527 223/500 [============>.................] - ETA: 1:48 - loss: 0.5605 - regression_loss: 0.5077 - classification_loss: 0.0528 224/500 [============>.................] - ETA: 1:47 - loss: 0.5609 - regression_loss: 0.5081 - classification_loss: 0.0528 225/500 [============>.................] - ETA: 1:47 - loss: 0.5613 - regression_loss: 0.5085 - classification_loss: 0.0529 226/500 [============>.................] - ETA: 1:46 - loss: 0.5612 - regression_loss: 0.5082 - classification_loss: 0.0529 227/500 [============>.................] - ETA: 1:46 - loss: 0.5621 - regression_loss: 0.5092 - classification_loss: 0.0529 228/500 [============>.................] - ETA: 1:46 - loss: 0.5642 - regression_loss: 0.5113 - classification_loss: 0.0530 229/500 [============>.................] - ETA: 1:45 - loss: 0.5643 - regression_loss: 0.5114 - classification_loss: 0.0528 230/500 [============>.................] - ETA: 1:45 - loss: 0.5640 - regression_loss: 0.5111 - classification_loss: 0.0529 231/500 [============>.................] - ETA: 1:44 - loss: 0.5629 - regression_loss: 0.5102 - classification_loss: 0.0527 232/500 [============>.................] - ETA: 1:44 - loss: 0.5635 - regression_loss: 0.5108 - classification_loss: 0.0528 233/500 [============>.................] - ETA: 1:44 - loss: 0.5633 - regression_loss: 0.5105 - classification_loss: 0.0528 234/500 [=============>................] - ETA: 1:43 - loss: 0.5632 - regression_loss: 0.5105 - classification_loss: 0.0527 235/500 [=============>................] - ETA: 1:43 - loss: 0.5631 - regression_loss: 0.5104 - classification_loss: 0.0527 236/500 [=============>................] - ETA: 1:43 - loss: 0.5628 - regression_loss: 0.5102 - classification_loss: 0.0526 237/500 [=============>................] - ETA: 1:42 - loss: 0.5619 - regression_loss: 0.5093 - classification_loss: 0.0526 238/500 [=============>................] - ETA: 1:42 - loss: 0.5613 - regression_loss: 0.5088 - classification_loss: 0.0525 239/500 [=============>................] - ETA: 1:41 - loss: 0.5606 - regression_loss: 0.5082 - classification_loss: 0.0524 240/500 [=============>................] - ETA: 1:41 - loss: 0.5603 - regression_loss: 0.5081 - classification_loss: 0.0523 241/500 [=============>................] - ETA: 1:41 - loss: 0.5591 - regression_loss: 0.5068 - classification_loss: 0.0522 242/500 [=============>................] - ETA: 1:40 - loss: 0.5587 - regression_loss: 0.5066 - classification_loss: 0.0521 243/500 [=============>................] - ETA: 1:40 - loss: 0.5584 - regression_loss: 0.5063 - classification_loss: 0.0520 244/500 [=============>................] - ETA: 1:39 - loss: 0.5576 - regression_loss: 0.5057 - classification_loss: 0.0520 245/500 [=============>................] - ETA: 1:39 - loss: 0.5580 - regression_loss: 0.5061 - classification_loss: 0.0519 246/500 [=============>................] - ETA: 1:39 - loss: 0.5598 - regression_loss: 0.5076 - classification_loss: 0.0522 247/500 [=============>................] - ETA: 1:38 - loss: 0.5606 - regression_loss: 0.5084 - classification_loss: 0.0522 248/500 [=============>................] - ETA: 1:38 - loss: 0.5602 - regression_loss: 0.5081 - classification_loss: 0.0521 249/500 [=============>................] - ETA: 1:37 - loss: 0.5590 - regression_loss: 0.5070 - classification_loss: 0.0520 250/500 [==============>...............] - ETA: 1:37 - loss: 0.5605 - regression_loss: 0.5082 - classification_loss: 0.0522 251/500 [==============>...............] - ETA: 1:37 - loss: 0.5611 - regression_loss: 0.5089 - classification_loss: 0.0522 252/500 [==============>...............] - ETA: 1:36 - loss: 0.5604 - regression_loss: 0.5083 - classification_loss: 0.0521 253/500 [==============>...............] - ETA: 1:36 - loss: 0.5607 - regression_loss: 0.5086 - classification_loss: 0.0521 254/500 [==============>...............] - ETA: 1:36 - loss: 0.5626 - regression_loss: 0.5102 - classification_loss: 0.0523 255/500 [==============>...............] - ETA: 1:35 - loss: 0.5630 - regression_loss: 0.5107 - classification_loss: 0.0524 256/500 [==============>...............] - ETA: 1:35 - loss: 0.5636 - regression_loss: 0.5112 - classification_loss: 0.0524 257/500 [==============>...............] - ETA: 1:34 - loss: 0.5629 - regression_loss: 0.5105 - classification_loss: 0.0524 258/500 [==============>...............] - ETA: 1:34 - loss: 0.5628 - regression_loss: 0.5104 - classification_loss: 0.0524 259/500 [==============>...............] - ETA: 1:34 - loss: 0.5618 - regression_loss: 0.5095 - classification_loss: 0.0524 260/500 [==============>...............] - ETA: 1:33 - loss: 0.5626 - regression_loss: 0.5102 - classification_loss: 0.0524 261/500 [==============>...............] - ETA: 1:33 - loss: 0.5651 - regression_loss: 0.5116 - classification_loss: 0.0536 262/500 [==============>...............] - ETA: 1:32 - loss: 0.5652 - regression_loss: 0.5116 - classification_loss: 0.0535 263/500 [==============>...............] - ETA: 1:32 - loss: 0.5651 - regression_loss: 0.5116 - classification_loss: 0.0535 264/500 [==============>...............] - ETA: 1:32 - loss: 0.5649 - regression_loss: 0.5113 - classification_loss: 0.0536 265/500 [==============>...............] - ETA: 1:31 - loss: 0.5645 - regression_loss: 0.5109 - classification_loss: 0.0536 266/500 [==============>...............] - ETA: 1:31 - loss: 0.5655 - regression_loss: 0.5120 - classification_loss: 0.0536 267/500 [===============>..............] - ETA: 1:30 - loss: 0.5644 - regression_loss: 0.5109 - classification_loss: 0.0535 268/500 [===============>..............] - ETA: 1:30 - loss: 0.5633 - regression_loss: 0.5099 - classification_loss: 0.0534 269/500 [===============>..............] - ETA: 1:30 - loss: 0.5619 - regression_loss: 0.5087 - classification_loss: 0.0533 270/500 [===============>..............] - ETA: 1:29 - loss: 0.5612 - regression_loss: 0.5080 - classification_loss: 0.0532 271/500 [===============>..............] - ETA: 1:29 - loss: 0.5635 - regression_loss: 0.5096 - classification_loss: 0.0540 272/500 [===============>..............] - ETA: 1:29 - loss: 0.5638 - regression_loss: 0.5096 - classification_loss: 0.0543 273/500 [===============>..............] - ETA: 1:28 - loss: 0.5628 - regression_loss: 0.5086 - classification_loss: 0.0542 274/500 [===============>..............] - ETA: 1:28 - loss: 0.5623 - regression_loss: 0.5080 - classification_loss: 0.0543 275/500 [===============>..............] - ETA: 1:27 - loss: 0.5634 - regression_loss: 0.5088 - classification_loss: 0.0546 276/500 [===============>..............] - ETA: 1:27 - loss: 0.5629 - regression_loss: 0.5084 - classification_loss: 0.0545 277/500 [===============>..............] - ETA: 1:27 - loss: 0.5634 - regression_loss: 0.5086 - classification_loss: 0.0548 278/500 [===============>..............] - ETA: 1:26 - loss: 0.5639 - regression_loss: 0.5091 - classification_loss: 0.0548 279/500 [===============>..............] - ETA: 1:26 - loss: 0.5633 - regression_loss: 0.5085 - classification_loss: 0.0548 280/500 [===============>..............] - ETA: 1:25 - loss: 0.5630 - regression_loss: 0.5083 - classification_loss: 0.0547 281/500 [===============>..............] - ETA: 1:25 - loss: 0.5633 - regression_loss: 0.5086 - classification_loss: 0.0547 282/500 [===============>..............] - ETA: 1:25 - loss: 0.5628 - regression_loss: 0.5082 - classification_loss: 0.0546 283/500 [===============>..............] - ETA: 1:24 - loss: 0.5647 - regression_loss: 0.5099 - classification_loss: 0.0548 284/500 [================>.............] - ETA: 1:24 - loss: 0.5646 - regression_loss: 0.5098 - classification_loss: 0.0548 285/500 [================>.............] - ETA: 1:23 - loss: 0.5652 - regression_loss: 0.5103 - classification_loss: 0.0549 286/500 [================>.............] - ETA: 1:23 - loss: 0.5649 - regression_loss: 0.5099 - classification_loss: 0.0550 287/500 [================>.............] - ETA: 1:23 - loss: 0.5650 - regression_loss: 0.5100 - classification_loss: 0.0550 288/500 [================>.............] - ETA: 1:22 - loss: 0.5665 - regression_loss: 0.5113 - classification_loss: 0.0552 289/500 [================>.............] - ETA: 1:22 - loss: 0.5668 - regression_loss: 0.5116 - classification_loss: 0.0552 290/500 [================>.............] - ETA: 1:21 - loss: 0.5678 - regression_loss: 0.5125 - classification_loss: 0.0552 291/500 [================>.............] - ETA: 1:21 - loss: 0.5674 - regression_loss: 0.5123 - classification_loss: 0.0552 292/500 [================>.............] - ETA: 1:21 - loss: 0.5682 - regression_loss: 0.5130 - classification_loss: 0.0553 293/500 [================>.............] - ETA: 1:20 - loss: 0.5680 - regression_loss: 0.5128 - classification_loss: 0.0552 294/500 [================>.............] - ETA: 1:20 - loss: 0.5681 - regression_loss: 0.5129 - classification_loss: 0.0551 295/500 [================>.............] - ETA: 1:20 - loss: 0.5680 - regression_loss: 0.5128 - classification_loss: 0.0551 296/500 [================>.............] - ETA: 1:19 - loss: 0.5683 - regression_loss: 0.5132 - classification_loss: 0.0551 297/500 [================>.............] - ETA: 1:19 - loss: 0.5683 - regression_loss: 0.5133 - classification_loss: 0.0550 298/500 [================>.............] - ETA: 1:18 - loss: 0.5679 - regression_loss: 0.5129 - classification_loss: 0.0550 299/500 [================>.............] - ETA: 1:18 - loss: 0.5676 - regression_loss: 0.5127 - classification_loss: 0.0549 300/500 [=================>............] - ETA: 1:18 - loss: 0.5687 - regression_loss: 0.5138 - classification_loss: 0.0549 301/500 [=================>............] - ETA: 1:17 - loss: 0.5683 - regression_loss: 0.5135 - classification_loss: 0.0548 302/500 [=================>............] - ETA: 1:17 - loss: 0.5697 - regression_loss: 0.5149 - classification_loss: 0.0548 303/500 [=================>............] - ETA: 1:16 - loss: 0.5697 - regression_loss: 0.5149 - classification_loss: 0.0547 304/500 [=================>............] - ETA: 1:16 - loss: 0.5690 - regression_loss: 0.5143 - classification_loss: 0.0547 305/500 [=================>............] - ETA: 1:16 - loss: 0.5698 - regression_loss: 0.5152 - classification_loss: 0.0546 306/500 [=================>............] - ETA: 1:15 - loss: 0.5691 - regression_loss: 0.5145 - classification_loss: 0.0545 307/500 [=================>............] - ETA: 1:15 - loss: 0.5692 - regression_loss: 0.5145 - classification_loss: 0.0547 308/500 [=================>............] - ETA: 1:14 - loss: 0.5693 - regression_loss: 0.5146 - classification_loss: 0.0547 309/500 [=================>............] - ETA: 1:14 - loss: 0.5693 - regression_loss: 0.5146 - classification_loss: 0.0547 310/500 [=================>............] - ETA: 1:14 - loss: 0.5699 - regression_loss: 0.5151 - classification_loss: 0.0548 311/500 [=================>............] - ETA: 1:13 - loss: 0.5699 - regression_loss: 0.5152 - classification_loss: 0.0548 312/500 [=================>............] - ETA: 1:13 - loss: 0.5698 - regression_loss: 0.5151 - classification_loss: 0.0547 313/500 [=================>............] - ETA: 1:12 - loss: 0.5696 - regression_loss: 0.5149 - classification_loss: 0.0547 314/500 [=================>............] - ETA: 1:12 - loss: 0.5694 - regression_loss: 0.5148 - classification_loss: 0.0546 315/500 [=================>............] - ETA: 1:12 - loss: 0.5690 - regression_loss: 0.5145 - classification_loss: 0.0545 316/500 [=================>............] - ETA: 1:11 - loss: 0.5686 - regression_loss: 0.5142 - classification_loss: 0.0544 317/500 [==================>...........] - ETA: 1:11 - loss: 0.5678 - regression_loss: 0.5134 - classification_loss: 0.0543 318/500 [==================>...........] - ETA: 1:11 - loss: 0.5684 - regression_loss: 0.5139 - classification_loss: 0.0545 319/500 [==================>...........] - ETA: 1:10 - loss: 0.5684 - regression_loss: 0.5138 - classification_loss: 0.0545 320/500 [==================>...........] - ETA: 1:10 - loss: 0.5695 - regression_loss: 0.5147 - classification_loss: 0.0548 321/500 [==================>...........] - ETA: 1:09 - loss: 0.5694 - regression_loss: 0.5146 - classification_loss: 0.0547 322/500 [==================>...........] - ETA: 1:09 - loss: 0.5704 - regression_loss: 0.5156 - classification_loss: 0.0548 323/500 [==================>...........] - ETA: 1:09 - loss: 0.5703 - regression_loss: 0.5155 - classification_loss: 0.0548 324/500 [==================>...........] - ETA: 1:08 - loss: 0.5690 - regression_loss: 0.5143 - classification_loss: 0.0547 325/500 [==================>...........] - ETA: 1:08 - loss: 0.5681 - regression_loss: 0.5135 - classification_loss: 0.0546 326/500 [==================>...........] - ETA: 1:07 - loss: 0.5679 - regression_loss: 0.5134 - classification_loss: 0.0546 327/500 [==================>...........] - ETA: 1:07 - loss: 0.5680 - regression_loss: 0.5135 - classification_loss: 0.0545 328/500 [==================>...........] - ETA: 1:07 - loss: 0.5683 - regression_loss: 0.5139 - classification_loss: 0.0544 329/500 [==================>...........] - ETA: 1:06 - loss: 0.5694 - regression_loss: 0.5149 - classification_loss: 0.0544 330/500 [==================>...........] - ETA: 1:06 - loss: 0.5697 - regression_loss: 0.5153 - classification_loss: 0.0544 331/500 [==================>...........] - ETA: 1:06 - loss: 0.5706 - regression_loss: 0.5161 - classification_loss: 0.0545 332/500 [==================>...........] - ETA: 1:05 - loss: 0.5703 - regression_loss: 0.5159 - classification_loss: 0.0544 333/500 [==================>...........] - ETA: 1:05 - loss: 0.5720 - regression_loss: 0.5173 - classification_loss: 0.0547 334/500 [===================>..........] - ETA: 1:04 - loss: 0.5721 - regression_loss: 0.5174 - classification_loss: 0.0547 335/500 [===================>..........] - ETA: 1:04 - loss: 0.5716 - regression_loss: 0.5170 - classification_loss: 0.0546 336/500 [===================>..........] - ETA: 1:04 - loss: 0.5716 - regression_loss: 0.5171 - classification_loss: 0.0546 337/500 [===================>..........] - ETA: 1:03 - loss: 0.5719 - regression_loss: 0.5173 - classification_loss: 0.0545 338/500 [===================>..........] - ETA: 1:03 - loss: 0.5727 - regression_loss: 0.5182 - classification_loss: 0.0545 339/500 [===================>..........] - ETA: 1:02 - loss: 0.5728 - regression_loss: 0.5183 - classification_loss: 0.0545 340/500 [===================>..........] - ETA: 1:02 - loss: 0.5722 - regression_loss: 0.5177 - classification_loss: 0.0544 341/500 [===================>..........] - ETA: 1:02 - loss: 0.5719 - regression_loss: 0.5175 - classification_loss: 0.0544 342/500 [===================>..........] - ETA: 1:01 - loss: 0.5714 - regression_loss: 0.5170 - classification_loss: 0.0543 343/500 [===================>..........] - ETA: 1:01 - loss: 0.5721 - regression_loss: 0.5176 - classification_loss: 0.0545 344/500 [===================>..........] - ETA: 1:00 - loss: 0.5740 - regression_loss: 0.5192 - classification_loss: 0.0547 345/500 [===================>..........] - ETA: 1:00 - loss: 0.5739 - regression_loss: 0.5192 - classification_loss: 0.0548 346/500 [===================>..........] - ETA: 1:00 - loss: 0.5732 - regression_loss: 0.5185 - classification_loss: 0.0547 347/500 [===================>..........] - ETA: 59s - loss: 0.5727 - regression_loss: 0.5182 - classification_loss: 0.0545  348/500 [===================>..........] - ETA: 59s - loss: 0.5732 - regression_loss: 0.5186 - classification_loss: 0.0546 349/500 [===================>..........] - ETA: 59s - loss: 0.5735 - regression_loss: 0.5189 - classification_loss: 0.0545 350/500 [====================>.........] - ETA: 58s - loss: 0.5752 - regression_loss: 0.5206 - classification_loss: 0.0545 351/500 [====================>.........] - ETA: 58s - loss: 0.5765 - regression_loss: 0.5220 - classification_loss: 0.0545 352/500 [====================>.........] - ETA: 57s - loss: 0.5763 - regression_loss: 0.5219 - classification_loss: 0.0544 353/500 [====================>.........] - ETA: 57s - loss: 0.5765 - regression_loss: 0.5221 - classification_loss: 0.0544 354/500 [====================>.........] - ETA: 57s - loss: 0.5776 - regression_loss: 0.5230 - classification_loss: 0.0547 355/500 [====================>.........] - ETA: 56s - loss: 0.5768 - regression_loss: 0.5223 - classification_loss: 0.0546 356/500 [====================>.........] - ETA: 56s - loss: 0.5770 - regression_loss: 0.5224 - classification_loss: 0.0546 357/500 [====================>.........] - ETA: 55s - loss: 0.5783 - regression_loss: 0.5237 - classification_loss: 0.0546 358/500 [====================>.........] - ETA: 55s - loss: 0.5776 - regression_loss: 0.5231 - classification_loss: 0.0545 359/500 [====================>.........] - ETA: 55s - loss: 0.5786 - regression_loss: 0.5241 - classification_loss: 0.0544 360/500 [====================>.........] - ETA: 54s - loss: 0.5800 - regression_loss: 0.5256 - classification_loss: 0.0544 361/500 [====================>.........] - ETA: 54s - loss: 0.5808 - regression_loss: 0.5265 - classification_loss: 0.0544 362/500 [====================>.........] - ETA: 53s - loss: 0.5800 - regression_loss: 0.5256 - classification_loss: 0.0543 363/500 [====================>.........] - ETA: 53s - loss: 0.5806 - regression_loss: 0.5263 - classification_loss: 0.0543 364/500 [====================>.........] - ETA: 53s - loss: 0.5813 - regression_loss: 0.5269 - classification_loss: 0.0544 365/500 [====================>.........] - ETA: 52s - loss: 0.5833 - regression_loss: 0.5286 - classification_loss: 0.0547 366/500 [====================>.........] - ETA: 52s - loss: 0.5834 - regression_loss: 0.5288 - classification_loss: 0.0547 367/500 [=====================>........] - ETA: 52s - loss: 0.5837 - regression_loss: 0.5290 - classification_loss: 0.0547 368/500 [=====================>........] - ETA: 51s - loss: 0.5840 - regression_loss: 0.5293 - classification_loss: 0.0547 369/500 [=====================>........] - ETA: 51s - loss: 0.5841 - regression_loss: 0.5295 - classification_loss: 0.0546 370/500 [=====================>........] - ETA: 50s - loss: 0.5842 - regression_loss: 0.5296 - classification_loss: 0.0546 371/500 [=====================>........] - ETA: 50s - loss: 0.5853 - regression_loss: 0.5305 - classification_loss: 0.0548 372/500 [=====================>........] - ETA: 50s - loss: 0.5848 - regression_loss: 0.5300 - classification_loss: 0.0548 373/500 [=====================>........] - ETA: 49s - loss: 0.5849 - regression_loss: 0.5301 - classification_loss: 0.0548 374/500 [=====================>........] - ETA: 49s - loss: 0.5849 - regression_loss: 0.5301 - classification_loss: 0.0548 375/500 [=====================>........] - ETA: 48s - loss: 0.5852 - regression_loss: 0.5305 - classification_loss: 0.0547 376/500 [=====================>........] - ETA: 48s - loss: 0.5849 - regression_loss: 0.5302 - classification_loss: 0.0547 377/500 [=====================>........] - ETA: 48s - loss: 0.5841 - regression_loss: 0.5295 - classification_loss: 0.0546 378/500 [=====================>........] - ETA: 47s - loss: 0.5838 - regression_loss: 0.5292 - classification_loss: 0.0546 379/500 [=====================>........] - ETA: 47s - loss: 0.5851 - regression_loss: 0.5303 - classification_loss: 0.0548 380/500 [=====================>........] - ETA: 46s - loss: 0.5853 - regression_loss: 0.5305 - classification_loss: 0.0548 381/500 [=====================>........] - ETA: 46s - loss: 0.5849 - regression_loss: 0.5302 - classification_loss: 0.0547 382/500 [=====================>........] - ETA: 46s - loss: 0.5844 - regression_loss: 0.5297 - classification_loss: 0.0547 383/500 [=====================>........] - ETA: 45s - loss: 0.5835 - regression_loss: 0.5290 - classification_loss: 0.0546 384/500 [======================>.......] - ETA: 45s - loss: 0.5829 - regression_loss: 0.5284 - classification_loss: 0.0545 385/500 [======================>.......] - ETA: 44s - loss: 0.5833 - regression_loss: 0.5288 - classification_loss: 0.0545 386/500 [======================>.......] - ETA: 44s - loss: 0.5837 - regression_loss: 0.5293 - classification_loss: 0.0544 387/500 [======================>.......] - ETA: 44s - loss: 0.5834 - regression_loss: 0.5290 - classification_loss: 0.0544 388/500 [======================>.......] - ETA: 43s - loss: 0.5829 - regression_loss: 0.5286 - classification_loss: 0.0543 389/500 [======================>.......] - ETA: 43s - loss: 0.5827 - regression_loss: 0.5284 - classification_loss: 0.0543 390/500 [======================>.......] - ETA: 43s - loss: 0.5826 - regression_loss: 0.5284 - classification_loss: 0.0543 391/500 [======================>.......] - ETA: 42s - loss: 0.5819 - regression_loss: 0.5276 - classification_loss: 0.0543 392/500 [======================>.......] - ETA: 42s - loss: 0.5815 - regression_loss: 0.5273 - classification_loss: 0.0542 393/500 [======================>.......] - ETA: 41s - loss: 0.5812 - regression_loss: 0.5270 - classification_loss: 0.0541 394/500 [======================>.......] - ETA: 41s - loss: 0.5811 - regression_loss: 0.5270 - classification_loss: 0.0541 395/500 [======================>.......] - ETA: 41s - loss: 0.5814 - regression_loss: 0.5273 - classification_loss: 0.0541 396/500 [======================>.......] - ETA: 40s - loss: 0.5811 - regression_loss: 0.5270 - classification_loss: 0.0541 397/500 [======================>.......] - ETA: 40s - loss: 0.5802 - regression_loss: 0.5262 - classification_loss: 0.0540 398/500 [======================>.......] - ETA: 39s - loss: 0.5809 - regression_loss: 0.5269 - classification_loss: 0.0540 399/500 [======================>.......] - ETA: 39s - loss: 0.5808 - regression_loss: 0.5269 - classification_loss: 0.0540 400/500 [=======================>......] - ETA: 39s - loss: 0.5805 - regression_loss: 0.5266 - classification_loss: 0.0539 401/500 [=======================>......] - ETA: 38s - loss: 0.5803 - regression_loss: 0.5264 - classification_loss: 0.0539 402/500 [=======================>......] - ETA: 38s - loss: 0.5808 - regression_loss: 0.5269 - classification_loss: 0.0539 403/500 [=======================>......] - ETA: 37s - loss: 0.5800 - regression_loss: 0.5261 - classification_loss: 0.0538 404/500 [=======================>......] - ETA: 37s - loss: 0.5794 - regression_loss: 0.5257 - classification_loss: 0.0538 405/500 [=======================>......] - ETA: 37s - loss: 0.5787 - regression_loss: 0.5250 - classification_loss: 0.0537 406/500 [=======================>......] - ETA: 36s - loss: 0.5789 - regression_loss: 0.5252 - classification_loss: 0.0537 407/500 [=======================>......] - ETA: 36s - loss: 0.5790 - regression_loss: 0.5253 - classification_loss: 0.0537 408/500 [=======================>......] - ETA: 35s - loss: 0.5788 - regression_loss: 0.5252 - classification_loss: 0.0537 409/500 [=======================>......] - ETA: 35s - loss: 0.5785 - regression_loss: 0.5248 - classification_loss: 0.0536 410/500 [=======================>......] - ETA: 35s - loss: 0.5783 - regression_loss: 0.5247 - classification_loss: 0.0536 411/500 [=======================>......] - ETA: 34s - loss: 0.5784 - regression_loss: 0.5248 - classification_loss: 0.0536 412/500 [=======================>......] - ETA: 34s - loss: 0.5779 - regression_loss: 0.5243 - classification_loss: 0.0536 413/500 [=======================>......] - ETA: 34s - loss: 0.5773 - regression_loss: 0.5238 - classification_loss: 0.0535 414/500 [=======================>......] - ETA: 33s - loss: 0.5767 - regression_loss: 0.5233 - classification_loss: 0.0534 415/500 [=======================>......] - ETA: 33s - loss: 0.5765 - regression_loss: 0.5231 - classification_loss: 0.0533 416/500 [=======================>......] - ETA: 32s - loss: 0.5766 - regression_loss: 0.5233 - classification_loss: 0.0533 417/500 [========================>.....] - ETA: 32s - loss: 0.5768 - regression_loss: 0.5236 - classification_loss: 0.0533 418/500 [========================>.....] - ETA: 32s - loss: 0.5781 - regression_loss: 0.5248 - classification_loss: 0.0533 419/500 [========================>.....] - ETA: 31s - loss: 0.5788 - regression_loss: 0.5255 - classification_loss: 0.0533 420/500 [========================>.....] - ETA: 31s - loss: 0.5785 - regression_loss: 0.5253 - classification_loss: 0.0532 421/500 [========================>.....] - ETA: 30s - loss: 0.5783 - regression_loss: 0.5251 - classification_loss: 0.0532 422/500 [========================>.....] - ETA: 30s - loss: 0.5781 - regression_loss: 0.5249 - classification_loss: 0.0532 423/500 [========================>.....] - ETA: 30s - loss: 0.5778 - regression_loss: 0.5246 - classification_loss: 0.0531 424/500 [========================>.....] - ETA: 29s - loss: 0.5772 - regression_loss: 0.5242 - classification_loss: 0.0530 425/500 [========================>.....] - ETA: 29s - loss: 0.5783 - regression_loss: 0.5248 - classification_loss: 0.0535 426/500 [========================>.....] - ETA: 28s - loss: 0.5789 - regression_loss: 0.5254 - classification_loss: 0.0535 427/500 [========================>.....] - ETA: 28s - loss: 0.5788 - regression_loss: 0.5252 - classification_loss: 0.0535 428/500 [========================>.....] - ETA: 28s - loss: 0.5785 - regression_loss: 0.5250 - classification_loss: 0.0535 429/500 [========================>.....] - ETA: 27s - loss: 0.5787 - regression_loss: 0.5253 - classification_loss: 0.0535 430/500 [========================>.....] - ETA: 27s - loss: 0.5789 - regression_loss: 0.5254 - classification_loss: 0.0535 431/500 [========================>.....] - ETA: 26s - loss: 0.5781 - regression_loss: 0.5247 - classification_loss: 0.0534 432/500 [========================>.....] - ETA: 26s - loss: 0.5782 - regression_loss: 0.5248 - classification_loss: 0.0534 433/500 [========================>.....] - ETA: 26s - loss: 0.5781 - regression_loss: 0.5247 - classification_loss: 0.0534 434/500 [=========================>....] - ETA: 25s - loss: 0.5778 - regression_loss: 0.5244 - classification_loss: 0.0533 435/500 [=========================>....] - ETA: 25s - loss: 0.5773 - regression_loss: 0.5240 - classification_loss: 0.0533 436/500 [=========================>....] - ETA: 24s - loss: 0.5775 - regression_loss: 0.5243 - classification_loss: 0.0532 437/500 [=========================>....] - ETA: 24s - loss: 0.5775 - regression_loss: 0.5243 - classification_loss: 0.0533 438/500 [=========================>....] - ETA: 24s - loss: 0.5770 - regression_loss: 0.5238 - classification_loss: 0.0533 439/500 [=========================>....] - ETA: 23s - loss: 0.5771 - regression_loss: 0.5239 - classification_loss: 0.0533 440/500 [=========================>....] - ETA: 23s - loss: 0.5768 - regression_loss: 0.5236 - classification_loss: 0.0532 441/500 [=========================>....] - ETA: 23s - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 442/500 [=========================>....] - ETA: 22s - loss: 0.5766 - regression_loss: 0.5233 - classification_loss: 0.0532 443/500 [=========================>....] - ETA: 22s - loss: 0.5778 - regression_loss: 0.5243 - classification_loss: 0.0535 444/500 [=========================>....] - ETA: 21s - loss: 0.5781 - regression_loss: 0.5247 - classification_loss: 0.0535 445/500 [=========================>....] - ETA: 21s - loss: 0.5777 - regression_loss: 0.5242 - classification_loss: 0.0534 446/500 [=========================>....] - ETA: 21s - loss: 0.5772 - regression_loss: 0.5238 - classification_loss: 0.0534 447/500 [=========================>....] - ETA: 20s - loss: 0.5769 - regression_loss: 0.5236 - classification_loss: 0.0533 448/500 [=========================>....] - ETA: 20s - loss: 0.5767 - regression_loss: 0.5234 - classification_loss: 0.0533 449/500 [=========================>....] - ETA: 19s - loss: 0.5766 - regression_loss: 0.5234 - classification_loss: 0.0532 450/500 [==========================>...] - ETA: 19s - loss: 0.5763 - regression_loss: 0.5231 - classification_loss: 0.0532 451/500 [==========================>...] - ETA: 19s - loss: 0.5767 - regression_loss: 0.5235 - classification_loss: 0.0532 452/500 [==========================>...] - ETA: 18s - loss: 0.5780 - regression_loss: 0.5245 - classification_loss: 0.0535 453/500 [==========================>...] - ETA: 18s - loss: 0.5777 - regression_loss: 0.5242 - classification_loss: 0.0535 454/500 [==========================>...] - ETA: 17s - loss: 0.5776 - regression_loss: 0.5241 - classification_loss: 0.0535 455/500 [==========================>...] - ETA: 17s - loss: 0.5773 - regression_loss: 0.5239 - classification_loss: 0.0534 456/500 [==========================>...] - ETA: 17s - loss: 0.5767 - regression_loss: 0.5234 - classification_loss: 0.0533 457/500 [==========================>...] - ETA: 16s - loss: 0.5766 - regression_loss: 0.5233 - classification_loss: 0.0533 458/500 [==========================>...] - ETA: 16s - loss: 0.5761 - regression_loss: 0.5229 - classification_loss: 0.0533 459/500 [==========================>...] - ETA: 16s - loss: 0.5760 - regression_loss: 0.5227 - classification_loss: 0.0533 460/500 [==========================>...] - ETA: 15s - loss: 0.5761 - regression_loss: 0.5228 - classification_loss: 0.0533 461/500 [==========================>...] - ETA: 15s - loss: 0.5772 - regression_loss: 0.5238 - classification_loss: 0.0534 462/500 [==========================>...] - ETA: 14s - loss: 0.5768 - regression_loss: 0.5234 - classification_loss: 0.0534 463/500 [==========================>...] - ETA: 14s - loss: 0.5769 - regression_loss: 0.5236 - classification_loss: 0.0534 464/500 [==========================>...] - ETA: 14s - loss: 0.5767 - regression_loss: 0.5234 - classification_loss: 0.0533 465/500 [==========================>...] - ETA: 13s - loss: 0.5773 - regression_loss: 0.5239 - classification_loss: 0.0534 466/500 [==========================>...] - ETA: 13s - loss: 0.5769 - regression_loss: 0.5235 - classification_loss: 0.0534 467/500 [===========================>..] - ETA: 12s - loss: 0.5770 - regression_loss: 0.5237 - classification_loss: 0.0533 468/500 [===========================>..] - ETA: 12s - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 469/500 [===========================>..] - ETA: 12s - loss: 0.5769 - regression_loss: 0.5236 - classification_loss: 0.0533 470/500 [===========================>..] - ETA: 11s - loss: 0.5770 - regression_loss: 0.5237 - classification_loss: 0.0533 471/500 [===========================>..] - ETA: 11s - loss: 0.5771 - regression_loss: 0.5238 - classification_loss: 0.0533 472/500 [===========================>..] - ETA: 10s - loss: 0.5771 - regression_loss: 0.5238 - classification_loss: 0.0533 473/500 [===========================>..] - ETA: 10s - loss: 0.5770 - regression_loss: 0.5238 - classification_loss: 0.0532 474/500 [===========================>..] - ETA: 10s - loss: 0.5770 - regression_loss: 0.5239 - classification_loss: 0.0532 475/500 [===========================>..] - ETA: 9s - loss: 0.5768 - regression_loss: 0.5236 - classification_loss: 0.0531  476/500 [===========================>..] - ETA: 9s - loss: 0.5760 - regression_loss: 0.5230 - classification_loss: 0.0531 477/500 [===========================>..] - ETA: 8s - loss: 0.5762 - regression_loss: 0.5231 - classification_loss: 0.0531 478/500 [===========================>..] - ETA: 8s - loss: 0.5774 - regression_loss: 0.5240 - classification_loss: 0.0534 479/500 [===========================>..] - ETA: 8s - loss: 0.5785 - regression_loss: 0.5249 - classification_loss: 0.0536 480/500 [===========================>..] - ETA: 7s - loss: 0.5778 - regression_loss: 0.5243 - classification_loss: 0.0535 481/500 [===========================>..] - ETA: 7s - loss: 0.5774 - regression_loss: 0.5239 - classification_loss: 0.0535 482/500 [===========================>..] - ETA: 7s - loss: 0.5769 - regression_loss: 0.5235 - classification_loss: 0.0534 483/500 [===========================>..] - ETA: 6s - loss: 0.5768 - regression_loss: 0.5234 - classification_loss: 0.0534 484/500 [============================>.] - ETA: 6s - loss: 0.5763 - regression_loss: 0.5230 - classification_loss: 0.0533 485/500 [============================>.] - ETA: 5s - loss: 0.5760 - regression_loss: 0.5227 - classification_loss: 0.0533 486/500 [============================>.] - ETA: 5s - loss: 0.5756 - regression_loss: 0.5223 - classification_loss: 0.0533 487/500 [============================>.] - ETA: 5s - loss: 0.5757 - regression_loss: 0.5224 - classification_loss: 0.0533 488/500 [============================>.] - ETA: 4s - loss: 0.5751 - regression_loss: 0.5219 - classification_loss: 0.0533 489/500 [============================>.] - ETA: 4s - loss: 0.5748 - regression_loss: 0.5215 - classification_loss: 0.0532 490/500 [============================>.] - ETA: 3s - loss: 0.5759 - regression_loss: 0.5227 - classification_loss: 0.0533 491/500 [============================>.] - ETA: 3s - loss: 0.5763 - regression_loss: 0.5231 - classification_loss: 0.0533 492/500 [============================>.] - ETA: 3s - loss: 0.5764 - regression_loss: 0.5231 - classification_loss: 0.0533 493/500 [============================>.] - ETA: 2s - loss: 0.5762 - regression_loss: 0.5230 - classification_loss: 0.0532 494/500 [============================>.] - ETA: 2s - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 495/500 [============================>.] - ETA: 1s - loss: 0.5769 - regression_loss: 0.5236 - classification_loss: 0.0533 496/500 [============================>.] - ETA: 1s - loss: 0.5772 - regression_loss: 0.5239 - classification_loss: 0.0533 497/500 [============================>.] - ETA: 1s - loss: 0.5768 - regression_loss: 0.5236 - classification_loss: 0.0533 498/500 [============================>.] - ETA: 0s - loss: 0.5766 - regression_loss: 0.5233 - classification_loss: 0.0533 499/500 [============================>.] - ETA: 0s - loss: 0.5768 - regression_loss: 0.5235 - classification_loss: 0.0533 500/500 [==============================] - 195s 390ms/step - loss: 0.5763 - regression_loss: 0.5230 - classification_loss: 0.0532 326 instances of class plum with average precision: 0.8157 mAP: 0.8157 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/20 1/500 [..............................] - ETA: 3:15 - loss: 0.3226 - regression_loss: 0.2981 - classification_loss: 0.0245 2/500 [..............................] - ETA: 3:18 - loss: 0.4311 - regression_loss: 0.4030 - classification_loss: 0.0281 3/500 [..............................] - ETA: 3:14 - loss: 0.4617 - regression_loss: 0.4300 - classification_loss: 0.0317 4/500 [..............................] - ETA: 3:11 - loss: 0.4397 - regression_loss: 0.4025 - classification_loss: 0.0371 5/500 [..............................] - ETA: 3:10 - loss: 0.4601 - regression_loss: 0.4236 - classification_loss: 0.0365 6/500 [..............................] - ETA: 3:10 - loss: 0.5645 - regression_loss: 0.5086 - classification_loss: 0.0559 7/500 [..............................] - ETA: 3:10 - loss: 0.5560 - regression_loss: 0.5026 - classification_loss: 0.0534 8/500 [..............................] - ETA: 3:09 - loss: 0.5409 - regression_loss: 0.4887 - classification_loss: 0.0522 9/500 [..............................] - ETA: 3:08 - loss: 0.5397 - regression_loss: 0.4868 - classification_loss: 0.0529 10/500 [..............................] - ETA: 3:08 - loss: 0.5529 - regression_loss: 0.5007 - classification_loss: 0.0521 11/500 [..............................] - ETA: 3:07 - loss: 0.5470 - regression_loss: 0.4968 - classification_loss: 0.0503 12/500 [..............................] - ETA: 3:07 - loss: 0.5251 - regression_loss: 0.4771 - classification_loss: 0.0480 13/500 [..............................] - ETA: 3:07 - loss: 0.5261 - regression_loss: 0.4789 - classification_loss: 0.0472 14/500 [..............................] - ETA: 3:07 - loss: 0.5097 - regression_loss: 0.4642 - classification_loss: 0.0455 15/500 [..............................] - ETA: 3:07 - loss: 0.4981 - regression_loss: 0.4521 - classification_loss: 0.0459 16/500 [..............................] - ETA: 3:07 - loss: 0.5187 - regression_loss: 0.4721 - classification_loss: 0.0466 17/500 [>.............................] - ETA: 3:07 - loss: 0.5372 - regression_loss: 0.4903 - classification_loss: 0.0469 18/500 [>.............................] - ETA: 3:06 - loss: 0.5251 - regression_loss: 0.4793 - classification_loss: 0.0457 19/500 [>.............................] - ETA: 3:05 - loss: 0.5130 - regression_loss: 0.4685 - classification_loss: 0.0445 20/500 [>.............................] - ETA: 3:05 - loss: 0.5570 - regression_loss: 0.5019 - classification_loss: 0.0550 21/500 [>.............................] - ETA: 3:05 - loss: 0.5637 - regression_loss: 0.5075 - classification_loss: 0.0562 22/500 [>.............................] - ETA: 3:05 - loss: 0.5860 - regression_loss: 0.5290 - classification_loss: 0.0570 23/500 [>.............................] - ETA: 3:05 - loss: 0.5911 - regression_loss: 0.5332 - classification_loss: 0.0579 24/500 [>.............................] - ETA: 3:04 - loss: 0.5846 - regression_loss: 0.5280 - classification_loss: 0.0566 25/500 [>.............................] - ETA: 3:04 - loss: 0.5805 - regression_loss: 0.5237 - classification_loss: 0.0567 26/500 [>.............................] - ETA: 3:04 - loss: 0.5778 - regression_loss: 0.5219 - classification_loss: 0.0559 27/500 [>.............................] - ETA: 3:04 - loss: 0.5801 - regression_loss: 0.5233 - classification_loss: 0.0568 28/500 [>.............................] - ETA: 3:03 - loss: 0.5790 - regression_loss: 0.5226 - classification_loss: 0.0564 29/500 [>.............................] - ETA: 3:03 - loss: 0.5753 - regression_loss: 0.5195 - classification_loss: 0.0559 30/500 [>.............................] - ETA: 3:02 - loss: 0.5712 - regression_loss: 0.5161 - classification_loss: 0.0551 31/500 [>.............................] - ETA: 3:01 - loss: 0.5817 - regression_loss: 0.5251 - classification_loss: 0.0566 32/500 [>.............................] - ETA: 3:01 - loss: 0.5880 - regression_loss: 0.5314 - classification_loss: 0.0566 33/500 [>.............................] - ETA: 3:01 - loss: 0.5864 - regression_loss: 0.5297 - classification_loss: 0.0567 34/500 [=>............................] - ETA: 3:01 - loss: 0.5887 - regression_loss: 0.5315 - classification_loss: 0.0573 35/500 [=>............................] - ETA: 3:00 - loss: 0.5847 - regression_loss: 0.5282 - classification_loss: 0.0565 36/500 [=>............................] - ETA: 3:00 - loss: 0.5804 - regression_loss: 0.5240 - classification_loss: 0.0564 37/500 [=>............................] - ETA: 3:00 - loss: 0.5730 - regression_loss: 0.5177 - classification_loss: 0.0554 38/500 [=>............................] - ETA: 2:59 - loss: 0.5785 - regression_loss: 0.5231 - classification_loss: 0.0554 39/500 [=>............................] - ETA: 2:59 - loss: 0.5755 - regression_loss: 0.5208 - classification_loss: 0.0547 40/500 [=>............................] - ETA: 2:59 - loss: 0.5726 - regression_loss: 0.5181 - classification_loss: 0.0545 41/500 [=>............................] - ETA: 2:58 - loss: 0.5652 - regression_loss: 0.5114 - classification_loss: 0.0537 42/500 [=>............................] - ETA: 2:57 - loss: 0.5791 - regression_loss: 0.5233 - classification_loss: 0.0558 43/500 [=>............................] - ETA: 2:57 - loss: 0.5919 - regression_loss: 0.5354 - classification_loss: 0.0565 44/500 [=>............................] - ETA: 2:56 - loss: 0.5987 - regression_loss: 0.5429 - classification_loss: 0.0558 45/500 [=>............................] - ETA: 2:56 - loss: 0.6074 - regression_loss: 0.5515 - classification_loss: 0.0560 46/500 [=>............................] - ETA: 2:55 - loss: 0.6053 - regression_loss: 0.5503 - classification_loss: 0.0550 47/500 [=>............................] - ETA: 2:55 - loss: 0.6110 - regression_loss: 0.5556 - classification_loss: 0.0554 48/500 [=>............................] - ETA: 2:54 - loss: 0.6113 - regression_loss: 0.5560 - classification_loss: 0.0553 49/500 [=>............................] - ETA: 2:54 - loss: 0.6082 - regression_loss: 0.5530 - classification_loss: 0.0552 50/500 [==>...........................] - ETA: 2:54 - loss: 0.6059 - regression_loss: 0.5504 - classification_loss: 0.0555 51/500 [==>...........................] - ETA: 2:53 - loss: 0.6062 - regression_loss: 0.5510 - classification_loss: 0.0552 52/500 [==>...........................] - ETA: 2:53 - loss: 0.6035 - regression_loss: 0.5489 - classification_loss: 0.0546 53/500 [==>...........................] - ETA: 2:52 - loss: 0.6036 - regression_loss: 0.5485 - classification_loss: 0.0551 54/500 [==>...........................] - ETA: 2:52 - loss: 0.6011 - regression_loss: 0.5461 - classification_loss: 0.0550 55/500 [==>...........................] - ETA: 2:52 - loss: 0.6099 - regression_loss: 0.5535 - classification_loss: 0.0563 56/500 [==>...........................] - ETA: 2:52 - loss: 0.6088 - regression_loss: 0.5530 - classification_loss: 0.0558 57/500 [==>...........................] - ETA: 2:51 - loss: 0.6054 - regression_loss: 0.5496 - classification_loss: 0.0558 58/500 [==>...........................] - ETA: 2:51 - loss: 0.6038 - regression_loss: 0.5484 - classification_loss: 0.0554 59/500 [==>...........................] - ETA: 2:50 - loss: 0.6041 - regression_loss: 0.5491 - classification_loss: 0.0550 60/500 [==>...........................] - ETA: 2:50 - loss: 0.5979 - regression_loss: 0.5432 - classification_loss: 0.0547 61/500 [==>...........................] - ETA: 2:50 - loss: 0.5930 - regression_loss: 0.5386 - classification_loss: 0.0544 62/500 [==>...........................] - ETA: 2:49 - loss: 0.5902 - regression_loss: 0.5361 - classification_loss: 0.0541 63/500 [==>...........................] - ETA: 2:49 - loss: 0.5872 - regression_loss: 0.5336 - classification_loss: 0.0536 64/500 [==>...........................] - ETA: 2:49 - loss: 0.5852 - regression_loss: 0.5320 - classification_loss: 0.0532 65/500 [==>...........................] - ETA: 2:48 - loss: 0.5847 - regression_loss: 0.5310 - classification_loss: 0.0537 66/500 [==>...........................] - ETA: 2:48 - loss: 0.5830 - regression_loss: 0.5295 - classification_loss: 0.0535 67/500 [===>..........................] - ETA: 2:48 - loss: 0.5793 - regression_loss: 0.5261 - classification_loss: 0.0532 68/500 [===>..........................] - ETA: 2:47 - loss: 0.5768 - regression_loss: 0.5239 - classification_loss: 0.0529 69/500 [===>..........................] - ETA: 2:47 - loss: 0.5737 - regression_loss: 0.5210 - classification_loss: 0.0527 70/500 [===>..........................] - ETA: 2:46 - loss: 0.5717 - regression_loss: 0.5193 - classification_loss: 0.0524 71/500 [===>..........................] - ETA: 2:46 - loss: 0.5697 - regression_loss: 0.5177 - classification_loss: 0.0520 72/500 [===>..........................] - ETA: 2:45 - loss: 0.5708 - regression_loss: 0.5190 - classification_loss: 0.0518 73/500 [===>..........................] - ETA: 2:45 - loss: 0.5648 - regression_loss: 0.5134 - classification_loss: 0.0514 74/500 [===>..........................] - ETA: 2:44 - loss: 0.5613 - regression_loss: 0.5103 - classification_loss: 0.0510 75/500 [===>..........................] - ETA: 2:44 - loss: 0.5580 - regression_loss: 0.5074 - classification_loss: 0.0506 76/500 [===>..........................] - ETA: 2:44 - loss: 0.5591 - regression_loss: 0.5085 - classification_loss: 0.0506 77/500 [===>..........................] - ETA: 2:43 - loss: 0.5582 - regression_loss: 0.5079 - classification_loss: 0.0504 78/500 [===>..........................] - ETA: 2:43 - loss: 0.5683 - regression_loss: 0.5166 - classification_loss: 0.0517 79/500 [===>..........................] - ETA: 2:43 - loss: 0.5678 - regression_loss: 0.5162 - classification_loss: 0.0516 80/500 [===>..........................] - ETA: 2:42 - loss: 0.5656 - regression_loss: 0.5143 - classification_loss: 0.0514 81/500 [===>..........................] - ETA: 2:42 - loss: 0.5616 - regression_loss: 0.5107 - classification_loss: 0.0509 82/500 [===>..........................] - ETA: 2:42 - loss: 0.5599 - regression_loss: 0.5092 - classification_loss: 0.0507 83/500 [===>..........................] - ETA: 2:41 - loss: 0.5580 - regression_loss: 0.5075 - classification_loss: 0.0505 84/500 [====>.........................] - ETA: 2:41 - loss: 0.5564 - regression_loss: 0.5062 - classification_loss: 0.0503 85/500 [====>.........................] - ETA: 2:40 - loss: 0.5542 - regression_loss: 0.5042 - classification_loss: 0.0500 86/500 [====>.........................] - ETA: 2:40 - loss: 0.5538 - regression_loss: 0.5039 - classification_loss: 0.0500 87/500 [====>.........................] - ETA: 2:40 - loss: 0.5560 - regression_loss: 0.5059 - classification_loss: 0.0500 88/500 [====>.........................] - ETA: 2:39 - loss: 0.5571 - regression_loss: 0.5068 - classification_loss: 0.0503 89/500 [====>.........................] - ETA: 2:39 - loss: 0.5562 - regression_loss: 0.5061 - classification_loss: 0.0501 90/500 [====>.........................] - ETA: 2:38 - loss: 0.5538 - regression_loss: 0.5040 - classification_loss: 0.0498 91/500 [====>.........................] - ETA: 2:38 - loss: 0.5518 - regression_loss: 0.5024 - classification_loss: 0.0495 92/500 [====>.........................] - ETA: 2:38 - loss: 0.5508 - regression_loss: 0.5015 - classification_loss: 0.0492 93/500 [====>.........................] - ETA: 2:37 - loss: 0.5493 - regression_loss: 0.5002 - classification_loss: 0.0491 94/500 [====>.........................] - ETA: 2:37 - loss: 0.5468 - regression_loss: 0.4980 - classification_loss: 0.0488 95/500 [====>.........................] - ETA: 2:37 - loss: 0.5466 - regression_loss: 0.4979 - classification_loss: 0.0487 96/500 [====>.........................] - ETA: 2:36 - loss: 0.5467 - regression_loss: 0.4981 - classification_loss: 0.0486 97/500 [====>.........................] - ETA: 2:36 - loss: 0.5465 - regression_loss: 0.4980 - classification_loss: 0.0484 98/500 [====>.........................] - ETA: 2:35 - loss: 0.5462 - regression_loss: 0.4980 - classification_loss: 0.0481 99/500 [====>.........................] - ETA: 2:35 - loss: 0.5473 - regression_loss: 0.4993 - classification_loss: 0.0480 100/500 [=====>........................] - ETA: 2:35 - loss: 0.5477 - regression_loss: 0.4998 - classification_loss: 0.0479 101/500 [=====>........................] - ETA: 2:34 - loss: 0.5464 - regression_loss: 0.4987 - classification_loss: 0.0477 102/500 [=====>........................] - ETA: 2:34 - loss: 0.5453 - regression_loss: 0.4978 - classification_loss: 0.0475 103/500 [=====>........................] - ETA: 2:34 - loss: 0.5443 - regression_loss: 0.4971 - classification_loss: 0.0473 104/500 [=====>........................] - ETA: 2:33 - loss: 0.5488 - regression_loss: 0.5010 - classification_loss: 0.0477 105/500 [=====>........................] - ETA: 2:33 - loss: 0.5486 - regression_loss: 0.5010 - classification_loss: 0.0476 106/500 [=====>........................] - ETA: 2:32 - loss: 0.5551 - regression_loss: 0.5056 - classification_loss: 0.0495 107/500 [=====>........................] - ETA: 2:32 - loss: 0.5531 - regression_loss: 0.5038 - classification_loss: 0.0493 108/500 [=====>........................] - ETA: 2:32 - loss: 0.5528 - regression_loss: 0.5036 - classification_loss: 0.0492 109/500 [=====>........................] - ETA: 2:31 - loss: 0.5532 - regression_loss: 0.5040 - classification_loss: 0.0492 110/500 [=====>........................] - ETA: 2:31 - loss: 0.5607 - regression_loss: 0.5107 - classification_loss: 0.0500 111/500 [=====>........................] - ETA: 2:31 - loss: 0.5621 - regression_loss: 0.5117 - classification_loss: 0.0503 112/500 [=====>........................] - ETA: 2:30 - loss: 0.5624 - regression_loss: 0.5119 - classification_loss: 0.0505 113/500 [=====>........................] - ETA: 2:30 - loss: 0.5632 - regression_loss: 0.5127 - classification_loss: 0.0506 114/500 [=====>........................] - ETA: 2:29 - loss: 0.5622 - regression_loss: 0.5118 - classification_loss: 0.0504 115/500 [=====>........................] - ETA: 2:29 - loss: 0.5638 - regression_loss: 0.5134 - classification_loss: 0.0504 116/500 [=====>........................] - ETA: 2:29 - loss: 0.5633 - regression_loss: 0.5129 - classification_loss: 0.0503 117/500 [======>.......................] - ETA: 2:28 - loss: 0.5621 - regression_loss: 0.5119 - classification_loss: 0.0502 118/500 [======>.......................] - ETA: 2:28 - loss: 0.5633 - regression_loss: 0.5130 - classification_loss: 0.0503 119/500 [======>.......................] - ETA: 2:28 - loss: 0.5629 - regression_loss: 0.5128 - classification_loss: 0.0502 120/500 [======>.......................] - ETA: 2:27 - loss: 0.5615 - regression_loss: 0.5115 - classification_loss: 0.0500 121/500 [======>.......................] - ETA: 2:27 - loss: 0.5586 - regression_loss: 0.5089 - classification_loss: 0.0497 122/500 [======>.......................] - ETA: 2:26 - loss: 0.5594 - regression_loss: 0.5095 - classification_loss: 0.0498 123/500 [======>.......................] - ETA: 2:26 - loss: 0.5565 - regression_loss: 0.5068 - classification_loss: 0.0497 124/500 [======>.......................] - ETA: 2:26 - loss: 0.5564 - regression_loss: 0.5066 - classification_loss: 0.0498 125/500 [======>.......................] - ETA: 2:25 - loss: 0.5553 - regression_loss: 0.5056 - classification_loss: 0.0497 126/500 [======>.......................] - ETA: 2:25 - loss: 0.5554 - regression_loss: 0.5058 - classification_loss: 0.0496 127/500 [======>.......................] - ETA: 2:25 - loss: 0.5561 - regression_loss: 0.5067 - classification_loss: 0.0494 128/500 [======>.......................] - ETA: 2:24 - loss: 0.5564 - regression_loss: 0.5070 - classification_loss: 0.0495 129/500 [======>.......................] - ETA: 2:24 - loss: 0.5546 - regression_loss: 0.5054 - classification_loss: 0.0493 130/500 [======>.......................] - ETA: 2:24 - loss: 0.5585 - regression_loss: 0.5077 - classification_loss: 0.0508 131/500 [======>.......................] - ETA: 2:23 - loss: 0.5559 - regression_loss: 0.5054 - classification_loss: 0.0505 132/500 [======>.......................] - ETA: 2:23 - loss: 0.5581 - regression_loss: 0.5072 - classification_loss: 0.0509 133/500 [======>.......................] - ETA: 2:22 - loss: 0.5576 - regression_loss: 0.5069 - classification_loss: 0.0507 134/500 [=======>......................] - ETA: 2:22 - loss: 0.5569 - regression_loss: 0.5063 - classification_loss: 0.0506 135/500 [=======>......................] - ETA: 2:22 - loss: 0.5555 - regression_loss: 0.5052 - classification_loss: 0.0504 136/500 [=======>......................] - ETA: 2:21 - loss: 0.5548 - regression_loss: 0.5045 - classification_loss: 0.0504 137/500 [=======>......................] - ETA: 2:21 - loss: 0.5551 - regression_loss: 0.5049 - classification_loss: 0.0502 138/500 [=======>......................] - ETA: 2:21 - loss: 0.5546 - regression_loss: 0.5046 - classification_loss: 0.0500 139/500 [=======>......................] - ETA: 2:20 - loss: 0.5581 - regression_loss: 0.5072 - classification_loss: 0.0509 140/500 [=======>......................] - ETA: 2:20 - loss: 0.5579 - regression_loss: 0.5070 - classification_loss: 0.0509 141/500 [=======>......................] - ETA: 2:19 - loss: 0.5583 - regression_loss: 0.5076 - classification_loss: 0.0507 142/500 [=======>......................] - ETA: 2:19 - loss: 0.5571 - regression_loss: 0.5065 - classification_loss: 0.0506 143/500 [=======>......................] - ETA: 2:19 - loss: 0.5566 - regression_loss: 0.5061 - classification_loss: 0.0505 144/500 [=======>......................] - ETA: 2:18 - loss: 0.5545 - regression_loss: 0.5042 - classification_loss: 0.0503 145/500 [=======>......................] - ETA: 2:18 - loss: 0.5580 - regression_loss: 0.5074 - classification_loss: 0.0505 146/500 [=======>......................] - ETA: 2:17 - loss: 0.5568 - regression_loss: 0.5064 - classification_loss: 0.0504 147/500 [=======>......................] - ETA: 2:17 - loss: 0.5562 - regression_loss: 0.5059 - classification_loss: 0.0503 148/500 [=======>......................] - ETA: 2:16 - loss: 0.5548 - regression_loss: 0.5046 - classification_loss: 0.0501 149/500 [=======>......................] - ETA: 2:16 - loss: 0.5530 - regression_loss: 0.5031 - classification_loss: 0.0500 150/500 [========>.....................] - ETA: 2:16 - loss: 0.5541 - regression_loss: 0.5037 - classification_loss: 0.0503 151/500 [========>.....................] - ETA: 2:15 - loss: 0.5553 - regression_loss: 0.5048 - classification_loss: 0.0505 152/500 [========>.....................] - ETA: 2:15 - loss: 0.5555 - regression_loss: 0.5051 - classification_loss: 0.0504 153/500 [========>.....................] - ETA: 2:14 - loss: 0.5545 - regression_loss: 0.5043 - classification_loss: 0.0503 154/500 [========>.....................] - ETA: 2:14 - loss: 0.5547 - regression_loss: 0.5046 - classification_loss: 0.0502 155/500 [========>.....................] - ETA: 2:14 - loss: 0.5559 - regression_loss: 0.5055 - classification_loss: 0.0504 156/500 [========>.....................] - ETA: 2:13 - loss: 0.5545 - regression_loss: 0.5042 - classification_loss: 0.0502 157/500 [========>.....................] - ETA: 2:13 - loss: 0.5552 - regression_loss: 0.5049 - classification_loss: 0.0503 158/500 [========>.....................] - ETA: 2:12 - loss: 0.5533 - regression_loss: 0.5032 - classification_loss: 0.0501 159/500 [========>.....................] - ETA: 2:12 - loss: 0.5555 - regression_loss: 0.5053 - classification_loss: 0.0502 160/500 [========>.....................] - ETA: 2:12 - loss: 0.5561 - regression_loss: 0.5059 - classification_loss: 0.0502 161/500 [========>.....................] - ETA: 2:11 - loss: 0.5559 - regression_loss: 0.5057 - classification_loss: 0.0502 162/500 [========>.....................] - ETA: 2:11 - loss: 0.5562 - regression_loss: 0.5058 - classification_loss: 0.0504 163/500 [========>.....................] - ETA: 2:11 - loss: 0.5561 - regression_loss: 0.5057 - classification_loss: 0.0503 164/500 [========>.....................] - ETA: 2:10 - loss: 0.5551 - regression_loss: 0.5048 - classification_loss: 0.0503 165/500 [========>.....................] - ETA: 2:10 - loss: 0.5583 - regression_loss: 0.5076 - classification_loss: 0.0507 166/500 [========>.....................] - ETA: 2:09 - loss: 0.5601 - regression_loss: 0.5091 - classification_loss: 0.0510 167/500 [=========>....................] - ETA: 2:09 - loss: 0.5598 - regression_loss: 0.5089 - classification_loss: 0.0509 168/500 [=========>....................] - ETA: 2:09 - loss: 0.5634 - regression_loss: 0.5122 - classification_loss: 0.0512 169/500 [=========>....................] - ETA: 2:08 - loss: 0.5626 - regression_loss: 0.5116 - classification_loss: 0.0511 170/500 [=========>....................] - ETA: 2:08 - loss: 0.5614 - regression_loss: 0.5104 - classification_loss: 0.0509 171/500 [=========>....................] - ETA: 2:07 - loss: 0.5591 - regression_loss: 0.5084 - classification_loss: 0.0508 172/500 [=========>....................] - ETA: 2:07 - loss: 0.5579 - regression_loss: 0.5074 - classification_loss: 0.0506 173/500 [=========>....................] - ETA: 2:07 - loss: 0.5584 - regression_loss: 0.5078 - classification_loss: 0.0506 174/500 [=========>....................] - ETA: 2:06 - loss: 0.5570 - regression_loss: 0.5064 - classification_loss: 0.0505 175/500 [=========>....................] - ETA: 2:06 - loss: 0.5564 - regression_loss: 0.5060 - classification_loss: 0.0504 176/500 [=========>....................] - ETA: 2:05 - loss: 0.5560 - regression_loss: 0.5054 - classification_loss: 0.0506 177/500 [=========>....................] - ETA: 2:05 - loss: 0.5552 - regression_loss: 0.5047 - classification_loss: 0.0504 178/500 [=========>....................] - ETA: 2:05 - loss: 0.5542 - regression_loss: 0.5038 - classification_loss: 0.0504 179/500 [=========>....................] - ETA: 2:04 - loss: 0.5527 - regression_loss: 0.5025 - classification_loss: 0.0502 180/500 [=========>....................] - ETA: 2:04 - loss: 0.5514 - regression_loss: 0.5013 - classification_loss: 0.0501 181/500 [=========>....................] - ETA: 2:03 - loss: 0.5505 - regression_loss: 0.5004 - classification_loss: 0.0501 182/500 [=========>....................] - ETA: 2:03 - loss: 0.5511 - regression_loss: 0.5009 - classification_loss: 0.0502 183/500 [=========>....................] - ETA: 2:03 - loss: 0.5520 - regression_loss: 0.5016 - classification_loss: 0.0503 184/500 [==========>...................] - ETA: 2:02 - loss: 0.5510 - regression_loss: 0.5008 - classification_loss: 0.0503 185/500 [==========>...................] - ETA: 2:02 - loss: 0.5524 - regression_loss: 0.5021 - classification_loss: 0.0503 186/500 [==========>...................] - ETA: 2:01 - loss: 0.5524 - regression_loss: 0.5021 - classification_loss: 0.0502 187/500 [==========>...................] - ETA: 2:01 - loss: 0.5517 - regression_loss: 0.5015 - classification_loss: 0.0501 188/500 [==========>...................] - ETA: 2:00 - loss: 0.5526 - regression_loss: 0.5024 - classification_loss: 0.0503 189/500 [==========>...................] - ETA: 2:00 - loss: 0.5526 - regression_loss: 0.5023 - classification_loss: 0.0503 190/500 [==========>...................] - ETA: 2:00 - loss: 0.5506 - regression_loss: 0.5005 - classification_loss: 0.0501 191/500 [==========>...................] - ETA: 1:59 - loss: 0.5502 - regression_loss: 0.5000 - classification_loss: 0.0501 192/500 [==========>...................] - ETA: 1:59 - loss: 0.5488 - regression_loss: 0.4987 - classification_loss: 0.0501 193/500 [==========>...................] - ETA: 1:58 - loss: 0.5494 - regression_loss: 0.4993 - classification_loss: 0.0502 194/500 [==========>...................] - ETA: 1:58 - loss: 0.5516 - regression_loss: 0.5009 - classification_loss: 0.0506 195/500 [==========>...................] - ETA: 1:57 - loss: 0.5514 - regression_loss: 0.5008 - classification_loss: 0.0505 196/500 [==========>...................] - ETA: 1:57 - loss: 0.5507 - regression_loss: 0.5003 - classification_loss: 0.0504 197/500 [==========>...................] - ETA: 1:57 - loss: 0.5498 - regression_loss: 0.4995 - classification_loss: 0.0503 198/500 [==========>...................] - ETA: 1:56 - loss: 0.5489 - regression_loss: 0.4987 - classification_loss: 0.0502 199/500 [==========>...................] - ETA: 1:56 - loss: 0.5485 - regression_loss: 0.4984 - classification_loss: 0.0501 200/500 [===========>..................] - ETA: 1:55 - loss: 0.5485 - regression_loss: 0.4984 - classification_loss: 0.0500 201/500 [===========>..................] - ETA: 1:55 - loss: 0.5487 - regression_loss: 0.4986 - classification_loss: 0.0500 202/500 [===========>..................] - ETA: 1:55 - loss: 0.5484 - regression_loss: 0.4984 - classification_loss: 0.0500 203/500 [===========>..................] - ETA: 1:54 - loss: 0.5493 - regression_loss: 0.4992 - classification_loss: 0.0501 204/500 [===========>..................] - ETA: 1:54 - loss: 0.5492 - regression_loss: 0.4992 - classification_loss: 0.0500 205/500 [===========>..................] - ETA: 1:54 - loss: 0.5487 - regression_loss: 0.4988 - classification_loss: 0.0499 206/500 [===========>..................] - ETA: 1:53 - loss: 0.5513 - regression_loss: 0.5008 - classification_loss: 0.0505 207/500 [===========>..................] - ETA: 1:53 - loss: 0.5524 - regression_loss: 0.5016 - classification_loss: 0.0508 208/500 [===========>..................] - ETA: 1:52 - loss: 0.5524 - regression_loss: 0.5015 - classification_loss: 0.0508 209/500 [===========>..................] - ETA: 1:52 - loss: 0.5524 - regression_loss: 0.5015 - classification_loss: 0.0510 210/500 [===========>..................] - ETA: 1:52 - loss: 0.5515 - regression_loss: 0.5006 - classification_loss: 0.0508 211/500 [===========>..................] - ETA: 1:51 - loss: 0.5517 - regression_loss: 0.5009 - classification_loss: 0.0508 212/500 [===========>..................] - ETA: 1:51 - loss: 0.5519 - regression_loss: 0.5012 - classification_loss: 0.0508 213/500 [===========>..................] - ETA: 1:51 - loss: 0.5513 - regression_loss: 0.5005 - classification_loss: 0.0508 214/500 [===========>..................] - ETA: 1:50 - loss: 0.5521 - regression_loss: 0.5012 - classification_loss: 0.0508 215/500 [===========>..................] - ETA: 1:50 - loss: 0.5518 - regression_loss: 0.5011 - classification_loss: 0.0507 216/500 [===========>..................] - ETA: 1:49 - loss: 0.5525 - regression_loss: 0.5018 - classification_loss: 0.0507 217/500 [============>.................] - ETA: 1:49 - loss: 0.5519 - regression_loss: 0.5013 - classification_loss: 0.0507 218/500 [============>.................] - ETA: 1:49 - loss: 0.5527 - regression_loss: 0.5020 - classification_loss: 0.0507 219/500 [============>.................] - ETA: 1:48 - loss: 0.5525 - regression_loss: 0.5019 - classification_loss: 0.0506 220/500 [============>.................] - ETA: 1:48 - loss: 0.5521 - regression_loss: 0.5015 - classification_loss: 0.0506 221/500 [============>.................] - ETA: 1:47 - loss: 0.5525 - regression_loss: 0.5019 - classification_loss: 0.0506 222/500 [============>.................] - ETA: 1:47 - loss: 0.5528 - regression_loss: 0.5022 - classification_loss: 0.0506 223/500 [============>.................] - ETA: 1:47 - loss: 0.5539 - regression_loss: 0.5033 - classification_loss: 0.0506 224/500 [============>.................] - ETA: 1:46 - loss: 0.5536 - regression_loss: 0.5031 - classification_loss: 0.0505 225/500 [============>.................] - ETA: 1:46 - loss: 0.5547 - regression_loss: 0.5041 - classification_loss: 0.0506 226/500 [============>.................] - ETA: 1:46 - loss: 0.5537 - regression_loss: 0.5031 - classification_loss: 0.0505 227/500 [============>.................] - ETA: 1:45 - loss: 0.5540 - regression_loss: 0.5036 - classification_loss: 0.0504 228/500 [============>.................] - ETA: 1:45 - loss: 0.5543 - regression_loss: 0.5039 - classification_loss: 0.0504 229/500 [============>.................] - ETA: 1:44 - loss: 0.5584 - regression_loss: 0.5074 - classification_loss: 0.0510 230/500 [============>.................] - ETA: 1:44 - loss: 0.5588 - regression_loss: 0.5079 - classification_loss: 0.0509 231/500 [============>.................] - ETA: 1:44 - loss: 0.5582 - regression_loss: 0.5073 - classification_loss: 0.0509 232/500 [============>.................] - ETA: 1:43 - loss: 0.5585 - regression_loss: 0.5076 - classification_loss: 0.0509 233/500 [============>.................] - ETA: 1:43 - loss: 0.5572 - regression_loss: 0.5065 - classification_loss: 0.0508 234/500 [=============>................] - ETA: 1:42 - loss: 0.5560 - regression_loss: 0.5053 - classification_loss: 0.0507 235/500 [=============>................] - ETA: 1:42 - loss: 0.5557 - regression_loss: 0.5051 - classification_loss: 0.0506 236/500 [=============>................] - ETA: 1:42 - loss: 0.5558 - regression_loss: 0.5052 - classification_loss: 0.0506 237/500 [=============>................] - ETA: 1:41 - loss: 0.5572 - regression_loss: 0.5066 - classification_loss: 0.0506 238/500 [=============>................] - ETA: 1:41 - loss: 0.5573 - regression_loss: 0.5067 - classification_loss: 0.0506 239/500 [=============>................] - ETA: 1:40 - loss: 0.5570 - regression_loss: 0.5064 - classification_loss: 0.0506 240/500 [=============>................] - ETA: 1:40 - loss: 0.5565 - regression_loss: 0.5061 - classification_loss: 0.0505 241/500 [=============>................] - ETA: 1:40 - loss: 0.5556 - regression_loss: 0.5053 - classification_loss: 0.0504 242/500 [=============>................] - ETA: 1:39 - loss: 0.5557 - regression_loss: 0.5052 - classification_loss: 0.0504 243/500 [=============>................] - ETA: 1:39 - loss: 0.5551 - regression_loss: 0.5048 - classification_loss: 0.0504 244/500 [=============>................] - ETA: 1:39 - loss: 0.5535 - regression_loss: 0.5033 - classification_loss: 0.0502 245/500 [=============>................] - ETA: 1:38 - loss: 0.5543 - regression_loss: 0.5040 - classification_loss: 0.0503 246/500 [=============>................] - ETA: 1:38 - loss: 0.5540 - regression_loss: 0.5038 - classification_loss: 0.0502 247/500 [=============>................] - ETA: 1:37 - loss: 0.5544 - regression_loss: 0.5041 - classification_loss: 0.0503 248/500 [=============>................] - ETA: 1:37 - loss: 0.5551 - regression_loss: 0.5047 - classification_loss: 0.0505 249/500 [=============>................] - ETA: 1:37 - loss: 0.5549 - regression_loss: 0.5045 - classification_loss: 0.0504 250/500 [==============>...............] - ETA: 1:36 - loss: 0.5560 - regression_loss: 0.5054 - classification_loss: 0.0505 251/500 [==============>...............] - ETA: 1:36 - loss: 0.5555 - regression_loss: 0.5050 - classification_loss: 0.0505 252/500 [==============>...............] - ETA: 1:35 - loss: 0.5548 - regression_loss: 0.5044 - classification_loss: 0.0504 253/500 [==============>...............] - ETA: 1:35 - loss: 0.5543 - regression_loss: 0.5040 - classification_loss: 0.0503 254/500 [==============>...............] - ETA: 1:35 - loss: 0.5551 - regression_loss: 0.5047 - classification_loss: 0.0504 255/500 [==============>...............] - ETA: 1:34 - loss: 0.5565 - regression_loss: 0.5061 - classification_loss: 0.0504 256/500 [==============>...............] - ETA: 1:34 - loss: 0.5567 - regression_loss: 0.5063 - classification_loss: 0.0504 257/500 [==============>...............] - ETA: 1:34 - loss: 0.5576 - regression_loss: 0.5072 - classification_loss: 0.0504 258/500 [==============>...............] - ETA: 1:33 - loss: 0.5574 - regression_loss: 0.5069 - classification_loss: 0.0505 259/500 [==============>...............] - ETA: 1:33 - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0505 260/500 [==============>...............] - ETA: 1:32 - loss: 0.5581 - regression_loss: 0.5076 - classification_loss: 0.0504 261/500 [==============>...............] - ETA: 1:32 - loss: 0.5585 - regression_loss: 0.5081 - classification_loss: 0.0504 262/500 [==============>...............] - ETA: 1:32 - loss: 0.5580 - regression_loss: 0.5077 - classification_loss: 0.0503 263/500 [==============>...............] - ETA: 1:31 - loss: 0.5583 - regression_loss: 0.5080 - classification_loss: 0.0503 264/500 [==============>...............] - ETA: 1:31 - loss: 0.5581 - regression_loss: 0.5078 - classification_loss: 0.0503 265/500 [==============>...............] - ETA: 1:30 - loss: 0.5569 - regression_loss: 0.5066 - classification_loss: 0.0502 266/500 [==============>...............] - ETA: 1:30 - loss: 0.5565 - regression_loss: 0.5062 - classification_loss: 0.0503 267/500 [===============>..............] - ETA: 1:30 - loss: 0.5568 - regression_loss: 0.5065 - classification_loss: 0.0503 268/500 [===============>..............] - ETA: 1:29 - loss: 0.5562 - regression_loss: 0.5060 - classification_loss: 0.0502 269/500 [===============>..............] - ETA: 1:29 - loss: 0.5556 - regression_loss: 0.5054 - classification_loss: 0.0502 270/500 [===============>..............] - ETA: 1:29 - loss: 0.5558 - regression_loss: 0.5056 - classification_loss: 0.0501 271/500 [===============>..............] - ETA: 1:28 - loss: 0.5561 - regression_loss: 0.5060 - classification_loss: 0.0501 272/500 [===============>..............] - ETA: 1:28 - loss: 0.5549 - regression_loss: 0.5049 - classification_loss: 0.0500 273/500 [===============>..............] - ETA: 1:27 - loss: 0.5551 - regression_loss: 0.5051 - classification_loss: 0.0499 274/500 [===============>..............] - ETA: 1:27 - loss: 0.5542 - regression_loss: 0.5043 - classification_loss: 0.0499 275/500 [===============>..............] - ETA: 1:27 - loss: 0.5543 - regression_loss: 0.5044 - classification_loss: 0.0499 276/500 [===============>..............] - ETA: 1:26 - loss: 0.5536 - regression_loss: 0.5038 - classification_loss: 0.0498 277/500 [===============>..............] - ETA: 1:26 - loss: 0.5527 - regression_loss: 0.5030 - classification_loss: 0.0497 278/500 [===============>..............] - ETA: 1:26 - loss: 0.5531 - regression_loss: 0.5034 - classification_loss: 0.0497 279/500 [===============>..............] - ETA: 1:25 - loss: 0.5541 - regression_loss: 0.5043 - classification_loss: 0.0498 280/500 [===============>..............] - ETA: 1:25 - loss: 0.5533 - regression_loss: 0.5036 - classification_loss: 0.0497 281/500 [===============>..............] - ETA: 1:24 - loss: 0.5524 - regression_loss: 0.5028 - classification_loss: 0.0497 282/500 [===============>..............] - ETA: 1:24 - loss: 0.5525 - regression_loss: 0.5029 - classification_loss: 0.0496 283/500 [===============>..............] - ETA: 1:24 - loss: 0.5538 - regression_loss: 0.5040 - classification_loss: 0.0499 284/500 [================>.............] - ETA: 1:23 - loss: 0.5545 - regression_loss: 0.5046 - classification_loss: 0.0499 285/500 [================>.............] - ETA: 1:23 - loss: 0.5547 - regression_loss: 0.5048 - classification_loss: 0.0499 286/500 [================>.............] - ETA: 1:22 - loss: 0.5547 - regression_loss: 0.5049 - classification_loss: 0.0499 287/500 [================>.............] - ETA: 1:22 - loss: 0.5549 - regression_loss: 0.5052 - classification_loss: 0.0497 288/500 [================>.............] - ETA: 1:22 - loss: 0.5556 - regression_loss: 0.5059 - classification_loss: 0.0497 289/500 [================>.............] - ETA: 1:21 - loss: 0.5550 - regression_loss: 0.5054 - classification_loss: 0.0496 290/500 [================>.............] - ETA: 1:21 - loss: 0.5559 - regression_loss: 0.5063 - classification_loss: 0.0496 291/500 [================>.............] - ETA: 1:21 - loss: 0.5558 - regression_loss: 0.5063 - classification_loss: 0.0495 292/500 [================>.............] - ETA: 1:20 - loss: 0.5550 - regression_loss: 0.5055 - classification_loss: 0.0495 293/500 [================>.............] - ETA: 1:20 - loss: 0.5567 - regression_loss: 0.5067 - classification_loss: 0.0501 294/500 [================>.............] - ETA: 1:19 - loss: 0.5563 - regression_loss: 0.5063 - classification_loss: 0.0500 295/500 [================>.............] - ETA: 1:19 - loss: 0.5580 - regression_loss: 0.5078 - classification_loss: 0.0502 296/500 [================>.............] - ETA: 1:19 - loss: 0.5576 - regression_loss: 0.5075 - classification_loss: 0.0502 297/500 [================>.............] - ETA: 1:18 - loss: 0.5571 - regression_loss: 0.5070 - classification_loss: 0.0501 298/500 [================>.............] - ETA: 1:18 - loss: 0.5570 - regression_loss: 0.5070 - classification_loss: 0.0500 299/500 [================>.............] - ETA: 1:17 - loss: 0.5566 - regression_loss: 0.5066 - classification_loss: 0.0499 300/500 [=================>............] - ETA: 1:17 - loss: 0.5567 - regression_loss: 0.5069 - classification_loss: 0.0499 301/500 [=================>............] - ETA: 1:17 - loss: 0.5562 - regression_loss: 0.5064 - classification_loss: 0.0498 302/500 [=================>............] - ETA: 1:16 - loss: 0.5565 - regression_loss: 0.5068 - classification_loss: 0.0497 303/500 [=================>............] - ETA: 1:16 - loss: 0.5568 - regression_loss: 0.5071 - classification_loss: 0.0497 304/500 [=================>............] - ETA: 1:16 - loss: 0.5570 - regression_loss: 0.5073 - classification_loss: 0.0498 305/500 [=================>............] - ETA: 1:15 - loss: 0.5555 - regression_loss: 0.5059 - classification_loss: 0.0496 306/500 [=================>............] - ETA: 1:15 - loss: 0.5548 - regression_loss: 0.5052 - classification_loss: 0.0495 307/500 [=================>............] - ETA: 1:14 - loss: 0.5547 - regression_loss: 0.5053 - classification_loss: 0.0494 308/500 [=================>............] - ETA: 1:14 - loss: 0.5553 - regression_loss: 0.5058 - classification_loss: 0.0494 309/500 [=================>............] - ETA: 1:14 - loss: 0.5551 - regression_loss: 0.5057 - classification_loss: 0.0494 310/500 [=================>............] - ETA: 1:13 - loss: 0.5553 - regression_loss: 0.5059 - classification_loss: 0.0495 311/500 [=================>............] - ETA: 1:13 - loss: 0.5559 - regression_loss: 0.5065 - classification_loss: 0.0495 312/500 [=================>............] - ETA: 1:12 - loss: 0.5561 - regression_loss: 0.5065 - classification_loss: 0.0496 313/500 [=================>............] - ETA: 1:12 - loss: 0.5558 - regression_loss: 0.5062 - classification_loss: 0.0495 314/500 [=================>............] - ETA: 1:12 - loss: 0.5553 - regression_loss: 0.5059 - classification_loss: 0.0495 315/500 [=================>............] - ETA: 1:11 - loss: 0.5551 - regression_loss: 0.5057 - classification_loss: 0.0494 316/500 [=================>............] - ETA: 1:11 - loss: 0.5552 - regression_loss: 0.5058 - classification_loss: 0.0494 317/500 [==================>...........] - ETA: 1:10 - loss: 0.5550 - regression_loss: 0.5057 - classification_loss: 0.0493 318/500 [==================>...........] - ETA: 1:10 - loss: 0.5546 - regression_loss: 0.5053 - classification_loss: 0.0493 319/500 [==================>...........] - ETA: 1:10 - loss: 0.5536 - regression_loss: 0.5044 - classification_loss: 0.0492 320/500 [==================>...........] - ETA: 1:09 - loss: 0.5543 - regression_loss: 0.5051 - classification_loss: 0.0492 321/500 [==================>...........] - ETA: 1:09 - loss: 0.5542 - regression_loss: 0.5050 - classification_loss: 0.0492 322/500 [==================>...........] - ETA: 1:09 - loss: 0.5530 - regression_loss: 0.5039 - classification_loss: 0.0491 323/500 [==================>...........] - ETA: 1:08 - loss: 0.5530 - regression_loss: 0.5040 - classification_loss: 0.0490 324/500 [==================>...........] - ETA: 1:08 - loss: 0.5526 - regression_loss: 0.5036 - classification_loss: 0.0490 325/500 [==================>...........] - ETA: 1:07 - loss: 0.5533 - regression_loss: 0.5043 - classification_loss: 0.0490 326/500 [==================>...........] - ETA: 1:07 - loss: 0.5526 - regression_loss: 0.5036 - classification_loss: 0.0489 327/500 [==================>...........] - ETA: 1:07 - loss: 0.5517 - regression_loss: 0.5029 - classification_loss: 0.0488 328/500 [==================>...........] - ETA: 1:06 - loss: 0.5518 - regression_loss: 0.5030 - classification_loss: 0.0488 329/500 [==================>...........] - ETA: 1:06 - loss: 0.5532 - regression_loss: 0.5041 - classification_loss: 0.0491 330/500 [==================>...........] - ETA: 1:05 - loss: 0.5547 - regression_loss: 0.5057 - classification_loss: 0.0491 331/500 [==================>...........] - ETA: 1:05 - loss: 0.5536 - regression_loss: 0.5046 - classification_loss: 0.0490 332/500 [==================>...........] - ETA: 1:05 - loss: 0.5537 - regression_loss: 0.5047 - classification_loss: 0.0490 333/500 [==================>...........] - ETA: 1:04 - loss: 0.5536 - regression_loss: 0.5046 - classification_loss: 0.0490 334/500 [===================>..........] - ETA: 1:04 - loss: 0.5533 - regression_loss: 0.5043 - classification_loss: 0.0489 335/500 [===================>..........] - ETA: 1:04 - loss: 0.5531 - regression_loss: 0.5043 - classification_loss: 0.0488 336/500 [===================>..........] - ETA: 1:03 - loss: 0.5531 - regression_loss: 0.5042 - classification_loss: 0.0488 337/500 [===================>..........] - ETA: 1:03 - loss: 0.5528 - regression_loss: 0.5040 - classification_loss: 0.0488 338/500 [===================>..........] - ETA: 1:02 - loss: 0.5531 - regression_loss: 0.5042 - classification_loss: 0.0489 339/500 [===================>..........] - ETA: 1:02 - loss: 0.5524 - regression_loss: 0.5036 - classification_loss: 0.0488 340/500 [===================>..........] - ETA: 1:02 - loss: 0.5519 - regression_loss: 0.5032 - classification_loss: 0.0488 341/500 [===================>..........] - ETA: 1:01 - loss: 0.5523 - regression_loss: 0.5036 - classification_loss: 0.0488 342/500 [===================>..........] - ETA: 1:01 - loss: 0.5523 - regression_loss: 0.5036 - classification_loss: 0.0488 343/500 [===================>..........] - ETA: 1:00 - loss: 0.5518 - regression_loss: 0.5031 - classification_loss: 0.0487 344/500 [===================>..........] - ETA: 1:00 - loss: 0.5533 - regression_loss: 0.5044 - classification_loss: 0.0490 345/500 [===================>..........] - ETA: 1:00 - loss: 0.5530 - regression_loss: 0.5041 - classification_loss: 0.0489 346/500 [===================>..........] - ETA: 59s - loss: 0.5522 - regression_loss: 0.5034 - classification_loss: 0.0488  347/500 [===================>..........] - ETA: 59s - loss: 0.5530 - regression_loss: 0.5041 - classification_loss: 0.0489 348/500 [===================>..........] - ETA: 58s - loss: 0.5533 - regression_loss: 0.5044 - classification_loss: 0.0489 349/500 [===================>..........] - ETA: 58s - loss: 0.5552 - regression_loss: 0.5061 - classification_loss: 0.0490 350/500 [====================>.........] - ETA: 58s - loss: 0.5547 - regression_loss: 0.5057 - classification_loss: 0.0490 351/500 [====================>.........] - ETA: 57s - loss: 0.5545 - regression_loss: 0.5055 - classification_loss: 0.0490 352/500 [====================>.........] - ETA: 57s - loss: 0.5541 - regression_loss: 0.5051 - classification_loss: 0.0490 353/500 [====================>.........] - ETA: 57s - loss: 0.5537 - regression_loss: 0.5047 - classification_loss: 0.0489 354/500 [====================>.........] - ETA: 56s - loss: 0.5552 - regression_loss: 0.5060 - classification_loss: 0.0492 355/500 [====================>.........] - ETA: 56s - loss: 0.5566 - regression_loss: 0.5072 - classification_loss: 0.0494 356/500 [====================>.........] - ETA: 55s - loss: 0.5572 - regression_loss: 0.5077 - classification_loss: 0.0494 357/500 [====================>.........] - ETA: 55s - loss: 0.5567 - regression_loss: 0.5074 - classification_loss: 0.0494 358/500 [====================>.........] - ETA: 55s - loss: 0.5580 - regression_loss: 0.5084 - classification_loss: 0.0496 359/500 [====================>.........] - ETA: 54s - loss: 0.5574 - regression_loss: 0.5079 - classification_loss: 0.0495 360/500 [====================>.........] - ETA: 54s - loss: 0.5563 - regression_loss: 0.5069 - classification_loss: 0.0495 361/500 [====================>.........] - ETA: 53s - loss: 0.5561 - regression_loss: 0.5067 - classification_loss: 0.0494 362/500 [====================>.........] - ETA: 53s - loss: 0.5559 - regression_loss: 0.5066 - classification_loss: 0.0494 363/500 [====================>.........] - ETA: 53s - loss: 0.5557 - regression_loss: 0.5064 - classification_loss: 0.0493 364/500 [====================>.........] - ETA: 52s - loss: 0.5550 - regression_loss: 0.5057 - classification_loss: 0.0493 365/500 [====================>.........] - ETA: 52s - loss: 0.5548 - regression_loss: 0.5056 - classification_loss: 0.0492 366/500 [====================>.........] - ETA: 51s - loss: 0.5543 - regression_loss: 0.5051 - classification_loss: 0.0492 367/500 [=====================>........] - ETA: 51s - loss: 0.5544 - regression_loss: 0.5052 - classification_loss: 0.0492 368/500 [=====================>........] - ETA: 51s - loss: 0.5542 - regression_loss: 0.5050 - classification_loss: 0.0492 369/500 [=====================>........] - ETA: 50s - loss: 0.5535 - regression_loss: 0.5044 - classification_loss: 0.0491 370/500 [=====================>........] - ETA: 50s - loss: 0.5531 - regression_loss: 0.5040 - classification_loss: 0.0490 371/500 [=====================>........] - ETA: 50s - loss: 0.5530 - regression_loss: 0.5039 - classification_loss: 0.0490 372/500 [=====================>........] - ETA: 49s - loss: 0.5525 - regression_loss: 0.5035 - classification_loss: 0.0490 373/500 [=====================>........] - ETA: 49s - loss: 0.5522 - regression_loss: 0.5032 - classification_loss: 0.0490 374/500 [=====================>........] - ETA: 48s - loss: 0.5513 - regression_loss: 0.5024 - classification_loss: 0.0489 375/500 [=====================>........] - ETA: 48s - loss: 0.5504 - regression_loss: 0.5016 - classification_loss: 0.0488 376/500 [=====================>........] - ETA: 48s - loss: 0.5499 - regression_loss: 0.5010 - classification_loss: 0.0488 377/500 [=====================>........] - ETA: 47s - loss: 0.5506 - regression_loss: 0.5018 - classification_loss: 0.0489 378/500 [=====================>........] - ETA: 47s - loss: 0.5497 - regression_loss: 0.5009 - classification_loss: 0.0488 379/500 [=====================>........] - ETA: 46s - loss: 0.5505 - regression_loss: 0.5015 - classification_loss: 0.0489 380/500 [=====================>........] - ETA: 46s - loss: 0.5502 - regression_loss: 0.5013 - classification_loss: 0.0489 381/500 [=====================>........] - ETA: 46s - loss: 0.5500 - regression_loss: 0.5010 - classification_loss: 0.0489 382/500 [=====================>........] - ETA: 45s - loss: 0.5496 - regression_loss: 0.5008 - classification_loss: 0.0489 383/500 [=====================>........] - ETA: 45s - loss: 0.5495 - regression_loss: 0.5007 - classification_loss: 0.0488 384/500 [======================>.......] - ETA: 44s - loss: 0.5493 - regression_loss: 0.5006 - classification_loss: 0.0488 385/500 [======================>.......] - ETA: 44s - loss: 0.5497 - regression_loss: 0.5008 - classification_loss: 0.0488 386/500 [======================>.......] - ETA: 44s - loss: 0.5494 - regression_loss: 0.5006 - classification_loss: 0.0488 387/500 [======================>.......] - ETA: 43s - loss: 0.5490 - regression_loss: 0.5003 - classification_loss: 0.0487 388/500 [======================>.......] - ETA: 43s - loss: 0.5483 - regression_loss: 0.4996 - classification_loss: 0.0486 389/500 [======================>.......] - ETA: 43s - loss: 0.5483 - regression_loss: 0.4997 - classification_loss: 0.0486 390/500 [======================>.......] - ETA: 42s - loss: 0.5475 - regression_loss: 0.4989 - classification_loss: 0.0486 391/500 [======================>.......] - ETA: 42s - loss: 0.5480 - regression_loss: 0.4995 - classification_loss: 0.0486 392/500 [======================>.......] - ETA: 41s - loss: 0.5483 - regression_loss: 0.4998 - classification_loss: 0.0486 393/500 [======================>.......] - ETA: 41s - loss: 0.5486 - regression_loss: 0.5000 - classification_loss: 0.0486 394/500 [======================>.......] - ETA: 41s - loss: 0.5480 - regression_loss: 0.4994 - classification_loss: 0.0485 395/500 [======================>.......] - ETA: 40s - loss: 0.5478 - regression_loss: 0.4993 - classification_loss: 0.0485 396/500 [======================>.......] - ETA: 40s - loss: 0.5479 - regression_loss: 0.4994 - classification_loss: 0.0485 397/500 [======================>.......] - ETA: 39s - loss: 0.5473 - regression_loss: 0.4988 - classification_loss: 0.0485 398/500 [======================>.......] - ETA: 39s - loss: 0.5468 - regression_loss: 0.4984 - classification_loss: 0.0484 399/500 [======================>.......] - ETA: 39s - loss: 0.5464 - regression_loss: 0.4980 - classification_loss: 0.0484 400/500 [=======================>......] - ETA: 38s - loss: 0.5458 - regression_loss: 0.4975 - classification_loss: 0.0483 401/500 [=======================>......] - ETA: 38s - loss: 0.5464 - regression_loss: 0.4980 - classification_loss: 0.0484 402/500 [=======================>......] - ETA: 38s - loss: 0.5463 - regression_loss: 0.4979 - classification_loss: 0.0484 403/500 [=======================>......] - ETA: 37s - loss: 0.5464 - regression_loss: 0.4980 - classification_loss: 0.0484 404/500 [=======================>......] - ETA: 37s - loss: 0.5463 - regression_loss: 0.4979 - classification_loss: 0.0484 405/500 [=======================>......] - ETA: 36s - loss: 0.5458 - regression_loss: 0.4975 - classification_loss: 0.0483 406/500 [=======================>......] - ETA: 36s - loss: 0.5464 - regression_loss: 0.4980 - classification_loss: 0.0484 407/500 [=======================>......] - ETA: 36s - loss: 0.5463 - regression_loss: 0.4979 - classification_loss: 0.0484 408/500 [=======================>......] - ETA: 35s - loss: 0.5456 - regression_loss: 0.4972 - classification_loss: 0.0484 409/500 [=======================>......] - ETA: 35s - loss: 0.5453 - regression_loss: 0.4970 - classification_loss: 0.0483 410/500 [=======================>......] - ETA: 34s - loss: 0.5453 - regression_loss: 0.4969 - classification_loss: 0.0483 411/500 [=======================>......] - ETA: 34s - loss: 0.5450 - regression_loss: 0.4968 - classification_loss: 0.0483 412/500 [=======================>......] - ETA: 34s - loss: 0.5454 - regression_loss: 0.4970 - classification_loss: 0.0484 413/500 [=======================>......] - ETA: 33s - loss: 0.5456 - regression_loss: 0.4972 - classification_loss: 0.0484 414/500 [=======================>......] - ETA: 33s - loss: 0.5453 - regression_loss: 0.4969 - classification_loss: 0.0484 415/500 [=======================>......] - ETA: 32s - loss: 0.5457 - regression_loss: 0.4973 - classification_loss: 0.0484 416/500 [=======================>......] - ETA: 32s - loss: 0.5461 - regression_loss: 0.4977 - classification_loss: 0.0484 417/500 [========================>.....] - ETA: 32s - loss: 0.5461 - regression_loss: 0.4977 - classification_loss: 0.0484 418/500 [========================>.....] - ETA: 31s - loss: 0.5461 - regression_loss: 0.4977 - classification_loss: 0.0485 419/500 [========================>.....] - ETA: 31s - loss: 0.5457 - regression_loss: 0.4973 - classification_loss: 0.0484 420/500 [========================>.....] - ETA: 31s - loss: 0.5451 - regression_loss: 0.4968 - classification_loss: 0.0484 421/500 [========================>.....] - ETA: 30s - loss: 0.5447 - regression_loss: 0.4964 - classification_loss: 0.0483 422/500 [========================>.....] - ETA: 30s - loss: 0.5450 - regression_loss: 0.4967 - classification_loss: 0.0483 423/500 [========================>.....] - ETA: 29s - loss: 0.5446 - regression_loss: 0.4964 - classification_loss: 0.0483 424/500 [========================>.....] - ETA: 29s - loss: 0.5438 - regression_loss: 0.4955 - classification_loss: 0.0482 425/500 [========================>.....] - ETA: 29s - loss: 0.5443 - regression_loss: 0.4959 - classification_loss: 0.0484 426/500 [========================>.....] - ETA: 28s - loss: 0.5443 - regression_loss: 0.4959 - classification_loss: 0.0484 427/500 [========================>.....] - ETA: 28s - loss: 0.5443 - regression_loss: 0.4959 - classification_loss: 0.0483 428/500 [========================>.....] - ETA: 27s - loss: 0.5444 - regression_loss: 0.4960 - classification_loss: 0.0484 429/500 [========================>.....] - ETA: 27s - loss: 0.5442 - regression_loss: 0.4959 - classification_loss: 0.0483 430/500 [========================>.....] - ETA: 27s - loss: 0.5437 - regression_loss: 0.4955 - classification_loss: 0.0482 431/500 [========================>.....] - ETA: 26s - loss: 0.5435 - regression_loss: 0.4952 - classification_loss: 0.0483 432/500 [========================>.....] - ETA: 26s - loss: 0.5429 - regression_loss: 0.4947 - classification_loss: 0.0482 433/500 [========================>.....] - ETA: 25s - loss: 0.5433 - regression_loss: 0.4951 - classification_loss: 0.0482 434/500 [=========================>....] - ETA: 25s - loss: 0.5434 - regression_loss: 0.4952 - classification_loss: 0.0483 435/500 [=========================>....] - ETA: 25s - loss: 0.5446 - regression_loss: 0.4961 - classification_loss: 0.0485 436/500 [=========================>....] - ETA: 24s - loss: 0.5440 - regression_loss: 0.4956 - classification_loss: 0.0484 437/500 [=========================>....] - ETA: 24s - loss: 0.5442 - regression_loss: 0.4957 - classification_loss: 0.0485 438/500 [=========================>....] - ETA: 24s - loss: 0.5444 - regression_loss: 0.4960 - classification_loss: 0.0484 439/500 [=========================>....] - ETA: 23s - loss: 0.5445 - regression_loss: 0.4961 - classification_loss: 0.0484 440/500 [=========================>....] - ETA: 23s - loss: 0.5446 - regression_loss: 0.4961 - classification_loss: 0.0484 441/500 [=========================>....] - ETA: 22s - loss: 0.5449 - regression_loss: 0.4965 - classification_loss: 0.0484 442/500 [=========================>....] - ETA: 22s - loss: 0.5450 - regression_loss: 0.4966 - classification_loss: 0.0484 443/500 [=========================>....] - ETA: 22s - loss: 0.5451 - regression_loss: 0.4968 - classification_loss: 0.0484 444/500 [=========================>....] - ETA: 21s - loss: 0.5450 - regression_loss: 0.4966 - classification_loss: 0.0483 445/500 [=========================>....] - ETA: 21s - loss: 0.5449 - regression_loss: 0.4966 - classification_loss: 0.0483 446/500 [=========================>....] - ETA: 20s - loss: 0.5446 - regression_loss: 0.4963 - classification_loss: 0.0483 447/500 [=========================>....] - ETA: 20s - loss: 0.5456 - regression_loss: 0.4972 - classification_loss: 0.0483 448/500 [=========================>....] - ETA: 20s - loss: 0.5457 - regression_loss: 0.4973 - classification_loss: 0.0484 449/500 [=========================>....] - ETA: 19s - loss: 0.5460 - regression_loss: 0.4976 - classification_loss: 0.0484 450/500 [==========================>...] - ETA: 19s - loss: 0.5463 - regression_loss: 0.4979 - classification_loss: 0.0484 451/500 [==========================>...] - ETA: 19s - loss: 0.5468 - regression_loss: 0.4984 - classification_loss: 0.0484 452/500 [==========================>...] - ETA: 18s - loss: 0.5468 - regression_loss: 0.4983 - classification_loss: 0.0485 453/500 [==========================>...] - ETA: 18s - loss: 0.5467 - regression_loss: 0.4983 - classification_loss: 0.0484 454/500 [==========================>...] - ETA: 17s - loss: 0.5465 - regression_loss: 0.4981 - classification_loss: 0.0484 455/500 [==========================>...] - ETA: 17s - loss: 0.5468 - regression_loss: 0.4983 - classification_loss: 0.0485 456/500 [==========================>...] - ETA: 17s - loss: 0.5461 - regression_loss: 0.4977 - classification_loss: 0.0484 457/500 [==========================>...] - ETA: 16s - loss: 0.5460 - regression_loss: 0.4976 - classification_loss: 0.0484 458/500 [==========================>...] - ETA: 16s - loss: 0.5456 - regression_loss: 0.4974 - classification_loss: 0.0483 459/500 [==========================>...] - ETA: 15s - loss: 0.5468 - regression_loss: 0.4982 - classification_loss: 0.0485 460/500 [==========================>...] - ETA: 15s - loss: 0.5469 - regression_loss: 0.4984 - classification_loss: 0.0485 461/500 [==========================>...] - ETA: 15s - loss: 0.5468 - regression_loss: 0.4984 - classification_loss: 0.0485 462/500 [==========================>...] - ETA: 14s - loss: 0.5465 - regression_loss: 0.4980 - classification_loss: 0.0484 463/500 [==========================>...] - ETA: 14s - loss: 0.5462 - regression_loss: 0.4978 - classification_loss: 0.0484 464/500 [==========================>...] - ETA: 13s - loss: 0.5456 - regression_loss: 0.4972 - classification_loss: 0.0483 465/500 [==========================>...] - ETA: 13s - loss: 0.5468 - regression_loss: 0.4984 - classification_loss: 0.0484 466/500 [==========================>...] - ETA: 13s - loss: 0.5469 - regression_loss: 0.4985 - classification_loss: 0.0484 467/500 [===========================>..] - ETA: 12s - loss: 0.5465 - regression_loss: 0.4982 - classification_loss: 0.0484 468/500 [===========================>..] - ETA: 12s - loss: 0.5470 - regression_loss: 0.4986 - classification_loss: 0.0484 469/500 [===========================>..] - ETA: 12s - loss: 0.5470 - regression_loss: 0.4986 - classification_loss: 0.0483 470/500 [===========================>..] - ETA: 11s - loss: 0.5470 - regression_loss: 0.4987 - classification_loss: 0.0483 471/500 [===========================>..] - ETA: 11s - loss: 0.5468 - regression_loss: 0.4986 - classification_loss: 0.0482 472/500 [===========================>..] - ETA: 10s - loss: 0.5471 - regression_loss: 0.4989 - classification_loss: 0.0482 473/500 [===========================>..] - ETA: 10s - loss: 0.5483 - regression_loss: 0.4999 - classification_loss: 0.0483 474/500 [===========================>..] - ETA: 10s - loss: 0.5490 - regression_loss: 0.5005 - classification_loss: 0.0485 475/500 [===========================>..] - ETA: 9s - loss: 0.5484 - regression_loss: 0.5000 - classification_loss: 0.0484  476/500 [===========================>..] - ETA: 9s - loss: 0.5489 - regression_loss: 0.5004 - classification_loss: 0.0485 477/500 [===========================>..] - ETA: 8s - loss: 0.5488 - regression_loss: 0.5004 - classification_loss: 0.0484 478/500 [===========================>..] - ETA: 8s - loss: 0.5489 - regression_loss: 0.5005 - classification_loss: 0.0484 479/500 [===========================>..] - ETA: 8s - loss: 0.5486 - regression_loss: 0.5003 - classification_loss: 0.0484 480/500 [===========================>..] - ETA: 7s - loss: 0.5486 - regression_loss: 0.5003 - classification_loss: 0.0484 481/500 [===========================>..] - ETA: 7s - loss: 0.5484 - regression_loss: 0.5001 - classification_loss: 0.0483 482/500 [===========================>..] - ETA: 6s - loss: 0.5491 - regression_loss: 0.5008 - classification_loss: 0.0483 483/500 [===========================>..] - ETA: 6s - loss: 0.5489 - regression_loss: 0.5006 - classification_loss: 0.0483 484/500 [============================>.] - ETA: 6s - loss: 0.5491 - regression_loss: 0.5008 - classification_loss: 0.0483 485/500 [============================>.] - ETA: 5s - loss: 0.5487 - regression_loss: 0.5004 - classification_loss: 0.0482 486/500 [============================>.] - ETA: 5s - loss: 0.5481 - regression_loss: 0.5000 - classification_loss: 0.0481 487/500 [============================>.] - ETA: 5s - loss: 0.5478 - regression_loss: 0.4997 - classification_loss: 0.0481 488/500 [============================>.] - ETA: 4s - loss: 0.5487 - regression_loss: 0.5003 - classification_loss: 0.0484 489/500 [============================>.] - ETA: 4s - loss: 0.5496 - regression_loss: 0.5012 - classification_loss: 0.0484 490/500 [============================>.] - ETA: 3s - loss: 0.5495 - regression_loss: 0.5011 - classification_loss: 0.0484 491/500 [============================>.] - ETA: 3s - loss: 0.5491 - regression_loss: 0.5007 - classification_loss: 0.0484 492/500 [============================>.] - ETA: 3s - loss: 0.5491 - regression_loss: 0.5007 - classification_loss: 0.0484 493/500 [============================>.] - ETA: 2s - loss: 0.5496 - regression_loss: 0.5011 - classification_loss: 0.0485 494/500 [============================>.] - ETA: 2s - loss: 0.5489 - regression_loss: 0.5005 - classification_loss: 0.0484 495/500 [============================>.] - ETA: 1s - loss: 0.5485 - regression_loss: 0.5002 - classification_loss: 0.0484 496/500 [============================>.] - ETA: 1s - loss: 0.5484 - regression_loss: 0.5001 - classification_loss: 0.0483 497/500 [============================>.] - ETA: 1s - loss: 0.5485 - regression_loss: 0.5001 - classification_loss: 0.0484 498/500 [============================>.] - ETA: 0s - loss: 0.5482 - regression_loss: 0.4998 - classification_loss: 0.0483 499/500 [============================>.] - ETA: 0s - loss: 0.5480 - regression_loss: 0.4997 - classification_loss: 0.0483 500/500 [==============================] - 194s 388ms/step - loss: 0.5475 - regression_loss: 0.4992 - classification_loss: 0.0483 326 instances of class plum with average precision: 0.8385 mAP: 0.8385 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/20 1/500 [..............................] - ETA: 3:04 - loss: 0.6102 - regression_loss: 0.5689 - classification_loss: 0.0413 2/500 [..............................] - ETA: 3:11 - loss: 0.5969 - regression_loss: 0.5332 - classification_loss: 0.0638 3/500 [..............................] - ETA: 3:13 - loss: 0.5554 - regression_loss: 0.4937 - classification_loss: 0.0617 4/500 [..............................] - ETA: 3:11 - loss: 0.5051 - regression_loss: 0.4532 - classification_loss: 0.0520 5/500 [..............................] - ETA: 3:11 - loss: 0.4901 - regression_loss: 0.4420 - classification_loss: 0.0481 6/500 [..............................] - ETA: 3:12 - loss: 0.4926 - regression_loss: 0.4478 - classification_loss: 0.0448 7/500 [..............................] - ETA: 3:11 - loss: 0.6033 - regression_loss: 0.5468 - classification_loss: 0.0564 8/500 [..............................] - ETA: 3:10 - loss: 0.5872 - regression_loss: 0.5347 - classification_loss: 0.0525 9/500 [..............................] - ETA: 3:10 - loss: 0.6240 - regression_loss: 0.5721 - classification_loss: 0.0519 10/500 [..............................] - ETA: 3:09 - loss: 0.6118 - regression_loss: 0.5611 - classification_loss: 0.0507 11/500 [..............................] - ETA: 3:09 - loss: 0.5941 - regression_loss: 0.5433 - classification_loss: 0.0508 12/500 [..............................] - ETA: 3:08 - loss: 0.6174 - regression_loss: 0.5631 - classification_loss: 0.0543 13/500 [..............................] - ETA: 3:08 - loss: 0.6134 - regression_loss: 0.5598 - classification_loss: 0.0536 14/500 [..............................] - ETA: 3:08 - loss: 0.6017 - regression_loss: 0.5497 - classification_loss: 0.0520 15/500 [..............................] - ETA: 3:08 - loss: 0.5913 - regression_loss: 0.5408 - classification_loss: 0.0504 16/500 [..............................] - ETA: 3:07 - loss: 0.5921 - regression_loss: 0.5406 - classification_loss: 0.0514 17/500 [>.............................] - ETA: 3:07 - loss: 0.5887 - regression_loss: 0.5381 - classification_loss: 0.0506 18/500 [>.............................] - ETA: 3:07 - loss: 0.6003 - regression_loss: 0.5478 - classification_loss: 0.0525 19/500 [>.............................] - ETA: 3:07 - loss: 0.5903 - regression_loss: 0.5384 - classification_loss: 0.0519 20/500 [>.............................] - ETA: 3:06 - loss: 0.5826 - regression_loss: 0.5323 - classification_loss: 0.0503 21/500 [>.............................] - ETA: 3:06 - loss: 0.5799 - regression_loss: 0.5309 - classification_loss: 0.0490 22/500 [>.............................] - ETA: 3:06 - loss: 0.6100 - regression_loss: 0.5514 - classification_loss: 0.0586 23/500 [>.............................] - ETA: 3:06 - loss: 0.6051 - regression_loss: 0.5473 - classification_loss: 0.0578 24/500 [>.............................] - ETA: 3:05 - loss: 0.6101 - regression_loss: 0.5521 - classification_loss: 0.0580 25/500 [>.............................] - ETA: 3:04 - loss: 0.6178 - regression_loss: 0.5608 - classification_loss: 0.0570 26/500 [>.............................] - ETA: 3:04 - loss: 0.6075 - regression_loss: 0.5513 - classification_loss: 0.0562 27/500 [>.............................] - ETA: 3:04 - loss: 0.5999 - regression_loss: 0.5447 - classification_loss: 0.0552 28/500 [>.............................] - ETA: 3:04 - loss: 0.6142 - regression_loss: 0.5561 - classification_loss: 0.0582 29/500 [>.............................] - ETA: 3:03 - loss: 0.6106 - regression_loss: 0.5532 - classification_loss: 0.0574 30/500 [>.............................] - ETA: 3:03 - loss: 0.6127 - regression_loss: 0.5557 - classification_loss: 0.0570 31/500 [>.............................] - ETA: 3:03 - loss: 0.6134 - regression_loss: 0.5569 - classification_loss: 0.0565 32/500 [>.............................] - ETA: 3:02 - loss: 0.6019 - regression_loss: 0.5465 - classification_loss: 0.0554 33/500 [>.............................] - ETA: 3:02 - loss: 0.6018 - regression_loss: 0.5470 - classification_loss: 0.0548 34/500 [=>............................] - ETA: 3:01 - loss: 0.5953 - regression_loss: 0.5418 - classification_loss: 0.0535 35/500 [=>............................] - ETA: 3:01 - loss: 0.6067 - regression_loss: 0.5500 - classification_loss: 0.0567 36/500 [=>............................] - ETA: 3:01 - loss: 0.6097 - regression_loss: 0.5532 - classification_loss: 0.0564 37/500 [=>............................] - ETA: 3:00 - loss: 0.6063 - regression_loss: 0.5505 - classification_loss: 0.0558 38/500 [=>............................] - ETA: 3:00 - loss: 0.6023 - regression_loss: 0.5474 - classification_loss: 0.0549 39/500 [=>............................] - ETA: 3:00 - loss: 0.5999 - regression_loss: 0.5454 - classification_loss: 0.0545 40/500 [=>............................] - ETA: 2:59 - loss: 0.5946 - regression_loss: 0.5405 - classification_loss: 0.0542 41/500 [=>............................] - ETA: 2:58 - loss: 0.5909 - regression_loss: 0.5372 - classification_loss: 0.0537 42/500 [=>............................] - ETA: 2:58 - loss: 0.5897 - regression_loss: 0.5365 - classification_loss: 0.0532 43/500 [=>............................] - ETA: 2:58 - loss: 0.5861 - regression_loss: 0.5333 - classification_loss: 0.0528 44/500 [=>............................] - ETA: 2:57 - loss: 0.5936 - regression_loss: 0.5406 - classification_loss: 0.0530 45/500 [=>............................] - ETA: 2:57 - loss: 0.5963 - regression_loss: 0.5436 - classification_loss: 0.0527 46/500 [=>............................] - ETA: 2:56 - loss: 0.5995 - regression_loss: 0.5467 - classification_loss: 0.0528 47/500 [=>............................] - ETA: 2:56 - loss: 0.6051 - regression_loss: 0.5520 - classification_loss: 0.0531 48/500 [=>............................] - ETA: 2:56 - loss: 0.5986 - regression_loss: 0.5460 - classification_loss: 0.0526 49/500 [=>............................] - ETA: 2:55 - loss: 0.5994 - regression_loss: 0.5467 - classification_loss: 0.0527 50/500 [==>...........................] - ETA: 2:55 - loss: 0.5944 - regression_loss: 0.5420 - classification_loss: 0.0523 51/500 [==>...........................] - ETA: 2:55 - loss: 0.5881 - regression_loss: 0.5364 - classification_loss: 0.0517 52/500 [==>...........................] - ETA: 2:54 - loss: 0.5975 - regression_loss: 0.5449 - classification_loss: 0.0527 53/500 [==>...........................] - ETA: 2:54 - loss: 0.5931 - regression_loss: 0.5411 - classification_loss: 0.0520 54/500 [==>...........................] - ETA: 2:53 - loss: 0.5854 - regression_loss: 0.5342 - classification_loss: 0.0512 55/500 [==>...........................] - ETA: 2:53 - loss: 0.5794 - regression_loss: 0.5285 - classification_loss: 0.0509 56/500 [==>...........................] - ETA: 2:53 - loss: 0.5770 - regression_loss: 0.5264 - classification_loss: 0.0506 57/500 [==>...........................] - ETA: 2:52 - loss: 0.5804 - regression_loss: 0.5296 - classification_loss: 0.0508 58/500 [==>...........................] - ETA: 2:52 - loss: 0.5791 - regression_loss: 0.5283 - classification_loss: 0.0508 59/500 [==>...........................] - ETA: 2:52 - loss: 0.5749 - regression_loss: 0.5247 - classification_loss: 0.0502 60/500 [==>...........................] - ETA: 2:51 - loss: 0.5721 - regression_loss: 0.5222 - classification_loss: 0.0500 61/500 [==>...........................] - ETA: 2:51 - loss: 0.5690 - regression_loss: 0.5194 - classification_loss: 0.0496 62/500 [==>...........................] - ETA: 2:50 - loss: 0.5644 - regression_loss: 0.5152 - classification_loss: 0.0492 63/500 [==>...........................] - ETA: 2:50 - loss: 0.5608 - regression_loss: 0.5120 - classification_loss: 0.0488 64/500 [==>...........................] - ETA: 2:50 - loss: 0.5602 - regression_loss: 0.5117 - classification_loss: 0.0485 65/500 [==>...........................] - ETA: 2:49 - loss: 0.5579 - regression_loss: 0.5099 - classification_loss: 0.0481 66/500 [==>...........................] - ETA: 2:49 - loss: 0.5551 - regression_loss: 0.5071 - classification_loss: 0.0480 67/500 [===>..........................] - ETA: 2:49 - loss: 0.5558 - regression_loss: 0.5081 - classification_loss: 0.0477 68/500 [===>..........................] - ETA: 2:48 - loss: 0.5558 - regression_loss: 0.5083 - classification_loss: 0.0475 69/500 [===>..........................] - ETA: 2:48 - loss: 0.5617 - regression_loss: 0.5136 - classification_loss: 0.0480 70/500 [===>..........................] - ETA: 2:48 - loss: 0.5623 - regression_loss: 0.5145 - classification_loss: 0.0478 71/500 [===>..........................] - ETA: 2:47 - loss: 0.5650 - regression_loss: 0.5166 - classification_loss: 0.0484 72/500 [===>..........................] - ETA: 2:47 - loss: 0.5661 - regression_loss: 0.5176 - classification_loss: 0.0485 73/500 [===>..........................] - ETA: 2:46 - loss: 0.5726 - regression_loss: 0.5230 - classification_loss: 0.0496 74/500 [===>..........................] - ETA: 2:46 - loss: 0.5697 - regression_loss: 0.5205 - classification_loss: 0.0492 75/500 [===>..........................] - ETA: 2:45 - loss: 0.5671 - regression_loss: 0.5182 - classification_loss: 0.0489 76/500 [===>..........................] - ETA: 2:45 - loss: 0.5688 - regression_loss: 0.5199 - classification_loss: 0.0489 77/500 [===>..........................] - ETA: 2:45 - loss: 0.5668 - regression_loss: 0.5181 - classification_loss: 0.0487 78/500 [===>..........................] - ETA: 2:44 - loss: 0.5656 - regression_loss: 0.5171 - classification_loss: 0.0485 79/500 [===>..........................] - ETA: 2:44 - loss: 0.5673 - regression_loss: 0.5188 - classification_loss: 0.0484 80/500 [===>..........................] - ETA: 2:44 - loss: 0.5667 - regression_loss: 0.5182 - classification_loss: 0.0485 81/500 [===>..........................] - ETA: 2:43 - loss: 0.5718 - regression_loss: 0.5218 - classification_loss: 0.0500 82/500 [===>..........................] - ETA: 2:43 - loss: 0.5732 - regression_loss: 0.5229 - classification_loss: 0.0503 83/500 [===>..........................] - ETA: 2:42 - loss: 0.5770 - regression_loss: 0.5269 - classification_loss: 0.0501 84/500 [====>.........................] - ETA: 2:42 - loss: 0.5750 - regression_loss: 0.5250 - classification_loss: 0.0499 85/500 [====>.........................] - ETA: 2:42 - loss: 0.5754 - regression_loss: 0.5254 - classification_loss: 0.0500 86/500 [====>.........................] - ETA: 2:41 - loss: 0.5731 - regression_loss: 0.5235 - classification_loss: 0.0496 87/500 [====>.........................] - ETA: 2:41 - loss: 0.5723 - regression_loss: 0.5226 - classification_loss: 0.0497 88/500 [====>.........................] - ETA: 2:40 - loss: 0.5760 - regression_loss: 0.5258 - classification_loss: 0.0502 89/500 [====>.........................] - ETA: 2:40 - loss: 0.5776 - regression_loss: 0.5275 - classification_loss: 0.0500 90/500 [====>.........................] - ETA: 2:40 - loss: 0.5803 - regression_loss: 0.5295 - classification_loss: 0.0507 91/500 [====>.........................] - ETA: 2:39 - loss: 0.5786 - regression_loss: 0.5280 - classification_loss: 0.0505 92/500 [====>.........................] - ETA: 2:39 - loss: 0.5781 - regression_loss: 0.5277 - classification_loss: 0.0504 93/500 [====>.........................] - ETA: 2:39 - loss: 0.5763 - regression_loss: 0.5262 - classification_loss: 0.0501 94/500 [====>.........................] - ETA: 2:38 - loss: 0.5748 - regression_loss: 0.5250 - classification_loss: 0.0498 95/500 [====>.........................] - ETA: 2:38 - loss: 0.5732 - regression_loss: 0.5236 - classification_loss: 0.0496 96/500 [====>.........................] - ETA: 2:37 - loss: 0.5712 - regression_loss: 0.5219 - classification_loss: 0.0493 97/500 [====>.........................] - ETA: 2:37 - loss: 0.5708 - regression_loss: 0.5217 - classification_loss: 0.0491 98/500 [====>.........................] - ETA: 2:36 - loss: 0.5693 - regression_loss: 0.5203 - classification_loss: 0.0490 99/500 [====>.........................] - ETA: 2:36 - loss: 0.5740 - regression_loss: 0.5232 - classification_loss: 0.0509 100/500 [=====>........................] - ETA: 2:36 - loss: 0.5718 - regression_loss: 0.5212 - classification_loss: 0.0506 101/500 [=====>........................] - ETA: 2:35 - loss: 0.5686 - regression_loss: 0.5184 - classification_loss: 0.0503 102/500 [=====>........................] - ETA: 2:35 - loss: 0.5655 - regression_loss: 0.5155 - classification_loss: 0.0500 103/500 [=====>........................] - ETA: 2:35 - loss: 0.5640 - regression_loss: 0.5143 - classification_loss: 0.0497 104/500 [=====>........................] - ETA: 2:34 - loss: 0.5641 - regression_loss: 0.5139 - classification_loss: 0.0502 105/500 [=====>........................] - ETA: 2:34 - loss: 0.5654 - regression_loss: 0.5153 - classification_loss: 0.0502 106/500 [=====>........................] - ETA: 2:33 - loss: 0.5623 - regression_loss: 0.5124 - classification_loss: 0.0499 107/500 [=====>........................] - ETA: 2:33 - loss: 0.5614 - regression_loss: 0.5116 - classification_loss: 0.0498 108/500 [=====>........................] - ETA: 2:33 - loss: 0.5614 - regression_loss: 0.5117 - classification_loss: 0.0497 109/500 [=====>........................] - ETA: 2:32 - loss: 0.5582 - regression_loss: 0.5088 - classification_loss: 0.0493 110/500 [=====>........................] - ETA: 2:32 - loss: 0.5596 - regression_loss: 0.5102 - classification_loss: 0.0494 111/500 [=====>........................] - ETA: 2:32 - loss: 0.5585 - regression_loss: 0.5092 - classification_loss: 0.0493 112/500 [=====>........................] - ETA: 2:31 - loss: 0.5573 - regression_loss: 0.5082 - classification_loss: 0.0492 113/500 [=====>........................] - ETA: 2:31 - loss: 0.5562 - regression_loss: 0.5072 - classification_loss: 0.0489 114/500 [=====>........................] - ETA: 2:30 - loss: 0.5561 - regression_loss: 0.5070 - classification_loss: 0.0491 115/500 [=====>........................] - ETA: 2:30 - loss: 0.5558 - regression_loss: 0.5069 - classification_loss: 0.0488 116/500 [=====>........................] - ETA: 2:30 - loss: 0.5577 - regression_loss: 0.5086 - classification_loss: 0.0491 117/500 [======>.......................] - ETA: 2:29 - loss: 0.5566 - regression_loss: 0.5077 - classification_loss: 0.0489 118/500 [======>.......................] - ETA: 2:29 - loss: 0.5549 - regression_loss: 0.5062 - classification_loss: 0.0487 119/500 [======>.......................] - ETA: 2:28 - loss: 0.5532 - regression_loss: 0.5047 - classification_loss: 0.0485 120/500 [======>.......................] - ETA: 2:28 - loss: 0.5508 - regression_loss: 0.5025 - classification_loss: 0.0483 121/500 [======>.......................] - ETA: 2:28 - loss: 0.5545 - regression_loss: 0.5052 - classification_loss: 0.0493 122/500 [======>.......................] - ETA: 2:27 - loss: 0.5533 - regression_loss: 0.5043 - classification_loss: 0.0491 123/500 [======>.......................] - ETA: 2:27 - loss: 0.5534 - regression_loss: 0.5045 - classification_loss: 0.0489 124/500 [======>.......................] - ETA: 2:26 - loss: 0.5517 - regression_loss: 0.5030 - classification_loss: 0.0486 125/500 [======>.......................] - ETA: 2:26 - loss: 0.5521 - regression_loss: 0.5036 - classification_loss: 0.0486 126/500 [======>.......................] - ETA: 2:26 - loss: 0.5506 - regression_loss: 0.5023 - classification_loss: 0.0484 127/500 [======>.......................] - ETA: 2:25 - loss: 0.5499 - regression_loss: 0.5017 - classification_loss: 0.0482 128/500 [======>.......................] - ETA: 2:25 - loss: 0.5482 - regression_loss: 0.5003 - classification_loss: 0.0479 129/500 [======>.......................] - ETA: 2:25 - loss: 0.5460 - regression_loss: 0.4983 - classification_loss: 0.0476 130/500 [======>.......................] - ETA: 2:24 - loss: 0.5454 - regression_loss: 0.4979 - classification_loss: 0.0475 131/500 [======>.......................] - ETA: 2:24 - loss: 0.5435 - regression_loss: 0.4962 - classification_loss: 0.0473 132/500 [======>.......................] - ETA: 2:23 - loss: 0.5437 - regression_loss: 0.4963 - classification_loss: 0.0474 133/500 [======>.......................] - ETA: 2:23 - loss: 0.5414 - regression_loss: 0.4942 - classification_loss: 0.0472 134/500 [=======>......................] - ETA: 2:23 - loss: 0.5392 - regression_loss: 0.4923 - classification_loss: 0.0470 135/500 [=======>......................] - ETA: 2:22 - loss: 0.5392 - regression_loss: 0.4924 - classification_loss: 0.0469 136/500 [=======>......................] - ETA: 2:22 - loss: 0.5388 - regression_loss: 0.4919 - classification_loss: 0.0468 137/500 [=======>......................] - ETA: 2:21 - loss: 0.5384 - regression_loss: 0.4916 - classification_loss: 0.0468 138/500 [=======>......................] - ETA: 2:21 - loss: 0.5367 - regression_loss: 0.4901 - classification_loss: 0.0466 139/500 [=======>......................] - ETA: 2:21 - loss: 0.5364 - regression_loss: 0.4898 - classification_loss: 0.0465 140/500 [=======>......................] - ETA: 2:20 - loss: 0.5369 - regression_loss: 0.4904 - classification_loss: 0.0466 141/500 [=======>......................] - ETA: 2:20 - loss: 0.5349 - regression_loss: 0.4885 - classification_loss: 0.0464 142/500 [=======>......................] - ETA: 2:19 - loss: 0.5340 - regression_loss: 0.4877 - classification_loss: 0.0463 143/500 [=======>......................] - ETA: 2:19 - loss: 0.5345 - regression_loss: 0.4882 - classification_loss: 0.0462 144/500 [=======>......................] - ETA: 2:19 - loss: 0.5336 - regression_loss: 0.4873 - classification_loss: 0.0463 145/500 [=======>......................] - ETA: 2:18 - loss: 0.5371 - regression_loss: 0.4907 - classification_loss: 0.0465 146/500 [=======>......................] - ETA: 2:18 - loss: 0.5386 - regression_loss: 0.4921 - classification_loss: 0.0465 147/500 [=======>......................] - ETA: 2:18 - loss: 0.5364 - regression_loss: 0.4901 - classification_loss: 0.0463 148/500 [=======>......................] - ETA: 2:17 - loss: 0.5358 - regression_loss: 0.4896 - classification_loss: 0.0461 149/500 [=======>......................] - ETA: 2:17 - loss: 0.5362 - regression_loss: 0.4899 - classification_loss: 0.0463 150/500 [========>.....................] - ETA: 2:16 - loss: 0.5369 - regression_loss: 0.4904 - classification_loss: 0.0465 151/500 [========>.....................] - ETA: 2:16 - loss: 0.5386 - regression_loss: 0.4918 - classification_loss: 0.0468 152/500 [========>.....................] - ETA: 2:16 - loss: 0.5378 - regression_loss: 0.4912 - classification_loss: 0.0466 153/500 [========>.....................] - ETA: 2:15 - loss: 0.5375 - regression_loss: 0.4910 - classification_loss: 0.0466 154/500 [========>.....................] - ETA: 2:15 - loss: 0.5371 - regression_loss: 0.4907 - classification_loss: 0.0464 155/500 [========>.....................] - ETA: 2:14 - loss: 0.5377 - regression_loss: 0.4913 - classification_loss: 0.0464 156/500 [========>.....................] - ETA: 2:14 - loss: 0.5381 - regression_loss: 0.4918 - classification_loss: 0.0463 157/500 [========>.....................] - ETA: 2:14 - loss: 0.5374 - regression_loss: 0.4913 - classification_loss: 0.0461 158/500 [========>.....................] - ETA: 2:13 - loss: 0.5364 - regression_loss: 0.4903 - classification_loss: 0.0460 159/500 [========>.....................] - ETA: 2:13 - loss: 0.5353 - regression_loss: 0.4894 - classification_loss: 0.0459 160/500 [========>.....................] - ETA: 2:12 - loss: 0.5343 - regression_loss: 0.4884 - classification_loss: 0.0459 161/500 [========>.....................] - ETA: 2:12 - loss: 0.5332 - regression_loss: 0.4874 - classification_loss: 0.0458 162/500 [========>.....................] - ETA: 2:12 - loss: 0.5331 - regression_loss: 0.4873 - classification_loss: 0.0458 163/500 [========>.....................] - ETA: 2:11 - loss: 0.5321 - regression_loss: 0.4864 - classification_loss: 0.0457 164/500 [========>.....................] - ETA: 2:11 - loss: 0.5319 - regression_loss: 0.4863 - classification_loss: 0.0456 165/500 [========>.....................] - ETA: 2:10 - loss: 0.5329 - regression_loss: 0.4871 - classification_loss: 0.0458 166/500 [========>.....................] - ETA: 2:10 - loss: 0.5340 - regression_loss: 0.4882 - classification_loss: 0.0458 167/500 [=========>....................] - ETA: 2:10 - loss: 0.5338 - regression_loss: 0.4878 - classification_loss: 0.0460 168/500 [=========>....................] - ETA: 2:09 - loss: 0.5324 - regression_loss: 0.4865 - classification_loss: 0.0459 169/500 [=========>....................] - ETA: 2:09 - loss: 0.5338 - regression_loss: 0.4877 - classification_loss: 0.0461 170/500 [=========>....................] - ETA: 2:08 - loss: 0.5326 - regression_loss: 0.4866 - classification_loss: 0.0460 171/500 [=========>....................] - ETA: 2:08 - loss: 0.5323 - regression_loss: 0.4863 - classification_loss: 0.0460 172/500 [=========>....................] - ETA: 2:08 - loss: 0.5314 - regression_loss: 0.4856 - classification_loss: 0.0458 173/500 [=========>....................] - ETA: 2:07 - loss: 0.5319 - regression_loss: 0.4862 - classification_loss: 0.0457 174/500 [=========>....................] - ETA: 2:07 - loss: 0.5316 - regression_loss: 0.4860 - classification_loss: 0.0456 175/500 [=========>....................] - ETA: 2:07 - loss: 0.5309 - regression_loss: 0.4853 - classification_loss: 0.0456 176/500 [=========>....................] - ETA: 2:06 - loss: 0.5293 - regression_loss: 0.4838 - classification_loss: 0.0455 177/500 [=========>....................] - ETA: 2:06 - loss: 0.5321 - regression_loss: 0.4865 - classification_loss: 0.0456 178/500 [=========>....................] - ETA: 2:05 - loss: 0.5333 - regression_loss: 0.4875 - classification_loss: 0.0458 179/500 [=========>....................] - ETA: 2:05 - loss: 0.5341 - regression_loss: 0.4883 - classification_loss: 0.0458 180/500 [=========>....................] - ETA: 2:05 - loss: 0.5348 - regression_loss: 0.4890 - classification_loss: 0.0457 181/500 [=========>....................] - ETA: 2:04 - loss: 0.5349 - regression_loss: 0.4892 - classification_loss: 0.0457 182/500 [=========>....................] - ETA: 2:04 - loss: 0.5350 - regression_loss: 0.4893 - classification_loss: 0.0457 183/500 [=========>....................] - ETA: 2:03 - loss: 0.5339 - regression_loss: 0.4884 - classification_loss: 0.0455 184/500 [==========>...................] - ETA: 2:03 - loss: 0.5342 - regression_loss: 0.4887 - classification_loss: 0.0455 185/500 [==========>...................] - ETA: 2:03 - loss: 0.5338 - regression_loss: 0.4884 - classification_loss: 0.0454 186/500 [==========>...................] - ETA: 2:02 - loss: 0.5341 - regression_loss: 0.4887 - classification_loss: 0.0453 187/500 [==========>...................] - ETA: 2:02 - loss: 0.5325 - regression_loss: 0.4873 - classification_loss: 0.0452 188/500 [==========>...................] - ETA: 2:01 - loss: 0.5331 - regression_loss: 0.4878 - classification_loss: 0.0453 189/500 [==========>...................] - ETA: 2:01 - loss: 0.5352 - regression_loss: 0.4895 - classification_loss: 0.0456 190/500 [==========>...................] - ETA: 2:01 - loss: 0.5353 - regression_loss: 0.4898 - classification_loss: 0.0455 191/500 [==========>...................] - ETA: 2:00 - loss: 0.5356 - regression_loss: 0.4902 - classification_loss: 0.0454 192/500 [==========>...................] - ETA: 2:00 - loss: 0.5341 - regression_loss: 0.4888 - classification_loss: 0.0453 193/500 [==========>...................] - ETA: 1:59 - loss: 0.5344 - regression_loss: 0.4889 - classification_loss: 0.0455 194/500 [==========>...................] - ETA: 1:59 - loss: 0.5332 - regression_loss: 0.4878 - classification_loss: 0.0454 195/500 [==========>...................] - ETA: 1:59 - loss: 0.5335 - regression_loss: 0.4881 - classification_loss: 0.0454 196/500 [==========>...................] - ETA: 1:58 - loss: 0.5330 - regression_loss: 0.4876 - classification_loss: 0.0455 197/500 [==========>...................] - ETA: 1:58 - loss: 0.5316 - regression_loss: 0.4863 - classification_loss: 0.0454 198/500 [==========>...................] - ETA: 1:57 - loss: 0.5330 - regression_loss: 0.4876 - classification_loss: 0.0454 199/500 [==========>...................] - ETA: 1:57 - loss: 0.5322 - regression_loss: 0.4869 - classification_loss: 0.0453 200/500 [===========>..................] - ETA: 1:57 - loss: 0.5315 - regression_loss: 0.4863 - classification_loss: 0.0451 201/500 [===========>..................] - ETA: 1:56 - loss: 0.5317 - regression_loss: 0.4865 - classification_loss: 0.0452 202/500 [===========>..................] - ETA: 1:56 - loss: 0.5308 - regression_loss: 0.4857 - classification_loss: 0.0451 203/500 [===========>..................] - ETA: 1:56 - loss: 0.5303 - regression_loss: 0.4852 - classification_loss: 0.0451 204/500 [===========>..................] - ETA: 1:55 - loss: 0.5321 - regression_loss: 0.4869 - classification_loss: 0.0452 205/500 [===========>..................] - ETA: 1:55 - loss: 0.5322 - regression_loss: 0.4870 - classification_loss: 0.0452 206/500 [===========>..................] - ETA: 1:54 - loss: 0.5318 - regression_loss: 0.4866 - classification_loss: 0.0452 207/500 [===========>..................] - ETA: 1:54 - loss: 0.5329 - regression_loss: 0.4873 - classification_loss: 0.0455 208/500 [===========>..................] - ETA: 1:54 - loss: 0.5317 - regression_loss: 0.4862 - classification_loss: 0.0455 209/500 [===========>..................] - ETA: 1:53 - loss: 0.5311 - regression_loss: 0.4857 - classification_loss: 0.0454 210/500 [===========>..................] - ETA: 1:53 - loss: 0.5315 - regression_loss: 0.4862 - classification_loss: 0.0454 211/500 [===========>..................] - ETA: 1:52 - loss: 0.5308 - regression_loss: 0.4855 - classification_loss: 0.0453 212/500 [===========>..................] - ETA: 1:52 - loss: 0.5308 - regression_loss: 0.4856 - classification_loss: 0.0452 213/500 [===========>..................] - ETA: 1:52 - loss: 0.5301 - regression_loss: 0.4849 - classification_loss: 0.0451 214/500 [===========>..................] - ETA: 1:51 - loss: 0.5297 - regression_loss: 0.4847 - classification_loss: 0.0450 215/500 [===========>..................] - ETA: 1:51 - loss: 0.5294 - regression_loss: 0.4845 - classification_loss: 0.0449 216/500 [===========>..................] - ETA: 1:50 - loss: 0.5285 - regression_loss: 0.4837 - classification_loss: 0.0448 217/500 [============>.................] - ETA: 1:50 - loss: 0.5320 - regression_loss: 0.4867 - classification_loss: 0.0453 218/500 [============>.................] - ETA: 1:50 - loss: 0.5312 - regression_loss: 0.4859 - classification_loss: 0.0453 219/500 [============>.................] - ETA: 1:49 - loss: 0.5299 - regression_loss: 0.4847 - classification_loss: 0.0452 220/500 [============>.................] - ETA: 1:49 - loss: 0.5294 - regression_loss: 0.4843 - classification_loss: 0.0451 221/500 [============>.................] - ETA: 1:48 - loss: 0.5293 - regression_loss: 0.4843 - classification_loss: 0.0450 222/500 [============>.................] - ETA: 1:48 - loss: 0.5308 - regression_loss: 0.4856 - classification_loss: 0.0451 223/500 [============>.................] - ETA: 1:48 - loss: 0.5301 - regression_loss: 0.4850 - classification_loss: 0.0451 224/500 [============>.................] - ETA: 1:47 - loss: 0.5313 - regression_loss: 0.4859 - classification_loss: 0.0454 225/500 [============>.................] - ETA: 1:47 - loss: 0.5311 - regression_loss: 0.4858 - classification_loss: 0.0454 226/500 [============>.................] - ETA: 1:46 - loss: 0.5304 - regression_loss: 0.4852 - classification_loss: 0.0453 227/500 [============>.................] - ETA: 1:46 - loss: 0.5309 - regression_loss: 0.4857 - classification_loss: 0.0452 228/500 [============>.................] - ETA: 1:46 - loss: 0.5305 - regression_loss: 0.4853 - classification_loss: 0.0451 229/500 [============>.................] - ETA: 1:45 - loss: 0.5299 - regression_loss: 0.4848 - classification_loss: 0.0451 230/500 [============>.................] - ETA: 1:45 - loss: 0.5299 - regression_loss: 0.4849 - classification_loss: 0.0450 231/500 [============>.................] - ETA: 1:44 - loss: 0.5298 - regression_loss: 0.4848 - classification_loss: 0.0450 232/500 [============>.................] - ETA: 1:44 - loss: 0.5293 - regression_loss: 0.4844 - classification_loss: 0.0449 233/500 [============>.................] - ETA: 1:44 - loss: 0.5281 - regression_loss: 0.4833 - classification_loss: 0.0447 234/500 [=============>................] - ETA: 1:43 - loss: 0.5275 - regression_loss: 0.4828 - classification_loss: 0.0447 235/500 [=============>................] - ETA: 1:43 - loss: 0.5270 - regression_loss: 0.4823 - classification_loss: 0.0446 236/500 [=============>................] - ETA: 1:42 - loss: 0.5288 - regression_loss: 0.4840 - classification_loss: 0.0448 237/500 [=============>................] - ETA: 1:42 - loss: 0.5295 - regression_loss: 0.4847 - classification_loss: 0.0448 238/500 [=============>................] - ETA: 1:42 - loss: 0.5301 - regression_loss: 0.4853 - classification_loss: 0.0449 239/500 [=============>................] - ETA: 1:41 - loss: 0.5302 - regression_loss: 0.4853 - classification_loss: 0.0450 240/500 [=============>................] - ETA: 1:41 - loss: 0.5300 - regression_loss: 0.4851 - classification_loss: 0.0450 241/500 [=============>................] - ETA: 1:41 - loss: 0.5296 - regression_loss: 0.4847 - classification_loss: 0.0449 242/500 [=============>................] - ETA: 1:40 - loss: 0.5295 - regression_loss: 0.4846 - classification_loss: 0.0449 243/500 [=============>................] - ETA: 1:40 - loss: 0.5290 - regression_loss: 0.4841 - classification_loss: 0.0448 244/500 [=============>................] - ETA: 1:39 - loss: 0.5294 - regression_loss: 0.4845 - classification_loss: 0.0449 245/500 [=============>................] - ETA: 1:39 - loss: 0.5291 - regression_loss: 0.4842 - classification_loss: 0.0449 246/500 [=============>................] - ETA: 1:39 - loss: 0.5285 - regression_loss: 0.4837 - classification_loss: 0.0448 247/500 [=============>................] - ETA: 1:38 - loss: 0.5286 - regression_loss: 0.4838 - classification_loss: 0.0448 248/500 [=============>................] - ETA: 1:38 - loss: 0.5288 - regression_loss: 0.4840 - classification_loss: 0.0448 249/500 [=============>................] - ETA: 1:37 - loss: 0.5290 - regression_loss: 0.4842 - classification_loss: 0.0448 250/500 [==============>...............] - ETA: 1:37 - loss: 0.5283 - regression_loss: 0.4837 - classification_loss: 0.0447 251/500 [==============>...............] - ETA: 1:37 - loss: 0.5278 - regression_loss: 0.4831 - classification_loss: 0.0446 252/500 [==============>...............] - ETA: 1:36 - loss: 0.5291 - regression_loss: 0.4844 - classification_loss: 0.0448 253/500 [==============>...............] - ETA: 1:36 - loss: 0.5310 - regression_loss: 0.4859 - classification_loss: 0.0451 254/500 [==============>...............] - ETA: 1:36 - loss: 0.5321 - regression_loss: 0.4869 - classification_loss: 0.0451 255/500 [==============>...............] - ETA: 1:35 - loss: 0.5312 - regression_loss: 0.4862 - classification_loss: 0.0451 256/500 [==============>...............] - ETA: 1:35 - loss: 0.5316 - regression_loss: 0.4863 - classification_loss: 0.0453 257/500 [==============>...............] - ETA: 1:34 - loss: 0.5302 - regression_loss: 0.4849 - classification_loss: 0.0452 258/500 [==============>...............] - ETA: 1:34 - loss: 0.5304 - regression_loss: 0.4852 - classification_loss: 0.0452 259/500 [==============>...............] - ETA: 1:34 - loss: 0.5295 - regression_loss: 0.4844 - classification_loss: 0.0451 260/500 [==============>...............] - ETA: 1:33 - loss: 0.5297 - regression_loss: 0.4846 - classification_loss: 0.0451 261/500 [==============>...............] - ETA: 1:33 - loss: 0.5300 - regression_loss: 0.4849 - classification_loss: 0.0451 262/500 [==============>...............] - ETA: 1:32 - loss: 0.5299 - regression_loss: 0.4849 - classification_loss: 0.0450 263/500 [==============>...............] - ETA: 1:32 - loss: 0.5294 - regression_loss: 0.4844 - classification_loss: 0.0450 264/500 [==============>...............] - ETA: 1:32 - loss: 0.5294 - regression_loss: 0.4844 - classification_loss: 0.0450 265/500 [==============>...............] - ETA: 1:31 - loss: 0.5291 - regression_loss: 0.4840 - classification_loss: 0.0450 266/500 [==============>...............] - ETA: 1:31 - loss: 0.5281 - regression_loss: 0.4831 - classification_loss: 0.0449 267/500 [===============>..............] - ETA: 1:30 - loss: 0.5284 - regression_loss: 0.4835 - classification_loss: 0.0449 268/500 [===============>..............] - ETA: 1:30 - loss: 0.5292 - regression_loss: 0.4842 - classification_loss: 0.0450 269/500 [===============>..............] - ETA: 1:30 - loss: 0.5287 - regression_loss: 0.4837 - classification_loss: 0.0450 270/500 [===============>..............] - ETA: 1:29 - loss: 0.5282 - regression_loss: 0.4832 - classification_loss: 0.0449 271/500 [===============>..............] - ETA: 1:29 - loss: 0.5280 - regression_loss: 0.4831 - classification_loss: 0.0449 272/500 [===============>..............] - ETA: 1:29 - loss: 0.5274 - regression_loss: 0.4826 - classification_loss: 0.0448 273/500 [===============>..............] - ETA: 1:28 - loss: 0.5285 - regression_loss: 0.4834 - classification_loss: 0.0451 274/500 [===============>..............] - ETA: 1:28 - loss: 0.5278 - regression_loss: 0.4827 - classification_loss: 0.0451 275/500 [===============>..............] - ETA: 1:27 - loss: 0.5275 - regression_loss: 0.4825 - classification_loss: 0.0450 276/500 [===============>..............] - ETA: 1:27 - loss: 0.5277 - regression_loss: 0.4828 - classification_loss: 0.0450 277/500 [===============>..............] - ETA: 1:27 - loss: 0.5276 - regression_loss: 0.4827 - classification_loss: 0.0449 278/500 [===============>..............] - ETA: 1:26 - loss: 0.5268 - regression_loss: 0.4820 - classification_loss: 0.0448 279/500 [===============>..............] - ETA: 1:26 - loss: 0.5267 - regression_loss: 0.4819 - classification_loss: 0.0448 280/500 [===============>..............] - ETA: 1:25 - loss: 0.5268 - regression_loss: 0.4820 - classification_loss: 0.0448 281/500 [===============>..............] - ETA: 1:25 - loss: 0.5265 - regression_loss: 0.4817 - classification_loss: 0.0448 282/500 [===============>..............] - ETA: 1:25 - loss: 0.5266 - regression_loss: 0.4819 - classification_loss: 0.0448 283/500 [===============>..............] - ETA: 1:24 - loss: 0.5254 - regression_loss: 0.4807 - classification_loss: 0.0447 284/500 [================>.............] - ETA: 1:24 - loss: 0.5251 - regression_loss: 0.4806 - classification_loss: 0.0446 285/500 [================>.............] - ETA: 1:24 - loss: 0.5262 - regression_loss: 0.4814 - classification_loss: 0.0448 286/500 [================>.............] - ETA: 1:23 - loss: 0.5287 - regression_loss: 0.4835 - classification_loss: 0.0452 287/500 [================>.............] - ETA: 1:23 - loss: 0.5284 - regression_loss: 0.4833 - classification_loss: 0.0451 288/500 [================>.............] - ETA: 1:22 - loss: 0.5305 - regression_loss: 0.4852 - classification_loss: 0.0453 289/500 [================>.............] - ETA: 1:22 - loss: 0.5297 - regression_loss: 0.4844 - classification_loss: 0.0453 290/500 [================>.............] - ETA: 1:22 - loss: 0.5296 - regression_loss: 0.4844 - classification_loss: 0.0452 291/500 [================>.............] - ETA: 1:21 - loss: 0.5291 - regression_loss: 0.4840 - classification_loss: 0.0451 292/500 [================>.............] - ETA: 1:21 - loss: 0.5300 - regression_loss: 0.4848 - classification_loss: 0.0452 293/500 [================>.............] - ETA: 1:20 - loss: 0.5309 - regression_loss: 0.4858 - classification_loss: 0.0451 294/500 [================>.............] - ETA: 1:20 - loss: 0.5317 - regression_loss: 0.4866 - classification_loss: 0.0451 295/500 [================>.............] - ETA: 1:20 - loss: 0.5330 - regression_loss: 0.4878 - classification_loss: 0.0452 296/500 [================>.............] - ETA: 1:19 - loss: 0.5333 - regression_loss: 0.4882 - classification_loss: 0.0451 297/500 [================>.............] - ETA: 1:19 - loss: 0.5335 - regression_loss: 0.4884 - classification_loss: 0.0451 298/500 [================>.............] - ETA: 1:18 - loss: 0.5336 - regression_loss: 0.4885 - classification_loss: 0.0451 299/500 [================>.............] - ETA: 1:18 - loss: 0.5332 - regression_loss: 0.4882 - classification_loss: 0.0450 300/500 [=================>............] - ETA: 1:18 - loss: 0.5345 - regression_loss: 0.4893 - classification_loss: 0.0452 301/500 [=================>............] - ETA: 1:17 - loss: 0.5341 - regression_loss: 0.4889 - classification_loss: 0.0452 302/500 [=================>............] - ETA: 1:17 - loss: 0.5332 - regression_loss: 0.4881 - classification_loss: 0.0451 303/500 [=================>............] - ETA: 1:16 - loss: 0.5324 - regression_loss: 0.4873 - classification_loss: 0.0450 304/500 [=================>............] - ETA: 1:16 - loss: 0.5318 - regression_loss: 0.4868 - classification_loss: 0.0450 305/500 [=================>............] - ETA: 1:16 - loss: 0.5315 - regression_loss: 0.4865 - classification_loss: 0.0450 306/500 [=================>............] - ETA: 1:15 - loss: 0.5322 - regression_loss: 0.4871 - classification_loss: 0.0451 307/500 [=================>............] - ETA: 1:15 - loss: 0.5336 - regression_loss: 0.4883 - classification_loss: 0.0453 308/500 [=================>............] - ETA: 1:14 - loss: 0.5339 - regression_loss: 0.4886 - classification_loss: 0.0453 309/500 [=================>............] - ETA: 1:14 - loss: 0.5338 - regression_loss: 0.4885 - classification_loss: 0.0453 310/500 [=================>............] - ETA: 1:14 - loss: 0.5335 - regression_loss: 0.4882 - classification_loss: 0.0453 311/500 [=================>............] - ETA: 1:13 - loss: 0.5339 - regression_loss: 0.4886 - classification_loss: 0.0453 312/500 [=================>............] - ETA: 1:13 - loss: 0.5339 - regression_loss: 0.4887 - classification_loss: 0.0452 313/500 [=================>............] - ETA: 1:13 - loss: 0.5340 - regression_loss: 0.4889 - classification_loss: 0.0452 314/500 [=================>............] - ETA: 1:12 - loss: 0.5339 - regression_loss: 0.4888 - classification_loss: 0.0451 315/500 [=================>............] - ETA: 1:12 - loss: 0.5341 - regression_loss: 0.4890 - classification_loss: 0.0451 316/500 [=================>............] - ETA: 1:11 - loss: 0.5345 - regression_loss: 0.4894 - classification_loss: 0.0451 317/500 [==================>...........] - ETA: 1:11 - loss: 0.5375 - regression_loss: 0.4923 - classification_loss: 0.0451 318/500 [==================>...........] - ETA: 1:11 - loss: 0.5378 - regression_loss: 0.4926 - classification_loss: 0.0452 319/500 [==================>...........] - ETA: 1:10 - loss: 0.5411 - regression_loss: 0.4958 - classification_loss: 0.0453 320/500 [==================>...........] - ETA: 1:10 - loss: 0.5440 - regression_loss: 0.4981 - classification_loss: 0.0460 321/500 [==================>...........] - ETA: 1:09 - loss: 0.5447 - regression_loss: 0.4988 - classification_loss: 0.0459 322/500 [==================>...........] - ETA: 1:09 - loss: 0.5442 - regression_loss: 0.4983 - classification_loss: 0.0458 323/500 [==================>...........] - ETA: 1:09 - loss: 0.5443 - regression_loss: 0.4984 - classification_loss: 0.0459 324/500 [==================>...........] - ETA: 1:08 - loss: 0.5443 - regression_loss: 0.4984 - classification_loss: 0.0459 325/500 [==================>...........] - ETA: 1:08 - loss: 0.5436 - regression_loss: 0.4978 - classification_loss: 0.0458 326/500 [==================>...........] - ETA: 1:07 - loss: 0.5435 - regression_loss: 0.4978 - classification_loss: 0.0457 327/500 [==================>...........] - ETA: 1:07 - loss: 0.5440 - regression_loss: 0.4983 - classification_loss: 0.0458 328/500 [==================>...........] - ETA: 1:07 - loss: 0.5448 - regression_loss: 0.4989 - classification_loss: 0.0458 329/500 [==================>...........] - ETA: 1:06 - loss: 0.5445 - regression_loss: 0.4987 - classification_loss: 0.0458 330/500 [==================>...........] - ETA: 1:06 - loss: 0.5440 - regression_loss: 0.4983 - classification_loss: 0.0457 331/500 [==================>...........] - ETA: 1:05 - loss: 0.5442 - regression_loss: 0.4985 - classification_loss: 0.0457 332/500 [==================>...........] - ETA: 1:05 - loss: 0.5445 - regression_loss: 0.4988 - classification_loss: 0.0456 333/500 [==================>...........] - ETA: 1:05 - loss: 0.5442 - regression_loss: 0.4986 - classification_loss: 0.0456 334/500 [===================>..........] - ETA: 1:04 - loss: 0.5445 - regression_loss: 0.4989 - classification_loss: 0.0457 335/500 [===================>..........] - ETA: 1:04 - loss: 0.5449 - regression_loss: 0.4993 - classification_loss: 0.0456 336/500 [===================>..........] - ETA: 1:03 - loss: 0.5448 - regression_loss: 0.4992 - classification_loss: 0.0456 337/500 [===================>..........] - ETA: 1:03 - loss: 0.5466 - regression_loss: 0.5007 - classification_loss: 0.0459 338/500 [===================>..........] - ETA: 1:03 - loss: 0.5467 - regression_loss: 0.5007 - classification_loss: 0.0460 339/500 [===================>..........] - ETA: 1:02 - loss: 0.5460 - regression_loss: 0.5000 - classification_loss: 0.0459 340/500 [===================>..........] - ETA: 1:02 - loss: 0.5461 - regression_loss: 0.5002 - classification_loss: 0.0460 341/500 [===================>..........] - ETA: 1:02 - loss: 0.5456 - regression_loss: 0.4996 - classification_loss: 0.0459 342/500 [===================>..........] - ETA: 1:01 - loss: 0.5459 - regression_loss: 0.4999 - classification_loss: 0.0460 343/500 [===================>..........] - ETA: 1:01 - loss: 0.5464 - regression_loss: 0.5004 - classification_loss: 0.0460 344/500 [===================>..........] - ETA: 1:00 - loss: 0.5456 - regression_loss: 0.4996 - classification_loss: 0.0459 345/500 [===================>..........] - ETA: 1:00 - loss: 0.5451 - regression_loss: 0.4992 - classification_loss: 0.0459 346/500 [===================>..........] - ETA: 1:00 - loss: 0.5456 - regression_loss: 0.4995 - classification_loss: 0.0461 347/500 [===================>..........] - ETA: 59s - loss: 0.5451 - regression_loss: 0.4990 - classification_loss: 0.0460  348/500 [===================>..........] - ETA: 59s - loss: 0.5444 - regression_loss: 0.4984 - classification_loss: 0.0460 349/500 [===================>..........] - ETA: 58s - loss: 0.5442 - regression_loss: 0.4983 - classification_loss: 0.0459 350/500 [====================>.........] - ETA: 58s - loss: 0.5440 - regression_loss: 0.4981 - classification_loss: 0.0459 351/500 [====================>.........] - ETA: 58s - loss: 0.5435 - regression_loss: 0.4976 - classification_loss: 0.0459 352/500 [====================>.........] - ETA: 57s - loss: 0.5431 - regression_loss: 0.4973 - classification_loss: 0.0459 353/500 [====================>.........] - ETA: 57s - loss: 0.5425 - regression_loss: 0.4967 - classification_loss: 0.0458 354/500 [====================>.........] - ETA: 56s - loss: 0.5422 - regression_loss: 0.4964 - classification_loss: 0.0457 355/500 [====================>.........] - ETA: 56s - loss: 0.5417 - regression_loss: 0.4960 - classification_loss: 0.0457 356/500 [====================>.........] - ETA: 56s - loss: 0.5414 - regression_loss: 0.4958 - classification_loss: 0.0456 357/500 [====================>.........] - ETA: 55s - loss: 0.5404 - regression_loss: 0.4948 - classification_loss: 0.0456 358/500 [====================>.........] - ETA: 55s - loss: 0.5397 - regression_loss: 0.4942 - classification_loss: 0.0455 359/500 [====================>.........] - ETA: 55s - loss: 0.5406 - regression_loss: 0.4950 - classification_loss: 0.0456 360/500 [====================>.........] - ETA: 54s - loss: 0.5404 - regression_loss: 0.4949 - classification_loss: 0.0455 361/500 [====================>.........] - ETA: 54s - loss: 0.5410 - regression_loss: 0.4954 - classification_loss: 0.0456 362/500 [====================>.........] - ETA: 53s - loss: 0.5404 - regression_loss: 0.4949 - classification_loss: 0.0456 363/500 [====================>.........] - ETA: 53s - loss: 0.5402 - regression_loss: 0.4947 - classification_loss: 0.0455 364/500 [====================>.........] - ETA: 53s - loss: 0.5401 - regression_loss: 0.4945 - classification_loss: 0.0456 365/500 [====================>.........] - ETA: 52s - loss: 0.5396 - regression_loss: 0.4941 - classification_loss: 0.0456 366/500 [====================>.........] - ETA: 52s - loss: 0.5390 - regression_loss: 0.4936 - classification_loss: 0.0455 367/500 [=====================>........] - ETA: 51s - loss: 0.5384 - regression_loss: 0.4930 - classification_loss: 0.0454 368/500 [=====================>........] - ETA: 51s - loss: 0.5382 - regression_loss: 0.4928 - classification_loss: 0.0454 369/500 [=====================>........] - ETA: 51s - loss: 0.5374 - regression_loss: 0.4921 - classification_loss: 0.0453 370/500 [=====================>........] - ETA: 50s - loss: 0.5372 - regression_loss: 0.4919 - classification_loss: 0.0453 371/500 [=====================>........] - ETA: 50s - loss: 0.5371 - regression_loss: 0.4919 - classification_loss: 0.0452 372/500 [=====================>........] - ETA: 49s - loss: 0.5364 - regression_loss: 0.4912 - classification_loss: 0.0452 373/500 [=====================>........] - ETA: 49s - loss: 0.5364 - regression_loss: 0.4913 - classification_loss: 0.0451 374/500 [=====================>........] - ETA: 49s - loss: 0.5363 - regression_loss: 0.4912 - classification_loss: 0.0451 375/500 [=====================>........] - ETA: 48s - loss: 0.5359 - regression_loss: 0.4908 - classification_loss: 0.0450 376/500 [=====================>........] - ETA: 48s - loss: 0.5374 - regression_loss: 0.4921 - classification_loss: 0.0453 377/500 [=====================>........] - ETA: 47s - loss: 0.5371 - regression_loss: 0.4918 - classification_loss: 0.0453 378/500 [=====================>........] - ETA: 47s - loss: 0.5368 - regression_loss: 0.4915 - classification_loss: 0.0452 379/500 [=====================>........] - ETA: 47s - loss: 0.5380 - regression_loss: 0.4926 - classification_loss: 0.0454 380/500 [=====================>........] - ETA: 46s - loss: 0.5376 - regression_loss: 0.4923 - classification_loss: 0.0454 381/500 [=====================>........] - ETA: 46s - loss: 0.5368 - regression_loss: 0.4915 - classification_loss: 0.0453 382/500 [=====================>........] - ETA: 45s - loss: 0.5375 - regression_loss: 0.4921 - classification_loss: 0.0454 383/500 [=====================>........] - ETA: 45s - loss: 0.5366 - regression_loss: 0.4912 - classification_loss: 0.0453 384/500 [======================>.......] - ETA: 45s - loss: 0.5367 - regression_loss: 0.4914 - classification_loss: 0.0453 385/500 [======================>.......] - ETA: 44s - loss: 0.5365 - regression_loss: 0.4912 - classification_loss: 0.0453 386/500 [======================>.......] - ETA: 44s - loss: 0.5360 - regression_loss: 0.4907 - classification_loss: 0.0453 387/500 [======================>.......] - ETA: 44s - loss: 0.5356 - regression_loss: 0.4904 - classification_loss: 0.0452 388/500 [======================>.......] - ETA: 43s - loss: 0.5355 - regression_loss: 0.4903 - classification_loss: 0.0452 389/500 [======================>.......] - ETA: 43s - loss: 0.5348 - regression_loss: 0.4897 - classification_loss: 0.0451 390/500 [======================>.......] - ETA: 42s - loss: 0.5350 - regression_loss: 0.4900 - classification_loss: 0.0451 391/500 [======================>.......] - ETA: 42s - loss: 0.5352 - regression_loss: 0.4901 - classification_loss: 0.0450 392/500 [======================>.......] - ETA: 42s - loss: 0.5345 - regression_loss: 0.4896 - classification_loss: 0.0450 393/500 [======================>.......] - ETA: 41s - loss: 0.5340 - regression_loss: 0.4891 - classification_loss: 0.0449 394/500 [======================>.......] - ETA: 41s - loss: 0.5341 - regression_loss: 0.4892 - classification_loss: 0.0449 395/500 [======================>.......] - ETA: 40s - loss: 0.5333 - regression_loss: 0.4884 - classification_loss: 0.0449 396/500 [======================>.......] - ETA: 40s - loss: 0.5328 - regression_loss: 0.4880 - classification_loss: 0.0448 397/500 [======================>.......] - ETA: 40s - loss: 0.5326 - regression_loss: 0.4878 - classification_loss: 0.0448 398/500 [======================>.......] - ETA: 39s - loss: 0.5323 - regression_loss: 0.4875 - classification_loss: 0.0447 399/500 [======================>.......] - ETA: 39s - loss: 0.5325 - regression_loss: 0.4878 - classification_loss: 0.0447 400/500 [=======================>......] - ETA: 38s - loss: 0.5333 - regression_loss: 0.4885 - classification_loss: 0.0448 401/500 [=======================>......] - ETA: 38s - loss: 0.5331 - regression_loss: 0.4883 - classification_loss: 0.0448 402/500 [=======================>......] - ETA: 38s - loss: 0.5329 - regression_loss: 0.4881 - classification_loss: 0.0448 403/500 [=======================>......] - ETA: 37s - loss: 0.5339 - regression_loss: 0.4889 - classification_loss: 0.0449 404/500 [=======================>......] - ETA: 37s - loss: 0.5336 - regression_loss: 0.4887 - classification_loss: 0.0449 405/500 [=======================>......] - ETA: 36s - loss: 0.5332 - regression_loss: 0.4884 - classification_loss: 0.0448 406/500 [=======================>......] - ETA: 36s - loss: 0.5327 - regression_loss: 0.4879 - classification_loss: 0.0448 407/500 [=======================>......] - ETA: 36s - loss: 0.5327 - regression_loss: 0.4879 - classification_loss: 0.0448 408/500 [=======================>......] - ETA: 35s - loss: 0.5344 - regression_loss: 0.4891 - classification_loss: 0.0453 409/500 [=======================>......] - ETA: 35s - loss: 0.5334 - regression_loss: 0.4883 - classification_loss: 0.0452 410/500 [=======================>......] - ETA: 35s - loss: 0.5330 - regression_loss: 0.4879 - classification_loss: 0.0451 411/500 [=======================>......] - ETA: 34s - loss: 0.5330 - regression_loss: 0.4879 - classification_loss: 0.0451 412/500 [=======================>......] - ETA: 34s - loss: 0.5343 - regression_loss: 0.4890 - classification_loss: 0.0452 413/500 [=======================>......] - ETA: 33s - loss: 0.5346 - regression_loss: 0.4894 - classification_loss: 0.0452 414/500 [=======================>......] - ETA: 33s - loss: 0.5340 - regression_loss: 0.4889 - classification_loss: 0.0451 415/500 [=======================>......] - ETA: 33s - loss: 0.5340 - regression_loss: 0.4889 - classification_loss: 0.0451 416/500 [=======================>......] - ETA: 32s - loss: 0.5346 - regression_loss: 0.4895 - classification_loss: 0.0452 417/500 [========================>.....] - ETA: 32s - loss: 0.5344 - regression_loss: 0.4893 - classification_loss: 0.0451 418/500 [========================>.....] - ETA: 31s - loss: 0.5349 - regression_loss: 0.4897 - classification_loss: 0.0452 419/500 [========================>.....] - ETA: 31s - loss: 0.5350 - regression_loss: 0.4898 - classification_loss: 0.0451 420/500 [========================>.....] - ETA: 31s - loss: 0.5343 - regression_loss: 0.4892 - classification_loss: 0.0451 421/500 [========================>.....] - ETA: 30s - loss: 0.5341 - regression_loss: 0.4890 - classification_loss: 0.0450 422/500 [========================>.....] - ETA: 30s - loss: 0.5346 - regression_loss: 0.4896 - classification_loss: 0.0451 423/500 [========================>.....] - ETA: 29s - loss: 0.5347 - regression_loss: 0.4896 - classification_loss: 0.0450 424/500 [========================>.....] - ETA: 29s - loss: 0.5344 - regression_loss: 0.4893 - classification_loss: 0.0450 425/500 [========================>.....] - ETA: 29s - loss: 0.5341 - regression_loss: 0.4891 - classification_loss: 0.0450 426/500 [========================>.....] - ETA: 28s - loss: 0.5336 - regression_loss: 0.4887 - classification_loss: 0.0449 427/500 [========================>.....] - ETA: 28s - loss: 0.5337 - regression_loss: 0.4888 - classification_loss: 0.0449 428/500 [========================>.....] - ETA: 28s - loss: 0.5340 - regression_loss: 0.4891 - classification_loss: 0.0450 429/500 [========================>.....] - ETA: 27s - loss: 0.5349 - regression_loss: 0.4897 - classification_loss: 0.0452 430/500 [========================>.....] - ETA: 27s - loss: 0.5350 - regression_loss: 0.4898 - classification_loss: 0.0452 431/500 [========================>.....] - ETA: 26s - loss: 0.5347 - regression_loss: 0.4895 - classification_loss: 0.0451 432/500 [========================>.....] - ETA: 26s - loss: 0.5362 - regression_loss: 0.4910 - classification_loss: 0.0452 433/500 [========================>.....] - ETA: 26s - loss: 0.5362 - regression_loss: 0.4910 - classification_loss: 0.0452 434/500 [=========================>....] - ETA: 25s - loss: 0.5358 - regression_loss: 0.4906 - classification_loss: 0.0452 435/500 [=========================>....] - ETA: 25s - loss: 0.5359 - regression_loss: 0.4907 - classification_loss: 0.0452 436/500 [=========================>....] - ETA: 24s - loss: 0.5355 - regression_loss: 0.4903 - classification_loss: 0.0452 437/500 [=========================>....] - ETA: 24s - loss: 0.5352 - regression_loss: 0.4900 - classification_loss: 0.0452 438/500 [=========================>....] - ETA: 24s - loss: 0.5352 - regression_loss: 0.4900 - classification_loss: 0.0452 439/500 [=========================>....] - ETA: 23s - loss: 0.5353 - regression_loss: 0.4901 - classification_loss: 0.0451 440/500 [=========================>....] - ETA: 23s - loss: 0.5355 - regression_loss: 0.4904 - classification_loss: 0.0451 441/500 [=========================>....] - ETA: 22s - loss: 0.5351 - regression_loss: 0.4900 - classification_loss: 0.0451 442/500 [=========================>....] - ETA: 22s - loss: 0.5347 - regression_loss: 0.4896 - classification_loss: 0.0451 443/500 [=========================>....] - ETA: 22s - loss: 0.5344 - regression_loss: 0.4894 - classification_loss: 0.0450 444/500 [=========================>....] - ETA: 21s - loss: 0.5357 - regression_loss: 0.4906 - classification_loss: 0.0451 445/500 [=========================>....] - ETA: 21s - loss: 0.5353 - regression_loss: 0.4902 - classification_loss: 0.0450 446/500 [=========================>....] - ETA: 21s - loss: 0.5354 - regression_loss: 0.4903 - classification_loss: 0.0450 447/500 [=========================>....] - ETA: 20s - loss: 0.5350 - regression_loss: 0.4900 - classification_loss: 0.0450 448/500 [=========================>....] - ETA: 20s - loss: 0.5350 - regression_loss: 0.4900 - classification_loss: 0.0450 449/500 [=========================>....] - ETA: 19s - loss: 0.5353 - regression_loss: 0.4904 - classification_loss: 0.0450 450/500 [==========================>...] - ETA: 19s - loss: 0.5361 - regression_loss: 0.4910 - classification_loss: 0.0452 451/500 [==========================>...] - ETA: 19s - loss: 0.5363 - regression_loss: 0.4911 - classification_loss: 0.0452 452/500 [==========================>...] - ETA: 18s - loss: 0.5363 - regression_loss: 0.4911 - classification_loss: 0.0452 453/500 [==========================>...] - ETA: 18s - loss: 0.5370 - regression_loss: 0.4917 - classification_loss: 0.0453 454/500 [==========================>...] - ETA: 17s - loss: 0.5372 - regression_loss: 0.4920 - classification_loss: 0.0453 455/500 [==========================>...] - ETA: 17s - loss: 0.5377 - regression_loss: 0.4924 - classification_loss: 0.0453 456/500 [==========================>...] - ETA: 17s - loss: 0.5378 - regression_loss: 0.4925 - classification_loss: 0.0453 457/500 [==========================>...] - ETA: 16s - loss: 0.5376 - regression_loss: 0.4923 - classification_loss: 0.0453 458/500 [==========================>...] - ETA: 16s - loss: 0.5377 - regression_loss: 0.4925 - classification_loss: 0.0453 459/500 [==========================>...] - ETA: 15s - loss: 0.5376 - regression_loss: 0.4923 - classification_loss: 0.0452 460/500 [==========================>...] - ETA: 15s - loss: 0.5378 - regression_loss: 0.4926 - classification_loss: 0.0452 461/500 [==========================>...] - ETA: 15s - loss: 0.5379 - regression_loss: 0.4928 - classification_loss: 0.0451 462/500 [==========================>...] - ETA: 14s - loss: 0.5379 - regression_loss: 0.4927 - classification_loss: 0.0451 463/500 [==========================>...] - ETA: 14s - loss: 0.5377 - regression_loss: 0.4926 - classification_loss: 0.0451 464/500 [==========================>...] - ETA: 13s - loss: 0.5374 - regression_loss: 0.4924 - classification_loss: 0.0450 465/500 [==========================>...] - ETA: 13s - loss: 0.5370 - regression_loss: 0.4920 - classification_loss: 0.0450 466/500 [==========================>...] - ETA: 13s - loss: 0.5369 - regression_loss: 0.4919 - classification_loss: 0.0450 467/500 [===========================>..] - ETA: 12s - loss: 0.5369 - regression_loss: 0.4919 - classification_loss: 0.0449 468/500 [===========================>..] - ETA: 12s - loss: 0.5367 - regression_loss: 0.4918 - classification_loss: 0.0449 469/500 [===========================>..] - ETA: 12s - loss: 0.5365 - regression_loss: 0.4916 - classification_loss: 0.0448 470/500 [===========================>..] - ETA: 11s - loss: 0.5363 - regression_loss: 0.4915 - classification_loss: 0.0448 471/500 [===========================>..] - ETA: 11s - loss: 0.5361 - regression_loss: 0.4913 - classification_loss: 0.0448 472/500 [===========================>..] - ETA: 10s - loss: 0.5358 - regression_loss: 0.4911 - classification_loss: 0.0447 473/500 [===========================>..] - ETA: 10s - loss: 0.5351 - regression_loss: 0.4904 - classification_loss: 0.0447 474/500 [===========================>..] - ETA: 10s - loss: 0.5349 - regression_loss: 0.4902 - classification_loss: 0.0446 475/500 [===========================>..] - ETA: 9s - loss: 0.5355 - regression_loss: 0.4908 - classification_loss: 0.0448  476/500 [===========================>..] - ETA: 9s - loss: 0.5353 - regression_loss: 0.4905 - classification_loss: 0.0447 477/500 [===========================>..] - ETA: 8s - loss: 0.5351 - regression_loss: 0.4904 - classification_loss: 0.0447 478/500 [===========================>..] - ETA: 8s - loss: 0.5347 - regression_loss: 0.4900 - classification_loss: 0.0447 479/500 [===========================>..] - ETA: 8s - loss: 0.5346 - regression_loss: 0.4900 - classification_loss: 0.0446 480/500 [===========================>..] - ETA: 7s - loss: 0.5345 - regression_loss: 0.4899 - classification_loss: 0.0446 481/500 [===========================>..] - ETA: 7s - loss: 0.5342 - regression_loss: 0.4896 - classification_loss: 0.0446 482/500 [===========================>..] - ETA: 7s - loss: 0.5340 - regression_loss: 0.4894 - classification_loss: 0.0446 483/500 [===========================>..] - ETA: 6s - loss: 0.5337 - regression_loss: 0.4891 - classification_loss: 0.0446 484/500 [============================>.] - ETA: 6s - loss: 0.5341 - regression_loss: 0.4894 - classification_loss: 0.0447 485/500 [============================>.] - ETA: 5s - loss: 0.5342 - regression_loss: 0.4895 - classification_loss: 0.0447 486/500 [============================>.] - ETA: 5s - loss: 0.5339 - regression_loss: 0.4892 - classification_loss: 0.0447 487/500 [============================>.] - ETA: 5s - loss: 0.5340 - regression_loss: 0.4894 - classification_loss: 0.0447 488/500 [============================>.] - ETA: 4s - loss: 0.5339 - regression_loss: 0.4892 - classification_loss: 0.0447 489/500 [============================>.] - ETA: 4s - loss: 0.5346 - regression_loss: 0.4899 - classification_loss: 0.0447 490/500 [============================>.] - ETA: 3s - loss: 0.5342 - regression_loss: 0.4896 - classification_loss: 0.0446 491/500 [============================>.] - ETA: 3s - loss: 0.5346 - regression_loss: 0.4898 - classification_loss: 0.0447 492/500 [============================>.] - ETA: 3s - loss: 0.5343 - regression_loss: 0.4895 - classification_loss: 0.0447 493/500 [============================>.] - ETA: 2s - loss: 0.5342 - regression_loss: 0.4895 - classification_loss: 0.0447 494/500 [============================>.] - ETA: 2s - loss: 0.5344 - regression_loss: 0.4897 - classification_loss: 0.0447 495/500 [============================>.] - ETA: 1s - loss: 0.5338 - regression_loss: 0.4892 - classification_loss: 0.0447 496/500 [============================>.] - ETA: 1s - loss: 0.5336 - regression_loss: 0.4890 - classification_loss: 0.0446 497/500 [============================>.] - ETA: 1s - loss: 0.5331 - regression_loss: 0.4885 - classification_loss: 0.0446 498/500 [============================>.] - ETA: 0s - loss: 0.5326 - regression_loss: 0.4881 - classification_loss: 0.0446 499/500 [============================>.] - ETA: 0s - loss: 0.5337 - regression_loss: 0.4890 - classification_loss: 0.0447 500/500 [==============================] - 195s 389ms/step - loss: 0.5341 - regression_loss: 0.4894 - classification_loss: 0.0448 326 instances of class plum with average precision: 0.8220 mAP: 0.8220 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/20 1/500 [..............................] - ETA: 3:08 - loss: 0.6596 - regression_loss: 0.6030 - classification_loss: 0.0567 2/500 [..............................] - ETA: 3:07 - loss: 0.6382 - regression_loss: 0.5999 - classification_loss: 0.0383 3/500 [..............................] - ETA: 3:10 - loss: 0.6684 - regression_loss: 0.6347 - classification_loss: 0.0336 4/500 [..............................] - ETA: 3:12 - loss: 0.6687 - regression_loss: 0.6210 - classification_loss: 0.0477 5/500 [..............................] - ETA: 3:12 - loss: 0.6234 - regression_loss: 0.5785 - classification_loss: 0.0449 6/500 [..............................] - ETA: 3:11 - loss: 0.6056 - regression_loss: 0.5611 - classification_loss: 0.0446 7/500 [..............................] - ETA: 3:12 - loss: 0.5795 - regression_loss: 0.5371 - classification_loss: 0.0424 8/500 [..............................] - ETA: 3:11 - loss: 0.6473 - regression_loss: 0.5864 - classification_loss: 0.0608 9/500 [..............................] - ETA: 3:12 - loss: 0.6504 - regression_loss: 0.5902 - classification_loss: 0.0602 10/500 [..............................] - ETA: 3:11 - loss: 0.6218 - regression_loss: 0.5642 - classification_loss: 0.0575 11/500 [..............................] - ETA: 3:11 - loss: 0.6212 - regression_loss: 0.5644 - classification_loss: 0.0568 12/500 [..............................] - ETA: 3:11 - loss: 0.6110 - regression_loss: 0.5573 - classification_loss: 0.0536 13/500 [..............................] - ETA: 3:10 - loss: 0.6023 - regression_loss: 0.5498 - classification_loss: 0.0525 14/500 [..............................] - ETA: 3:10 - loss: 0.6007 - regression_loss: 0.5490 - classification_loss: 0.0517 15/500 [..............................] - ETA: 3:09 - loss: 0.6011 - regression_loss: 0.5499 - classification_loss: 0.0512 16/500 [..............................] - ETA: 3:09 - loss: 0.5984 - regression_loss: 0.5469 - classification_loss: 0.0514 17/500 [>.............................] - ETA: 3:09 - loss: 0.5738 - regression_loss: 0.5239 - classification_loss: 0.0499 18/500 [>.............................] - ETA: 3:08 - loss: 0.5889 - regression_loss: 0.5375 - classification_loss: 0.0515 19/500 [>.............................] - ETA: 3:07 - loss: 0.5795 - regression_loss: 0.5289 - classification_loss: 0.0505 20/500 [>.............................] - ETA: 3:07 - loss: 0.5687 - regression_loss: 0.5193 - classification_loss: 0.0494 21/500 [>.............................] - ETA: 3:07 - loss: 0.5667 - regression_loss: 0.5175 - classification_loss: 0.0492 22/500 [>.............................] - ETA: 3:06 - loss: 0.5616 - regression_loss: 0.5128 - classification_loss: 0.0488 23/500 [>.............................] - ETA: 3:06 - loss: 0.5498 - regression_loss: 0.5020 - classification_loss: 0.0477 24/500 [>.............................] - ETA: 3:06 - loss: 0.5392 - regression_loss: 0.4925 - classification_loss: 0.0467 25/500 [>.............................] - ETA: 3:06 - loss: 0.5346 - regression_loss: 0.4889 - classification_loss: 0.0457 26/500 [>.............................] - ETA: 3:05 - loss: 0.5275 - regression_loss: 0.4820 - classification_loss: 0.0455 27/500 [>.............................] - ETA: 3:05 - loss: 0.5408 - regression_loss: 0.4948 - classification_loss: 0.0460 28/500 [>.............................] - ETA: 3:05 - loss: 0.5524 - regression_loss: 0.5022 - classification_loss: 0.0502 29/500 [>.............................] - ETA: 3:04 - loss: 0.5706 - regression_loss: 0.5176 - classification_loss: 0.0530 30/500 [>.............................] - ETA: 3:03 - loss: 0.5740 - regression_loss: 0.5219 - classification_loss: 0.0521 31/500 [>.............................] - ETA: 3:03 - loss: 0.5788 - regression_loss: 0.5274 - classification_loss: 0.0514 32/500 [>.............................] - ETA: 3:03 - loss: 0.5776 - regression_loss: 0.5268 - classification_loss: 0.0508 33/500 [>.............................] - ETA: 3:02 - loss: 0.5772 - regression_loss: 0.5265 - classification_loss: 0.0507 34/500 [=>............................] - ETA: 3:02 - loss: 0.5780 - regression_loss: 0.5273 - classification_loss: 0.0508 35/500 [=>............................] - ETA: 3:01 - loss: 0.5791 - regression_loss: 0.5285 - classification_loss: 0.0506 36/500 [=>............................] - ETA: 3:01 - loss: 0.5744 - regression_loss: 0.5245 - classification_loss: 0.0499 37/500 [=>............................] - ETA: 3:01 - loss: 0.5854 - regression_loss: 0.5349 - classification_loss: 0.0505 38/500 [=>............................] - ETA: 3:00 - loss: 0.5815 - regression_loss: 0.5316 - classification_loss: 0.0499 39/500 [=>............................] - ETA: 3:00 - loss: 0.5830 - regression_loss: 0.5336 - classification_loss: 0.0494 40/500 [=>............................] - ETA: 2:59 - loss: 0.5804 - regression_loss: 0.5317 - classification_loss: 0.0487 41/500 [=>............................] - ETA: 2:59 - loss: 0.5819 - regression_loss: 0.5334 - classification_loss: 0.0485 42/500 [=>............................] - ETA: 2:59 - loss: 0.5746 - regression_loss: 0.5267 - classification_loss: 0.0478 43/500 [=>............................] - ETA: 2:59 - loss: 0.5713 - regression_loss: 0.5238 - classification_loss: 0.0474 44/500 [=>............................] - ETA: 2:58 - loss: 0.5661 - regression_loss: 0.5194 - classification_loss: 0.0467 45/500 [=>............................] - ETA: 2:58 - loss: 0.5636 - regression_loss: 0.5172 - classification_loss: 0.0464 46/500 [=>............................] - ETA: 2:57 - loss: 0.5610 - regression_loss: 0.5150 - classification_loss: 0.0460 47/500 [=>............................] - ETA: 2:57 - loss: 0.5649 - regression_loss: 0.5179 - classification_loss: 0.0470 48/500 [=>............................] - ETA: 2:57 - loss: 0.5646 - regression_loss: 0.5176 - classification_loss: 0.0469 49/500 [=>............................] - ETA: 2:57 - loss: 0.5739 - regression_loss: 0.5250 - classification_loss: 0.0488 50/500 [==>...........................] - ETA: 2:56 - loss: 0.5713 - regression_loss: 0.5227 - classification_loss: 0.0486 51/500 [==>...........................] - ETA: 2:56 - loss: 0.5666 - regression_loss: 0.5186 - classification_loss: 0.0481 52/500 [==>...........................] - ETA: 2:55 - loss: 0.5677 - regression_loss: 0.5191 - classification_loss: 0.0486 53/500 [==>...........................] - ETA: 2:55 - loss: 0.5647 - regression_loss: 0.5166 - classification_loss: 0.0481 54/500 [==>...........................] - ETA: 2:54 - loss: 0.5633 - regression_loss: 0.5154 - classification_loss: 0.0478 55/500 [==>...........................] - ETA: 2:54 - loss: 0.5556 - regression_loss: 0.5083 - classification_loss: 0.0473 56/500 [==>...........................] - ETA: 2:54 - loss: 0.5525 - regression_loss: 0.5057 - classification_loss: 0.0468 57/500 [==>...........................] - ETA: 2:53 - loss: 0.5527 - regression_loss: 0.5059 - classification_loss: 0.0468 58/500 [==>...........................] - ETA: 2:53 - loss: 0.5540 - regression_loss: 0.5072 - classification_loss: 0.0468 59/500 [==>...........................] - ETA: 2:52 - loss: 0.5493 - regression_loss: 0.5029 - classification_loss: 0.0464 60/500 [==>...........................] - ETA: 2:52 - loss: 0.5514 - regression_loss: 0.5049 - classification_loss: 0.0465 61/500 [==>...........................] - ETA: 2:52 - loss: 0.5500 - regression_loss: 0.5038 - classification_loss: 0.0461 62/500 [==>...........................] - ETA: 2:51 - loss: 0.5491 - regression_loss: 0.5028 - classification_loss: 0.0463 63/500 [==>...........................] - ETA: 2:51 - loss: 0.5465 - regression_loss: 0.5006 - classification_loss: 0.0459 64/500 [==>...........................] - ETA: 2:50 - loss: 0.5440 - regression_loss: 0.4984 - classification_loss: 0.0456 65/500 [==>...........................] - ETA: 2:50 - loss: 0.5454 - regression_loss: 0.4998 - classification_loss: 0.0456 66/500 [==>...........................] - ETA: 2:50 - loss: 0.5468 - regression_loss: 0.5012 - classification_loss: 0.0456 67/500 [===>..........................] - ETA: 2:49 - loss: 0.5433 - regression_loss: 0.4981 - classification_loss: 0.0452 68/500 [===>..........................] - ETA: 2:49 - loss: 0.5455 - regression_loss: 0.5003 - classification_loss: 0.0453 69/500 [===>..........................] - ETA: 2:49 - loss: 0.5436 - regression_loss: 0.4987 - classification_loss: 0.0449 70/500 [===>..........................] - ETA: 2:48 - loss: 0.5396 - regression_loss: 0.4951 - classification_loss: 0.0446 71/500 [===>..........................] - ETA: 2:48 - loss: 0.5387 - regression_loss: 0.4943 - classification_loss: 0.0445 72/500 [===>..........................] - ETA: 2:48 - loss: 0.5383 - regression_loss: 0.4938 - classification_loss: 0.0444 73/500 [===>..........................] - ETA: 2:47 - loss: 0.5372 - regression_loss: 0.4929 - classification_loss: 0.0443 74/500 [===>..........................] - ETA: 2:47 - loss: 0.5333 - regression_loss: 0.4894 - classification_loss: 0.0439 75/500 [===>..........................] - ETA: 2:46 - loss: 0.5310 - regression_loss: 0.4871 - classification_loss: 0.0439 76/500 [===>..........................] - ETA: 2:46 - loss: 0.5286 - regression_loss: 0.4850 - classification_loss: 0.0435 77/500 [===>..........................] - ETA: 2:46 - loss: 0.5250 - regression_loss: 0.4816 - classification_loss: 0.0434 78/500 [===>..........................] - ETA: 2:45 - loss: 0.5273 - regression_loss: 0.4836 - classification_loss: 0.0437 79/500 [===>..........................] - ETA: 2:45 - loss: 0.5248 - regression_loss: 0.4814 - classification_loss: 0.0433 80/500 [===>..........................] - ETA: 2:45 - loss: 0.5216 - regression_loss: 0.4786 - classification_loss: 0.0430 81/500 [===>..........................] - ETA: 2:44 - loss: 0.5184 - regression_loss: 0.4758 - classification_loss: 0.0426 82/500 [===>..........................] - ETA: 2:44 - loss: 0.5161 - regression_loss: 0.4736 - classification_loss: 0.0425 83/500 [===>..........................] - ETA: 2:43 - loss: 0.5148 - regression_loss: 0.4725 - classification_loss: 0.0423 84/500 [====>.........................] - ETA: 2:43 - loss: 0.5134 - regression_loss: 0.4714 - classification_loss: 0.0421 85/500 [====>.........................] - ETA: 2:42 - loss: 0.5123 - regression_loss: 0.4704 - classification_loss: 0.0419 86/500 [====>.........................] - ETA: 2:42 - loss: 0.5129 - regression_loss: 0.4710 - classification_loss: 0.0419 87/500 [====>.........................] - ETA: 2:42 - loss: 0.5152 - regression_loss: 0.4733 - classification_loss: 0.0419 88/500 [====>.........................] - ETA: 2:41 - loss: 0.5132 - regression_loss: 0.4714 - classification_loss: 0.0417 89/500 [====>.........................] - ETA: 2:41 - loss: 0.5114 - regression_loss: 0.4697 - classification_loss: 0.0417 90/500 [====>.........................] - ETA: 2:40 - loss: 0.5095 - regression_loss: 0.4681 - classification_loss: 0.0414 91/500 [====>.........................] - ETA: 2:40 - loss: 0.5084 - regression_loss: 0.4670 - classification_loss: 0.0414 92/500 [====>.........................] - ETA: 2:40 - loss: 0.5064 - regression_loss: 0.4651 - classification_loss: 0.0413 93/500 [====>.........................] - ETA: 2:39 - loss: 0.5022 - regression_loss: 0.4612 - classification_loss: 0.0410 94/500 [====>.........................] - ETA: 2:39 - loss: 0.5026 - regression_loss: 0.4616 - classification_loss: 0.0410 95/500 [====>.........................] - ETA: 2:38 - loss: 0.5023 - regression_loss: 0.4614 - classification_loss: 0.0409 96/500 [====>.........................] - ETA: 2:38 - loss: 0.5090 - regression_loss: 0.4670 - classification_loss: 0.0420 97/500 [====>.........................] - ETA: 2:37 - loss: 0.5059 - regression_loss: 0.4642 - classification_loss: 0.0417 98/500 [====>.........................] - ETA: 2:37 - loss: 0.5051 - regression_loss: 0.4635 - classification_loss: 0.0416 99/500 [====>.........................] - ETA: 2:37 - loss: 0.5037 - regression_loss: 0.4624 - classification_loss: 0.0413 100/500 [=====>........................] - ETA: 2:36 - loss: 0.5028 - regression_loss: 0.4617 - classification_loss: 0.0411 101/500 [=====>........................] - ETA: 2:36 - loss: 0.5012 - regression_loss: 0.4603 - classification_loss: 0.0409 102/500 [=====>........................] - ETA: 2:35 - loss: 0.4990 - regression_loss: 0.4581 - classification_loss: 0.0410 103/500 [=====>........................] - ETA: 2:35 - loss: 0.4974 - regression_loss: 0.4566 - classification_loss: 0.0408 104/500 [=====>........................] - ETA: 2:35 - loss: 0.4975 - regression_loss: 0.4567 - classification_loss: 0.0407 105/500 [=====>........................] - ETA: 2:34 - loss: 0.4978 - regression_loss: 0.4571 - classification_loss: 0.0407 106/500 [=====>........................] - ETA: 2:34 - loss: 0.4954 - regression_loss: 0.4549 - classification_loss: 0.0405 107/500 [=====>........................] - ETA: 2:33 - loss: 0.4935 - regression_loss: 0.4532 - classification_loss: 0.0403 108/500 [=====>........................] - ETA: 2:33 - loss: 0.4910 - regression_loss: 0.4509 - classification_loss: 0.0401 109/500 [=====>........................] - ETA: 2:33 - loss: 0.4942 - regression_loss: 0.4539 - classification_loss: 0.0404 110/500 [=====>........................] - ETA: 2:32 - loss: 0.4922 - regression_loss: 0.4520 - classification_loss: 0.0402 111/500 [=====>........................] - ETA: 2:32 - loss: 0.4939 - regression_loss: 0.4535 - classification_loss: 0.0404 112/500 [=====>........................] - ETA: 2:32 - loss: 0.4956 - regression_loss: 0.4553 - classification_loss: 0.0403 113/500 [=====>........................] - ETA: 2:31 - loss: 0.4956 - regression_loss: 0.4555 - classification_loss: 0.0402 114/500 [=====>........................] - ETA: 2:31 - loss: 0.4948 - regression_loss: 0.4548 - classification_loss: 0.0400 115/500 [=====>........................] - ETA: 2:30 - loss: 0.4954 - regression_loss: 0.4553 - classification_loss: 0.0401 116/500 [=====>........................] - ETA: 2:30 - loss: 0.4990 - regression_loss: 0.4585 - classification_loss: 0.0404 117/500 [======>.......................] - ETA: 2:29 - loss: 0.4978 - regression_loss: 0.4575 - classification_loss: 0.0403 118/500 [======>.......................] - ETA: 2:29 - loss: 0.4982 - regression_loss: 0.4579 - classification_loss: 0.0403 119/500 [======>.......................] - ETA: 2:29 - loss: 0.4979 - regression_loss: 0.4576 - classification_loss: 0.0403 120/500 [======>.......................] - ETA: 2:28 - loss: 0.4950 - regression_loss: 0.4549 - classification_loss: 0.0401 121/500 [======>.......................] - ETA: 2:28 - loss: 0.4949 - regression_loss: 0.4548 - classification_loss: 0.0401 122/500 [======>.......................] - ETA: 2:27 - loss: 0.4995 - regression_loss: 0.4578 - classification_loss: 0.0417 123/500 [======>.......................] - ETA: 2:27 - loss: 0.4994 - regression_loss: 0.4578 - classification_loss: 0.0416 124/500 [======>.......................] - ETA: 2:26 - loss: 0.4972 - regression_loss: 0.4558 - classification_loss: 0.0414 125/500 [======>.......................] - ETA: 2:26 - loss: 0.4981 - regression_loss: 0.4566 - classification_loss: 0.0415 126/500 [======>.......................] - ETA: 2:26 - loss: 0.4967 - regression_loss: 0.4554 - classification_loss: 0.0413 127/500 [======>.......................] - ETA: 2:25 - loss: 0.4968 - regression_loss: 0.4557 - classification_loss: 0.0412 128/500 [======>.......................] - ETA: 2:25 - loss: 0.4980 - regression_loss: 0.4568 - classification_loss: 0.0412 129/500 [======>.......................] - ETA: 2:25 - loss: 0.4974 - regression_loss: 0.4563 - classification_loss: 0.0411 130/500 [======>.......................] - ETA: 2:24 - loss: 0.4961 - regression_loss: 0.4551 - classification_loss: 0.0410 131/500 [======>.......................] - ETA: 2:24 - loss: 0.4949 - regression_loss: 0.4542 - classification_loss: 0.0408 132/500 [======>.......................] - ETA: 2:23 - loss: 0.4942 - regression_loss: 0.4536 - classification_loss: 0.0407 133/500 [======>.......................] - ETA: 2:23 - loss: 0.4937 - regression_loss: 0.4531 - classification_loss: 0.0406 134/500 [=======>......................] - ETA: 2:22 - loss: 0.4942 - regression_loss: 0.4536 - classification_loss: 0.0406 135/500 [=======>......................] - ETA: 2:22 - loss: 0.4936 - regression_loss: 0.4532 - classification_loss: 0.0404 136/500 [=======>......................] - ETA: 2:22 - loss: 0.4929 - regression_loss: 0.4525 - classification_loss: 0.0404 137/500 [=======>......................] - ETA: 2:21 - loss: 0.4926 - regression_loss: 0.4521 - classification_loss: 0.0405 138/500 [=======>......................] - ETA: 2:21 - loss: 0.4912 - regression_loss: 0.4509 - classification_loss: 0.0403 139/500 [=======>......................] - ETA: 2:21 - loss: 0.4910 - regression_loss: 0.4507 - classification_loss: 0.0403 140/500 [=======>......................] - ETA: 2:20 - loss: 0.4930 - regression_loss: 0.4527 - classification_loss: 0.0403 141/500 [=======>......................] - ETA: 2:20 - loss: 0.4928 - regression_loss: 0.4525 - classification_loss: 0.0403 142/500 [=======>......................] - ETA: 2:19 - loss: 0.4920 - regression_loss: 0.4517 - classification_loss: 0.0403 143/500 [=======>......................] - ETA: 2:19 - loss: 0.4904 - regression_loss: 0.4503 - classification_loss: 0.0401 144/500 [=======>......................] - ETA: 2:19 - loss: 0.4885 - regression_loss: 0.4486 - classification_loss: 0.0399 145/500 [=======>......................] - ETA: 2:18 - loss: 0.4879 - regression_loss: 0.4481 - classification_loss: 0.0398 146/500 [=======>......................] - ETA: 2:18 - loss: 0.4876 - regression_loss: 0.4479 - classification_loss: 0.0397 147/500 [=======>......................] - ETA: 2:17 - loss: 0.4881 - regression_loss: 0.4485 - classification_loss: 0.0396 148/500 [=======>......................] - ETA: 2:17 - loss: 0.4876 - regression_loss: 0.4481 - classification_loss: 0.0395 149/500 [=======>......................] - ETA: 2:17 - loss: 0.4866 - regression_loss: 0.4471 - classification_loss: 0.0394 150/500 [========>.....................] - ETA: 2:16 - loss: 0.4867 - regression_loss: 0.4473 - classification_loss: 0.0393 151/500 [========>.....................] - ETA: 2:16 - loss: 0.4871 - regression_loss: 0.4478 - classification_loss: 0.0393 152/500 [========>.....................] - ETA: 2:16 - loss: 0.4860 - regression_loss: 0.4469 - classification_loss: 0.0391 153/500 [========>.....................] - ETA: 2:15 - loss: 0.4858 - regression_loss: 0.4467 - classification_loss: 0.0391 154/500 [========>.....................] - ETA: 2:15 - loss: 0.4870 - regression_loss: 0.4478 - classification_loss: 0.0392 155/500 [========>.....................] - ETA: 2:14 - loss: 0.4880 - regression_loss: 0.4490 - classification_loss: 0.0391 156/500 [========>.....................] - ETA: 2:14 - loss: 0.4905 - regression_loss: 0.4513 - classification_loss: 0.0392 157/500 [========>.....................] - ETA: 2:14 - loss: 0.4925 - regression_loss: 0.4533 - classification_loss: 0.0392 158/500 [========>.....................] - ETA: 2:13 - loss: 0.4958 - regression_loss: 0.4564 - classification_loss: 0.0394 159/500 [========>.....................] - ETA: 2:13 - loss: 0.4971 - regression_loss: 0.4575 - classification_loss: 0.0396 160/500 [========>.....................] - ETA: 2:12 - loss: 0.4972 - regression_loss: 0.4577 - classification_loss: 0.0395 161/500 [========>.....................] - ETA: 2:12 - loss: 0.4970 - regression_loss: 0.4576 - classification_loss: 0.0394 162/500 [========>.....................] - ETA: 2:11 - loss: 0.4996 - regression_loss: 0.4601 - classification_loss: 0.0395 163/500 [========>.....................] - ETA: 2:11 - loss: 0.4998 - regression_loss: 0.4603 - classification_loss: 0.0396 164/500 [========>.....................] - ETA: 2:11 - loss: 0.4992 - regression_loss: 0.4596 - classification_loss: 0.0395 165/500 [========>.....................] - ETA: 2:10 - loss: 0.5002 - regression_loss: 0.4607 - classification_loss: 0.0395 166/500 [========>.....................] - ETA: 2:10 - loss: 0.4993 - regression_loss: 0.4599 - classification_loss: 0.0394 167/500 [=========>....................] - ETA: 2:09 - loss: 0.4976 - regression_loss: 0.4583 - classification_loss: 0.0392 168/500 [=========>....................] - ETA: 2:09 - loss: 0.4980 - regression_loss: 0.4586 - classification_loss: 0.0394 169/500 [=========>....................] - ETA: 2:09 - loss: 0.4973 - regression_loss: 0.4579 - classification_loss: 0.0393 170/500 [=========>....................] - ETA: 2:08 - loss: 0.4984 - regression_loss: 0.4586 - classification_loss: 0.0398 171/500 [=========>....................] - ETA: 2:08 - loss: 0.4974 - regression_loss: 0.4577 - classification_loss: 0.0397 172/500 [=========>....................] - ETA: 2:08 - loss: 0.4964 - regression_loss: 0.4568 - classification_loss: 0.0396 173/500 [=========>....................] - ETA: 2:07 - loss: 0.4957 - regression_loss: 0.4563 - classification_loss: 0.0395 174/500 [=========>....................] - ETA: 2:07 - loss: 0.4982 - regression_loss: 0.4586 - classification_loss: 0.0395 175/500 [=========>....................] - ETA: 2:06 - loss: 0.4989 - regression_loss: 0.4593 - classification_loss: 0.0396 176/500 [=========>....................] - ETA: 2:06 - loss: 0.4998 - regression_loss: 0.4600 - classification_loss: 0.0398 177/500 [=========>....................] - ETA: 2:05 - loss: 0.5000 - regression_loss: 0.4602 - classification_loss: 0.0398 178/500 [=========>....................] - ETA: 2:05 - loss: 0.4994 - regression_loss: 0.4597 - classification_loss: 0.0397 179/500 [=========>....................] - ETA: 2:05 - loss: 0.4988 - regression_loss: 0.4591 - classification_loss: 0.0397 180/500 [=========>....................] - ETA: 2:04 - loss: 0.4984 - regression_loss: 0.4587 - classification_loss: 0.0397 181/500 [=========>....................] - ETA: 2:04 - loss: 0.4981 - regression_loss: 0.4585 - classification_loss: 0.0396 182/500 [=========>....................] - ETA: 2:04 - loss: 0.5001 - regression_loss: 0.4602 - classification_loss: 0.0399 183/500 [=========>....................] - ETA: 2:03 - loss: 0.5006 - regression_loss: 0.4606 - classification_loss: 0.0400 184/500 [==========>...................] - ETA: 2:03 - loss: 0.5011 - regression_loss: 0.4611 - classification_loss: 0.0400 185/500 [==========>...................] - ETA: 2:02 - loss: 0.5009 - regression_loss: 0.4610 - classification_loss: 0.0399 186/500 [==========>...................] - ETA: 2:02 - loss: 0.5002 - regression_loss: 0.4604 - classification_loss: 0.0398 187/500 [==========>...................] - ETA: 2:01 - loss: 0.5056 - regression_loss: 0.4649 - classification_loss: 0.0407 188/500 [==========>...................] - ETA: 2:01 - loss: 0.5052 - regression_loss: 0.4644 - classification_loss: 0.0407 189/500 [==========>...................] - ETA: 2:01 - loss: 0.5052 - regression_loss: 0.4645 - classification_loss: 0.0407 190/500 [==========>...................] - ETA: 2:00 - loss: 0.5062 - regression_loss: 0.4656 - classification_loss: 0.0406 191/500 [==========>...................] - ETA: 2:00 - loss: 0.5069 - regression_loss: 0.4662 - classification_loss: 0.0407 192/500 [==========>...................] - ETA: 2:00 - loss: 0.5065 - regression_loss: 0.4658 - classification_loss: 0.0407 193/500 [==========>...................] - ETA: 1:59 - loss: 0.5054 - regression_loss: 0.4647 - classification_loss: 0.0406 194/500 [==========>...................] - ETA: 1:59 - loss: 0.5048 - regression_loss: 0.4643 - classification_loss: 0.0406 195/500 [==========>...................] - ETA: 1:58 - loss: 0.5048 - regression_loss: 0.4642 - classification_loss: 0.0406 196/500 [==========>...................] - ETA: 1:58 - loss: 0.5051 - regression_loss: 0.4645 - classification_loss: 0.0406 197/500 [==========>...................] - ETA: 1:58 - loss: 0.5054 - regression_loss: 0.4648 - classification_loss: 0.0407 198/500 [==========>...................] - ETA: 1:57 - loss: 0.5072 - regression_loss: 0.4665 - classification_loss: 0.0407 199/500 [==========>...................] - ETA: 1:57 - loss: 0.5063 - regression_loss: 0.4657 - classification_loss: 0.0406 200/500 [===========>..................] - ETA: 1:56 - loss: 0.5059 - regression_loss: 0.4653 - classification_loss: 0.0405 201/500 [===========>..................] - ETA: 1:56 - loss: 0.5053 - regression_loss: 0.4649 - classification_loss: 0.0405 202/500 [===========>..................] - ETA: 1:56 - loss: 0.5043 - regression_loss: 0.4639 - classification_loss: 0.0404 203/500 [===========>..................] - ETA: 1:55 - loss: 0.5033 - regression_loss: 0.4630 - classification_loss: 0.0404 204/500 [===========>..................] - ETA: 1:55 - loss: 0.5031 - regression_loss: 0.4629 - classification_loss: 0.0402 205/500 [===========>..................] - ETA: 1:55 - loss: 0.5021 - regression_loss: 0.4620 - classification_loss: 0.0401 206/500 [===========>..................] - ETA: 1:54 - loss: 0.5016 - regression_loss: 0.4616 - classification_loss: 0.0400 207/500 [===========>..................] - ETA: 1:54 - loss: 0.5022 - regression_loss: 0.4621 - classification_loss: 0.0400 208/500 [===========>..................] - ETA: 1:53 - loss: 0.5041 - regression_loss: 0.4639 - classification_loss: 0.0401 209/500 [===========>..................] - ETA: 1:53 - loss: 0.5059 - regression_loss: 0.4655 - classification_loss: 0.0404 210/500 [===========>..................] - ETA: 1:53 - loss: 0.5059 - regression_loss: 0.4656 - classification_loss: 0.0403 211/500 [===========>..................] - ETA: 1:52 - loss: 0.5081 - regression_loss: 0.4673 - classification_loss: 0.0408 212/500 [===========>..................] - ETA: 1:52 - loss: 0.5070 - regression_loss: 0.4663 - classification_loss: 0.0407 213/500 [===========>..................] - ETA: 1:51 - loss: 0.5086 - regression_loss: 0.4678 - classification_loss: 0.0408 214/500 [===========>..................] - ETA: 1:51 - loss: 0.5078 - regression_loss: 0.4671 - classification_loss: 0.0407 215/500 [===========>..................] - ETA: 1:51 - loss: 0.5079 - regression_loss: 0.4672 - classification_loss: 0.0407 216/500 [===========>..................] - ETA: 1:50 - loss: 0.5070 - regression_loss: 0.4663 - classification_loss: 0.0407 217/500 [============>.................] - ETA: 1:50 - loss: 0.5064 - regression_loss: 0.4657 - classification_loss: 0.0407 218/500 [============>.................] - ETA: 1:49 - loss: 0.5063 - regression_loss: 0.4656 - classification_loss: 0.0407 219/500 [============>.................] - ETA: 1:49 - loss: 0.5051 - regression_loss: 0.4645 - classification_loss: 0.0407 220/500 [============>.................] - ETA: 1:49 - loss: 0.5041 - regression_loss: 0.4635 - classification_loss: 0.0406 221/500 [============>.................] - ETA: 1:48 - loss: 0.5026 - regression_loss: 0.4621 - classification_loss: 0.0404 222/500 [============>.................] - ETA: 1:48 - loss: 0.5023 - regression_loss: 0.4619 - classification_loss: 0.0404 223/500 [============>.................] - ETA: 1:47 - loss: 0.5022 - regression_loss: 0.4619 - classification_loss: 0.0403 224/500 [============>.................] - ETA: 1:47 - loss: 0.5018 - regression_loss: 0.4615 - classification_loss: 0.0403 225/500 [============>.................] - ETA: 1:47 - loss: 0.5010 - regression_loss: 0.4608 - classification_loss: 0.0402 226/500 [============>.................] - ETA: 1:46 - loss: 0.5011 - regression_loss: 0.4609 - classification_loss: 0.0402 227/500 [============>.................] - ETA: 1:46 - loss: 0.5008 - regression_loss: 0.4607 - classification_loss: 0.0401 228/500 [============>.................] - ETA: 1:46 - loss: 0.5000 - regression_loss: 0.4599 - classification_loss: 0.0400 229/500 [============>.................] - ETA: 1:45 - loss: 0.5021 - regression_loss: 0.4615 - classification_loss: 0.0406 230/500 [============>.................] - ETA: 1:45 - loss: 0.5018 - regression_loss: 0.4612 - classification_loss: 0.0406 231/500 [============>.................] - ETA: 1:44 - loss: 0.5030 - regression_loss: 0.4621 - classification_loss: 0.0409 232/500 [============>.................] - ETA: 1:44 - loss: 0.5030 - regression_loss: 0.4621 - classification_loss: 0.0409 233/500 [============>.................] - ETA: 1:44 - loss: 0.5019 - regression_loss: 0.4611 - classification_loss: 0.0408 234/500 [=============>................] - ETA: 1:43 - loss: 0.5022 - regression_loss: 0.4613 - classification_loss: 0.0409 235/500 [=============>................] - ETA: 1:43 - loss: 0.5012 - regression_loss: 0.4604 - classification_loss: 0.0408 236/500 [=============>................] - ETA: 1:42 - loss: 0.5012 - regression_loss: 0.4605 - classification_loss: 0.0407 237/500 [=============>................] - ETA: 1:42 - loss: 0.5018 - regression_loss: 0.4611 - classification_loss: 0.0407 238/500 [=============>................] - ETA: 1:42 - loss: 0.5017 - regression_loss: 0.4610 - classification_loss: 0.0407 239/500 [=============>................] - ETA: 1:41 - loss: 0.5004 - regression_loss: 0.4598 - classification_loss: 0.0406 240/500 [=============>................] - ETA: 1:41 - loss: 0.5006 - regression_loss: 0.4600 - classification_loss: 0.0406 241/500 [=============>................] - ETA: 1:40 - loss: 0.5012 - regression_loss: 0.4605 - classification_loss: 0.0406 242/500 [=============>................] - ETA: 1:40 - loss: 0.5007 - regression_loss: 0.4601 - classification_loss: 0.0406 243/500 [=============>................] - ETA: 1:40 - loss: 0.4997 - regression_loss: 0.4592 - classification_loss: 0.0405 244/500 [=============>................] - ETA: 1:39 - loss: 0.4989 - regression_loss: 0.4585 - classification_loss: 0.0404 245/500 [=============>................] - ETA: 1:39 - loss: 0.4988 - regression_loss: 0.4584 - classification_loss: 0.0404 246/500 [=============>................] - ETA: 1:38 - loss: 0.4992 - regression_loss: 0.4588 - classification_loss: 0.0404 247/500 [=============>................] - ETA: 1:38 - loss: 0.4987 - regression_loss: 0.4583 - classification_loss: 0.0404 248/500 [=============>................] - ETA: 1:38 - loss: 0.4982 - regression_loss: 0.4579 - classification_loss: 0.0403 249/500 [=============>................] - ETA: 1:37 - loss: 0.4986 - regression_loss: 0.4582 - classification_loss: 0.0403 250/500 [==============>...............] - ETA: 1:37 - loss: 0.4995 - regression_loss: 0.4591 - classification_loss: 0.0404 251/500 [==============>...............] - ETA: 1:36 - loss: 0.4995 - regression_loss: 0.4592 - classification_loss: 0.0403 252/500 [==============>...............] - ETA: 1:36 - loss: 0.4996 - regression_loss: 0.4593 - classification_loss: 0.0402 253/500 [==============>...............] - ETA: 1:36 - loss: 0.4994 - regression_loss: 0.4592 - classification_loss: 0.0402 254/500 [==============>...............] - ETA: 1:35 - loss: 0.5006 - regression_loss: 0.4603 - classification_loss: 0.0403 255/500 [==============>...............] - ETA: 1:35 - loss: 0.5010 - regression_loss: 0.4607 - classification_loss: 0.0403 256/500 [==============>...............] - ETA: 1:35 - loss: 0.5001 - regression_loss: 0.4598 - classification_loss: 0.0403 257/500 [==============>...............] - ETA: 1:34 - loss: 0.5006 - regression_loss: 0.4602 - classification_loss: 0.0405 258/500 [==============>...............] - ETA: 1:34 - loss: 0.4997 - regression_loss: 0.4593 - classification_loss: 0.0404 259/500 [==============>...............] - ETA: 1:33 - loss: 0.4990 - regression_loss: 0.4587 - classification_loss: 0.0404 260/500 [==============>...............] - ETA: 1:33 - loss: 0.4991 - regression_loss: 0.4588 - classification_loss: 0.0404 261/500 [==============>...............] - ETA: 1:33 - loss: 0.4989 - regression_loss: 0.4586 - classification_loss: 0.0403 262/500 [==============>...............] - ETA: 1:32 - loss: 0.5000 - regression_loss: 0.4596 - classification_loss: 0.0404 263/500 [==============>...............] - ETA: 1:32 - loss: 0.4999 - regression_loss: 0.4596 - classification_loss: 0.0403 264/500 [==============>...............] - ETA: 1:31 - loss: 0.5002 - regression_loss: 0.4599 - classification_loss: 0.0403 265/500 [==============>...............] - ETA: 1:31 - loss: 0.4997 - regression_loss: 0.4595 - classification_loss: 0.0403 266/500 [==============>...............] - ETA: 1:31 - loss: 0.4995 - regression_loss: 0.4592 - classification_loss: 0.0403 267/500 [===============>..............] - ETA: 1:30 - loss: 0.5008 - regression_loss: 0.4605 - classification_loss: 0.0403 268/500 [===============>..............] - ETA: 1:30 - loss: 0.5010 - regression_loss: 0.4608 - classification_loss: 0.0402 269/500 [===============>..............] - ETA: 1:29 - loss: 0.5004 - regression_loss: 0.4603 - classification_loss: 0.0401 270/500 [===============>..............] - ETA: 1:29 - loss: 0.5013 - regression_loss: 0.4612 - classification_loss: 0.0401 271/500 [===============>..............] - ETA: 1:29 - loss: 0.5022 - regression_loss: 0.4620 - classification_loss: 0.0402 272/500 [===============>..............] - ETA: 1:28 - loss: 0.5028 - regression_loss: 0.4626 - classification_loss: 0.0403 273/500 [===============>..............] - ETA: 1:28 - loss: 0.5054 - regression_loss: 0.4648 - classification_loss: 0.0406 274/500 [===============>..............] - ETA: 1:28 - loss: 0.5083 - regression_loss: 0.4668 - classification_loss: 0.0414 275/500 [===============>..............] - ETA: 1:27 - loss: 0.5080 - regression_loss: 0.4666 - classification_loss: 0.0414 276/500 [===============>..............] - ETA: 1:27 - loss: 0.5071 - regression_loss: 0.4658 - classification_loss: 0.0413 277/500 [===============>..............] - ETA: 1:26 - loss: 0.5066 - regression_loss: 0.4653 - classification_loss: 0.0413 278/500 [===============>..............] - ETA: 1:26 - loss: 0.5059 - regression_loss: 0.4647 - classification_loss: 0.0412 279/500 [===============>..............] - ETA: 1:26 - loss: 0.5058 - regression_loss: 0.4647 - classification_loss: 0.0411 280/500 [===============>..............] - ETA: 1:25 - loss: 0.5060 - regression_loss: 0.4650 - classification_loss: 0.0411 281/500 [===============>..............] - ETA: 1:25 - loss: 0.5061 - regression_loss: 0.4650 - classification_loss: 0.0411 282/500 [===============>..............] - ETA: 1:24 - loss: 0.5061 - regression_loss: 0.4651 - classification_loss: 0.0411 283/500 [===============>..............] - ETA: 1:24 - loss: 0.5061 - regression_loss: 0.4650 - classification_loss: 0.0411 284/500 [================>.............] - ETA: 1:24 - loss: 0.5064 - regression_loss: 0.4652 - classification_loss: 0.0412 285/500 [================>.............] - ETA: 1:23 - loss: 0.5088 - regression_loss: 0.4674 - classification_loss: 0.0414 286/500 [================>.............] - ETA: 1:23 - loss: 0.5082 - regression_loss: 0.4669 - classification_loss: 0.0413 287/500 [================>.............] - ETA: 1:22 - loss: 0.5083 - regression_loss: 0.4670 - classification_loss: 0.0413 288/500 [================>.............] - ETA: 1:22 - loss: 0.5079 - regression_loss: 0.4667 - classification_loss: 0.0413 289/500 [================>.............] - ETA: 1:22 - loss: 0.5067 - regression_loss: 0.4656 - classification_loss: 0.0411 290/500 [================>.............] - ETA: 1:21 - loss: 0.5059 - regression_loss: 0.4648 - classification_loss: 0.0411 291/500 [================>.............] - ETA: 1:21 - loss: 0.5052 - regression_loss: 0.4642 - classification_loss: 0.0410 292/500 [================>.............] - ETA: 1:20 - loss: 0.5045 - regression_loss: 0.4635 - classification_loss: 0.0410 293/500 [================>.............] - ETA: 1:20 - loss: 0.5053 - regression_loss: 0.4642 - classification_loss: 0.0411 294/500 [================>.............] - ETA: 1:20 - loss: 0.5055 - regression_loss: 0.4645 - classification_loss: 0.0410 295/500 [================>.............] - ETA: 1:19 - loss: 0.5055 - regression_loss: 0.4645 - classification_loss: 0.0410 296/500 [================>.............] - ETA: 1:19 - loss: 0.5052 - regression_loss: 0.4643 - classification_loss: 0.0409 297/500 [================>.............] - ETA: 1:19 - loss: 0.5050 - regression_loss: 0.4641 - classification_loss: 0.0409 298/500 [================>.............] - ETA: 1:18 - loss: 0.5044 - regression_loss: 0.4636 - classification_loss: 0.0408 299/500 [================>.............] - ETA: 1:18 - loss: 0.5043 - regression_loss: 0.4635 - classification_loss: 0.0408 300/500 [=================>............] - ETA: 1:17 - loss: 0.5033 - regression_loss: 0.4626 - classification_loss: 0.0407 301/500 [=================>............] - ETA: 1:17 - loss: 0.5020 - regression_loss: 0.4614 - classification_loss: 0.0406 302/500 [=================>............] - ETA: 1:17 - loss: 0.5018 - regression_loss: 0.4612 - classification_loss: 0.0406 303/500 [=================>............] - ETA: 1:16 - loss: 0.5017 - regression_loss: 0.4611 - classification_loss: 0.0406 304/500 [=================>............] - ETA: 1:16 - loss: 0.5013 - regression_loss: 0.4608 - classification_loss: 0.0405 305/500 [=================>............] - ETA: 1:15 - loss: 0.5019 - regression_loss: 0.4613 - classification_loss: 0.0405 306/500 [=================>............] - ETA: 1:15 - loss: 0.5022 - regression_loss: 0.4616 - classification_loss: 0.0406 307/500 [=================>............] - ETA: 1:15 - loss: 0.5024 - regression_loss: 0.4618 - classification_loss: 0.0406 308/500 [=================>............] - ETA: 1:14 - loss: 0.5030 - regression_loss: 0.4624 - classification_loss: 0.0406 309/500 [=================>............] - ETA: 1:14 - loss: 0.5030 - regression_loss: 0.4624 - classification_loss: 0.0405 310/500 [=================>............] - ETA: 1:13 - loss: 0.5027 - regression_loss: 0.4623 - classification_loss: 0.0405 311/500 [=================>............] - ETA: 1:13 - loss: 0.5028 - regression_loss: 0.4624 - classification_loss: 0.0404 312/500 [=================>............] - ETA: 1:13 - loss: 0.5026 - regression_loss: 0.4622 - classification_loss: 0.0404 313/500 [=================>............] - ETA: 1:12 - loss: 0.5020 - regression_loss: 0.4617 - classification_loss: 0.0403 314/500 [=================>............] - ETA: 1:12 - loss: 0.5017 - regression_loss: 0.4615 - classification_loss: 0.0403 315/500 [=================>............] - ETA: 1:11 - loss: 0.5014 - regression_loss: 0.4612 - classification_loss: 0.0402 316/500 [=================>............] - ETA: 1:11 - loss: 0.5016 - regression_loss: 0.4614 - classification_loss: 0.0402 317/500 [==================>...........] - ETA: 1:11 - loss: 0.5019 - regression_loss: 0.4617 - classification_loss: 0.0402 318/500 [==================>...........] - ETA: 1:10 - loss: 0.5024 - regression_loss: 0.4623 - classification_loss: 0.0402 319/500 [==================>...........] - ETA: 1:10 - loss: 0.5017 - regression_loss: 0.4616 - classification_loss: 0.0401 320/500 [==================>...........] - ETA: 1:10 - loss: 0.5017 - regression_loss: 0.4616 - classification_loss: 0.0401 321/500 [==================>...........] - ETA: 1:09 - loss: 0.5014 - regression_loss: 0.4613 - classification_loss: 0.0400 322/500 [==================>...........] - ETA: 1:09 - loss: 0.5006 - regression_loss: 0.4607 - classification_loss: 0.0400 323/500 [==================>...........] - ETA: 1:08 - loss: 0.5008 - regression_loss: 0.4607 - classification_loss: 0.0400 324/500 [==================>...........] - ETA: 1:08 - loss: 0.5006 - regression_loss: 0.4605 - classification_loss: 0.0401 325/500 [==================>...........] - ETA: 1:08 - loss: 0.4998 - regression_loss: 0.4597 - classification_loss: 0.0401 326/500 [==================>...........] - ETA: 1:07 - loss: 0.4997 - regression_loss: 0.4597 - classification_loss: 0.0400 327/500 [==================>...........] - ETA: 1:07 - loss: 0.5000 - regression_loss: 0.4600 - classification_loss: 0.0400 328/500 [==================>...........] - ETA: 1:06 - loss: 0.4998 - regression_loss: 0.4599 - classification_loss: 0.0399 329/500 [==================>...........] - ETA: 1:06 - loss: 0.4988 - regression_loss: 0.4589 - classification_loss: 0.0399 330/500 [==================>...........] - ETA: 1:06 - loss: 0.4983 - regression_loss: 0.4585 - classification_loss: 0.0399 331/500 [==================>...........] - ETA: 1:05 - loss: 0.5001 - regression_loss: 0.4601 - classification_loss: 0.0400 332/500 [==================>...........] - ETA: 1:05 - loss: 0.5010 - regression_loss: 0.4610 - classification_loss: 0.0401 333/500 [==================>...........] - ETA: 1:04 - loss: 0.5011 - regression_loss: 0.4610 - classification_loss: 0.0401 334/500 [===================>..........] - ETA: 1:04 - loss: 0.5012 - regression_loss: 0.4611 - classification_loss: 0.0401 335/500 [===================>..........] - ETA: 1:04 - loss: 0.5027 - regression_loss: 0.4622 - classification_loss: 0.0405 336/500 [===================>..........] - ETA: 1:03 - loss: 0.5026 - regression_loss: 0.4621 - classification_loss: 0.0405 337/500 [===================>..........] - ETA: 1:03 - loss: 0.5031 - regression_loss: 0.4627 - classification_loss: 0.0404 338/500 [===================>..........] - ETA: 1:03 - loss: 0.5026 - regression_loss: 0.4622 - classification_loss: 0.0404 339/500 [===================>..........] - ETA: 1:02 - loss: 0.5023 - regression_loss: 0.4619 - classification_loss: 0.0404 340/500 [===================>..........] - ETA: 1:02 - loss: 0.5020 - regression_loss: 0.4616 - classification_loss: 0.0404 341/500 [===================>..........] - ETA: 1:01 - loss: 0.5017 - regression_loss: 0.4613 - classification_loss: 0.0404 342/500 [===================>..........] - ETA: 1:01 - loss: 0.5019 - regression_loss: 0.4614 - classification_loss: 0.0405 343/500 [===================>..........] - ETA: 1:01 - loss: 0.5011 - regression_loss: 0.4607 - classification_loss: 0.0404 344/500 [===================>..........] - ETA: 1:00 - loss: 0.5005 - regression_loss: 0.4601 - classification_loss: 0.0403 345/500 [===================>..........] - ETA: 1:00 - loss: 0.5006 - regression_loss: 0.4602 - classification_loss: 0.0403 346/500 [===================>..........] - ETA: 59s - loss: 0.5001 - regression_loss: 0.4598 - classification_loss: 0.0403  347/500 [===================>..........] - ETA: 59s - loss: 0.5001 - regression_loss: 0.4598 - classification_loss: 0.0403 348/500 [===================>..........] - ETA: 59s - loss: 0.5002 - regression_loss: 0.4600 - classification_loss: 0.0402 349/500 [===================>..........] - ETA: 58s - loss: 0.4999 - regression_loss: 0.4597 - classification_loss: 0.0402 350/500 [====================>.........] - ETA: 58s - loss: 0.5016 - regression_loss: 0.4612 - classification_loss: 0.0403 351/500 [====================>.........] - ETA: 58s - loss: 0.5012 - regression_loss: 0.4609 - classification_loss: 0.0403 352/500 [====================>.........] - ETA: 57s - loss: 0.5022 - regression_loss: 0.4616 - classification_loss: 0.0406 353/500 [====================>.........] - ETA: 57s - loss: 0.5022 - regression_loss: 0.4616 - classification_loss: 0.0406 354/500 [====================>.........] - ETA: 56s - loss: 0.5024 - regression_loss: 0.4618 - classification_loss: 0.0406 355/500 [====================>.........] - ETA: 56s - loss: 0.5028 - regression_loss: 0.4620 - classification_loss: 0.0408 356/500 [====================>.........] - ETA: 56s - loss: 0.5026 - regression_loss: 0.4619 - classification_loss: 0.0407 357/500 [====================>.........] - ETA: 55s - loss: 0.5024 - regression_loss: 0.4617 - classification_loss: 0.0407 358/500 [====================>.........] - ETA: 55s - loss: 0.5017 - regression_loss: 0.4611 - classification_loss: 0.0407 359/500 [====================>.........] - ETA: 54s - loss: 0.5015 - regression_loss: 0.4608 - classification_loss: 0.0407 360/500 [====================>.........] - ETA: 54s - loss: 0.5013 - regression_loss: 0.4607 - classification_loss: 0.0406 361/500 [====================>.........] - ETA: 54s - loss: 0.5011 - regression_loss: 0.4605 - classification_loss: 0.0406 362/500 [====================>.........] - ETA: 53s - loss: 0.5008 - regression_loss: 0.4602 - classification_loss: 0.0406 363/500 [====================>.........] - ETA: 53s - loss: 0.5014 - regression_loss: 0.4608 - classification_loss: 0.0406 364/500 [====================>.........] - ETA: 52s - loss: 0.5013 - regression_loss: 0.4607 - classification_loss: 0.0406 365/500 [====================>.........] - ETA: 52s - loss: 0.5008 - regression_loss: 0.4603 - classification_loss: 0.0405 366/500 [====================>.........] - ETA: 52s - loss: 0.5013 - regression_loss: 0.4607 - classification_loss: 0.0405 367/500 [=====================>........] - ETA: 51s - loss: 0.5022 - regression_loss: 0.4615 - classification_loss: 0.0406 368/500 [=====================>........] - ETA: 51s - loss: 0.5022 - regression_loss: 0.4615 - classification_loss: 0.0406 369/500 [=====================>........] - ETA: 51s - loss: 0.5039 - regression_loss: 0.4627 - classification_loss: 0.0412 370/500 [=====================>........] - ETA: 50s - loss: 0.5033 - regression_loss: 0.4622 - classification_loss: 0.0411 371/500 [=====================>........] - ETA: 50s - loss: 0.5035 - regression_loss: 0.4623 - classification_loss: 0.0412 372/500 [=====================>........] - ETA: 49s - loss: 0.5034 - regression_loss: 0.4622 - classification_loss: 0.0411 373/500 [=====================>........] - ETA: 49s - loss: 0.5030 - regression_loss: 0.4619 - classification_loss: 0.0411 374/500 [=====================>........] - ETA: 49s - loss: 0.5053 - regression_loss: 0.4641 - classification_loss: 0.0412 375/500 [=====================>........] - ETA: 48s - loss: 0.5052 - regression_loss: 0.4640 - classification_loss: 0.0411 376/500 [=====================>........] - ETA: 48s - loss: 0.5050 - regression_loss: 0.4639 - classification_loss: 0.0411 377/500 [=====================>........] - ETA: 47s - loss: 0.5051 - regression_loss: 0.4640 - classification_loss: 0.0411 378/500 [=====================>........] - ETA: 47s - loss: 0.5044 - regression_loss: 0.4634 - classification_loss: 0.0410 379/500 [=====================>........] - ETA: 47s - loss: 0.5037 - regression_loss: 0.4627 - classification_loss: 0.0409 380/500 [=====================>........] - ETA: 46s - loss: 0.5036 - regression_loss: 0.4626 - classification_loss: 0.0409 381/500 [=====================>........] - ETA: 46s - loss: 0.5032 - regression_loss: 0.4624 - classification_loss: 0.0409 382/500 [=====================>........] - ETA: 45s - loss: 0.5031 - regression_loss: 0.4622 - classification_loss: 0.0409 383/500 [=====================>........] - ETA: 45s - loss: 0.5029 - regression_loss: 0.4620 - classification_loss: 0.0409 384/500 [======================>.......] - ETA: 45s - loss: 0.5035 - regression_loss: 0.4625 - classification_loss: 0.0410 385/500 [======================>.......] - ETA: 44s - loss: 0.5033 - regression_loss: 0.4623 - classification_loss: 0.0409 386/500 [======================>.......] - ETA: 44s - loss: 0.5025 - regression_loss: 0.4616 - classification_loss: 0.0409 387/500 [======================>.......] - ETA: 44s - loss: 0.5025 - regression_loss: 0.4617 - classification_loss: 0.0409 388/500 [======================>.......] - ETA: 43s - loss: 0.5031 - regression_loss: 0.4623 - classification_loss: 0.0408 389/500 [======================>.......] - ETA: 43s - loss: 0.5026 - regression_loss: 0.4618 - classification_loss: 0.0408 390/500 [======================>.......] - ETA: 42s - loss: 0.5019 - regression_loss: 0.4612 - classification_loss: 0.0407 391/500 [======================>.......] - ETA: 42s - loss: 0.5014 - regression_loss: 0.4607 - classification_loss: 0.0407 392/500 [======================>.......] - ETA: 42s - loss: 0.5012 - regression_loss: 0.4605 - classification_loss: 0.0406 393/500 [======================>.......] - ETA: 41s - loss: 0.5018 - regression_loss: 0.4612 - classification_loss: 0.0407 394/500 [======================>.......] - ETA: 41s - loss: 0.5015 - regression_loss: 0.4608 - classification_loss: 0.0406 395/500 [======================>.......] - ETA: 40s - loss: 0.5011 - regression_loss: 0.4605 - classification_loss: 0.0406 396/500 [======================>.......] - ETA: 40s - loss: 0.5017 - regression_loss: 0.4611 - classification_loss: 0.0406 397/500 [======================>.......] - ETA: 40s - loss: 0.5012 - regression_loss: 0.4607 - classification_loss: 0.0405 398/500 [======================>.......] - ETA: 39s - loss: 0.5016 - regression_loss: 0.4610 - classification_loss: 0.0406 399/500 [======================>.......] - ETA: 39s - loss: 0.5016 - regression_loss: 0.4610 - classification_loss: 0.0405 400/500 [=======================>......] - ETA: 38s - loss: 0.5016 - regression_loss: 0.4611 - classification_loss: 0.0405 401/500 [=======================>......] - ETA: 38s - loss: 0.5007 - regression_loss: 0.4603 - classification_loss: 0.0404 402/500 [=======================>......] - ETA: 38s - loss: 0.5002 - regression_loss: 0.4598 - classification_loss: 0.0404 403/500 [=======================>......] - ETA: 37s - loss: 0.5003 - regression_loss: 0.4599 - classification_loss: 0.0404 404/500 [=======================>......] - ETA: 37s - loss: 0.5003 - regression_loss: 0.4599 - classification_loss: 0.0404 405/500 [=======================>......] - ETA: 37s - loss: 0.5015 - regression_loss: 0.4610 - classification_loss: 0.0405 406/500 [=======================>......] - ETA: 36s - loss: 0.5009 - regression_loss: 0.4604 - classification_loss: 0.0405 407/500 [=======================>......] - ETA: 36s - loss: 0.5008 - regression_loss: 0.4604 - classification_loss: 0.0404 408/500 [=======================>......] - ETA: 35s - loss: 0.5003 - regression_loss: 0.4599 - classification_loss: 0.0404 409/500 [=======================>......] - ETA: 35s - loss: 0.5000 - regression_loss: 0.4596 - classification_loss: 0.0404 410/500 [=======================>......] - ETA: 35s - loss: 0.4991 - regression_loss: 0.4588 - classification_loss: 0.0403 411/500 [=======================>......] - ETA: 34s - loss: 0.4989 - regression_loss: 0.4586 - classification_loss: 0.0403 412/500 [=======================>......] - ETA: 34s - loss: 0.4992 - regression_loss: 0.4590 - classification_loss: 0.0403 413/500 [=======================>......] - ETA: 33s - loss: 0.4993 - regression_loss: 0.4590 - classification_loss: 0.0403 414/500 [=======================>......] - ETA: 33s - loss: 0.4996 - regression_loss: 0.4593 - classification_loss: 0.0403 415/500 [=======================>......] - ETA: 33s - loss: 0.4993 - regression_loss: 0.4590 - classification_loss: 0.0403 416/500 [=======================>......] - ETA: 32s - loss: 0.4997 - regression_loss: 0.4594 - classification_loss: 0.0403 417/500 [========================>.....] - ETA: 32s - loss: 0.4999 - regression_loss: 0.4595 - classification_loss: 0.0404 418/500 [========================>.....] - ETA: 31s - loss: 0.4997 - regression_loss: 0.4593 - classification_loss: 0.0403 419/500 [========================>.....] - ETA: 31s - loss: 0.4996 - regression_loss: 0.4592 - classification_loss: 0.0404 420/500 [========================>.....] - ETA: 31s - loss: 0.4997 - regression_loss: 0.4592 - classification_loss: 0.0404 421/500 [========================>.....] - ETA: 30s - loss: 0.4994 - regression_loss: 0.4589 - classification_loss: 0.0405 422/500 [========================>.....] - ETA: 30s - loss: 0.4990 - regression_loss: 0.4586 - classification_loss: 0.0404 423/500 [========================>.....] - ETA: 29s - loss: 0.4988 - regression_loss: 0.4584 - classification_loss: 0.0404 424/500 [========================>.....] - ETA: 29s - loss: 0.4996 - regression_loss: 0.4592 - classification_loss: 0.0404 425/500 [========================>.....] - ETA: 29s - loss: 0.4999 - regression_loss: 0.4595 - classification_loss: 0.0404 426/500 [========================>.....] - ETA: 28s - loss: 0.5002 - regression_loss: 0.4597 - classification_loss: 0.0405 427/500 [========================>.....] - ETA: 28s - loss: 0.5006 - regression_loss: 0.4602 - classification_loss: 0.0405 428/500 [========================>.....] - ETA: 28s - loss: 0.5004 - regression_loss: 0.4599 - classification_loss: 0.0404 429/500 [========================>.....] - ETA: 27s - loss: 0.5001 - regression_loss: 0.4597 - classification_loss: 0.0404 430/500 [========================>.....] - ETA: 27s - loss: 0.5001 - regression_loss: 0.4598 - classification_loss: 0.0403 431/500 [========================>.....] - ETA: 26s - loss: 0.5000 - regression_loss: 0.4597 - classification_loss: 0.0403 432/500 [========================>.....] - ETA: 26s - loss: 0.4993 - regression_loss: 0.4590 - classification_loss: 0.0402 433/500 [========================>.....] - ETA: 26s - loss: 0.4987 - regression_loss: 0.4585 - classification_loss: 0.0402 434/500 [=========================>....] - ETA: 25s - loss: 0.4986 - regression_loss: 0.4584 - classification_loss: 0.0402 435/500 [=========================>....] - ETA: 25s - loss: 0.4981 - regression_loss: 0.4580 - classification_loss: 0.0402 436/500 [=========================>....] - ETA: 24s - loss: 0.4982 - regression_loss: 0.4580 - classification_loss: 0.0402 437/500 [=========================>....] - ETA: 24s - loss: 0.4978 - regression_loss: 0.4576 - classification_loss: 0.0402 438/500 [=========================>....] - ETA: 24s - loss: 0.4978 - regression_loss: 0.4576 - classification_loss: 0.0402 439/500 [=========================>....] - ETA: 23s - loss: 0.4979 - regression_loss: 0.4577 - classification_loss: 0.0402 440/500 [=========================>....] - ETA: 23s - loss: 0.4983 - regression_loss: 0.4581 - classification_loss: 0.0402 441/500 [=========================>....] - ETA: 22s - loss: 0.4992 - regression_loss: 0.4588 - classification_loss: 0.0404 442/500 [=========================>....] - ETA: 22s - loss: 0.4992 - regression_loss: 0.4588 - classification_loss: 0.0403 443/500 [=========================>....] - ETA: 22s - loss: 0.4987 - regression_loss: 0.4584 - classification_loss: 0.0403 444/500 [=========================>....] - ETA: 21s - loss: 0.4987 - regression_loss: 0.4585 - classification_loss: 0.0403 445/500 [=========================>....] - ETA: 21s - loss: 0.4985 - regression_loss: 0.4583 - classification_loss: 0.0402 446/500 [=========================>....] - ETA: 21s - loss: 0.4990 - regression_loss: 0.4586 - classification_loss: 0.0404 447/500 [=========================>....] - ETA: 20s - loss: 0.4987 - regression_loss: 0.4583 - classification_loss: 0.0404 448/500 [=========================>....] - ETA: 20s - loss: 0.4981 - regression_loss: 0.4578 - classification_loss: 0.0403 449/500 [=========================>....] - ETA: 19s - loss: 0.4973 - regression_loss: 0.4571 - classification_loss: 0.0402 450/500 [==========================>...] - ETA: 19s - loss: 0.4966 - regression_loss: 0.4564 - classification_loss: 0.0402 451/500 [==========================>...] - ETA: 19s - loss: 0.4972 - regression_loss: 0.4570 - classification_loss: 0.0402 452/500 [==========================>...] - ETA: 18s - loss: 0.4978 - regression_loss: 0.4575 - classification_loss: 0.0403 453/500 [==========================>...] - ETA: 18s - loss: 0.4985 - regression_loss: 0.4581 - classification_loss: 0.0404 454/500 [==========================>...] - ETA: 17s - loss: 0.4992 - regression_loss: 0.4588 - classification_loss: 0.0404 455/500 [==========================>...] - ETA: 17s - loss: 0.4986 - regression_loss: 0.4583 - classification_loss: 0.0403 456/500 [==========================>...] - ETA: 17s - loss: 0.4981 - regression_loss: 0.4578 - classification_loss: 0.0403 457/500 [==========================>...] - ETA: 16s - loss: 0.4989 - regression_loss: 0.4585 - classification_loss: 0.0404 458/500 [==========================>...] - ETA: 16s - loss: 0.4986 - regression_loss: 0.4582 - classification_loss: 0.0404 459/500 [==========================>...] - ETA: 15s - loss: 0.4981 - regression_loss: 0.4578 - classification_loss: 0.0403 460/500 [==========================>...] - ETA: 15s - loss: 0.4978 - regression_loss: 0.4575 - classification_loss: 0.0403 461/500 [==========================>...] - ETA: 15s - loss: 0.4976 - regression_loss: 0.4574 - classification_loss: 0.0403 462/500 [==========================>...] - ETA: 14s - loss: 0.4975 - regression_loss: 0.4573 - classification_loss: 0.0402 463/500 [==========================>...] - ETA: 14s - loss: 0.4973 - regression_loss: 0.4571 - classification_loss: 0.0402 464/500 [==========================>...] - ETA: 14s - loss: 0.4978 - regression_loss: 0.4577 - classification_loss: 0.0401 465/500 [==========================>...] - ETA: 13s - loss: 0.4975 - regression_loss: 0.4574 - classification_loss: 0.0401 466/500 [==========================>...] - ETA: 13s - loss: 0.4977 - regression_loss: 0.4575 - classification_loss: 0.0402 467/500 [===========================>..] - ETA: 12s - loss: 0.4976 - regression_loss: 0.4574 - classification_loss: 0.0402 468/500 [===========================>..] - ETA: 12s - loss: 0.4974 - regression_loss: 0.4572 - classification_loss: 0.0401 469/500 [===========================>..] - ETA: 12s - loss: 0.4970 - regression_loss: 0.4569 - classification_loss: 0.0401 470/500 [===========================>..] - ETA: 11s - loss: 0.4969 - regression_loss: 0.4568 - classification_loss: 0.0401 471/500 [===========================>..] - ETA: 11s - loss: 0.4969 - regression_loss: 0.4568 - classification_loss: 0.0401 472/500 [===========================>..] - ETA: 10s - loss: 0.4984 - regression_loss: 0.4581 - classification_loss: 0.0402 473/500 [===========================>..] - ETA: 10s - loss: 0.4982 - regression_loss: 0.4580 - classification_loss: 0.0402 474/500 [===========================>..] - ETA: 10s - loss: 0.4980 - regression_loss: 0.4578 - classification_loss: 0.0402 475/500 [===========================>..] - ETA: 9s - loss: 0.4978 - regression_loss: 0.4577 - classification_loss: 0.0401  476/500 [===========================>..] - ETA: 9s - loss: 0.4982 - regression_loss: 0.4580 - classification_loss: 0.0402 477/500 [===========================>..] - ETA: 8s - loss: 0.4984 - regression_loss: 0.4583 - classification_loss: 0.0401 478/500 [===========================>..] - ETA: 8s - loss: 0.4988 - regression_loss: 0.4587 - classification_loss: 0.0401 479/500 [===========================>..] - ETA: 8s - loss: 0.4987 - regression_loss: 0.4587 - classification_loss: 0.0401 480/500 [===========================>..] - ETA: 7s - loss: 0.4983 - regression_loss: 0.4583 - classification_loss: 0.0400 481/500 [===========================>..] - ETA: 7s - loss: 0.4982 - regression_loss: 0.4582 - classification_loss: 0.0400 482/500 [===========================>..] - ETA: 7s - loss: 0.4986 - regression_loss: 0.4586 - classification_loss: 0.0401 483/500 [===========================>..] - ETA: 6s - loss: 0.4981 - regression_loss: 0.4581 - classification_loss: 0.0400 484/500 [============================>.] - ETA: 6s - loss: 0.4978 - regression_loss: 0.4578 - classification_loss: 0.0400 485/500 [============================>.] - ETA: 5s - loss: 0.4978 - regression_loss: 0.4579 - classification_loss: 0.0399 486/500 [============================>.] - ETA: 5s - loss: 0.4975 - regression_loss: 0.4576 - classification_loss: 0.0399 487/500 [============================>.] - ETA: 5s - loss: 0.4968 - regression_loss: 0.4570 - classification_loss: 0.0399 488/500 [============================>.] - ETA: 4s - loss: 0.4971 - regression_loss: 0.4572 - classification_loss: 0.0399 489/500 [============================>.] - ETA: 4s - loss: 0.4967 - regression_loss: 0.4568 - classification_loss: 0.0399 490/500 [============================>.] - ETA: 3s - loss: 0.4964 - regression_loss: 0.4566 - classification_loss: 0.0398 491/500 [============================>.] - ETA: 3s - loss: 0.4959 - regression_loss: 0.4561 - classification_loss: 0.0398 492/500 [============================>.] - ETA: 3s - loss: 0.4958 - regression_loss: 0.4561 - classification_loss: 0.0398 493/500 [============================>.] - ETA: 2s - loss: 0.4953 - regression_loss: 0.4556 - classification_loss: 0.0397 494/500 [============================>.] - ETA: 2s - loss: 0.4950 - regression_loss: 0.4553 - classification_loss: 0.0397 495/500 [============================>.] - ETA: 1s - loss: 0.4947 - regression_loss: 0.4550 - classification_loss: 0.0397 496/500 [============================>.] - ETA: 1s - loss: 0.4950 - regression_loss: 0.4553 - classification_loss: 0.0397 497/500 [============================>.] - ETA: 1s - loss: 0.4945 - regression_loss: 0.4548 - classification_loss: 0.0397 498/500 [============================>.] - ETA: 0s - loss: 0.4942 - regression_loss: 0.4546 - classification_loss: 0.0396 499/500 [============================>.] - ETA: 0s - loss: 0.4937 - regression_loss: 0.4541 - classification_loss: 0.0396 500/500 [==============================] - 195s 390ms/step - loss: 0.4935 - regression_loss: 0.4539 - classification_loss: 0.0395 326 instances of class plum with average precision: 0.8443 mAP: 0.8443 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/20 1/500 [..............................] - ETA: 3:12 - loss: 0.6339 - regression_loss: 0.5807 - classification_loss: 0.0532 2/500 [..............................] - ETA: 3:15 - loss: 0.4414 - regression_loss: 0.4047 - classification_loss: 0.0367 3/500 [..............................] - ETA: 3:16 - loss: 0.4484 - regression_loss: 0.4107 - classification_loss: 0.0378 4/500 [..............................] - ETA: 3:16 - loss: 0.3730 - regression_loss: 0.3414 - classification_loss: 0.0316 5/500 [..............................] - ETA: 3:15 - loss: 0.3536 - regression_loss: 0.3239 - classification_loss: 0.0297 6/500 [..............................] - ETA: 3:15 - loss: 0.4326 - regression_loss: 0.3952 - classification_loss: 0.0374 7/500 [..............................] - ETA: 3:15 - loss: 0.4327 - regression_loss: 0.3980 - classification_loss: 0.0347 8/500 [..............................] - ETA: 3:15 - loss: 0.4374 - regression_loss: 0.4037 - classification_loss: 0.0337 9/500 [..............................] - ETA: 3:15 - loss: 0.4302 - regression_loss: 0.3948 - classification_loss: 0.0354 10/500 [..............................] - ETA: 3:14 - loss: 0.4235 - regression_loss: 0.3897 - classification_loss: 0.0338 11/500 [..............................] - ETA: 3:13 - loss: 0.4092 - regression_loss: 0.3772 - classification_loss: 0.0320 12/500 [..............................] - ETA: 3:13 - loss: 0.4139 - regression_loss: 0.3823 - classification_loss: 0.0315 13/500 [..............................] - ETA: 3:13 - loss: 0.4025 - regression_loss: 0.3729 - classification_loss: 0.0296 14/500 [..............................] - ETA: 3:12 - loss: 0.4123 - regression_loss: 0.3834 - classification_loss: 0.0289 15/500 [..............................] - ETA: 3:12 - loss: 0.4385 - regression_loss: 0.4065 - classification_loss: 0.0320 16/500 [..............................] - ETA: 3:11 - loss: 0.4597 - regression_loss: 0.4271 - classification_loss: 0.0326 17/500 [>.............................] - ETA: 3:11 - loss: 0.4700 - regression_loss: 0.4382 - classification_loss: 0.0318 18/500 [>.............................] - ETA: 3:10 - loss: 0.4845 - regression_loss: 0.4490 - classification_loss: 0.0356 19/500 [>.............................] - ETA: 3:10 - loss: 0.4843 - regression_loss: 0.4489 - classification_loss: 0.0354 20/500 [>.............................] - ETA: 3:09 - loss: 0.5067 - regression_loss: 0.4662 - classification_loss: 0.0405 21/500 [>.............................] - ETA: 3:09 - loss: 0.5094 - regression_loss: 0.4691 - classification_loss: 0.0403 22/500 [>.............................] - ETA: 3:08 - loss: 0.5344 - regression_loss: 0.4907 - classification_loss: 0.0437 23/500 [>.............................] - ETA: 3:08 - loss: 0.5324 - regression_loss: 0.4892 - classification_loss: 0.0432 24/500 [>.............................] - ETA: 3:08 - loss: 0.5276 - regression_loss: 0.4847 - classification_loss: 0.0429 25/500 [>.............................] - ETA: 3:07 - loss: 0.5178 - regression_loss: 0.4756 - classification_loss: 0.0422 26/500 [>.............................] - ETA: 3:06 - loss: 0.5097 - regression_loss: 0.4648 - classification_loss: 0.0449 27/500 [>.............................] - ETA: 3:06 - loss: 0.5185 - regression_loss: 0.4708 - classification_loss: 0.0477 28/500 [>.............................] - ETA: 3:05 - loss: 0.5225 - regression_loss: 0.4727 - classification_loss: 0.0498 29/500 [>.............................] - ETA: 3:05 - loss: 0.5296 - regression_loss: 0.4791 - classification_loss: 0.0505 30/500 [>.............................] - ETA: 3:05 - loss: 0.5337 - regression_loss: 0.4809 - classification_loss: 0.0527 31/500 [>.............................] - ETA: 3:04 - loss: 0.5440 - regression_loss: 0.4893 - classification_loss: 0.0546 32/500 [>.............................] - ETA: 3:04 - loss: 0.5608 - regression_loss: 0.5055 - classification_loss: 0.0553 33/500 [>.............................] - ETA: 3:03 - loss: 0.5563 - regression_loss: 0.5021 - classification_loss: 0.0542 34/500 [=>............................] - ETA: 3:03 - loss: 0.5544 - regression_loss: 0.5008 - classification_loss: 0.0536 35/500 [=>............................] - ETA: 3:02 - loss: 0.5485 - regression_loss: 0.4955 - classification_loss: 0.0529 36/500 [=>............................] - ETA: 3:02 - loss: 0.5428 - regression_loss: 0.4903 - classification_loss: 0.0525 37/500 [=>............................] - ETA: 3:01 - loss: 0.5419 - regression_loss: 0.4891 - classification_loss: 0.0528 38/500 [=>............................] - ETA: 3:01 - loss: 0.5356 - regression_loss: 0.4833 - classification_loss: 0.0523 39/500 [=>............................] - ETA: 3:00 - loss: 0.5405 - regression_loss: 0.4869 - classification_loss: 0.0536 40/500 [=>............................] - ETA: 3:00 - loss: 0.5352 - regression_loss: 0.4824 - classification_loss: 0.0529 41/500 [=>............................] - ETA: 2:59 - loss: 0.5323 - regression_loss: 0.4802 - classification_loss: 0.0522 42/500 [=>............................] - ETA: 2:59 - loss: 0.5266 - regression_loss: 0.4753 - classification_loss: 0.0513 43/500 [=>............................] - ETA: 2:58 - loss: 0.5381 - regression_loss: 0.4848 - classification_loss: 0.0533 44/500 [=>............................] - ETA: 2:58 - loss: 0.5390 - regression_loss: 0.4858 - classification_loss: 0.0533 45/500 [=>............................] - ETA: 2:58 - loss: 0.5360 - regression_loss: 0.4834 - classification_loss: 0.0526 46/500 [=>............................] - ETA: 2:57 - loss: 0.5325 - regression_loss: 0.4805 - classification_loss: 0.0521 47/500 [=>............................] - ETA: 2:57 - loss: 0.5262 - regression_loss: 0.4748 - classification_loss: 0.0514 48/500 [=>............................] - ETA: 2:56 - loss: 0.5290 - regression_loss: 0.4779 - classification_loss: 0.0511 49/500 [=>............................] - ETA: 2:56 - loss: 0.5276 - regression_loss: 0.4772 - classification_loss: 0.0504 50/500 [==>...........................] - ETA: 2:56 - loss: 0.5226 - regression_loss: 0.4728 - classification_loss: 0.0498 51/500 [==>...........................] - ETA: 2:55 - loss: 0.5239 - regression_loss: 0.4744 - classification_loss: 0.0495 52/500 [==>...........................] - ETA: 2:55 - loss: 0.5212 - regression_loss: 0.4717 - classification_loss: 0.0495 53/500 [==>...........................] - ETA: 2:54 - loss: 0.5188 - regression_loss: 0.4697 - classification_loss: 0.0491 54/500 [==>...........................] - ETA: 2:54 - loss: 0.5117 - regression_loss: 0.4632 - classification_loss: 0.0485 55/500 [==>...........................] - ETA: 2:53 - loss: 0.5130 - regression_loss: 0.4642 - classification_loss: 0.0488 56/500 [==>...........................] - ETA: 2:53 - loss: 0.5122 - regression_loss: 0.4635 - classification_loss: 0.0487 57/500 [==>...........................] - ETA: 2:53 - loss: 0.5175 - regression_loss: 0.4684 - classification_loss: 0.0491 58/500 [==>...........................] - ETA: 2:52 - loss: 0.5159 - regression_loss: 0.4673 - classification_loss: 0.0486 59/500 [==>...........................] - ETA: 2:52 - loss: 0.5144 - regression_loss: 0.4659 - classification_loss: 0.0485 60/500 [==>...........................] - ETA: 2:51 - loss: 0.5136 - regression_loss: 0.4654 - classification_loss: 0.0482 61/500 [==>...........................] - ETA: 2:51 - loss: 0.5088 - regression_loss: 0.4610 - classification_loss: 0.0477 62/500 [==>...........................] - ETA: 2:50 - loss: 0.5106 - regression_loss: 0.4628 - classification_loss: 0.0478 63/500 [==>...........................] - ETA: 2:50 - loss: 0.5087 - regression_loss: 0.4613 - classification_loss: 0.0474 64/500 [==>...........................] - ETA: 2:50 - loss: 0.5106 - regression_loss: 0.4635 - classification_loss: 0.0472 65/500 [==>...........................] - ETA: 2:49 - loss: 0.5069 - regression_loss: 0.4602 - classification_loss: 0.0467 66/500 [==>...........................] - ETA: 2:49 - loss: 0.5020 - regression_loss: 0.4557 - classification_loss: 0.0464 67/500 [===>..........................] - ETA: 2:48 - loss: 0.4990 - regression_loss: 0.4531 - classification_loss: 0.0460 68/500 [===>..........................] - ETA: 2:48 - loss: 0.5016 - regression_loss: 0.4553 - classification_loss: 0.0463 69/500 [===>..........................] - ETA: 2:48 - loss: 0.4974 - regression_loss: 0.4516 - classification_loss: 0.0458 70/500 [===>..........................] - ETA: 2:47 - loss: 0.4949 - regression_loss: 0.4495 - classification_loss: 0.0454 71/500 [===>..........................] - ETA: 2:47 - loss: 0.4946 - regression_loss: 0.4495 - classification_loss: 0.0452 72/500 [===>..........................] - ETA: 2:47 - loss: 0.4942 - regression_loss: 0.4492 - classification_loss: 0.0450 73/500 [===>..........................] - ETA: 2:46 - loss: 0.4925 - regression_loss: 0.4476 - classification_loss: 0.0449 74/500 [===>..........................] - ETA: 2:46 - loss: 0.4900 - regression_loss: 0.4454 - classification_loss: 0.0447 75/500 [===>..........................] - ETA: 2:45 - loss: 0.4957 - regression_loss: 0.4508 - classification_loss: 0.0448 76/500 [===>..........................] - ETA: 2:45 - loss: 0.4921 - regression_loss: 0.4473 - classification_loss: 0.0447 77/500 [===>..........................] - ETA: 2:45 - loss: 0.4914 - regression_loss: 0.4468 - classification_loss: 0.0446 78/500 [===>..........................] - ETA: 2:44 - loss: 0.4930 - regression_loss: 0.4486 - classification_loss: 0.0444 79/500 [===>..........................] - ETA: 2:44 - loss: 0.4970 - regression_loss: 0.4527 - classification_loss: 0.0443 80/500 [===>..........................] - ETA: 2:43 - loss: 0.4959 - regression_loss: 0.4518 - classification_loss: 0.0441 81/500 [===>..........................] - ETA: 2:43 - loss: 0.4952 - regression_loss: 0.4513 - classification_loss: 0.0439 82/500 [===>..........................] - ETA: 2:43 - loss: 0.4991 - regression_loss: 0.4549 - classification_loss: 0.0442 83/500 [===>..........................] - ETA: 2:42 - loss: 0.4976 - regression_loss: 0.4538 - classification_loss: 0.0438 84/500 [====>.........................] - ETA: 2:42 - loss: 0.4969 - regression_loss: 0.4534 - classification_loss: 0.0435 85/500 [====>.........................] - ETA: 2:41 - loss: 0.4941 - regression_loss: 0.4510 - classification_loss: 0.0432 86/500 [====>.........................] - ETA: 2:41 - loss: 0.4934 - regression_loss: 0.4504 - classification_loss: 0.0430 87/500 [====>.........................] - ETA: 2:40 - loss: 0.4938 - regression_loss: 0.4508 - classification_loss: 0.0429 88/500 [====>.........................] - ETA: 2:40 - loss: 0.4954 - regression_loss: 0.4527 - classification_loss: 0.0428 89/500 [====>.........................] - ETA: 2:40 - loss: 0.4991 - regression_loss: 0.4563 - classification_loss: 0.0428 90/500 [====>.........................] - ETA: 2:39 - loss: 0.4969 - regression_loss: 0.4543 - classification_loss: 0.0426 91/500 [====>.........................] - ETA: 2:39 - loss: 0.4970 - regression_loss: 0.4547 - classification_loss: 0.0423 92/500 [====>.........................] - ETA: 2:38 - loss: 0.4999 - regression_loss: 0.4575 - classification_loss: 0.0423 93/500 [====>.........................] - ETA: 2:38 - loss: 0.5043 - regression_loss: 0.4615 - classification_loss: 0.0428 94/500 [====>.........................] - ETA: 2:38 - loss: 0.5043 - regression_loss: 0.4615 - classification_loss: 0.0428 95/500 [====>.........................] - ETA: 2:37 - loss: 0.5027 - regression_loss: 0.4601 - classification_loss: 0.0426 96/500 [====>.........................] - ETA: 2:37 - loss: 0.5021 - regression_loss: 0.4596 - classification_loss: 0.0425 97/500 [====>.........................] - ETA: 2:36 - loss: 0.5045 - regression_loss: 0.4618 - classification_loss: 0.0427 98/500 [====>.........................] - ETA: 2:36 - loss: 0.5044 - regression_loss: 0.4619 - classification_loss: 0.0425 99/500 [====>.........................] - ETA: 2:36 - loss: 0.5043 - regression_loss: 0.4620 - classification_loss: 0.0424 100/500 [=====>........................] - ETA: 2:35 - loss: 0.5050 - regression_loss: 0.4629 - classification_loss: 0.0422 101/500 [=====>........................] - ETA: 2:35 - loss: 0.5041 - regression_loss: 0.4620 - classification_loss: 0.0421 102/500 [=====>........................] - ETA: 2:34 - loss: 0.4998 - regression_loss: 0.4580 - classification_loss: 0.0418 103/500 [=====>........................] - ETA: 2:34 - loss: 0.5001 - regression_loss: 0.4584 - classification_loss: 0.0417 104/500 [=====>........................] - ETA: 2:34 - loss: 0.5014 - regression_loss: 0.4596 - classification_loss: 0.0417 105/500 [=====>........................] - ETA: 2:33 - loss: 0.5034 - regression_loss: 0.4614 - classification_loss: 0.0420 106/500 [=====>........................] - ETA: 2:33 - loss: 0.5028 - regression_loss: 0.4609 - classification_loss: 0.0419 107/500 [=====>........................] - ETA: 2:32 - loss: 0.5041 - regression_loss: 0.4621 - classification_loss: 0.0420 108/500 [=====>........................] - ETA: 2:32 - loss: 0.5045 - regression_loss: 0.4626 - classification_loss: 0.0419 109/500 [=====>........................] - ETA: 2:32 - loss: 0.5062 - regression_loss: 0.4640 - classification_loss: 0.0422 110/500 [=====>........................] - ETA: 2:31 - loss: 0.5058 - regression_loss: 0.4638 - classification_loss: 0.0420 111/500 [=====>........................] - ETA: 2:31 - loss: 0.5066 - regression_loss: 0.4645 - classification_loss: 0.0421 112/500 [=====>........................] - ETA: 2:30 - loss: 0.5046 - regression_loss: 0.4627 - classification_loss: 0.0419 113/500 [=====>........................] - ETA: 2:30 - loss: 0.5027 - regression_loss: 0.4610 - classification_loss: 0.0417 114/500 [=====>........................] - ETA: 2:29 - loss: 0.5006 - regression_loss: 0.4591 - classification_loss: 0.0415 115/500 [=====>........................] - ETA: 2:29 - loss: 0.5012 - regression_loss: 0.4597 - classification_loss: 0.0415 116/500 [=====>........................] - ETA: 2:29 - loss: 0.5006 - regression_loss: 0.4592 - classification_loss: 0.0414 117/500 [======>.......................] - ETA: 2:28 - loss: 0.4991 - regression_loss: 0.4580 - classification_loss: 0.0412 118/500 [======>.......................] - ETA: 2:28 - loss: 0.4985 - regression_loss: 0.4574 - classification_loss: 0.0410 119/500 [======>.......................] - ETA: 2:27 - loss: 0.4961 - regression_loss: 0.4553 - classification_loss: 0.0408 120/500 [======>.......................] - ETA: 2:27 - loss: 0.4989 - regression_loss: 0.4580 - classification_loss: 0.0409 121/500 [======>.......................] - ETA: 2:27 - loss: 0.5034 - regression_loss: 0.4623 - classification_loss: 0.0411 122/500 [======>.......................] - ETA: 2:26 - loss: 0.5065 - regression_loss: 0.4648 - classification_loss: 0.0417 123/500 [======>.......................] - ETA: 2:26 - loss: 0.5079 - regression_loss: 0.4663 - classification_loss: 0.0416 124/500 [======>.......................] - ETA: 2:26 - loss: 0.5122 - regression_loss: 0.4689 - classification_loss: 0.0433 125/500 [======>.......................] - ETA: 2:25 - loss: 0.5104 - regression_loss: 0.4673 - classification_loss: 0.0431 126/500 [======>.......................] - ETA: 2:25 - loss: 0.5085 - regression_loss: 0.4656 - classification_loss: 0.0430 127/500 [======>.......................] - ETA: 2:24 - loss: 0.5071 - regression_loss: 0.4643 - classification_loss: 0.0428 128/500 [======>.......................] - ETA: 2:24 - loss: 0.5072 - regression_loss: 0.4645 - classification_loss: 0.0427 129/500 [======>.......................] - ETA: 2:24 - loss: 0.5063 - regression_loss: 0.4638 - classification_loss: 0.0425 130/500 [======>.......................] - ETA: 2:23 - loss: 0.5068 - regression_loss: 0.4644 - classification_loss: 0.0425 131/500 [======>.......................] - ETA: 2:23 - loss: 0.5048 - regression_loss: 0.4624 - classification_loss: 0.0424 132/500 [======>.......................] - ETA: 2:23 - loss: 0.5037 - regression_loss: 0.4614 - classification_loss: 0.0423 133/500 [======>.......................] - ETA: 2:22 - loss: 0.5025 - regression_loss: 0.4602 - classification_loss: 0.0422 134/500 [=======>......................] - ETA: 2:22 - loss: 0.5026 - regression_loss: 0.4604 - classification_loss: 0.0422 135/500 [=======>......................] - ETA: 2:21 - loss: 0.5035 - regression_loss: 0.4611 - classification_loss: 0.0424 136/500 [=======>......................] - ETA: 2:21 - loss: 0.5022 - regression_loss: 0.4600 - classification_loss: 0.0423 137/500 [=======>......................] - ETA: 2:21 - loss: 0.5009 - regression_loss: 0.4588 - classification_loss: 0.0422 138/500 [=======>......................] - ETA: 2:20 - loss: 0.5003 - regression_loss: 0.4583 - classification_loss: 0.0420 139/500 [=======>......................] - ETA: 2:20 - loss: 0.4992 - regression_loss: 0.4572 - classification_loss: 0.0420 140/500 [=======>......................] - ETA: 2:20 - loss: 0.4975 - regression_loss: 0.4556 - classification_loss: 0.0419 141/500 [=======>......................] - ETA: 2:19 - loss: 0.4984 - regression_loss: 0.4564 - classification_loss: 0.0419 142/500 [=======>......................] - ETA: 2:19 - loss: 0.4966 - regression_loss: 0.4549 - classification_loss: 0.0417 143/500 [=======>......................] - ETA: 2:18 - loss: 0.5010 - regression_loss: 0.4589 - classification_loss: 0.0421 144/500 [=======>......................] - ETA: 2:18 - loss: 0.5005 - regression_loss: 0.4586 - classification_loss: 0.0419 145/500 [=======>......................] - ETA: 2:18 - loss: 0.5001 - regression_loss: 0.4581 - classification_loss: 0.0420 146/500 [=======>......................] - ETA: 2:17 - loss: 0.4997 - regression_loss: 0.4577 - classification_loss: 0.0419 147/500 [=======>......................] - ETA: 2:17 - loss: 0.4989 - regression_loss: 0.4570 - classification_loss: 0.0418 148/500 [=======>......................] - ETA: 2:17 - loss: 0.4981 - regression_loss: 0.4564 - classification_loss: 0.0417 149/500 [=======>......................] - ETA: 2:16 - loss: 0.4983 - regression_loss: 0.4565 - classification_loss: 0.0417 150/500 [========>.....................] - ETA: 2:16 - loss: 0.4966 - regression_loss: 0.4550 - classification_loss: 0.0416 151/500 [========>.....................] - ETA: 2:15 - loss: 0.4944 - regression_loss: 0.4530 - classification_loss: 0.0414 152/500 [========>.....................] - ETA: 2:15 - loss: 0.4932 - regression_loss: 0.4518 - classification_loss: 0.0414 153/500 [========>.....................] - ETA: 2:15 - loss: 0.4938 - regression_loss: 0.4525 - classification_loss: 0.0413 154/500 [========>.....................] - ETA: 2:14 - loss: 0.4970 - regression_loss: 0.4555 - classification_loss: 0.0416 155/500 [========>.....................] - ETA: 2:14 - loss: 0.4964 - regression_loss: 0.4549 - classification_loss: 0.0415 156/500 [========>.....................] - ETA: 2:14 - loss: 0.5005 - regression_loss: 0.4585 - classification_loss: 0.0420 157/500 [========>.....................] - ETA: 2:13 - loss: 0.5020 - regression_loss: 0.4599 - classification_loss: 0.0422 158/500 [========>.....................] - ETA: 2:13 - loss: 0.5020 - regression_loss: 0.4599 - classification_loss: 0.0421 159/500 [========>.....................] - ETA: 2:12 - loss: 0.5028 - regression_loss: 0.4606 - classification_loss: 0.0422 160/500 [========>.....................] - ETA: 2:12 - loss: 0.5025 - regression_loss: 0.4604 - classification_loss: 0.0421 161/500 [========>.....................] - ETA: 2:11 - loss: 0.5026 - regression_loss: 0.4604 - classification_loss: 0.0421 162/500 [========>.....................] - ETA: 2:11 - loss: 0.5029 - regression_loss: 0.4608 - classification_loss: 0.0421 163/500 [========>.....................] - ETA: 2:11 - loss: 0.5053 - regression_loss: 0.4632 - classification_loss: 0.0421 164/500 [========>.....................] - ETA: 2:10 - loss: 0.5046 - regression_loss: 0.4626 - classification_loss: 0.0420 165/500 [========>.....................] - ETA: 2:10 - loss: 0.5036 - regression_loss: 0.4617 - classification_loss: 0.0419 166/500 [========>.....................] - ETA: 2:10 - loss: 0.5021 - regression_loss: 0.4603 - classification_loss: 0.0418 167/500 [=========>....................] - ETA: 2:09 - loss: 0.5011 - regression_loss: 0.4594 - classification_loss: 0.0418 168/500 [=========>....................] - ETA: 2:09 - loss: 0.5007 - regression_loss: 0.4590 - classification_loss: 0.0417 169/500 [=========>....................] - ETA: 2:08 - loss: 0.5006 - regression_loss: 0.4591 - classification_loss: 0.0415 170/500 [=========>....................] - ETA: 2:08 - loss: 0.4995 - regression_loss: 0.4581 - classification_loss: 0.0414 171/500 [=========>....................] - ETA: 2:07 - loss: 0.4991 - regression_loss: 0.4576 - classification_loss: 0.0415 172/500 [=========>....................] - ETA: 2:07 - loss: 0.4988 - regression_loss: 0.4573 - classification_loss: 0.0415 173/500 [=========>....................] - ETA: 2:07 - loss: 0.4991 - regression_loss: 0.4577 - classification_loss: 0.0414 174/500 [=========>....................] - ETA: 2:06 - loss: 0.5011 - regression_loss: 0.4594 - classification_loss: 0.0417 175/500 [=========>....................] - ETA: 2:06 - loss: 0.5003 - regression_loss: 0.4587 - classification_loss: 0.0416 176/500 [=========>....................] - ETA: 2:06 - loss: 0.5023 - regression_loss: 0.4607 - classification_loss: 0.0416 177/500 [=========>....................] - ETA: 2:05 - loss: 0.5044 - regression_loss: 0.4626 - classification_loss: 0.0419 178/500 [=========>....................] - ETA: 2:05 - loss: 0.5046 - regression_loss: 0.4628 - classification_loss: 0.0418 179/500 [=========>....................] - ETA: 2:04 - loss: 0.5047 - regression_loss: 0.4630 - classification_loss: 0.0417 180/500 [=========>....................] - ETA: 2:04 - loss: 0.5043 - regression_loss: 0.4627 - classification_loss: 0.0416 181/500 [=========>....................] - ETA: 2:04 - loss: 0.5044 - regression_loss: 0.4628 - classification_loss: 0.0416 182/500 [=========>....................] - ETA: 2:03 - loss: 0.5039 - regression_loss: 0.4624 - classification_loss: 0.0414 183/500 [=========>....................] - ETA: 2:03 - loss: 0.5046 - regression_loss: 0.4633 - classification_loss: 0.0413 184/500 [==========>...................] - ETA: 2:03 - loss: 0.5051 - regression_loss: 0.4638 - classification_loss: 0.0413 185/500 [==========>...................] - ETA: 2:02 - loss: 0.5046 - regression_loss: 0.4635 - classification_loss: 0.0411 186/500 [==========>...................] - ETA: 2:02 - loss: 0.5026 - regression_loss: 0.4617 - classification_loss: 0.0409 187/500 [==========>...................] - ETA: 2:01 - loss: 0.5040 - regression_loss: 0.4629 - classification_loss: 0.0410 188/500 [==========>...................] - ETA: 2:01 - loss: 0.5033 - regression_loss: 0.4624 - classification_loss: 0.0409 189/500 [==========>...................] - ETA: 2:01 - loss: 0.5033 - regression_loss: 0.4624 - classification_loss: 0.0409 190/500 [==========>...................] - ETA: 2:00 - loss: 0.5059 - regression_loss: 0.4646 - classification_loss: 0.0413 191/500 [==========>...................] - ETA: 2:00 - loss: 0.5076 - regression_loss: 0.4661 - classification_loss: 0.0415 192/500 [==========>...................] - ETA: 1:59 - loss: 0.5082 - regression_loss: 0.4667 - classification_loss: 0.0415 193/500 [==========>...................] - ETA: 1:59 - loss: 0.5070 - regression_loss: 0.4657 - classification_loss: 0.0413 194/500 [==========>...................] - ETA: 1:59 - loss: 0.5062 - regression_loss: 0.4650 - classification_loss: 0.0413 195/500 [==========>...................] - ETA: 1:58 - loss: 0.5053 - regression_loss: 0.4641 - classification_loss: 0.0412 196/500 [==========>...................] - ETA: 1:58 - loss: 0.5048 - regression_loss: 0.4636 - classification_loss: 0.0412 197/500 [==========>...................] - ETA: 1:57 - loss: 0.5039 - regression_loss: 0.4628 - classification_loss: 0.0411 198/500 [==========>...................] - ETA: 1:57 - loss: 0.5031 - regression_loss: 0.4618 - classification_loss: 0.0412 199/500 [==========>...................] - ETA: 1:57 - loss: 0.5019 - regression_loss: 0.4608 - classification_loss: 0.0411 200/500 [===========>..................] - ETA: 1:56 - loss: 0.5006 - regression_loss: 0.4597 - classification_loss: 0.0409 201/500 [===========>..................] - ETA: 1:56 - loss: 0.5006 - regression_loss: 0.4596 - classification_loss: 0.0410 202/500 [===========>..................] - ETA: 1:56 - loss: 0.5005 - regression_loss: 0.4597 - classification_loss: 0.0409 203/500 [===========>..................] - ETA: 1:55 - loss: 0.5003 - regression_loss: 0.4595 - classification_loss: 0.0408 204/500 [===========>..................] - ETA: 1:55 - loss: 0.5014 - regression_loss: 0.4607 - classification_loss: 0.0408 205/500 [===========>..................] - ETA: 1:54 - loss: 0.5009 - regression_loss: 0.4602 - classification_loss: 0.0407 206/500 [===========>..................] - ETA: 1:54 - loss: 0.4998 - regression_loss: 0.4592 - classification_loss: 0.0407 207/500 [===========>..................] - ETA: 1:54 - loss: 0.5005 - regression_loss: 0.4598 - classification_loss: 0.0407 208/500 [===========>..................] - ETA: 1:53 - loss: 0.5007 - regression_loss: 0.4601 - classification_loss: 0.0406 209/500 [===========>..................] - ETA: 1:53 - loss: 0.5005 - regression_loss: 0.4600 - classification_loss: 0.0405 210/500 [===========>..................] - ETA: 1:52 - loss: 0.4998 - regression_loss: 0.4594 - classification_loss: 0.0404 211/500 [===========>..................] - ETA: 1:52 - loss: 0.5001 - regression_loss: 0.4598 - classification_loss: 0.0404 212/500 [===========>..................] - ETA: 1:52 - loss: 0.5007 - regression_loss: 0.4603 - classification_loss: 0.0403 213/500 [===========>..................] - ETA: 1:51 - loss: 0.5013 - regression_loss: 0.4610 - classification_loss: 0.0404 214/500 [===========>..................] - ETA: 1:51 - loss: 0.5026 - regression_loss: 0.4623 - classification_loss: 0.0403 215/500 [===========>..................] - ETA: 1:51 - loss: 0.5032 - regression_loss: 0.4629 - classification_loss: 0.0402 216/500 [===========>..................] - ETA: 1:50 - loss: 0.5023 - regression_loss: 0.4621 - classification_loss: 0.0402 217/500 [============>.................] - ETA: 1:50 - loss: 0.5009 - regression_loss: 0.4608 - classification_loss: 0.0401 218/500 [============>.................] - ETA: 1:49 - loss: 0.5004 - regression_loss: 0.4603 - classification_loss: 0.0401 219/500 [============>.................] - ETA: 1:49 - loss: 0.4992 - regression_loss: 0.4592 - classification_loss: 0.0400 220/500 [============>.................] - ETA: 1:49 - loss: 0.4991 - regression_loss: 0.4589 - classification_loss: 0.0402 221/500 [============>.................] - ETA: 1:48 - loss: 0.5011 - regression_loss: 0.4605 - classification_loss: 0.0406 222/500 [============>.................] - ETA: 1:48 - loss: 0.5026 - regression_loss: 0.4618 - classification_loss: 0.0408 223/500 [============>.................] - ETA: 1:47 - loss: 0.5025 - regression_loss: 0.4618 - classification_loss: 0.0407 224/500 [============>.................] - ETA: 1:47 - loss: 0.5026 - regression_loss: 0.4619 - classification_loss: 0.0407 225/500 [============>.................] - ETA: 1:47 - loss: 0.5018 - regression_loss: 0.4613 - classification_loss: 0.0406 226/500 [============>.................] - ETA: 1:46 - loss: 0.5028 - regression_loss: 0.4621 - classification_loss: 0.0407 227/500 [============>.................] - ETA: 1:46 - loss: 0.5021 - regression_loss: 0.4615 - classification_loss: 0.0406 228/500 [============>.................] - ETA: 1:46 - loss: 0.5012 - regression_loss: 0.4607 - classification_loss: 0.0405 229/500 [============>.................] - ETA: 1:45 - loss: 0.5014 - regression_loss: 0.4610 - classification_loss: 0.0404 230/500 [============>.................] - ETA: 1:45 - loss: 0.5018 - regression_loss: 0.4613 - classification_loss: 0.0405 231/500 [============>.................] - ETA: 1:44 - loss: 0.5009 - regression_loss: 0.4605 - classification_loss: 0.0404 232/500 [============>.................] - ETA: 1:44 - loss: 0.5008 - regression_loss: 0.4605 - classification_loss: 0.0403 233/500 [============>.................] - ETA: 1:44 - loss: 0.4997 - regression_loss: 0.4595 - classification_loss: 0.0402 234/500 [=============>................] - ETA: 1:43 - loss: 0.4984 - regression_loss: 0.4583 - classification_loss: 0.0401 235/500 [=============>................] - ETA: 1:43 - loss: 0.4990 - regression_loss: 0.4590 - classification_loss: 0.0401 236/500 [=============>................] - ETA: 1:42 - loss: 0.5021 - regression_loss: 0.4614 - classification_loss: 0.0407 237/500 [=============>................] - ETA: 1:42 - loss: 0.5012 - regression_loss: 0.4606 - classification_loss: 0.0406 238/500 [=============>................] - ETA: 1:42 - loss: 0.5015 - regression_loss: 0.4609 - classification_loss: 0.0406 239/500 [=============>................] - ETA: 1:41 - loss: 0.5021 - regression_loss: 0.4614 - classification_loss: 0.0408 240/500 [=============>................] - ETA: 1:41 - loss: 0.5045 - regression_loss: 0.4635 - classification_loss: 0.0410 241/500 [=============>................] - ETA: 1:40 - loss: 0.5038 - regression_loss: 0.4630 - classification_loss: 0.0409 242/500 [=============>................] - ETA: 1:40 - loss: 0.5041 - regression_loss: 0.4631 - classification_loss: 0.0410 243/500 [=============>................] - ETA: 1:40 - loss: 0.5033 - regression_loss: 0.4624 - classification_loss: 0.0409 244/500 [=============>................] - ETA: 1:39 - loss: 0.5021 - regression_loss: 0.4612 - classification_loss: 0.0408 245/500 [=============>................] - ETA: 1:39 - loss: 0.5027 - regression_loss: 0.4619 - classification_loss: 0.0408 246/500 [=============>................] - ETA: 1:38 - loss: 0.5046 - regression_loss: 0.4637 - classification_loss: 0.0408 247/500 [=============>................] - ETA: 1:38 - loss: 0.5043 - regression_loss: 0.4636 - classification_loss: 0.0408 248/500 [=============>................] - ETA: 1:38 - loss: 0.5036 - regression_loss: 0.4629 - classification_loss: 0.0407 249/500 [=============>................] - ETA: 1:37 - loss: 0.5028 - regression_loss: 0.4622 - classification_loss: 0.0406 250/500 [==============>...............] - ETA: 1:37 - loss: 0.5027 - regression_loss: 0.4621 - classification_loss: 0.0406 251/500 [==============>...............] - ETA: 1:36 - loss: 0.5018 - regression_loss: 0.4612 - classification_loss: 0.0405 252/500 [==============>...............] - ETA: 1:36 - loss: 0.5020 - regression_loss: 0.4614 - classification_loss: 0.0406 253/500 [==============>...............] - ETA: 1:36 - loss: 0.5012 - regression_loss: 0.4607 - classification_loss: 0.0405 254/500 [==============>...............] - ETA: 1:35 - loss: 0.5028 - regression_loss: 0.4617 - classification_loss: 0.0411 255/500 [==============>...............] - ETA: 1:35 - loss: 0.5035 - regression_loss: 0.4623 - classification_loss: 0.0412 256/500 [==============>...............] - ETA: 1:35 - loss: 0.5032 - regression_loss: 0.4621 - classification_loss: 0.0411 257/500 [==============>...............] - ETA: 1:34 - loss: 0.5026 - regression_loss: 0.4616 - classification_loss: 0.0410 258/500 [==============>...............] - ETA: 1:34 - loss: 0.5036 - regression_loss: 0.4625 - classification_loss: 0.0411 259/500 [==============>...............] - ETA: 1:33 - loss: 0.5037 - regression_loss: 0.4627 - classification_loss: 0.0410 260/500 [==============>...............] - ETA: 1:33 - loss: 0.5031 - regression_loss: 0.4622 - classification_loss: 0.0409 261/500 [==============>...............] - ETA: 1:33 - loss: 0.5017 - regression_loss: 0.4609 - classification_loss: 0.0408 262/500 [==============>...............] - ETA: 1:32 - loss: 0.5013 - regression_loss: 0.4605 - classification_loss: 0.0407 263/500 [==============>...............] - ETA: 1:32 - loss: 0.5018 - regression_loss: 0.4611 - classification_loss: 0.0407 264/500 [==============>...............] - ETA: 1:31 - loss: 0.5012 - regression_loss: 0.4605 - classification_loss: 0.0407 265/500 [==============>...............] - ETA: 1:31 - loss: 0.5001 - regression_loss: 0.4595 - classification_loss: 0.0406 266/500 [==============>...............] - ETA: 1:31 - loss: 0.4996 - regression_loss: 0.4590 - classification_loss: 0.0405 267/500 [===============>..............] - ETA: 1:30 - loss: 0.4987 - regression_loss: 0.4582 - classification_loss: 0.0404 268/500 [===============>..............] - ETA: 1:30 - loss: 0.4992 - regression_loss: 0.4588 - classification_loss: 0.0404 269/500 [===============>..............] - ETA: 1:30 - loss: 0.4988 - regression_loss: 0.4584 - classification_loss: 0.0403 270/500 [===============>..............] - ETA: 1:29 - loss: 0.4987 - regression_loss: 0.4583 - classification_loss: 0.0403 271/500 [===============>..............] - ETA: 1:29 - loss: 0.4986 - regression_loss: 0.4582 - classification_loss: 0.0403 272/500 [===============>..............] - ETA: 1:28 - loss: 0.4983 - regression_loss: 0.4580 - classification_loss: 0.0403 273/500 [===============>..............] - ETA: 1:28 - loss: 0.4980 - regression_loss: 0.4578 - classification_loss: 0.0402 274/500 [===============>..............] - ETA: 1:28 - loss: 0.4976 - regression_loss: 0.4574 - classification_loss: 0.0401 275/500 [===============>..............] - ETA: 1:27 - loss: 0.4974 - regression_loss: 0.4573 - classification_loss: 0.0401 276/500 [===============>..............] - ETA: 1:27 - loss: 0.4975 - regression_loss: 0.4575 - classification_loss: 0.0400 277/500 [===============>..............] - ETA: 1:26 - loss: 0.4985 - regression_loss: 0.4583 - classification_loss: 0.0403 278/500 [===============>..............] - ETA: 1:26 - loss: 0.4983 - regression_loss: 0.4581 - classification_loss: 0.0402 279/500 [===============>..............] - ETA: 1:26 - loss: 0.4980 - regression_loss: 0.4578 - classification_loss: 0.0401 280/500 [===============>..............] - ETA: 1:25 - loss: 0.4979 - regression_loss: 0.4578 - classification_loss: 0.0401 281/500 [===============>..............] - ETA: 1:25 - loss: 0.4975 - regression_loss: 0.4575 - classification_loss: 0.0401 282/500 [===============>..............] - ETA: 1:24 - loss: 0.4973 - regression_loss: 0.4573 - classification_loss: 0.0401 283/500 [===============>..............] - ETA: 1:24 - loss: 0.4967 - regression_loss: 0.4567 - classification_loss: 0.0400 284/500 [================>.............] - ETA: 1:24 - loss: 0.4965 - regression_loss: 0.4566 - classification_loss: 0.0399 285/500 [================>.............] - ETA: 1:23 - loss: 0.4961 - regression_loss: 0.4563 - classification_loss: 0.0398 286/500 [================>.............] - ETA: 1:23 - loss: 0.4959 - regression_loss: 0.4561 - classification_loss: 0.0398 287/500 [================>.............] - ETA: 1:23 - loss: 0.4963 - regression_loss: 0.4565 - classification_loss: 0.0398 288/500 [================>.............] - ETA: 1:22 - loss: 0.4963 - regression_loss: 0.4564 - classification_loss: 0.0398 289/500 [================>.............] - ETA: 1:22 - loss: 0.4957 - regression_loss: 0.4560 - classification_loss: 0.0398 290/500 [================>.............] - ETA: 1:21 - loss: 0.4956 - regression_loss: 0.4558 - classification_loss: 0.0398 291/500 [================>.............] - ETA: 1:21 - loss: 0.4967 - regression_loss: 0.4567 - classification_loss: 0.0400 292/500 [================>.............] - ETA: 1:21 - loss: 0.4980 - regression_loss: 0.4580 - classification_loss: 0.0400 293/500 [================>.............] - ETA: 1:20 - loss: 0.4984 - regression_loss: 0.4583 - classification_loss: 0.0401 294/500 [================>.............] - ETA: 1:20 - loss: 0.4979 - regression_loss: 0.4579 - classification_loss: 0.0400 295/500 [================>.............] - ETA: 1:19 - loss: 0.4978 - regression_loss: 0.4578 - classification_loss: 0.0401 296/500 [================>.............] - ETA: 1:19 - loss: 0.4975 - regression_loss: 0.4575 - classification_loss: 0.0401 297/500 [================>.............] - ETA: 1:19 - loss: 0.4987 - regression_loss: 0.4585 - classification_loss: 0.0402 298/500 [================>.............] - ETA: 1:18 - loss: 0.4985 - regression_loss: 0.4582 - classification_loss: 0.0402 299/500 [================>.............] - ETA: 1:18 - loss: 0.4986 - regression_loss: 0.4584 - classification_loss: 0.0402 300/500 [=================>............] - ETA: 1:17 - loss: 0.4981 - regression_loss: 0.4579 - classification_loss: 0.0402 301/500 [=================>............] - ETA: 1:17 - loss: 0.4979 - regression_loss: 0.4577 - classification_loss: 0.0401 302/500 [=================>............] - ETA: 1:17 - loss: 0.4976 - regression_loss: 0.4575 - classification_loss: 0.0401 303/500 [=================>............] - ETA: 1:16 - loss: 0.4974 - regression_loss: 0.4573 - classification_loss: 0.0401 304/500 [=================>............] - ETA: 1:16 - loss: 0.4964 - regression_loss: 0.4563 - classification_loss: 0.0400 305/500 [=================>............] - ETA: 1:16 - loss: 0.4959 - regression_loss: 0.4560 - classification_loss: 0.0399 306/500 [=================>............] - ETA: 1:15 - loss: 0.4959 - regression_loss: 0.4560 - classification_loss: 0.0399 307/500 [=================>............] - ETA: 1:15 - loss: 0.4952 - regression_loss: 0.4553 - classification_loss: 0.0399 308/500 [=================>............] - ETA: 1:14 - loss: 0.4946 - regression_loss: 0.4548 - classification_loss: 0.0398 309/500 [=================>............] - ETA: 1:14 - loss: 0.4937 - regression_loss: 0.4539 - classification_loss: 0.0397 310/500 [=================>............] - ETA: 1:14 - loss: 0.4934 - regression_loss: 0.4537 - classification_loss: 0.0397 311/500 [=================>............] - ETA: 1:13 - loss: 0.4933 - regression_loss: 0.4536 - classification_loss: 0.0397 312/500 [=================>............] - ETA: 1:13 - loss: 0.4934 - regression_loss: 0.4537 - classification_loss: 0.0397 313/500 [=================>............] - ETA: 1:12 - loss: 0.4934 - regression_loss: 0.4538 - classification_loss: 0.0396 314/500 [=================>............] - ETA: 1:12 - loss: 0.4926 - regression_loss: 0.4530 - classification_loss: 0.0396 315/500 [=================>............] - ETA: 1:12 - loss: 0.4926 - regression_loss: 0.4530 - classification_loss: 0.0397 316/500 [=================>............] - ETA: 1:11 - loss: 0.4921 - regression_loss: 0.4525 - classification_loss: 0.0396 317/500 [==================>...........] - ETA: 1:11 - loss: 0.4913 - regression_loss: 0.4518 - classification_loss: 0.0395 318/500 [==================>...........] - ETA: 1:10 - loss: 0.4911 - regression_loss: 0.4516 - classification_loss: 0.0395 319/500 [==================>...........] - ETA: 1:10 - loss: 0.4906 - regression_loss: 0.4511 - classification_loss: 0.0395 320/500 [==================>...........] - ETA: 1:10 - loss: 0.4901 - regression_loss: 0.4507 - classification_loss: 0.0395 321/500 [==================>...........] - ETA: 1:09 - loss: 0.4900 - regression_loss: 0.4505 - classification_loss: 0.0395 322/500 [==================>...........] - ETA: 1:09 - loss: 0.4904 - regression_loss: 0.4508 - classification_loss: 0.0395 323/500 [==================>...........] - ETA: 1:09 - loss: 0.4916 - regression_loss: 0.4519 - classification_loss: 0.0397 324/500 [==================>...........] - ETA: 1:08 - loss: 0.4910 - regression_loss: 0.4514 - classification_loss: 0.0396 325/500 [==================>...........] - ETA: 1:08 - loss: 0.4913 - regression_loss: 0.4517 - classification_loss: 0.0396 326/500 [==================>...........] - ETA: 1:07 - loss: 0.4907 - regression_loss: 0.4512 - classification_loss: 0.0395 327/500 [==================>...........] - ETA: 1:07 - loss: 0.4913 - regression_loss: 0.4518 - classification_loss: 0.0396 328/500 [==================>...........] - ETA: 1:07 - loss: 0.4934 - regression_loss: 0.4532 - classification_loss: 0.0402 329/500 [==================>...........] - ETA: 1:06 - loss: 0.4930 - regression_loss: 0.4529 - classification_loss: 0.0401 330/500 [==================>...........] - ETA: 1:06 - loss: 0.4922 - regression_loss: 0.4521 - classification_loss: 0.0400 331/500 [==================>...........] - ETA: 1:05 - loss: 0.4920 - regression_loss: 0.4520 - classification_loss: 0.0400 332/500 [==================>...........] - ETA: 1:05 - loss: 0.4917 - regression_loss: 0.4517 - classification_loss: 0.0399 333/500 [==================>...........] - ETA: 1:05 - loss: 0.4924 - regression_loss: 0.4524 - classification_loss: 0.0400 334/500 [===================>..........] - ETA: 1:04 - loss: 0.4923 - regression_loss: 0.4523 - classification_loss: 0.0400 335/500 [===================>..........] - ETA: 1:04 - loss: 0.4917 - regression_loss: 0.4518 - classification_loss: 0.0399 336/500 [===================>..........] - ETA: 1:03 - loss: 0.4910 - regression_loss: 0.4511 - classification_loss: 0.0399 337/500 [===================>..........] - ETA: 1:03 - loss: 0.4904 - regression_loss: 0.4506 - classification_loss: 0.0398 338/500 [===================>..........] - ETA: 1:03 - loss: 0.4898 - regression_loss: 0.4500 - classification_loss: 0.0397 339/500 [===================>..........] - ETA: 1:02 - loss: 0.4899 - regression_loss: 0.4502 - classification_loss: 0.0397 340/500 [===================>..........] - ETA: 1:02 - loss: 0.4903 - regression_loss: 0.4506 - classification_loss: 0.0397 341/500 [===================>..........] - ETA: 1:01 - loss: 0.4900 - regression_loss: 0.4503 - classification_loss: 0.0397 342/500 [===================>..........] - ETA: 1:01 - loss: 0.4898 - regression_loss: 0.4502 - classification_loss: 0.0396 343/500 [===================>..........] - ETA: 1:01 - loss: 0.4894 - regression_loss: 0.4498 - classification_loss: 0.0396 344/500 [===================>..........] - ETA: 1:00 - loss: 0.4891 - regression_loss: 0.4496 - classification_loss: 0.0395 345/500 [===================>..........] - ETA: 1:00 - loss: 0.4888 - regression_loss: 0.4493 - classification_loss: 0.0395 346/500 [===================>..........] - ETA: 1:00 - loss: 0.4888 - regression_loss: 0.4493 - classification_loss: 0.0395 347/500 [===================>..........] - ETA: 59s - loss: 0.4883 - regression_loss: 0.4488 - classification_loss: 0.0394  348/500 [===================>..........] - ETA: 59s - loss: 0.4873 - regression_loss: 0.4480 - classification_loss: 0.0393 349/500 [===================>..........] - ETA: 58s - loss: 0.4884 - regression_loss: 0.4490 - classification_loss: 0.0394 350/500 [====================>.........] - ETA: 58s - loss: 0.4880 - regression_loss: 0.4486 - classification_loss: 0.0394 351/500 [====================>.........] - ETA: 58s - loss: 0.4870 - regression_loss: 0.4477 - classification_loss: 0.0393 352/500 [====================>.........] - ETA: 57s - loss: 0.4867 - regression_loss: 0.4474 - classification_loss: 0.0393 353/500 [====================>.........] - ETA: 57s - loss: 0.4869 - regression_loss: 0.4476 - classification_loss: 0.0393 354/500 [====================>.........] - ETA: 56s - loss: 0.4865 - regression_loss: 0.4473 - classification_loss: 0.0393 355/500 [====================>.........] - ETA: 56s - loss: 0.4869 - regression_loss: 0.4476 - classification_loss: 0.0392 356/500 [====================>.........] - ETA: 56s - loss: 0.4860 - regression_loss: 0.4468 - classification_loss: 0.0392 357/500 [====================>.........] - ETA: 55s - loss: 0.4855 - regression_loss: 0.4464 - classification_loss: 0.0391 358/500 [====================>.........] - ETA: 55s - loss: 0.4862 - regression_loss: 0.4470 - classification_loss: 0.0392 359/500 [====================>.........] - ETA: 54s - loss: 0.4858 - regression_loss: 0.4467 - classification_loss: 0.0391 360/500 [====================>.........] - ETA: 54s - loss: 0.4857 - regression_loss: 0.4466 - classification_loss: 0.0390 361/500 [====================>.........] - ETA: 54s - loss: 0.4855 - regression_loss: 0.4465 - classification_loss: 0.0390 362/500 [====================>.........] - ETA: 53s - loss: 0.4847 - regression_loss: 0.4458 - classification_loss: 0.0389 363/500 [====================>.........] - ETA: 53s - loss: 0.4851 - regression_loss: 0.4462 - classification_loss: 0.0389 364/500 [====================>.........] - ETA: 52s - loss: 0.4853 - regression_loss: 0.4464 - classification_loss: 0.0389 365/500 [====================>.........] - ETA: 52s - loss: 0.4851 - regression_loss: 0.4463 - classification_loss: 0.0389 366/500 [====================>.........] - ETA: 52s - loss: 0.4853 - regression_loss: 0.4464 - classification_loss: 0.0389 367/500 [=====================>........] - ETA: 51s - loss: 0.4847 - regression_loss: 0.4459 - classification_loss: 0.0388 368/500 [=====================>........] - ETA: 51s - loss: 0.4843 - regression_loss: 0.4455 - classification_loss: 0.0388 369/500 [=====================>........] - ETA: 51s - loss: 0.4840 - regression_loss: 0.4453 - classification_loss: 0.0388 370/500 [=====================>........] - ETA: 50s - loss: 0.4838 - regression_loss: 0.4450 - classification_loss: 0.0387 371/500 [=====================>........] - ETA: 50s - loss: 0.4846 - regression_loss: 0.4458 - classification_loss: 0.0388 372/500 [=====================>........] - ETA: 49s - loss: 0.4848 - regression_loss: 0.4460 - classification_loss: 0.0388 373/500 [=====================>........] - ETA: 49s - loss: 0.4858 - regression_loss: 0.4469 - classification_loss: 0.0389 374/500 [=====================>........] - ETA: 49s - loss: 0.4858 - regression_loss: 0.4469 - classification_loss: 0.0389 375/500 [=====================>........] - ETA: 48s - loss: 0.4863 - regression_loss: 0.4474 - classification_loss: 0.0389 376/500 [=====================>........] - ETA: 48s - loss: 0.4859 - regression_loss: 0.4470 - classification_loss: 0.0389 377/500 [=====================>........] - ETA: 47s - loss: 0.4851 - regression_loss: 0.4463 - classification_loss: 0.0388 378/500 [=====================>........] - ETA: 47s - loss: 0.4845 - regression_loss: 0.4457 - classification_loss: 0.0388 379/500 [=====================>........] - ETA: 47s - loss: 0.4843 - regression_loss: 0.4456 - classification_loss: 0.0387 380/500 [=====================>........] - ETA: 46s - loss: 0.4839 - regression_loss: 0.4452 - classification_loss: 0.0387 381/500 [=====================>........] - ETA: 46s - loss: 0.4834 - regression_loss: 0.4448 - classification_loss: 0.0386 382/500 [=====================>........] - ETA: 45s - loss: 0.4837 - regression_loss: 0.4451 - classification_loss: 0.0386 383/500 [=====================>........] - ETA: 45s - loss: 0.4837 - regression_loss: 0.4451 - classification_loss: 0.0386 384/500 [======================>.......] - ETA: 45s - loss: 0.4832 - regression_loss: 0.4446 - classification_loss: 0.0386 385/500 [======================>.......] - ETA: 44s - loss: 0.4828 - regression_loss: 0.4442 - classification_loss: 0.0386 386/500 [======================>.......] - ETA: 44s - loss: 0.4843 - regression_loss: 0.4455 - classification_loss: 0.0388 387/500 [======================>.......] - ETA: 44s - loss: 0.4844 - regression_loss: 0.4457 - classification_loss: 0.0387 388/500 [======================>.......] - ETA: 43s - loss: 0.4859 - regression_loss: 0.4470 - classification_loss: 0.0390 389/500 [======================>.......] - ETA: 43s - loss: 0.4856 - regression_loss: 0.4466 - classification_loss: 0.0389 390/500 [======================>.......] - ETA: 42s - loss: 0.4848 - regression_loss: 0.4460 - classification_loss: 0.0389 391/500 [======================>.......] - ETA: 42s - loss: 0.4847 - regression_loss: 0.4459 - classification_loss: 0.0388 392/500 [======================>.......] - ETA: 42s - loss: 0.4844 - regression_loss: 0.4456 - classification_loss: 0.0388 393/500 [======================>.......] - ETA: 41s - loss: 0.4858 - regression_loss: 0.4467 - classification_loss: 0.0391 394/500 [======================>.......] - ETA: 41s - loss: 0.4859 - regression_loss: 0.4469 - classification_loss: 0.0390 395/500 [======================>.......] - ETA: 40s - loss: 0.4862 - regression_loss: 0.4471 - classification_loss: 0.0391 396/500 [======================>.......] - ETA: 40s - loss: 0.4865 - regression_loss: 0.4474 - classification_loss: 0.0391 397/500 [======================>.......] - ETA: 40s - loss: 0.4863 - regression_loss: 0.4472 - classification_loss: 0.0391 398/500 [======================>.......] - ETA: 39s - loss: 0.4862 - regression_loss: 0.4471 - classification_loss: 0.0390 399/500 [======================>.......] - ETA: 39s - loss: 0.4863 - regression_loss: 0.4473 - classification_loss: 0.0390 400/500 [=======================>......] - ETA: 38s - loss: 0.4865 - regression_loss: 0.4474 - classification_loss: 0.0391 401/500 [=======================>......] - ETA: 38s - loss: 0.4867 - regression_loss: 0.4477 - classification_loss: 0.0390 402/500 [=======================>......] - ETA: 38s - loss: 0.4869 - regression_loss: 0.4478 - classification_loss: 0.0390 403/500 [=======================>......] - ETA: 37s - loss: 0.4870 - regression_loss: 0.4480 - classification_loss: 0.0390 404/500 [=======================>......] - ETA: 37s - loss: 0.4869 - regression_loss: 0.4479 - classification_loss: 0.0390 405/500 [=======================>......] - ETA: 36s - loss: 0.4874 - regression_loss: 0.4484 - classification_loss: 0.0391 406/500 [=======================>......] - ETA: 36s - loss: 0.4877 - regression_loss: 0.4486 - classification_loss: 0.0391 407/500 [=======================>......] - ETA: 36s - loss: 0.4876 - regression_loss: 0.4486 - classification_loss: 0.0390 408/500 [=======================>......] - ETA: 35s - loss: 0.4877 - regression_loss: 0.4487 - classification_loss: 0.0390 409/500 [=======================>......] - ETA: 35s - loss: 0.4879 - regression_loss: 0.4489 - classification_loss: 0.0390 410/500 [=======================>......] - ETA: 35s - loss: 0.4873 - regression_loss: 0.4484 - classification_loss: 0.0389 411/500 [=======================>......] - ETA: 34s - loss: 0.4873 - regression_loss: 0.4484 - classification_loss: 0.0390 412/500 [=======================>......] - ETA: 34s - loss: 0.4872 - regression_loss: 0.4483 - classification_loss: 0.0389 413/500 [=======================>......] - ETA: 33s - loss: 0.4871 - regression_loss: 0.4482 - classification_loss: 0.0389 414/500 [=======================>......] - ETA: 33s - loss: 0.4867 - regression_loss: 0.4479 - classification_loss: 0.0389 415/500 [=======================>......] - ETA: 33s - loss: 0.4870 - regression_loss: 0.4482 - classification_loss: 0.0389 416/500 [=======================>......] - ETA: 32s - loss: 0.4872 - regression_loss: 0.4484 - classification_loss: 0.0389 417/500 [========================>.....] - ETA: 32s - loss: 0.4868 - regression_loss: 0.4480 - classification_loss: 0.0388 418/500 [========================>.....] - ETA: 31s - loss: 0.4866 - regression_loss: 0.4478 - classification_loss: 0.0388 419/500 [========================>.....] - ETA: 31s - loss: 0.4861 - regression_loss: 0.4474 - classification_loss: 0.0388 420/500 [========================>.....] - ETA: 31s - loss: 0.4854 - regression_loss: 0.4467 - classification_loss: 0.0387 421/500 [========================>.....] - ETA: 30s - loss: 0.4848 - regression_loss: 0.4461 - classification_loss: 0.0386 422/500 [========================>.....] - ETA: 30s - loss: 0.4843 - regression_loss: 0.4457 - classification_loss: 0.0386 423/500 [========================>.....] - ETA: 30s - loss: 0.4851 - regression_loss: 0.4464 - classification_loss: 0.0387 424/500 [========================>.....] - ETA: 29s - loss: 0.4845 - regression_loss: 0.4458 - classification_loss: 0.0387 425/500 [========================>.....] - ETA: 29s - loss: 0.4842 - regression_loss: 0.4455 - classification_loss: 0.0386 426/500 [========================>.....] - ETA: 28s - loss: 0.4847 - regression_loss: 0.4460 - classification_loss: 0.0387 427/500 [========================>.....] - ETA: 28s - loss: 0.4847 - regression_loss: 0.4460 - classification_loss: 0.0387 428/500 [========================>.....] - ETA: 28s - loss: 0.4845 - regression_loss: 0.4459 - classification_loss: 0.0387 429/500 [========================>.....] - ETA: 27s - loss: 0.4864 - regression_loss: 0.4472 - classification_loss: 0.0391 430/500 [========================>.....] - ETA: 27s - loss: 0.4868 - regression_loss: 0.4476 - classification_loss: 0.0392 431/500 [========================>.....] - ETA: 26s - loss: 0.4865 - regression_loss: 0.4473 - classification_loss: 0.0391 432/500 [========================>.....] - ETA: 26s - loss: 0.4865 - regression_loss: 0.4473 - classification_loss: 0.0392 433/500 [========================>.....] - ETA: 26s - loss: 0.4869 - regression_loss: 0.4477 - classification_loss: 0.0392 434/500 [=========================>....] - ETA: 25s - loss: 0.4869 - regression_loss: 0.4477 - classification_loss: 0.0392 435/500 [=========================>....] - ETA: 25s - loss: 0.4868 - regression_loss: 0.4476 - classification_loss: 0.0392 436/500 [=========================>....] - ETA: 24s - loss: 0.4871 - regression_loss: 0.4480 - classification_loss: 0.0391 437/500 [=========================>....] - ETA: 24s - loss: 0.4878 - regression_loss: 0.4482 - classification_loss: 0.0397 438/500 [=========================>....] - ETA: 24s - loss: 0.4876 - regression_loss: 0.4480 - classification_loss: 0.0396 439/500 [=========================>....] - ETA: 23s - loss: 0.4876 - regression_loss: 0.4480 - classification_loss: 0.0396 440/500 [=========================>....] - ETA: 23s - loss: 0.4882 - regression_loss: 0.4486 - classification_loss: 0.0396 441/500 [=========================>....] - ETA: 22s - loss: 0.4881 - regression_loss: 0.4485 - classification_loss: 0.0396 442/500 [=========================>....] - ETA: 22s - loss: 0.4884 - regression_loss: 0.4488 - classification_loss: 0.0396 443/500 [=========================>....] - ETA: 22s - loss: 0.4885 - regression_loss: 0.4489 - classification_loss: 0.0396 444/500 [=========================>....] - ETA: 21s - loss: 0.4883 - regression_loss: 0.4487 - classification_loss: 0.0396 445/500 [=========================>....] - ETA: 21s - loss: 0.4884 - regression_loss: 0.4488 - classification_loss: 0.0396 446/500 [=========================>....] - ETA: 21s - loss: 0.4877 - regression_loss: 0.4482 - classification_loss: 0.0395 447/500 [=========================>....] - ETA: 20s - loss: 0.4876 - regression_loss: 0.4481 - classification_loss: 0.0395 448/500 [=========================>....] - ETA: 20s - loss: 0.4876 - regression_loss: 0.4481 - classification_loss: 0.0395 449/500 [=========================>....] - ETA: 19s - loss: 0.4882 - regression_loss: 0.4486 - classification_loss: 0.0395 450/500 [==========================>...] - ETA: 19s - loss: 0.4879 - regression_loss: 0.4484 - classification_loss: 0.0395 451/500 [==========================>...] - ETA: 19s - loss: 0.4883 - regression_loss: 0.4488 - classification_loss: 0.0395 452/500 [==========================>...] - ETA: 18s - loss: 0.4888 - regression_loss: 0.4493 - classification_loss: 0.0395 453/500 [==========================>...] - ETA: 18s - loss: 0.4897 - regression_loss: 0.4501 - classification_loss: 0.0396 454/500 [==========================>...] - ETA: 17s - loss: 0.4892 - regression_loss: 0.4497 - classification_loss: 0.0396 455/500 [==========================>...] - ETA: 17s - loss: 0.4897 - regression_loss: 0.4501 - classification_loss: 0.0396 456/500 [==========================>...] - ETA: 17s - loss: 0.4892 - regression_loss: 0.4496 - classification_loss: 0.0396 457/500 [==========================>...] - ETA: 16s - loss: 0.4890 - regression_loss: 0.4495 - classification_loss: 0.0395 458/500 [==========================>...] - ETA: 16s - loss: 0.4893 - regression_loss: 0.4497 - classification_loss: 0.0395 459/500 [==========================>...] - ETA: 15s - loss: 0.4892 - regression_loss: 0.4497 - classification_loss: 0.0395 460/500 [==========================>...] - ETA: 15s - loss: 0.4893 - regression_loss: 0.4497 - classification_loss: 0.0395 461/500 [==========================>...] - ETA: 15s - loss: 0.4889 - regression_loss: 0.4494 - classification_loss: 0.0395 462/500 [==========================>...] - ETA: 14s - loss: 0.4899 - regression_loss: 0.4502 - classification_loss: 0.0397 463/500 [==========================>...] - ETA: 14s - loss: 0.4901 - regression_loss: 0.4504 - classification_loss: 0.0397 464/500 [==========================>...] - ETA: 14s - loss: 0.4899 - regression_loss: 0.4503 - classification_loss: 0.0396 465/500 [==========================>...] - ETA: 13s - loss: 0.4899 - regression_loss: 0.4502 - classification_loss: 0.0396 466/500 [==========================>...] - ETA: 13s - loss: 0.4898 - regression_loss: 0.4501 - classification_loss: 0.0396 467/500 [===========================>..] - ETA: 12s - loss: 0.4895 - regression_loss: 0.4498 - classification_loss: 0.0397 468/500 [===========================>..] - ETA: 12s - loss: 0.4886 - regression_loss: 0.4490 - classification_loss: 0.0396 469/500 [===========================>..] - ETA: 12s - loss: 0.4884 - regression_loss: 0.4488 - classification_loss: 0.0396 470/500 [===========================>..] - ETA: 11s - loss: 0.4884 - regression_loss: 0.4489 - classification_loss: 0.0396 471/500 [===========================>..] - ETA: 11s - loss: 0.4881 - regression_loss: 0.4486 - classification_loss: 0.0395 472/500 [===========================>..] - ETA: 10s - loss: 0.4884 - regression_loss: 0.4489 - classification_loss: 0.0395 473/500 [===========================>..] - ETA: 10s - loss: 0.4881 - regression_loss: 0.4486 - classification_loss: 0.0395 474/500 [===========================>..] - ETA: 10s - loss: 0.4881 - regression_loss: 0.4486 - classification_loss: 0.0395 475/500 [===========================>..] - ETA: 9s - loss: 0.4879 - regression_loss: 0.4484 - classification_loss: 0.0395  476/500 [===========================>..] - ETA: 9s - loss: 0.4871 - regression_loss: 0.4477 - classification_loss: 0.0394 477/500 [===========================>..] - ETA: 8s - loss: 0.4867 - regression_loss: 0.4473 - classification_loss: 0.0394 478/500 [===========================>..] - ETA: 8s - loss: 0.4866 - regression_loss: 0.4472 - classification_loss: 0.0393 479/500 [===========================>..] - ETA: 8s - loss: 0.4863 - regression_loss: 0.4470 - classification_loss: 0.0393 480/500 [===========================>..] - ETA: 7s - loss: 0.4862 - regression_loss: 0.4469 - classification_loss: 0.0393 481/500 [===========================>..] - ETA: 7s - loss: 0.4858 - regression_loss: 0.4465 - classification_loss: 0.0393 482/500 [===========================>..] - ETA: 7s - loss: 0.4853 - regression_loss: 0.4461 - classification_loss: 0.0392 483/500 [===========================>..] - ETA: 6s - loss: 0.4849 - regression_loss: 0.4456 - classification_loss: 0.0393 484/500 [============================>.] - ETA: 6s - loss: 0.4845 - regression_loss: 0.4452 - classification_loss: 0.0393 485/500 [============================>.] - ETA: 5s - loss: 0.4846 - regression_loss: 0.4454 - classification_loss: 0.0392 486/500 [============================>.] - ETA: 5s - loss: 0.4846 - regression_loss: 0.4453 - classification_loss: 0.0392 487/500 [============================>.] - ETA: 5s - loss: 0.4842 - regression_loss: 0.4450 - classification_loss: 0.0392 488/500 [============================>.] - ETA: 4s - loss: 0.4838 - regression_loss: 0.4446 - classification_loss: 0.0392 489/500 [============================>.] - ETA: 4s - loss: 0.4832 - regression_loss: 0.4441 - classification_loss: 0.0391 490/500 [============================>.] - ETA: 3s - loss: 0.4827 - regression_loss: 0.4436 - classification_loss: 0.0391 491/500 [============================>.] - ETA: 3s - loss: 0.4822 - regression_loss: 0.4431 - classification_loss: 0.0390 492/500 [============================>.] - ETA: 3s - loss: 0.4824 - regression_loss: 0.4433 - classification_loss: 0.0391 493/500 [============================>.] - ETA: 2s - loss: 0.4824 - regression_loss: 0.4433 - classification_loss: 0.0391 494/500 [============================>.] - ETA: 2s - loss: 0.4816 - regression_loss: 0.4426 - classification_loss: 0.0390 495/500 [============================>.] - ETA: 1s - loss: 0.4820 - regression_loss: 0.4429 - classification_loss: 0.0390 496/500 [============================>.] - ETA: 1s - loss: 0.4821 - regression_loss: 0.4431 - classification_loss: 0.0391 497/500 [============================>.] - ETA: 1s - loss: 0.4816 - regression_loss: 0.4426 - classification_loss: 0.0391 498/500 [============================>.] - ETA: 0s - loss: 0.4821 - regression_loss: 0.4430 - classification_loss: 0.0391 499/500 [============================>.] - ETA: 0s - loss: 0.4816 - regression_loss: 0.4425 - classification_loss: 0.0390 500/500 [==============================] - 195s 390ms/step - loss: 0.4816 - regression_loss: 0.4426 - classification_loss: 0.0390 326 instances of class plum with average precision: 0.8453 mAP: 0.8453 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/20 1/500 [..............................] - ETA: 2:57 - loss: 0.2880 - regression_loss: 0.2669 - classification_loss: 0.0211 2/500 [..............................] - ETA: 3:07 - loss: 0.3406 - regression_loss: 0.2948 - classification_loss: 0.0458 3/500 [..............................] - ETA: 3:09 - loss: 0.6117 - regression_loss: 0.5408 - classification_loss: 0.0709 4/500 [..............................] - ETA: 3:08 - loss: 0.5673 - regression_loss: 0.5094 - classification_loss: 0.0579 5/500 [..............................] - ETA: 3:06 - loss: 0.5354 - regression_loss: 0.4850 - classification_loss: 0.0505 6/500 [..............................] - ETA: 3:07 - loss: 0.5090 - regression_loss: 0.4637 - classification_loss: 0.0452 7/500 [..............................] - ETA: 3:06 - loss: 0.5095 - regression_loss: 0.4630 - classification_loss: 0.0465 8/500 [..............................] - ETA: 3:06 - loss: 0.5251 - regression_loss: 0.4776 - classification_loss: 0.0475 9/500 [..............................] - ETA: 3:06 - loss: 0.4902 - regression_loss: 0.4465 - classification_loss: 0.0437 10/500 [..............................] - ETA: 3:07 - loss: 0.4757 - regression_loss: 0.4347 - classification_loss: 0.0410 11/500 [..............................] - ETA: 3:07 - loss: 0.4656 - regression_loss: 0.4266 - classification_loss: 0.0390 12/500 [..............................] - ETA: 3:06 - loss: 0.4462 - regression_loss: 0.4090 - classification_loss: 0.0372 13/500 [..............................] - ETA: 3:05 - loss: 0.4473 - regression_loss: 0.4099 - classification_loss: 0.0374 14/500 [..............................] - ETA: 3:05 - loss: 0.4811 - regression_loss: 0.4394 - classification_loss: 0.0416 15/500 [..............................] - ETA: 3:05 - loss: 0.4686 - regression_loss: 0.4283 - classification_loss: 0.0403 16/500 [..............................] - ETA: 3:04 - loss: 0.4612 - regression_loss: 0.4222 - classification_loss: 0.0390 17/500 [>.............................] - ETA: 3:03 - loss: 0.4529 - regression_loss: 0.4150 - classification_loss: 0.0378 18/500 [>.............................] - ETA: 3:04 - loss: 0.4706 - regression_loss: 0.4316 - classification_loss: 0.0390 19/500 [>.............................] - ETA: 3:04 - loss: 0.4679 - regression_loss: 0.4296 - classification_loss: 0.0383 20/500 [>.............................] - ETA: 3:03 - loss: 0.4620 - regression_loss: 0.4244 - classification_loss: 0.0376 21/500 [>.............................] - ETA: 3:03 - loss: 0.4527 - regression_loss: 0.4158 - classification_loss: 0.0368 22/500 [>.............................] - ETA: 3:02 - loss: 0.4798 - regression_loss: 0.4338 - classification_loss: 0.0460 23/500 [>.............................] - ETA: 3:02 - loss: 0.4767 - regression_loss: 0.4317 - classification_loss: 0.0450 24/500 [>.............................] - ETA: 3:02 - loss: 0.4666 - regression_loss: 0.4226 - classification_loss: 0.0439 25/500 [>.............................] - ETA: 3:02 - loss: 0.4690 - regression_loss: 0.4252 - classification_loss: 0.0438 26/500 [>.............................] - ETA: 3:02 - loss: 0.4611 - regression_loss: 0.4185 - classification_loss: 0.0425 27/500 [>.............................] - ETA: 3:02 - loss: 0.4515 - regression_loss: 0.4100 - classification_loss: 0.0414 28/500 [>.............................] - ETA: 3:02 - loss: 0.4486 - regression_loss: 0.4077 - classification_loss: 0.0409 29/500 [>.............................] - ETA: 3:01 - loss: 0.4379 - regression_loss: 0.3979 - classification_loss: 0.0400 30/500 [>.............................] - ETA: 3:01 - loss: 0.4353 - regression_loss: 0.3956 - classification_loss: 0.0397 31/500 [>.............................] - ETA: 3:01 - loss: 0.4274 - regression_loss: 0.3887 - classification_loss: 0.0388 32/500 [>.............................] - ETA: 3:00 - loss: 0.4301 - regression_loss: 0.3916 - classification_loss: 0.0385 33/500 [>.............................] - ETA: 3:00 - loss: 0.4325 - regression_loss: 0.3946 - classification_loss: 0.0379 34/500 [=>............................] - ETA: 3:00 - loss: 0.4247 - regression_loss: 0.3875 - classification_loss: 0.0372 35/500 [=>............................] - ETA: 3:00 - loss: 0.4287 - regression_loss: 0.3909 - classification_loss: 0.0378 36/500 [=>............................] - ETA: 2:59 - loss: 0.4359 - regression_loss: 0.3977 - classification_loss: 0.0382 37/500 [=>............................] - ETA: 2:58 - loss: 0.4346 - regression_loss: 0.3964 - classification_loss: 0.0382 38/500 [=>............................] - ETA: 2:58 - loss: 0.4316 - regression_loss: 0.3939 - classification_loss: 0.0377 39/500 [=>............................] - ETA: 2:58 - loss: 0.4320 - regression_loss: 0.3946 - classification_loss: 0.0373 40/500 [=>............................] - ETA: 2:57 - loss: 0.4331 - regression_loss: 0.3960 - classification_loss: 0.0371 41/500 [=>............................] - ETA: 2:57 - loss: 0.4369 - regression_loss: 0.3992 - classification_loss: 0.0377 42/500 [=>............................] - ETA: 2:57 - loss: 0.4381 - regression_loss: 0.4004 - classification_loss: 0.0377 43/500 [=>............................] - ETA: 2:57 - loss: 0.4339 - regression_loss: 0.3967 - classification_loss: 0.0372 44/500 [=>............................] - ETA: 2:56 - loss: 0.4309 - regression_loss: 0.3942 - classification_loss: 0.0368 45/500 [=>............................] - ETA: 2:56 - loss: 0.4289 - regression_loss: 0.3924 - classification_loss: 0.0365 46/500 [=>............................] - ETA: 2:56 - loss: 0.4272 - regression_loss: 0.3906 - classification_loss: 0.0366 47/500 [=>............................] - ETA: 2:55 - loss: 0.4375 - regression_loss: 0.3995 - classification_loss: 0.0380 48/500 [=>............................] - ETA: 2:55 - loss: 0.4376 - regression_loss: 0.3998 - classification_loss: 0.0378 49/500 [=>............................] - ETA: 2:55 - loss: 0.4344 - regression_loss: 0.3968 - classification_loss: 0.0375 50/500 [==>...........................] - ETA: 2:54 - loss: 0.4363 - regression_loss: 0.3981 - classification_loss: 0.0382 51/500 [==>...........................] - ETA: 2:54 - loss: 0.4352 - regression_loss: 0.3965 - classification_loss: 0.0387 52/500 [==>...........................] - ETA: 2:54 - loss: 0.4307 - regression_loss: 0.3923 - classification_loss: 0.0384 53/500 [==>...........................] - ETA: 2:53 - loss: 0.4318 - regression_loss: 0.3936 - classification_loss: 0.0382 54/500 [==>...........................] - ETA: 2:53 - loss: 0.4319 - regression_loss: 0.3939 - classification_loss: 0.0380 55/500 [==>...........................] - ETA: 2:53 - loss: 0.4348 - regression_loss: 0.3970 - classification_loss: 0.0378 56/500 [==>...........................] - ETA: 2:53 - loss: 0.4333 - regression_loss: 0.3959 - classification_loss: 0.0375 57/500 [==>...........................] - ETA: 2:52 - loss: 0.4338 - regression_loss: 0.3966 - classification_loss: 0.0372 58/500 [==>...........................] - ETA: 2:52 - loss: 0.4314 - regression_loss: 0.3945 - classification_loss: 0.0370 59/500 [==>...........................] - ETA: 2:51 - loss: 0.4303 - regression_loss: 0.3937 - classification_loss: 0.0366 60/500 [==>...........................] - ETA: 2:51 - loss: 0.4292 - regression_loss: 0.3926 - classification_loss: 0.0365 61/500 [==>...........................] - ETA: 2:50 - loss: 0.4286 - regression_loss: 0.3922 - classification_loss: 0.0364 62/500 [==>...........................] - ETA: 2:50 - loss: 0.4291 - regression_loss: 0.3931 - classification_loss: 0.0360 63/500 [==>...........................] - ETA: 2:49 - loss: 0.4276 - regression_loss: 0.3918 - classification_loss: 0.0359 64/500 [==>...........................] - ETA: 2:49 - loss: 0.4285 - regression_loss: 0.3925 - classification_loss: 0.0359 65/500 [==>...........................] - ETA: 2:49 - loss: 0.4307 - regression_loss: 0.3946 - classification_loss: 0.0361 66/500 [==>...........................] - ETA: 2:48 - loss: 0.4280 - regression_loss: 0.3922 - classification_loss: 0.0357 67/500 [===>..........................] - ETA: 2:48 - loss: 0.4255 - regression_loss: 0.3901 - classification_loss: 0.0354 68/500 [===>..........................] - ETA: 2:48 - loss: 0.4242 - regression_loss: 0.3889 - classification_loss: 0.0353 69/500 [===>..........................] - ETA: 2:47 - loss: 0.4241 - regression_loss: 0.3889 - classification_loss: 0.0353 70/500 [===>..........................] - ETA: 2:47 - loss: 0.4224 - regression_loss: 0.3874 - classification_loss: 0.0350 71/500 [===>..........................] - ETA: 2:47 - loss: 0.4205 - regression_loss: 0.3855 - classification_loss: 0.0350 72/500 [===>..........................] - ETA: 2:46 - loss: 0.4251 - regression_loss: 0.3897 - classification_loss: 0.0354 73/500 [===>..........................] - ETA: 2:46 - loss: 0.4266 - regression_loss: 0.3912 - classification_loss: 0.0355 74/500 [===>..........................] - ETA: 2:45 - loss: 0.4282 - regression_loss: 0.3926 - classification_loss: 0.0357 75/500 [===>..........................] - ETA: 2:45 - loss: 0.4302 - regression_loss: 0.3947 - classification_loss: 0.0355 76/500 [===>..........................] - ETA: 2:44 - loss: 0.4310 - regression_loss: 0.3956 - classification_loss: 0.0353 77/500 [===>..........................] - ETA: 2:44 - loss: 0.4287 - regression_loss: 0.3937 - classification_loss: 0.0351 78/500 [===>..........................] - ETA: 2:44 - loss: 0.4282 - regression_loss: 0.3933 - classification_loss: 0.0349 79/500 [===>..........................] - ETA: 2:43 - loss: 0.4430 - regression_loss: 0.4067 - classification_loss: 0.0363 80/500 [===>..........................] - ETA: 2:43 - loss: 0.4407 - regression_loss: 0.4046 - classification_loss: 0.0360 81/500 [===>..........................] - ETA: 2:42 - loss: 0.4381 - regression_loss: 0.4020 - classification_loss: 0.0361 82/500 [===>..........................] - ETA: 2:42 - loss: 0.4350 - regression_loss: 0.3991 - classification_loss: 0.0359 83/500 [===>..........................] - ETA: 2:42 - loss: 0.4340 - regression_loss: 0.3982 - classification_loss: 0.0358 84/500 [====>.........................] - ETA: 2:41 - loss: 0.4344 - regression_loss: 0.3986 - classification_loss: 0.0359 85/500 [====>.........................] - ETA: 2:41 - loss: 0.4372 - regression_loss: 0.4010 - classification_loss: 0.0362 86/500 [====>.........................] - ETA: 2:41 - loss: 0.4368 - regression_loss: 0.4007 - classification_loss: 0.0362 87/500 [====>.........................] - ETA: 2:40 - loss: 0.4369 - regression_loss: 0.4009 - classification_loss: 0.0360 88/500 [====>.........................] - ETA: 2:40 - loss: 0.4374 - regression_loss: 0.4011 - classification_loss: 0.0363 89/500 [====>.........................] - ETA: 2:39 - loss: 0.4363 - regression_loss: 0.4000 - classification_loss: 0.0363 90/500 [====>.........................] - ETA: 2:39 - loss: 0.4376 - regression_loss: 0.4013 - classification_loss: 0.0363 91/500 [====>.........................] - ETA: 2:39 - loss: 0.4371 - regression_loss: 0.4010 - classification_loss: 0.0361 92/500 [====>.........................] - ETA: 2:38 - loss: 0.4355 - regression_loss: 0.3994 - classification_loss: 0.0361 93/500 [====>.........................] - ETA: 2:38 - loss: 0.4359 - regression_loss: 0.3999 - classification_loss: 0.0359 94/500 [====>.........................] - ETA: 2:37 - loss: 0.4374 - regression_loss: 0.4016 - classification_loss: 0.0358 95/500 [====>.........................] - ETA: 2:37 - loss: 0.4407 - regression_loss: 0.4042 - classification_loss: 0.0366 96/500 [====>.........................] - ETA: 2:36 - loss: 0.4406 - regression_loss: 0.4042 - classification_loss: 0.0364 97/500 [====>.........................] - ETA: 2:36 - loss: 0.4409 - regression_loss: 0.4047 - classification_loss: 0.0363 98/500 [====>.........................] - ETA: 2:36 - loss: 0.4389 - regression_loss: 0.4028 - classification_loss: 0.0361 99/500 [====>.........................] - ETA: 2:35 - loss: 0.4369 - regression_loss: 0.4011 - classification_loss: 0.0359 100/500 [=====>........................] - ETA: 2:35 - loss: 0.4353 - regression_loss: 0.3995 - classification_loss: 0.0358 101/500 [=====>........................] - ETA: 2:35 - loss: 0.4387 - regression_loss: 0.4022 - classification_loss: 0.0365 102/500 [=====>........................] - ETA: 2:34 - loss: 0.4378 - regression_loss: 0.4014 - classification_loss: 0.0363 103/500 [=====>........................] - ETA: 2:34 - loss: 0.4360 - regression_loss: 0.3998 - classification_loss: 0.0361 104/500 [=====>........................] - ETA: 2:33 - loss: 0.4353 - regression_loss: 0.3993 - classification_loss: 0.0360 105/500 [=====>........................] - ETA: 2:33 - loss: 0.4387 - regression_loss: 0.4023 - classification_loss: 0.0364 106/500 [=====>........................] - ETA: 2:33 - loss: 0.4380 - regression_loss: 0.4017 - classification_loss: 0.0363 107/500 [=====>........................] - ETA: 2:32 - loss: 0.4394 - regression_loss: 0.4028 - classification_loss: 0.0366 108/500 [=====>........................] - ETA: 2:32 - loss: 0.4391 - regression_loss: 0.4026 - classification_loss: 0.0365 109/500 [=====>........................] - ETA: 2:31 - loss: 0.4380 - regression_loss: 0.4017 - classification_loss: 0.0363 110/500 [=====>........................] - ETA: 2:31 - loss: 0.4390 - regression_loss: 0.4025 - classification_loss: 0.0365 111/500 [=====>........................] - ETA: 2:31 - loss: 0.4387 - regression_loss: 0.4023 - classification_loss: 0.0364 112/500 [=====>........................] - ETA: 2:30 - loss: 0.4386 - regression_loss: 0.4022 - classification_loss: 0.0364 113/500 [=====>........................] - ETA: 2:30 - loss: 0.4379 - regression_loss: 0.4015 - classification_loss: 0.0363 114/500 [=====>........................] - ETA: 2:30 - loss: 0.4387 - regression_loss: 0.4023 - classification_loss: 0.0364 115/500 [=====>........................] - ETA: 2:29 - loss: 0.4440 - regression_loss: 0.4063 - classification_loss: 0.0377 116/500 [=====>........................] - ETA: 2:29 - loss: 0.4456 - regression_loss: 0.4079 - classification_loss: 0.0377 117/500 [======>.......................] - ETA: 2:29 - loss: 0.4496 - regression_loss: 0.4114 - classification_loss: 0.0382 118/500 [======>.......................] - ETA: 2:28 - loss: 0.4497 - regression_loss: 0.4116 - classification_loss: 0.0381 119/500 [======>.......................] - ETA: 2:28 - loss: 0.4487 - regression_loss: 0.4107 - classification_loss: 0.0379 120/500 [======>.......................] - ETA: 2:27 - loss: 0.4480 - regression_loss: 0.4102 - classification_loss: 0.0378 121/500 [======>.......................] - ETA: 2:27 - loss: 0.4480 - regression_loss: 0.4104 - classification_loss: 0.0377 122/500 [======>.......................] - ETA: 2:27 - loss: 0.4473 - regression_loss: 0.4098 - classification_loss: 0.0375 123/500 [======>.......................] - ETA: 2:26 - loss: 0.4476 - regression_loss: 0.4100 - classification_loss: 0.0377 124/500 [======>.......................] - ETA: 2:26 - loss: 0.4482 - regression_loss: 0.4104 - classification_loss: 0.0378 125/500 [======>.......................] - ETA: 2:25 - loss: 0.4486 - regression_loss: 0.4110 - classification_loss: 0.0376 126/500 [======>.......................] - ETA: 2:25 - loss: 0.4472 - regression_loss: 0.4098 - classification_loss: 0.0374 127/500 [======>.......................] - ETA: 2:25 - loss: 0.4459 - regression_loss: 0.4085 - classification_loss: 0.0373 128/500 [======>.......................] - ETA: 2:24 - loss: 0.4480 - regression_loss: 0.4104 - classification_loss: 0.0377 129/500 [======>.......................] - ETA: 2:24 - loss: 0.4460 - regression_loss: 0.4085 - classification_loss: 0.0375 130/500 [======>.......................] - ETA: 2:23 - loss: 0.4436 - regression_loss: 0.4063 - classification_loss: 0.0373 131/500 [======>.......................] - ETA: 2:23 - loss: 0.4425 - regression_loss: 0.4053 - classification_loss: 0.0372 132/500 [======>.......................] - ETA: 2:23 - loss: 0.4433 - regression_loss: 0.4060 - classification_loss: 0.0373 133/500 [======>.......................] - ETA: 2:22 - loss: 0.4436 - regression_loss: 0.4062 - classification_loss: 0.0374 134/500 [=======>......................] - ETA: 2:22 - loss: 0.4430 - regression_loss: 0.4057 - classification_loss: 0.0373 135/500 [=======>......................] - ETA: 2:22 - loss: 0.4427 - regression_loss: 0.4055 - classification_loss: 0.0372 136/500 [=======>......................] - ETA: 2:21 - loss: 0.4434 - regression_loss: 0.4062 - classification_loss: 0.0372 137/500 [=======>......................] - ETA: 2:21 - loss: 0.4443 - regression_loss: 0.4070 - classification_loss: 0.0373 138/500 [=======>......................] - ETA: 2:20 - loss: 0.4427 - regression_loss: 0.4055 - classification_loss: 0.0372 139/500 [=======>......................] - ETA: 2:20 - loss: 0.4420 - regression_loss: 0.4049 - classification_loss: 0.0371 140/500 [=======>......................] - ETA: 2:20 - loss: 0.4412 - regression_loss: 0.4042 - classification_loss: 0.0370 141/500 [=======>......................] - ETA: 2:19 - loss: 0.4398 - regression_loss: 0.4029 - classification_loss: 0.0369 142/500 [=======>......................] - ETA: 2:19 - loss: 0.4398 - regression_loss: 0.4030 - classification_loss: 0.0369 143/500 [=======>......................] - ETA: 2:19 - loss: 0.4400 - regression_loss: 0.4033 - classification_loss: 0.0367 144/500 [=======>......................] - ETA: 2:18 - loss: 0.4398 - regression_loss: 0.4029 - classification_loss: 0.0369 145/500 [=======>......................] - ETA: 2:18 - loss: 0.4388 - regression_loss: 0.4018 - classification_loss: 0.0370 146/500 [=======>......................] - ETA: 2:17 - loss: 0.4390 - regression_loss: 0.4021 - classification_loss: 0.0369 147/500 [=======>......................] - ETA: 2:17 - loss: 0.4397 - regression_loss: 0.4027 - classification_loss: 0.0369 148/500 [=======>......................] - ETA: 2:17 - loss: 0.4394 - regression_loss: 0.4025 - classification_loss: 0.0369 149/500 [=======>......................] - ETA: 2:16 - loss: 0.4407 - regression_loss: 0.4038 - classification_loss: 0.0369 150/500 [========>.....................] - ETA: 2:16 - loss: 0.4391 - regression_loss: 0.4023 - classification_loss: 0.0367 151/500 [========>.....................] - ETA: 2:15 - loss: 0.4385 - regression_loss: 0.4018 - classification_loss: 0.0366 152/500 [========>.....................] - ETA: 2:15 - loss: 0.4381 - regression_loss: 0.4015 - classification_loss: 0.0366 153/500 [========>.....................] - ETA: 2:15 - loss: 0.4387 - regression_loss: 0.4021 - classification_loss: 0.0366 154/500 [========>.....................] - ETA: 2:14 - loss: 0.4400 - regression_loss: 0.4033 - classification_loss: 0.0367 155/500 [========>.....................] - ETA: 2:14 - loss: 0.4391 - regression_loss: 0.4026 - classification_loss: 0.0365 156/500 [========>.....................] - ETA: 2:14 - loss: 0.4398 - regression_loss: 0.4032 - classification_loss: 0.0366 157/500 [========>.....................] - ETA: 2:13 - loss: 0.4403 - regression_loss: 0.4035 - classification_loss: 0.0368 158/500 [========>.....................] - ETA: 2:13 - loss: 0.4416 - regression_loss: 0.4049 - classification_loss: 0.0367 159/500 [========>.....................] - ETA: 2:12 - loss: 0.4407 - regression_loss: 0.4040 - classification_loss: 0.0366 160/500 [========>.....................] - ETA: 2:12 - loss: 0.4407 - regression_loss: 0.4041 - classification_loss: 0.0366 161/500 [========>.....................] - ETA: 2:12 - loss: 0.4389 - regression_loss: 0.4025 - classification_loss: 0.0364 162/500 [========>.....................] - ETA: 2:11 - loss: 0.4374 - regression_loss: 0.4010 - classification_loss: 0.0363 163/500 [========>.....................] - ETA: 2:11 - loss: 0.4385 - regression_loss: 0.4021 - classification_loss: 0.0364 164/500 [========>.....................] - ETA: 2:10 - loss: 0.4369 - regression_loss: 0.4007 - classification_loss: 0.0362 165/500 [========>.....................] - ETA: 2:10 - loss: 0.4406 - regression_loss: 0.4035 - classification_loss: 0.0371 166/500 [========>.....................] - ETA: 2:10 - loss: 0.4409 - regression_loss: 0.4039 - classification_loss: 0.0371 167/500 [=========>....................] - ETA: 2:09 - loss: 0.4430 - regression_loss: 0.4058 - classification_loss: 0.0372 168/500 [=========>....................] - ETA: 2:09 - loss: 0.4433 - regression_loss: 0.4063 - classification_loss: 0.0371 169/500 [=========>....................] - ETA: 2:09 - loss: 0.4437 - regression_loss: 0.4067 - classification_loss: 0.0370 170/500 [=========>....................] - ETA: 2:08 - loss: 0.4430 - regression_loss: 0.4062 - classification_loss: 0.0369 171/500 [=========>....................] - ETA: 2:08 - loss: 0.4413 - regression_loss: 0.4045 - classification_loss: 0.0367 172/500 [=========>....................] - ETA: 2:07 - loss: 0.4412 - regression_loss: 0.4045 - classification_loss: 0.0367 173/500 [=========>....................] - ETA: 2:07 - loss: 0.4414 - regression_loss: 0.4048 - classification_loss: 0.0366 174/500 [=========>....................] - ETA: 2:07 - loss: 0.4407 - regression_loss: 0.4042 - classification_loss: 0.0365 175/500 [=========>....................] - ETA: 2:06 - loss: 0.4405 - regression_loss: 0.4040 - classification_loss: 0.0364 176/500 [=========>....................] - ETA: 2:06 - loss: 0.4445 - regression_loss: 0.4077 - classification_loss: 0.0368 177/500 [=========>....................] - ETA: 2:05 - loss: 0.4449 - regression_loss: 0.4080 - classification_loss: 0.0369 178/500 [=========>....................] - ETA: 2:05 - loss: 0.4440 - regression_loss: 0.4070 - classification_loss: 0.0369 179/500 [=========>....................] - ETA: 2:05 - loss: 0.4428 - regression_loss: 0.4060 - classification_loss: 0.0368 180/500 [=========>....................] - ETA: 2:04 - loss: 0.4418 - regression_loss: 0.4051 - classification_loss: 0.0367 181/500 [=========>....................] - ETA: 2:04 - loss: 0.4409 - regression_loss: 0.4042 - classification_loss: 0.0367 182/500 [=========>....................] - ETA: 2:03 - loss: 0.4432 - regression_loss: 0.4064 - classification_loss: 0.0368 183/500 [=========>....................] - ETA: 2:03 - loss: 0.4436 - regression_loss: 0.4069 - classification_loss: 0.0367 184/500 [==========>...................] - ETA: 2:03 - loss: 0.4438 - regression_loss: 0.4072 - classification_loss: 0.0366 185/500 [==========>...................] - ETA: 2:02 - loss: 0.4437 - regression_loss: 0.4072 - classification_loss: 0.0366 186/500 [==========>...................] - ETA: 2:02 - loss: 0.4426 - regression_loss: 0.4061 - classification_loss: 0.0365 187/500 [==========>...................] - ETA: 2:01 - loss: 0.4432 - regression_loss: 0.4066 - classification_loss: 0.0365 188/500 [==========>...................] - ETA: 2:01 - loss: 0.4436 - regression_loss: 0.4071 - classification_loss: 0.0365 189/500 [==========>...................] - ETA: 2:01 - loss: 0.4429 - regression_loss: 0.4065 - classification_loss: 0.0364 190/500 [==========>...................] - ETA: 2:00 - loss: 0.4422 - regression_loss: 0.4058 - classification_loss: 0.0364 191/500 [==========>...................] - ETA: 2:00 - loss: 0.4428 - regression_loss: 0.4064 - classification_loss: 0.0364 192/500 [==========>...................] - ETA: 2:00 - loss: 0.4424 - regression_loss: 0.4061 - classification_loss: 0.0364 193/500 [==========>...................] - ETA: 1:59 - loss: 0.4450 - regression_loss: 0.4085 - classification_loss: 0.0365 194/500 [==========>...................] - ETA: 1:59 - loss: 0.4448 - regression_loss: 0.4082 - classification_loss: 0.0365 195/500 [==========>...................] - ETA: 1:58 - loss: 0.4432 - regression_loss: 0.4068 - classification_loss: 0.0364 196/500 [==========>...................] - ETA: 1:58 - loss: 0.4431 - regression_loss: 0.4068 - classification_loss: 0.0364 197/500 [==========>...................] - ETA: 1:58 - loss: 0.4421 - regression_loss: 0.4059 - classification_loss: 0.0362 198/500 [==========>...................] - ETA: 1:57 - loss: 0.4414 - regression_loss: 0.4053 - classification_loss: 0.0361 199/500 [==========>...................] - ETA: 1:57 - loss: 0.4416 - regression_loss: 0.4054 - classification_loss: 0.0362 200/500 [===========>..................] - ETA: 1:56 - loss: 0.4439 - regression_loss: 0.4076 - classification_loss: 0.0364 201/500 [===========>..................] - ETA: 1:56 - loss: 0.4448 - regression_loss: 0.4084 - classification_loss: 0.0364 202/500 [===========>..................] - ETA: 1:56 - loss: 0.4433 - regression_loss: 0.4070 - classification_loss: 0.0363 203/500 [===========>..................] - ETA: 1:55 - loss: 0.4422 - regression_loss: 0.4061 - classification_loss: 0.0361 204/500 [===========>..................] - ETA: 1:55 - loss: 0.4435 - regression_loss: 0.4072 - classification_loss: 0.0363 205/500 [===========>..................] - ETA: 1:54 - loss: 0.4432 - regression_loss: 0.4071 - classification_loss: 0.0362 206/500 [===========>..................] - ETA: 1:54 - loss: 0.4443 - regression_loss: 0.4080 - classification_loss: 0.0362 207/500 [===========>..................] - ETA: 1:54 - loss: 0.4437 - regression_loss: 0.4075 - classification_loss: 0.0361 208/500 [===========>..................] - ETA: 1:53 - loss: 0.4444 - regression_loss: 0.4082 - classification_loss: 0.0362 209/500 [===========>..................] - ETA: 1:53 - loss: 0.4449 - regression_loss: 0.4087 - classification_loss: 0.0361 210/500 [===========>..................] - ETA: 1:53 - loss: 0.4443 - regression_loss: 0.4082 - classification_loss: 0.0361 211/500 [===========>..................] - ETA: 1:52 - loss: 0.4440 - regression_loss: 0.4080 - classification_loss: 0.0360 212/500 [===========>..................] - ETA: 1:52 - loss: 0.4433 - regression_loss: 0.4075 - classification_loss: 0.0359 213/500 [===========>..................] - ETA: 1:51 - loss: 0.4444 - regression_loss: 0.4085 - classification_loss: 0.0360 214/500 [===========>..................] - ETA: 1:51 - loss: 0.4444 - regression_loss: 0.4085 - classification_loss: 0.0359 215/500 [===========>..................] - ETA: 1:51 - loss: 0.4485 - regression_loss: 0.4121 - classification_loss: 0.0365 216/500 [===========>..................] - ETA: 1:50 - loss: 0.4495 - regression_loss: 0.4130 - classification_loss: 0.0365 217/500 [============>.................] - ETA: 1:50 - loss: 0.4497 - regression_loss: 0.4132 - classification_loss: 0.0365 218/500 [============>.................] - ETA: 1:49 - loss: 0.4503 - regression_loss: 0.4137 - classification_loss: 0.0365 219/500 [============>.................] - ETA: 1:49 - loss: 0.4503 - regression_loss: 0.4138 - classification_loss: 0.0365 220/500 [============>.................] - ETA: 1:49 - loss: 0.4517 - regression_loss: 0.4149 - classification_loss: 0.0368 221/500 [============>.................] - ETA: 1:48 - loss: 0.4507 - regression_loss: 0.4140 - classification_loss: 0.0367 222/500 [============>.................] - ETA: 1:48 - loss: 0.4499 - regression_loss: 0.4133 - classification_loss: 0.0366 223/500 [============>.................] - ETA: 1:47 - loss: 0.4495 - regression_loss: 0.4130 - classification_loss: 0.0365 224/500 [============>.................] - ETA: 1:47 - loss: 0.4496 - regression_loss: 0.4131 - classification_loss: 0.0365 225/500 [============>.................] - ETA: 1:47 - loss: 0.4511 - regression_loss: 0.4144 - classification_loss: 0.0367 226/500 [============>.................] - ETA: 1:46 - loss: 0.4504 - regression_loss: 0.4138 - classification_loss: 0.0366 227/500 [============>.................] - ETA: 1:46 - loss: 0.4504 - regression_loss: 0.4138 - classification_loss: 0.0366 228/500 [============>.................] - ETA: 1:46 - loss: 0.4499 - regression_loss: 0.4134 - classification_loss: 0.0365 229/500 [============>.................] - ETA: 1:45 - loss: 0.4491 - regression_loss: 0.4126 - classification_loss: 0.0364 230/500 [============>.................] - ETA: 1:45 - loss: 0.4485 - regression_loss: 0.4121 - classification_loss: 0.0364 231/500 [============>.................] - ETA: 1:44 - loss: 0.4474 - regression_loss: 0.4110 - classification_loss: 0.0363 232/500 [============>.................] - ETA: 1:44 - loss: 0.4471 - regression_loss: 0.4108 - classification_loss: 0.0363 233/500 [============>.................] - ETA: 1:44 - loss: 0.4466 - regression_loss: 0.4104 - classification_loss: 0.0362 234/500 [=============>................] - ETA: 1:43 - loss: 0.4470 - regression_loss: 0.4108 - classification_loss: 0.0362 235/500 [=============>................] - ETA: 1:43 - loss: 0.4498 - regression_loss: 0.4136 - classification_loss: 0.0362 236/500 [=============>................] - ETA: 1:42 - loss: 0.4496 - regression_loss: 0.4134 - classification_loss: 0.0362 237/500 [=============>................] - ETA: 1:42 - loss: 0.4490 - regression_loss: 0.4129 - classification_loss: 0.0361 238/500 [=============>................] - ETA: 1:42 - loss: 0.4484 - regression_loss: 0.4124 - classification_loss: 0.0360 239/500 [=============>................] - ETA: 1:41 - loss: 0.4480 - regression_loss: 0.4120 - classification_loss: 0.0359 240/500 [=============>................] - ETA: 1:41 - loss: 0.4475 - regression_loss: 0.4116 - classification_loss: 0.0359 241/500 [=============>................] - ETA: 1:40 - loss: 0.4469 - regression_loss: 0.4112 - classification_loss: 0.0357 242/500 [=============>................] - ETA: 1:40 - loss: 0.4474 - regression_loss: 0.4116 - classification_loss: 0.0358 243/500 [=============>................] - ETA: 1:40 - loss: 0.4476 - regression_loss: 0.4119 - classification_loss: 0.0357 244/500 [=============>................] - ETA: 1:39 - loss: 0.4488 - regression_loss: 0.4131 - classification_loss: 0.0357 245/500 [=============>................] - ETA: 1:39 - loss: 0.4493 - regression_loss: 0.4136 - classification_loss: 0.0357 246/500 [=============>................] - ETA: 1:39 - loss: 0.4485 - regression_loss: 0.4129 - classification_loss: 0.0356 247/500 [=============>................] - ETA: 1:38 - loss: 0.4494 - regression_loss: 0.4137 - classification_loss: 0.0357 248/500 [=============>................] - ETA: 1:38 - loss: 0.4495 - regression_loss: 0.4139 - classification_loss: 0.0356 249/500 [=============>................] - ETA: 1:37 - loss: 0.4518 - regression_loss: 0.4159 - classification_loss: 0.0359 250/500 [==============>...............] - ETA: 1:37 - loss: 0.4509 - regression_loss: 0.4151 - classification_loss: 0.0358 251/500 [==============>...............] - ETA: 1:37 - loss: 0.4511 - regression_loss: 0.4153 - classification_loss: 0.0358 252/500 [==============>...............] - ETA: 1:36 - loss: 0.4514 - regression_loss: 0.4156 - classification_loss: 0.0358 253/500 [==============>...............] - ETA: 1:36 - loss: 0.4504 - regression_loss: 0.4147 - classification_loss: 0.0357 254/500 [==============>...............] - ETA: 1:35 - loss: 0.4500 - regression_loss: 0.4144 - classification_loss: 0.0356 255/500 [==============>...............] - ETA: 1:35 - loss: 0.4524 - regression_loss: 0.4165 - classification_loss: 0.0359 256/500 [==============>...............] - ETA: 1:35 - loss: 0.4521 - regression_loss: 0.4163 - classification_loss: 0.0359 257/500 [==============>...............] - ETA: 1:34 - loss: 0.4529 - regression_loss: 0.4170 - classification_loss: 0.0360 258/500 [==============>...............] - ETA: 1:34 - loss: 0.4533 - regression_loss: 0.4174 - classification_loss: 0.0359 259/500 [==============>...............] - ETA: 1:33 - loss: 0.4545 - regression_loss: 0.4186 - classification_loss: 0.0359 260/500 [==============>...............] - ETA: 1:33 - loss: 0.4551 - regression_loss: 0.4192 - classification_loss: 0.0359 261/500 [==============>...............] - ETA: 1:33 - loss: 0.4556 - regression_loss: 0.4197 - classification_loss: 0.0359 262/500 [==============>...............] - ETA: 1:32 - loss: 0.4557 - regression_loss: 0.4198 - classification_loss: 0.0360 263/500 [==============>...............] - ETA: 1:32 - loss: 0.4560 - regression_loss: 0.4201 - classification_loss: 0.0359 264/500 [==============>...............] - ETA: 1:31 - loss: 0.4565 - regression_loss: 0.4205 - classification_loss: 0.0359 265/500 [==============>...............] - ETA: 1:31 - loss: 0.4566 - regression_loss: 0.4207 - classification_loss: 0.0359 266/500 [==============>...............] - ETA: 1:31 - loss: 0.4560 - regression_loss: 0.4202 - classification_loss: 0.0358 267/500 [===============>..............] - ETA: 1:30 - loss: 0.4558 - regression_loss: 0.4200 - classification_loss: 0.0358 268/500 [===============>..............] - ETA: 1:30 - loss: 0.4553 - regression_loss: 0.4196 - classification_loss: 0.0357 269/500 [===============>..............] - ETA: 1:30 - loss: 0.4560 - regression_loss: 0.4203 - classification_loss: 0.0358 270/500 [===============>..............] - ETA: 1:29 - loss: 0.4553 - regression_loss: 0.4196 - classification_loss: 0.0357 271/500 [===============>..............] - ETA: 1:29 - loss: 0.4552 - regression_loss: 0.4195 - classification_loss: 0.0357 272/500 [===============>..............] - ETA: 1:28 - loss: 0.4549 - regression_loss: 0.4193 - classification_loss: 0.0357 273/500 [===============>..............] - ETA: 1:28 - loss: 0.4543 - regression_loss: 0.4187 - classification_loss: 0.0356 274/500 [===============>..............] - ETA: 1:28 - loss: 0.4552 - regression_loss: 0.4196 - classification_loss: 0.0356 275/500 [===============>..............] - ETA: 1:27 - loss: 0.4560 - regression_loss: 0.4203 - classification_loss: 0.0356 276/500 [===============>..............] - ETA: 1:27 - loss: 0.4559 - regression_loss: 0.4203 - classification_loss: 0.0356 277/500 [===============>..............] - ETA: 1:26 - loss: 0.4563 - regression_loss: 0.4207 - classification_loss: 0.0356 278/500 [===============>..............] - ETA: 1:26 - loss: 0.4558 - regression_loss: 0.4203 - classification_loss: 0.0356 279/500 [===============>..............] - ETA: 1:26 - loss: 0.4561 - regression_loss: 0.4205 - classification_loss: 0.0355 280/500 [===============>..............] - ETA: 1:25 - loss: 0.4551 - regression_loss: 0.4197 - classification_loss: 0.0355 281/500 [===============>..............] - ETA: 1:25 - loss: 0.4546 - regression_loss: 0.4192 - classification_loss: 0.0354 282/500 [===============>..............] - ETA: 1:24 - loss: 0.4549 - regression_loss: 0.4195 - classification_loss: 0.0355 283/500 [===============>..............] - ETA: 1:24 - loss: 0.4546 - regression_loss: 0.4192 - classification_loss: 0.0354 284/500 [================>.............] - ETA: 1:24 - loss: 0.4547 - regression_loss: 0.4193 - classification_loss: 0.0354 285/500 [================>.............] - ETA: 1:23 - loss: 0.4549 - regression_loss: 0.4195 - classification_loss: 0.0355 286/500 [================>.............] - ETA: 1:23 - loss: 0.4546 - regression_loss: 0.4191 - classification_loss: 0.0354 287/500 [================>.............] - ETA: 1:22 - loss: 0.4541 - regression_loss: 0.4187 - classification_loss: 0.0354 288/500 [================>.............] - ETA: 1:22 - loss: 0.4544 - regression_loss: 0.4190 - classification_loss: 0.0354 289/500 [================>.............] - ETA: 1:22 - loss: 0.4535 - regression_loss: 0.4182 - classification_loss: 0.0353 290/500 [================>.............] - ETA: 1:21 - loss: 0.4532 - regression_loss: 0.4180 - classification_loss: 0.0353 291/500 [================>.............] - ETA: 1:21 - loss: 0.4526 - regression_loss: 0.4174 - classification_loss: 0.0352 292/500 [================>.............] - ETA: 1:21 - loss: 0.4525 - regression_loss: 0.4174 - classification_loss: 0.0352 293/500 [================>.............] - ETA: 1:20 - loss: 0.4539 - regression_loss: 0.4187 - classification_loss: 0.0351 294/500 [================>.............] - ETA: 1:20 - loss: 0.4538 - regression_loss: 0.4187 - classification_loss: 0.0351 295/500 [================>.............] - ETA: 1:19 - loss: 0.4535 - regression_loss: 0.4184 - classification_loss: 0.0351 296/500 [================>.............] - ETA: 1:19 - loss: 0.4531 - regression_loss: 0.4181 - classification_loss: 0.0351 297/500 [================>.............] - ETA: 1:19 - loss: 0.4550 - regression_loss: 0.4193 - classification_loss: 0.0358 298/500 [================>.............] - ETA: 1:18 - loss: 0.4539 - regression_loss: 0.4182 - classification_loss: 0.0357 299/500 [================>.............] - ETA: 1:18 - loss: 0.4532 - regression_loss: 0.4176 - classification_loss: 0.0356 300/500 [=================>............] - ETA: 1:17 - loss: 0.4530 - regression_loss: 0.4174 - classification_loss: 0.0356 301/500 [=================>............] - ETA: 1:17 - loss: 0.4526 - regression_loss: 0.4170 - classification_loss: 0.0355 302/500 [=================>............] - ETA: 1:17 - loss: 0.4539 - regression_loss: 0.4183 - classification_loss: 0.0357 303/500 [=================>............] - ETA: 1:16 - loss: 0.4541 - regression_loss: 0.4185 - classification_loss: 0.0357 304/500 [=================>............] - ETA: 1:16 - loss: 0.4536 - regression_loss: 0.4180 - classification_loss: 0.0356 305/500 [=================>............] - ETA: 1:15 - loss: 0.4529 - regression_loss: 0.4174 - classification_loss: 0.0355 306/500 [=================>............] - ETA: 1:15 - loss: 0.4523 - regression_loss: 0.4168 - classification_loss: 0.0355 307/500 [=================>............] - ETA: 1:15 - loss: 0.4520 - regression_loss: 0.4166 - classification_loss: 0.0354 308/500 [=================>............] - ETA: 1:14 - loss: 0.4517 - regression_loss: 0.4163 - classification_loss: 0.0353 309/500 [=================>............] - ETA: 1:14 - loss: 0.4526 - regression_loss: 0.4171 - classification_loss: 0.0355 310/500 [=================>............] - ETA: 1:13 - loss: 0.4523 - regression_loss: 0.4169 - classification_loss: 0.0354 311/500 [=================>............] - ETA: 1:13 - loss: 0.4518 - regression_loss: 0.4164 - classification_loss: 0.0354 312/500 [=================>............] - ETA: 1:13 - loss: 0.4516 - regression_loss: 0.4163 - classification_loss: 0.0354 313/500 [=================>............] - ETA: 1:12 - loss: 0.4530 - regression_loss: 0.4174 - classification_loss: 0.0357 314/500 [=================>............] - ETA: 1:12 - loss: 0.4523 - regression_loss: 0.4167 - classification_loss: 0.0356 315/500 [=================>............] - ETA: 1:12 - loss: 0.4525 - regression_loss: 0.4169 - classification_loss: 0.0357 316/500 [=================>............] - ETA: 1:11 - loss: 0.4517 - regression_loss: 0.4161 - classification_loss: 0.0356 317/500 [==================>...........] - ETA: 1:11 - loss: 0.4511 - regression_loss: 0.4155 - classification_loss: 0.0356 318/500 [==================>...........] - ETA: 1:10 - loss: 0.4505 - regression_loss: 0.4150 - classification_loss: 0.0355 319/500 [==================>...........] - ETA: 1:10 - loss: 0.4502 - regression_loss: 0.4148 - classification_loss: 0.0355 320/500 [==================>...........] - ETA: 1:10 - loss: 0.4500 - regression_loss: 0.4146 - classification_loss: 0.0354 321/500 [==================>...........] - ETA: 1:09 - loss: 0.4497 - regression_loss: 0.4143 - classification_loss: 0.0354 322/500 [==================>...........] - ETA: 1:09 - loss: 0.4495 - regression_loss: 0.4142 - classification_loss: 0.0353 323/500 [==================>...........] - ETA: 1:08 - loss: 0.4492 - regression_loss: 0.4139 - classification_loss: 0.0353 324/500 [==================>...........] - ETA: 1:08 - loss: 0.4487 - regression_loss: 0.4135 - classification_loss: 0.0352 325/500 [==================>...........] - ETA: 1:08 - loss: 0.4477 - regression_loss: 0.4125 - classification_loss: 0.0352 326/500 [==================>...........] - ETA: 1:07 - loss: 0.4485 - regression_loss: 0.4133 - classification_loss: 0.0352 327/500 [==================>...........] - ETA: 1:07 - loss: 0.4484 - regression_loss: 0.4133 - classification_loss: 0.0352 328/500 [==================>...........] - ETA: 1:06 - loss: 0.4487 - regression_loss: 0.4136 - classification_loss: 0.0351 329/500 [==================>...........] - ETA: 1:06 - loss: 0.4483 - regression_loss: 0.4132 - classification_loss: 0.0351 330/500 [==================>...........] - ETA: 1:06 - loss: 0.4486 - regression_loss: 0.4135 - classification_loss: 0.0351 331/500 [==================>...........] - ETA: 1:05 - loss: 0.4483 - regression_loss: 0.4132 - classification_loss: 0.0351 332/500 [==================>...........] - ETA: 1:05 - loss: 0.4475 - regression_loss: 0.4125 - classification_loss: 0.0350 333/500 [==================>...........] - ETA: 1:05 - loss: 0.4473 - regression_loss: 0.4123 - classification_loss: 0.0350 334/500 [===================>..........] - ETA: 1:04 - loss: 0.4468 - regression_loss: 0.4119 - classification_loss: 0.0349 335/500 [===================>..........] - ETA: 1:04 - loss: 0.4472 - regression_loss: 0.4123 - classification_loss: 0.0349 336/500 [===================>..........] - ETA: 1:03 - loss: 0.4471 - regression_loss: 0.4122 - classification_loss: 0.0349 337/500 [===================>..........] - ETA: 1:03 - loss: 0.4469 - regression_loss: 0.4120 - classification_loss: 0.0349 338/500 [===================>..........] - ETA: 1:03 - loss: 0.4469 - regression_loss: 0.4120 - classification_loss: 0.0348 339/500 [===================>..........] - ETA: 1:02 - loss: 0.4464 - regression_loss: 0.4117 - classification_loss: 0.0348 340/500 [===================>..........] - ETA: 1:02 - loss: 0.4473 - regression_loss: 0.4125 - classification_loss: 0.0348 341/500 [===================>..........] - ETA: 1:01 - loss: 0.4468 - regression_loss: 0.4120 - classification_loss: 0.0347 342/500 [===================>..........] - ETA: 1:01 - loss: 0.4462 - regression_loss: 0.4116 - classification_loss: 0.0347 343/500 [===================>..........] - ETA: 1:01 - loss: 0.4474 - regression_loss: 0.4126 - classification_loss: 0.0348 344/500 [===================>..........] - ETA: 1:00 - loss: 0.4467 - regression_loss: 0.4120 - classification_loss: 0.0347 345/500 [===================>..........] - ETA: 1:00 - loss: 0.4467 - regression_loss: 0.4119 - classification_loss: 0.0347 346/500 [===================>..........] - ETA: 59s - loss: 0.4461 - regression_loss: 0.4114 - classification_loss: 0.0347  347/500 [===================>..........] - ETA: 59s - loss: 0.4470 - regression_loss: 0.4123 - classification_loss: 0.0347 348/500 [===================>..........] - ETA: 59s - loss: 0.4482 - regression_loss: 0.4134 - classification_loss: 0.0348 349/500 [===================>..........] - ETA: 58s - loss: 0.4478 - regression_loss: 0.4130 - classification_loss: 0.0348 350/500 [====================>.........] - ETA: 58s - loss: 0.4484 - regression_loss: 0.4135 - classification_loss: 0.0349 351/500 [====================>.........] - ETA: 57s - loss: 0.4483 - regression_loss: 0.4134 - classification_loss: 0.0349 352/500 [====================>.........] - ETA: 57s - loss: 0.4491 - regression_loss: 0.4142 - classification_loss: 0.0349 353/500 [====================>.........] - ETA: 57s - loss: 0.4487 - regression_loss: 0.4139 - classification_loss: 0.0349 354/500 [====================>.........] - ETA: 56s - loss: 0.4482 - regression_loss: 0.4134 - classification_loss: 0.0348 355/500 [====================>.........] - ETA: 56s - loss: 0.4484 - regression_loss: 0.4136 - classification_loss: 0.0348 356/500 [====================>.........] - ETA: 56s - loss: 0.4476 - regression_loss: 0.4129 - classification_loss: 0.0347 357/500 [====================>.........] - ETA: 55s - loss: 0.4472 - regression_loss: 0.4125 - classification_loss: 0.0347 358/500 [====================>.........] - ETA: 55s - loss: 0.4477 - regression_loss: 0.4130 - classification_loss: 0.0348 359/500 [====================>.........] - ETA: 54s - loss: 0.4477 - regression_loss: 0.4129 - classification_loss: 0.0348 360/500 [====================>.........] - ETA: 54s - loss: 0.4486 - regression_loss: 0.4137 - classification_loss: 0.0349 361/500 [====================>.........] - ETA: 54s - loss: 0.4489 - regression_loss: 0.4140 - classification_loss: 0.0349 362/500 [====================>.........] - ETA: 53s - loss: 0.4486 - regression_loss: 0.4137 - classification_loss: 0.0349 363/500 [====================>.........] - ETA: 53s - loss: 0.4491 - regression_loss: 0.4142 - classification_loss: 0.0349 364/500 [====================>.........] - ETA: 52s - loss: 0.4487 - regression_loss: 0.4139 - classification_loss: 0.0348 365/500 [====================>.........] - ETA: 52s - loss: 0.4484 - regression_loss: 0.4135 - classification_loss: 0.0348 366/500 [====================>.........] - ETA: 52s - loss: 0.4495 - regression_loss: 0.4144 - classification_loss: 0.0351 367/500 [=====================>........] - ETA: 51s - loss: 0.4492 - regression_loss: 0.4140 - classification_loss: 0.0352 368/500 [=====================>........] - ETA: 51s - loss: 0.4494 - regression_loss: 0.4142 - classification_loss: 0.0352 369/500 [=====================>........] - ETA: 50s - loss: 0.4486 - regression_loss: 0.4135 - classification_loss: 0.0352 370/500 [=====================>........] - ETA: 50s - loss: 0.4497 - regression_loss: 0.4143 - classification_loss: 0.0354 371/500 [=====================>........] - ETA: 50s - loss: 0.4493 - regression_loss: 0.4139 - classification_loss: 0.0353 372/500 [=====================>........] - ETA: 49s - loss: 0.4496 - regression_loss: 0.4142 - classification_loss: 0.0353 373/500 [=====================>........] - ETA: 49s - loss: 0.4495 - regression_loss: 0.4141 - classification_loss: 0.0353 374/500 [=====================>........] - ETA: 49s - loss: 0.4496 - regression_loss: 0.4143 - classification_loss: 0.0353 375/500 [=====================>........] - ETA: 48s - loss: 0.4495 - regression_loss: 0.4142 - classification_loss: 0.0353 376/500 [=====================>........] - ETA: 48s - loss: 0.4489 - regression_loss: 0.4136 - classification_loss: 0.0353 377/500 [=====================>........] - ETA: 47s - loss: 0.4485 - regression_loss: 0.4133 - classification_loss: 0.0352 378/500 [=====================>........] - ETA: 47s - loss: 0.4483 - regression_loss: 0.4131 - classification_loss: 0.0352 379/500 [=====================>........] - ETA: 47s - loss: 0.4478 - regression_loss: 0.4127 - classification_loss: 0.0351 380/500 [=====================>........] - ETA: 46s - loss: 0.4475 - regression_loss: 0.4125 - classification_loss: 0.0351 381/500 [=====================>........] - ETA: 46s - loss: 0.4480 - regression_loss: 0.4129 - classification_loss: 0.0351 382/500 [=====================>........] - ETA: 45s - loss: 0.4480 - regression_loss: 0.4129 - classification_loss: 0.0352 383/500 [=====================>........] - ETA: 45s - loss: 0.4476 - regression_loss: 0.4125 - classification_loss: 0.0351 384/500 [======================>.......] - ETA: 45s - loss: 0.4475 - regression_loss: 0.4124 - classification_loss: 0.0351 385/500 [======================>.......] - ETA: 44s - loss: 0.4472 - regression_loss: 0.4121 - classification_loss: 0.0351 386/500 [======================>.......] - ETA: 44s - loss: 0.4475 - regression_loss: 0.4124 - classification_loss: 0.0351 387/500 [======================>.......] - ETA: 43s - loss: 0.4470 - regression_loss: 0.4120 - classification_loss: 0.0350 388/500 [======================>.......] - ETA: 43s - loss: 0.4470 - regression_loss: 0.4120 - classification_loss: 0.0350 389/500 [======================>.......] - ETA: 43s - loss: 0.4474 - regression_loss: 0.4124 - classification_loss: 0.0349 390/500 [======================>.......] - ETA: 42s - loss: 0.4486 - regression_loss: 0.4136 - classification_loss: 0.0349 391/500 [======================>.......] - ETA: 42s - loss: 0.4483 - regression_loss: 0.4134 - classification_loss: 0.0349 392/500 [======================>.......] - ETA: 42s - loss: 0.4498 - regression_loss: 0.4149 - classification_loss: 0.0349 393/500 [======================>.......] - ETA: 41s - loss: 0.4499 - regression_loss: 0.4150 - classification_loss: 0.0349 394/500 [======================>.......] - ETA: 41s - loss: 0.4494 - regression_loss: 0.4145 - classification_loss: 0.0349 395/500 [======================>.......] - ETA: 40s - loss: 0.4505 - regression_loss: 0.4156 - classification_loss: 0.0349 396/500 [======================>.......] - ETA: 40s - loss: 0.4503 - regression_loss: 0.4155 - classification_loss: 0.0348 397/500 [======================>.......] - ETA: 40s - loss: 0.4503 - regression_loss: 0.4155 - classification_loss: 0.0348 398/500 [======================>.......] - ETA: 39s - loss: 0.4499 - regression_loss: 0.4151 - classification_loss: 0.0348 399/500 [======================>.......] - ETA: 39s - loss: 0.4497 - regression_loss: 0.4150 - classification_loss: 0.0347 400/500 [=======================>......] - ETA: 38s - loss: 0.4511 - regression_loss: 0.4162 - classification_loss: 0.0349 401/500 [=======================>......] - ETA: 38s - loss: 0.4511 - regression_loss: 0.4163 - classification_loss: 0.0348 402/500 [=======================>......] - ETA: 38s - loss: 0.4508 - regression_loss: 0.4160 - classification_loss: 0.0348 403/500 [=======================>......] - ETA: 37s - loss: 0.4505 - regression_loss: 0.4157 - classification_loss: 0.0348 404/500 [=======================>......] - ETA: 37s - loss: 0.4518 - regression_loss: 0.4169 - classification_loss: 0.0349 405/500 [=======================>......] - ETA: 36s - loss: 0.4512 - regression_loss: 0.4164 - classification_loss: 0.0348 406/500 [=======================>......] - ETA: 36s - loss: 0.4512 - regression_loss: 0.4164 - classification_loss: 0.0348 407/500 [=======================>......] - ETA: 36s - loss: 0.4508 - regression_loss: 0.4160 - classification_loss: 0.0348 408/500 [=======================>......] - ETA: 35s - loss: 0.4510 - regression_loss: 0.4162 - classification_loss: 0.0348 409/500 [=======================>......] - ETA: 35s - loss: 0.4513 - regression_loss: 0.4165 - classification_loss: 0.0348 410/500 [=======================>......] - ETA: 35s - loss: 0.4509 - regression_loss: 0.4162 - classification_loss: 0.0347 411/500 [=======================>......] - ETA: 34s - loss: 0.4505 - regression_loss: 0.4158 - classification_loss: 0.0347 412/500 [=======================>......] - ETA: 34s - loss: 0.4507 - regression_loss: 0.4160 - classification_loss: 0.0347 413/500 [=======================>......] - ETA: 33s - loss: 0.4510 - regression_loss: 0.4163 - classification_loss: 0.0347 414/500 [=======================>......] - ETA: 33s - loss: 0.4505 - regression_loss: 0.4159 - classification_loss: 0.0347 415/500 [=======================>......] - ETA: 33s - loss: 0.4501 - regression_loss: 0.4155 - classification_loss: 0.0346 416/500 [=======================>......] - ETA: 32s - loss: 0.4496 - regression_loss: 0.4150 - classification_loss: 0.0346 417/500 [========================>.....] - ETA: 32s - loss: 0.4492 - regression_loss: 0.4147 - classification_loss: 0.0346 418/500 [========================>.....] - ETA: 31s - loss: 0.4490 - regression_loss: 0.4144 - classification_loss: 0.0346 419/500 [========================>.....] - ETA: 31s - loss: 0.4487 - regression_loss: 0.4142 - classification_loss: 0.0345 420/500 [========================>.....] - ETA: 31s - loss: 0.4481 - regression_loss: 0.4137 - classification_loss: 0.0345 421/500 [========================>.....] - ETA: 30s - loss: 0.4486 - regression_loss: 0.4142 - classification_loss: 0.0345 422/500 [========================>.....] - ETA: 30s - loss: 0.4487 - regression_loss: 0.4142 - classification_loss: 0.0344 423/500 [========================>.....] - ETA: 29s - loss: 0.4485 - regression_loss: 0.4141 - classification_loss: 0.0344 424/500 [========================>.....] - ETA: 29s - loss: 0.4488 - regression_loss: 0.4143 - classification_loss: 0.0345 425/500 [========================>.....] - ETA: 29s - loss: 0.4485 - regression_loss: 0.4141 - classification_loss: 0.0344 426/500 [========================>.....] - ETA: 28s - loss: 0.4482 - regression_loss: 0.4138 - classification_loss: 0.0344 427/500 [========================>.....] - ETA: 28s - loss: 0.4479 - regression_loss: 0.4135 - classification_loss: 0.0344 428/500 [========================>.....] - ETA: 28s - loss: 0.4476 - regression_loss: 0.4132 - classification_loss: 0.0344 429/500 [========================>.....] - ETA: 27s - loss: 0.4471 - regression_loss: 0.4128 - classification_loss: 0.0343 430/500 [========================>.....] - ETA: 27s - loss: 0.4466 - regression_loss: 0.4123 - classification_loss: 0.0343 431/500 [========================>.....] - ETA: 26s - loss: 0.4465 - regression_loss: 0.4122 - classification_loss: 0.0344 432/500 [========================>.....] - ETA: 26s - loss: 0.4459 - regression_loss: 0.4116 - classification_loss: 0.0343 433/500 [========================>.....] - ETA: 26s - loss: 0.4455 - regression_loss: 0.4112 - classification_loss: 0.0343 434/500 [=========================>....] - ETA: 25s - loss: 0.4463 - regression_loss: 0.4121 - classification_loss: 0.0343 435/500 [=========================>....] - ETA: 25s - loss: 0.4462 - regression_loss: 0.4120 - classification_loss: 0.0343 436/500 [=========================>....] - ETA: 24s - loss: 0.4462 - regression_loss: 0.4120 - classification_loss: 0.0343 437/500 [=========================>....] - ETA: 24s - loss: 0.4468 - regression_loss: 0.4125 - classification_loss: 0.0343 438/500 [=========================>....] - ETA: 24s - loss: 0.4467 - regression_loss: 0.4125 - classification_loss: 0.0343 439/500 [=========================>....] - ETA: 23s - loss: 0.4462 - regression_loss: 0.4120 - classification_loss: 0.0342 440/500 [=========================>....] - ETA: 23s - loss: 0.4461 - regression_loss: 0.4119 - classification_loss: 0.0342 441/500 [=========================>....] - ETA: 22s - loss: 0.4468 - regression_loss: 0.4125 - classification_loss: 0.0343 442/500 [=========================>....] - ETA: 22s - loss: 0.4467 - regression_loss: 0.4124 - classification_loss: 0.0343 443/500 [=========================>....] - ETA: 22s - loss: 0.4468 - regression_loss: 0.4126 - classification_loss: 0.0342 444/500 [=========================>....] - ETA: 21s - loss: 0.4462 - regression_loss: 0.4120 - classification_loss: 0.0342 445/500 [=========================>....] - ETA: 21s - loss: 0.4453 - regression_loss: 0.4112 - classification_loss: 0.0341 446/500 [=========================>....] - ETA: 21s - loss: 0.4456 - regression_loss: 0.4115 - classification_loss: 0.0341 447/500 [=========================>....] - ETA: 20s - loss: 0.4450 - regression_loss: 0.4109 - classification_loss: 0.0341 448/500 [=========================>....] - ETA: 20s - loss: 0.4455 - regression_loss: 0.4113 - classification_loss: 0.0341 449/500 [=========================>....] - ETA: 19s - loss: 0.4455 - regression_loss: 0.4114 - classification_loss: 0.0341 450/500 [==========================>...] - ETA: 19s - loss: 0.4456 - regression_loss: 0.4115 - classification_loss: 0.0341 451/500 [==========================>...] - ETA: 19s - loss: 0.4464 - regression_loss: 0.4123 - classification_loss: 0.0341 452/500 [==========================>...] - ETA: 18s - loss: 0.4466 - regression_loss: 0.4125 - classification_loss: 0.0341 453/500 [==========================>...] - ETA: 18s - loss: 0.4470 - regression_loss: 0.4129 - classification_loss: 0.0341 454/500 [==========================>...] - ETA: 17s - loss: 0.4473 - regression_loss: 0.4132 - classification_loss: 0.0341 455/500 [==========================>...] - ETA: 17s - loss: 0.4470 - regression_loss: 0.4130 - classification_loss: 0.0340 456/500 [==========================>...] - ETA: 17s - loss: 0.4475 - regression_loss: 0.4135 - classification_loss: 0.0340 457/500 [==========================>...] - ETA: 16s - loss: 0.4481 - regression_loss: 0.4141 - classification_loss: 0.0340 458/500 [==========================>...] - ETA: 16s - loss: 0.4480 - regression_loss: 0.4140 - classification_loss: 0.0340 459/500 [==========================>...] - ETA: 15s - loss: 0.4479 - regression_loss: 0.4139 - classification_loss: 0.0340 460/500 [==========================>...] - ETA: 15s - loss: 0.4486 - regression_loss: 0.4146 - classification_loss: 0.0340 461/500 [==========================>...] - ETA: 15s - loss: 0.4486 - regression_loss: 0.4146 - classification_loss: 0.0340 462/500 [==========================>...] - ETA: 14s - loss: 0.4492 - regression_loss: 0.4152 - classification_loss: 0.0340 463/500 [==========================>...] - ETA: 14s - loss: 0.4497 - regression_loss: 0.4157 - classification_loss: 0.0340 464/500 [==========================>...] - ETA: 14s - loss: 0.4496 - regression_loss: 0.4157 - classification_loss: 0.0340 465/500 [==========================>...] - ETA: 13s - loss: 0.4493 - regression_loss: 0.4154 - classification_loss: 0.0339 466/500 [==========================>...] - ETA: 13s - loss: 0.4495 - regression_loss: 0.4155 - classification_loss: 0.0340 467/500 [===========================>..] - ETA: 12s - loss: 0.4502 - regression_loss: 0.4162 - classification_loss: 0.0339 468/500 [===========================>..] - ETA: 12s - loss: 0.4500 - regression_loss: 0.4160 - classification_loss: 0.0339 469/500 [===========================>..] - ETA: 12s - loss: 0.4503 - regression_loss: 0.4163 - classification_loss: 0.0339 470/500 [===========================>..] - ETA: 11s - loss: 0.4502 - regression_loss: 0.4163 - classification_loss: 0.0339 471/500 [===========================>..] - ETA: 11s - loss: 0.4501 - regression_loss: 0.4162 - classification_loss: 0.0339 472/500 [===========================>..] - ETA: 10s - loss: 0.4499 - regression_loss: 0.4161 - classification_loss: 0.0338 473/500 [===========================>..] - ETA: 10s - loss: 0.4499 - regression_loss: 0.4161 - classification_loss: 0.0338 474/500 [===========================>..] - ETA: 10s - loss: 0.4503 - regression_loss: 0.4164 - classification_loss: 0.0339 475/500 [===========================>..] - ETA: 9s - loss: 0.4504 - regression_loss: 0.4165 - classification_loss: 0.0339  476/500 [===========================>..] - ETA: 9s - loss: 0.4506 - regression_loss: 0.4167 - classification_loss: 0.0339 477/500 [===========================>..] - ETA: 8s - loss: 0.4506 - regression_loss: 0.4168 - classification_loss: 0.0339 478/500 [===========================>..] - ETA: 8s - loss: 0.4522 - regression_loss: 0.4182 - classification_loss: 0.0340 479/500 [===========================>..] - ETA: 8s - loss: 0.4520 - regression_loss: 0.4180 - classification_loss: 0.0340 480/500 [===========================>..] - ETA: 7s - loss: 0.4528 - regression_loss: 0.4186 - classification_loss: 0.0342 481/500 [===========================>..] - ETA: 7s - loss: 0.4529 - regression_loss: 0.4187 - classification_loss: 0.0342 482/500 [===========================>..] - ETA: 7s - loss: 0.4531 - regression_loss: 0.4189 - classification_loss: 0.0341 483/500 [===========================>..] - ETA: 6s - loss: 0.4531 - regression_loss: 0.4190 - classification_loss: 0.0341 484/500 [============================>.] - ETA: 6s - loss: 0.4531 - regression_loss: 0.4189 - classification_loss: 0.0342 485/500 [============================>.] - ETA: 5s - loss: 0.4529 - regression_loss: 0.4187 - classification_loss: 0.0342 486/500 [============================>.] - ETA: 5s - loss: 0.4527 - regression_loss: 0.4185 - classification_loss: 0.0342 487/500 [============================>.] - ETA: 5s - loss: 0.4525 - regression_loss: 0.4184 - classification_loss: 0.0341 488/500 [============================>.] - ETA: 4s - loss: 0.4526 - regression_loss: 0.4185 - classification_loss: 0.0341 489/500 [============================>.] - ETA: 4s - loss: 0.4521 - regression_loss: 0.4180 - classification_loss: 0.0341 490/500 [============================>.] - ETA: 3s - loss: 0.4522 - regression_loss: 0.4181 - classification_loss: 0.0341 491/500 [============================>.] - ETA: 3s - loss: 0.4518 - regression_loss: 0.4177 - classification_loss: 0.0341 492/500 [============================>.] - ETA: 3s - loss: 0.4516 - regression_loss: 0.4175 - classification_loss: 0.0341 493/500 [============================>.] - ETA: 2s - loss: 0.4517 - regression_loss: 0.4176 - classification_loss: 0.0341 494/500 [============================>.] - ETA: 2s - loss: 0.4511 - regression_loss: 0.4171 - classification_loss: 0.0341 495/500 [============================>.] - ETA: 1s - loss: 0.4509 - regression_loss: 0.4169 - classification_loss: 0.0341 496/500 [============================>.] - ETA: 1s - loss: 0.4509 - regression_loss: 0.4169 - classification_loss: 0.0340 497/500 [============================>.] - ETA: 1s - loss: 0.4512 - regression_loss: 0.4171 - classification_loss: 0.0340 498/500 [============================>.] - ETA: 0s - loss: 0.4509 - regression_loss: 0.4169 - classification_loss: 0.0340 499/500 [============================>.] - ETA: 0s - loss: 0.4510 - regression_loss: 0.4170 - classification_loss: 0.0340 500/500 [==============================] - 195s 389ms/step - loss: 0.4505 - regression_loss: 0.4166 - classification_loss: 0.0340 326 instances of class plum with average precision: 0.8438 mAP: 0.8438 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Now training depth backbone as well Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 4 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ padding_conv1_rgb (ZeroPadding2 (None, None, None, 3 0 lambda_1[0][0] __________________________________________________________________________________________________ conv1_rgb (Conv2D) (None, None, None, 6 9408 padding_conv1_rgb[0][0] __________________________________________________________________________________________________ bn_conv1_rgb (BatchNormalizatio (None, None, None, 6 256 conv1_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_rgb (Activation) (None, None, None, 6 0 bn_conv1_rgb[0][0] __________________________________________________________________________________________________ pool1_rgb (MaxPooling2D) (None, None, None, 6 0 conv1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_rgb (Conv2D) (None, None, None, 6 4096 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch1_rgb (Conv2D) (None, None, None, 2 16384 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch1_rgb (BatchNormaliz (None, None, None, 2 1024 res2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_rgb (Add) (None, None, None, 2 0 bn2a_branch2c_rgb[0][0] bn2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_rgb (Activation) (None, None, None, 2 0 res2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2a_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2b_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_rgb (Add) (None, None, None, 2 0 bn2b_branch2c_rgb[0][0] res2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_rgb (Activation) (None, None, None, 2 0 res2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_rgb (Add) (None, None, None, 2 0 bn2c_branch2c_rgb[0][0] res2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_rgb (Activation) (None, None, None, 2 0 res2c_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_rgb (Conv2D) (None, None, None, 1 32768 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2a_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_rgb (ZeroPad (None, None, None, 1 0 res3a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2b_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch1_rgb (Conv2D) (None, None, None, 5 131072 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2c_rgb (BatchNormali (None, None, None, 5 2048 res3a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch1_rgb (BatchNormaliz (None, None, None, 5 2048 res3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_rgb (Add) (None, None, None, 5 0 bn3a_branch2c_rgb[0][0] bn3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_rgb (Activation) (None, None, None, 5 0 res3a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3a_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b1_rgb (Add) (None, None, None, 5 0 bn3b1_branch2c_rgb[0][0] res3a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_rgb (Activation) (None, None, None, 5 0 res3b1_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_rgb (Add) (None, None, None, 5 0 bn3b2_branch2c_rgb[0][0] res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_rgb (Activation) (None, None, None, 5 0 res3b2_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_rgb (Add) (None, None, None, 5 0 bn3b3_branch2c_rgb[0][0] res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_rgb (Activation) (None, None, None, 5 0 res3b3_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_rgb (Conv2D) (None, None, None, 2 131072 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2a_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_rgb (ZeroPad (None, None, None, 2 0 res4a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2b_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch1_rgb (Conv2D) (None, None, None, 1 524288 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2c_rgb (BatchNormali (None, None, None, 1 4096 res4a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch1_rgb (BatchNormaliz (None, None, None, 1 4096 res4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_rgb (Add) (None, None, None, 1 0 bn4a_branch2c_rgb[0][0] bn4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_rgb (Activation) (None, None, None, 1 0 res4a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4a_relu_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ lambda_2 (Lambda) (None, None, None, 1 0 input_1[0][0] __________________________________________________________________________________________________ res4b1_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding_conv1_d (ZeroPadding2D) (None, None, None, 1 0 lambda_2[0][0] __________________________________________________________________________________________________ bn4b1_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ conv1_d (Conv2D) (None, None, None, 6 3136 padding_conv1_d[0][0] __________________________________________________________________________________________________ res4b1_rgb (Add) (None, None, None, 1 0 bn4b1_branch2c_rgb[0][0] res4a_relu_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_d (Activation) (None, None, None, 6 0 conv1_d[0][0] __________________________________________________________________________________________________ res4b1_relu_rgb (Activation) (None, None, None, 1 0 res4b1_rgb[0][0] __________________________________________________________________________________________________ pool1_d (MaxPooling2D) (None, None, None, 6 0 conv1_relu_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_d (Conv2D) (None, None, None, 6 4096 pool1_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_d (Activati (None, None, None, 6 0 res2a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_d (Activati (None, None, None, 6 0 res2a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_d (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res2a_branch1_d (Conv2D) (None, None, None, 2 16384 pool1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_d (Add) (None, None, None, 2 0 res2a_branch2c_d[0][0] res2a_branch1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_d (Activation) (None, None, None, 2 0 res2a_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_d (Conv2D) (None, None, None, 6 16384 res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b2_rgb (Add) (None, None, None, 1 0 bn4b2_branch2c_rgb[0][0] res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_d (Activati (None, None, None, 6 0 res2b_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_relu_rgb (Activation) (None, None, None, 1 0 res4b2_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_d (Activati (None, None, None, 6 0 res2b_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_d (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_d (Add) (None, None, None, 2 0 res2b_branch2c_d[0][0] res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_d (Activation) (None, None, None, 2 0 res2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_d (Conv2D) (None, None, None, 6 16384 res2b_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_d (Activati (None, None, None, 6 0 res2c_branch2a_d[0][0] __________________________________________________________________________________________________ res4b3_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_rgb (Add) (None, None, None, 1 0 bn4b3_branch2c_rgb[0][0] res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_d (Activati (None, None, None, 6 0 res2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_relu_rgb (Activation) (None, None, None, 1 0 res4b3_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_d (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_d (Add) (None, None, None, 2 0 res2c_branch2c_d[0][0] res2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_d (Activation) (None, None, None, 2 0 res2c_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_d (Conv2D) (None, None, None, 1 32768 res2c_relu_d[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b4_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_d (Activati (None, None, None, 1 0 res3a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_d (ZeroPaddi (None, None, None, 1 0 res3a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_d (Activati (None, None, None, 1 0 res3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_d (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res3a_branch1_d (Conv2D) (None, None, None, 5 131072 res2c_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b4_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3a_d (Add) (None, None, None, 5 0 res3a_branch2c_d[0][0] res3a_branch1_d[0][0] __________________________________________________________________________________________________ res4b4_rgb (Add) (None, None, None, 1 0 bn4b4_branch2c_rgb[0][0] res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_d (Activation) (None, None, None, 5 0 res3a_d[0][0] __________________________________________________________________________________________________ res4b4_relu_rgb (Activation) (None, None, None, 1 0 res4b4_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_d (Conv2D) (None, None, None, 1 65536 res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_d (Activat (None, None, None, 1 0 res3b1_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b5_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_d (Activat (None, None, None, 1 0 res3b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_d (Add) (None, None, None, 5 0 res3b1_branch2c_d[0][0] res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_d (Activation) (None, None, None, 5 0 res3b1_d[0][0] __________________________________________________________________________________________________ res4b5_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b1_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b5_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_d (Activat (None, None, None, 1 0 res3b2_branch2a_d[0][0] __________________________________________________________________________________________________ res4b5_rgb (Add) (None, None, None, 1 0 bn4b5_branch2c_rgb[0][0] res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_relu_rgb (Activation) (None, None, None, 1 0 res4b5_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_d (Activat (None, None, None, 1 0 res3b2_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_d (Add) (None, None, None, 5 0 res3b2_branch2c_d[0][0] res3b1_relu_d[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b6_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_d (Activation) (None, None, None, 5 0 res3b2_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b2_relu_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_d (Activat (None, None, None, 1 0 res3b3_branch2a_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b6_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_d (Activat (None, None, None, 1 0 res3b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_rgb (Add) (None, None, None, 1 0 bn4b6_branch2c_rgb[0][0] res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_relu_rgb (Activation) (None, None, None, 1 0 res4b6_rgb[0][0] __________________________________________________________________________________________________ res3b3_d (Add) (None, None, None, 5 0 res3b3_branch2c_d[0][0] res3b2_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_d (Activation) (None, None, None, 5 0 res3b3_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_d (Conv2D) (None, None, None, 2 131072 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_d (Activati (None, None, None, 2 0 res4a_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b7_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_d (ZeroPaddi (None, None, None, 2 0 res4a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_d (Activati (None, None, None, 2 0 res4a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_d (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4a_branch1_d (Conv2D) (None, None, None, 1 524288 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_d (Add) (None, None, None, 1 0 res4a_branch2c_d[0][0] res4a_branch1_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b7_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_d (Activation) (None, None, None, 1 0 res4a_d[0][0] __________________________________________________________________________________________________ res4b7_rgb (Add) (None, None, None, 1 0 bn4b7_branch2c_rgb[0][0] res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_d (Conv2D) (None, None, None, 2 262144 res4a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_relu_rgb (Activation) (None, None, None, 1 0 res4b7_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_d (Activat (None, None, None, 2 0 res4b1_branch2a_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_d (Activat (None, None, None, 2 0 res4b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b8_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_d (Add) (None, None, None, 1 0 res4b1_branch2c_d[0][0] res4a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_relu_d (Activation) (None, None, None, 1 0 res4b1_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_d (Activat (None, None, None, 2 0 res4b2_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b8_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b8_rgb (Add) (None, None, None, 1 0 bn4b8_branch2c_rgb[0][0] res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_relu_rgb (Activation) (None, None, None, 1 0 res4b8_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_d (Activat (None, None, None, 2 0 res4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_d (Add) (None, None, None, 1 0 res4b2_branch2c_d[0][0] res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_relu_d (Activation) (None, None, None, 1 0 res4b2_d[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b9_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_d (Activat (None, None, None, 2 0 res4b3_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_d (Activat (None, None, None, 2 0 res4b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b9_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b9_rgb (Add) (None, None, None, 1 0 bn4b9_branch2c_rgb[0][0] res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_d (Add) (None, None, None, 1 0 res4b3_branch2c_d[0][0] res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_relu_rgb (Activation) (None, None, None, 1 0 res4b9_rgb[0][0] __________________________________________________________________________________________________ res4b3_relu_d (Activation) (None, None, None, 1 0 res4b3_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b3_relu_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_d (Activat (None, None, None, 2 0 res4b4_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b4_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b10_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_d (Activat (None, None, None, 2 0 res4b4_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_d (Add) (None, None, None, 1 0 res4b4_branch2c_d[0][0] res4b3_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_relu_d (Activation) (None, None, None, 1 0 res4b4_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b10_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b10_rgb (Add) (None, None, None, 1 0 bn4b10_branch2c_rgb[0][0] res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_d (Activat (None, None, None, 2 0 res4b5_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_relu_rgb (Activation) (None, None, None, 1 0 res4b10_rgb[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b5_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_d (Activat (None, None, None, 2 0 res4b5_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b11_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_d (Add) (None, None, None, 1 0 res4b5_branch2c_d[0][0] res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b5_relu_d (Activation) (None, None, None, 1 0 res4b5_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b5_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_d (Activat (None, None, None, 2 0 res4b6_branch2a_d[0][0] __________________________________________________________________________________________________ res4b11_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b6_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b11_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_rgb (Add) (None, None, None, 1 0 bn4b11_branch2c_rgb[0][0] res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_d (Activat (None, None, None, 2 0 res4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_relu_rgb (Activation) (None, None, None, 1 0 res4b11_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_d (Add) (None, None, None, 1 0 res4b6_branch2c_d[0][0] res4b5_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b6_relu_d (Activation) (None, None, None, 1 0 res4b6_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b6_relu_d[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b12_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_d (Activat (None, None, None, 2 0 res4b7_branch2a_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b7_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_d (Activat (None, None, None, 2 0 res4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b12_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b7_d (Add) (None, None, None, 1 0 res4b7_branch2c_d[0][0] res4b6_relu_d[0][0] __________________________________________________________________________________________________ res4b12_rgb (Add) (None, None, None, 1 0 bn4b12_branch2c_rgb[0][0] res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_relu_d (Activation) (None, None, None, 1 0 res4b7_d[0][0] __________________________________________________________________________________________________ res4b12_relu_rgb (Activation) (None, None, None, 1 0 res4b12_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_d (Activat (None, None, None, 2 0 res4b8_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b8_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b13_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_d (Activat (None, None, None, 2 0 res4b8_branch2b_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_d (Add) (None, None, None, 1 0 res4b8_branch2c_d[0][0] res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_relu_d (Activation) (None, None, None, 1 0 res4b8_d[0][0] __________________________________________________________________________________________________ res4b13_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b8_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b13_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_d (Activat (None, None, None, 2 0 res4b9_branch2a_d[0][0] __________________________________________________________________________________________________ res4b13_rgb (Add) (None, None, None, 1 0 bn4b13_branch2c_rgb[0][0] res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b9_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_relu_rgb (Activation) (None, None, None, 1 0 res4b13_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_d (Activat (None, None, None, 2 0 res4b9_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_d (Add) (None, None, None, 1 0 res4b9_branch2c_d[0][0] res4b8_relu_d[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b14_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_relu_d (Activation) (None, None, None, 1 0 res4b9_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b9_relu_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_d (Activa (None, None, None, 2 0 res4b10_branch2a_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_d (ZeroPad (None, None, None, 2 0 res4b10_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b14_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_d (Activa (None, None, None, 2 0 res4b10_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_rgb (Add) (None, None, None, 1 0 bn4b14_branch2c_rgb[0][0] res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_relu_rgb (Activation) (None, None, None, 1 0 res4b14_rgb[0][0] __________________________________________________________________________________________________ res4b10_d (Add) (None, None, None, 1 0 res4b10_branch2c_d[0][0] res4b9_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_relu_d (Activation) (None, None, None, 1 0 res4b10_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b10_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_d (Activa (None, None, None, 2 0 res4b11_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b15_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_d (ZeroPad (None, None, None, 2 0 res4b11_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_d (Activa (None, None, None, 2 0 res4b11_branch2b_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b11_d (Add) (None, None, None, 1 0 res4b11_branch2c_d[0][0] res4b10_relu_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b15_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b11_relu_d (Activation) (None, None, None, 1 0 res4b11_d[0][0] __________________________________________________________________________________________________ res4b15_rgb (Add) (None, None, None, 1 0 bn4b15_branch2c_rgb[0][0] res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b11_relu_d[0][0] __________________________________________________________________________________________________ res4b15_relu_rgb (Activation) (None, None, None, 1 0 res4b15_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_d (Activa (None, None, None, 2 0 res4b12_branch2a_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_d (ZeroPad (None, None, None, 2 0 res4b12_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_d (Activa (None, None, None, 2 0 res4b12_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b16_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_d (Add) (None, None, None, 1 0 res4b12_branch2c_d[0][0] res4b11_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_relu_d (Activation) (None, None, None, 1 0 res4b12_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_d (Activa (None, None, None, 2 0 res4b13_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b16_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_d (ZeroPad (None, None, None, 2 0 res4b13_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b16_rgb (Add) (None, None, None, 1 0 bn4b16_branch2c_rgb[0][0] res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_relu_rgb (Activation) (None, None, None, 1 0 res4b16_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_d (Activa (None, None, None, 2 0 res4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_d (Add) (None, None, None, 1 0 res4b13_branch2c_d[0][0] res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_relu_d (Activation) (None, None, None, 1 0 res4b13_d[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b17_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_d (Activa (None, None, None, 2 0 res4b14_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_d (ZeroPad (None, None, None, 2 0 res4b14_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_d (Activa (None, None, None, 2 0 res4b14_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b17_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b17_rgb (Add) (None, None, None, 1 0 bn4b17_branch2c_rgb[0][0] res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_d (Add) (None, None, None, 1 0 res4b14_branch2c_d[0][0] res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_relu_rgb (Activation) (None, None, None, 1 0 res4b17_rgb[0][0] __________________________________________________________________________________________________ res4b14_relu_d (Activation) (None, None, None, 1 0 res4b14_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b14_relu_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_d (Activa (None, None, None, 2 0 res4b15_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_d (ZeroPad (None, None, None, 2 0 res4b15_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b18_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_d (Activa (None, None, None, 2 0 res4b15_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_d (Add) (None, None, None, 1 0 res4b15_branch2c_d[0][0] res4b14_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_relu_d (Activation) (None, None, None, 1 0 res4b15_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b18_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b18_rgb (Add) (None, None, None, 1 0 bn4b18_branch2c_rgb[0][0] res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_d (Activa (None, None, None, 2 0 res4b16_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_relu_rgb (Activation) (None, None, None, 1 0 res4b18_rgb[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_d (ZeroPad (None, None, None, 2 0 res4b16_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_d (Activa (None, None, None, 2 0 res4b16_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b19_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_d (Add) (None, None, None, 1 0 res4b16_branch2c_d[0][0] res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b16_relu_d (Activation) (None, None, None, 1 0 res4b16_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b16_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_d (Activa (None, None, None, 2 0 res4b17_branch2a_d[0][0] __________________________________________________________________________________________________ res4b19_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_d (ZeroPad (None, None, None, 2 0 res4b17_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b19_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_rgb (Add) (None, None, None, 1 0 bn4b19_branch2c_rgb[0][0] res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_d (Activa (None, None, None, 2 0 res4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_relu_rgb (Activation) (None, None, None, 1 0 res4b19_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_d (Add) (None, None, None, 1 0 res4b17_branch2c_d[0][0] res4b16_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b17_relu_d (Activation) (None, None, None, 1 0 res4b17_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b17_relu_d[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b20_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_d (Activa (None, None, None, 2 0 res4b18_branch2a_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_d (ZeroPad (None, None, None, 2 0 res4b18_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_d (Activa (None, None, None, 2 0 res4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b20_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b18_d (Add) (None, None, None, 1 0 res4b18_branch2c_d[0][0] res4b17_relu_d[0][0] __________________________________________________________________________________________________ res4b20_rgb (Add) (None, None, None, 1 0 bn4b20_branch2c_rgb[0][0] res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_relu_d (Activation) (None, None, None, 1 0 res4b18_d[0][0] __________________________________________________________________________________________________ res4b20_relu_rgb (Activation) (None, None, None, 1 0 res4b20_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_d (Activa (None, None, None, 2 0 res4b19_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_d (ZeroPad (None, None, None, 2 0 res4b19_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b21_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_d (Activa (None, None, None, 2 0 res4b19_branch2b_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_d (Add) (None, None, None, 1 0 res4b19_branch2c_d[0][0] res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_relu_d (Activation) (None, None, None, 1 0 res4b19_d[0][0] __________________________________________________________________________________________________ res4b21_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b19_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b21_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_d (Activa (None, None, None, 2 0 res4b20_branch2a_d[0][0] __________________________________________________________________________________________________ res4b21_rgb (Add) (None, None, None, 1 0 bn4b21_branch2c_rgb[0][0] res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_d (ZeroPad (None, None, None, 2 0 res4b20_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_relu_rgb (Activation) (None, None, None, 1 0 res4b21_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_d (Activa (None, None, None, 2 0 res4b20_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_d (Add) (None, None, None, 1 0 res4b20_branch2c_d[0][0] res4b19_relu_d[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b22_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_relu_d (Activation) (None, None, None, 1 0 res4b20_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b20_relu_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_d (Activa (None, None, None, 2 0 res4b21_branch2a_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_d (ZeroPad (None, None, None, 2 0 res4b21_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b22_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_d (Activa (None, None, None, 2 0 res4b21_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_rgb (Add) (None, None, None, 1 0 bn4b22_branch2c_rgb[0][0] res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_relu_rgb (Activation) (None, None, None, 1 0 res4b22_rgb[0][0] __________________________________________________________________________________________________ res4b21_d (Add) (None, None, None, 1 0 res4b21_branch2c_d[0][0] res4b20_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_rgb (Conv2D) (None, None, None, 5 524288 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_relu_d (Activation) (None, None, None, 1 0 res4b21_d[0][0] __________________________________________________________________________________________________ bn5a_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b21_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_d (Activa (None, None, None, 2 0 res4b22_branch2a_d[0][0] __________________________________________________________________________________________________ padding5a_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_d (ZeroPad (None, None, None, 2 0 res4b22_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_d[0][0] __________________________________________________________________________________________________ bn5a_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_d (Activa (None, None, None, 2 0 res4b22_branch2b_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch1_rgb (Conv2D) (None, None, None, 2 2097152 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b22_d (Add) (None, None, None, 1 0 res4b22_branch2c_d[0][0] res4b21_relu_d[0][0] __________________________________________________________________________________________________ bn5a_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn5a_branch1_rgb (BatchNormaliz (None, None, None, 2 8192 res5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4b22_relu_d (Activation) (None, None, None, 1 0 res4b22_d[0][0] __________________________________________________________________________________________________ res5a_rgb (Add) (None, None, None, 2 0 bn5a_branch2c_rgb[0][0] bn5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_d (Conv2D) (None, None, None, 5 524288 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5a_relu_rgb (Activation) (None, None, None, 2 0 res5a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_d (Activati (None, None, None, 5 0 res5a_branch2a_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5a_relu_rgb[0][0] __________________________________________________________________________________________________ padding5a_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn5b_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_d (Activati (None, None, None, 5 0 res5a_branch2b_d[0][0] __________________________________________________________________________________________________ padding5b_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch1_d (Conv2D) (None, None, None, 2 2097152 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_d (Add) (None, None, None, 2 0 res5a_branch2c_d[0][0] res5a_branch1_d[0][0] __________________________________________________________________________________________________ bn5b_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_relu_d (Activation) (None, None, None, 2 0 res5a_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5a_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_d (Activati (None, None, None, 5 0 res5b_branch2a_d[0][0] __________________________________________________________________________________________________ bn5b_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5b_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding5b_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5b_rgb (Add) (None, None, None, 2 0 bn5b_branch2c_rgb[0][0] res5a_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_relu_rgb (Activation) (None, None, None, 2 0 res5b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_d (Activati (None, None, None, 5 0 res5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn5c_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_d (Add) (None, None, None, 2 0 res5b_branch2c_d[0][0] res5a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_relu_d (Activation) (None, None, None, 2 0 res5b_d[0][0] __________________________________________________________________________________________________ padding5c_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_d (Activati (None, None, None, 5 0 res5c_branch2a_d[0][0] __________________________________________________________________________________________________ bn5c_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding5c_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_d (Activati (None, None, None, 5 0 res5c_branch2b_d[0][0] __________________________________________________________________________________________________ bn5c_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5c_rgb (Add) (None, None, None, 2 0 bn5c_branch2c_rgb[0][0] res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_d (Add) (None, None, None, 2 0 res5c_branch2c_d[0][0] res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_relu_rgb (Activation) (None, None, None, 2 0 res5c_rgb[0][0] __________________________________________________________________________________________________ res5c_relu_d (Activation) (None, None, None, 2 0 res5c_d[0][0] __________________________________________________________________________________________________ concatenate_33 (Concatenate) (None, None, None, 4 0 res5c_relu_rgb[0][0] res5c_relu_d[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 1048832 concatenate_33[0][0] __________________________________________________________________________________________________ concatenate_30 (Concatenate) (None, None, None, 2 0 res4b22_relu_rgb[0][0] res4b22_relu_d[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] concatenate_30[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 524544 concatenate_30[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ concatenate_7 (Concatenate) (None, None, None, 1 0 res3b3_relu_rgb[0][0] res3b3_relu_d[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] concatenate_7[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 262400 concatenate_7[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 9437440 concatenate_33[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 103,451,949 Trainable params: 103,241,261 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 1:35:45 - loss: 0.5251 - regression_loss: 0.4954 - classification_loss: 0.0297 2/500 [..............................] - ETA: 49:56 - loss: 0.7995 - regression_loss: 0.7177 - classification_loss: 0.0818  3/500 [..............................] - ETA: 34:39 - loss: 0.7023 - regression_loss: 0.6392 - classification_loss: 0.0631 4/500 [..............................] - ETA: 27:00 - loss: 0.7151 - regression_loss: 0.6527 - classification_loss: 0.0624 5/500 [..............................] - ETA: 22:25 - loss: 0.6749 - regression_loss: 0.6220 - classification_loss: 0.0529 6/500 [..............................] - ETA: 19:20 - loss: 0.6260 - regression_loss: 0.5775 - classification_loss: 0.0485 7/500 [..............................] - ETA: 17:09 - loss: 0.5706 - regression_loss: 0.5263 - classification_loss: 0.0444 8/500 [..............................] - ETA: 15:30 - loss: 0.5569 - regression_loss: 0.5123 - classification_loss: 0.0446 9/500 [..............................] - ETA: 14:13 - loss: 0.5148 - regression_loss: 0.4742 - classification_loss: 0.0406 10/500 [..............................] - ETA: 13:11 - loss: 0.5103 - regression_loss: 0.4686 - classification_loss: 0.0417 11/500 [..............................] - ETA: 12:19 - loss: 0.5213 - regression_loss: 0.4799 - classification_loss: 0.0414 12/500 [..............................] - ETA: 11:37 - loss: 0.5261 - regression_loss: 0.4839 - classification_loss: 0.0422 13/500 [..............................] - ETA: 11:01 - loss: 0.5080 - regression_loss: 0.4674 - classification_loss: 0.0405 14/500 [..............................] - ETA: 10:30 - loss: 0.4999 - regression_loss: 0.4599 - classification_loss: 0.0400 15/500 [..............................] - ETA: 10:03 - loss: 0.4825 - regression_loss: 0.4443 - classification_loss: 0.0382 16/500 [..............................] - ETA: 9:40 - loss: 0.4683 - regression_loss: 0.4309 - classification_loss: 0.0374  17/500 [>.............................] - ETA: 9:20 - loss: 0.4761 - regression_loss: 0.4367 - classification_loss: 0.0395 18/500 [>.............................] - ETA: 9:01 - loss: 0.4847 - regression_loss: 0.4451 - classification_loss: 0.0396 19/500 [>.............................] - ETA: 8:45 - loss: 0.4762 - regression_loss: 0.4374 - classification_loss: 0.0388 20/500 [>.............................] - ETA: 8:30 - loss: 0.4708 - regression_loss: 0.4330 - classification_loss: 0.0377 21/500 [>.............................] - ETA: 8:16 - loss: 0.4668 - regression_loss: 0.4296 - classification_loss: 0.0372 22/500 [>.............................] - ETA: 8:04 - loss: 0.4577 - regression_loss: 0.4214 - classification_loss: 0.0363 23/500 [>.............................] - ETA: 7:53 - loss: 0.4855 - regression_loss: 0.4402 - classification_loss: 0.0453 24/500 [>.............................] - ETA: 7:42 - loss: 0.4813 - regression_loss: 0.4367 - classification_loss: 0.0447 25/500 [>.............................] - ETA: 7:32 - loss: 0.4807 - regression_loss: 0.4367 - classification_loss: 0.0440 26/500 [>.............................] - ETA: 7:23 - loss: 0.4709 - regression_loss: 0.4280 - classification_loss: 0.0429 27/500 [>.............................] - ETA: 7:15 - loss: 0.4693 - regression_loss: 0.4268 - classification_loss: 0.0425 28/500 [>.............................] - ETA: 7:07 - loss: 0.4714 - regression_loss: 0.4285 - classification_loss: 0.0429 29/500 [>.............................] - ETA: 7:00 - loss: 0.4661 - regression_loss: 0.4238 - classification_loss: 0.0424 30/500 [>.............................] - ETA: 6:53 - loss: 0.4540 - regression_loss: 0.4127 - classification_loss: 0.0412 31/500 [>.............................] - ETA: 6:47 - loss: 0.4455 - regression_loss: 0.4049 - classification_loss: 0.0406 32/500 [>.............................] - ETA: 6:40 - loss: 0.4418 - regression_loss: 0.4011 - classification_loss: 0.0407 33/500 [>.............................] - ETA: 6:35 - loss: 0.4593 - regression_loss: 0.4165 - classification_loss: 0.0428 34/500 [=>............................] - ETA: 6:29 - loss: 0.4554 - regression_loss: 0.4130 - classification_loss: 0.0424 35/500 [=>............................] - ETA: 6:24 - loss: 0.4493 - regression_loss: 0.4074 - classification_loss: 0.0418 36/500 [=>............................] - ETA: 6:20 - loss: 0.4435 - regression_loss: 0.4024 - classification_loss: 0.0411 37/500 [=>............................] - ETA: 6:15 - loss: 0.4480 - regression_loss: 0.4067 - classification_loss: 0.0414 38/500 [=>............................] - ETA: 6:11 - loss: 0.4460 - regression_loss: 0.4053 - classification_loss: 0.0407 39/500 [=>............................] - ETA: 6:06 - loss: 0.4483 - regression_loss: 0.4072 - classification_loss: 0.0411 40/500 [=>............................] - ETA: 6:02 - loss: 0.4497 - regression_loss: 0.4086 - classification_loss: 0.0411 41/500 [=>............................] - ETA: 5:59 - loss: 0.4511 - regression_loss: 0.4100 - classification_loss: 0.0411 42/500 [=>............................] - ETA: 5:55 - loss: 0.4483 - regression_loss: 0.4078 - classification_loss: 0.0405 43/500 [=>............................] - ETA: 5:51 - loss: 0.4418 - regression_loss: 0.4017 - classification_loss: 0.0400 44/500 [=>............................] - ETA: 5:48 - loss: 0.4380 - regression_loss: 0.3986 - classification_loss: 0.0394 45/500 [=>............................] - ETA: 5:44 - loss: 0.4388 - regression_loss: 0.3995 - classification_loss: 0.0393 46/500 [=>............................] - ETA: 5:41 - loss: 0.4368 - regression_loss: 0.3979 - classification_loss: 0.0389 47/500 [=>............................] - ETA: 5:38 - loss: 0.4313 - regression_loss: 0.3928 - classification_loss: 0.0384 48/500 [=>............................] - ETA: 5:35 - loss: 0.4283 - regression_loss: 0.3904 - classification_loss: 0.0379 49/500 [=>............................] - ETA: 5:32 - loss: 0.4281 - regression_loss: 0.3903 - classification_loss: 0.0378 50/500 [==>...........................] - ETA: 5:29 - loss: 0.4285 - regression_loss: 0.3911 - classification_loss: 0.0375 51/500 [==>...........................] - ETA: 5:26 - loss: 0.4397 - regression_loss: 0.4009 - classification_loss: 0.0387 52/500 [==>...........................] - ETA: 5:23 - loss: 0.4331 - regression_loss: 0.3949 - classification_loss: 0.0382 53/500 [==>...........................] - ETA: 5:21 - loss: 0.4335 - regression_loss: 0.3956 - classification_loss: 0.0379 54/500 [==>...........................] - ETA: 5:18 - loss: 0.4321 - regression_loss: 0.3946 - classification_loss: 0.0375 55/500 [==>...........................] - ETA: 5:15 - loss: 0.4324 - regression_loss: 0.3950 - classification_loss: 0.0374 56/500 [==>...........................] - ETA: 5:13 - loss: 0.4383 - regression_loss: 0.4004 - classification_loss: 0.0379 57/500 [==>...........................] - ETA: 5:11 - loss: 0.4345 - regression_loss: 0.3970 - classification_loss: 0.0375 58/500 [==>...........................] - ETA: 5:08 - loss: 0.4339 - regression_loss: 0.3965 - classification_loss: 0.0374 59/500 [==>...........................] - ETA: 5:06 - loss: 0.4313 - regression_loss: 0.3942 - classification_loss: 0.0371 60/500 [==>...........................] - ETA: 5:04 - loss: 0.4382 - regression_loss: 0.4009 - classification_loss: 0.0374 61/500 [==>...........................] - ETA: 5:02 - loss: 0.4366 - regression_loss: 0.3995 - classification_loss: 0.0370 62/500 [==>...........................] - ETA: 5:00 - loss: 0.4351 - regression_loss: 0.3985 - classification_loss: 0.0366 63/500 [==>...........................] - ETA: 4:58 - loss: 0.4305 - regression_loss: 0.3943 - classification_loss: 0.0362 64/500 [==>...........................] - ETA: 4:56 - loss: 0.4293 - regression_loss: 0.3933 - classification_loss: 0.0360 65/500 [==>...........................] - ETA: 4:54 - loss: 0.4309 - regression_loss: 0.3950 - classification_loss: 0.0359 66/500 [==>...........................] - ETA: 4:53 - loss: 0.4297 - regression_loss: 0.3940 - classification_loss: 0.0357 67/500 [===>..........................] - ETA: 4:51 - loss: 0.4263 - regression_loss: 0.3908 - classification_loss: 0.0354 68/500 [===>..........................] - ETA: 4:49 - loss: 0.4243 - regression_loss: 0.3891 - classification_loss: 0.0352 69/500 [===>..........................] - ETA: 4:47 - loss: 0.4236 - regression_loss: 0.3886 - classification_loss: 0.0350 70/500 [===>..........................] - ETA: 4:46 - loss: 0.4228 - regression_loss: 0.3881 - classification_loss: 0.0347 71/500 [===>..........................] - ETA: 4:44 - loss: 0.4211 - regression_loss: 0.3866 - classification_loss: 0.0345 72/500 [===>..........................] - ETA: 4:43 - loss: 0.4205 - regression_loss: 0.3861 - classification_loss: 0.0343 73/500 [===>..........................] - ETA: 4:41 - loss: 0.4195 - regression_loss: 0.3854 - classification_loss: 0.0341 74/500 [===>..........................] - ETA: 4:40 - loss: 0.4185 - regression_loss: 0.3846 - classification_loss: 0.0339 75/500 [===>..........................] - ETA: 4:38 - loss: 0.4170 - regression_loss: 0.3830 - classification_loss: 0.0340 76/500 [===>..........................] - ETA: 4:37 - loss: 0.4160 - regression_loss: 0.3823 - classification_loss: 0.0337 77/500 [===>..........................] - ETA: 4:35 - loss: 0.4196 - regression_loss: 0.3858 - classification_loss: 0.0338 78/500 [===>..........................] - ETA: 4:34 - loss: 0.4181 - regression_loss: 0.3843 - classification_loss: 0.0337 79/500 [===>..........................] - ETA: 4:32 - loss: 0.4170 - regression_loss: 0.3834 - classification_loss: 0.0336 80/500 [===>..........................] - ETA: 4:31 - loss: 0.4172 - regression_loss: 0.3837 - classification_loss: 0.0335 81/500 [===>..........................] - ETA: 4:29 - loss: 0.4145 - regression_loss: 0.3813 - classification_loss: 0.0333 82/500 [===>..........................] - ETA: 4:28 - loss: 0.4133 - regression_loss: 0.3802 - classification_loss: 0.0331 83/500 [===>..........................] - ETA: 4:27 - loss: 0.4145 - regression_loss: 0.3815 - classification_loss: 0.0330 84/500 [====>.........................] - ETA: 4:25 - loss: 0.4123 - regression_loss: 0.3795 - classification_loss: 0.0327 85/500 [====>.........................] - ETA: 4:24 - loss: 0.4110 - regression_loss: 0.3785 - classification_loss: 0.0325 86/500 [====>.........................] - ETA: 4:23 - loss: 0.4097 - regression_loss: 0.3774 - classification_loss: 0.0323 87/500 [====>.........................] - ETA: 4:22 - loss: 0.4072 - regression_loss: 0.3751 - classification_loss: 0.0321 88/500 [====>.........................] - ETA: 4:20 - loss: 0.4077 - regression_loss: 0.3756 - classification_loss: 0.0321 89/500 [====>.........................] - ETA: 4:19 - loss: 0.4054 - regression_loss: 0.3735 - classification_loss: 0.0319 90/500 [====>.........................] - ETA: 4:18 - loss: 0.4032 - regression_loss: 0.3714 - classification_loss: 0.0318 91/500 [====>.........................] - ETA: 4:17 - loss: 0.4058 - regression_loss: 0.3737 - classification_loss: 0.0321 92/500 [====>.........................] - ETA: 4:16 - loss: 0.4039 - regression_loss: 0.3719 - classification_loss: 0.0319 93/500 [====>.........................] - ETA: 4:15 - loss: 0.4026 - regression_loss: 0.3707 - classification_loss: 0.0319 94/500 [====>.........................] - ETA: 4:14 - loss: 0.4017 - regression_loss: 0.3699 - classification_loss: 0.0318 95/500 [====>.........................] - ETA: 4:13 - loss: 0.4008 - regression_loss: 0.3692 - classification_loss: 0.0316 96/500 [====>.........................] - ETA: 4:11 - loss: 0.3988 - regression_loss: 0.3672 - classification_loss: 0.0315 97/500 [====>.........................] - ETA: 4:10 - loss: 0.3972 - regression_loss: 0.3657 - classification_loss: 0.0315 98/500 [====>.........................] - ETA: 4:09 - loss: 0.3963 - regression_loss: 0.3651 - classification_loss: 0.0313 99/500 [====>.........................] - ETA: 4:08 - loss: 0.3947 - regression_loss: 0.3636 - classification_loss: 0.0311 100/500 [=====>........................] - ETA: 4:07 - loss: 0.3932 - regression_loss: 0.3623 - classification_loss: 0.0310 101/500 [=====>........................] - ETA: 4:06 - loss: 0.3951 - regression_loss: 0.3640 - classification_loss: 0.0311 102/500 [=====>........................] - ETA: 4:05 - loss: 0.3970 - regression_loss: 0.3658 - classification_loss: 0.0311 103/500 [=====>........................] - ETA: 4:04 - loss: 0.3948 - regression_loss: 0.3638 - classification_loss: 0.0310 104/500 [=====>........................] - ETA: 4:03 - loss: 0.3940 - regression_loss: 0.3632 - classification_loss: 0.0308 105/500 [=====>........................] - ETA: 4:02 - loss: 0.3918 - regression_loss: 0.3611 - classification_loss: 0.0307 106/500 [=====>........................] - ETA: 4:01 - loss: 0.3925 - regression_loss: 0.3618 - classification_loss: 0.0307 107/500 [=====>........................] - ETA: 4:00 - loss: 0.3903 - regression_loss: 0.3598 - classification_loss: 0.0305 108/500 [=====>........................] - ETA: 3:59 - loss: 0.3927 - regression_loss: 0.3614 - classification_loss: 0.0313 109/500 [=====>........................] - ETA: 3:58 - loss: 0.3919 - regression_loss: 0.3608 - classification_loss: 0.0311 110/500 [=====>........................] - ETA: 3:57 - loss: 0.3968 - regression_loss: 0.3654 - classification_loss: 0.0313 111/500 [=====>........................] - ETA: 3:56 - loss: 0.3972 - regression_loss: 0.3659 - classification_loss: 0.0313 112/500 [=====>........................] - ETA: 3:55 - loss: 0.3958 - regression_loss: 0.3646 - classification_loss: 0.0312 113/500 [=====>........................] - ETA: 3:54 - loss: 0.3941 - regression_loss: 0.3631 - classification_loss: 0.0310 114/500 [=====>........................] - ETA: 3:53 - loss: 0.3939 - regression_loss: 0.3631 - classification_loss: 0.0309 115/500 [=====>........................] - ETA: 3:53 - loss: 0.3923 - regression_loss: 0.3616 - classification_loss: 0.0307 116/500 [=====>........................] - ETA: 3:52 - loss: 0.3907 - regression_loss: 0.3602 - classification_loss: 0.0305 117/500 [======>.......................] - ETA: 3:51 - loss: 0.3901 - regression_loss: 0.3596 - classification_loss: 0.0305 118/500 [======>.......................] - ETA: 3:50 - loss: 0.3895 - regression_loss: 0.3590 - classification_loss: 0.0306 119/500 [======>.......................] - ETA: 3:49 - loss: 0.3888 - regression_loss: 0.3583 - classification_loss: 0.0305 120/500 [======>.......................] - ETA: 3:48 - loss: 0.3895 - regression_loss: 0.3591 - classification_loss: 0.0304 121/500 [======>.......................] - ETA: 3:47 - loss: 0.3876 - regression_loss: 0.3573 - classification_loss: 0.0303 122/500 [======>.......................] - ETA: 3:47 - loss: 0.3901 - regression_loss: 0.3597 - classification_loss: 0.0304 123/500 [======>.......................] - ETA: 3:46 - loss: 0.3901 - regression_loss: 0.3597 - classification_loss: 0.0304 124/500 [======>.......................] - ETA: 3:45 - loss: 0.3889 - regression_loss: 0.3586 - classification_loss: 0.0303 125/500 [======>.......................] - ETA: 3:44 - loss: 0.3882 - regression_loss: 0.3581 - classification_loss: 0.0301 126/500 [======>.......................] - ETA: 3:43 - loss: 0.3864 - regression_loss: 0.3565 - classification_loss: 0.0300 127/500 [======>.......................] - ETA: 3:42 - loss: 0.3857 - regression_loss: 0.3559 - classification_loss: 0.0299 128/500 [======>.......................] - ETA: 3:41 - loss: 0.3834 - regression_loss: 0.3537 - classification_loss: 0.0297 129/500 [======>.......................] - ETA: 3:40 - loss: 0.3848 - regression_loss: 0.3550 - classification_loss: 0.0298 130/500 [======>.......................] - ETA: 3:40 - loss: 0.3852 - regression_loss: 0.3551 - classification_loss: 0.0300 131/500 [======>.......................] - ETA: 3:39 - loss: 0.3847 - regression_loss: 0.3548 - classification_loss: 0.0299 132/500 [======>.......................] - ETA: 3:38 - loss: 0.3845 - regression_loss: 0.3547 - classification_loss: 0.0298 133/500 [======>.......................] - ETA: 3:37 - loss: 0.3842 - regression_loss: 0.3545 - classification_loss: 0.0297 134/500 [=======>......................] - ETA: 3:36 - loss: 0.3833 - regression_loss: 0.3538 - classification_loss: 0.0295 135/500 [=======>......................] - ETA: 3:35 - loss: 0.3835 - regression_loss: 0.3540 - classification_loss: 0.0295 136/500 [=======>......................] - ETA: 3:35 - loss: 0.3827 - regression_loss: 0.3534 - classification_loss: 0.0293 137/500 [=======>......................] - ETA: 3:34 - loss: 0.3819 - regression_loss: 0.3526 - classification_loss: 0.0293 138/500 [=======>......................] - ETA: 3:33 - loss: 0.3835 - regression_loss: 0.3542 - classification_loss: 0.0293 139/500 [=======>......................] - ETA: 3:32 - loss: 0.3838 - regression_loss: 0.3544 - classification_loss: 0.0294 140/500 [=======>......................] - ETA: 3:31 - loss: 0.3828 - regression_loss: 0.3535 - classification_loss: 0.0293 141/500 [=======>......................] - ETA: 3:31 - loss: 0.3818 - regression_loss: 0.3526 - classification_loss: 0.0292 142/500 [=======>......................] - ETA: 3:30 - loss: 0.3816 - regression_loss: 0.3524 - classification_loss: 0.0291 143/500 [=======>......................] - ETA: 3:29 - loss: 0.3822 - regression_loss: 0.3530 - classification_loss: 0.0291 144/500 [=======>......................] - ETA: 3:28 - loss: 0.3841 - regression_loss: 0.3550 - classification_loss: 0.0291 145/500 [=======>......................] - ETA: 3:28 - loss: 0.3833 - regression_loss: 0.3543 - classification_loss: 0.0290 146/500 [=======>......................] - ETA: 3:27 - loss: 0.3821 - regression_loss: 0.3532 - classification_loss: 0.0289 147/500 [=======>......................] - ETA: 3:26 - loss: 0.3819 - regression_loss: 0.3530 - classification_loss: 0.0289 148/500 [=======>......................] - ETA: 3:25 - loss: 0.3820 - regression_loss: 0.3531 - classification_loss: 0.0288 149/500 [=======>......................] - ETA: 3:25 - loss: 0.3812 - regression_loss: 0.3524 - classification_loss: 0.0288 150/500 [========>.....................] - ETA: 3:24 - loss: 0.3806 - regression_loss: 0.3518 - classification_loss: 0.0287 151/500 [========>.....................] - ETA: 3:23 - loss: 0.3808 - regression_loss: 0.3520 - classification_loss: 0.0288 152/500 [========>.....................] - ETA: 3:22 - loss: 0.3806 - regression_loss: 0.3518 - classification_loss: 0.0288 153/500 [========>.....................] - ETA: 3:22 - loss: 0.3809 - regression_loss: 0.3520 - classification_loss: 0.0289 154/500 [========>.....................] - ETA: 3:21 - loss: 0.3807 - regression_loss: 0.3518 - classification_loss: 0.0289 155/500 [========>.....................] - ETA: 3:20 - loss: 0.3803 - regression_loss: 0.3514 - classification_loss: 0.0288 156/500 [========>.....................] - ETA: 3:19 - loss: 0.3807 - regression_loss: 0.3519 - classification_loss: 0.0289 157/500 [========>.....................] - ETA: 3:19 - loss: 0.3809 - regression_loss: 0.3521 - classification_loss: 0.0289 158/500 [========>.....................] - ETA: 3:18 - loss: 0.3809 - regression_loss: 0.3521 - classification_loss: 0.0289 159/500 [========>.....................] - ETA: 3:17 - loss: 0.3810 - regression_loss: 0.3522 - classification_loss: 0.0288 160/500 [========>.....................] - ETA: 3:17 - loss: 0.3807 - regression_loss: 0.3518 - classification_loss: 0.0288 161/500 [========>.....................] - ETA: 3:16 - loss: 0.3805 - regression_loss: 0.3516 - classification_loss: 0.0289 162/500 [========>.....................] - ETA: 3:15 - loss: 0.3796 - regression_loss: 0.3508 - classification_loss: 0.0288 163/500 [========>.....................] - ETA: 3:14 - loss: 0.3792 - regression_loss: 0.3505 - classification_loss: 0.0287 164/500 [========>.....................] - ETA: 3:14 - loss: 0.3788 - regression_loss: 0.3502 - classification_loss: 0.0287 165/500 [========>.....................] - ETA: 3:13 - loss: 0.3778 - regression_loss: 0.3493 - classification_loss: 0.0285 166/500 [========>.....................] - ETA: 3:12 - loss: 0.3774 - regression_loss: 0.3489 - classification_loss: 0.0285 167/500 [=========>....................] - ETA: 3:12 - loss: 0.3788 - regression_loss: 0.3503 - classification_loss: 0.0285 168/500 [=========>....................] - ETA: 3:11 - loss: 0.3814 - regression_loss: 0.3526 - classification_loss: 0.0288 169/500 [=========>....................] - ETA: 3:10 - loss: 0.3810 - regression_loss: 0.3522 - classification_loss: 0.0288 170/500 [=========>....................] - ETA: 3:09 - loss: 0.3797 - regression_loss: 0.3510 - classification_loss: 0.0286 171/500 [=========>....................] - ETA: 3:09 - loss: 0.3810 - regression_loss: 0.3523 - classification_loss: 0.0287 172/500 [=========>....................] - ETA: 3:08 - loss: 0.3821 - regression_loss: 0.3533 - classification_loss: 0.0288 173/500 [=========>....................] - ETA: 3:07 - loss: 0.3819 - regression_loss: 0.3532 - classification_loss: 0.0287 174/500 [=========>....................] - ETA: 3:07 - loss: 0.3817 - regression_loss: 0.3529 - classification_loss: 0.0287 175/500 [=========>....................] - ETA: 3:06 - loss: 0.3823 - regression_loss: 0.3535 - classification_loss: 0.0288 176/500 [=========>....................] - ETA: 3:05 - loss: 0.3810 - regression_loss: 0.3524 - classification_loss: 0.0287 177/500 [=========>....................] - ETA: 3:04 - loss: 0.3834 - regression_loss: 0.3543 - classification_loss: 0.0291 178/500 [=========>....................] - ETA: 3:04 - loss: 0.3830 - regression_loss: 0.3539 - classification_loss: 0.0290 179/500 [=========>....................] - ETA: 3:03 - loss: 0.3832 - regression_loss: 0.3542 - classification_loss: 0.0290 180/500 [=========>....................] - ETA: 3:02 - loss: 0.3825 - regression_loss: 0.3536 - classification_loss: 0.0289 181/500 [=========>....................] - ETA: 3:02 - loss: 0.3825 - regression_loss: 0.3537 - classification_loss: 0.0288 182/500 [=========>....................] - ETA: 3:01 - loss: 0.3813 - regression_loss: 0.3526 - classification_loss: 0.0287 183/500 [=========>....................] - ETA: 3:00 - loss: 0.3806 - regression_loss: 0.3520 - classification_loss: 0.0286 184/500 [==========>...................] - ETA: 3:00 - loss: 0.3813 - regression_loss: 0.3527 - classification_loss: 0.0286 185/500 [==========>...................] - ETA: 2:59 - loss: 0.3801 - regression_loss: 0.3515 - classification_loss: 0.0286 186/500 [==========>...................] - ETA: 2:58 - loss: 0.3814 - regression_loss: 0.3527 - classification_loss: 0.0287 187/500 [==========>...................] - ETA: 2:58 - loss: 0.3839 - regression_loss: 0.3548 - classification_loss: 0.0290 188/500 [==========>...................] - ETA: 2:57 - loss: 0.3836 - regression_loss: 0.3546 - classification_loss: 0.0290 189/500 [==========>...................] - ETA: 2:56 - loss: 0.3834 - regression_loss: 0.3544 - classification_loss: 0.0290 190/500 [==========>...................] - ETA: 2:56 - loss: 0.3867 - regression_loss: 0.3566 - classification_loss: 0.0301 191/500 [==========>...................] - ETA: 2:55 - loss: 0.3871 - regression_loss: 0.3571 - classification_loss: 0.0301 192/500 [==========>...................] - ETA: 2:54 - loss: 0.3864 - regression_loss: 0.3564 - classification_loss: 0.0300 193/500 [==========>...................] - ETA: 2:54 - loss: 0.3865 - regression_loss: 0.3566 - classification_loss: 0.0300 194/500 [==========>...................] - ETA: 2:53 - loss: 0.3866 - regression_loss: 0.3567 - classification_loss: 0.0299 195/500 [==========>...................] - ETA: 2:52 - loss: 0.3858 - regression_loss: 0.3559 - classification_loss: 0.0299 196/500 [==========>...................] - ETA: 2:52 - loss: 0.3855 - regression_loss: 0.3557 - classification_loss: 0.0298 197/500 [==========>...................] - ETA: 2:51 - loss: 0.3847 - regression_loss: 0.3550 - classification_loss: 0.0298 198/500 [==========>...................] - ETA: 2:50 - loss: 0.3843 - regression_loss: 0.3545 - classification_loss: 0.0298 199/500 [==========>...................] - ETA: 2:50 - loss: 0.3835 - regression_loss: 0.3538 - classification_loss: 0.0297 200/500 [===========>..................] - ETA: 2:49 - loss: 0.3825 - regression_loss: 0.3528 - classification_loss: 0.0296 201/500 [===========>..................] - ETA: 2:48 - loss: 0.3819 - regression_loss: 0.3523 - classification_loss: 0.0296 202/500 [===========>..................] - ETA: 2:48 - loss: 0.3840 - regression_loss: 0.3539 - classification_loss: 0.0301 203/500 [===========>..................] - ETA: 2:47 - loss: 0.3835 - regression_loss: 0.3535 - classification_loss: 0.0301 204/500 [===========>..................] - ETA: 2:47 - loss: 0.3829 - regression_loss: 0.3529 - classification_loss: 0.0300 205/500 [===========>..................] - ETA: 2:46 - loss: 0.3847 - regression_loss: 0.3545 - classification_loss: 0.0301 206/500 [===========>..................] - ETA: 2:45 - loss: 0.3839 - regression_loss: 0.3538 - classification_loss: 0.0301 207/500 [===========>..................] - ETA: 2:45 - loss: 0.3841 - regression_loss: 0.3540 - classification_loss: 0.0301 208/500 [===========>..................] - ETA: 2:44 - loss: 0.3837 - regression_loss: 0.3536 - classification_loss: 0.0300 209/500 [===========>..................] - ETA: 2:43 - loss: 0.3832 - regression_loss: 0.3532 - classification_loss: 0.0300 210/500 [===========>..................] - ETA: 2:43 - loss: 0.3829 - regression_loss: 0.3530 - classification_loss: 0.0299 211/500 [===========>..................] - ETA: 2:42 - loss: 0.3817 - regression_loss: 0.3519 - classification_loss: 0.0298 212/500 [===========>..................] - ETA: 2:41 - loss: 0.3809 - regression_loss: 0.3511 - classification_loss: 0.0298 213/500 [===========>..................] - ETA: 2:41 - loss: 0.3803 - regression_loss: 0.3506 - classification_loss: 0.0297 214/500 [===========>..................] - ETA: 2:40 - loss: 0.3794 - regression_loss: 0.3497 - classification_loss: 0.0296 215/500 [===========>..................] - ETA: 2:40 - loss: 0.3800 - regression_loss: 0.3503 - classification_loss: 0.0297 216/500 [===========>..................] - ETA: 2:39 - loss: 0.3809 - regression_loss: 0.3512 - classification_loss: 0.0297 217/500 [============>.................] - ETA: 2:38 - loss: 0.3806 - regression_loss: 0.3509 - classification_loss: 0.0296 218/500 [============>.................] - ETA: 2:38 - loss: 0.3804 - regression_loss: 0.3508 - classification_loss: 0.0296 219/500 [============>.................] - ETA: 2:37 - loss: 0.3801 - regression_loss: 0.3506 - classification_loss: 0.0295 220/500 [============>.................] - ETA: 2:36 - loss: 0.3794 - regression_loss: 0.3500 - classification_loss: 0.0295 221/500 [============>.................] - ETA: 2:36 - loss: 0.3792 - regression_loss: 0.3498 - classification_loss: 0.0294 222/500 [============>.................] - ETA: 2:35 - loss: 0.3785 - regression_loss: 0.3492 - classification_loss: 0.0293 223/500 [============>.................] - ETA: 2:35 - loss: 0.3794 - regression_loss: 0.3500 - classification_loss: 0.0293 224/500 [============>.................] - ETA: 2:34 - loss: 0.3786 - regression_loss: 0.3494 - classification_loss: 0.0292 225/500 [============>.................] - ETA: 2:33 - loss: 0.3782 - regression_loss: 0.3490 - classification_loss: 0.0292 226/500 [============>.................] - ETA: 2:33 - loss: 0.3777 - regression_loss: 0.3486 - classification_loss: 0.0291 227/500 [============>.................] - ETA: 2:32 - loss: 0.3780 - regression_loss: 0.3487 - classification_loss: 0.0293 228/500 [============>.................] - ETA: 2:31 - loss: 0.3768 - regression_loss: 0.3476 - classification_loss: 0.0292 229/500 [============>.................] - ETA: 2:31 - loss: 0.3767 - regression_loss: 0.3475 - classification_loss: 0.0292 230/500 [============>.................] - ETA: 2:30 - loss: 0.3759 - regression_loss: 0.3468 - classification_loss: 0.0292 231/500 [============>.................] - ETA: 2:30 - loss: 0.3763 - regression_loss: 0.3472 - classification_loss: 0.0291 232/500 [============>.................] - ETA: 2:29 - loss: 0.3770 - regression_loss: 0.3480 - classification_loss: 0.0291 233/500 [============>.................] - ETA: 2:28 - loss: 0.3762 - regression_loss: 0.3472 - classification_loss: 0.0290 234/500 [=============>................] - ETA: 2:28 - loss: 0.3786 - regression_loss: 0.3493 - classification_loss: 0.0293 235/500 [=============>................] - ETA: 2:27 - loss: 0.3784 - regression_loss: 0.3491 - classification_loss: 0.0293 236/500 [=============>................] - ETA: 2:27 - loss: 0.3781 - regression_loss: 0.3489 - classification_loss: 0.0292 237/500 [=============>................] - ETA: 2:26 - loss: 0.3783 - regression_loss: 0.3490 - classification_loss: 0.0293 238/500 [=============>................] - ETA: 2:25 - loss: 0.3783 - regression_loss: 0.3490 - classification_loss: 0.0293 239/500 [=============>................] - ETA: 2:25 - loss: 0.3783 - regression_loss: 0.3490 - classification_loss: 0.0292 240/500 [=============>................] - ETA: 2:24 - loss: 0.3775 - regression_loss: 0.3483 - classification_loss: 0.0291 241/500 [=============>................] - ETA: 2:23 - loss: 0.3776 - regression_loss: 0.3484 - classification_loss: 0.0291 242/500 [=============>................] - ETA: 2:23 - loss: 0.3773 - regression_loss: 0.3482 - classification_loss: 0.0291 243/500 [=============>................] - ETA: 2:22 - loss: 0.3764 - regression_loss: 0.3474 - classification_loss: 0.0290 244/500 [=============>................] - ETA: 2:22 - loss: 0.3757 - regression_loss: 0.3467 - classification_loss: 0.0290 245/500 [=============>................] - ETA: 2:21 - loss: 0.3746 - regression_loss: 0.3457 - classification_loss: 0.0289 246/500 [=============>................] - ETA: 2:20 - loss: 0.3745 - regression_loss: 0.3456 - classification_loss: 0.0289 247/500 [=============>................] - ETA: 2:20 - loss: 0.3740 - regression_loss: 0.3452 - classification_loss: 0.0288 248/500 [=============>................] - ETA: 2:19 - loss: 0.3748 - regression_loss: 0.3459 - classification_loss: 0.0289 249/500 [=============>................] - ETA: 2:19 - loss: 0.3747 - regression_loss: 0.3458 - classification_loss: 0.0289 250/500 [==============>...............] - ETA: 2:18 - loss: 0.3740 - regression_loss: 0.3451 - classification_loss: 0.0289 251/500 [==============>...............] - ETA: 2:17 - loss: 0.3744 - regression_loss: 0.3455 - classification_loss: 0.0288 252/500 [==============>...............] - ETA: 2:17 - loss: 0.3740 - regression_loss: 0.3452 - classification_loss: 0.0288 253/500 [==============>...............] - ETA: 2:16 - loss: 0.3731 - regression_loss: 0.3444 - classification_loss: 0.0287 254/500 [==============>...............] - ETA: 2:16 - loss: 0.3738 - regression_loss: 0.3451 - classification_loss: 0.0287 255/500 [==============>...............] - ETA: 2:15 - loss: 0.3734 - regression_loss: 0.3447 - classification_loss: 0.0287 256/500 [==============>...............] - ETA: 2:14 - loss: 0.3727 - regression_loss: 0.3441 - classification_loss: 0.0286 257/500 [==============>...............] - ETA: 2:14 - loss: 0.3727 - regression_loss: 0.3441 - classification_loss: 0.0286 258/500 [==============>...............] - ETA: 2:13 - loss: 0.3743 - regression_loss: 0.3456 - classification_loss: 0.0287 259/500 [==============>...............] - ETA: 2:13 - loss: 0.3737 - regression_loss: 0.3450 - classification_loss: 0.0286 260/500 [==============>...............] - ETA: 2:12 - loss: 0.3730 - regression_loss: 0.3444 - classification_loss: 0.0286 261/500 [==============>...............] - ETA: 2:12 - loss: 0.3725 - regression_loss: 0.3440 - classification_loss: 0.0285 262/500 [==============>...............] - ETA: 2:11 - loss: 0.3729 - regression_loss: 0.3443 - classification_loss: 0.0286 263/500 [==============>...............] - ETA: 2:10 - loss: 0.3731 - regression_loss: 0.3445 - classification_loss: 0.0286 264/500 [==============>...............] - ETA: 2:10 - loss: 0.3730 - regression_loss: 0.3443 - classification_loss: 0.0286 265/500 [==============>...............] - ETA: 2:09 - loss: 0.3729 - regression_loss: 0.3443 - classification_loss: 0.0286 266/500 [==============>...............] - ETA: 2:09 - loss: 0.3725 - regression_loss: 0.3440 - classification_loss: 0.0285 267/500 [===============>..............] - ETA: 2:08 - loss: 0.3722 - regression_loss: 0.3438 - classification_loss: 0.0285 268/500 [===============>..............] - ETA: 2:07 - loss: 0.3724 - regression_loss: 0.3439 - classification_loss: 0.0285 269/500 [===============>..............] - ETA: 2:07 - loss: 0.3726 - regression_loss: 0.3440 - classification_loss: 0.0286 270/500 [===============>..............] - ETA: 2:06 - loss: 0.3731 - regression_loss: 0.3444 - classification_loss: 0.0286 271/500 [===============>..............] - ETA: 2:06 - loss: 0.3728 - regression_loss: 0.3441 - classification_loss: 0.0287 272/500 [===============>..............] - ETA: 2:05 - loss: 0.3723 - regression_loss: 0.3437 - classification_loss: 0.0286 273/500 [===============>..............] - ETA: 2:05 - loss: 0.3715 - regression_loss: 0.3429 - classification_loss: 0.0285 274/500 [===============>..............] - ETA: 2:04 - loss: 0.3709 - regression_loss: 0.3424 - classification_loss: 0.0285 275/500 [===============>..............] - ETA: 2:03 - loss: 0.3700 - regression_loss: 0.3416 - classification_loss: 0.0284 276/500 [===============>..............] - ETA: 2:03 - loss: 0.3696 - regression_loss: 0.3413 - classification_loss: 0.0284 277/500 [===============>..............] - ETA: 2:02 - loss: 0.3691 - regression_loss: 0.3408 - classification_loss: 0.0283 278/500 [===============>..............] - ETA: 2:02 - loss: 0.3690 - regression_loss: 0.3407 - classification_loss: 0.0283 279/500 [===============>..............] - ETA: 2:01 - loss: 0.3689 - regression_loss: 0.3407 - classification_loss: 0.0283 280/500 [===============>..............] - ETA: 2:00 - loss: 0.3682 - regression_loss: 0.3400 - classification_loss: 0.0282 281/500 [===============>..............] - ETA: 2:00 - loss: 0.3678 - regression_loss: 0.3396 - classification_loss: 0.0282 282/500 [===============>..............] - ETA: 1:59 - loss: 0.3677 - regression_loss: 0.3395 - classification_loss: 0.0282 283/500 [===============>..............] - ETA: 1:59 - loss: 0.3683 - regression_loss: 0.3401 - classification_loss: 0.0282 284/500 [================>.............] - ETA: 1:58 - loss: 0.3682 - regression_loss: 0.3400 - classification_loss: 0.0282 285/500 [================>.............] - ETA: 1:58 - loss: 0.3681 - regression_loss: 0.3399 - classification_loss: 0.0281 286/500 [================>.............] - ETA: 1:57 - loss: 0.3674 - regression_loss: 0.3393 - classification_loss: 0.0281 287/500 [================>.............] - ETA: 1:56 - loss: 0.3678 - regression_loss: 0.3397 - classification_loss: 0.0281 288/500 [================>.............] - ETA: 1:56 - loss: 0.3679 - regression_loss: 0.3398 - classification_loss: 0.0281 289/500 [================>.............] - ETA: 1:55 - loss: 0.3680 - regression_loss: 0.3399 - classification_loss: 0.0281 290/500 [================>.............] - ETA: 1:55 - loss: 0.3678 - regression_loss: 0.3398 - classification_loss: 0.0280 291/500 [================>.............] - ETA: 1:54 - loss: 0.3673 - regression_loss: 0.3393 - classification_loss: 0.0280 292/500 [================>.............] - ETA: 1:54 - loss: 0.3669 - regression_loss: 0.3390 - classification_loss: 0.0279 293/500 [================>.............] - ETA: 1:53 - loss: 0.3661 - regression_loss: 0.3383 - classification_loss: 0.0278 294/500 [================>.............] - ETA: 1:52 - loss: 0.3663 - regression_loss: 0.3384 - classification_loss: 0.0279 295/500 [================>.............] - ETA: 1:52 - loss: 0.3657 - regression_loss: 0.3378 - classification_loss: 0.0278 296/500 [================>.............] - ETA: 1:51 - loss: 0.3667 - regression_loss: 0.3387 - classification_loss: 0.0280 297/500 [================>.............] - ETA: 1:51 - loss: 0.3665 - regression_loss: 0.3386 - classification_loss: 0.0280 298/500 [================>.............] - ETA: 1:50 - loss: 0.3665 - regression_loss: 0.3386 - classification_loss: 0.0279 299/500 [================>.............] - ETA: 1:50 - loss: 0.3676 - regression_loss: 0.3396 - classification_loss: 0.0279 300/500 [=================>............] - ETA: 1:49 - loss: 0.3676 - regression_loss: 0.3397 - classification_loss: 0.0279 301/500 [=================>............] - ETA: 1:48 - loss: 0.3676 - regression_loss: 0.3397 - classification_loss: 0.0279 302/500 [=================>............] - ETA: 1:48 - loss: 0.3682 - regression_loss: 0.3402 - classification_loss: 0.0280 303/500 [=================>............] - ETA: 1:47 - loss: 0.3688 - regression_loss: 0.3407 - classification_loss: 0.0280 304/500 [=================>............] - ETA: 1:47 - loss: 0.3688 - regression_loss: 0.3408 - classification_loss: 0.0280 305/500 [=================>............] - ETA: 1:46 - loss: 0.3694 - regression_loss: 0.3413 - classification_loss: 0.0281 306/500 [=================>............] - ETA: 1:46 - loss: 0.3695 - regression_loss: 0.3414 - classification_loss: 0.0281 307/500 [=================>............] - ETA: 1:45 - loss: 0.3692 - regression_loss: 0.3412 - classification_loss: 0.0281 308/500 [=================>............] - ETA: 1:44 - loss: 0.3688 - regression_loss: 0.3408 - classification_loss: 0.0280 309/500 [=================>............] - ETA: 1:44 - loss: 0.3689 - regression_loss: 0.3409 - classification_loss: 0.0280 310/500 [=================>............] - ETA: 1:43 - loss: 0.3692 - regression_loss: 0.3412 - classification_loss: 0.0280 311/500 [=================>............] - ETA: 1:43 - loss: 0.3690 - regression_loss: 0.3410 - classification_loss: 0.0280 312/500 [=================>............] - ETA: 1:42 - loss: 0.3689 - regression_loss: 0.3409 - classification_loss: 0.0280 313/500 [=================>............] - ETA: 1:42 - loss: 0.3686 - regression_loss: 0.3407 - classification_loss: 0.0279 314/500 [=================>............] - ETA: 1:41 - loss: 0.3678 - regression_loss: 0.3399 - classification_loss: 0.0279 315/500 [=================>............] - ETA: 1:40 - loss: 0.3678 - regression_loss: 0.3399 - classification_loss: 0.0278 316/500 [=================>............] - ETA: 1:40 - loss: 0.3675 - regression_loss: 0.3398 - classification_loss: 0.0278 317/500 [==================>...........] - ETA: 1:39 - loss: 0.3686 - regression_loss: 0.3408 - classification_loss: 0.0278 318/500 [==================>...........] - ETA: 1:39 - loss: 0.3683 - regression_loss: 0.3405 - classification_loss: 0.0278 319/500 [==================>...........] - ETA: 1:38 - loss: 0.3682 - regression_loss: 0.3405 - classification_loss: 0.0278 320/500 [==================>...........] - ETA: 1:38 - loss: 0.3680 - regression_loss: 0.3402 - classification_loss: 0.0278 321/500 [==================>...........] - ETA: 1:37 - loss: 0.3675 - regression_loss: 0.3398 - classification_loss: 0.0277 322/500 [==================>...........] - ETA: 1:37 - loss: 0.3680 - regression_loss: 0.3401 - classification_loss: 0.0278 323/500 [==================>...........] - ETA: 1:36 - loss: 0.3682 - regression_loss: 0.3404 - classification_loss: 0.0278 324/500 [==================>...........] - ETA: 1:35 - loss: 0.3676 - regression_loss: 0.3398 - classification_loss: 0.0278 325/500 [==================>...........] - ETA: 1:35 - loss: 0.3676 - regression_loss: 0.3398 - classification_loss: 0.0278 326/500 [==================>...........] - ETA: 1:34 - loss: 0.3676 - regression_loss: 0.3398 - classification_loss: 0.0277 327/500 [==================>...........] - ETA: 1:34 - loss: 0.3669 - regression_loss: 0.3392 - classification_loss: 0.0277 328/500 [==================>...........] - ETA: 1:33 - loss: 0.3667 - regression_loss: 0.3390 - classification_loss: 0.0277 329/500 [==================>...........] - ETA: 1:33 - loss: 0.3666 - regression_loss: 0.3390 - classification_loss: 0.0277 330/500 [==================>...........] - ETA: 1:32 - loss: 0.3664 - regression_loss: 0.3388 - classification_loss: 0.0276 331/500 [==================>...........] - ETA: 1:31 - loss: 0.3663 - regression_loss: 0.3387 - classification_loss: 0.0276 332/500 [==================>...........] - ETA: 1:31 - loss: 0.3668 - regression_loss: 0.3391 - classification_loss: 0.0276 333/500 [==================>...........] - ETA: 1:30 - loss: 0.3662 - regression_loss: 0.3386 - classification_loss: 0.0276 334/500 [===================>..........] - ETA: 1:30 - loss: 0.3658 - regression_loss: 0.3383 - classification_loss: 0.0276 335/500 [===================>..........] - ETA: 1:29 - loss: 0.3669 - regression_loss: 0.3393 - classification_loss: 0.0276 336/500 [===================>..........] - ETA: 1:29 - loss: 0.3661 - regression_loss: 0.3386 - classification_loss: 0.0276 337/500 [===================>..........] - ETA: 1:28 - loss: 0.3657 - regression_loss: 0.3382 - classification_loss: 0.0275 338/500 [===================>..........] - ETA: 1:27 - loss: 0.3657 - regression_loss: 0.3382 - classification_loss: 0.0275 339/500 [===================>..........] - ETA: 1:27 - loss: 0.3659 - regression_loss: 0.3384 - classification_loss: 0.0276 340/500 [===================>..........] - ETA: 1:26 - loss: 0.3653 - regression_loss: 0.3378 - classification_loss: 0.0275 341/500 [===================>..........] - ETA: 1:26 - loss: 0.3647 - regression_loss: 0.3372 - classification_loss: 0.0275 342/500 [===================>..........] - ETA: 1:25 - loss: 0.3649 - regression_loss: 0.3375 - classification_loss: 0.0274 343/500 [===================>..........] - ETA: 1:25 - loss: 0.3650 - regression_loss: 0.3375 - classification_loss: 0.0274 344/500 [===================>..........] - ETA: 1:24 - loss: 0.3645 - regression_loss: 0.3372 - classification_loss: 0.0274 345/500 [===================>..........] - ETA: 1:24 - loss: 0.3645 - regression_loss: 0.3372 - classification_loss: 0.0274 346/500 [===================>..........] - ETA: 1:23 - loss: 0.3651 - regression_loss: 0.3376 - classification_loss: 0.0274 347/500 [===================>..........] - ETA: 1:22 - loss: 0.3653 - regression_loss: 0.3379 - classification_loss: 0.0274 348/500 [===================>..........] - ETA: 1:22 - loss: 0.3649 - regression_loss: 0.3375 - classification_loss: 0.0274 349/500 [===================>..........] - ETA: 1:21 - loss: 0.3650 - regression_loss: 0.3376 - classification_loss: 0.0274 350/500 [====================>.........] - ETA: 1:21 - loss: 0.3645 - regression_loss: 0.3371 - classification_loss: 0.0274 351/500 [====================>.........] - ETA: 1:20 - loss: 0.3643 - regression_loss: 0.3370 - classification_loss: 0.0273 352/500 [====================>.........] - ETA: 1:20 - loss: 0.3640 - regression_loss: 0.3367 - classification_loss: 0.0273 353/500 [====================>.........] - ETA: 1:19 - loss: 0.3640 - regression_loss: 0.3367 - classification_loss: 0.0273 354/500 [====================>.........] - ETA: 1:19 - loss: 0.3637 - regression_loss: 0.3364 - classification_loss: 0.0273 355/500 [====================>.........] - ETA: 1:18 - loss: 0.3631 - regression_loss: 0.3359 - classification_loss: 0.0272 356/500 [====================>.........] - ETA: 1:17 - loss: 0.3630 - regression_loss: 0.3358 - classification_loss: 0.0272 357/500 [====================>.........] - ETA: 1:17 - loss: 0.3628 - regression_loss: 0.3356 - classification_loss: 0.0272 358/500 [====================>.........] - ETA: 1:16 - loss: 0.3632 - regression_loss: 0.3360 - classification_loss: 0.0272 359/500 [====================>.........] - ETA: 1:16 - loss: 0.3638 - regression_loss: 0.3366 - classification_loss: 0.0272 360/500 [====================>.........] - ETA: 1:15 - loss: 0.3646 - regression_loss: 0.3374 - classification_loss: 0.0273 361/500 [====================>.........] - ETA: 1:15 - loss: 0.3643 - regression_loss: 0.3370 - classification_loss: 0.0273 362/500 [====================>.........] - ETA: 1:14 - loss: 0.3648 - regression_loss: 0.3375 - classification_loss: 0.0274 363/500 [====================>.........] - ETA: 1:14 - loss: 0.3645 - regression_loss: 0.3371 - classification_loss: 0.0273 364/500 [====================>.........] - ETA: 1:13 - loss: 0.3646 - regression_loss: 0.3373 - classification_loss: 0.0273 365/500 [====================>.........] - ETA: 1:12 - loss: 0.3643 - regression_loss: 0.3370 - classification_loss: 0.0273 366/500 [====================>.........] - ETA: 1:12 - loss: 0.3641 - regression_loss: 0.3368 - classification_loss: 0.0273 367/500 [=====================>........] - ETA: 1:11 - loss: 0.3640 - regression_loss: 0.3368 - classification_loss: 0.0273 368/500 [=====================>........] - ETA: 1:11 - loss: 0.3639 - regression_loss: 0.3367 - classification_loss: 0.0272 369/500 [=====================>........] - ETA: 1:10 - loss: 0.3635 - regression_loss: 0.3363 - classification_loss: 0.0272 370/500 [=====================>........] - ETA: 1:10 - loss: 0.3632 - regression_loss: 0.3361 - classification_loss: 0.0272 371/500 [=====================>........] - ETA: 1:09 - loss: 0.3629 - regression_loss: 0.3358 - classification_loss: 0.0271 372/500 [=====================>........] - ETA: 1:09 - loss: 0.3629 - regression_loss: 0.3358 - classification_loss: 0.0271 373/500 [=====================>........] - ETA: 1:08 - loss: 0.3623 - regression_loss: 0.3353 - classification_loss: 0.0271 374/500 [=====================>........] - ETA: 1:08 - loss: 0.3621 - regression_loss: 0.3351 - classification_loss: 0.0271 375/500 [=====================>........] - ETA: 1:07 - loss: 0.3618 - regression_loss: 0.3348 - classification_loss: 0.0270 376/500 [=====================>........] - ETA: 1:06 - loss: 0.3625 - regression_loss: 0.3354 - classification_loss: 0.0272 377/500 [=====================>........] - ETA: 1:06 - loss: 0.3628 - regression_loss: 0.3356 - classification_loss: 0.0272 378/500 [=====================>........] - ETA: 1:05 - loss: 0.3630 - regression_loss: 0.3358 - classification_loss: 0.0272 379/500 [=====================>........] - ETA: 1:05 - loss: 0.3632 - regression_loss: 0.3360 - classification_loss: 0.0272 380/500 [=====================>........] - ETA: 1:04 - loss: 0.3637 - regression_loss: 0.3366 - classification_loss: 0.0271 381/500 [=====================>........] - ETA: 1:04 - loss: 0.3634 - regression_loss: 0.3363 - classification_loss: 0.0272 382/500 [=====================>........] - ETA: 1:03 - loss: 0.3633 - regression_loss: 0.3362 - classification_loss: 0.0271 383/500 [=====================>........] - ETA: 1:03 - loss: 0.3651 - regression_loss: 0.3374 - classification_loss: 0.0277 384/500 [======================>.......] - ETA: 1:02 - loss: 0.3655 - regression_loss: 0.3377 - classification_loss: 0.0277 385/500 [======================>.......] - ETA: 1:01 - loss: 0.3654 - regression_loss: 0.3377 - classification_loss: 0.0277 386/500 [======================>.......] - ETA: 1:01 - loss: 0.3654 - regression_loss: 0.3378 - classification_loss: 0.0277 387/500 [======================>.......] - ETA: 1:00 - loss: 0.3650 - regression_loss: 0.3374 - classification_loss: 0.0276 388/500 [======================>.......] - ETA: 1:00 - loss: 0.3647 - regression_loss: 0.3371 - classification_loss: 0.0276 389/500 [======================>.......] - ETA: 59s - loss: 0.3644 - regression_loss: 0.3368 - classification_loss: 0.0276  390/500 [======================>.......] - ETA: 59s - loss: 0.3648 - regression_loss: 0.3372 - classification_loss: 0.0276 391/500 [======================>.......] - ETA: 58s - loss: 0.3643 - regression_loss: 0.3368 - classification_loss: 0.0276 392/500 [======================>.......] - ETA: 58s - loss: 0.3644 - regression_loss: 0.3368 - classification_loss: 0.0276 393/500 [======================>.......] - ETA: 57s - loss: 0.3638 - regression_loss: 0.3363 - classification_loss: 0.0275 394/500 [======================>.......] - ETA: 57s - loss: 0.3637 - regression_loss: 0.3362 - classification_loss: 0.0275 395/500 [======================>.......] - ETA: 56s - loss: 0.3633 - regression_loss: 0.3358 - classification_loss: 0.0275 396/500 [======================>.......] - ETA: 55s - loss: 0.3632 - regression_loss: 0.3358 - classification_loss: 0.0274 397/500 [======================>.......] - ETA: 55s - loss: 0.3637 - regression_loss: 0.3362 - classification_loss: 0.0275 398/500 [======================>.......] - ETA: 54s - loss: 0.3637 - regression_loss: 0.3363 - classification_loss: 0.0274 399/500 [======================>.......] - ETA: 54s - loss: 0.3640 - regression_loss: 0.3366 - classification_loss: 0.0274 400/500 [=======================>......] - ETA: 53s - loss: 0.3637 - regression_loss: 0.3363 - classification_loss: 0.0274 401/500 [=======================>......] - ETA: 53s - loss: 0.3637 - regression_loss: 0.3363 - classification_loss: 0.0273 402/500 [=======================>......] - ETA: 52s - loss: 0.3639 - regression_loss: 0.3365 - classification_loss: 0.0274 403/500 [=======================>......] - ETA: 52s - loss: 0.3641 - regression_loss: 0.3367 - classification_loss: 0.0274 404/500 [=======================>......] - ETA: 51s - loss: 0.3642 - regression_loss: 0.3368 - classification_loss: 0.0274 405/500 [=======================>......] - ETA: 51s - loss: 0.3643 - regression_loss: 0.3369 - classification_loss: 0.0274 406/500 [=======================>......] - ETA: 50s - loss: 0.3646 - regression_loss: 0.3371 - classification_loss: 0.0274 407/500 [=======================>......] - ETA: 49s - loss: 0.3642 - regression_loss: 0.3368 - classification_loss: 0.0274 408/500 [=======================>......] - ETA: 49s - loss: 0.3652 - regression_loss: 0.3378 - classification_loss: 0.0275 409/500 [=======================>......] - ETA: 48s - loss: 0.3649 - regression_loss: 0.3374 - classification_loss: 0.0274 410/500 [=======================>......] - ETA: 48s - loss: 0.3644 - regression_loss: 0.3370 - classification_loss: 0.0274 411/500 [=======================>......] - ETA: 47s - loss: 0.3642 - regression_loss: 0.3368 - classification_loss: 0.0274 412/500 [=======================>......] - ETA: 47s - loss: 0.3642 - regression_loss: 0.3368 - classification_loss: 0.0274 413/500 [=======================>......] - ETA: 46s - loss: 0.3642 - regression_loss: 0.3368 - classification_loss: 0.0274 414/500 [=======================>......] - ETA: 46s - loss: 0.3639 - regression_loss: 0.3366 - classification_loss: 0.0274 415/500 [=======================>......] - ETA: 45s - loss: 0.3636 - regression_loss: 0.3363 - classification_loss: 0.0273 416/500 [=======================>......] - ETA: 45s - loss: 0.3636 - regression_loss: 0.3363 - classification_loss: 0.0273 417/500 [========================>.....] - ETA: 44s - loss: 0.3632 - regression_loss: 0.3359 - classification_loss: 0.0273 418/500 [========================>.....] - ETA: 43s - loss: 0.3632 - regression_loss: 0.3359 - classification_loss: 0.0273 419/500 [========================>.....] - ETA: 43s - loss: 0.3624 - regression_loss: 0.3352 - classification_loss: 0.0272 420/500 [========================>.....] - ETA: 42s - loss: 0.3628 - regression_loss: 0.3354 - classification_loss: 0.0274 421/500 [========================>.....] - ETA: 42s - loss: 0.3624 - regression_loss: 0.3351 - classification_loss: 0.0273 422/500 [========================>.....] - ETA: 41s - loss: 0.3623 - regression_loss: 0.3350 - classification_loss: 0.0273 423/500 [========================>.....] - ETA: 41s - loss: 0.3620 - regression_loss: 0.3347 - classification_loss: 0.0273 424/500 [========================>.....] - ETA: 40s - loss: 0.3626 - regression_loss: 0.3352 - classification_loss: 0.0274 425/500 [========================>.....] - ETA: 40s - loss: 0.3627 - regression_loss: 0.3353 - classification_loss: 0.0274 426/500 [========================>.....] - ETA: 39s - loss: 0.3629 - regression_loss: 0.3355 - classification_loss: 0.0274 427/500 [========================>.....] - ETA: 39s - loss: 0.3629 - regression_loss: 0.3356 - classification_loss: 0.0274 428/500 [========================>.....] - ETA: 38s - loss: 0.3627 - regression_loss: 0.3353 - classification_loss: 0.0274 429/500 [========================>.....] - ETA: 38s - loss: 0.3625 - regression_loss: 0.3351 - classification_loss: 0.0274 430/500 [========================>.....] - ETA: 37s - loss: 0.3621 - regression_loss: 0.3348 - classification_loss: 0.0273 431/500 [========================>.....] - ETA: 36s - loss: 0.3620 - regression_loss: 0.3347 - classification_loss: 0.0273 432/500 [========================>.....] - ETA: 36s - loss: 0.3618 - regression_loss: 0.3345 - classification_loss: 0.0273 433/500 [========================>.....] - ETA: 35s - loss: 0.3623 - regression_loss: 0.3350 - classification_loss: 0.0273 434/500 [=========================>....] - ETA: 35s - loss: 0.3623 - regression_loss: 0.3350 - classification_loss: 0.0273 435/500 [=========================>....] - ETA: 34s - loss: 0.3627 - regression_loss: 0.3354 - classification_loss: 0.0273 436/500 [=========================>....] - ETA: 34s - loss: 0.3625 - regression_loss: 0.3352 - classification_loss: 0.0273 437/500 [=========================>....] - ETA: 33s - loss: 0.3622 - regression_loss: 0.3349 - classification_loss: 0.0273 438/500 [=========================>....] - ETA: 33s - loss: 0.3635 - regression_loss: 0.3361 - classification_loss: 0.0274 439/500 [=========================>....] - ETA: 32s - loss: 0.3631 - regression_loss: 0.3357 - classification_loss: 0.0274 440/500 [=========================>....] - ETA: 32s - loss: 0.3629 - regression_loss: 0.3355 - classification_loss: 0.0274 441/500 [=========================>....] - ETA: 31s - loss: 0.3625 - regression_loss: 0.3352 - classification_loss: 0.0273 442/500 [=========================>....] - ETA: 31s - loss: 0.3628 - regression_loss: 0.3355 - classification_loss: 0.0273 443/500 [=========================>....] - ETA: 30s - loss: 0.3627 - regression_loss: 0.3354 - classification_loss: 0.0273 444/500 [=========================>....] - ETA: 29s - loss: 0.3625 - regression_loss: 0.3352 - classification_loss: 0.0273 445/500 [=========================>....] - ETA: 29s - loss: 0.3628 - regression_loss: 0.3354 - classification_loss: 0.0273 446/500 [=========================>....] - ETA: 28s - loss: 0.3624 - regression_loss: 0.3351 - classification_loss: 0.0273 447/500 [=========================>....] - ETA: 28s - loss: 0.3622 - regression_loss: 0.3349 - classification_loss: 0.0273 448/500 [=========================>....] - ETA: 27s - loss: 0.3618 - regression_loss: 0.3346 - classification_loss: 0.0272 449/500 [=========================>....] - ETA: 27s - loss: 0.3614 - regression_loss: 0.3342 - classification_loss: 0.0272 450/500 [==========================>...] - ETA: 26s - loss: 0.3611 - regression_loss: 0.3339 - classification_loss: 0.0272 451/500 [==========================>...] - ETA: 26s - loss: 0.3614 - regression_loss: 0.3342 - classification_loss: 0.0272 452/500 [==========================>...] - ETA: 25s - loss: 0.3613 - regression_loss: 0.3341 - classification_loss: 0.0272 453/500 [==========================>...] - ETA: 25s - loss: 0.3609 - regression_loss: 0.3337 - classification_loss: 0.0272 454/500 [==========================>...] - ETA: 24s - loss: 0.3606 - regression_loss: 0.3335 - classification_loss: 0.0272 455/500 [==========================>...] - ETA: 24s - loss: 0.3605 - regression_loss: 0.3334 - classification_loss: 0.0271 456/500 [==========================>...] - ETA: 23s - loss: 0.3604 - regression_loss: 0.3333 - classification_loss: 0.0271 457/500 [==========================>...] - ETA: 22s - loss: 0.3600 - regression_loss: 0.3329 - classification_loss: 0.0271 458/500 [==========================>...] - ETA: 22s - loss: 0.3597 - regression_loss: 0.3326 - classification_loss: 0.0271 459/500 [==========================>...] - ETA: 21s - loss: 0.3594 - regression_loss: 0.3324 - classification_loss: 0.0270 460/500 [==========================>...] - ETA: 21s - loss: 0.3593 - regression_loss: 0.3323 - classification_loss: 0.0270 461/500 [==========================>...] - ETA: 20s - loss: 0.3590 - regression_loss: 0.3321 - classification_loss: 0.0270 462/500 [==========================>...] - ETA: 20s - loss: 0.3601 - regression_loss: 0.3330 - classification_loss: 0.0270 463/500 [==========================>...] - ETA: 19s - loss: 0.3601 - regression_loss: 0.3331 - classification_loss: 0.0270 464/500 [==========================>...] - ETA: 19s - loss: 0.3609 - regression_loss: 0.3337 - classification_loss: 0.0271 465/500 [==========================>...] - ETA: 18s - loss: 0.3612 - regression_loss: 0.3340 - classification_loss: 0.0272 466/500 [==========================>...] - ETA: 18s - loss: 0.3614 - regression_loss: 0.3342 - classification_loss: 0.0272 467/500 [===========================>..] - ETA: 17s - loss: 0.3614 - regression_loss: 0.3342 - classification_loss: 0.0272 468/500 [===========================>..] - ETA: 17s - loss: 0.3614 - regression_loss: 0.3342 - classification_loss: 0.0272 469/500 [===========================>..] - ETA: 16s - loss: 0.3613 - regression_loss: 0.3341 - classification_loss: 0.0272 470/500 [===========================>..] - ETA: 16s - loss: 0.3611 - regression_loss: 0.3340 - classification_loss: 0.0272 471/500 [===========================>..] - ETA: 15s - loss: 0.3613 - regression_loss: 0.3341 - classification_loss: 0.0272 472/500 [===========================>..] - ETA: 14s - loss: 0.3619 - regression_loss: 0.3346 - classification_loss: 0.0273 473/500 [===========================>..] - ETA: 14s - loss: 0.3619 - regression_loss: 0.3346 - classification_loss: 0.0273 474/500 [===========================>..] - ETA: 13s - loss: 0.3620 - regression_loss: 0.3347 - classification_loss: 0.0273 475/500 [===========================>..] - ETA: 13s - loss: 0.3621 - regression_loss: 0.3348 - classification_loss: 0.0273 476/500 [===========================>..] - ETA: 12s - loss: 0.3622 - regression_loss: 0.3349 - classification_loss: 0.0273 477/500 [===========================>..] - ETA: 12s - loss: 0.3622 - regression_loss: 0.3350 - classification_loss: 0.0273 478/500 [===========================>..] - ETA: 11s - loss: 0.3620 - regression_loss: 0.3347 - classification_loss: 0.0272 479/500 [===========================>..] - ETA: 11s - loss: 0.3619 - regression_loss: 0.3347 - classification_loss: 0.0272 480/500 [===========================>..] - ETA: 10s - loss: 0.3620 - regression_loss: 0.3347 - classification_loss: 0.0272 481/500 [===========================>..] - ETA: 10s - loss: 0.3619 - regression_loss: 0.3346 - classification_loss: 0.0272 482/500 [===========================>..] - ETA: 9s - loss: 0.3619 - regression_loss: 0.3347 - classification_loss: 0.0272  483/500 [===========================>..] - ETA: 9s - loss: 0.3618 - regression_loss: 0.3346 - classification_loss: 0.0272 484/500 [============================>.] - ETA: 8s - loss: 0.3613 - regression_loss: 0.3342 - classification_loss: 0.0272 485/500 [============================>.] - ETA: 7s - loss: 0.3614 - regression_loss: 0.3343 - classification_loss: 0.0272 486/500 [============================>.] - ETA: 7s - loss: 0.3615 - regression_loss: 0.3343 - classification_loss: 0.0272 487/500 [============================>.] - ETA: 6s - loss: 0.3615 - regression_loss: 0.3343 - classification_loss: 0.0272 488/500 [============================>.] - ETA: 6s - loss: 0.3612 - regression_loss: 0.3340 - classification_loss: 0.0272 489/500 [============================>.] - ETA: 5s - loss: 0.3613 - regression_loss: 0.3342 - classification_loss: 0.0272 490/500 [============================>.] - ETA: 5s - loss: 0.3617 - regression_loss: 0.3345 - classification_loss: 0.0272 491/500 [============================>.] - ETA: 4s - loss: 0.3621 - regression_loss: 0.3349 - classification_loss: 0.0272 492/500 [============================>.] - ETA: 4s - loss: 0.3621 - regression_loss: 0.3349 - classification_loss: 0.0272 493/500 [============================>.] - ETA: 3s - loss: 0.3631 - regression_loss: 0.3357 - classification_loss: 0.0274 494/500 [============================>.] - ETA: 3s - loss: 0.3631 - regression_loss: 0.3357 - classification_loss: 0.0274 495/500 [============================>.] - ETA: 2s - loss: 0.3627 - regression_loss: 0.3354 - classification_loss: 0.0274 496/500 [============================>.] - ETA: 2s - loss: 0.3627 - regression_loss: 0.3353 - classification_loss: 0.0274 497/500 [============================>.] - ETA: 1s - loss: 0.3632 - regression_loss: 0.3359 - classification_loss: 0.0273 498/500 [============================>.] - ETA: 1s - loss: 0.3630 - regression_loss: 0.3357 - classification_loss: 0.0273 499/500 [============================>.] - ETA: 0s - loss: 0.3627 - regression_loss: 0.3354 - classification_loss: 0.0273 500/500 [==============================] - 266s 532ms/step - loss: 0.3626 - regression_loss: 0.3353 - classification_loss: 0.0273 326 instances of class plum with average precision: 0.8553 mAP: 0.8553 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 4:14 - loss: 0.2211 - regression_loss: 0.2097 - classification_loss: 0.0114 2/500 [..............................] - ETA: 4:03 - loss: 0.3929 - regression_loss: 0.3635 - classification_loss: 0.0295 3/500 [..............................] - ETA: 4:03 - loss: 0.3857 - regression_loss: 0.3582 - classification_loss: 0.0276 4/500 [..............................] - ETA: 4:01 - loss: 0.3604 - regression_loss: 0.3358 - classification_loss: 0.0245 5/500 [..............................] - ETA: 4:02 - loss: 0.3350 - regression_loss: 0.3127 - classification_loss: 0.0223 6/500 [..............................] - ETA: 4:02 - loss: 0.3349 - regression_loss: 0.3129 - classification_loss: 0.0220 7/500 [..............................] - ETA: 4:03 - loss: 0.3623 - regression_loss: 0.3393 - classification_loss: 0.0230 8/500 [..............................] - ETA: 4:03 - loss: 0.3682 - regression_loss: 0.3461 - classification_loss: 0.0221 9/500 [..............................] - ETA: 4:03 - loss: 0.3878 - regression_loss: 0.3647 - classification_loss: 0.0230 10/500 [..............................] - ETA: 4:03 - loss: 0.3847 - regression_loss: 0.3623 - classification_loss: 0.0225 11/500 [..............................] - ETA: 4:04 - loss: 0.3802 - regression_loss: 0.3579 - classification_loss: 0.0224 12/500 [..............................] - ETA: 4:03 - loss: 0.3761 - regression_loss: 0.3542 - classification_loss: 0.0219 13/500 [..............................] - ETA: 4:04 - loss: 0.3630 - regression_loss: 0.3413 - classification_loss: 0.0216 14/500 [..............................] - ETA: 4:04 - loss: 0.3676 - regression_loss: 0.3451 - classification_loss: 0.0225 15/500 [..............................] - ETA: 4:04 - loss: 0.3603 - regression_loss: 0.3380 - classification_loss: 0.0223 16/500 [..............................] - ETA: 4:03 - loss: 0.3637 - regression_loss: 0.3410 - classification_loss: 0.0227 17/500 [>.............................] - ETA: 4:02 - loss: 0.3612 - regression_loss: 0.3386 - classification_loss: 0.0226 18/500 [>.............................] - ETA: 4:02 - loss: 0.3651 - regression_loss: 0.3414 - classification_loss: 0.0236 19/500 [>.............................] - ETA: 4:02 - loss: 0.3644 - regression_loss: 0.3408 - classification_loss: 0.0236 20/500 [>.............................] - ETA: 4:01 - loss: 0.3562 - regression_loss: 0.3329 - classification_loss: 0.0234 21/500 [>.............................] - ETA: 4:01 - loss: 0.3499 - regression_loss: 0.3267 - classification_loss: 0.0232 22/500 [>.............................] - ETA: 4:01 - loss: 0.3604 - regression_loss: 0.3373 - classification_loss: 0.0231 23/500 [>.............................] - ETA: 4:00 - loss: 0.3727 - regression_loss: 0.3497 - classification_loss: 0.0230 24/500 [>.............................] - ETA: 4:00 - loss: 0.3703 - regression_loss: 0.3473 - classification_loss: 0.0230 25/500 [>.............................] - ETA: 3:59 - loss: 0.3820 - regression_loss: 0.3588 - classification_loss: 0.0232 26/500 [>.............................] - ETA: 3:59 - loss: 0.4018 - regression_loss: 0.3748 - classification_loss: 0.0270 27/500 [>.............................] - ETA: 3:59 - loss: 0.3961 - regression_loss: 0.3696 - classification_loss: 0.0265 28/500 [>.............................] - ETA: 3:59 - loss: 0.3905 - regression_loss: 0.3643 - classification_loss: 0.0263 29/500 [>.............................] - ETA: 3:58 - loss: 0.3904 - regression_loss: 0.3642 - classification_loss: 0.0262 30/500 [>.............................] - ETA: 3:58 - loss: 0.3897 - regression_loss: 0.3634 - classification_loss: 0.0263 31/500 [>.............................] - ETA: 3:57 - loss: 0.3948 - regression_loss: 0.3687 - classification_loss: 0.0261 32/500 [>.............................] - ETA: 3:57 - loss: 0.4139 - regression_loss: 0.3833 - classification_loss: 0.0306 33/500 [>.............................] - ETA: 3:56 - loss: 0.4106 - regression_loss: 0.3800 - classification_loss: 0.0306 34/500 [=>............................] - ETA: 3:56 - loss: 0.4047 - regression_loss: 0.3747 - classification_loss: 0.0300 35/500 [=>............................] - ETA: 3:55 - loss: 0.4077 - regression_loss: 0.3771 - classification_loss: 0.0305 36/500 [=>............................] - ETA: 3:55 - loss: 0.4107 - regression_loss: 0.3795 - classification_loss: 0.0312 37/500 [=>............................] - ETA: 3:54 - loss: 0.4076 - regression_loss: 0.3768 - classification_loss: 0.0308 38/500 [=>............................] - ETA: 3:54 - loss: 0.4073 - regression_loss: 0.3768 - classification_loss: 0.0305 39/500 [=>............................] - ETA: 3:53 - loss: 0.4066 - regression_loss: 0.3763 - classification_loss: 0.0303 40/500 [=>............................] - ETA: 3:53 - loss: 0.4020 - regression_loss: 0.3720 - classification_loss: 0.0299 41/500 [=>............................] - ETA: 3:52 - loss: 0.4027 - regression_loss: 0.3729 - classification_loss: 0.0298 42/500 [=>............................] - ETA: 3:52 - loss: 0.4052 - regression_loss: 0.3742 - classification_loss: 0.0309 43/500 [=>............................] - ETA: 3:51 - loss: 0.4042 - regression_loss: 0.3736 - classification_loss: 0.0306 44/500 [=>............................] - ETA: 3:51 - loss: 0.4004 - regression_loss: 0.3701 - classification_loss: 0.0302 45/500 [=>............................] - ETA: 3:50 - loss: 0.4007 - regression_loss: 0.3703 - classification_loss: 0.0304 46/500 [=>............................] - ETA: 3:50 - loss: 0.4009 - regression_loss: 0.3706 - classification_loss: 0.0303 47/500 [=>............................] - ETA: 3:49 - loss: 0.4034 - regression_loss: 0.3730 - classification_loss: 0.0304 48/500 [=>............................] - ETA: 3:49 - loss: 0.4025 - regression_loss: 0.3721 - classification_loss: 0.0304 49/500 [=>............................] - ETA: 3:48 - loss: 0.3992 - regression_loss: 0.3691 - classification_loss: 0.0300 50/500 [==>...........................] - ETA: 3:48 - loss: 0.4018 - regression_loss: 0.3720 - classification_loss: 0.0297 51/500 [==>...........................] - ETA: 3:47 - loss: 0.4090 - regression_loss: 0.3790 - classification_loss: 0.0300 52/500 [==>...........................] - ETA: 3:47 - loss: 0.4051 - regression_loss: 0.3755 - classification_loss: 0.0297 53/500 [==>...........................] - ETA: 3:46 - loss: 0.4023 - regression_loss: 0.3728 - classification_loss: 0.0295 54/500 [==>...........................] - ETA: 3:46 - loss: 0.4046 - regression_loss: 0.3749 - classification_loss: 0.0297 55/500 [==>...........................] - ETA: 3:46 - loss: 0.4030 - regression_loss: 0.3738 - classification_loss: 0.0293 56/500 [==>...........................] - ETA: 3:45 - loss: 0.4019 - regression_loss: 0.3728 - classification_loss: 0.0291 57/500 [==>...........................] - ETA: 3:45 - loss: 0.3989 - regression_loss: 0.3699 - classification_loss: 0.0290 58/500 [==>...........................] - ETA: 3:44 - loss: 0.3978 - regression_loss: 0.3687 - classification_loss: 0.0291 59/500 [==>...........................] - ETA: 3:44 - loss: 0.3960 - regression_loss: 0.3671 - classification_loss: 0.0289 60/500 [==>...........................] - ETA: 3:43 - loss: 0.3967 - regression_loss: 0.3680 - classification_loss: 0.0287 61/500 [==>...........................] - ETA: 3:43 - loss: 0.3967 - regression_loss: 0.3682 - classification_loss: 0.0285 62/500 [==>...........................] - ETA: 3:42 - loss: 0.3989 - regression_loss: 0.3704 - classification_loss: 0.0285 63/500 [==>...........................] - ETA: 3:41 - loss: 0.4011 - regression_loss: 0.3728 - classification_loss: 0.0283 64/500 [==>...........................] - ETA: 3:41 - loss: 0.4019 - regression_loss: 0.3737 - classification_loss: 0.0282 65/500 [==>...........................] - ETA: 3:40 - loss: 0.4009 - regression_loss: 0.3728 - classification_loss: 0.0281 66/500 [==>...........................] - ETA: 3:40 - loss: 0.4022 - regression_loss: 0.3741 - classification_loss: 0.0281 67/500 [===>..........................] - ETA: 3:39 - loss: 0.3996 - regression_loss: 0.3717 - classification_loss: 0.0278 68/500 [===>..........................] - ETA: 3:39 - loss: 0.3990 - regression_loss: 0.3713 - classification_loss: 0.0277 69/500 [===>..........................] - ETA: 3:38 - loss: 0.3979 - regression_loss: 0.3702 - classification_loss: 0.0277 70/500 [===>..........................] - ETA: 3:38 - loss: 0.3955 - regression_loss: 0.3680 - classification_loss: 0.0275 71/500 [===>..........................] - ETA: 3:37 - loss: 0.3987 - regression_loss: 0.3712 - classification_loss: 0.0274 72/500 [===>..........................] - ETA: 3:37 - loss: 0.3984 - regression_loss: 0.3710 - classification_loss: 0.0274 73/500 [===>..........................] - ETA: 3:36 - loss: 0.3955 - regression_loss: 0.3684 - classification_loss: 0.0271 74/500 [===>..........................] - ETA: 3:36 - loss: 0.3960 - regression_loss: 0.3690 - classification_loss: 0.0270 75/500 [===>..........................] - ETA: 3:35 - loss: 0.3964 - regression_loss: 0.3695 - classification_loss: 0.0269 76/500 [===>..........................] - ETA: 3:35 - loss: 0.3942 - regression_loss: 0.3676 - classification_loss: 0.0266 77/500 [===>..........................] - ETA: 3:34 - loss: 0.3945 - regression_loss: 0.3681 - classification_loss: 0.0265 78/500 [===>..........................] - ETA: 3:34 - loss: 0.3966 - regression_loss: 0.3699 - classification_loss: 0.0267 79/500 [===>..........................] - ETA: 3:33 - loss: 0.3945 - regression_loss: 0.3680 - classification_loss: 0.0265 80/500 [===>..........................] - ETA: 3:33 - loss: 0.3936 - regression_loss: 0.3672 - classification_loss: 0.0264 81/500 [===>..........................] - ETA: 3:32 - loss: 0.3923 - regression_loss: 0.3660 - classification_loss: 0.0263 82/500 [===>..........................] - ETA: 3:32 - loss: 0.3921 - regression_loss: 0.3656 - classification_loss: 0.0265 83/500 [===>..........................] - ETA: 3:31 - loss: 0.3900 - regression_loss: 0.3636 - classification_loss: 0.0264 84/500 [====>.........................] - ETA: 3:31 - loss: 0.3912 - regression_loss: 0.3647 - classification_loss: 0.0266 85/500 [====>.........................] - ETA: 3:30 - loss: 0.3897 - regression_loss: 0.3633 - classification_loss: 0.0265 86/500 [====>.........................] - ETA: 3:30 - loss: 0.3893 - regression_loss: 0.3629 - classification_loss: 0.0264 87/500 [====>.........................] - ETA: 3:29 - loss: 0.3879 - regression_loss: 0.3617 - classification_loss: 0.0263 88/500 [====>.........................] - ETA: 3:29 - loss: 0.3903 - regression_loss: 0.3641 - classification_loss: 0.0261 89/500 [====>.........................] - ETA: 3:28 - loss: 0.3880 - regression_loss: 0.3621 - classification_loss: 0.0259 90/500 [====>.........................] - ETA: 3:28 - loss: 0.3875 - regression_loss: 0.3616 - classification_loss: 0.0260 91/500 [====>.........................] - ETA: 3:27 - loss: 0.3858 - regression_loss: 0.3600 - classification_loss: 0.0258 92/500 [====>.........................] - ETA: 3:27 - loss: 0.3849 - regression_loss: 0.3592 - classification_loss: 0.0257 93/500 [====>.........................] - ETA: 3:26 - loss: 0.3826 - regression_loss: 0.3571 - classification_loss: 0.0256 94/500 [====>.........................] - ETA: 3:26 - loss: 0.3834 - regression_loss: 0.3578 - classification_loss: 0.0256 95/500 [====>.........................] - ETA: 3:25 - loss: 0.3825 - regression_loss: 0.3570 - classification_loss: 0.0255 96/500 [====>.........................] - ETA: 3:25 - loss: 0.3847 - regression_loss: 0.3590 - classification_loss: 0.0257 97/500 [====>.........................] - ETA: 3:24 - loss: 0.3837 - regression_loss: 0.3582 - classification_loss: 0.0256 98/500 [====>.........................] - ETA: 3:24 - loss: 0.3830 - regression_loss: 0.3575 - classification_loss: 0.0256 99/500 [====>.........................] - ETA: 3:23 - loss: 0.3826 - regression_loss: 0.3571 - classification_loss: 0.0255 100/500 [=====>........................] - ETA: 3:23 - loss: 0.3813 - regression_loss: 0.3559 - classification_loss: 0.0254 101/500 [=====>........................] - ETA: 3:22 - loss: 0.3809 - regression_loss: 0.3555 - classification_loss: 0.0254 102/500 [=====>........................] - ETA: 3:22 - loss: 0.3798 - regression_loss: 0.3545 - classification_loss: 0.0253 103/500 [=====>........................] - ETA: 3:21 - loss: 0.3776 - regression_loss: 0.3525 - classification_loss: 0.0251 104/500 [=====>........................] - ETA: 3:21 - loss: 0.3759 - regression_loss: 0.3509 - classification_loss: 0.0250 105/500 [=====>........................] - ETA: 3:20 - loss: 0.3743 - regression_loss: 0.3495 - classification_loss: 0.0249 106/500 [=====>........................] - ETA: 3:20 - loss: 0.3726 - regression_loss: 0.3479 - classification_loss: 0.0247 107/500 [=====>........................] - ETA: 3:19 - loss: 0.3735 - regression_loss: 0.3488 - classification_loss: 0.0248 108/500 [=====>........................] - ETA: 3:18 - loss: 0.3738 - regression_loss: 0.3490 - classification_loss: 0.0247 109/500 [=====>........................] - ETA: 3:18 - loss: 0.3756 - regression_loss: 0.3506 - classification_loss: 0.0249 110/500 [=====>........................] - ETA: 3:18 - loss: 0.3737 - regression_loss: 0.3488 - classification_loss: 0.0249 111/500 [=====>........................] - ETA: 3:17 - loss: 0.3721 - regression_loss: 0.3472 - classification_loss: 0.0248 112/500 [=====>........................] - ETA: 3:17 - loss: 0.3707 - regression_loss: 0.3460 - classification_loss: 0.0247 113/500 [=====>........................] - ETA: 3:16 - loss: 0.3702 - regression_loss: 0.3455 - classification_loss: 0.0247 114/500 [=====>........................] - ETA: 3:16 - loss: 0.3728 - regression_loss: 0.3478 - classification_loss: 0.0250 115/500 [=====>........................] - ETA: 3:15 - loss: 0.3723 - regression_loss: 0.3473 - classification_loss: 0.0250 116/500 [=====>........................] - ETA: 3:15 - loss: 0.3714 - regression_loss: 0.3464 - classification_loss: 0.0250 117/500 [======>.......................] - ETA: 3:14 - loss: 0.3720 - regression_loss: 0.3470 - classification_loss: 0.0250 118/500 [======>.......................] - ETA: 3:14 - loss: 0.3731 - regression_loss: 0.3480 - classification_loss: 0.0251 119/500 [======>.......................] - ETA: 3:13 - loss: 0.3713 - regression_loss: 0.3463 - classification_loss: 0.0250 120/500 [======>.......................] - ETA: 3:13 - loss: 0.3708 - regression_loss: 0.3459 - classification_loss: 0.0249 121/500 [======>.......................] - ETA: 3:12 - loss: 0.3697 - regression_loss: 0.3449 - classification_loss: 0.0248 122/500 [======>.......................] - ETA: 3:12 - loss: 0.3682 - regression_loss: 0.3434 - classification_loss: 0.0247 123/500 [======>.......................] - ETA: 3:11 - loss: 0.3711 - regression_loss: 0.3461 - classification_loss: 0.0250 124/500 [======>.......................] - ETA: 3:10 - loss: 0.3714 - regression_loss: 0.3464 - classification_loss: 0.0250 125/500 [======>.......................] - ETA: 3:10 - loss: 0.3713 - regression_loss: 0.3462 - classification_loss: 0.0251 126/500 [======>.......................] - ETA: 3:09 - loss: 0.3755 - regression_loss: 0.3499 - classification_loss: 0.0255 127/500 [======>.......................] - ETA: 3:09 - loss: 0.3745 - regression_loss: 0.3491 - classification_loss: 0.0254 128/500 [======>.......................] - ETA: 3:08 - loss: 0.3728 - regression_loss: 0.3476 - classification_loss: 0.0253 129/500 [======>.......................] - ETA: 3:08 - loss: 0.3735 - regression_loss: 0.3483 - classification_loss: 0.0253 130/500 [======>.......................] - ETA: 3:07 - loss: 0.3749 - regression_loss: 0.3494 - classification_loss: 0.0255 131/500 [======>.......................] - ETA: 3:07 - loss: 0.3752 - regression_loss: 0.3496 - classification_loss: 0.0256 132/500 [======>.......................] - ETA: 3:06 - loss: 0.3772 - regression_loss: 0.3513 - classification_loss: 0.0260 133/500 [======>.......................] - ETA: 3:06 - loss: 0.3770 - regression_loss: 0.3510 - classification_loss: 0.0259 134/500 [=======>......................] - ETA: 3:05 - loss: 0.3761 - regression_loss: 0.3502 - classification_loss: 0.0258 135/500 [=======>......................] - ETA: 3:05 - loss: 0.3756 - regression_loss: 0.3498 - classification_loss: 0.0257 136/500 [=======>......................] - ETA: 3:04 - loss: 0.3753 - regression_loss: 0.3496 - classification_loss: 0.0258 137/500 [=======>......................] - ETA: 3:04 - loss: 0.3742 - regression_loss: 0.3486 - classification_loss: 0.0256 138/500 [=======>......................] - ETA: 3:03 - loss: 0.3735 - regression_loss: 0.3479 - classification_loss: 0.0256 139/500 [=======>......................] - ETA: 3:03 - loss: 0.3752 - regression_loss: 0.3496 - classification_loss: 0.0256 140/500 [=======>......................] - ETA: 3:02 - loss: 0.3755 - regression_loss: 0.3501 - classification_loss: 0.0255 141/500 [=======>......................] - ETA: 3:02 - loss: 0.3776 - regression_loss: 0.3519 - classification_loss: 0.0256 142/500 [=======>......................] - ETA: 3:01 - loss: 0.3779 - regression_loss: 0.3522 - classification_loss: 0.0257 143/500 [=======>......................] - ETA: 3:01 - loss: 0.3773 - regression_loss: 0.3516 - classification_loss: 0.0257 144/500 [=======>......................] - ETA: 3:00 - loss: 0.3775 - regression_loss: 0.3519 - classification_loss: 0.0256 145/500 [=======>......................] - ETA: 3:00 - loss: 0.3791 - regression_loss: 0.3534 - classification_loss: 0.0257 146/500 [=======>......................] - ETA: 3:00 - loss: 0.3792 - regression_loss: 0.3534 - classification_loss: 0.0257 147/500 [=======>......................] - ETA: 2:59 - loss: 0.3784 - regression_loss: 0.3528 - classification_loss: 0.0256 148/500 [=======>......................] - ETA: 2:59 - loss: 0.3777 - regression_loss: 0.3521 - classification_loss: 0.0256 149/500 [=======>......................] - ETA: 2:58 - loss: 0.3779 - regression_loss: 0.3524 - classification_loss: 0.0255 150/500 [========>.....................] - ETA: 2:58 - loss: 0.3778 - regression_loss: 0.3522 - classification_loss: 0.0255 151/500 [========>.....................] - ETA: 2:57 - loss: 0.3797 - regression_loss: 0.3540 - classification_loss: 0.0257 152/500 [========>.....................] - ETA: 2:56 - loss: 0.3799 - regression_loss: 0.3542 - classification_loss: 0.0257 153/500 [========>.....................] - ETA: 2:56 - loss: 0.3798 - regression_loss: 0.3541 - classification_loss: 0.0257 154/500 [========>.....................] - ETA: 2:55 - loss: 0.3790 - regression_loss: 0.3534 - classification_loss: 0.0256 155/500 [========>.....................] - ETA: 2:55 - loss: 0.3793 - regression_loss: 0.3536 - classification_loss: 0.0257 156/500 [========>.....................] - ETA: 2:54 - loss: 0.3793 - regression_loss: 0.3537 - classification_loss: 0.0256 157/500 [========>.....................] - ETA: 2:54 - loss: 0.3804 - regression_loss: 0.3548 - classification_loss: 0.0256 158/500 [========>.....................] - ETA: 2:53 - loss: 0.3806 - regression_loss: 0.3550 - classification_loss: 0.0256 159/500 [========>.....................] - ETA: 2:53 - loss: 0.3801 - regression_loss: 0.3545 - classification_loss: 0.0255 160/500 [========>.....................] - ETA: 2:53 - loss: 0.3792 - regression_loss: 0.3536 - classification_loss: 0.0255 161/500 [========>.....................] - ETA: 2:52 - loss: 0.3782 - regression_loss: 0.3528 - classification_loss: 0.0255 162/500 [========>.....................] - ETA: 2:52 - loss: 0.3770 - regression_loss: 0.3517 - classification_loss: 0.0254 163/500 [========>.....................] - ETA: 2:51 - loss: 0.3762 - regression_loss: 0.3509 - classification_loss: 0.0253 164/500 [========>.....................] - ETA: 2:51 - loss: 0.3760 - regression_loss: 0.3506 - classification_loss: 0.0255 165/500 [========>.....................] - ETA: 2:50 - loss: 0.3771 - regression_loss: 0.3516 - classification_loss: 0.0255 166/500 [========>.....................] - ETA: 2:50 - loss: 0.3770 - regression_loss: 0.3515 - classification_loss: 0.0255 167/500 [=========>....................] - ETA: 2:49 - loss: 0.3770 - regression_loss: 0.3515 - classification_loss: 0.0255 168/500 [=========>....................] - ETA: 2:49 - loss: 0.3766 - regression_loss: 0.3512 - classification_loss: 0.0254 169/500 [=========>....................] - ETA: 2:48 - loss: 0.3753 - regression_loss: 0.3499 - classification_loss: 0.0253 170/500 [=========>....................] - ETA: 2:48 - loss: 0.3759 - regression_loss: 0.3506 - classification_loss: 0.0253 171/500 [=========>....................] - ETA: 2:47 - loss: 0.3799 - regression_loss: 0.3533 - classification_loss: 0.0266 172/500 [=========>....................] - ETA: 2:47 - loss: 0.3799 - regression_loss: 0.3533 - classification_loss: 0.0266 173/500 [=========>....................] - ETA: 2:46 - loss: 0.3796 - regression_loss: 0.3530 - classification_loss: 0.0265 174/500 [=========>....................] - ETA: 2:46 - loss: 0.3787 - regression_loss: 0.3523 - classification_loss: 0.0265 175/500 [=========>....................] - ETA: 2:45 - loss: 0.3806 - regression_loss: 0.3539 - classification_loss: 0.0267 176/500 [=========>....................] - ETA: 2:45 - loss: 0.3809 - regression_loss: 0.3542 - classification_loss: 0.0268 177/500 [=========>....................] - ETA: 2:44 - loss: 0.3802 - regression_loss: 0.3534 - classification_loss: 0.0267 178/500 [=========>....................] - ETA: 2:44 - loss: 0.3800 - regression_loss: 0.3533 - classification_loss: 0.0267 179/500 [=========>....................] - ETA: 2:43 - loss: 0.3806 - regression_loss: 0.3536 - classification_loss: 0.0270 180/500 [=========>....................] - ETA: 2:43 - loss: 0.3801 - regression_loss: 0.3532 - classification_loss: 0.0269 181/500 [=========>....................] - ETA: 2:42 - loss: 0.3783 - regression_loss: 0.3516 - classification_loss: 0.0268 182/500 [=========>....................] - ETA: 2:42 - loss: 0.3775 - regression_loss: 0.3508 - classification_loss: 0.0267 183/500 [=========>....................] - ETA: 2:41 - loss: 0.3778 - regression_loss: 0.3511 - classification_loss: 0.0267 184/500 [==========>...................] - ETA: 2:41 - loss: 0.3779 - regression_loss: 0.3513 - classification_loss: 0.0266 185/500 [==========>...................] - ETA: 2:40 - loss: 0.3766 - regression_loss: 0.3501 - classification_loss: 0.0265 186/500 [==========>...................] - ETA: 2:40 - loss: 0.3765 - regression_loss: 0.3501 - classification_loss: 0.0265 187/500 [==========>...................] - ETA: 2:39 - loss: 0.3785 - regression_loss: 0.3514 - classification_loss: 0.0271 188/500 [==========>...................] - ETA: 2:39 - loss: 0.3786 - regression_loss: 0.3516 - classification_loss: 0.0271 189/500 [==========>...................] - ETA: 2:38 - loss: 0.3787 - regression_loss: 0.3516 - classification_loss: 0.0271 190/500 [==========>...................] - ETA: 2:38 - loss: 0.3826 - regression_loss: 0.3552 - classification_loss: 0.0274 191/500 [==========>...................] - ETA: 2:37 - loss: 0.3836 - regression_loss: 0.3562 - classification_loss: 0.0274 192/500 [==========>...................] - ETA: 2:37 - loss: 0.3828 - regression_loss: 0.3554 - classification_loss: 0.0274 193/500 [==========>...................] - ETA: 2:36 - loss: 0.3819 - regression_loss: 0.3546 - classification_loss: 0.0273 194/500 [==========>...................] - ETA: 2:36 - loss: 0.3815 - regression_loss: 0.3542 - classification_loss: 0.0274 195/500 [==========>...................] - ETA: 2:35 - loss: 0.3807 - regression_loss: 0.3533 - classification_loss: 0.0274 196/500 [==========>...................] - ETA: 2:35 - loss: 0.3804 - regression_loss: 0.3531 - classification_loss: 0.0273 197/500 [==========>...................] - ETA: 2:34 - loss: 0.3798 - regression_loss: 0.3526 - classification_loss: 0.0273 198/500 [==========>...................] - ETA: 2:34 - loss: 0.3789 - regression_loss: 0.3517 - classification_loss: 0.0272 199/500 [==========>...................] - ETA: 2:33 - loss: 0.3784 - regression_loss: 0.3511 - classification_loss: 0.0272 200/500 [===========>..................] - ETA: 2:33 - loss: 0.3780 - regression_loss: 0.3508 - classification_loss: 0.0272 201/500 [===========>..................] - ETA: 2:32 - loss: 0.3785 - regression_loss: 0.3514 - classification_loss: 0.0272 202/500 [===========>..................] - ETA: 2:32 - loss: 0.3781 - regression_loss: 0.3510 - classification_loss: 0.0272 203/500 [===========>..................] - ETA: 2:31 - loss: 0.3780 - regression_loss: 0.3509 - classification_loss: 0.0271 204/500 [===========>..................] - ETA: 2:30 - loss: 0.3773 - regression_loss: 0.3502 - classification_loss: 0.0271 205/500 [===========>..................] - ETA: 2:30 - loss: 0.3762 - regression_loss: 0.3492 - classification_loss: 0.0270 206/500 [===========>..................] - ETA: 2:29 - loss: 0.3755 - regression_loss: 0.3485 - classification_loss: 0.0269 207/500 [===========>..................] - ETA: 2:29 - loss: 0.3761 - regression_loss: 0.3492 - classification_loss: 0.0269 208/500 [===========>..................] - ETA: 2:28 - loss: 0.3768 - regression_loss: 0.3499 - classification_loss: 0.0269 209/500 [===========>..................] - ETA: 2:28 - loss: 0.3768 - regression_loss: 0.3499 - classification_loss: 0.0269 210/500 [===========>..................] - ETA: 2:27 - loss: 0.3761 - regression_loss: 0.3493 - classification_loss: 0.0268 211/500 [===========>..................] - ETA: 2:27 - loss: 0.3764 - regression_loss: 0.3496 - classification_loss: 0.0268 212/500 [===========>..................] - ETA: 2:26 - loss: 0.3761 - regression_loss: 0.3492 - classification_loss: 0.0268 213/500 [===========>..................] - ETA: 2:26 - loss: 0.3759 - regression_loss: 0.3491 - classification_loss: 0.0268 214/500 [===========>..................] - ETA: 2:25 - loss: 0.3782 - regression_loss: 0.3512 - classification_loss: 0.0269 215/500 [===========>..................] - ETA: 2:25 - loss: 0.3777 - regression_loss: 0.3508 - classification_loss: 0.0269 216/500 [===========>..................] - ETA: 2:24 - loss: 0.3776 - regression_loss: 0.3506 - classification_loss: 0.0269 217/500 [============>.................] - ETA: 2:24 - loss: 0.3771 - regression_loss: 0.3502 - classification_loss: 0.0269 218/500 [============>.................] - ETA: 2:23 - loss: 0.3769 - regression_loss: 0.3501 - classification_loss: 0.0268 219/500 [============>.................] - ETA: 2:23 - loss: 0.3775 - regression_loss: 0.3506 - classification_loss: 0.0269 220/500 [============>.................] - ETA: 2:22 - loss: 0.3773 - regression_loss: 0.3504 - classification_loss: 0.0269 221/500 [============>.................] - ETA: 2:22 - loss: 0.3770 - regression_loss: 0.3501 - classification_loss: 0.0269 222/500 [============>.................] - ETA: 2:21 - loss: 0.3782 - regression_loss: 0.3513 - classification_loss: 0.0269 223/500 [============>.................] - ETA: 2:21 - loss: 0.3780 - regression_loss: 0.3512 - classification_loss: 0.0268 224/500 [============>.................] - ETA: 2:20 - loss: 0.3777 - regression_loss: 0.3509 - classification_loss: 0.0268 225/500 [============>.................] - ETA: 2:20 - loss: 0.3781 - regression_loss: 0.3514 - classification_loss: 0.0268 226/500 [============>.................] - ETA: 2:19 - loss: 0.3780 - regression_loss: 0.3512 - classification_loss: 0.0268 227/500 [============>.................] - ETA: 2:19 - loss: 0.3784 - regression_loss: 0.3515 - classification_loss: 0.0268 228/500 [============>.................] - ETA: 2:18 - loss: 0.3788 - regression_loss: 0.3519 - classification_loss: 0.0269 229/500 [============>.................] - ETA: 2:18 - loss: 0.3779 - regression_loss: 0.3511 - classification_loss: 0.0268 230/500 [============>.................] - ETA: 2:17 - loss: 0.3774 - regression_loss: 0.3507 - classification_loss: 0.0268 231/500 [============>.................] - ETA: 2:17 - loss: 0.3792 - regression_loss: 0.3520 - classification_loss: 0.0272 232/500 [============>.................] - ETA: 2:16 - loss: 0.3786 - regression_loss: 0.3514 - classification_loss: 0.0272 233/500 [============>.................] - ETA: 2:16 - loss: 0.3779 - regression_loss: 0.3509 - classification_loss: 0.0271 234/500 [=============>................] - ETA: 2:15 - loss: 0.3769 - regression_loss: 0.3500 - classification_loss: 0.0270 235/500 [=============>................] - ETA: 2:15 - loss: 0.3777 - regression_loss: 0.3507 - classification_loss: 0.0270 236/500 [=============>................] - ETA: 2:14 - loss: 0.3775 - regression_loss: 0.3505 - classification_loss: 0.0270 237/500 [=============>................] - ETA: 2:14 - loss: 0.3787 - regression_loss: 0.3516 - classification_loss: 0.0271 238/500 [=============>................] - ETA: 2:13 - loss: 0.3786 - regression_loss: 0.3516 - classification_loss: 0.0270 239/500 [=============>................] - ETA: 2:13 - loss: 0.3815 - regression_loss: 0.3541 - classification_loss: 0.0274 240/500 [=============>................] - ETA: 2:12 - loss: 0.3817 - regression_loss: 0.3543 - classification_loss: 0.0273 241/500 [=============>................] - ETA: 2:11 - loss: 0.3820 - regression_loss: 0.3546 - classification_loss: 0.0274 242/500 [=============>................] - ETA: 2:11 - loss: 0.3819 - regression_loss: 0.3546 - classification_loss: 0.0273 243/500 [=============>................] - ETA: 2:10 - loss: 0.3817 - regression_loss: 0.3544 - classification_loss: 0.0273 244/500 [=============>................] - ETA: 2:10 - loss: 0.3812 - regression_loss: 0.3540 - classification_loss: 0.0272 245/500 [=============>................] - ETA: 2:09 - loss: 0.3811 - regression_loss: 0.3539 - classification_loss: 0.0272 246/500 [=============>................] - ETA: 2:09 - loss: 0.3810 - regression_loss: 0.3538 - classification_loss: 0.0271 247/500 [=============>................] - ETA: 2:08 - loss: 0.3828 - regression_loss: 0.3552 - classification_loss: 0.0276 248/500 [=============>................] - ETA: 2:08 - loss: 0.3849 - regression_loss: 0.3572 - classification_loss: 0.0277 249/500 [=============>................] - ETA: 2:07 - loss: 0.3854 - regression_loss: 0.3576 - classification_loss: 0.0278 250/500 [==============>...............] - ETA: 2:07 - loss: 0.3851 - regression_loss: 0.3574 - classification_loss: 0.0277 251/500 [==============>...............] - ETA: 2:06 - loss: 0.3855 - regression_loss: 0.3578 - classification_loss: 0.0277 252/500 [==============>...............] - ETA: 2:06 - loss: 0.3865 - regression_loss: 0.3588 - classification_loss: 0.0277 253/500 [==============>...............] - ETA: 2:05 - loss: 0.3874 - regression_loss: 0.3597 - classification_loss: 0.0277 254/500 [==============>...............] - ETA: 2:05 - loss: 0.3870 - regression_loss: 0.3594 - classification_loss: 0.0276 255/500 [==============>...............] - ETA: 2:04 - loss: 0.3871 - regression_loss: 0.3594 - classification_loss: 0.0277 256/500 [==============>...............] - ETA: 2:04 - loss: 0.3872 - regression_loss: 0.3595 - classification_loss: 0.0277 257/500 [==============>...............] - ETA: 2:03 - loss: 0.3872 - regression_loss: 0.3596 - classification_loss: 0.0277 258/500 [==============>...............] - ETA: 2:03 - loss: 0.3867 - regression_loss: 0.3591 - classification_loss: 0.0276 259/500 [==============>...............] - ETA: 2:02 - loss: 0.3871 - regression_loss: 0.3594 - classification_loss: 0.0276 260/500 [==============>...............] - ETA: 2:02 - loss: 0.3881 - regression_loss: 0.3605 - classification_loss: 0.0276 261/500 [==============>...............] - ETA: 2:01 - loss: 0.3898 - regression_loss: 0.3622 - classification_loss: 0.0276 262/500 [==============>...............] - ETA: 2:01 - loss: 0.3904 - regression_loss: 0.3628 - classification_loss: 0.0276 263/500 [==============>...............] - ETA: 2:00 - loss: 0.3904 - regression_loss: 0.3629 - classification_loss: 0.0276 264/500 [==============>...............] - ETA: 2:00 - loss: 0.3902 - regression_loss: 0.3627 - classification_loss: 0.0275 265/500 [==============>...............] - ETA: 1:59 - loss: 0.3899 - regression_loss: 0.3624 - classification_loss: 0.0275 266/500 [==============>...............] - ETA: 1:59 - loss: 0.3903 - regression_loss: 0.3628 - classification_loss: 0.0275 267/500 [===============>..............] - ETA: 1:58 - loss: 0.3901 - regression_loss: 0.3626 - classification_loss: 0.0275 268/500 [===============>..............] - ETA: 1:58 - loss: 0.3900 - regression_loss: 0.3626 - classification_loss: 0.0275 269/500 [===============>..............] - ETA: 1:57 - loss: 0.3894 - regression_loss: 0.3620 - classification_loss: 0.0274 270/500 [===============>..............] - ETA: 1:57 - loss: 0.3896 - regression_loss: 0.3622 - classification_loss: 0.0274 271/500 [===============>..............] - ETA: 1:56 - loss: 0.3892 - regression_loss: 0.3619 - classification_loss: 0.0274 272/500 [===============>..............] - ETA: 1:56 - loss: 0.3888 - regression_loss: 0.3615 - classification_loss: 0.0273 273/500 [===============>..............] - ETA: 1:55 - loss: 0.3885 - regression_loss: 0.3612 - classification_loss: 0.0273 274/500 [===============>..............] - ETA: 1:55 - loss: 0.3885 - regression_loss: 0.3612 - classification_loss: 0.0273 275/500 [===============>..............] - ETA: 1:54 - loss: 0.3878 - regression_loss: 0.3605 - classification_loss: 0.0273 276/500 [===============>..............] - ETA: 1:54 - loss: 0.3889 - regression_loss: 0.3616 - classification_loss: 0.0273 277/500 [===============>..............] - ETA: 1:53 - loss: 0.3884 - regression_loss: 0.3612 - classification_loss: 0.0273 278/500 [===============>..............] - ETA: 1:53 - loss: 0.3886 - regression_loss: 0.3614 - classification_loss: 0.0273 279/500 [===============>..............] - ETA: 1:52 - loss: 0.3892 - regression_loss: 0.3619 - classification_loss: 0.0273 280/500 [===============>..............] - ETA: 1:52 - loss: 0.3891 - regression_loss: 0.3618 - classification_loss: 0.0273 281/500 [===============>..............] - ETA: 1:51 - loss: 0.3886 - regression_loss: 0.3613 - classification_loss: 0.0272 282/500 [===============>..............] - ETA: 1:51 - loss: 0.3882 - regression_loss: 0.3610 - classification_loss: 0.0272 283/500 [===============>..............] - ETA: 1:50 - loss: 0.3874 - regression_loss: 0.3602 - classification_loss: 0.0272 284/500 [================>.............] - ETA: 1:50 - loss: 0.3870 - regression_loss: 0.3599 - classification_loss: 0.0271 285/500 [================>.............] - ETA: 1:49 - loss: 0.3881 - regression_loss: 0.3609 - classification_loss: 0.0272 286/500 [================>.............] - ETA: 1:49 - loss: 0.3876 - regression_loss: 0.3603 - classification_loss: 0.0273 287/500 [================>.............] - ETA: 1:48 - loss: 0.3877 - regression_loss: 0.3605 - classification_loss: 0.0273 288/500 [================>.............] - ETA: 1:48 - loss: 0.3872 - regression_loss: 0.3600 - classification_loss: 0.0272 289/500 [================>.............] - ETA: 1:47 - loss: 0.3875 - regression_loss: 0.3603 - classification_loss: 0.0272 290/500 [================>.............] - ETA: 1:47 - loss: 0.3879 - regression_loss: 0.3606 - classification_loss: 0.0272 291/500 [================>.............] - ETA: 1:46 - loss: 0.3890 - regression_loss: 0.3615 - classification_loss: 0.0275 292/500 [================>.............] - ETA: 1:46 - loss: 0.3902 - regression_loss: 0.3625 - classification_loss: 0.0277 293/500 [================>.............] - ETA: 1:45 - loss: 0.3894 - regression_loss: 0.3618 - classification_loss: 0.0276 294/500 [================>.............] - ETA: 1:45 - loss: 0.3915 - regression_loss: 0.3632 - classification_loss: 0.0284 295/500 [================>.............] - ETA: 1:44 - loss: 0.3913 - regression_loss: 0.3629 - classification_loss: 0.0283 296/500 [================>.............] - ETA: 1:44 - loss: 0.3909 - regression_loss: 0.3626 - classification_loss: 0.0283 297/500 [================>.............] - ETA: 1:43 - loss: 0.3924 - regression_loss: 0.3639 - classification_loss: 0.0285 298/500 [================>.............] - ETA: 1:43 - loss: 0.3923 - regression_loss: 0.3637 - classification_loss: 0.0285 299/500 [================>.............] - ETA: 1:42 - loss: 0.3923 - regression_loss: 0.3638 - classification_loss: 0.0285 300/500 [=================>............] - ETA: 1:42 - loss: 0.3914 - regression_loss: 0.3629 - classification_loss: 0.0284 301/500 [=================>............] - ETA: 1:41 - loss: 0.3921 - regression_loss: 0.3636 - classification_loss: 0.0285 302/500 [=================>............] - ETA: 1:41 - loss: 0.3926 - regression_loss: 0.3641 - classification_loss: 0.0285 303/500 [=================>............] - ETA: 1:40 - loss: 0.3926 - regression_loss: 0.3642 - classification_loss: 0.0285 304/500 [=================>............] - ETA: 1:40 - loss: 0.3930 - regression_loss: 0.3645 - classification_loss: 0.0285 305/500 [=================>............] - ETA: 1:39 - loss: 0.3932 - regression_loss: 0.3647 - classification_loss: 0.0285 306/500 [=================>............] - ETA: 1:39 - loss: 0.3928 - regression_loss: 0.3643 - classification_loss: 0.0285 307/500 [=================>............] - ETA: 1:38 - loss: 0.3924 - regression_loss: 0.3639 - classification_loss: 0.0284 308/500 [=================>............] - ETA: 1:38 - loss: 0.3931 - regression_loss: 0.3646 - classification_loss: 0.0284 309/500 [=================>............] - ETA: 1:37 - loss: 0.3931 - regression_loss: 0.3647 - classification_loss: 0.0284 310/500 [=================>............] - ETA: 1:36 - loss: 0.3928 - regression_loss: 0.3644 - classification_loss: 0.0284 311/500 [=================>............] - ETA: 1:36 - loss: 0.3932 - regression_loss: 0.3648 - classification_loss: 0.0284 312/500 [=================>............] - ETA: 1:35 - loss: 0.3934 - regression_loss: 0.3651 - classification_loss: 0.0284 313/500 [=================>............] - ETA: 1:35 - loss: 0.3928 - regression_loss: 0.3645 - classification_loss: 0.0283 314/500 [=================>............] - ETA: 1:34 - loss: 0.3928 - regression_loss: 0.3645 - classification_loss: 0.0283 315/500 [=================>............] - ETA: 1:34 - loss: 0.3927 - regression_loss: 0.3643 - classification_loss: 0.0284 316/500 [=================>............] - ETA: 1:33 - loss: 0.3923 - regression_loss: 0.3640 - classification_loss: 0.0283 317/500 [==================>...........] - ETA: 1:33 - loss: 0.3922 - regression_loss: 0.3639 - classification_loss: 0.0283 318/500 [==================>...........] - ETA: 1:32 - loss: 0.3923 - regression_loss: 0.3640 - classification_loss: 0.0284 319/500 [==================>...........] - ETA: 1:32 - loss: 0.3923 - regression_loss: 0.3640 - classification_loss: 0.0284 320/500 [==================>...........] - ETA: 1:31 - loss: 0.3923 - regression_loss: 0.3639 - classification_loss: 0.0284 321/500 [==================>...........] - ETA: 1:31 - loss: 0.3918 - regression_loss: 0.3634 - classification_loss: 0.0284 322/500 [==================>...........] - ETA: 1:30 - loss: 0.3925 - regression_loss: 0.3641 - classification_loss: 0.0285 323/500 [==================>...........] - ETA: 1:30 - loss: 0.3928 - regression_loss: 0.3644 - classification_loss: 0.0284 324/500 [==================>...........] - ETA: 1:29 - loss: 0.3927 - regression_loss: 0.3642 - classification_loss: 0.0284 325/500 [==================>...........] - ETA: 1:29 - loss: 0.3924 - regression_loss: 0.3640 - classification_loss: 0.0284 326/500 [==================>...........] - ETA: 1:28 - loss: 0.3920 - regression_loss: 0.3636 - classification_loss: 0.0284 327/500 [==================>...........] - ETA: 1:28 - loss: 0.3920 - regression_loss: 0.3636 - classification_loss: 0.0284 328/500 [==================>...........] - ETA: 1:27 - loss: 0.3916 - regression_loss: 0.3633 - classification_loss: 0.0284 329/500 [==================>...........] - ETA: 1:27 - loss: 0.3910 - regression_loss: 0.3627 - classification_loss: 0.0283 330/500 [==================>...........] - ETA: 1:26 - loss: 0.3906 - regression_loss: 0.3623 - classification_loss: 0.0283 331/500 [==================>...........] - ETA: 1:26 - loss: 0.3904 - regression_loss: 0.3621 - classification_loss: 0.0282 332/500 [==================>...........] - ETA: 1:25 - loss: 0.3907 - regression_loss: 0.3625 - classification_loss: 0.0282 333/500 [==================>...........] - ETA: 1:25 - loss: 0.3903 - regression_loss: 0.3621 - classification_loss: 0.0282 334/500 [===================>..........] - ETA: 1:24 - loss: 0.3904 - regression_loss: 0.3622 - classification_loss: 0.0282 335/500 [===================>..........] - ETA: 1:24 - loss: 0.3903 - regression_loss: 0.3620 - classification_loss: 0.0283 336/500 [===================>..........] - ETA: 1:23 - loss: 0.3903 - regression_loss: 0.3620 - classification_loss: 0.0283 337/500 [===================>..........] - ETA: 1:23 - loss: 0.3912 - regression_loss: 0.3629 - classification_loss: 0.0283 338/500 [===================>..........] - ETA: 1:22 - loss: 0.3924 - regression_loss: 0.3639 - classification_loss: 0.0285 339/500 [===================>..........] - ETA: 1:22 - loss: 0.3924 - regression_loss: 0.3638 - classification_loss: 0.0285 340/500 [===================>..........] - ETA: 1:21 - loss: 0.3924 - regression_loss: 0.3639 - classification_loss: 0.0286 341/500 [===================>..........] - ETA: 1:21 - loss: 0.3920 - regression_loss: 0.3635 - classification_loss: 0.0285 342/500 [===================>..........] - ETA: 1:20 - loss: 0.3925 - regression_loss: 0.3640 - classification_loss: 0.0285 343/500 [===================>..........] - ETA: 1:20 - loss: 0.3918 - regression_loss: 0.3634 - classification_loss: 0.0284 344/500 [===================>..........] - ETA: 1:19 - loss: 0.3924 - regression_loss: 0.3639 - classification_loss: 0.0285 345/500 [===================>..........] - ETA: 1:19 - loss: 0.3919 - regression_loss: 0.3635 - classification_loss: 0.0284 346/500 [===================>..........] - ETA: 1:18 - loss: 0.3915 - regression_loss: 0.3631 - classification_loss: 0.0284 347/500 [===================>..........] - ETA: 1:18 - loss: 0.3912 - regression_loss: 0.3628 - classification_loss: 0.0284 348/500 [===================>..........] - ETA: 1:17 - loss: 0.3912 - regression_loss: 0.3628 - classification_loss: 0.0283 349/500 [===================>..........] - ETA: 1:17 - loss: 0.3909 - regression_loss: 0.3626 - classification_loss: 0.0283 350/500 [====================>.........] - ETA: 1:16 - loss: 0.3904 - regression_loss: 0.3622 - classification_loss: 0.0283 351/500 [====================>.........] - ETA: 1:16 - loss: 0.3901 - regression_loss: 0.3619 - classification_loss: 0.0282 352/500 [====================>.........] - ETA: 1:15 - loss: 0.3910 - regression_loss: 0.3628 - classification_loss: 0.0282 353/500 [====================>.........] - ETA: 1:15 - loss: 0.3913 - regression_loss: 0.3631 - classification_loss: 0.0282 354/500 [====================>.........] - ETA: 1:14 - loss: 0.3908 - regression_loss: 0.3626 - classification_loss: 0.0282 355/500 [====================>.........] - ETA: 1:14 - loss: 0.3908 - regression_loss: 0.3626 - classification_loss: 0.0282 356/500 [====================>.........] - ETA: 1:13 - loss: 0.3907 - regression_loss: 0.3626 - classification_loss: 0.0281 357/500 [====================>.........] - ETA: 1:13 - loss: 0.3906 - regression_loss: 0.3625 - classification_loss: 0.0281 358/500 [====================>.........] - ETA: 1:12 - loss: 0.3905 - regression_loss: 0.3625 - classification_loss: 0.0280 359/500 [====================>.........] - ETA: 1:12 - loss: 0.3903 - regression_loss: 0.3623 - classification_loss: 0.0280 360/500 [====================>.........] - ETA: 1:11 - loss: 0.3909 - regression_loss: 0.3628 - classification_loss: 0.0280 361/500 [====================>.........] - ETA: 1:10 - loss: 0.3910 - regression_loss: 0.3629 - classification_loss: 0.0280 362/500 [====================>.........] - ETA: 1:10 - loss: 0.3907 - regression_loss: 0.3627 - classification_loss: 0.0280 363/500 [====================>.........] - ETA: 1:09 - loss: 0.3902 - regression_loss: 0.3622 - classification_loss: 0.0280 364/500 [====================>.........] - ETA: 1:09 - loss: 0.3899 - regression_loss: 0.3620 - classification_loss: 0.0280 365/500 [====================>.........] - ETA: 1:08 - loss: 0.3899 - regression_loss: 0.3620 - classification_loss: 0.0279 366/500 [====================>.........] - ETA: 1:08 - loss: 0.3908 - regression_loss: 0.3629 - classification_loss: 0.0279 367/500 [=====================>........] - ETA: 1:07 - loss: 0.3908 - regression_loss: 0.3629 - classification_loss: 0.0279 368/500 [=====================>........] - ETA: 1:07 - loss: 0.3905 - regression_loss: 0.3627 - classification_loss: 0.0278 369/500 [=====================>........] - ETA: 1:06 - loss: 0.3908 - regression_loss: 0.3630 - classification_loss: 0.0278 370/500 [=====================>........] - ETA: 1:06 - loss: 0.3910 - regression_loss: 0.3632 - classification_loss: 0.0279 371/500 [=====================>........] - ETA: 1:05 - loss: 0.3909 - regression_loss: 0.3631 - classification_loss: 0.0279 372/500 [=====================>........] - ETA: 1:05 - loss: 0.3904 - regression_loss: 0.3626 - classification_loss: 0.0278 373/500 [=====================>........] - ETA: 1:04 - loss: 0.3901 - regression_loss: 0.3623 - classification_loss: 0.0278 374/500 [=====================>........] - ETA: 1:04 - loss: 0.3898 - regression_loss: 0.3621 - classification_loss: 0.0278 375/500 [=====================>........] - ETA: 1:03 - loss: 0.3903 - regression_loss: 0.3626 - classification_loss: 0.0277 376/500 [=====================>........] - ETA: 1:03 - loss: 0.3899 - regression_loss: 0.3622 - classification_loss: 0.0277 377/500 [=====================>........] - ETA: 1:02 - loss: 0.3903 - regression_loss: 0.3626 - classification_loss: 0.0277 378/500 [=====================>........] - ETA: 1:02 - loss: 0.3898 - regression_loss: 0.3621 - classification_loss: 0.0277 379/500 [=====================>........] - ETA: 1:01 - loss: 0.3894 - regression_loss: 0.3618 - classification_loss: 0.0276 380/500 [=====================>........] - ETA: 1:01 - loss: 0.3898 - regression_loss: 0.3622 - classification_loss: 0.0277 381/500 [=====================>........] - ETA: 1:00 - loss: 0.3898 - regression_loss: 0.3622 - classification_loss: 0.0276 382/500 [=====================>........] - ETA: 1:00 - loss: 0.3901 - regression_loss: 0.3624 - classification_loss: 0.0277 383/500 [=====================>........] - ETA: 59s - loss: 0.3898 - regression_loss: 0.3622 - classification_loss: 0.0276  384/500 [======================>.......] - ETA: 59s - loss: 0.3895 - regression_loss: 0.3619 - classification_loss: 0.0276 385/500 [======================>.......] - ETA: 58s - loss: 0.3893 - regression_loss: 0.3617 - classification_loss: 0.0276 386/500 [======================>.......] - ETA: 58s - loss: 0.3890 - regression_loss: 0.3615 - classification_loss: 0.0276 387/500 [======================>.......] - ETA: 57s - loss: 0.3890 - regression_loss: 0.3615 - classification_loss: 0.0275 388/500 [======================>.......] - ETA: 57s - loss: 0.3895 - regression_loss: 0.3619 - classification_loss: 0.0275 389/500 [======================>.......] - ETA: 56s - loss: 0.3894 - regression_loss: 0.3619 - classification_loss: 0.0275 390/500 [======================>.......] - ETA: 56s - loss: 0.3891 - regression_loss: 0.3616 - classification_loss: 0.0275 391/500 [======================>.......] - ETA: 55s - loss: 0.3890 - regression_loss: 0.3615 - classification_loss: 0.0275 392/500 [======================>.......] - ETA: 55s - loss: 0.3889 - regression_loss: 0.3615 - classification_loss: 0.0274 393/500 [======================>.......] - ETA: 54s - loss: 0.3887 - regression_loss: 0.3613 - classification_loss: 0.0274 394/500 [======================>.......] - ETA: 54s - loss: 0.3884 - regression_loss: 0.3610 - classification_loss: 0.0274 395/500 [======================>.......] - ETA: 53s - loss: 0.3881 - regression_loss: 0.3608 - classification_loss: 0.0274 396/500 [======================>.......] - ETA: 53s - loss: 0.3885 - regression_loss: 0.3612 - classification_loss: 0.0274 397/500 [======================>.......] - ETA: 52s - loss: 0.3888 - regression_loss: 0.3615 - classification_loss: 0.0273 398/500 [======================>.......] - ETA: 52s - loss: 0.3888 - regression_loss: 0.3614 - classification_loss: 0.0273 399/500 [======================>.......] - ETA: 51s - loss: 0.3889 - regression_loss: 0.3616 - classification_loss: 0.0274 400/500 [=======================>......] - ETA: 51s - loss: 0.3884 - regression_loss: 0.3610 - classification_loss: 0.0273 401/500 [=======================>......] - ETA: 50s - loss: 0.3882 - regression_loss: 0.3608 - classification_loss: 0.0274 402/500 [=======================>......] - ETA: 50s - loss: 0.3889 - regression_loss: 0.3615 - classification_loss: 0.0274 403/500 [=======================>......] - ETA: 49s - loss: 0.3887 - regression_loss: 0.3614 - classification_loss: 0.0273 404/500 [=======================>......] - ETA: 49s - loss: 0.3881 - regression_loss: 0.3608 - classification_loss: 0.0273 405/500 [=======================>......] - ETA: 48s - loss: 0.3886 - regression_loss: 0.3614 - classification_loss: 0.0273 406/500 [=======================>......] - ETA: 48s - loss: 0.3884 - regression_loss: 0.3612 - classification_loss: 0.0272 407/500 [=======================>......] - ETA: 47s - loss: 0.3884 - regression_loss: 0.3612 - classification_loss: 0.0272 408/500 [=======================>......] - ETA: 46s - loss: 0.3886 - regression_loss: 0.3613 - classification_loss: 0.0272 409/500 [=======================>......] - ETA: 46s - loss: 0.3888 - regression_loss: 0.3615 - classification_loss: 0.0273 410/500 [=======================>......] - ETA: 45s - loss: 0.3891 - regression_loss: 0.3618 - classification_loss: 0.0273 411/500 [=======================>......] - ETA: 45s - loss: 0.3891 - regression_loss: 0.3618 - classification_loss: 0.0273 412/500 [=======================>......] - ETA: 44s - loss: 0.3894 - regression_loss: 0.3622 - classification_loss: 0.0273 413/500 [=======================>......] - ETA: 44s - loss: 0.3890 - regression_loss: 0.3617 - classification_loss: 0.0272 414/500 [=======================>......] - ETA: 43s - loss: 0.3892 - regression_loss: 0.3619 - classification_loss: 0.0272 415/500 [=======================>......] - ETA: 43s - loss: 0.3889 - regression_loss: 0.3617 - classification_loss: 0.0272 416/500 [=======================>......] - ETA: 42s - loss: 0.3885 - regression_loss: 0.3614 - classification_loss: 0.0272 417/500 [========================>.....] - ETA: 42s - loss: 0.3882 - regression_loss: 0.3611 - classification_loss: 0.0271 418/500 [========================>.....] - ETA: 41s - loss: 0.3883 - regression_loss: 0.3612 - classification_loss: 0.0271 419/500 [========================>.....] - ETA: 41s - loss: 0.3880 - regression_loss: 0.3609 - classification_loss: 0.0271 420/500 [========================>.....] - ETA: 40s - loss: 0.3881 - regression_loss: 0.3610 - classification_loss: 0.0271 421/500 [========================>.....] - ETA: 40s - loss: 0.3878 - regression_loss: 0.3607 - classification_loss: 0.0271 422/500 [========================>.....] - ETA: 39s - loss: 0.3880 - regression_loss: 0.3609 - classification_loss: 0.0271 423/500 [========================>.....] - ETA: 39s - loss: 0.3882 - regression_loss: 0.3612 - classification_loss: 0.0271 424/500 [========================>.....] - ETA: 38s - loss: 0.3883 - regression_loss: 0.3612 - classification_loss: 0.0271 425/500 [========================>.....] - ETA: 38s - loss: 0.3882 - regression_loss: 0.3612 - classification_loss: 0.0271 426/500 [========================>.....] - ETA: 37s - loss: 0.3896 - regression_loss: 0.3624 - classification_loss: 0.0272 427/500 [========================>.....] - ETA: 37s - loss: 0.3893 - regression_loss: 0.3621 - classification_loss: 0.0272 428/500 [========================>.....] - ETA: 36s - loss: 0.3891 - regression_loss: 0.3619 - classification_loss: 0.0272 429/500 [========================>.....] - ETA: 36s - loss: 0.3889 - regression_loss: 0.3617 - classification_loss: 0.0272 430/500 [========================>.....] - ETA: 35s - loss: 0.3887 - regression_loss: 0.3615 - classification_loss: 0.0272 431/500 [========================>.....] - ETA: 35s - loss: 0.3891 - regression_loss: 0.3618 - classification_loss: 0.0272 432/500 [========================>.....] - ETA: 34s - loss: 0.3893 - regression_loss: 0.3621 - classification_loss: 0.0273 433/500 [========================>.....] - ETA: 34s - loss: 0.3891 - regression_loss: 0.3618 - classification_loss: 0.0272 434/500 [=========================>....] - ETA: 33s - loss: 0.3891 - regression_loss: 0.3619 - classification_loss: 0.0272 435/500 [=========================>....] - ETA: 33s - loss: 0.3888 - regression_loss: 0.3616 - classification_loss: 0.0272 436/500 [=========================>....] - ETA: 32s - loss: 0.3885 - regression_loss: 0.3613 - classification_loss: 0.0272 437/500 [=========================>....] - ETA: 32s - loss: 0.3878 - regression_loss: 0.3607 - classification_loss: 0.0272 438/500 [=========================>....] - ETA: 31s - loss: 0.3879 - regression_loss: 0.3608 - classification_loss: 0.0272 439/500 [=========================>....] - ETA: 31s - loss: 0.3882 - regression_loss: 0.3610 - classification_loss: 0.0272 440/500 [=========================>....] - ETA: 30s - loss: 0.3900 - regression_loss: 0.3625 - classification_loss: 0.0274 441/500 [=========================>....] - ETA: 30s - loss: 0.3908 - regression_loss: 0.3633 - classification_loss: 0.0275 442/500 [=========================>....] - ETA: 29s - loss: 0.3912 - regression_loss: 0.3637 - classification_loss: 0.0275 443/500 [=========================>....] - ETA: 29s - loss: 0.3917 - regression_loss: 0.3642 - classification_loss: 0.0275 444/500 [=========================>....] - ETA: 28s - loss: 0.3917 - regression_loss: 0.3642 - classification_loss: 0.0275 445/500 [=========================>....] - ETA: 28s - loss: 0.3916 - regression_loss: 0.3642 - classification_loss: 0.0275 446/500 [=========================>....] - ETA: 27s - loss: 0.3914 - regression_loss: 0.3639 - classification_loss: 0.0275 447/500 [=========================>....] - ETA: 27s - loss: 0.3910 - regression_loss: 0.3636 - classification_loss: 0.0274 448/500 [=========================>....] - ETA: 26s - loss: 0.3908 - regression_loss: 0.3634 - classification_loss: 0.0274 449/500 [=========================>....] - ETA: 26s - loss: 0.3904 - regression_loss: 0.3630 - classification_loss: 0.0274 450/500 [==========================>...] - ETA: 25s - loss: 0.3909 - regression_loss: 0.3635 - classification_loss: 0.0274 451/500 [==========================>...] - ETA: 25s - loss: 0.3912 - regression_loss: 0.3638 - classification_loss: 0.0274 452/500 [==========================>...] - ETA: 24s - loss: 0.3910 - regression_loss: 0.3636 - classification_loss: 0.0273 453/500 [==========================>...] - ETA: 23s - loss: 0.3903 - regression_loss: 0.3630 - classification_loss: 0.0273 454/500 [==========================>...] - ETA: 23s - loss: 0.3904 - regression_loss: 0.3631 - classification_loss: 0.0273 455/500 [==========================>...] - ETA: 22s - loss: 0.3902 - regression_loss: 0.3629 - classification_loss: 0.0273 456/500 [==========================>...] - ETA: 22s - loss: 0.3903 - regression_loss: 0.3630 - classification_loss: 0.0272 457/500 [==========================>...] - ETA: 21s - loss: 0.3905 - regression_loss: 0.3632 - classification_loss: 0.0273 458/500 [==========================>...] - ETA: 21s - loss: 0.3918 - regression_loss: 0.3641 - classification_loss: 0.0278 459/500 [==========================>...] - ETA: 20s - loss: 0.3921 - regression_loss: 0.3643 - classification_loss: 0.0278 460/500 [==========================>...] - ETA: 20s - loss: 0.3919 - regression_loss: 0.3641 - classification_loss: 0.0278 461/500 [==========================>...] - ETA: 19s - loss: 0.3923 - regression_loss: 0.3644 - classification_loss: 0.0278 462/500 [==========================>...] - ETA: 19s - loss: 0.3920 - regression_loss: 0.3642 - classification_loss: 0.0278 463/500 [==========================>...] - ETA: 18s - loss: 0.3919 - regression_loss: 0.3642 - classification_loss: 0.0278 464/500 [==========================>...] - ETA: 18s - loss: 0.3914 - regression_loss: 0.3637 - classification_loss: 0.0277 465/500 [==========================>...] - ETA: 17s - loss: 0.3917 - regression_loss: 0.3639 - classification_loss: 0.0278 466/500 [==========================>...] - ETA: 17s - loss: 0.3915 - regression_loss: 0.3636 - classification_loss: 0.0278 467/500 [===========================>..] - ETA: 16s - loss: 0.3913 - regression_loss: 0.3635 - classification_loss: 0.0278 468/500 [===========================>..] - ETA: 16s - loss: 0.3906 - regression_loss: 0.3629 - classification_loss: 0.0277 469/500 [===========================>..] - ETA: 15s - loss: 0.3908 - regression_loss: 0.3631 - classification_loss: 0.0277 470/500 [===========================>..] - ETA: 15s - loss: 0.3905 - regression_loss: 0.3628 - classification_loss: 0.0277 471/500 [===========================>..] - ETA: 14s - loss: 0.3907 - regression_loss: 0.3630 - classification_loss: 0.0277 472/500 [===========================>..] - ETA: 14s - loss: 0.3906 - regression_loss: 0.3629 - classification_loss: 0.0277 473/500 [===========================>..] - ETA: 13s - loss: 0.3905 - regression_loss: 0.3628 - classification_loss: 0.0276 474/500 [===========================>..] - ETA: 13s - loss: 0.3903 - regression_loss: 0.3626 - classification_loss: 0.0276 475/500 [===========================>..] - ETA: 12s - loss: 0.3901 - regression_loss: 0.3624 - classification_loss: 0.0276 476/500 [===========================>..] - ETA: 12s - loss: 0.3899 - regression_loss: 0.3623 - classification_loss: 0.0276 477/500 [===========================>..] - ETA: 11s - loss: 0.3901 - regression_loss: 0.3625 - classification_loss: 0.0276 478/500 [===========================>..] - ETA: 11s - loss: 0.3900 - regression_loss: 0.3624 - classification_loss: 0.0276 479/500 [===========================>..] - ETA: 10s - loss: 0.3899 - regression_loss: 0.3624 - classification_loss: 0.0276 480/500 [===========================>..] - ETA: 10s - loss: 0.3900 - regression_loss: 0.3624 - classification_loss: 0.0275 481/500 [===========================>..] - ETA: 9s - loss: 0.3908 - regression_loss: 0.3632 - classification_loss: 0.0277  482/500 [===========================>..] - ETA: 9s - loss: 0.3904 - regression_loss: 0.3628 - classification_loss: 0.0276 483/500 [===========================>..] - ETA: 8s - loss: 0.3915 - regression_loss: 0.3637 - classification_loss: 0.0278 484/500 [============================>.] - ETA: 8s - loss: 0.3918 - regression_loss: 0.3639 - classification_loss: 0.0279 485/500 [============================>.] - ETA: 7s - loss: 0.3915 - regression_loss: 0.3636 - classification_loss: 0.0278 486/500 [============================>.] - ETA: 7s - loss: 0.3912 - regression_loss: 0.3634 - classification_loss: 0.0278 487/500 [============================>.] - ETA: 6s - loss: 0.3911 - regression_loss: 0.3633 - classification_loss: 0.0278 488/500 [============================>.] - ETA: 6s - loss: 0.3910 - regression_loss: 0.3632 - classification_loss: 0.0278 489/500 [============================>.] - ETA: 5s - loss: 0.3908 - regression_loss: 0.3630 - classification_loss: 0.0278 490/500 [============================>.] - ETA: 5s - loss: 0.3907 - regression_loss: 0.3629 - classification_loss: 0.0277 491/500 [============================>.] - ETA: 4s - loss: 0.3905 - regression_loss: 0.3627 - classification_loss: 0.0277 492/500 [============================>.] - ETA: 4s - loss: 0.3902 - regression_loss: 0.3625 - classification_loss: 0.0277 493/500 [============================>.] - ETA: 3s - loss: 0.3906 - regression_loss: 0.3629 - classification_loss: 0.0277 494/500 [============================>.] - ETA: 3s - loss: 0.3906 - regression_loss: 0.3629 - classification_loss: 0.0277 495/500 [============================>.] - ETA: 2s - loss: 0.3903 - regression_loss: 0.3626 - classification_loss: 0.0277 496/500 [============================>.] - ETA: 2s - loss: 0.3900 - regression_loss: 0.3623 - classification_loss: 0.0277 497/500 [============================>.] - ETA: 1s - loss: 0.3899 - regression_loss: 0.3622 - classification_loss: 0.0277 498/500 [============================>.] - ETA: 1s - loss: 0.3901 - regression_loss: 0.3624 - classification_loss: 0.0277 499/500 [============================>.] - ETA: 0s - loss: 0.3901 - regression_loss: 0.3624 - classification_loss: 0.0277 500/500 [==============================] - 255s 511ms/step - loss: 0.3902 - regression_loss: 0.3625 - classification_loss: 0.0277 326 instances of class plum with average precision: 0.8430 mAP: 0.8430 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 4:17 - loss: 0.4148 - regression_loss: 0.3911 - classification_loss: 0.0236 2/500 [..............................] - ETA: 4:14 - loss: 0.3590 - regression_loss: 0.3443 - classification_loss: 0.0146 3/500 [..............................] - ETA: 4:14 - loss: 0.3894 - regression_loss: 0.3717 - classification_loss: 0.0177 4/500 [..............................] - ETA: 4:13 - loss: 0.3762 - regression_loss: 0.3599 - classification_loss: 0.0163 5/500 [..............................] - ETA: 4:13 - loss: 0.3597 - regression_loss: 0.3415 - classification_loss: 0.0182 6/500 [..............................] - ETA: 4:13 - loss: 0.3650 - regression_loss: 0.3477 - classification_loss: 0.0174 7/500 [..............................] - ETA: 4:11 - loss: 0.3389 - regression_loss: 0.3226 - classification_loss: 0.0164 8/500 [..............................] - ETA: 4:11 - loss: 0.3558 - regression_loss: 0.3376 - classification_loss: 0.0181 9/500 [..............................] - ETA: 4:10 - loss: 0.3408 - regression_loss: 0.3233 - classification_loss: 0.0174 10/500 [..............................] - ETA: 4:10 - loss: 0.3498 - regression_loss: 0.3303 - classification_loss: 0.0195 11/500 [..............................] - ETA: 4:09 - loss: 0.3458 - regression_loss: 0.3266 - classification_loss: 0.0192 12/500 [..............................] - ETA: 4:09 - loss: 0.3379 - regression_loss: 0.3195 - classification_loss: 0.0183 13/500 [..............................] - ETA: 4:08 - loss: 0.3461 - regression_loss: 0.3269 - classification_loss: 0.0193 14/500 [..............................] - ETA: 4:08 - loss: 0.3430 - regression_loss: 0.3235 - classification_loss: 0.0194 15/500 [..............................] - ETA: 4:06 - loss: 0.3390 - regression_loss: 0.3201 - classification_loss: 0.0189 16/500 [..............................] - ETA: 4:05 - loss: 0.3308 - regression_loss: 0.3123 - classification_loss: 0.0185 17/500 [>.............................] - ETA: 4:05 - loss: 0.3317 - regression_loss: 0.3126 - classification_loss: 0.0192 18/500 [>.............................] - ETA: 4:04 - loss: 0.3250 - regression_loss: 0.3064 - classification_loss: 0.0187 19/500 [>.............................] - ETA: 4:04 - loss: 0.3159 - regression_loss: 0.2973 - classification_loss: 0.0186 20/500 [>.............................] - ETA: 4:03 - loss: 0.3144 - regression_loss: 0.2962 - classification_loss: 0.0183 21/500 [>.............................] - ETA: 4:03 - loss: 0.3169 - regression_loss: 0.2988 - classification_loss: 0.0180 22/500 [>.............................] - ETA: 4:02 - loss: 0.3426 - regression_loss: 0.3215 - classification_loss: 0.0211 23/500 [>.............................] - ETA: 4:02 - loss: 0.3391 - regression_loss: 0.3182 - classification_loss: 0.0209 24/500 [>.............................] - ETA: 4:02 - loss: 0.3443 - regression_loss: 0.3231 - classification_loss: 0.0212 25/500 [>.............................] - ETA: 4:02 - loss: 0.3440 - regression_loss: 0.3229 - classification_loss: 0.0211 26/500 [>.............................] - ETA: 4:01 - loss: 0.3409 - regression_loss: 0.3201 - classification_loss: 0.0208 27/500 [>.............................] - ETA: 4:01 - loss: 0.3374 - regression_loss: 0.3169 - classification_loss: 0.0205 28/500 [>.............................] - ETA: 4:00 - loss: 0.3393 - regression_loss: 0.3188 - classification_loss: 0.0205 29/500 [>.............................] - ETA: 4:00 - loss: 0.3439 - regression_loss: 0.3232 - classification_loss: 0.0208 30/500 [>.............................] - ETA: 3:59 - loss: 0.3466 - regression_loss: 0.3253 - classification_loss: 0.0212 31/500 [>.............................] - ETA: 3:59 - loss: 0.3404 - regression_loss: 0.3194 - classification_loss: 0.0210 32/500 [>.............................] - ETA: 3:58 - loss: 0.3387 - regression_loss: 0.3176 - classification_loss: 0.0211 33/500 [>.............................] - ETA: 3:58 - loss: 0.3431 - regression_loss: 0.3217 - classification_loss: 0.0214 34/500 [=>............................] - ETA: 3:58 - loss: 0.3525 - regression_loss: 0.3311 - classification_loss: 0.0214 35/500 [=>............................] - ETA: 3:57 - loss: 0.3494 - regression_loss: 0.3283 - classification_loss: 0.0211 36/500 [=>............................] - ETA: 3:57 - loss: 0.3536 - regression_loss: 0.3323 - classification_loss: 0.0214 37/500 [=>............................] - ETA: 3:57 - loss: 0.3527 - regression_loss: 0.3314 - classification_loss: 0.0213 38/500 [=>............................] - ETA: 3:56 - loss: 0.3491 - regression_loss: 0.3280 - classification_loss: 0.0210 39/500 [=>............................] - ETA: 3:56 - loss: 0.3433 - regression_loss: 0.3226 - classification_loss: 0.0207 40/500 [=>............................] - ETA: 3:55 - loss: 0.3457 - regression_loss: 0.3246 - classification_loss: 0.0211 41/500 [=>............................] - ETA: 3:55 - loss: 0.3444 - regression_loss: 0.3234 - classification_loss: 0.0209 42/500 [=>............................] - ETA: 3:54 - loss: 0.3519 - regression_loss: 0.3305 - classification_loss: 0.0214 43/500 [=>............................] - ETA: 3:54 - loss: 0.3511 - regression_loss: 0.3296 - classification_loss: 0.0214 44/500 [=>............................] - ETA: 3:53 - loss: 0.3495 - regression_loss: 0.3282 - classification_loss: 0.0213 45/500 [=>............................] - ETA: 3:53 - loss: 0.3505 - regression_loss: 0.3289 - classification_loss: 0.0216 46/500 [=>............................] - ETA: 3:52 - loss: 0.3499 - regression_loss: 0.3284 - classification_loss: 0.0215 47/500 [=>............................] - ETA: 3:52 - loss: 0.3493 - regression_loss: 0.3278 - classification_loss: 0.0215 48/500 [=>............................] - ETA: 3:51 - loss: 0.3562 - regression_loss: 0.3333 - classification_loss: 0.0229 49/500 [=>............................] - ETA: 3:51 - loss: 0.3553 - regression_loss: 0.3324 - classification_loss: 0.0229 50/500 [==>...........................] - ETA: 3:50 - loss: 0.3570 - regression_loss: 0.3341 - classification_loss: 0.0229 51/500 [==>...........................] - ETA: 3:50 - loss: 0.3551 - regression_loss: 0.3323 - classification_loss: 0.0228 52/500 [==>...........................] - ETA: 3:49 - loss: 0.3556 - regression_loss: 0.3323 - classification_loss: 0.0233 53/500 [==>...........................] - ETA: 3:49 - loss: 0.3539 - regression_loss: 0.3308 - classification_loss: 0.0231 54/500 [==>...........................] - ETA: 3:48 - loss: 0.3609 - regression_loss: 0.3372 - classification_loss: 0.0237 55/500 [==>...........................] - ETA: 3:48 - loss: 0.3622 - regression_loss: 0.3380 - classification_loss: 0.0242 56/500 [==>...........................] - ETA: 3:47 - loss: 0.3700 - regression_loss: 0.3455 - classification_loss: 0.0245 57/500 [==>...........................] - ETA: 3:47 - loss: 0.3732 - regression_loss: 0.3486 - classification_loss: 0.0247 58/500 [==>...........................] - ETA: 3:46 - loss: 0.3760 - regression_loss: 0.3511 - classification_loss: 0.0249 59/500 [==>...........................] - ETA: 3:45 - loss: 0.3744 - regression_loss: 0.3497 - classification_loss: 0.0247 60/500 [==>...........................] - ETA: 3:45 - loss: 0.3712 - regression_loss: 0.3467 - classification_loss: 0.0245 61/500 [==>...........................] - ETA: 3:44 - loss: 0.3812 - regression_loss: 0.3555 - classification_loss: 0.0257 62/500 [==>...........................] - ETA: 3:44 - loss: 0.3825 - regression_loss: 0.3567 - classification_loss: 0.0258 63/500 [==>...........................] - ETA: 3:43 - loss: 0.3785 - regression_loss: 0.3530 - classification_loss: 0.0255 64/500 [==>...........................] - ETA: 3:43 - loss: 0.3760 - regression_loss: 0.3506 - classification_loss: 0.0253 65/500 [==>...........................] - ETA: 3:42 - loss: 0.3755 - regression_loss: 0.3503 - classification_loss: 0.0252 66/500 [==>...........................] - ETA: 3:42 - loss: 0.3770 - regression_loss: 0.3519 - classification_loss: 0.0250 67/500 [===>..........................] - ETA: 3:41 - loss: 0.3756 - regression_loss: 0.3507 - classification_loss: 0.0249 68/500 [===>..........................] - ETA: 3:41 - loss: 0.3785 - regression_loss: 0.3536 - classification_loss: 0.0249 69/500 [===>..........................] - ETA: 3:40 - loss: 0.3794 - regression_loss: 0.3544 - classification_loss: 0.0249 70/500 [===>..........................] - ETA: 3:40 - loss: 0.3785 - regression_loss: 0.3536 - classification_loss: 0.0250 71/500 [===>..........................] - ETA: 3:40 - loss: 0.3797 - regression_loss: 0.3547 - classification_loss: 0.0250 72/500 [===>..........................] - ETA: 3:39 - loss: 0.3785 - regression_loss: 0.3536 - classification_loss: 0.0248 73/500 [===>..........................] - ETA: 3:39 - loss: 0.3793 - regression_loss: 0.3543 - classification_loss: 0.0250 74/500 [===>..........................] - ETA: 3:38 - loss: 0.3773 - regression_loss: 0.3526 - classification_loss: 0.0248 75/500 [===>..........................] - ETA: 3:37 - loss: 0.3765 - regression_loss: 0.3517 - classification_loss: 0.0248 76/500 [===>..........................] - ETA: 3:37 - loss: 0.3755 - regression_loss: 0.3508 - classification_loss: 0.0248 77/500 [===>..........................] - ETA: 3:36 - loss: 0.3751 - regression_loss: 0.3503 - classification_loss: 0.0248 78/500 [===>..........................] - ETA: 3:36 - loss: 0.3732 - regression_loss: 0.3486 - classification_loss: 0.0246 79/500 [===>..........................] - ETA: 3:35 - loss: 0.3723 - regression_loss: 0.3479 - classification_loss: 0.0244 80/500 [===>..........................] - ETA: 3:35 - loss: 0.3714 - regression_loss: 0.3470 - classification_loss: 0.0244 81/500 [===>..........................] - ETA: 3:34 - loss: 0.3690 - regression_loss: 0.3448 - classification_loss: 0.0242 82/500 [===>..........................] - ETA: 3:34 - loss: 0.3705 - regression_loss: 0.3460 - classification_loss: 0.0245 83/500 [===>..........................] - ETA: 3:33 - loss: 0.3733 - regression_loss: 0.3487 - classification_loss: 0.0246 84/500 [====>.........................] - ETA: 3:33 - loss: 0.3742 - regression_loss: 0.3495 - classification_loss: 0.0247 85/500 [====>.........................] - ETA: 3:32 - loss: 0.3732 - regression_loss: 0.3485 - classification_loss: 0.0247 86/500 [====>.........................] - ETA: 3:32 - loss: 0.3732 - regression_loss: 0.3485 - classification_loss: 0.0247 87/500 [====>.........................] - ETA: 3:31 - loss: 0.3748 - regression_loss: 0.3500 - classification_loss: 0.0248 88/500 [====>.........................] - ETA: 3:31 - loss: 0.3757 - regression_loss: 0.3508 - classification_loss: 0.0248 89/500 [====>.........................] - ETA: 3:30 - loss: 0.3758 - regression_loss: 0.3509 - classification_loss: 0.0249 90/500 [====>.........................] - ETA: 3:30 - loss: 0.3746 - regression_loss: 0.3498 - classification_loss: 0.0248 91/500 [====>.........................] - ETA: 3:29 - loss: 0.3772 - regression_loss: 0.3523 - classification_loss: 0.0249 92/500 [====>.........................] - ETA: 3:29 - loss: 0.3767 - regression_loss: 0.3520 - classification_loss: 0.0248 93/500 [====>.........................] - ETA: 3:28 - loss: 0.3756 - regression_loss: 0.3509 - classification_loss: 0.0247 94/500 [====>.........................] - ETA: 3:27 - loss: 0.3815 - regression_loss: 0.3560 - classification_loss: 0.0256 95/500 [====>.........................] - ETA: 3:27 - loss: 0.3806 - regression_loss: 0.3552 - classification_loss: 0.0254 96/500 [====>.........................] - ETA: 3:26 - loss: 0.3852 - regression_loss: 0.3590 - classification_loss: 0.0263 97/500 [====>.........................] - ETA: 3:26 - loss: 0.3847 - regression_loss: 0.3586 - classification_loss: 0.0261 98/500 [====>.........................] - ETA: 3:25 - loss: 0.3841 - regression_loss: 0.3580 - classification_loss: 0.0261 99/500 [====>.........................] - ETA: 3:25 - loss: 0.3849 - regression_loss: 0.3588 - classification_loss: 0.0261 100/500 [=====>........................] - ETA: 3:25 - loss: 0.3870 - regression_loss: 0.3610 - classification_loss: 0.0261 101/500 [=====>........................] - ETA: 3:24 - loss: 0.3870 - regression_loss: 0.3610 - classification_loss: 0.0260 102/500 [=====>........................] - ETA: 3:24 - loss: 0.3884 - regression_loss: 0.3624 - classification_loss: 0.0260 103/500 [=====>........................] - ETA: 3:23 - loss: 0.3874 - regression_loss: 0.3615 - classification_loss: 0.0259 104/500 [=====>........................] - ETA: 3:22 - loss: 0.3873 - regression_loss: 0.3612 - classification_loss: 0.0261 105/500 [=====>........................] - ETA: 3:22 - loss: 0.3854 - regression_loss: 0.3594 - classification_loss: 0.0261 106/500 [=====>........................] - ETA: 3:21 - loss: 0.3878 - regression_loss: 0.3616 - classification_loss: 0.0261 107/500 [=====>........................] - ETA: 3:21 - loss: 0.3877 - regression_loss: 0.3616 - classification_loss: 0.0262 108/500 [=====>........................] - ETA: 3:20 - loss: 0.3855 - regression_loss: 0.3595 - classification_loss: 0.0260 109/500 [=====>........................] - ETA: 3:20 - loss: 0.3848 - regression_loss: 0.3589 - classification_loss: 0.0259 110/500 [=====>........................] - ETA: 3:19 - loss: 0.3849 - regression_loss: 0.3590 - classification_loss: 0.0259 111/500 [=====>........................] - ETA: 3:19 - loss: 0.3866 - regression_loss: 0.3605 - classification_loss: 0.0261 112/500 [=====>........................] - ETA: 3:18 - loss: 0.3849 - regression_loss: 0.3589 - classification_loss: 0.0260 113/500 [=====>........................] - ETA: 3:18 - loss: 0.3847 - regression_loss: 0.3587 - classification_loss: 0.0260 114/500 [=====>........................] - ETA: 3:17 - loss: 0.3837 - regression_loss: 0.3578 - classification_loss: 0.0258 115/500 [=====>........................] - ETA: 3:17 - loss: 0.3833 - regression_loss: 0.3576 - classification_loss: 0.0257 116/500 [=====>........................] - ETA: 3:16 - loss: 0.3825 - regression_loss: 0.3569 - classification_loss: 0.0257 117/500 [======>.......................] - ETA: 3:16 - loss: 0.3842 - regression_loss: 0.3584 - classification_loss: 0.0257 118/500 [======>.......................] - ETA: 3:15 - loss: 0.3857 - regression_loss: 0.3598 - classification_loss: 0.0258 119/500 [======>.......................] - ETA: 3:15 - loss: 0.3931 - regression_loss: 0.3669 - classification_loss: 0.0263 120/500 [======>.......................] - ETA: 3:14 - loss: 0.3956 - regression_loss: 0.3692 - classification_loss: 0.0264 121/500 [======>.......................] - ETA: 3:14 - loss: 0.3954 - regression_loss: 0.3690 - classification_loss: 0.0264 122/500 [======>.......................] - ETA: 3:13 - loss: 0.3965 - regression_loss: 0.3699 - classification_loss: 0.0266 123/500 [======>.......................] - ETA: 3:13 - loss: 0.3960 - regression_loss: 0.3696 - classification_loss: 0.0264 124/500 [======>.......................] - ETA: 3:12 - loss: 0.3953 - regression_loss: 0.3689 - classification_loss: 0.0263 125/500 [======>.......................] - ETA: 3:12 - loss: 0.3955 - regression_loss: 0.3692 - classification_loss: 0.0263 126/500 [======>.......................] - ETA: 3:11 - loss: 0.3948 - regression_loss: 0.3686 - classification_loss: 0.0262 127/500 [======>.......................] - ETA: 3:11 - loss: 0.3948 - regression_loss: 0.3688 - classification_loss: 0.0260 128/500 [======>.......................] - ETA: 3:10 - loss: 0.3949 - regression_loss: 0.3688 - classification_loss: 0.0261 129/500 [======>.......................] - ETA: 3:10 - loss: 0.3961 - regression_loss: 0.3700 - classification_loss: 0.0260 130/500 [======>.......................] - ETA: 3:09 - loss: 0.3962 - regression_loss: 0.3703 - classification_loss: 0.0259 131/500 [======>.......................] - ETA: 3:08 - loss: 0.3954 - regression_loss: 0.3696 - classification_loss: 0.0258 132/500 [======>.......................] - ETA: 3:08 - loss: 0.3955 - regression_loss: 0.3697 - classification_loss: 0.0258 133/500 [======>.......................] - ETA: 3:07 - loss: 0.3947 - regression_loss: 0.3690 - classification_loss: 0.0257 134/500 [=======>......................] - ETA: 3:07 - loss: 0.3940 - regression_loss: 0.3683 - classification_loss: 0.0257 135/500 [=======>......................] - ETA: 3:06 - loss: 0.3938 - regression_loss: 0.3681 - classification_loss: 0.0257 136/500 [=======>......................] - ETA: 3:06 - loss: 0.3930 - regression_loss: 0.3674 - classification_loss: 0.0256 137/500 [=======>......................] - ETA: 3:05 - loss: 0.3931 - regression_loss: 0.3673 - classification_loss: 0.0257 138/500 [=======>......................] - ETA: 3:05 - loss: 0.3925 - regression_loss: 0.3667 - classification_loss: 0.0258 139/500 [=======>......................] - ETA: 3:04 - loss: 0.3935 - regression_loss: 0.3676 - classification_loss: 0.0260 140/500 [=======>......................] - ETA: 3:04 - loss: 0.3921 - regression_loss: 0.3662 - classification_loss: 0.0259 141/500 [=======>......................] - ETA: 3:03 - loss: 0.3914 - regression_loss: 0.3655 - classification_loss: 0.0259 142/500 [=======>......................] - ETA: 3:03 - loss: 0.3915 - regression_loss: 0.3656 - classification_loss: 0.0258 143/500 [=======>......................] - ETA: 3:02 - loss: 0.3901 - regression_loss: 0.3643 - classification_loss: 0.0258 144/500 [=======>......................] - ETA: 3:02 - loss: 0.3896 - regression_loss: 0.3640 - classification_loss: 0.0257 145/500 [=======>......................] - ETA: 3:01 - loss: 0.3886 - regression_loss: 0.3631 - classification_loss: 0.0256 146/500 [=======>......................] - ETA: 3:01 - loss: 0.3881 - regression_loss: 0.3626 - classification_loss: 0.0255 147/500 [=======>......................] - ETA: 3:00 - loss: 0.3878 - regression_loss: 0.3623 - classification_loss: 0.0255 148/500 [=======>......................] - ETA: 3:00 - loss: 0.3885 - regression_loss: 0.3629 - classification_loss: 0.0256 149/500 [=======>......................] - ETA: 2:59 - loss: 0.3899 - regression_loss: 0.3642 - classification_loss: 0.0257 150/500 [========>.....................] - ETA: 2:59 - loss: 0.3898 - regression_loss: 0.3642 - classification_loss: 0.0257 151/500 [========>.....................] - ETA: 2:58 - loss: 0.3891 - regression_loss: 0.3635 - classification_loss: 0.0256 152/500 [========>.....................] - ETA: 2:58 - loss: 0.3879 - regression_loss: 0.3624 - classification_loss: 0.0255 153/500 [========>.....................] - ETA: 2:57 - loss: 0.3874 - regression_loss: 0.3619 - classification_loss: 0.0255 154/500 [========>.....................] - ETA: 2:57 - loss: 0.3866 - regression_loss: 0.3612 - classification_loss: 0.0254 155/500 [========>.....................] - ETA: 2:56 - loss: 0.3871 - regression_loss: 0.3616 - classification_loss: 0.0254 156/500 [========>.....................] - ETA: 2:56 - loss: 0.3869 - regression_loss: 0.3615 - classification_loss: 0.0254 157/500 [========>.....................] - ETA: 2:55 - loss: 0.3867 - regression_loss: 0.3613 - classification_loss: 0.0254 158/500 [========>.....................] - ETA: 2:54 - loss: 0.3858 - regression_loss: 0.3605 - classification_loss: 0.0253 159/500 [========>.....................] - ETA: 2:54 - loss: 0.3851 - regression_loss: 0.3599 - classification_loss: 0.0252 160/500 [========>.....................] - ETA: 2:53 - loss: 0.3840 - regression_loss: 0.3588 - classification_loss: 0.0252 161/500 [========>.....................] - ETA: 2:53 - loss: 0.3843 - regression_loss: 0.3591 - classification_loss: 0.0252 162/500 [========>.....................] - ETA: 2:52 - loss: 0.3839 - regression_loss: 0.3587 - classification_loss: 0.0252 163/500 [========>.....................] - ETA: 2:52 - loss: 0.3839 - regression_loss: 0.3587 - classification_loss: 0.0252 164/500 [========>.....................] - ETA: 2:51 - loss: 0.3846 - regression_loss: 0.3591 - classification_loss: 0.0255 165/500 [========>.....................] - ETA: 2:51 - loss: 0.3864 - regression_loss: 0.3609 - classification_loss: 0.0255 166/500 [========>.....................] - ETA: 2:50 - loss: 0.3858 - regression_loss: 0.3603 - classification_loss: 0.0254 167/500 [=========>....................] - ETA: 2:50 - loss: 0.3847 - regression_loss: 0.3593 - classification_loss: 0.0254 168/500 [=========>....................] - ETA: 2:49 - loss: 0.3853 - regression_loss: 0.3599 - classification_loss: 0.0254 169/500 [=========>....................] - ETA: 2:49 - loss: 0.3855 - regression_loss: 0.3602 - classification_loss: 0.0253 170/500 [=========>....................] - ETA: 2:48 - loss: 0.3844 - regression_loss: 0.3592 - classification_loss: 0.0253 171/500 [=========>....................] - ETA: 2:48 - loss: 0.3852 - regression_loss: 0.3600 - classification_loss: 0.0252 172/500 [=========>....................] - ETA: 2:47 - loss: 0.3868 - regression_loss: 0.3612 - classification_loss: 0.0256 173/500 [=========>....................] - ETA: 2:47 - loss: 0.3861 - regression_loss: 0.3606 - classification_loss: 0.0256 174/500 [=========>....................] - ETA: 2:46 - loss: 0.3879 - regression_loss: 0.3620 - classification_loss: 0.0259 175/500 [=========>....................] - ETA: 2:46 - loss: 0.3872 - regression_loss: 0.3615 - classification_loss: 0.0258 176/500 [=========>....................] - ETA: 2:45 - loss: 0.3867 - regression_loss: 0.3610 - classification_loss: 0.0257 177/500 [=========>....................] - ETA: 2:45 - loss: 0.3870 - regression_loss: 0.3613 - classification_loss: 0.0257 178/500 [=========>....................] - ETA: 2:44 - loss: 0.3870 - regression_loss: 0.3612 - classification_loss: 0.0258 179/500 [=========>....................] - ETA: 2:44 - loss: 0.3862 - regression_loss: 0.3605 - classification_loss: 0.0257 180/500 [=========>....................] - ETA: 2:43 - loss: 0.3851 - regression_loss: 0.3595 - classification_loss: 0.0256 181/500 [=========>....................] - ETA: 2:43 - loss: 0.3850 - regression_loss: 0.3594 - classification_loss: 0.0255 182/500 [=========>....................] - ETA: 2:42 - loss: 0.3842 - regression_loss: 0.3587 - classification_loss: 0.0255 183/500 [=========>....................] - ETA: 2:42 - loss: 0.3831 - regression_loss: 0.3577 - classification_loss: 0.0254 184/500 [==========>...................] - ETA: 2:41 - loss: 0.3845 - regression_loss: 0.3590 - classification_loss: 0.0254 185/500 [==========>...................] - ETA: 2:41 - loss: 0.3839 - regression_loss: 0.3585 - classification_loss: 0.0254 186/500 [==========>...................] - ETA: 2:40 - loss: 0.3831 - regression_loss: 0.3577 - classification_loss: 0.0254 187/500 [==========>...................] - ETA: 2:40 - loss: 0.3834 - regression_loss: 0.3580 - classification_loss: 0.0254 188/500 [==========>...................] - ETA: 2:39 - loss: 0.3843 - regression_loss: 0.3588 - classification_loss: 0.0256 189/500 [==========>...................] - ETA: 2:39 - loss: 0.3837 - regression_loss: 0.3582 - classification_loss: 0.0255 190/500 [==========>...................] - ETA: 2:38 - loss: 0.3835 - regression_loss: 0.3581 - classification_loss: 0.0254 191/500 [==========>...................] - ETA: 2:38 - loss: 0.3827 - regression_loss: 0.3574 - classification_loss: 0.0253 192/500 [==========>...................] - ETA: 2:37 - loss: 0.3823 - regression_loss: 0.3570 - classification_loss: 0.0253 193/500 [==========>...................] - ETA: 2:37 - loss: 0.3814 - regression_loss: 0.3562 - classification_loss: 0.0252 194/500 [==========>...................] - ETA: 2:36 - loss: 0.3812 - regression_loss: 0.3560 - classification_loss: 0.0252 195/500 [==========>...................] - ETA: 2:36 - loss: 0.3812 - regression_loss: 0.3559 - classification_loss: 0.0252 196/500 [==========>...................] - ETA: 2:35 - loss: 0.3800 - regression_loss: 0.3548 - classification_loss: 0.0251 197/500 [==========>...................] - ETA: 2:35 - loss: 0.3804 - regression_loss: 0.3551 - classification_loss: 0.0252 198/500 [==========>...................] - ETA: 2:34 - loss: 0.3829 - regression_loss: 0.3574 - classification_loss: 0.0255 199/500 [==========>...................] - ETA: 2:34 - loss: 0.3825 - regression_loss: 0.3570 - classification_loss: 0.0255 200/500 [===========>..................] - ETA: 2:33 - loss: 0.3832 - regression_loss: 0.3578 - classification_loss: 0.0255 201/500 [===========>..................] - ETA: 2:33 - loss: 0.3835 - regression_loss: 0.3580 - classification_loss: 0.0255 202/500 [===========>..................] - ETA: 2:32 - loss: 0.3827 - regression_loss: 0.3573 - classification_loss: 0.0255 203/500 [===========>..................] - ETA: 2:32 - loss: 0.3828 - regression_loss: 0.3573 - classification_loss: 0.0255 204/500 [===========>..................] - ETA: 2:31 - loss: 0.3826 - regression_loss: 0.3571 - classification_loss: 0.0255 205/500 [===========>..................] - ETA: 2:31 - loss: 0.3825 - regression_loss: 0.3570 - classification_loss: 0.0255 206/500 [===========>..................] - ETA: 2:30 - loss: 0.3814 - regression_loss: 0.3559 - classification_loss: 0.0255 207/500 [===========>..................] - ETA: 2:30 - loss: 0.3807 - regression_loss: 0.3554 - classification_loss: 0.0254 208/500 [===========>..................] - ETA: 2:29 - loss: 0.3800 - regression_loss: 0.3547 - classification_loss: 0.0253 209/500 [===========>..................] - ETA: 2:29 - loss: 0.3797 - regression_loss: 0.3544 - classification_loss: 0.0253 210/500 [===========>..................] - ETA: 2:28 - loss: 0.3801 - regression_loss: 0.3549 - classification_loss: 0.0253 211/500 [===========>..................] - ETA: 2:28 - loss: 0.3805 - regression_loss: 0.3552 - classification_loss: 0.0253 212/500 [===========>..................] - ETA: 2:27 - loss: 0.3817 - regression_loss: 0.3564 - classification_loss: 0.0253 213/500 [===========>..................] - ETA: 2:27 - loss: 0.3823 - regression_loss: 0.3569 - classification_loss: 0.0253 214/500 [===========>..................] - ETA: 2:26 - loss: 0.3828 - regression_loss: 0.3574 - classification_loss: 0.0254 215/500 [===========>..................] - ETA: 2:26 - loss: 0.3821 - regression_loss: 0.3567 - classification_loss: 0.0254 216/500 [===========>..................] - ETA: 2:25 - loss: 0.3814 - regression_loss: 0.3560 - classification_loss: 0.0254 217/500 [============>.................] - ETA: 2:25 - loss: 0.3816 - regression_loss: 0.3563 - classification_loss: 0.0254 218/500 [============>.................] - ETA: 2:24 - loss: 0.3817 - regression_loss: 0.3564 - classification_loss: 0.0253 219/500 [============>.................] - ETA: 2:24 - loss: 0.3841 - regression_loss: 0.3585 - classification_loss: 0.0256 220/500 [============>.................] - ETA: 2:23 - loss: 0.3840 - regression_loss: 0.3584 - classification_loss: 0.0256 221/500 [============>.................] - ETA: 2:22 - loss: 0.3837 - regression_loss: 0.3582 - classification_loss: 0.0255 222/500 [============>.................] - ETA: 2:22 - loss: 0.3828 - regression_loss: 0.3574 - classification_loss: 0.0255 223/500 [============>.................] - ETA: 2:21 - loss: 0.3830 - regression_loss: 0.3575 - classification_loss: 0.0255 224/500 [============>.................] - ETA: 2:21 - loss: 0.3831 - regression_loss: 0.3575 - classification_loss: 0.0255 225/500 [============>.................] - ETA: 2:20 - loss: 0.3828 - regression_loss: 0.3573 - classification_loss: 0.0255 226/500 [============>.................] - ETA: 2:20 - loss: 0.3830 - regression_loss: 0.3576 - classification_loss: 0.0255 227/500 [============>.................] - ETA: 2:19 - loss: 0.3834 - regression_loss: 0.3579 - classification_loss: 0.0254 228/500 [============>.................] - ETA: 2:19 - loss: 0.3834 - regression_loss: 0.3579 - classification_loss: 0.0255 229/500 [============>.................] - ETA: 2:18 - loss: 0.3828 - regression_loss: 0.3573 - classification_loss: 0.0255 230/500 [============>.................] - ETA: 2:18 - loss: 0.3825 - regression_loss: 0.3571 - classification_loss: 0.0254 231/500 [============>.................] - ETA: 2:17 - loss: 0.3823 - regression_loss: 0.3569 - classification_loss: 0.0253 232/500 [============>.................] - ETA: 2:17 - loss: 0.3850 - regression_loss: 0.3587 - classification_loss: 0.0263 233/500 [============>.................] - ETA: 2:16 - loss: 0.3843 - regression_loss: 0.3581 - classification_loss: 0.0263 234/500 [=============>................] - ETA: 2:16 - loss: 0.3867 - regression_loss: 0.3602 - classification_loss: 0.0265 235/500 [=============>................] - ETA: 2:15 - loss: 0.3874 - regression_loss: 0.3609 - classification_loss: 0.0265 236/500 [=============>................] - ETA: 2:15 - loss: 0.3871 - regression_loss: 0.3606 - classification_loss: 0.0265 237/500 [=============>................] - ETA: 2:14 - loss: 0.3883 - regression_loss: 0.3619 - classification_loss: 0.0265 238/500 [=============>................] - ETA: 2:14 - loss: 0.3911 - regression_loss: 0.3645 - classification_loss: 0.0266 239/500 [=============>................] - ETA: 2:13 - loss: 0.3912 - regression_loss: 0.3647 - classification_loss: 0.0265 240/500 [=============>................] - ETA: 2:13 - loss: 0.3917 - regression_loss: 0.3652 - classification_loss: 0.0266 241/500 [=============>................] - ETA: 2:12 - loss: 0.3932 - regression_loss: 0.3663 - classification_loss: 0.0269 242/500 [=============>................] - ETA: 2:12 - loss: 0.3927 - regression_loss: 0.3658 - classification_loss: 0.0268 243/500 [=============>................] - ETA: 2:11 - loss: 0.3927 - regression_loss: 0.3658 - classification_loss: 0.0268 244/500 [=============>................] - ETA: 2:11 - loss: 0.3930 - regression_loss: 0.3662 - classification_loss: 0.0268 245/500 [=============>................] - ETA: 2:10 - loss: 0.3924 - regression_loss: 0.3656 - classification_loss: 0.0268 246/500 [=============>................] - ETA: 2:10 - loss: 0.3916 - regression_loss: 0.3649 - classification_loss: 0.0267 247/500 [=============>................] - ETA: 2:09 - loss: 0.3918 - regression_loss: 0.3651 - classification_loss: 0.0267 248/500 [=============>................] - ETA: 2:09 - loss: 0.3923 - regression_loss: 0.3656 - classification_loss: 0.0267 249/500 [=============>................] - ETA: 2:08 - loss: 0.3930 - regression_loss: 0.3662 - classification_loss: 0.0267 250/500 [==============>...............] - ETA: 2:08 - loss: 0.3926 - regression_loss: 0.3659 - classification_loss: 0.0267 251/500 [==============>...............] - ETA: 2:07 - loss: 0.3923 - regression_loss: 0.3657 - classification_loss: 0.0267 252/500 [==============>...............] - ETA: 2:07 - loss: 0.3916 - regression_loss: 0.3649 - classification_loss: 0.0266 253/500 [==============>...............] - ETA: 2:06 - loss: 0.3912 - regression_loss: 0.3647 - classification_loss: 0.0266 254/500 [==============>...............] - ETA: 2:06 - loss: 0.3917 - regression_loss: 0.3651 - classification_loss: 0.0266 255/500 [==============>...............] - ETA: 2:05 - loss: 0.3909 - regression_loss: 0.3644 - classification_loss: 0.0265 256/500 [==============>...............] - ETA: 2:05 - loss: 0.3908 - regression_loss: 0.3643 - classification_loss: 0.0265 257/500 [==============>...............] - ETA: 2:04 - loss: 0.3905 - regression_loss: 0.3640 - classification_loss: 0.0265 258/500 [==============>...............] - ETA: 2:04 - loss: 0.3909 - regression_loss: 0.3644 - classification_loss: 0.0266 259/500 [==============>...............] - ETA: 2:03 - loss: 0.3900 - regression_loss: 0.3635 - classification_loss: 0.0265 260/500 [==============>...............] - ETA: 2:03 - loss: 0.3898 - regression_loss: 0.3633 - classification_loss: 0.0265 261/500 [==============>...............] - ETA: 2:02 - loss: 0.3891 - regression_loss: 0.3626 - classification_loss: 0.0265 262/500 [==============>...............] - ETA: 2:01 - loss: 0.3888 - regression_loss: 0.3623 - classification_loss: 0.0265 263/500 [==============>...............] - ETA: 2:01 - loss: 0.3895 - regression_loss: 0.3629 - classification_loss: 0.0266 264/500 [==============>...............] - ETA: 2:00 - loss: 0.3886 - regression_loss: 0.3620 - classification_loss: 0.0265 265/500 [==============>...............] - ETA: 2:00 - loss: 0.3890 - regression_loss: 0.3624 - classification_loss: 0.0266 266/500 [==============>...............] - ETA: 1:59 - loss: 0.3887 - regression_loss: 0.3622 - classification_loss: 0.0265 267/500 [===============>..............] - ETA: 1:59 - loss: 0.3888 - regression_loss: 0.3623 - classification_loss: 0.0265 268/500 [===============>..............] - ETA: 1:58 - loss: 0.3888 - regression_loss: 0.3622 - classification_loss: 0.0265 269/500 [===============>..............] - ETA: 1:58 - loss: 0.3882 - regression_loss: 0.3617 - classification_loss: 0.0265 270/500 [===============>..............] - ETA: 1:57 - loss: 0.3877 - regression_loss: 0.3612 - classification_loss: 0.0264 271/500 [===============>..............] - ETA: 1:57 - loss: 0.3887 - regression_loss: 0.3623 - classification_loss: 0.0264 272/500 [===============>..............] - ETA: 1:56 - loss: 0.3886 - regression_loss: 0.3621 - classification_loss: 0.0264 273/500 [===============>..............] - ETA: 1:56 - loss: 0.3886 - regression_loss: 0.3622 - classification_loss: 0.0265 274/500 [===============>..............] - ETA: 1:55 - loss: 0.3880 - regression_loss: 0.3616 - classification_loss: 0.0264 275/500 [===============>..............] - ETA: 1:55 - loss: 0.3880 - regression_loss: 0.3616 - classification_loss: 0.0264 276/500 [===============>..............] - ETA: 1:54 - loss: 0.3885 - regression_loss: 0.3622 - classification_loss: 0.0264 277/500 [===============>..............] - ETA: 1:54 - loss: 0.3890 - regression_loss: 0.3627 - classification_loss: 0.0264 278/500 [===============>..............] - ETA: 1:53 - loss: 0.3899 - regression_loss: 0.3635 - classification_loss: 0.0264 279/500 [===============>..............] - ETA: 1:53 - loss: 0.3899 - regression_loss: 0.3635 - classification_loss: 0.0264 280/500 [===============>..............] - ETA: 1:52 - loss: 0.3895 - regression_loss: 0.3632 - classification_loss: 0.0263 281/500 [===============>..............] - ETA: 1:52 - loss: 0.3889 - regression_loss: 0.3626 - classification_loss: 0.0263 282/500 [===============>..............] - ETA: 1:51 - loss: 0.3885 - regression_loss: 0.3623 - classification_loss: 0.0263 283/500 [===============>..............] - ETA: 1:51 - loss: 0.3883 - regression_loss: 0.3621 - classification_loss: 0.0262 284/500 [================>.............] - ETA: 1:50 - loss: 0.3883 - regression_loss: 0.3621 - classification_loss: 0.0262 285/500 [================>.............] - ETA: 1:50 - loss: 0.3880 - regression_loss: 0.3618 - classification_loss: 0.0262 286/500 [================>.............] - ETA: 1:49 - loss: 0.3876 - regression_loss: 0.3615 - classification_loss: 0.0262 287/500 [================>.............] - ETA: 1:49 - loss: 0.3877 - regression_loss: 0.3616 - classification_loss: 0.0262 288/500 [================>.............] - ETA: 1:48 - loss: 0.3877 - regression_loss: 0.3615 - classification_loss: 0.0261 289/500 [================>.............] - ETA: 1:48 - loss: 0.3873 - regression_loss: 0.3613 - classification_loss: 0.0261 290/500 [================>.............] - ETA: 1:47 - loss: 0.3884 - regression_loss: 0.3623 - classification_loss: 0.0261 291/500 [================>.............] - ETA: 1:47 - loss: 0.3879 - regression_loss: 0.3619 - classification_loss: 0.0261 292/500 [================>.............] - ETA: 1:46 - loss: 0.3869 - regression_loss: 0.3609 - classification_loss: 0.0260 293/500 [================>.............] - ETA: 1:46 - loss: 0.3862 - regression_loss: 0.3602 - classification_loss: 0.0260 294/500 [================>.............] - ETA: 1:45 - loss: 0.3857 - regression_loss: 0.3598 - classification_loss: 0.0259 295/500 [================>.............] - ETA: 1:44 - loss: 0.3869 - regression_loss: 0.3608 - classification_loss: 0.0261 296/500 [================>.............] - ETA: 1:44 - loss: 0.3862 - regression_loss: 0.3602 - classification_loss: 0.0260 297/500 [================>.............] - ETA: 1:43 - loss: 0.3857 - regression_loss: 0.3597 - classification_loss: 0.0260 298/500 [================>.............] - ETA: 1:43 - loss: 0.3858 - regression_loss: 0.3597 - classification_loss: 0.0261 299/500 [================>.............] - ETA: 1:42 - loss: 0.3852 - regression_loss: 0.3592 - classification_loss: 0.0260 300/500 [=================>............] - ETA: 1:42 - loss: 0.3867 - regression_loss: 0.3605 - classification_loss: 0.0262 301/500 [=================>............] - ETA: 1:41 - loss: 0.3859 - regression_loss: 0.3597 - classification_loss: 0.0261 302/500 [=================>............] - ETA: 1:41 - loss: 0.3854 - regression_loss: 0.3593 - classification_loss: 0.0261 303/500 [=================>............] - ETA: 1:40 - loss: 0.3849 - regression_loss: 0.3589 - classification_loss: 0.0261 304/500 [=================>............] - ETA: 1:40 - loss: 0.3857 - regression_loss: 0.3596 - classification_loss: 0.0261 305/500 [=================>............] - ETA: 1:39 - loss: 0.3854 - regression_loss: 0.3593 - classification_loss: 0.0262 306/500 [=================>............] - ETA: 1:39 - loss: 0.3854 - regression_loss: 0.3592 - classification_loss: 0.0262 307/500 [=================>............] - ETA: 1:38 - loss: 0.3850 - regression_loss: 0.3588 - classification_loss: 0.0261 308/500 [=================>............] - ETA: 1:38 - loss: 0.3844 - regression_loss: 0.3583 - classification_loss: 0.0261 309/500 [=================>............] - ETA: 1:37 - loss: 0.3838 - regression_loss: 0.3577 - classification_loss: 0.0261 310/500 [=================>............] - ETA: 1:37 - loss: 0.3836 - regression_loss: 0.3576 - classification_loss: 0.0260 311/500 [=================>............] - ETA: 1:36 - loss: 0.3835 - regression_loss: 0.3575 - classification_loss: 0.0260 312/500 [=================>............] - ETA: 1:36 - loss: 0.3836 - regression_loss: 0.3576 - classification_loss: 0.0260 313/500 [=================>............] - ETA: 1:35 - loss: 0.3833 - regression_loss: 0.3572 - classification_loss: 0.0261 314/500 [=================>............] - ETA: 1:35 - loss: 0.3830 - regression_loss: 0.3570 - classification_loss: 0.0260 315/500 [=================>............] - ETA: 1:34 - loss: 0.3831 - regression_loss: 0.3571 - classification_loss: 0.0260 316/500 [=================>............] - ETA: 1:34 - loss: 0.3829 - regression_loss: 0.3569 - classification_loss: 0.0260 317/500 [==================>...........] - ETA: 1:33 - loss: 0.3826 - regression_loss: 0.3566 - classification_loss: 0.0260 318/500 [==================>...........] - ETA: 1:33 - loss: 0.3821 - regression_loss: 0.3562 - classification_loss: 0.0259 319/500 [==================>...........] - ETA: 1:32 - loss: 0.3818 - regression_loss: 0.3559 - classification_loss: 0.0259 320/500 [==================>...........] - ETA: 1:32 - loss: 0.3818 - regression_loss: 0.3559 - classification_loss: 0.0259 321/500 [==================>...........] - ETA: 1:31 - loss: 0.3818 - regression_loss: 0.3559 - classification_loss: 0.0259 322/500 [==================>...........] - ETA: 1:31 - loss: 0.3815 - regression_loss: 0.3556 - classification_loss: 0.0258 323/500 [==================>...........] - ETA: 1:30 - loss: 0.3814 - regression_loss: 0.3555 - classification_loss: 0.0258 324/500 [==================>...........] - ETA: 1:30 - loss: 0.3817 - regression_loss: 0.3559 - classification_loss: 0.0258 325/500 [==================>...........] - ETA: 1:29 - loss: 0.3820 - regression_loss: 0.3562 - classification_loss: 0.0258 326/500 [==================>...........] - ETA: 1:29 - loss: 0.3815 - regression_loss: 0.3557 - classification_loss: 0.0258 327/500 [==================>...........] - ETA: 1:28 - loss: 0.3812 - regression_loss: 0.3554 - classification_loss: 0.0257 328/500 [==================>...........] - ETA: 1:28 - loss: 0.3823 - regression_loss: 0.3561 - classification_loss: 0.0262 329/500 [==================>...........] - ETA: 1:27 - loss: 0.3816 - regression_loss: 0.3554 - classification_loss: 0.0261 330/500 [==================>...........] - ETA: 1:27 - loss: 0.3813 - regression_loss: 0.3552 - classification_loss: 0.0261 331/500 [==================>...........] - ETA: 1:26 - loss: 0.3809 - regression_loss: 0.3547 - classification_loss: 0.0262 332/500 [==================>...........] - ETA: 1:26 - loss: 0.3804 - regression_loss: 0.3543 - classification_loss: 0.0261 333/500 [==================>...........] - ETA: 1:25 - loss: 0.3800 - regression_loss: 0.3539 - classification_loss: 0.0261 334/500 [===================>..........] - ETA: 1:25 - loss: 0.3795 - regression_loss: 0.3535 - classification_loss: 0.0261 335/500 [===================>..........] - ETA: 1:24 - loss: 0.3792 - regression_loss: 0.3532 - classification_loss: 0.0260 336/500 [===================>..........] - ETA: 1:23 - loss: 0.3790 - regression_loss: 0.3530 - classification_loss: 0.0260 337/500 [===================>..........] - ETA: 1:23 - loss: 0.3781 - regression_loss: 0.3522 - classification_loss: 0.0259 338/500 [===================>..........] - ETA: 1:22 - loss: 0.3774 - regression_loss: 0.3515 - classification_loss: 0.0259 339/500 [===================>..........] - ETA: 1:22 - loss: 0.3769 - regression_loss: 0.3510 - classification_loss: 0.0259 340/500 [===================>..........] - ETA: 1:21 - loss: 0.3764 - regression_loss: 0.3505 - classification_loss: 0.0258 341/500 [===================>..........] - ETA: 1:21 - loss: 0.3764 - regression_loss: 0.3506 - classification_loss: 0.0258 342/500 [===================>..........] - ETA: 1:20 - loss: 0.3758 - regression_loss: 0.3500 - classification_loss: 0.0258 343/500 [===================>..........] - ETA: 1:20 - loss: 0.3754 - regression_loss: 0.3496 - classification_loss: 0.0258 344/500 [===================>..........] - ETA: 1:19 - loss: 0.3752 - regression_loss: 0.3494 - classification_loss: 0.0257 345/500 [===================>..........] - ETA: 1:19 - loss: 0.3752 - regression_loss: 0.3495 - classification_loss: 0.0257 346/500 [===================>..........] - ETA: 1:18 - loss: 0.3755 - regression_loss: 0.3498 - classification_loss: 0.0257 347/500 [===================>..........] - ETA: 1:18 - loss: 0.3752 - regression_loss: 0.3496 - classification_loss: 0.0257 348/500 [===================>..........] - ETA: 1:17 - loss: 0.3748 - regression_loss: 0.3492 - classification_loss: 0.0256 349/500 [===================>..........] - ETA: 1:17 - loss: 0.3747 - regression_loss: 0.3491 - classification_loss: 0.0256 350/500 [====================>.........] - ETA: 1:16 - loss: 0.3744 - regression_loss: 0.3488 - classification_loss: 0.0256 351/500 [====================>.........] - ETA: 1:16 - loss: 0.3742 - regression_loss: 0.3486 - classification_loss: 0.0256 352/500 [====================>.........] - ETA: 1:15 - loss: 0.3739 - regression_loss: 0.3484 - classification_loss: 0.0255 353/500 [====================>.........] - ETA: 1:15 - loss: 0.3739 - regression_loss: 0.3483 - classification_loss: 0.0255 354/500 [====================>.........] - ETA: 1:14 - loss: 0.3750 - regression_loss: 0.3493 - classification_loss: 0.0257 355/500 [====================>.........] - ETA: 1:14 - loss: 0.3749 - regression_loss: 0.3493 - classification_loss: 0.0257 356/500 [====================>.........] - ETA: 1:13 - loss: 0.3745 - regression_loss: 0.3488 - classification_loss: 0.0257 357/500 [====================>.........] - ETA: 1:13 - loss: 0.3747 - regression_loss: 0.3490 - classification_loss: 0.0256 358/500 [====================>.........] - ETA: 1:12 - loss: 0.3752 - regression_loss: 0.3496 - classification_loss: 0.0256 359/500 [====================>.........] - ETA: 1:12 - loss: 0.3756 - regression_loss: 0.3499 - classification_loss: 0.0258 360/500 [====================>.........] - ETA: 1:11 - loss: 0.3749 - regression_loss: 0.3492 - classification_loss: 0.0257 361/500 [====================>.........] - ETA: 1:11 - loss: 0.3744 - regression_loss: 0.3488 - classification_loss: 0.0257 362/500 [====================>.........] - ETA: 1:10 - loss: 0.3743 - regression_loss: 0.3486 - classification_loss: 0.0257 363/500 [====================>.........] - ETA: 1:10 - loss: 0.3744 - regression_loss: 0.3486 - classification_loss: 0.0257 364/500 [====================>.........] - ETA: 1:09 - loss: 0.3747 - regression_loss: 0.3490 - classification_loss: 0.0257 365/500 [====================>.........] - ETA: 1:09 - loss: 0.3742 - regression_loss: 0.3485 - classification_loss: 0.0257 366/500 [====================>.........] - ETA: 1:08 - loss: 0.3739 - regression_loss: 0.3482 - classification_loss: 0.0257 367/500 [=====================>........] - ETA: 1:08 - loss: 0.3740 - regression_loss: 0.3483 - classification_loss: 0.0257 368/500 [=====================>........] - ETA: 1:07 - loss: 0.3744 - regression_loss: 0.3486 - classification_loss: 0.0257 369/500 [=====================>........] - ETA: 1:07 - loss: 0.3738 - regression_loss: 0.3481 - classification_loss: 0.0257 370/500 [=====================>........] - ETA: 1:06 - loss: 0.3742 - regression_loss: 0.3486 - classification_loss: 0.0257 371/500 [=====================>........] - ETA: 1:06 - loss: 0.3742 - regression_loss: 0.3485 - classification_loss: 0.0257 372/500 [=====================>........] - ETA: 1:05 - loss: 0.3741 - regression_loss: 0.3484 - classification_loss: 0.0257 373/500 [=====================>........] - ETA: 1:05 - loss: 0.3750 - regression_loss: 0.3491 - classification_loss: 0.0259 374/500 [=====================>........] - ETA: 1:04 - loss: 0.3746 - regression_loss: 0.3486 - classification_loss: 0.0259 375/500 [=====================>........] - ETA: 1:04 - loss: 0.3748 - regression_loss: 0.3489 - classification_loss: 0.0259 376/500 [=====================>........] - ETA: 1:03 - loss: 0.3749 - regression_loss: 0.3491 - classification_loss: 0.0258 377/500 [=====================>........] - ETA: 1:03 - loss: 0.3756 - regression_loss: 0.3497 - classification_loss: 0.0259 378/500 [=====================>........] - ETA: 1:02 - loss: 0.3756 - regression_loss: 0.3497 - classification_loss: 0.0259 379/500 [=====================>........] - ETA: 1:01 - loss: 0.3756 - regression_loss: 0.3498 - classification_loss: 0.0259 380/500 [=====================>........] - ETA: 1:01 - loss: 0.3755 - regression_loss: 0.3496 - classification_loss: 0.0258 381/500 [=====================>........] - ETA: 1:00 - loss: 0.3748 - regression_loss: 0.3490 - classification_loss: 0.0258 382/500 [=====================>........] - ETA: 1:00 - loss: 0.3746 - regression_loss: 0.3488 - classification_loss: 0.0257 383/500 [=====================>........] - ETA: 59s - loss: 0.3744 - regression_loss: 0.3486 - classification_loss: 0.0257  384/500 [======================>.......] - ETA: 59s - loss: 0.3738 - regression_loss: 0.3481 - classification_loss: 0.0257 385/500 [======================>.......] - ETA: 58s - loss: 0.3742 - regression_loss: 0.3484 - classification_loss: 0.0258 386/500 [======================>.......] - ETA: 58s - loss: 0.3748 - regression_loss: 0.3490 - classification_loss: 0.0259 387/500 [======================>.......] - ETA: 57s - loss: 0.3750 - regression_loss: 0.3491 - classification_loss: 0.0259 388/500 [======================>.......] - ETA: 57s - loss: 0.3748 - regression_loss: 0.3489 - classification_loss: 0.0258 389/500 [======================>.......] - ETA: 56s - loss: 0.3748 - regression_loss: 0.3490 - classification_loss: 0.0258 390/500 [======================>.......] - ETA: 56s - loss: 0.3746 - regression_loss: 0.3489 - classification_loss: 0.0258 391/500 [======================>.......] - ETA: 55s - loss: 0.3764 - regression_loss: 0.3501 - classification_loss: 0.0264 392/500 [======================>.......] - ETA: 55s - loss: 0.3764 - regression_loss: 0.3500 - classification_loss: 0.0264 393/500 [======================>.......] - ETA: 54s - loss: 0.3761 - regression_loss: 0.3498 - classification_loss: 0.0263 394/500 [======================>.......] - ETA: 54s - loss: 0.3758 - regression_loss: 0.3495 - classification_loss: 0.0263 395/500 [======================>.......] - ETA: 53s - loss: 0.3754 - regression_loss: 0.3491 - classification_loss: 0.0263 396/500 [======================>.......] - ETA: 53s - loss: 0.3756 - regression_loss: 0.3493 - classification_loss: 0.0263 397/500 [======================>.......] - ETA: 52s - loss: 0.3756 - regression_loss: 0.3492 - classification_loss: 0.0263 398/500 [======================>.......] - ETA: 52s - loss: 0.3751 - regression_loss: 0.3488 - classification_loss: 0.0263 399/500 [======================>.......] - ETA: 51s - loss: 0.3751 - regression_loss: 0.3488 - classification_loss: 0.0263 400/500 [=======================>......] - ETA: 51s - loss: 0.3755 - regression_loss: 0.3492 - classification_loss: 0.0263 401/500 [=======================>......] - ETA: 50s - loss: 0.3752 - regression_loss: 0.3489 - classification_loss: 0.0263 402/500 [=======================>......] - ETA: 50s - loss: 0.3750 - regression_loss: 0.3487 - classification_loss: 0.0263 403/500 [=======================>......] - ETA: 49s - loss: 0.3749 - regression_loss: 0.3487 - classification_loss: 0.0262 404/500 [=======================>......] - ETA: 49s - loss: 0.3752 - regression_loss: 0.3490 - classification_loss: 0.0262 405/500 [=======================>......] - ETA: 48s - loss: 0.3751 - regression_loss: 0.3489 - classification_loss: 0.0262 406/500 [=======================>......] - ETA: 48s - loss: 0.3755 - regression_loss: 0.3493 - classification_loss: 0.0262 407/500 [=======================>......] - ETA: 47s - loss: 0.3751 - regression_loss: 0.3489 - classification_loss: 0.0262 408/500 [=======================>......] - ETA: 47s - loss: 0.3759 - regression_loss: 0.3497 - classification_loss: 0.0262 409/500 [=======================>......] - ETA: 46s - loss: 0.3761 - regression_loss: 0.3498 - classification_loss: 0.0262 410/500 [=======================>......] - ETA: 46s - loss: 0.3761 - regression_loss: 0.3499 - classification_loss: 0.0262 411/500 [=======================>......] - ETA: 45s - loss: 0.3760 - regression_loss: 0.3499 - classification_loss: 0.0262 412/500 [=======================>......] - ETA: 45s - loss: 0.3765 - regression_loss: 0.3503 - classification_loss: 0.0262 413/500 [=======================>......] - ETA: 44s - loss: 0.3766 - regression_loss: 0.3504 - classification_loss: 0.0262 414/500 [=======================>......] - ETA: 44s - loss: 0.3770 - regression_loss: 0.3508 - classification_loss: 0.0262 415/500 [=======================>......] - ETA: 43s - loss: 0.3774 - regression_loss: 0.3512 - classification_loss: 0.0262 416/500 [=======================>......] - ETA: 43s - loss: 0.3773 - regression_loss: 0.3512 - classification_loss: 0.0262 417/500 [========================>.....] - ETA: 42s - loss: 0.3773 - regression_loss: 0.3512 - classification_loss: 0.0261 418/500 [========================>.....] - ETA: 42s - loss: 0.3772 - regression_loss: 0.3510 - classification_loss: 0.0261 419/500 [========================>.....] - ETA: 41s - loss: 0.3769 - regression_loss: 0.3508 - classification_loss: 0.0261 420/500 [========================>.....] - ETA: 40s - loss: 0.3768 - regression_loss: 0.3508 - classification_loss: 0.0261 421/500 [========================>.....] - ETA: 40s - loss: 0.3777 - regression_loss: 0.3514 - classification_loss: 0.0263 422/500 [========================>.....] - ETA: 39s - loss: 0.3777 - regression_loss: 0.3515 - classification_loss: 0.0262 423/500 [========================>.....] - ETA: 39s - loss: 0.3780 - regression_loss: 0.3517 - classification_loss: 0.0263 424/500 [========================>.....] - ETA: 38s - loss: 0.3776 - regression_loss: 0.3514 - classification_loss: 0.0262 425/500 [========================>.....] - ETA: 38s - loss: 0.3771 - regression_loss: 0.3509 - classification_loss: 0.0262 426/500 [========================>.....] - ETA: 37s - loss: 0.3779 - regression_loss: 0.3517 - classification_loss: 0.0262 427/500 [========================>.....] - ETA: 37s - loss: 0.3781 - regression_loss: 0.3519 - classification_loss: 0.0262 428/500 [========================>.....] - ETA: 36s - loss: 0.3777 - regression_loss: 0.3515 - classification_loss: 0.0262 429/500 [========================>.....] - ETA: 36s - loss: 0.3774 - regression_loss: 0.3512 - classification_loss: 0.0261 430/500 [========================>.....] - ETA: 35s - loss: 0.3775 - regression_loss: 0.3513 - classification_loss: 0.0261 431/500 [========================>.....] - ETA: 35s - loss: 0.3782 - regression_loss: 0.3520 - classification_loss: 0.0262 432/500 [========================>.....] - ETA: 34s - loss: 0.3782 - regression_loss: 0.3520 - classification_loss: 0.0262 433/500 [========================>.....] - ETA: 34s - loss: 0.3785 - regression_loss: 0.3524 - classification_loss: 0.0261 434/500 [=========================>....] - ETA: 33s - loss: 0.3785 - regression_loss: 0.3524 - classification_loss: 0.0261 435/500 [=========================>....] - ETA: 33s - loss: 0.3796 - regression_loss: 0.3534 - classification_loss: 0.0262 436/500 [=========================>....] - ETA: 32s - loss: 0.3794 - regression_loss: 0.3532 - classification_loss: 0.0262 437/500 [=========================>....] - ETA: 32s - loss: 0.3790 - regression_loss: 0.3528 - classification_loss: 0.0262 438/500 [=========================>....] - ETA: 31s - loss: 0.3789 - regression_loss: 0.3528 - classification_loss: 0.0261 439/500 [=========================>....] - ETA: 31s - loss: 0.3792 - regression_loss: 0.3530 - classification_loss: 0.0261 440/500 [=========================>....] - ETA: 30s - loss: 0.3790 - regression_loss: 0.3529 - classification_loss: 0.0261 441/500 [=========================>....] - ETA: 30s - loss: 0.3790 - regression_loss: 0.3529 - classification_loss: 0.0261 442/500 [=========================>....] - ETA: 29s - loss: 0.3789 - regression_loss: 0.3528 - classification_loss: 0.0261 443/500 [=========================>....] - ETA: 29s - loss: 0.3792 - regression_loss: 0.3531 - classification_loss: 0.0261 444/500 [=========================>....] - ETA: 28s - loss: 0.3793 - regression_loss: 0.3532 - classification_loss: 0.0261 445/500 [=========================>....] - ETA: 28s - loss: 0.3795 - regression_loss: 0.3534 - classification_loss: 0.0261 446/500 [=========================>....] - ETA: 27s - loss: 0.3796 - regression_loss: 0.3535 - classification_loss: 0.0261 447/500 [=========================>....] - ETA: 27s - loss: 0.3797 - regression_loss: 0.3536 - classification_loss: 0.0261 448/500 [=========================>....] - ETA: 26s - loss: 0.3794 - regression_loss: 0.3533 - classification_loss: 0.0260 449/500 [=========================>....] - ETA: 26s - loss: 0.3795 - regression_loss: 0.3534 - classification_loss: 0.0261 450/500 [==========================>...] - ETA: 25s - loss: 0.3795 - regression_loss: 0.3534 - classification_loss: 0.0261 451/500 [==========================>...] - ETA: 25s - loss: 0.3790 - regression_loss: 0.3530 - classification_loss: 0.0260 452/500 [==========================>...] - ETA: 24s - loss: 0.3787 - regression_loss: 0.3527 - classification_loss: 0.0260 453/500 [==========================>...] - ETA: 24s - loss: 0.3801 - regression_loss: 0.3537 - classification_loss: 0.0265 454/500 [==========================>...] - ETA: 23s - loss: 0.3797 - regression_loss: 0.3533 - classification_loss: 0.0264 455/500 [==========================>...] - ETA: 23s - loss: 0.3793 - regression_loss: 0.3529 - classification_loss: 0.0264 456/500 [==========================>...] - ETA: 22s - loss: 0.3794 - regression_loss: 0.3530 - classification_loss: 0.0264 457/500 [==========================>...] - ETA: 22s - loss: 0.3796 - regression_loss: 0.3532 - classification_loss: 0.0264 458/500 [==========================>...] - ETA: 21s - loss: 0.3804 - regression_loss: 0.3539 - classification_loss: 0.0265 459/500 [==========================>...] - ETA: 21s - loss: 0.3798 - regression_loss: 0.3533 - classification_loss: 0.0265 460/500 [==========================>...] - ETA: 20s - loss: 0.3801 - regression_loss: 0.3535 - classification_loss: 0.0265 461/500 [==========================>...] - ETA: 19s - loss: 0.3798 - regression_loss: 0.3532 - classification_loss: 0.0265 462/500 [==========================>...] - ETA: 19s - loss: 0.3796 - regression_loss: 0.3531 - classification_loss: 0.0265 463/500 [==========================>...] - ETA: 18s - loss: 0.3794 - regression_loss: 0.3529 - classification_loss: 0.0265 464/500 [==========================>...] - ETA: 18s - loss: 0.3792 - regression_loss: 0.3528 - classification_loss: 0.0265 465/500 [==========================>...] - ETA: 17s - loss: 0.3794 - regression_loss: 0.3529 - classification_loss: 0.0265 466/500 [==========================>...] - ETA: 17s - loss: 0.3792 - regression_loss: 0.3528 - classification_loss: 0.0265 467/500 [===========================>..] - ETA: 16s - loss: 0.3793 - regression_loss: 0.3528 - classification_loss: 0.0264 468/500 [===========================>..] - ETA: 16s - loss: 0.3792 - regression_loss: 0.3527 - classification_loss: 0.0265 469/500 [===========================>..] - ETA: 15s - loss: 0.3786 - regression_loss: 0.3522 - classification_loss: 0.0264 470/500 [===========================>..] - ETA: 15s - loss: 0.3784 - regression_loss: 0.3520 - classification_loss: 0.0264 471/500 [===========================>..] - ETA: 14s - loss: 0.3783 - regression_loss: 0.3519 - classification_loss: 0.0264 472/500 [===========================>..] - ETA: 14s - loss: 0.3784 - regression_loss: 0.3519 - classification_loss: 0.0264 473/500 [===========================>..] - ETA: 13s - loss: 0.3785 - regression_loss: 0.3520 - classification_loss: 0.0264 474/500 [===========================>..] - ETA: 13s - loss: 0.3781 - regression_loss: 0.3517 - classification_loss: 0.0264 475/500 [===========================>..] - ETA: 12s - loss: 0.3779 - regression_loss: 0.3516 - classification_loss: 0.0264 476/500 [===========================>..] - ETA: 12s - loss: 0.3774 - regression_loss: 0.3511 - classification_loss: 0.0263 477/500 [===========================>..] - ETA: 11s - loss: 0.3773 - regression_loss: 0.3511 - classification_loss: 0.0263 478/500 [===========================>..] - ETA: 11s - loss: 0.3775 - regression_loss: 0.3512 - classification_loss: 0.0263 479/500 [===========================>..] - ETA: 10s - loss: 0.3770 - regression_loss: 0.3507 - classification_loss: 0.0263 480/500 [===========================>..] - ETA: 10s - loss: 0.3770 - regression_loss: 0.3508 - classification_loss: 0.0263 481/500 [===========================>..] - ETA: 9s - loss: 0.3772 - regression_loss: 0.3509 - classification_loss: 0.0262  482/500 [===========================>..] - ETA: 9s - loss: 0.3769 - regression_loss: 0.3507 - classification_loss: 0.0262 483/500 [===========================>..] - ETA: 8s - loss: 0.3774 - regression_loss: 0.3510 - classification_loss: 0.0263 484/500 [============================>.] - ETA: 8s - loss: 0.3774 - regression_loss: 0.3511 - classification_loss: 0.0263 485/500 [============================>.] - ETA: 7s - loss: 0.3777 - regression_loss: 0.3514 - classification_loss: 0.0263 486/500 [============================>.] - ETA: 7s - loss: 0.3781 - regression_loss: 0.3518 - classification_loss: 0.0263 487/500 [============================>.] - ETA: 6s - loss: 0.3782 - regression_loss: 0.3518 - classification_loss: 0.0263 488/500 [============================>.] - ETA: 6s - loss: 0.3786 - regression_loss: 0.3523 - classification_loss: 0.0263 489/500 [============================>.] - ETA: 5s - loss: 0.3785 - regression_loss: 0.3522 - classification_loss: 0.0263 490/500 [============================>.] - ETA: 5s - loss: 0.3786 - regression_loss: 0.3523 - classification_loss: 0.0263 491/500 [============================>.] - ETA: 4s - loss: 0.3787 - regression_loss: 0.3523 - classification_loss: 0.0264 492/500 [============================>.] - ETA: 4s - loss: 0.3788 - regression_loss: 0.3524 - classification_loss: 0.0264 493/500 [============================>.] - ETA: 3s - loss: 0.3785 - regression_loss: 0.3521 - classification_loss: 0.0263 494/500 [============================>.] - ETA: 3s - loss: 0.3780 - regression_loss: 0.3517 - classification_loss: 0.0263 495/500 [============================>.] - ETA: 2s - loss: 0.3778 - regression_loss: 0.3515 - classification_loss: 0.0263 496/500 [============================>.] - ETA: 2s - loss: 0.3777 - regression_loss: 0.3514 - classification_loss: 0.0263 497/500 [============================>.] - ETA: 1s - loss: 0.3777 - regression_loss: 0.3514 - classification_loss: 0.0263 498/500 [============================>.] - ETA: 1s - loss: 0.3781 - regression_loss: 0.3518 - classification_loss: 0.0263 499/500 [============================>.] - ETA: 0s - loss: 0.3780 - regression_loss: 0.3517 - classification_loss: 0.0263 500/500 [==============================] - 256s 512ms/step - loss: 0.3779 - regression_loss: 0.3516 - classification_loss: 0.0263 326 instances of class plum with average precision: 0.8277 mAP: 0.8277 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 4:23 - loss: 0.2915 - regression_loss: 0.2650 - classification_loss: 0.0265 2/500 [..............................] - ETA: 4:13 - loss: 0.4087 - regression_loss: 0.3874 - classification_loss: 0.0213 3/500 [..............................] - ETA: 4:15 - loss: 0.3922 - regression_loss: 0.3706 - classification_loss: 0.0216 4/500 [..............................] - ETA: 4:16 - loss: 0.3858 - regression_loss: 0.3656 - classification_loss: 0.0202 5/500 [..............................] - ETA: 4:14 - loss: 0.4073 - regression_loss: 0.3866 - classification_loss: 0.0207 6/500 [..............................] - ETA: 4:14 - loss: 0.3727 - regression_loss: 0.3536 - classification_loss: 0.0191 7/500 [..............................] - ETA: 4:13 - loss: 0.3804 - regression_loss: 0.3582 - classification_loss: 0.0222 8/500 [..............................] - ETA: 4:13 - loss: 0.3900 - regression_loss: 0.3661 - classification_loss: 0.0239 9/500 [..............................] - ETA: 4:12 - loss: 0.4041 - regression_loss: 0.3794 - classification_loss: 0.0247 10/500 [..............................] - ETA: 4:13 - loss: 0.3919 - regression_loss: 0.3682 - classification_loss: 0.0237 11/500 [..............................] - ETA: 4:11 - loss: 0.3759 - regression_loss: 0.3537 - classification_loss: 0.0222 12/500 [..............................] - ETA: 4:10 - loss: 0.3700 - regression_loss: 0.3483 - classification_loss: 0.0217 13/500 [..............................] - ETA: 4:09 - loss: 0.3701 - regression_loss: 0.3477 - classification_loss: 0.0224 14/500 [..............................] - ETA: 4:09 - loss: 0.3608 - regression_loss: 0.3389 - classification_loss: 0.0219 15/500 [..............................] - ETA: 4:08 - loss: 0.3590 - regression_loss: 0.3375 - classification_loss: 0.0215 16/500 [..............................] - ETA: 4:08 - loss: 0.3583 - regression_loss: 0.3370 - classification_loss: 0.0213 17/500 [>.............................] - ETA: 4:07 - loss: 0.3511 - regression_loss: 0.3304 - classification_loss: 0.0207 18/500 [>.............................] - ETA: 4:06 - loss: 0.3457 - regression_loss: 0.3256 - classification_loss: 0.0201 19/500 [>.............................] - ETA: 4:05 - loss: 0.3451 - regression_loss: 0.3246 - classification_loss: 0.0205 20/500 [>.............................] - ETA: 4:04 - loss: 0.3484 - regression_loss: 0.3271 - classification_loss: 0.0213 21/500 [>.............................] - ETA: 4:04 - loss: 0.3423 - regression_loss: 0.3216 - classification_loss: 0.0207 22/500 [>.............................] - ETA: 4:03 - loss: 0.3416 - regression_loss: 0.3213 - classification_loss: 0.0203 23/500 [>.............................] - ETA: 4:03 - loss: 0.3408 - regression_loss: 0.3206 - classification_loss: 0.0201 24/500 [>.............................] - ETA: 4:02 - loss: 0.3442 - regression_loss: 0.3238 - classification_loss: 0.0204 25/500 [>.............................] - ETA: 4:02 - loss: 0.3379 - regression_loss: 0.3177 - classification_loss: 0.0202 26/500 [>.............................] - ETA: 4:01 - loss: 0.3358 - regression_loss: 0.3155 - classification_loss: 0.0203 27/500 [>.............................] - ETA: 4:00 - loss: 0.3435 - regression_loss: 0.3232 - classification_loss: 0.0203 28/500 [>.............................] - ETA: 4:00 - loss: 0.3529 - regression_loss: 0.3319 - classification_loss: 0.0210 29/500 [>.............................] - ETA: 3:59 - loss: 0.3571 - regression_loss: 0.3361 - classification_loss: 0.0209 30/500 [>.............................] - ETA: 3:59 - loss: 0.3557 - regression_loss: 0.3348 - classification_loss: 0.0209 31/500 [>.............................] - ETA: 3:58 - loss: 0.3563 - regression_loss: 0.3351 - classification_loss: 0.0212 32/500 [>.............................] - ETA: 3:58 - loss: 0.3504 - regression_loss: 0.3295 - classification_loss: 0.0210 33/500 [>.............................] - ETA: 3:57 - loss: 0.3509 - regression_loss: 0.3299 - classification_loss: 0.0210 34/500 [=>............................] - ETA: 3:56 - loss: 0.3617 - regression_loss: 0.3388 - classification_loss: 0.0228 35/500 [=>............................] - ETA: 3:56 - loss: 0.3617 - regression_loss: 0.3391 - classification_loss: 0.0227 36/500 [=>............................] - ETA: 3:55 - loss: 0.3595 - regression_loss: 0.3371 - classification_loss: 0.0224 37/500 [=>............................] - ETA: 3:55 - loss: 0.3571 - regression_loss: 0.3349 - classification_loss: 0.0223 38/500 [=>............................] - ETA: 3:55 - loss: 0.3671 - regression_loss: 0.3446 - classification_loss: 0.0225 39/500 [=>............................] - ETA: 3:54 - loss: 0.3626 - regression_loss: 0.3403 - classification_loss: 0.0222 40/500 [=>............................] - ETA: 3:54 - loss: 0.3636 - regression_loss: 0.3412 - classification_loss: 0.0224 41/500 [=>............................] - ETA: 3:53 - loss: 0.3650 - regression_loss: 0.3422 - classification_loss: 0.0228 42/500 [=>............................] - ETA: 3:53 - loss: 0.3601 - regression_loss: 0.3376 - classification_loss: 0.0225 43/500 [=>............................] - ETA: 3:52 - loss: 0.3594 - regression_loss: 0.3371 - classification_loss: 0.0223 44/500 [=>............................] - ETA: 3:52 - loss: 0.3579 - regression_loss: 0.3359 - classification_loss: 0.0221 45/500 [=>............................] - ETA: 3:51 - loss: 0.3639 - regression_loss: 0.3413 - classification_loss: 0.0226 46/500 [=>............................] - ETA: 3:51 - loss: 0.3640 - regression_loss: 0.3414 - classification_loss: 0.0226 47/500 [=>............................] - ETA: 3:50 - loss: 0.3615 - regression_loss: 0.3391 - classification_loss: 0.0224 48/500 [=>............................] - ETA: 3:50 - loss: 0.3606 - regression_loss: 0.3384 - classification_loss: 0.0222 49/500 [=>............................] - ETA: 3:49 - loss: 0.3637 - regression_loss: 0.3416 - classification_loss: 0.0221 50/500 [==>...........................] - ETA: 3:49 - loss: 0.3632 - regression_loss: 0.3411 - classification_loss: 0.0221 51/500 [==>...........................] - ETA: 3:48 - loss: 0.3640 - regression_loss: 0.3418 - classification_loss: 0.0221 52/500 [==>...........................] - ETA: 3:48 - loss: 0.3729 - regression_loss: 0.3494 - classification_loss: 0.0234 53/500 [==>...........................] - ETA: 3:47 - loss: 0.3738 - regression_loss: 0.3498 - classification_loss: 0.0241 54/500 [==>...........................] - ETA: 3:47 - loss: 0.3702 - regression_loss: 0.3464 - classification_loss: 0.0238 55/500 [==>...........................] - ETA: 3:46 - loss: 0.3661 - regression_loss: 0.3426 - classification_loss: 0.0235 56/500 [==>...........................] - ETA: 3:46 - loss: 0.3655 - regression_loss: 0.3420 - classification_loss: 0.0234 57/500 [==>...........................] - ETA: 3:45 - loss: 0.3634 - regression_loss: 0.3402 - classification_loss: 0.0232 58/500 [==>...........................] - ETA: 3:45 - loss: 0.3630 - regression_loss: 0.3399 - classification_loss: 0.0231 59/500 [==>...........................] - ETA: 3:44 - loss: 0.3612 - regression_loss: 0.3382 - classification_loss: 0.0230 60/500 [==>...........................] - ETA: 3:44 - loss: 0.3588 - regression_loss: 0.3359 - classification_loss: 0.0228 61/500 [==>...........................] - ETA: 3:43 - loss: 0.3574 - regression_loss: 0.3347 - classification_loss: 0.0227 62/500 [==>...........................] - ETA: 3:43 - loss: 0.3567 - regression_loss: 0.3340 - classification_loss: 0.0227 63/500 [==>...........................] - ETA: 3:43 - loss: 0.3571 - regression_loss: 0.3344 - classification_loss: 0.0227 64/500 [==>...........................] - ETA: 3:42 - loss: 0.3603 - regression_loss: 0.3374 - classification_loss: 0.0229 65/500 [==>...........................] - ETA: 3:42 - loss: 0.3585 - regression_loss: 0.3357 - classification_loss: 0.0228 66/500 [==>...........................] - ETA: 3:41 - loss: 0.3675 - regression_loss: 0.3438 - classification_loss: 0.0237 67/500 [===>..........................] - ETA: 3:41 - loss: 0.3673 - regression_loss: 0.3437 - classification_loss: 0.0236 68/500 [===>..........................] - ETA: 3:40 - loss: 0.3662 - regression_loss: 0.3427 - classification_loss: 0.0236 69/500 [===>..........................] - ETA: 3:40 - loss: 0.3652 - regression_loss: 0.3417 - classification_loss: 0.0235 70/500 [===>..........................] - ETA: 3:39 - loss: 0.3655 - regression_loss: 0.3421 - classification_loss: 0.0234 71/500 [===>..........................] - ETA: 3:39 - loss: 0.3645 - regression_loss: 0.3412 - classification_loss: 0.0233 72/500 [===>..........................] - ETA: 3:38 - loss: 0.3620 - regression_loss: 0.3388 - classification_loss: 0.0232 73/500 [===>..........................] - ETA: 3:37 - loss: 0.3590 - regression_loss: 0.3360 - classification_loss: 0.0230 74/500 [===>..........................] - ETA: 3:37 - loss: 0.3574 - regression_loss: 0.3346 - classification_loss: 0.0228 75/500 [===>..........................] - ETA: 3:36 - loss: 0.3570 - regression_loss: 0.3344 - classification_loss: 0.0226 76/500 [===>..........................] - ETA: 3:36 - loss: 0.3571 - regression_loss: 0.3347 - classification_loss: 0.0224 77/500 [===>..........................] - ETA: 3:35 - loss: 0.3539 - regression_loss: 0.3317 - classification_loss: 0.0222 78/500 [===>..........................] - ETA: 3:35 - loss: 0.3567 - regression_loss: 0.3344 - classification_loss: 0.0223 79/500 [===>..........................] - ETA: 3:34 - loss: 0.3584 - regression_loss: 0.3359 - classification_loss: 0.0225 80/500 [===>..........................] - ETA: 3:34 - loss: 0.3577 - regression_loss: 0.3353 - classification_loss: 0.0224 81/500 [===>..........................] - ETA: 3:34 - loss: 0.3571 - regression_loss: 0.3346 - classification_loss: 0.0224 82/500 [===>..........................] - ETA: 3:33 - loss: 0.3612 - regression_loss: 0.3384 - classification_loss: 0.0227 83/500 [===>..........................] - ETA: 3:33 - loss: 0.3613 - regression_loss: 0.3387 - classification_loss: 0.0226 84/500 [====>.........................] - ETA: 3:32 - loss: 0.3628 - regression_loss: 0.3401 - classification_loss: 0.0227 85/500 [====>.........................] - ETA: 3:31 - loss: 0.3636 - regression_loss: 0.3408 - classification_loss: 0.0227 86/500 [====>.........................] - ETA: 3:31 - loss: 0.3626 - regression_loss: 0.3401 - classification_loss: 0.0225 87/500 [====>.........................] - ETA: 3:30 - loss: 0.3625 - regression_loss: 0.3401 - classification_loss: 0.0224 88/500 [====>.........................] - ETA: 3:30 - loss: 0.3680 - regression_loss: 0.3454 - classification_loss: 0.0226 89/500 [====>.........................] - ETA: 3:29 - loss: 0.3680 - regression_loss: 0.3451 - classification_loss: 0.0228 90/500 [====>.........................] - ETA: 3:29 - loss: 0.3685 - regression_loss: 0.3457 - classification_loss: 0.0228 91/500 [====>.........................] - ETA: 3:28 - loss: 0.3686 - regression_loss: 0.3458 - classification_loss: 0.0228 92/500 [====>.........................] - ETA: 3:28 - loss: 0.3693 - regression_loss: 0.3465 - classification_loss: 0.0228 93/500 [====>.........................] - ETA: 3:27 - loss: 0.3678 - regression_loss: 0.3452 - classification_loss: 0.0227 94/500 [====>.........................] - ETA: 3:27 - loss: 0.3695 - regression_loss: 0.3466 - classification_loss: 0.0228 95/500 [====>.........................] - ETA: 3:26 - loss: 0.3692 - regression_loss: 0.3465 - classification_loss: 0.0227 96/500 [====>.........................] - ETA: 3:26 - loss: 0.3727 - regression_loss: 0.3490 - classification_loss: 0.0237 97/500 [====>.........................] - ETA: 3:25 - loss: 0.3734 - regression_loss: 0.3498 - classification_loss: 0.0237 98/500 [====>.........................] - ETA: 3:25 - loss: 0.3720 - regression_loss: 0.3484 - classification_loss: 0.0236 99/500 [====>.........................] - ETA: 3:24 - loss: 0.3724 - regression_loss: 0.3488 - classification_loss: 0.0237 100/500 [=====>........................] - ETA: 3:24 - loss: 0.3738 - regression_loss: 0.3501 - classification_loss: 0.0237 101/500 [=====>........................] - ETA: 3:24 - loss: 0.3744 - regression_loss: 0.3508 - classification_loss: 0.0237 102/500 [=====>........................] - ETA: 3:23 - loss: 0.3781 - regression_loss: 0.3538 - classification_loss: 0.0243 103/500 [=====>........................] - ETA: 3:23 - loss: 0.3789 - regression_loss: 0.3544 - classification_loss: 0.0244 104/500 [=====>........................] - ETA: 3:22 - loss: 0.3775 - regression_loss: 0.3532 - classification_loss: 0.0243 105/500 [=====>........................] - ETA: 3:22 - loss: 0.3759 - regression_loss: 0.3518 - classification_loss: 0.0242 106/500 [=====>........................] - ETA: 3:21 - loss: 0.3820 - regression_loss: 0.3571 - classification_loss: 0.0250 107/500 [=====>........................] - ETA: 3:21 - loss: 0.3827 - regression_loss: 0.3576 - classification_loss: 0.0251 108/500 [=====>........................] - ETA: 3:20 - loss: 0.3817 - regression_loss: 0.3567 - classification_loss: 0.0250 109/500 [=====>........................] - ETA: 3:20 - loss: 0.3821 - regression_loss: 0.3569 - classification_loss: 0.0252 110/500 [=====>........................] - ETA: 3:19 - loss: 0.3808 - regression_loss: 0.3558 - classification_loss: 0.0251 111/500 [=====>........................] - ETA: 3:19 - loss: 0.3814 - regression_loss: 0.3563 - classification_loss: 0.0251 112/500 [=====>........................] - ETA: 3:18 - loss: 0.3817 - regression_loss: 0.3567 - classification_loss: 0.0250 113/500 [=====>........................] - ETA: 3:18 - loss: 0.3798 - regression_loss: 0.3549 - classification_loss: 0.0249 114/500 [=====>........................] - ETA: 3:17 - loss: 0.3791 - regression_loss: 0.3542 - classification_loss: 0.0249 115/500 [=====>........................] - ETA: 3:17 - loss: 0.3786 - regression_loss: 0.3537 - classification_loss: 0.0248 116/500 [=====>........................] - ETA: 3:16 - loss: 0.3771 - regression_loss: 0.3524 - classification_loss: 0.0247 117/500 [======>.......................] - ETA: 3:16 - loss: 0.3777 - regression_loss: 0.3530 - classification_loss: 0.0247 118/500 [======>.......................] - ETA: 3:15 - loss: 0.3773 - regression_loss: 0.3526 - classification_loss: 0.0247 119/500 [======>.......................] - ETA: 3:15 - loss: 0.3785 - regression_loss: 0.3536 - classification_loss: 0.0248 120/500 [======>.......................] - ETA: 3:14 - loss: 0.3778 - regression_loss: 0.3530 - classification_loss: 0.0248 121/500 [======>.......................] - ETA: 3:14 - loss: 0.3774 - regression_loss: 0.3527 - classification_loss: 0.0247 122/500 [======>.......................] - ETA: 3:13 - loss: 0.3803 - regression_loss: 0.3551 - classification_loss: 0.0252 123/500 [======>.......................] - ETA: 3:12 - loss: 0.3809 - regression_loss: 0.3557 - classification_loss: 0.0251 124/500 [======>.......................] - ETA: 3:12 - loss: 0.3791 - regression_loss: 0.3540 - classification_loss: 0.0250 125/500 [======>.......................] - ETA: 3:11 - loss: 0.3789 - regression_loss: 0.3539 - classification_loss: 0.0250 126/500 [======>.......................] - ETA: 3:11 - loss: 0.3790 - regression_loss: 0.3539 - classification_loss: 0.0250 127/500 [======>.......................] - ETA: 3:10 - loss: 0.3783 - regression_loss: 0.3532 - classification_loss: 0.0250 128/500 [======>.......................] - ETA: 3:10 - loss: 0.3776 - regression_loss: 0.3527 - classification_loss: 0.0250 129/500 [======>.......................] - ETA: 3:09 - loss: 0.3764 - regression_loss: 0.3515 - classification_loss: 0.0249 130/500 [======>.......................] - ETA: 3:09 - loss: 0.3763 - regression_loss: 0.3515 - classification_loss: 0.0248 131/500 [======>.......................] - ETA: 3:08 - loss: 0.3752 - regression_loss: 0.3505 - classification_loss: 0.0247 132/500 [======>.......................] - ETA: 3:08 - loss: 0.3743 - regression_loss: 0.3498 - classification_loss: 0.0245 133/500 [======>.......................] - ETA: 3:07 - loss: 0.3744 - regression_loss: 0.3498 - classification_loss: 0.0245 134/500 [=======>......................] - ETA: 3:07 - loss: 0.3736 - regression_loss: 0.3492 - classification_loss: 0.0244 135/500 [=======>......................] - ETA: 3:06 - loss: 0.3728 - regression_loss: 0.3484 - classification_loss: 0.0244 136/500 [=======>......................] - ETA: 3:06 - loss: 0.3736 - regression_loss: 0.3490 - classification_loss: 0.0246 137/500 [=======>......................] - ETA: 3:05 - loss: 0.3758 - regression_loss: 0.3510 - classification_loss: 0.0248 138/500 [=======>......................] - ETA: 3:05 - loss: 0.3749 - regression_loss: 0.3503 - classification_loss: 0.0247 139/500 [=======>......................] - ETA: 3:04 - loss: 0.3736 - regression_loss: 0.3490 - classification_loss: 0.0246 140/500 [=======>......................] - ETA: 3:04 - loss: 0.3723 - regression_loss: 0.3478 - classification_loss: 0.0245 141/500 [=======>......................] - ETA: 3:03 - loss: 0.3721 - regression_loss: 0.3477 - classification_loss: 0.0245 142/500 [=======>......................] - ETA: 3:03 - loss: 0.3715 - regression_loss: 0.3470 - classification_loss: 0.0244 143/500 [=======>......................] - ETA: 3:02 - loss: 0.3732 - regression_loss: 0.3488 - classification_loss: 0.0245 144/500 [=======>......................] - ETA: 3:02 - loss: 0.3734 - regression_loss: 0.3488 - classification_loss: 0.0246 145/500 [=======>......................] - ETA: 3:01 - loss: 0.3755 - regression_loss: 0.3504 - classification_loss: 0.0252 146/500 [=======>......................] - ETA: 3:01 - loss: 0.3752 - regression_loss: 0.3501 - classification_loss: 0.0251 147/500 [=======>......................] - ETA: 3:00 - loss: 0.3756 - regression_loss: 0.3504 - classification_loss: 0.0252 148/500 [=======>......................] - ETA: 3:00 - loss: 0.3737 - regression_loss: 0.3487 - classification_loss: 0.0250 149/500 [=======>......................] - ETA: 2:59 - loss: 0.3732 - regression_loss: 0.3482 - classification_loss: 0.0250 150/500 [========>.....................] - ETA: 2:59 - loss: 0.3732 - regression_loss: 0.3481 - classification_loss: 0.0250 151/500 [========>.....................] - ETA: 2:58 - loss: 0.3721 - regression_loss: 0.3472 - classification_loss: 0.0249 152/500 [========>.....................] - ETA: 2:58 - loss: 0.3733 - regression_loss: 0.3483 - classification_loss: 0.0250 153/500 [========>.....................] - ETA: 2:57 - loss: 0.3732 - regression_loss: 0.3481 - classification_loss: 0.0250 154/500 [========>.....................] - ETA: 2:57 - loss: 0.3726 - regression_loss: 0.3476 - classification_loss: 0.0250 155/500 [========>.....................] - ETA: 2:56 - loss: 0.3725 - regression_loss: 0.3474 - classification_loss: 0.0251 156/500 [========>.....................] - ETA: 2:56 - loss: 0.3720 - regression_loss: 0.3470 - classification_loss: 0.0250 157/500 [========>.....................] - ETA: 2:55 - loss: 0.3719 - regression_loss: 0.3470 - classification_loss: 0.0249 158/500 [========>.....................] - ETA: 2:55 - loss: 0.3720 - regression_loss: 0.3470 - classification_loss: 0.0250 159/500 [========>.....................] - ETA: 2:54 - loss: 0.3811 - regression_loss: 0.3545 - classification_loss: 0.0266 160/500 [========>.....................] - ETA: 2:54 - loss: 0.3808 - regression_loss: 0.3542 - classification_loss: 0.0266 161/500 [========>.....................] - ETA: 2:53 - loss: 0.3808 - regression_loss: 0.3543 - classification_loss: 0.0265 162/500 [========>.....................] - ETA: 2:53 - loss: 0.3804 - regression_loss: 0.3539 - classification_loss: 0.0265 163/500 [========>.....................] - ETA: 2:52 - loss: 0.3790 - regression_loss: 0.3526 - classification_loss: 0.0264 164/500 [========>.....................] - ETA: 2:51 - loss: 0.3788 - regression_loss: 0.3524 - classification_loss: 0.0264 165/500 [========>.....................] - ETA: 2:51 - loss: 0.3786 - regression_loss: 0.3522 - classification_loss: 0.0264 166/500 [========>.....................] - ETA: 2:50 - loss: 0.3781 - regression_loss: 0.3517 - classification_loss: 0.0264 167/500 [=========>....................] - ETA: 2:50 - loss: 0.3779 - regression_loss: 0.3516 - classification_loss: 0.0264 168/500 [=========>....................] - ETA: 2:49 - loss: 0.3801 - regression_loss: 0.3535 - classification_loss: 0.0267 169/500 [=========>....................] - ETA: 2:49 - loss: 0.3820 - regression_loss: 0.3552 - classification_loss: 0.0268 170/500 [=========>....................] - ETA: 2:48 - loss: 0.3822 - regression_loss: 0.3551 - classification_loss: 0.0271 171/500 [=========>....................] - ETA: 2:48 - loss: 0.3822 - regression_loss: 0.3551 - classification_loss: 0.0271 172/500 [=========>....................] - ETA: 2:47 - loss: 0.3816 - regression_loss: 0.3546 - classification_loss: 0.0270 173/500 [=========>....................] - ETA: 2:47 - loss: 0.3809 - regression_loss: 0.3539 - classification_loss: 0.0269 174/500 [=========>....................] - ETA: 2:46 - loss: 0.3802 - regression_loss: 0.3533 - classification_loss: 0.0269 175/500 [=========>....................] - ETA: 2:46 - loss: 0.3796 - regression_loss: 0.3528 - classification_loss: 0.0268 176/500 [=========>....................] - ETA: 2:45 - loss: 0.3784 - regression_loss: 0.3516 - classification_loss: 0.0268 177/500 [=========>....................] - ETA: 2:45 - loss: 0.3785 - regression_loss: 0.3517 - classification_loss: 0.0268 178/500 [=========>....................] - ETA: 2:44 - loss: 0.3781 - regression_loss: 0.3513 - classification_loss: 0.0268 179/500 [=========>....................] - ETA: 2:44 - loss: 0.3791 - regression_loss: 0.3523 - classification_loss: 0.0268 180/500 [=========>....................] - ETA: 2:43 - loss: 0.3790 - regression_loss: 0.3523 - classification_loss: 0.0267 181/500 [=========>....................] - ETA: 2:43 - loss: 0.3788 - regression_loss: 0.3521 - classification_loss: 0.0267 182/500 [=========>....................] - ETA: 2:42 - loss: 0.3780 - regression_loss: 0.3514 - classification_loss: 0.0266 183/500 [=========>....................] - ETA: 2:42 - loss: 0.3786 - regression_loss: 0.3520 - classification_loss: 0.0266 184/500 [==========>...................] - ETA: 2:41 - loss: 0.3780 - regression_loss: 0.3516 - classification_loss: 0.0265 185/500 [==========>...................] - ETA: 2:41 - loss: 0.3786 - regression_loss: 0.3521 - classification_loss: 0.0265 186/500 [==========>...................] - ETA: 2:40 - loss: 0.3777 - regression_loss: 0.3514 - classification_loss: 0.0264 187/500 [==========>...................] - ETA: 2:40 - loss: 0.3782 - regression_loss: 0.3518 - classification_loss: 0.0263 188/500 [==========>...................] - ETA: 2:39 - loss: 0.3778 - regression_loss: 0.3515 - classification_loss: 0.0263 189/500 [==========>...................] - ETA: 2:39 - loss: 0.3792 - regression_loss: 0.3528 - classification_loss: 0.0265 190/500 [==========>...................] - ETA: 2:38 - loss: 0.3778 - regression_loss: 0.3514 - classification_loss: 0.0264 191/500 [==========>...................] - ETA: 2:38 - loss: 0.3769 - regression_loss: 0.3506 - classification_loss: 0.0263 192/500 [==========>...................] - ETA: 2:37 - loss: 0.3781 - regression_loss: 0.3518 - classification_loss: 0.0263 193/500 [==========>...................] - ETA: 2:37 - loss: 0.3774 - regression_loss: 0.3512 - classification_loss: 0.0263 194/500 [==========>...................] - ETA: 2:36 - loss: 0.3774 - regression_loss: 0.3512 - classification_loss: 0.0262 195/500 [==========>...................] - ETA: 2:36 - loss: 0.3772 - regression_loss: 0.3510 - classification_loss: 0.0263 196/500 [==========>...................] - ETA: 2:35 - loss: 0.3781 - regression_loss: 0.3516 - classification_loss: 0.0264 197/500 [==========>...................] - ETA: 2:35 - loss: 0.3806 - regression_loss: 0.3541 - classification_loss: 0.0266 198/500 [==========>...................] - ETA: 2:34 - loss: 0.3810 - regression_loss: 0.3544 - classification_loss: 0.0265 199/500 [==========>...................] - ETA: 2:34 - loss: 0.3845 - regression_loss: 0.3569 - classification_loss: 0.0276 200/500 [===========>..................] - ETA: 2:33 - loss: 0.3842 - regression_loss: 0.3567 - classification_loss: 0.0275 201/500 [===========>..................] - ETA: 2:33 - loss: 0.3835 - regression_loss: 0.3560 - classification_loss: 0.0275 202/500 [===========>..................] - ETA: 2:32 - loss: 0.3828 - regression_loss: 0.3554 - classification_loss: 0.0274 203/500 [===========>..................] - ETA: 2:32 - loss: 0.3831 - regression_loss: 0.3557 - classification_loss: 0.0274 204/500 [===========>..................] - ETA: 2:31 - loss: 0.3825 - regression_loss: 0.3551 - classification_loss: 0.0274 205/500 [===========>..................] - ETA: 2:31 - loss: 0.3827 - regression_loss: 0.3553 - classification_loss: 0.0274 206/500 [===========>..................] - ETA: 2:30 - loss: 0.3819 - regression_loss: 0.3546 - classification_loss: 0.0273 207/500 [===========>..................] - ETA: 2:30 - loss: 0.3820 - regression_loss: 0.3547 - classification_loss: 0.0273 208/500 [===========>..................] - ETA: 2:29 - loss: 0.3831 - regression_loss: 0.3558 - classification_loss: 0.0273 209/500 [===========>..................] - ETA: 2:29 - loss: 0.3830 - regression_loss: 0.3557 - classification_loss: 0.0273 210/500 [===========>..................] - ETA: 2:28 - loss: 0.3822 - regression_loss: 0.3550 - classification_loss: 0.0272 211/500 [===========>..................] - ETA: 2:28 - loss: 0.3826 - regression_loss: 0.3553 - classification_loss: 0.0273 212/500 [===========>..................] - ETA: 2:27 - loss: 0.3823 - regression_loss: 0.3551 - classification_loss: 0.0272 213/500 [===========>..................] - ETA: 2:27 - loss: 0.3824 - regression_loss: 0.3552 - classification_loss: 0.0272 214/500 [===========>..................] - ETA: 2:26 - loss: 0.3828 - regression_loss: 0.3556 - classification_loss: 0.0272 215/500 [===========>..................] - ETA: 2:26 - loss: 0.3830 - regression_loss: 0.3557 - classification_loss: 0.0273 216/500 [===========>..................] - ETA: 2:25 - loss: 0.3830 - regression_loss: 0.3557 - classification_loss: 0.0273 217/500 [============>.................] - ETA: 2:25 - loss: 0.3830 - regression_loss: 0.3557 - classification_loss: 0.0273 218/500 [============>.................] - ETA: 2:24 - loss: 0.3823 - regression_loss: 0.3550 - classification_loss: 0.0273 219/500 [============>.................] - ETA: 2:24 - loss: 0.3824 - regression_loss: 0.3551 - classification_loss: 0.0273 220/500 [============>.................] - ETA: 2:23 - loss: 0.3824 - regression_loss: 0.3552 - classification_loss: 0.0273 221/500 [============>.................] - ETA: 2:23 - loss: 0.3818 - regression_loss: 0.3546 - classification_loss: 0.0272 222/500 [============>.................] - ETA: 2:22 - loss: 0.3812 - regression_loss: 0.3540 - classification_loss: 0.0272 223/500 [============>.................] - ETA: 2:22 - loss: 0.3816 - regression_loss: 0.3545 - classification_loss: 0.0271 224/500 [============>.................] - ETA: 2:21 - loss: 0.3811 - regression_loss: 0.3540 - classification_loss: 0.0270 225/500 [============>.................] - ETA: 2:21 - loss: 0.3814 - regression_loss: 0.3543 - classification_loss: 0.0270 226/500 [============>.................] - ETA: 2:20 - loss: 0.3816 - regression_loss: 0.3546 - classification_loss: 0.0270 227/500 [============>.................] - ETA: 2:20 - loss: 0.3809 - regression_loss: 0.3539 - classification_loss: 0.0270 228/500 [============>.................] - ETA: 2:19 - loss: 0.3805 - regression_loss: 0.3536 - classification_loss: 0.0269 229/500 [============>.................] - ETA: 2:19 - loss: 0.3804 - regression_loss: 0.3535 - classification_loss: 0.0269 230/500 [============>.................] - ETA: 2:18 - loss: 0.3796 - regression_loss: 0.3528 - classification_loss: 0.0268 231/500 [============>.................] - ETA: 2:18 - loss: 0.3791 - regression_loss: 0.3524 - classification_loss: 0.0267 232/500 [============>.................] - ETA: 2:17 - loss: 0.3783 - regression_loss: 0.3516 - classification_loss: 0.0267 233/500 [============>.................] - ETA: 2:17 - loss: 0.3784 - regression_loss: 0.3518 - classification_loss: 0.0267 234/500 [=============>................] - ETA: 2:16 - loss: 0.3789 - regression_loss: 0.3523 - classification_loss: 0.0266 235/500 [=============>................] - ETA: 2:15 - loss: 0.3789 - regression_loss: 0.3524 - classification_loss: 0.0265 236/500 [=============>................] - ETA: 2:15 - loss: 0.3788 - regression_loss: 0.3523 - classification_loss: 0.0265 237/500 [=============>................] - ETA: 2:14 - loss: 0.3783 - regression_loss: 0.3519 - classification_loss: 0.0265 238/500 [=============>................] - ETA: 2:14 - loss: 0.3787 - regression_loss: 0.3522 - classification_loss: 0.0265 239/500 [=============>................] - ETA: 2:13 - loss: 0.3790 - regression_loss: 0.3525 - classification_loss: 0.0264 240/500 [=============>................] - ETA: 2:13 - loss: 0.3824 - regression_loss: 0.3558 - classification_loss: 0.0266 241/500 [=============>................] - ETA: 2:12 - loss: 0.3824 - regression_loss: 0.3557 - classification_loss: 0.0267 242/500 [=============>................] - ETA: 2:12 - loss: 0.3826 - regression_loss: 0.3559 - classification_loss: 0.0268 243/500 [=============>................] - ETA: 2:11 - loss: 0.3824 - regression_loss: 0.3557 - classification_loss: 0.0268 244/500 [=============>................] - ETA: 2:11 - loss: 0.3822 - regression_loss: 0.3555 - classification_loss: 0.0267 245/500 [=============>................] - ETA: 2:10 - loss: 0.3821 - regression_loss: 0.3554 - classification_loss: 0.0267 246/500 [=============>................] - ETA: 2:10 - loss: 0.3817 - regression_loss: 0.3550 - classification_loss: 0.0266 247/500 [=============>................] - ETA: 2:09 - loss: 0.3815 - regression_loss: 0.3549 - classification_loss: 0.0266 248/500 [=============>................] - ETA: 2:09 - loss: 0.3812 - regression_loss: 0.3546 - classification_loss: 0.0265 249/500 [=============>................] - ETA: 2:08 - loss: 0.3803 - regression_loss: 0.3538 - classification_loss: 0.0265 250/500 [==============>...............] - ETA: 2:08 - loss: 0.3800 - regression_loss: 0.3536 - classification_loss: 0.0264 251/500 [==============>...............] - ETA: 2:07 - loss: 0.3794 - regression_loss: 0.3530 - classification_loss: 0.0264 252/500 [==============>...............] - ETA: 2:07 - loss: 0.3797 - regression_loss: 0.3534 - classification_loss: 0.0263 253/500 [==============>...............] - ETA: 2:06 - loss: 0.3797 - regression_loss: 0.3534 - classification_loss: 0.0263 254/500 [==============>...............] - ETA: 2:06 - loss: 0.3794 - regression_loss: 0.3531 - classification_loss: 0.0263 255/500 [==============>...............] - ETA: 2:05 - loss: 0.3791 - regression_loss: 0.3529 - classification_loss: 0.0263 256/500 [==============>...............] - ETA: 2:05 - loss: 0.3781 - regression_loss: 0.3519 - classification_loss: 0.0262 257/500 [==============>...............] - ETA: 2:04 - loss: 0.3775 - regression_loss: 0.3514 - classification_loss: 0.0262 258/500 [==============>...............] - ETA: 2:04 - loss: 0.3772 - regression_loss: 0.3508 - classification_loss: 0.0264 259/500 [==============>...............] - ETA: 2:03 - loss: 0.3774 - regression_loss: 0.3510 - classification_loss: 0.0264 260/500 [==============>...............] - ETA: 2:03 - loss: 0.3767 - regression_loss: 0.3504 - classification_loss: 0.0263 261/500 [==============>...............] - ETA: 2:02 - loss: 0.3762 - regression_loss: 0.3499 - classification_loss: 0.0263 262/500 [==============>...............] - ETA: 2:02 - loss: 0.3766 - regression_loss: 0.3503 - classification_loss: 0.0263 263/500 [==============>...............] - ETA: 2:01 - loss: 0.3765 - regression_loss: 0.3502 - classification_loss: 0.0263 264/500 [==============>...............] - ETA: 2:01 - loss: 0.3766 - regression_loss: 0.3504 - classification_loss: 0.0262 265/500 [==============>...............] - ETA: 2:00 - loss: 0.3769 - regression_loss: 0.3507 - classification_loss: 0.0262 266/500 [==============>...............] - ETA: 1:59 - loss: 0.3763 - regression_loss: 0.3501 - classification_loss: 0.0262 267/500 [===============>..............] - ETA: 1:59 - loss: 0.3758 - regression_loss: 0.3497 - classification_loss: 0.0261 268/500 [===============>..............] - ETA: 1:58 - loss: 0.3759 - regression_loss: 0.3498 - classification_loss: 0.0261 269/500 [===============>..............] - ETA: 1:58 - loss: 0.3771 - regression_loss: 0.3510 - classification_loss: 0.0261 270/500 [===============>..............] - ETA: 1:57 - loss: 0.3801 - regression_loss: 0.3537 - classification_loss: 0.0264 271/500 [===============>..............] - ETA: 1:57 - loss: 0.3816 - regression_loss: 0.3549 - classification_loss: 0.0267 272/500 [===============>..............] - ETA: 1:56 - loss: 0.3819 - regression_loss: 0.3552 - classification_loss: 0.0267 273/500 [===============>..............] - ETA: 1:56 - loss: 0.3820 - regression_loss: 0.3554 - classification_loss: 0.0267 274/500 [===============>..............] - ETA: 1:55 - loss: 0.3819 - regression_loss: 0.3553 - classification_loss: 0.0266 275/500 [===============>..............] - ETA: 1:55 - loss: 0.3833 - regression_loss: 0.3566 - classification_loss: 0.0267 276/500 [===============>..............] - ETA: 1:54 - loss: 0.3841 - regression_loss: 0.3573 - classification_loss: 0.0267 277/500 [===============>..............] - ETA: 1:54 - loss: 0.3840 - regression_loss: 0.3572 - classification_loss: 0.0268 278/500 [===============>..............] - ETA: 1:53 - loss: 0.3850 - regression_loss: 0.3579 - classification_loss: 0.0271 279/500 [===============>..............] - ETA: 1:53 - loss: 0.3843 - regression_loss: 0.3572 - classification_loss: 0.0270 280/500 [===============>..............] - ETA: 1:52 - loss: 0.3838 - regression_loss: 0.3568 - classification_loss: 0.0270 281/500 [===============>..............] - ETA: 1:52 - loss: 0.3840 - regression_loss: 0.3570 - classification_loss: 0.0270 282/500 [===============>..............] - ETA: 1:51 - loss: 0.3855 - regression_loss: 0.3585 - classification_loss: 0.0271 283/500 [===============>..............] - ETA: 1:51 - loss: 0.3858 - regression_loss: 0.3587 - classification_loss: 0.0271 284/500 [================>.............] - ETA: 1:50 - loss: 0.3860 - regression_loss: 0.3589 - classification_loss: 0.0271 285/500 [================>.............] - ETA: 1:50 - loss: 0.3854 - regression_loss: 0.3583 - classification_loss: 0.0271 286/500 [================>.............] - ETA: 1:49 - loss: 0.3852 - regression_loss: 0.3581 - classification_loss: 0.0271 287/500 [================>.............] - ETA: 1:49 - loss: 0.3863 - regression_loss: 0.3592 - classification_loss: 0.0271 288/500 [================>.............] - ETA: 1:48 - loss: 0.3859 - regression_loss: 0.3589 - classification_loss: 0.0270 289/500 [================>.............] - ETA: 1:48 - loss: 0.3854 - regression_loss: 0.3584 - classification_loss: 0.0270 290/500 [================>.............] - ETA: 1:47 - loss: 0.3856 - regression_loss: 0.3585 - classification_loss: 0.0270 291/500 [================>.............] - ETA: 1:47 - loss: 0.3854 - regression_loss: 0.3584 - classification_loss: 0.0270 292/500 [================>.............] - ETA: 1:46 - loss: 0.3857 - regression_loss: 0.3588 - classification_loss: 0.0269 293/500 [================>.............] - ETA: 1:46 - loss: 0.3863 - regression_loss: 0.3594 - classification_loss: 0.0269 294/500 [================>.............] - ETA: 1:45 - loss: 0.3862 - regression_loss: 0.3593 - classification_loss: 0.0269 295/500 [================>.............] - ETA: 1:45 - loss: 0.3864 - regression_loss: 0.3595 - classification_loss: 0.0269 296/500 [================>.............] - ETA: 1:44 - loss: 0.3858 - regression_loss: 0.3590 - classification_loss: 0.0269 297/500 [================>.............] - ETA: 1:44 - loss: 0.3856 - regression_loss: 0.3588 - classification_loss: 0.0268 298/500 [================>.............] - ETA: 1:43 - loss: 0.3854 - regression_loss: 0.3586 - classification_loss: 0.0268 299/500 [================>.............] - ETA: 1:43 - loss: 0.3851 - regression_loss: 0.3583 - classification_loss: 0.0268 300/500 [=================>............] - ETA: 1:42 - loss: 0.3847 - regression_loss: 0.3580 - classification_loss: 0.0267 301/500 [=================>............] - ETA: 1:42 - loss: 0.3843 - regression_loss: 0.3576 - classification_loss: 0.0267 302/500 [=================>............] - ETA: 1:41 - loss: 0.3850 - regression_loss: 0.3582 - classification_loss: 0.0268 303/500 [=================>............] - ETA: 1:41 - loss: 0.3847 - regression_loss: 0.3580 - classification_loss: 0.0267 304/500 [=================>............] - ETA: 1:40 - loss: 0.3846 - regression_loss: 0.3579 - classification_loss: 0.0267 305/500 [=================>............] - ETA: 1:40 - loss: 0.3845 - regression_loss: 0.3578 - classification_loss: 0.0267 306/500 [=================>............] - ETA: 1:39 - loss: 0.3836 - regression_loss: 0.3570 - classification_loss: 0.0266 307/500 [=================>............] - ETA: 1:39 - loss: 0.3838 - regression_loss: 0.3571 - classification_loss: 0.0267 308/500 [=================>............] - ETA: 1:38 - loss: 0.3839 - regression_loss: 0.3573 - classification_loss: 0.0267 309/500 [=================>............] - ETA: 1:37 - loss: 0.3839 - regression_loss: 0.3573 - classification_loss: 0.0266 310/500 [=================>............] - ETA: 1:37 - loss: 0.3832 - regression_loss: 0.3567 - classification_loss: 0.0266 311/500 [=================>............] - ETA: 1:36 - loss: 0.3834 - regression_loss: 0.3568 - classification_loss: 0.0266 312/500 [=================>............] - ETA: 1:36 - loss: 0.3832 - regression_loss: 0.3567 - classification_loss: 0.0265 313/500 [=================>............] - ETA: 1:35 - loss: 0.3827 - regression_loss: 0.3562 - classification_loss: 0.0265 314/500 [=================>............] - ETA: 1:35 - loss: 0.3826 - regression_loss: 0.3562 - classification_loss: 0.0265 315/500 [=================>............] - ETA: 1:34 - loss: 0.3829 - regression_loss: 0.3564 - classification_loss: 0.0265 316/500 [=================>............] - ETA: 1:34 - loss: 0.3824 - regression_loss: 0.3560 - classification_loss: 0.0264 317/500 [==================>...........] - ETA: 1:33 - loss: 0.3821 - regression_loss: 0.3557 - classification_loss: 0.0264 318/500 [==================>...........] - ETA: 1:33 - loss: 0.3816 - regression_loss: 0.3553 - classification_loss: 0.0264 319/500 [==================>...........] - ETA: 1:32 - loss: 0.3815 - regression_loss: 0.3552 - classification_loss: 0.0263 320/500 [==================>...........] - ETA: 1:32 - loss: 0.3811 - regression_loss: 0.3548 - classification_loss: 0.0263 321/500 [==================>...........] - ETA: 1:31 - loss: 0.3810 - regression_loss: 0.3548 - classification_loss: 0.0262 322/500 [==================>...........] - ETA: 1:31 - loss: 0.3806 - regression_loss: 0.3544 - classification_loss: 0.0262 323/500 [==================>...........] - ETA: 1:30 - loss: 0.3804 - regression_loss: 0.3543 - classification_loss: 0.0261 324/500 [==================>...........] - ETA: 1:30 - loss: 0.3822 - regression_loss: 0.3559 - classification_loss: 0.0262 325/500 [==================>...........] - ETA: 1:29 - loss: 0.3820 - regression_loss: 0.3558 - classification_loss: 0.0262 326/500 [==================>...........] - ETA: 1:29 - loss: 0.3818 - regression_loss: 0.3556 - classification_loss: 0.0262 327/500 [==================>...........] - ETA: 1:28 - loss: 0.3819 - regression_loss: 0.3557 - classification_loss: 0.0261 328/500 [==================>...........] - ETA: 1:28 - loss: 0.3818 - regression_loss: 0.3557 - classification_loss: 0.0261 329/500 [==================>...........] - ETA: 1:27 - loss: 0.3828 - regression_loss: 0.3567 - classification_loss: 0.0261 330/500 [==================>...........] - ETA: 1:27 - loss: 0.3823 - regression_loss: 0.3563 - classification_loss: 0.0261 331/500 [==================>...........] - ETA: 1:26 - loss: 0.3835 - regression_loss: 0.3574 - classification_loss: 0.0261 332/500 [==================>...........] - ETA: 1:26 - loss: 0.3833 - regression_loss: 0.3572 - classification_loss: 0.0261 333/500 [==================>...........] - ETA: 1:25 - loss: 0.3831 - regression_loss: 0.3571 - classification_loss: 0.0261 334/500 [===================>..........] - ETA: 1:25 - loss: 0.3834 - regression_loss: 0.3573 - classification_loss: 0.0261 335/500 [===================>..........] - ETA: 1:24 - loss: 0.3831 - regression_loss: 0.3570 - classification_loss: 0.0261 336/500 [===================>..........] - ETA: 1:24 - loss: 0.3834 - regression_loss: 0.3573 - classification_loss: 0.0261 337/500 [===================>..........] - ETA: 1:23 - loss: 0.3830 - regression_loss: 0.3570 - classification_loss: 0.0260 338/500 [===================>..........] - ETA: 1:23 - loss: 0.3828 - regression_loss: 0.3568 - classification_loss: 0.0261 339/500 [===================>..........] - ETA: 1:22 - loss: 0.3830 - regression_loss: 0.3570 - classification_loss: 0.0260 340/500 [===================>..........] - ETA: 1:22 - loss: 0.3824 - regression_loss: 0.3564 - classification_loss: 0.0260 341/500 [===================>..........] - ETA: 1:21 - loss: 0.3822 - regression_loss: 0.3562 - classification_loss: 0.0260 342/500 [===================>..........] - ETA: 1:21 - loss: 0.3829 - regression_loss: 0.3568 - classification_loss: 0.0261 343/500 [===================>..........] - ETA: 1:20 - loss: 0.3851 - regression_loss: 0.3584 - classification_loss: 0.0267 344/500 [===================>..........] - ETA: 1:19 - loss: 0.3849 - regression_loss: 0.3583 - classification_loss: 0.0267 345/500 [===================>..........] - ETA: 1:19 - loss: 0.3842 - regression_loss: 0.3576 - classification_loss: 0.0266 346/500 [===================>..........] - ETA: 1:18 - loss: 0.3839 - regression_loss: 0.3573 - classification_loss: 0.0266 347/500 [===================>..........] - ETA: 1:18 - loss: 0.3835 - regression_loss: 0.3570 - classification_loss: 0.0265 348/500 [===================>..........] - ETA: 1:17 - loss: 0.3832 - regression_loss: 0.3568 - classification_loss: 0.0265 349/500 [===================>..........] - ETA: 1:17 - loss: 0.3829 - regression_loss: 0.3564 - classification_loss: 0.0265 350/500 [====================>.........] - ETA: 1:16 - loss: 0.3831 - regression_loss: 0.3566 - classification_loss: 0.0265 351/500 [====================>.........] - ETA: 1:16 - loss: 0.3837 - regression_loss: 0.3572 - classification_loss: 0.0265 352/500 [====================>.........] - ETA: 1:15 - loss: 0.3838 - regression_loss: 0.3573 - classification_loss: 0.0265 353/500 [====================>.........] - ETA: 1:15 - loss: 0.3838 - regression_loss: 0.3573 - classification_loss: 0.0264 354/500 [====================>.........] - ETA: 1:14 - loss: 0.3842 - regression_loss: 0.3578 - classification_loss: 0.0265 355/500 [====================>.........] - ETA: 1:14 - loss: 0.3839 - regression_loss: 0.3575 - classification_loss: 0.0264 356/500 [====================>.........] - ETA: 1:13 - loss: 0.3835 - regression_loss: 0.3571 - classification_loss: 0.0264 357/500 [====================>.........] - ETA: 1:13 - loss: 0.3829 - regression_loss: 0.3566 - classification_loss: 0.0263 358/500 [====================>.........] - ETA: 1:12 - loss: 0.3826 - regression_loss: 0.3563 - classification_loss: 0.0263 359/500 [====================>.........] - ETA: 1:12 - loss: 0.3821 - regression_loss: 0.3559 - classification_loss: 0.0263 360/500 [====================>.........] - ETA: 1:11 - loss: 0.3824 - regression_loss: 0.3562 - classification_loss: 0.0262 361/500 [====================>.........] - ETA: 1:11 - loss: 0.3826 - regression_loss: 0.3564 - classification_loss: 0.0262 362/500 [====================>.........] - ETA: 1:10 - loss: 0.3823 - regression_loss: 0.3561 - classification_loss: 0.0262 363/500 [====================>.........] - ETA: 1:10 - loss: 0.3819 - regression_loss: 0.3558 - classification_loss: 0.0262 364/500 [====================>.........] - ETA: 1:09 - loss: 0.3824 - regression_loss: 0.3562 - classification_loss: 0.0262 365/500 [====================>.........] - ETA: 1:09 - loss: 0.3822 - regression_loss: 0.3560 - classification_loss: 0.0261 366/500 [====================>.........] - ETA: 1:08 - loss: 0.3819 - regression_loss: 0.3558 - classification_loss: 0.0261 367/500 [=====================>........] - ETA: 1:08 - loss: 0.3821 - regression_loss: 0.3560 - classification_loss: 0.0261 368/500 [=====================>........] - ETA: 1:07 - loss: 0.3818 - regression_loss: 0.3558 - classification_loss: 0.0260 369/500 [=====================>........] - ETA: 1:07 - loss: 0.3817 - regression_loss: 0.3557 - classification_loss: 0.0260 370/500 [=====================>........] - ETA: 1:06 - loss: 0.3814 - regression_loss: 0.3555 - classification_loss: 0.0260 371/500 [=====================>........] - ETA: 1:06 - loss: 0.3815 - regression_loss: 0.3555 - classification_loss: 0.0260 372/500 [=====================>........] - ETA: 1:05 - loss: 0.3824 - regression_loss: 0.3563 - classification_loss: 0.0261 373/500 [=====================>........] - ETA: 1:05 - loss: 0.3835 - regression_loss: 0.3572 - classification_loss: 0.0263 374/500 [=====================>........] - ETA: 1:04 - loss: 0.3836 - regression_loss: 0.3573 - classification_loss: 0.0262 375/500 [=====================>........] - ETA: 1:04 - loss: 0.3845 - regression_loss: 0.3581 - classification_loss: 0.0264 376/500 [=====================>........] - ETA: 1:03 - loss: 0.3847 - regression_loss: 0.3582 - classification_loss: 0.0265 377/500 [=====================>........] - ETA: 1:03 - loss: 0.3853 - regression_loss: 0.3588 - classification_loss: 0.0265 378/500 [=====================>........] - ETA: 1:02 - loss: 0.3850 - regression_loss: 0.3585 - classification_loss: 0.0265 379/500 [=====================>........] - ETA: 1:02 - loss: 0.3850 - regression_loss: 0.3585 - classification_loss: 0.0265 380/500 [=====================>........] - ETA: 1:01 - loss: 0.3850 - regression_loss: 0.3586 - classification_loss: 0.0265 381/500 [=====================>........] - ETA: 1:01 - loss: 0.3848 - regression_loss: 0.3583 - classification_loss: 0.0265 382/500 [=====================>........] - ETA: 1:00 - loss: 0.3853 - regression_loss: 0.3589 - classification_loss: 0.0265 383/500 [=====================>........] - ETA: 1:00 - loss: 0.3850 - regression_loss: 0.3585 - classification_loss: 0.0264 384/500 [======================>.......] - ETA: 59s - loss: 0.3848 - regression_loss: 0.3583 - classification_loss: 0.0265  385/500 [======================>.......] - ETA: 58s - loss: 0.3843 - regression_loss: 0.3578 - classification_loss: 0.0265 386/500 [======================>.......] - ETA: 58s - loss: 0.3844 - regression_loss: 0.3579 - classification_loss: 0.0265 387/500 [======================>.......] - ETA: 57s - loss: 0.3845 - regression_loss: 0.3580 - classification_loss: 0.0265 388/500 [======================>.......] - ETA: 57s - loss: 0.3845 - regression_loss: 0.3580 - classification_loss: 0.0265 389/500 [======================>.......] - ETA: 56s - loss: 0.3840 - regression_loss: 0.3576 - classification_loss: 0.0264 390/500 [======================>.......] - ETA: 56s - loss: 0.3836 - regression_loss: 0.3572 - classification_loss: 0.0264 391/500 [======================>.......] - ETA: 55s - loss: 0.3836 - regression_loss: 0.3573 - classification_loss: 0.0264 392/500 [======================>.......] - ETA: 55s - loss: 0.3843 - regression_loss: 0.3579 - classification_loss: 0.0264 393/500 [======================>.......] - ETA: 54s - loss: 0.3840 - regression_loss: 0.3576 - classification_loss: 0.0264 394/500 [======================>.......] - ETA: 54s - loss: 0.3837 - regression_loss: 0.3574 - classification_loss: 0.0263 395/500 [======================>.......] - ETA: 53s - loss: 0.3836 - regression_loss: 0.3573 - classification_loss: 0.0263 396/500 [======================>.......] - ETA: 53s - loss: 0.3828 - regression_loss: 0.3565 - classification_loss: 0.0263 397/500 [======================>.......] - ETA: 52s - loss: 0.3824 - regression_loss: 0.3562 - classification_loss: 0.0262 398/500 [======================>.......] - ETA: 52s - loss: 0.3824 - regression_loss: 0.3561 - classification_loss: 0.0263 399/500 [======================>.......] - ETA: 51s - loss: 0.3822 - regression_loss: 0.3559 - classification_loss: 0.0263 400/500 [=======================>......] - ETA: 51s - loss: 0.3821 - regression_loss: 0.3558 - classification_loss: 0.0263 401/500 [=======================>......] - ETA: 50s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 402/500 [=======================>......] - ETA: 50s - loss: 0.3819 - regression_loss: 0.3557 - classification_loss: 0.0263 403/500 [=======================>......] - ETA: 49s - loss: 0.3817 - regression_loss: 0.3555 - classification_loss: 0.0262 404/500 [=======================>......] - ETA: 49s - loss: 0.3814 - regression_loss: 0.3552 - classification_loss: 0.0262 405/500 [=======================>......] - ETA: 48s - loss: 0.3813 - regression_loss: 0.3550 - classification_loss: 0.0263 406/500 [=======================>......] - ETA: 48s - loss: 0.3807 - regression_loss: 0.3545 - classification_loss: 0.0262 407/500 [=======================>......] - ETA: 47s - loss: 0.3810 - regression_loss: 0.3548 - classification_loss: 0.0263 408/500 [=======================>......] - ETA: 47s - loss: 0.3814 - regression_loss: 0.3552 - classification_loss: 0.0262 409/500 [=======================>......] - ETA: 46s - loss: 0.3819 - regression_loss: 0.3556 - classification_loss: 0.0262 410/500 [=======================>......] - ETA: 46s - loss: 0.3817 - regression_loss: 0.3554 - classification_loss: 0.0262 411/500 [=======================>......] - ETA: 45s - loss: 0.3817 - regression_loss: 0.3554 - classification_loss: 0.0263 412/500 [=======================>......] - ETA: 45s - loss: 0.3817 - regression_loss: 0.3555 - classification_loss: 0.0263 413/500 [=======================>......] - ETA: 44s - loss: 0.3814 - regression_loss: 0.3552 - classification_loss: 0.0262 414/500 [=======================>......] - ETA: 44s - loss: 0.3816 - regression_loss: 0.3554 - classification_loss: 0.0262 415/500 [=======================>......] - ETA: 43s - loss: 0.3814 - regression_loss: 0.3552 - classification_loss: 0.0262 416/500 [=======================>......] - ETA: 43s - loss: 0.3813 - regression_loss: 0.3551 - classification_loss: 0.0262 417/500 [========================>.....] - ETA: 42s - loss: 0.3811 - regression_loss: 0.3549 - classification_loss: 0.0262 418/500 [========================>.....] - ETA: 42s - loss: 0.3807 - regression_loss: 0.3546 - classification_loss: 0.0262 419/500 [========================>.....] - ETA: 41s - loss: 0.3809 - regression_loss: 0.3548 - classification_loss: 0.0261 420/500 [========================>.....] - ETA: 41s - loss: 0.3811 - regression_loss: 0.3549 - classification_loss: 0.0262 421/500 [========================>.....] - ETA: 40s - loss: 0.3810 - regression_loss: 0.3549 - classification_loss: 0.0261 422/500 [========================>.....] - ETA: 39s - loss: 0.3810 - regression_loss: 0.3549 - classification_loss: 0.0261 423/500 [========================>.....] - ETA: 39s - loss: 0.3809 - regression_loss: 0.3548 - classification_loss: 0.0261 424/500 [========================>.....] - ETA: 38s - loss: 0.3812 - regression_loss: 0.3551 - classification_loss: 0.0261 425/500 [========================>.....] - ETA: 38s - loss: 0.3810 - regression_loss: 0.3549 - classification_loss: 0.0261 426/500 [========================>.....] - ETA: 37s - loss: 0.3815 - regression_loss: 0.3553 - classification_loss: 0.0262 427/500 [========================>.....] - ETA: 37s - loss: 0.3817 - regression_loss: 0.3555 - classification_loss: 0.0261 428/500 [========================>.....] - ETA: 36s - loss: 0.3824 - regression_loss: 0.3562 - classification_loss: 0.0262 429/500 [========================>.....] - ETA: 36s - loss: 0.3826 - regression_loss: 0.3564 - classification_loss: 0.0262 430/500 [========================>.....] - ETA: 35s - loss: 0.3823 - regression_loss: 0.3562 - classification_loss: 0.0262 431/500 [========================>.....] - ETA: 35s - loss: 0.3820 - regression_loss: 0.3558 - classification_loss: 0.0262 432/500 [========================>.....] - ETA: 34s - loss: 0.3818 - regression_loss: 0.3557 - classification_loss: 0.0261 433/500 [========================>.....] - ETA: 34s - loss: 0.3818 - regression_loss: 0.3557 - classification_loss: 0.0261 434/500 [=========================>....] - ETA: 33s - loss: 0.3820 - regression_loss: 0.3559 - classification_loss: 0.0261 435/500 [=========================>....] - ETA: 33s - loss: 0.3819 - regression_loss: 0.3558 - classification_loss: 0.0261 436/500 [=========================>....] - ETA: 32s - loss: 0.3817 - regression_loss: 0.3556 - classification_loss: 0.0261 437/500 [=========================>....] - ETA: 32s - loss: 0.3816 - regression_loss: 0.3555 - classification_loss: 0.0260 438/500 [=========================>....] - ETA: 31s - loss: 0.3815 - regression_loss: 0.3555 - classification_loss: 0.0260 439/500 [=========================>....] - ETA: 31s - loss: 0.3837 - regression_loss: 0.3575 - classification_loss: 0.0262 440/500 [=========================>....] - ETA: 30s - loss: 0.3833 - regression_loss: 0.3571 - classification_loss: 0.0262 441/500 [=========================>....] - ETA: 30s - loss: 0.3831 - regression_loss: 0.3569 - classification_loss: 0.0261 442/500 [=========================>....] - ETA: 29s - loss: 0.3832 - regression_loss: 0.3570 - classification_loss: 0.0262 443/500 [=========================>....] - ETA: 29s - loss: 0.3831 - regression_loss: 0.3570 - classification_loss: 0.0261 444/500 [=========================>....] - ETA: 28s - loss: 0.3838 - regression_loss: 0.3575 - classification_loss: 0.0263 445/500 [=========================>....] - ETA: 28s - loss: 0.3840 - regression_loss: 0.3577 - classification_loss: 0.0263 446/500 [=========================>....] - ETA: 27s - loss: 0.3837 - regression_loss: 0.3574 - classification_loss: 0.0262 447/500 [=========================>....] - ETA: 27s - loss: 0.3835 - regression_loss: 0.3573 - classification_loss: 0.0262 448/500 [=========================>....] - ETA: 26s - loss: 0.3830 - regression_loss: 0.3568 - classification_loss: 0.0262 449/500 [=========================>....] - ETA: 26s - loss: 0.3826 - regression_loss: 0.3564 - classification_loss: 0.0262 450/500 [==========================>...] - ETA: 25s - loss: 0.3822 - regression_loss: 0.3561 - classification_loss: 0.0262 451/500 [==========================>...] - ETA: 25s - loss: 0.3823 - regression_loss: 0.3561 - classification_loss: 0.0262 452/500 [==========================>...] - ETA: 24s - loss: 0.3831 - regression_loss: 0.3568 - classification_loss: 0.0262 453/500 [==========================>...] - ETA: 24s - loss: 0.3828 - regression_loss: 0.3566 - classification_loss: 0.0262 454/500 [==========================>...] - ETA: 23s - loss: 0.3827 - regression_loss: 0.3564 - classification_loss: 0.0262 455/500 [==========================>...] - ETA: 23s - loss: 0.3827 - regression_loss: 0.3565 - classification_loss: 0.0262 456/500 [==========================>...] - ETA: 22s - loss: 0.3824 - regression_loss: 0.3562 - classification_loss: 0.0262 457/500 [==========================>...] - ETA: 22s - loss: 0.3820 - regression_loss: 0.3558 - classification_loss: 0.0262 458/500 [==========================>...] - ETA: 21s - loss: 0.3818 - regression_loss: 0.3557 - classification_loss: 0.0261 459/500 [==========================>...] - ETA: 21s - loss: 0.3827 - regression_loss: 0.3564 - classification_loss: 0.0263 460/500 [==========================>...] - ETA: 20s - loss: 0.3823 - regression_loss: 0.3561 - classification_loss: 0.0262 461/500 [==========================>...] - ETA: 19s - loss: 0.3830 - regression_loss: 0.3567 - classification_loss: 0.0263 462/500 [==========================>...] - ETA: 19s - loss: 0.3826 - regression_loss: 0.3563 - classification_loss: 0.0262 463/500 [==========================>...] - ETA: 18s - loss: 0.3833 - regression_loss: 0.3570 - classification_loss: 0.0263 464/500 [==========================>...] - ETA: 18s - loss: 0.3831 - regression_loss: 0.3568 - classification_loss: 0.0263 465/500 [==========================>...] - ETA: 17s - loss: 0.3831 - regression_loss: 0.3568 - classification_loss: 0.0262 466/500 [==========================>...] - ETA: 17s - loss: 0.3834 - regression_loss: 0.3571 - classification_loss: 0.0262 467/500 [===========================>..] - ETA: 16s - loss: 0.3832 - regression_loss: 0.3570 - classification_loss: 0.0262 468/500 [===========================>..] - ETA: 16s - loss: 0.3829 - regression_loss: 0.3567 - classification_loss: 0.0262 469/500 [===========================>..] - ETA: 15s - loss: 0.3834 - regression_loss: 0.3572 - classification_loss: 0.0262 470/500 [===========================>..] - ETA: 15s - loss: 0.3833 - regression_loss: 0.3571 - classification_loss: 0.0262 471/500 [===========================>..] - ETA: 14s - loss: 0.3832 - regression_loss: 0.3570 - classification_loss: 0.0261 472/500 [===========================>..] - ETA: 14s - loss: 0.3830 - regression_loss: 0.3569 - classification_loss: 0.0261 473/500 [===========================>..] - ETA: 13s - loss: 0.3826 - regression_loss: 0.3565 - classification_loss: 0.0261 474/500 [===========================>..] - ETA: 13s - loss: 0.3827 - regression_loss: 0.3566 - classification_loss: 0.0261 475/500 [===========================>..] - ETA: 12s - loss: 0.3822 - regression_loss: 0.3562 - classification_loss: 0.0260 476/500 [===========================>..] - ETA: 12s - loss: 0.3819 - regression_loss: 0.3559 - classification_loss: 0.0260 477/500 [===========================>..] - ETA: 11s - loss: 0.3816 - regression_loss: 0.3555 - classification_loss: 0.0260 478/500 [===========================>..] - ETA: 11s - loss: 0.3818 - regression_loss: 0.3558 - classification_loss: 0.0260 479/500 [===========================>..] - ETA: 10s - loss: 0.3816 - regression_loss: 0.3556 - classification_loss: 0.0260 480/500 [===========================>..] - ETA: 10s - loss: 0.3817 - regression_loss: 0.3557 - classification_loss: 0.0260 481/500 [===========================>..] - ETA: 9s - loss: 0.3812 - regression_loss: 0.3553 - classification_loss: 0.0260  482/500 [===========================>..] - ETA: 9s - loss: 0.3811 - regression_loss: 0.3551 - classification_loss: 0.0259 483/500 [===========================>..] - ETA: 8s - loss: 0.3812 - regression_loss: 0.3552 - classification_loss: 0.0259 484/500 [============================>.] - ETA: 8s - loss: 0.3807 - regression_loss: 0.3548 - classification_loss: 0.0259 485/500 [============================>.] - ETA: 7s - loss: 0.3807 - regression_loss: 0.3548 - classification_loss: 0.0259 486/500 [============================>.] - ETA: 7s - loss: 0.3807 - regression_loss: 0.3547 - classification_loss: 0.0259 487/500 [============================>.] - ETA: 6s - loss: 0.3804 - regression_loss: 0.3545 - classification_loss: 0.0259 488/500 [============================>.] - ETA: 6s - loss: 0.3804 - regression_loss: 0.3545 - classification_loss: 0.0259 489/500 [============================>.] - ETA: 5s - loss: 0.3804 - regression_loss: 0.3545 - classification_loss: 0.0259 490/500 [============================>.] - ETA: 5s - loss: 0.3811 - regression_loss: 0.3552 - classification_loss: 0.0259 491/500 [============================>.] - ETA: 4s - loss: 0.3814 - regression_loss: 0.3555 - classification_loss: 0.0259 492/500 [============================>.] - ETA: 4s - loss: 0.3810 - regression_loss: 0.3551 - classification_loss: 0.0259 493/500 [============================>.] - ETA: 3s - loss: 0.3815 - regression_loss: 0.3556 - classification_loss: 0.0259 494/500 [============================>.] - ETA: 3s - loss: 0.3813 - regression_loss: 0.3554 - classification_loss: 0.0259 495/500 [============================>.] - ETA: 2s - loss: 0.3827 - regression_loss: 0.3563 - classification_loss: 0.0263 496/500 [============================>.] - ETA: 2s - loss: 0.3828 - regression_loss: 0.3565 - classification_loss: 0.0263 497/500 [============================>.] - ETA: 1s - loss: 0.3827 - regression_loss: 0.3565 - classification_loss: 0.0263 498/500 [============================>.] - ETA: 1s - loss: 0.3825 - regression_loss: 0.3562 - classification_loss: 0.0263 499/500 [============================>.] - ETA: 0s - loss: 0.3823 - regression_loss: 0.3560 - classification_loss: 0.0263 500/500 [==============================] - 256s 513ms/step - loss: 0.3834 - regression_loss: 0.3570 - classification_loss: 0.0264 326 instances of class plum with average precision: 0.8515 mAP: 0.8515 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 00004: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. Epoch 5/150 1/500 [..............................] - ETA: 4:10 - loss: 0.2117 - regression_loss: 0.2009 - classification_loss: 0.0109 2/500 [..............................] - ETA: 4:14 - loss: 0.2430 - regression_loss: 0.2330 - classification_loss: 0.0100 3/500 [..............................] - ETA: 4:14 - loss: 0.3103 - regression_loss: 0.2961 - classification_loss: 0.0142 4/500 [..............................] - ETA: 4:11 - loss: 0.3614 - regression_loss: 0.3480 - classification_loss: 0.0134 5/500 [..............................] - ETA: 4:10 - loss: 0.3644 - regression_loss: 0.3498 - classification_loss: 0.0146 6/500 [..............................] - ETA: 4:11 - loss: 0.3636 - regression_loss: 0.3482 - classification_loss: 0.0155 7/500 [..............................] - ETA: 4:11 - loss: 0.3474 - regression_loss: 0.3318 - classification_loss: 0.0156 8/500 [..............................] - ETA: 4:09 - loss: 0.3414 - regression_loss: 0.3263 - classification_loss: 0.0152 9/500 [..............................] - ETA: 4:07 - loss: 0.3586 - regression_loss: 0.3416 - classification_loss: 0.0170 10/500 [..............................] - ETA: 4:07 - loss: 0.3499 - regression_loss: 0.3333 - classification_loss: 0.0165 11/500 [..............................] - ETA: 4:08 - loss: 0.3562 - regression_loss: 0.3393 - classification_loss: 0.0169 12/500 [..............................] - ETA: 4:07 - loss: 0.3538 - regression_loss: 0.3372 - classification_loss: 0.0166 13/500 [..............................] - ETA: 4:07 - loss: 0.3494 - regression_loss: 0.3323 - classification_loss: 0.0170 14/500 [..............................] - ETA: 4:06 - loss: 0.3445 - regression_loss: 0.3274 - classification_loss: 0.0170 15/500 [..............................] - ETA: 4:05 - loss: 0.3479 - regression_loss: 0.3306 - classification_loss: 0.0173 16/500 [..............................] - ETA: 4:05 - loss: 0.3468 - regression_loss: 0.3292 - classification_loss: 0.0176 17/500 [>.............................] - ETA: 4:04 - loss: 0.3698 - regression_loss: 0.3492 - classification_loss: 0.0206 18/500 [>.............................] - ETA: 4:04 - loss: 0.3656 - regression_loss: 0.3450 - classification_loss: 0.0206 19/500 [>.............................] - ETA: 4:03 - loss: 0.3537 - regression_loss: 0.3338 - classification_loss: 0.0199 20/500 [>.............................] - ETA: 4:03 - loss: 0.3706 - regression_loss: 0.3506 - classification_loss: 0.0199 21/500 [>.............................] - ETA: 4:02 - loss: 0.3632 - regression_loss: 0.3437 - classification_loss: 0.0196 22/500 [>.............................] - ETA: 4:02 - loss: 0.3817 - regression_loss: 0.3622 - classification_loss: 0.0195 23/500 [>.............................] - ETA: 4:01 - loss: 0.4121 - regression_loss: 0.3897 - classification_loss: 0.0224 24/500 [>.............................] - ETA: 4:00 - loss: 0.4087 - regression_loss: 0.3862 - classification_loss: 0.0224 25/500 [>.............................] - ETA: 3:59 - loss: 0.3990 - regression_loss: 0.3771 - classification_loss: 0.0219 26/500 [>.............................] - ETA: 3:59 - loss: 0.3926 - regression_loss: 0.3709 - classification_loss: 0.0218 27/500 [>.............................] - ETA: 3:58 - loss: 0.3808 - regression_loss: 0.3596 - classification_loss: 0.0212 28/500 [>.............................] - ETA: 3:59 - loss: 0.3764 - regression_loss: 0.3555 - classification_loss: 0.0210 29/500 [>.............................] - ETA: 3:58 - loss: 0.3717 - regression_loss: 0.3510 - classification_loss: 0.0207 30/500 [>.............................] - ETA: 3:57 - loss: 0.3669 - regression_loss: 0.3467 - classification_loss: 0.0202 31/500 [>.............................] - ETA: 3:57 - loss: 0.3628 - regression_loss: 0.3430 - classification_loss: 0.0199 32/500 [>.............................] - ETA: 3:56 - loss: 0.3607 - regression_loss: 0.3405 - classification_loss: 0.0202 33/500 [>.............................] - ETA: 3:56 - loss: 0.3569 - regression_loss: 0.3371 - classification_loss: 0.0198 34/500 [=>............................] - ETA: 3:55 - loss: 0.3544 - regression_loss: 0.3347 - classification_loss: 0.0198 35/500 [=>............................] - ETA: 3:55 - loss: 0.3636 - regression_loss: 0.3432 - classification_loss: 0.0203 36/500 [=>............................] - ETA: 3:55 - loss: 0.3573 - regression_loss: 0.3373 - classification_loss: 0.0199 37/500 [=>............................] - ETA: 3:55 - loss: 0.3672 - regression_loss: 0.3461 - classification_loss: 0.0211 38/500 [=>............................] - ETA: 3:54 - loss: 0.3602 - regression_loss: 0.3394 - classification_loss: 0.0208 39/500 [=>............................] - ETA: 3:54 - loss: 0.3596 - regression_loss: 0.3387 - classification_loss: 0.0209 40/500 [=>............................] - ETA: 3:54 - loss: 0.3562 - regression_loss: 0.3357 - classification_loss: 0.0206 41/500 [=>............................] - ETA: 3:53 - loss: 0.3554 - regression_loss: 0.3350 - classification_loss: 0.0205 42/500 [=>............................] - ETA: 3:52 - loss: 0.3640 - regression_loss: 0.3415 - classification_loss: 0.0225 43/500 [=>............................] - ETA: 3:52 - loss: 0.3591 - regression_loss: 0.3370 - classification_loss: 0.0221 44/500 [=>............................] - ETA: 3:51 - loss: 0.3577 - regression_loss: 0.3357 - classification_loss: 0.0220 45/500 [=>............................] - ETA: 3:51 - loss: 0.3564 - regression_loss: 0.3347 - classification_loss: 0.0217 46/500 [=>............................] - ETA: 3:50 - loss: 0.3549 - regression_loss: 0.3332 - classification_loss: 0.0217 47/500 [=>............................] - ETA: 3:50 - loss: 0.3520 - regression_loss: 0.3305 - classification_loss: 0.0216 48/500 [=>............................] - ETA: 3:49 - loss: 0.3488 - regression_loss: 0.3275 - classification_loss: 0.0213 49/500 [=>............................] - ETA: 3:49 - loss: 0.3518 - regression_loss: 0.3306 - classification_loss: 0.0213 50/500 [==>...........................] - ETA: 3:48 - loss: 0.3515 - regression_loss: 0.3302 - classification_loss: 0.0213 51/500 [==>...........................] - ETA: 3:48 - loss: 0.3494 - regression_loss: 0.3283 - classification_loss: 0.0211 52/500 [==>...........................] - ETA: 3:47 - loss: 0.3476 - regression_loss: 0.3267 - classification_loss: 0.0209 53/500 [==>...........................] - ETA: 3:47 - loss: 0.3504 - regression_loss: 0.3295 - classification_loss: 0.0209 54/500 [==>...........................] - ETA: 3:46 - loss: 0.3534 - regression_loss: 0.3321 - classification_loss: 0.0213 55/500 [==>...........................] - ETA: 3:46 - loss: 0.3499 - regression_loss: 0.3288 - classification_loss: 0.0211 56/500 [==>...........................] - ETA: 3:45 - loss: 0.3482 - regression_loss: 0.3272 - classification_loss: 0.0210 57/500 [==>...........................] - ETA: 3:45 - loss: 0.3470 - regression_loss: 0.3261 - classification_loss: 0.0210 58/500 [==>...........................] - ETA: 3:44 - loss: 0.3446 - regression_loss: 0.3238 - classification_loss: 0.0208 59/500 [==>...........................] - ETA: 3:44 - loss: 0.3410 - regression_loss: 0.3204 - classification_loss: 0.0205 60/500 [==>...........................] - ETA: 3:44 - loss: 0.3458 - regression_loss: 0.3238 - classification_loss: 0.0220 61/500 [==>...........................] - ETA: 3:43 - loss: 0.3468 - regression_loss: 0.3249 - classification_loss: 0.0220 62/500 [==>...........................] - ETA: 3:43 - loss: 0.3455 - regression_loss: 0.3235 - classification_loss: 0.0220 63/500 [==>...........................] - ETA: 3:42 - loss: 0.3451 - regression_loss: 0.3232 - classification_loss: 0.0220 64/500 [==>...........................] - ETA: 3:42 - loss: 0.3422 - regression_loss: 0.3203 - classification_loss: 0.0218 65/500 [==>...........................] - ETA: 3:41 - loss: 0.3385 - regression_loss: 0.3169 - classification_loss: 0.0216 66/500 [==>...........................] - ETA: 3:41 - loss: 0.3388 - regression_loss: 0.3172 - classification_loss: 0.0216 67/500 [===>..........................] - ETA: 3:40 - loss: 0.3369 - regression_loss: 0.3153 - classification_loss: 0.0215 68/500 [===>..........................] - ETA: 3:40 - loss: 0.3339 - regression_loss: 0.3125 - classification_loss: 0.0214 69/500 [===>..........................] - ETA: 3:39 - loss: 0.3331 - regression_loss: 0.3118 - classification_loss: 0.0212 70/500 [===>..........................] - ETA: 3:39 - loss: 0.3317 - regression_loss: 0.3106 - classification_loss: 0.0211 71/500 [===>..........................] - ETA: 3:38 - loss: 0.3300 - regression_loss: 0.3091 - classification_loss: 0.0209 72/500 [===>..........................] - ETA: 3:38 - loss: 0.3285 - regression_loss: 0.3077 - classification_loss: 0.0208 73/500 [===>..........................] - ETA: 3:37 - loss: 0.3280 - regression_loss: 0.3073 - classification_loss: 0.0207 74/500 [===>..........................] - ETA: 3:37 - loss: 0.3308 - regression_loss: 0.3100 - classification_loss: 0.0208 75/500 [===>..........................] - ETA: 3:36 - loss: 0.3276 - regression_loss: 0.3070 - classification_loss: 0.0206 76/500 [===>..........................] - ETA: 3:36 - loss: 0.3264 - regression_loss: 0.3058 - classification_loss: 0.0206 77/500 [===>..........................] - ETA: 3:35 - loss: 0.3249 - regression_loss: 0.3043 - classification_loss: 0.0207 78/500 [===>..........................] - ETA: 3:35 - loss: 0.3251 - regression_loss: 0.3041 - classification_loss: 0.0209 79/500 [===>..........................] - ETA: 3:34 - loss: 0.3236 - regression_loss: 0.3028 - classification_loss: 0.0208 80/500 [===>..........................] - ETA: 3:34 - loss: 0.3223 - regression_loss: 0.3016 - classification_loss: 0.0207 81/500 [===>..........................] - ETA: 3:33 - loss: 0.3211 - regression_loss: 0.3005 - classification_loss: 0.0206 82/500 [===>..........................] - ETA: 3:33 - loss: 0.3206 - regression_loss: 0.2999 - classification_loss: 0.0206 83/500 [===>..........................] - ETA: 3:32 - loss: 0.3194 - regression_loss: 0.2990 - classification_loss: 0.0204 84/500 [====>.........................] - ETA: 3:32 - loss: 0.3189 - regression_loss: 0.2983 - classification_loss: 0.0206 85/500 [====>.........................] - ETA: 3:31 - loss: 0.3184 - regression_loss: 0.2978 - classification_loss: 0.0206 86/500 [====>.........................] - ETA: 3:31 - loss: 0.3185 - regression_loss: 0.2979 - classification_loss: 0.0206 87/500 [====>.........................] - ETA: 3:30 - loss: 0.3188 - regression_loss: 0.2982 - classification_loss: 0.0206 88/500 [====>.........................] - ETA: 3:29 - loss: 0.3186 - regression_loss: 0.2980 - classification_loss: 0.0206 89/500 [====>.........................] - ETA: 3:29 - loss: 0.3178 - regression_loss: 0.2973 - classification_loss: 0.0205 90/500 [====>.........................] - ETA: 3:28 - loss: 0.3178 - regression_loss: 0.2973 - classification_loss: 0.0205 91/500 [====>.........................] - ETA: 3:28 - loss: 0.3172 - regression_loss: 0.2967 - classification_loss: 0.0204 92/500 [====>.........................] - ETA: 3:27 - loss: 0.3155 - regression_loss: 0.2952 - classification_loss: 0.0203 93/500 [====>.........................] - ETA: 3:27 - loss: 0.3136 - regression_loss: 0.2934 - classification_loss: 0.0203 94/500 [====>.........................] - ETA: 3:26 - loss: 0.3134 - regression_loss: 0.2927 - classification_loss: 0.0207 95/500 [====>.........................] - ETA: 3:26 - loss: 0.3114 - regression_loss: 0.2909 - classification_loss: 0.0205 96/500 [====>.........................] - ETA: 3:26 - loss: 0.3112 - regression_loss: 0.2907 - classification_loss: 0.0205 97/500 [====>.........................] - ETA: 3:25 - loss: 0.3107 - regression_loss: 0.2902 - classification_loss: 0.0206 98/500 [====>.........................] - ETA: 3:25 - loss: 0.3093 - regression_loss: 0.2888 - classification_loss: 0.0205 99/500 [====>.........................] - ETA: 3:24 - loss: 0.3083 - regression_loss: 0.2878 - classification_loss: 0.0204 100/500 [=====>........................] - ETA: 3:24 - loss: 0.3065 - regression_loss: 0.2862 - classification_loss: 0.0203 101/500 [=====>........................] - ETA: 3:23 - loss: 0.3043 - regression_loss: 0.2841 - classification_loss: 0.0202 102/500 [=====>........................] - ETA: 3:23 - loss: 0.3027 - regression_loss: 0.2826 - classification_loss: 0.0201 103/500 [=====>........................] - ETA: 3:22 - loss: 0.3028 - regression_loss: 0.2826 - classification_loss: 0.0202 104/500 [=====>........................] - ETA: 3:21 - loss: 0.3031 - regression_loss: 0.2829 - classification_loss: 0.0202 105/500 [=====>........................] - ETA: 3:21 - loss: 0.3011 - regression_loss: 0.2810 - classification_loss: 0.0201 106/500 [=====>........................] - ETA: 3:21 - loss: 0.2999 - regression_loss: 0.2799 - classification_loss: 0.0200 107/500 [=====>........................] - ETA: 3:20 - loss: 0.2994 - regression_loss: 0.2794 - classification_loss: 0.0200 108/500 [=====>........................] - ETA: 3:20 - loss: 0.2996 - regression_loss: 0.2796 - classification_loss: 0.0200 109/500 [=====>........................] - ETA: 3:19 - loss: 0.3028 - regression_loss: 0.2825 - classification_loss: 0.0203 110/500 [=====>........................] - ETA: 3:19 - loss: 0.3024 - regression_loss: 0.2823 - classification_loss: 0.0202 111/500 [=====>........................] - ETA: 3:18 - loss: 0.3043 - regression_loss: 0.2841 - classification_loss: 0.0202 112/500 [=====>........................] - ETA: 3:18 - loss: 0.3042 - regression_loss: 0.2840 - classification_loss: 0.0202 113/500 [=====>........................] - ETA: 3:17 - loss: 0.3021 - regression_loss: 0.2820 - classification_loss: 0.0201 114/500 [=====>........................] - ETA: 3:16 - loss: 0.3006 - regression_loss: 0.2807 - classification_loss: 0.0200 115/500 [=====>........................] - ETA: 3:16 - loss: 0.2997 - regression_loss: 0.2798 - classification_loss: 0.0199 116/500 [=====>........................] - ETA: 3:15 - loss: 0.2986 - regression_loss: 0.2788 - classification_loss: 0.0198 117/500 [======>.......................] - ETA: 3:15 - loss: 0.3022 - regression_loss: 0.2825 - classification_loss: 0.0197 118/500 [======>.......................] - ETA: 3:14 - loss: 0.3003 - regression_loss: 0.2807 - classification_loss: 0.0196 119/500 [======>.......................] - ETA: 3:14 - loss: 0.2995 - regression_loss: 0.2799 - classification_loss: 0.0196 120/500 [======>.......................] - ETA: 3:13 - loss: 0.2995 - regression_loss: 0.2798 - classification_loss: 0.0196 121/500 [======>.......................] - ETA: 3:13 - loss: 0.2986 - regression_loss: 0.2791 - classification_loss: 0.0196 122/500 [======>.......................] - ETA: 3:12 - loss: 0.2984 - regression_loss: 0.2788 - classification_loss: 0.0196 123/500 [======>.......................] - ETA: 3:12 - loss: 0.2977 - regression_loss: 0.2783 - classification_loss: 0.0195 124/500 [======>.......................] - ETA: 3:11 - loss: 0.2971 - regression_loss: 0.2775 - classification_loss: 0.0196 125/500 [======>.......................] - ETA: 3:11 - loss: 0.2961 - regression_loss: 0.2766 - classification_loss: 0.0195 126/500 [======>.......................] - ETA: 3:10 - loss: 0.2956 - regression_loss: 0.2761 - classification_loss: 0.0194 127/500 [======>.......................] - ETA: 3:10 - loss: 0.2948 - regression_loss: 0.2755 - classification_loss: 0.0194 128/500 [======>.......................] - ETA: 3:09 - loss: 0.2948 - regression_loss: 0.2755 - classification_loss: 0.0193 129/500 [======>.......................] - ETA: 3:09 - loss: 0.2932 - regression_loss: 0.2740 - classification_loss: 0.0193 130/500 [======>.......................] - ETA: 3:08 - loss: 0.2965 - regression_loss: 0.2768 - classification_loss: 0.0196 131/500 [======>.......................] - ETA: 3:08 - loss: 0.2953 - regression_loss: 0.2757 - classification_loss: 0.0196 132/500 [======>.......................] - ETA: 3:07 - loss: 0.2948 - regression_loss: 0.2753 - classification_loss: 0.0195 133/500 [======>.......................] - ETA: 3:07 - loss: 0.2936 - regression_loss: 0.2742 - classification_loss: 0.0194 134/500 [=======>......................] - ETA: 3:06 - loss: 0.2921 - regression_loss: 0.2728 - classification_loss: 0.0193 135/500 [=======>......................] - ETA: 3:06 - loss: 0.2921 - regression_loss: 0.2728 - classification_loss: 0.0193 136/500 [=======>......................] - ETA: 3:05 - loss: 0.2913 - regression_loss: 0.2721 - classification_loss: 0.0192 137/500 [=======>......................] - ETA: 3:05 - loss: 0.2907 - regression_loss: 0.2715 - classification_loss: 0.0192 138/500 [=======>......................] - ETA: 3:04 - loss: 0.2895 - regression_loss: 0.2704 - classification_loss: 0.0191 139/500 [=======>......................] - ETA: 3:04 - loss: 0.2887 - regression_loss: 0.2697 - classification_loss: 0.0191 140/500 [=======>......................] - ETA: 3:03 - loss: 0.2903 - regression_loss: 0.2710 - classification_loss: 0.0192 141/500 [=======>......................] - ETA: 3:03 - loss: 0.2901 - regression_loss: 0.2709 - classification_loss: 0.0192 142/500 [=======>......................] - ETA: 3:02 - loss: 0.2917 - regression_loss: 0.2724 - classification_loss: 0.0193 143/500 [=======>......................] - ETA: 3:02 - loss: 0.2920 - regression_loss: 0.2726 - classification_loss: 0.0193 144/500 [=======>......................] - ETA: 3:01 - loss: 0.2913 - regression_loss: 0.2720 - classification_loss: 0.0193 145/500 [=======>......................] - ETA: 3:01 - loss: 0.2936 - regression_loss: 0.2739 - classification_loss: 0.0197 146/500 [=======>......................] - ETA: 3:00 - loss: 0.2926 - regression_loss: 0.2730 - classification_loss: 0.0196 147/500 [=======>......................] - ETA: 3:00 - loss: 0.2932 - regression_loss: 0.2736 - classification_loss: 0.0196 148/500 [=======>......................] - ETA: 2:59 - loss: 0.2930 - regression_loss: 0.2735 - classification_loss: 0.0195 149/500 [=======>......................] - ETA: 2:59 - loss: 0.2921 - regression_loss: 0.2726 - classification_loss: 0.0195 150/500 [========>.....................] - ETA: 2:58 - loss: 0.2908 - regression_loss: 0.2714 - classification_loss: 0.0194 151/500 [========>.....................] - ETA: 2:58 - loss: 0.2928 - regression_loss: 0.2734 - classification_loss: 0.0194 152/500 [========>.....................] - ETA: 2:57 - loss: 0.2920 - regression_loss: 0.2726 - classification_loss: 0.0194 153/500 [========>.....................] - ETA: 2:57 - loss: 0.2920 - regression_loss: 0.2726 - classification_loss: 0.0194 154/500 [========>.....................] - ETA: 2:56 - loss: 0.2919 - regression_loss: 0.2725 - classification_loss: 0.0194 155/500 [========>.....................] - ETA: 2:56 - loss: 0.2918 - regression_loss: 0.2724 - classification_loss: 0.0194 156/500 [========>.....................] - ETA: 2:55 - loss: 0.2917 - regression_loss: 0.2724 - classification_loss: 0.0194 157/500 [========>.....................] - ETA: 2:55 - loss: 0.2904 - regression_loss: 0.2711 - classification_loss: 0.0193 158/500 [========>.....................] - ETA: 2:54 - loss: 0.2900 - regression_loss: 0.2707 - classification_loss: 0.0193 159/500 [========>.....................] - ETA: 2:54 - loss: 0.2898 - regression_loss: 0.2705 - classification_loss: 0.0193 160/500 [========>.....................] - ETA: 2:53 - loss: 0.2897 - regression_loss: 0.2704 - classification_loss: 0.0193 161/500 [========>.....................] - ETA: 2:53 - loss: 0.2909 - regression_loss: 0.2715 - classification_loss: 0.0194 162/500 [========>.....................] - ETA: 2:52 - loss: 0.2907 - regression_loss: 0.2714 - classification_loss: 0.0193 163/500 [========>.....................] - ETA: 2:52 - loss: 0.2905 - regression_loss: 0.2713 - classification_loss: 0.0193 164/500 [========>.....................] - ETA: 2:51 - loss: 0.2904 - regression_loss: 0.2712 - classification_loss: 0.0193 165/500 [========>.....................] - ETA: 2:51 - loss: 0.2899 - regression_loss: 0.2707 - classification_loss: 0.0192 166/500 [========>.....................] - ETA: 2:50 - loss: 0.2897 - regression_loss: 0.2705 - classification_loss: 0.0192 167/500 [=========>....................] - ETA: 2:50 - loss: 0.2891 - regression_loss: 0.2699 - classification_loss: 0.0191 168/500 [=========>....................] - ETA: 2:49 - loss: 0.2898 - regression_loss: 0.2707 - classification_loss: 0.0192 169/500 [=========>....................] - ETA: 2:49 - loss: 0.2897 - regression_loss: 0.2706 - classification_loss: 0.0191 170/500 [=========>....................] - ETA: 2:48 - loss: 0.2897 - regression_loss: 0.2706 - classification_loss: 0.0191 171/500 [=========>....................] - ETA: 2:48 - loss: 0.2915 - regression_loss: 0.2722 - classification_loss: 0.0193 172/500 [=========>....................] - ETA: 2:47 - loss: 0.2913 - regression_loss: 0.2720 - classification_loss: 0.0193 173/500 [=========>....................] - ETA: 2:47 - loss: 0.2916 - regression_loss: 0.2723 - classification_loss: 0.0193 174/500 [=========>....................] - ETA: 2:46 - loss: 0.2911 - regression_loss: 0.2719 - classification_loss: 0.0192 175/500 [=========>....................] - ETA: 2:46 - loss: 0.2904 - regression_loss: 0.2711 - classification_loss: 0.0192 176/500 [=========>....................] - ETA: 2:45 - loss: 0.2903 - regression_loss: 0.2711 - classification_loss: 0.0192 177/500 [=========>....................] - ETA: 2:44 - loss: 0.2942 - regression_loss: 0.2737 - classification_loss: 0.0205 178/500 [=========>....................] - ETA: 2:44 - loss: 0.2940 - regression_loss: 0.2735 - classification_loss: 0.0205 179/500 [=========>....................] - ETA: 2:43 - loss: 0.2930 - regression_loss: 0.2726 - classification_loss: 0.0204 180/500 [=========>....................] - ETA: 2:43 - loss: 0.2926 - regression_loss: 0.2721 - classification_loss: 0.0204 181/500 [=========>....................] - ETA: 2:42 - loss: 0.2926 - regression_loss: 0.2722 - classification_loss: 0.0204 182/500 [=========>....................] - ETA: 2:42 - loss: 0.2920 - regression_loss: 0.2716 - classification_loss: 0.0204 183/500 [=========>....................] - ETA: 2:41 - loss: 0.2916 - regression_loss: 0.2712 - classification_loss: 0.0203 184/500 [==========>...................] - ETA: 2:41 - loss: 0.2908 - regression_loss: 0.2706 - classification_loss: 0.0203 185/500 [==========>...................] - ETA: 2:40 - loss: 0.2917 - regression_loss: 0.2713 - classification_loss: 0.0204 186/500 [==========>...................] - ETA: 2:40 - loss: 0.2913 - regression_loss: 0.2710 - classification_loss: 0.0204 187/500 [==========>...................] - ETA: 2:39 - loss: 0.2907 - regression_loss: 0.2704 - classification_loss: 0.0203 188/500 [==========>...................] - ETA: 2:39 - loss: 0.2901 - regression_loss: 0.2699 - classification_loss: 0.0203 189/500 [==========>...................] - ETA: 2:38 - loss: 0.2899 - regression_loss: 0.2697 - classification_loss: 0.0203 190/500 [==========>...................] - ETA: 2:38 - loss: 0.2897 - regression_loss: 0.2695 - classification_loss: 0.0202 191/500 [==========>...................] - ETA: 2:37 - loss: 0.2890 - regression_loss: 0.2688 - classification_loss: 0.0201 192/500 [==========>...................] - ETA: 2:37 - loss: 0.2885 - regression_loss: 0.2684 - classification_loss: 0.0201 193/500 [==========>...................] - ETA: 2:36 - loss: 0.2877 - regression_loss: 0.2677 - classification_loss: 0.0200 194/500 [==========>...................] - ETA: 2:36 - loss: 0.2872 - regression_loss: 0.2672 - classification_loss: 0.0200 195/500 [==========>...................] - ETA: 2:35 - loss: 0.2866 - regression_loss: 0.2667 - classification_loss: 0.0199 196/500 [==========>...................] - ETA: 2:35 - loss: 0.2856 - regression_loss: 0.2657 - classification_loss: 0.0199 197/500 [==========>...................] - ETA: 2:34 - loss: 0.2849 - regression_loss: 0.2651 - classification_loss: 0.0198 198/500 [==========>...................] - ETA: 2:34 - loss: 0.2848 - regression_loss: 0.2650 - classification_loss: 0.0198 199/500 [==========>...................] - ETA: 2:33 - loss: 0.2845 - regression_loss: 0.2647 - classification_loss: 0.0199 200/500 [===========>..................] - ETA: 2:33 - loss: 0.2838 - regression_loss: 0.2639 - classification_loss: 0.0198 201/500 [===========>..................] - ETA: 2:32 - loss: 0.2854 - regression_loss: 0.2656 - classification_loss: 0.0199 202/500 [===========>..................] - ETA: 2:32 - loss: 0.2855 - regression_loss: 0.2656 - classification_loss: 0.0200 203/500 [===========>..................] - ETA: 2:31 - loss: 0.2853 - regression_loss: 0.2653 - classification_loss: 0.0199 204/500 [===========>..................] - ETA: 2:31 - loss: 0.2849 - regression_loss: 0.2650 - classification_loss: 0.0199 205/500 [===========>..................] - ETA: 2:30 - loss: 0.2844 - regression_loss: 0.2645 - classification_loss: 0.0199 206/500 [===========>..................] - ETA: 2:30 - loss: 0.2845 - regression_loss: 0.2645 - classification_loss: 0.0199 207/500 [===========>..................] - ETA: 2:29 - loss: 0.2843 - regression_loss: 0.2644 - classification_loss: 0.0199 208/500 [===========>..................] - ETA: 2:29 - loss: 0.2841 - regression_loss: 0.2643 - classification_loss: 0.0199 209/500 [===========>..................] - ETA: 2:28 - loss: 0.2838 - regression_loss: 0.2640 - classification_loss: 0.0199 210/500 [===========>..................] - ETA: 2:28 - loss: 0.2865 - regression_loss: 0.2663 - classification_loss: 0.0202 211/500 [===========>..................] - ETA: 2:27 - loss: 0.2867 - regression_loss: 0.2666 - classification_loss: 0.0201 212/500 [===========>..................] - ETA: 2:27 - loss: 0.2872 - regression_loss: 0.2670 - classification_loss: 0.0202 213/500 [===========>..................] - ETA: 2:26 - loss: 0.2871 - regression_loss: 0.2669 - classification_loss: 0.0201 214/500 [===========>..................] - ETA: 2:26 - loss: 0.2878 - regression_loss: 0.2676 - classification_loss: 0.0202 215/500 [===========>..................] - ETA: 2:25 - loss: 0.2879 - regression_loss: 0.2676 - classification_loss: 0.0202 216/500 [===========>..................] - ETA: 2:25 - loss: 0.2883 - regression_loss: 0.2680 - classification_loss: 0.0203 217/500 [============>.................] - ETA: 2:24 - loss: 0.2882 - regression_loss: 0.2680 - classification_loss: 0.0202 218/500 [============>.................] - ETA: 2:24 - loss: 0.2874 - regression_loss: 0.2672 - classification_loss: 0.0202 219/500 [============>.................] - ETA: 2:23 - loss: 0.2870 - regression_loss: 0.2669 - classification_loss: 0.0201 220/500 [============>.................] - ETA: 2:23 - loss: 0.2869 - regression_loss: 0.2668 - classification_loss: 0.0201 221/500 [============>.................] - ETA: 2:22 - loss: 0.2868 - regression_loss: 0.2667 - classification_loss: 0.0201 222/500 [============>.................] - ETA: 2:22 - loss: 0.2893 - regression_loss: 0.2687 - classification_loss: 0.0206 223/500 [============>.................] - ETA: 2:21 - loss: 0.2889 - regression_loss: 0.2683 - classification_loss: 0.0206 224/500 [============>.................] - ETA: 2:21 - loss: 0.2896 - regression_loss: 0.2690 - classification_loss: 0.0206 225/500 [============>.................] - ETA: 2:20 - loss: 0.2891 - regression_loss: 0.2685 - classification_loss: 0.0205 226/500 [============>.................] - ETA: 2:20 - loss: 0.2886 - regression_loss: 0.2681 - classification_loss: 0.0205 227/500 [============>.................] - ETA: 2:19 - loss: 0.2883 - regression_loss: 0.2678 - classification_loss: 0.0205 228/500 [============>.................] - ETA: 2:19 - loss: 0.2876 - regression_loss: 0.2672 - classification_loss: 0.0204 229/500 [============>.................] - ETA: 2:18 - loss: 0.2878 - regression_loss: 0.2674 - classification_loss: 0.0204 230/500 [============>.................] - ETA: 2:18 - loss: 0.2869 - regression_loss: 0.2666 - classification_loss: 0.0203 231/500 [============>.................] - ETA: 2:17 - loss: 0.2862 - regression_loss: 0.2659 - classification_loss: 0.0203 232/500 [============>.................] - ETA: 2:17 - loss: 0.2858 - regression_loss: 0.2656 - classification_loss: 0.0202 233/500 [============>.................] - ETA: 2:16 - loss: 0.2863 - regression_loss: 0.2660 - classification_loss: 0.0202 234/500 [=============>................] - ETA: 2:16 - loss: 0.2859 - regression_loss: 0.2657 - classification_loss: 0.0202 235/500 [=============>................] - ETA: 2:15 - loss: 0.2853 - regression_loss: 0.2651 - classification_loss: 0.0202 236/500 [=============>................] - ETA: 2:15 - loss: 0.2846 - regression_loss: 0.2644 - classification_loss: 0.0201 237/500 [=============>................] - ETA: 2:14 - loss: 0.2849 - regression_loss: 0.2648 - classification_loss: 0.0201 238/500 [=============>................] - ETA: 2:14 - loss: 0.2844 - regression_loss: 0.2643 - classification_loss: 0.0201 239/500 [=============>................] - ETA: 2:13 - loss: 0.2842 - regression_loss: 0.2641 - classification_loss: 0.0201 240/500 [=============>................] - ETA: 2:12 - loss: 0.2836 - regression_loss: 0.2636 - classification_loss: 0.0200 241/500 [=============>................] - ETA: 2:12 - loss: 0.2833 - regression_loss: 0.2633 - classification_loss: 0.0200 242/500 [=============>................] - ETA: 2:11 - loss: 0.2828 - regression_loss: 0.2629 - classification_loss: 0.0199 243/500 [=============>................] - ETA: 2:11 - loss: 0.2819 - regression_loss: 0.2620 - classification_loss: 0.0199 244/500 [=============>................] - ETA: 2:10 - loss: 0.2816 - regression_loss: 0.2618 - classification_loss: 0.0199 245/500 [=============>................] - ETA: 2:10 - loss: 0.2819 - regression_loss: 0.2620 - classification_loss: 0.0199 246/500 [=============>................] - ETA: 2:09 - loss: 0.2821 - regression_loss: 0.2622 - classification_loss: 0.0199 247/500 [=============>................] - ETA: 2:09 - loss: 0.2815 - regression_loss: 0.2616 - classification_loss: 0.0198 248/500 [=============>................] - ETA: 2:08 - loss: 0.2808 - regression_loss: 0.2610 - classification_loss: 0.0198 249/500 [=============>................] - ETA: 2:08 - loss: 0.2805 - regression_loss: 0.2608 - classification_loss: 0.0198 250/500 [==============>...............] - ETA: 2:07 - loss: 0.2803 - regression_loss: 0.2605 - classification_loss: 0.0198 251/500 [==============>...............] - ETA: 2:07 - loss: 0.2798 - regression_loss: 0.2601 - classification_loss: 0.0197 252/500 [==============>...............] - ETA: 2:06 - loss: 0.2794 - regression_loss: 0.2597 - classification_loss: 0.0197 253/500 [==============>...............] - ETA: 2:06 - loss: 0.2791 - regression_loss: 0.2593 - classification_loss: 0.0197 254/500 [==============>...............] - ETA: 2:05 - loss: 0.2790 - regression_loss: 0.2593 - classification_loss: 0.0197 255/500 [==============>...............] - ETA: 2:05 - loss: 0.2799 - regression_loss: 0.2602 - classification_loss: 0.0197 256/500 [==============>...............] - ETA: 2:04 - loss: 0.2795 - regression_loss: 0.2599 - classification_loss: 0.0197 257/500 [==============>...............] - ETA: 2:04 - loss: 0.2794 - regression_loss: 0.2597 - classification_loss: 0.0197 258/500 [==============>...............] - ETA: 2:03 - loss: 0.2790 - regression_loss: 0.2594 - classification_loss: 0.0197 259/500 [==============>...............] - ETA: 2:03 - loss: 0.2792 - regression_loss: 0.2595 - classification_loss: 0.0197 260/500 [==============>...............] - ETA: 2:02 - loss: 0.2790 - regression_loss: 0.2594 - classification_loss: 0.0196 261/500 [==============>...............] - ETA: 2:02 - loss: 0.2795 - regression_loss: 0.2598 - classification_loss: 0.0197 262/500 [==============>...............] - ETA: 2:01 - loss: 0.2790 - regression_loss: 0.2594 - classification_loss: 0.0197 263/500 [==============>...............] - ETA: 2:01 - loss: 0.2785 - regression_loss: 0.2589 - classification_loss: 0.0196 264/500 [==============>...............] - ETA: 2:00 - loss: 0.2784 - regression_loss: 0.2588 - classification_loss: 0.0196 265/500 [==============>...............] - ETA: 2:00 - loss: 0.2783 - regression_loss: 0.2587 - classification_loss: 0.0196 266/500 [==============>...............] - ETA: 1:59 - loss: 0.2780 - regression_loss: 0.2585 - classification_loss: 0.0196 267/500 [===============>..............] - ETA: 1:59 - loss: 0.2774 - regression_loss: 0.2579 - classification_loss: 0.0195 268/500 [===============>..............] - ETA: 1:58 - loss: 0.2777 - regression_loss: 0.2582 - classification_loss: 0.0195 269/500 [===============>..............] - ETA: 1:58 - loss: 0.2775 - regression_loss: 0.2580 - classification_loss: 0.0195 270/500 [===============>..............] - ETA: 1:57 - loss: 0.2793 - regression_loss: 0.2596 - classification_loss: 0.0197 271/500 [===============>..............] - ETA: 1:57 - loss: 0.2793 - regression_loss: 0.2596 - classification_loss: 0.0197 272/500 [===============>..............] - ETA: 1:56 - loss: 0.2806 - regression_loss: 0.2608 - classification_loss: 0.0198 273/500 [===============>..............] - ETA: 1:56 - loss: 0.2801 - regression_loss: 0.2604 - classification_loss: 0.0197 274/500 [===============>..............] - ETA: 1:55 - loss: 0.2804 - regression_loss: 0.2606 - classification_loss: 0.0197 275/500 [===============>..............] - ETA: 1:55 - loss: 0.2803 - regression_loss: 0.2605 - classification_loss: 0.0197 276/500 [===============>..............] - ETA: 1:54 - loss: 0.2805 - regression_loss: 0.2608 - classification_loss: 0.0197 277/500 [===============>..............] - ETA: 1:54 - loss: 0.2802 - regression_loss: 0.2605 - classification_loss: 0.0197 278/500 [===============>..............] - ETA: 1:53 - loss: 0.2801 - regression_loss: 0.2604 - classification_loss: 0.0197 279/500 [===============>..............] - ETA: 1:53 - loss: 0.2805 - regression_loss: 0.2608 - classification_loss: 0.0197 280/500 [===============>..............] - ETA: 1:52 - loss: 0.2801 - regression_loss: 0.2604 - classification_loss: 0.0197 281/500 [===============>..............] - ETA: 1:52 - loss: 0.2795 - regression_loss: 0.2599 - classification_loss: 0.0196 282/500 [===============>..............] - ETA: 1:51 - loss: 0.2789 - regression_loss: 0.2593 - classification_loss: 0.0196 283/500 [===============>..............] - ETA: 1:51 - loss: 0.2787 - regression_loss: 0.2591 - classification_loss: 0.0196 284/500 [================>.............] - ETA: 1:50 - loss: 0.2782 - regression_loss: 0.2586 - classification_loss: 0.0195 285/500 [================>.............] - ETA: 1:50 - loss: 0.2777 - regression_loss: 0.2583 - classification_loss: 0.0195 286/500 [================>.............] - ETA: 1:49 - loss: 0.2774 - regression_loss: 0.2579 - classification_loss: 0.0195 287/500 [================>.............] - ETA: 1:49 - loss: 0.2784 - regression_loss: 0.2587 - classification_loss: 0.0197 288/500 [================>.............] - ETA: 1:48 - loss: 0.2778 - regression_loss: 0.2581 - classification_loss: 0.0197 289/500 [================>.............] - ETA: 1:48 - loss: 0.2775 - regression_loss: 0.2578 - classification_loss: 0.0197 290/500 [================>.............] - ETA: 1:47 - loss: 0.2774 - regression_loss: 0.2576 - classification_loss: 0.0197 291/500 [================>.............] - ETA: 1:47 - loss: 0.2768 - regression_loss: 0.2571 - classification_loss: 0.0197 292/500 [================>.............] - ETA: 1:46 - loss: 0.2776 - regression_loss: 0.2579 - classification_loss: 0.0197 293/500 [================>.............] - ETA: 1:45 - loss: 0.2776 - regression_loss: 0.2579 - classification_loss: 0.0197 294/500 [================>.............] - ETA: 1:45 - loss: 0.2777 - regression_loss: 0.2580 - classification_loss: 0.0197 295/500 [================>.............] - ETA: 1:44 - loss: 0.2776 - regression_loss: 0.2580 - classification_loss: 0.0196 296/500 [================>.............] - ETA: 1:44 - loss: 0.2773 - regression_loss: 0.2577 - classification_loss: 0.0196 297/500 [================>.............] - ETA: 1:43 - loss: 0.2768 - regression_loss: 0.2572 - classification_loss: 0.0196 298/500 [================>.............] - ETA: 1:43 - loss: 0.2766 - regression_loss: 0.2571 - classification_loss: 0.0196 299/500 [================>.............] - ETA: 1:42 - loss: 0.2762 - regression_loss: 0.2567 - classification_loss: 0.0195 300/500 [=================>............] - ETA: 1:42 - loss: 0.2759 - regression_loss: 0.2563 - classification_loss: 0.0195 301/500 [=================>............] - ETA: 1:41 - loss: 0.2756 - regression_loss: 0.2561 - classification_loss: 0.0195 302/500 [=================>............] - ETA: 1:41 - loss: 0.2754 - regression_loss: 0.2559 - classification_loss: 0.0195 303/500 [=================>............] - ETA: 1:40 - loss: 0.2750 - regression_loss: 0.2556 - classification_loss: 0.0195 304/500 [=================>............] - ETA: 1:40 - loss: 0.2750 - regression_loss: 0.2556 - classification_loss: 0.0194 305/500 [=================>............] - ETA: 1:39 - loss: 0.2749 - regression_loss: 0.2555 - classification_loss: 0.0194 306/500 [=================>............] - ETA: 1:39 - loss: 0.2744 - regression_loss: 0.2550 - classification_loss: 0.0194 307/500 [=================>............] - ETA: 1:38 - loss: 0.2737 - regression_loss: 0.2543 - classification_loss: 0.0193 308/500 [=================>............] - ETA: 1:38 - loss: 0.2735 - regression_loss: 0.2542 - classification_loss: 0.0193 309/500 [=================>............] - ETA: 1:37 - loss: 0.2748 - regression_loss: 0.2551 - classification_loss: 0.0197 310/500 [=================>............] - ETA: 1:37 - loss: 0.2748 - regression_loss: 0.2551 - classification_loss: 0.0196 311/500 [=================>............] - ETA: 1:36 - loss: 0.2743 - regression_loss: 0.2547 - classification_loss: 0.0196 312/500 [=================>............] - ETA: 1:36 - loss: 0.2764 - regression_loss: 0.2561 - classification_loss: 0.0203 313/500 [=================>............] - ETA: 1:35 - loss: 0.2779 - regression_loss: 0.2573 - classification_loss: 0.0206 314/500 [=================>............] - ETA: 1:35 - loss: 0.2774 - regression_loss: 0.2569 - classification_loss: 0.0205 315/500 [=================>............] - ETA: 1:34 - loss: 0.2770 - regression_loss: 0.2565 - classification_loss: 0.0205 316/500 [=================>............] - ETA: 1:34 - loss: 0.2768 - regression_loss: 0.2563 - classification_loss: 0.0205 317/500 [==================>...........] - ETA: 1:33 - loss: 0.2762 - regression_loss: 0.2557 - classification_loss: 0.0204 318/500 [==================>...........] - ETA: 1:33 - loss: 0.2760 - regression_loss: 0.2556 - classification_loss: 0.0204 319/500 [==================>...........] - ETA: 1:32 - loss: 0.2758 - regression_loss: 0.2554 - classification_loss: 0.0204 320/500 [==================>...........] - ETA: 1:32 - loss: 0.2752 - regression_loss: 0.2548 - classification_loss: 0.0204 321/500 [==================>...........] - ETA: 1:31 - loss: 0.2751 - regression_loss: 0.2547 - classification_loss: 0.0204 322/500 [==================>...........] - ETA: 1:31 - loss: 0.2751 - regression_loss: 0.2547 - classification_loss: 0.0203 323/500 [==================>...........] - ETA: 1:30 - loss: 0.2746 - regression_loss: 0.2543 - classification_loss: 0.0203 324/500 [==================>...........] - ETA: 1:30 - loss: 0.2741 - regression_loss: 0.2538 - classification_loss: 0.0203 325/500 [==================>...........] - ETA: 1:29 - loss: 0.2738 - regression_loss: 0.2535 - classification_loss: 0.0202 326/500 [==================>...........] - ETA: 1:29 - loss: 0.2731 - regression_loss: 0.2530 - classification_loss: 0.0202 327/500 [==================>...........] - ETA: 1:28 - loss: 0.2729 - regression_loss: 0.2527 - classification_loss: 0.0202 328/500 [==================>...........] - ETA: 1:28 - loss: 0.2726 - regression_loss: 0.2524 - classification_loss: 0.0201 329/500 [==================>...........] - ETA: 1:27 - loss: 0.2730 - regression_loss: 0.2529 - classification_loss: 0.0202 330/500 [==================>...........] - ETA: 1:27 - loss: 0.2728 - regression_loss: 0.2526 - classification_loss: 0.0201 331/500 [==================>...........] - ETA: 1:26 - loss: 0.2727 - regression_loss: 0.2526 - classification_loss: 0.0201 332/500 [==================>...........] - ETA: 1:26 - loss: 0.2731 - regression_loss: 0.2530 - classification_loss: 0.0202 333/500 [==================>...........] - ETA: 1:25 - loss: 0.2744 - regression_loss: 0.2541 - classification_loss: 0.0203 334/500 [===================>..........] - ETA: 1:25 - loss: 0.2745 - regression_loss: 0.2542 - classification_loss: 0.0203 335/500 [===================>..........] - ETA: 1:24 - loss: 0.2742 - regression_loss: 0.2539 - classification_loss: 0.0203 336/500 [===================>..........] - ETA: 1:23 - loss: 0.2740 - regression_loss: 0.2538 - classification_loss: 0.0202 337/500 [===================>..........] - ETA: 1:23 - loss: 0.2737 - regression_loss: 0.2535 - classification_loss: 0.0202 338/500 [===================>..........] - ETA: 1:22 - loss: 0.2731 - regression_loss: 0.2529 - classification_loss: 0.0202 339/500 [===================>..........] - ETA: 1:22 - loss: 0.2727 - regression_loss: 0.2525 - classification_loss: 0.0202 340/500 [===================>..........] - ETA: 1:21 - loss: 0.2726 - regression_loss: 0.2525 - classification_loss: 0.0201 341/500 [===================>..........] - ETA: 1:21 - loss: 0.2722 - regression_loss: 0.2521 - classification_loss: 0.0201 342/500 [===================>..........] - ETA: 1:20 - loss: 0.2724 - regression_loss: 0.2523 - classification_loss: 0.0201 343/500 [===================>..........] - ETA: 1:20 - loss: 0.2722 - regression_loss: 0.2521 - classification_loss: 0.0201 344/500 [===================>..........] - ETA: 1:19 - loss: 0.2718 - regression_loss: 0.2517 - classification_loss: 0.0201 345/500 [===================>..........] - ETA: 1:19 - loss: 0.2714 - regression_loss: 0.2514 - classification_loss: 0.0200 346/500 [===================>..........] - ETA: 1:18 - loss: 0.2710 - regression_loss: 0.2510 - classification_loss: 0.0200 347/500 [===================>..........] - ETA: 1:18 - loss: 0.2727 - regression_loss: 0.2525 - classification_loss: 0.0202 348/500 [===================>..........] - ETA: 1:17 - loss: 0.2721 - regression_loss: 0.2520 - classification_loss: 0.0202 349/500 [===================>..........] - ETA: 1:17 - loss: 0.2716 - regression_loss: 0.2515 - classification_loss: 0.0201 350/500 [====================>.........] - ETA: 1:16 - loss: 0.2718 - regression_loss: 0.2517 - classification_loss: 0.0201 351/500 [====================>.........] - ETA: 1:16 - loss: 0.2715 - regression_loss: 0.2514 - classification_loss: 0.0201 352/500 [====================>.........] - ETA: 1:15 - loss: 0.2711 - regression_loss: 0.2510 - classification_loss: 0.0201 353/500 [====================>.........] - ETA: 1:15 - loss: 0.2707 - regression_loss: 0.2506 - classification_loss: 0.0201 354/500 [====================>.........] - ETA: 1:14 - loss: 0.2701 - regression_loss: 0.2500 - classification_loss: 0.0200 355/500 [====================>.........] - ETA: 1:14 - loss: 0.2696 - regression_loss: 0.2496 - classification_loss: 0.0200 356/500 [====================>.........] - ETA: 1:13 - loss: 0.2693 - regression_loss: 0.2493 - classification_loss: 0.0200 357/500 [====================>.........] - ETA: 1:13 - loss: 0.2698 - regression_loss: 0.2497 - classification_loss: 0.0200 358/500 [====================>.........] - ETA: 1:12 - loss: 0.2695 - regression_loss: 0.2495 - classification_loss: 0.0200 359/500 [====================>.........] - ETA: 1:12 - loss: 0.2694 - regression_loss: 0.2494 - classification_loss: 0.0200 360/500 [====================>.........] - ETA: 1:11 - loss: 0.2702 - regression_loss: 0.2502 - classification_loss: 0.0200 361/500 [====================>.........] - ETA: 1:11 - loss: 0.2708 - regression_loss: 0.2507 - classification_loss: 0.0200 362/500 [====================>.........] - ETA: 1:10 - loss: 0.2713 - regression_loss: 0.2513 - classification_loss: 0.0200 363/500 [====================>.........] - ETA: 1:10 - loss: 0.2713 - regression_loss: 0.2512 - classification_loss: 0.0201 364/500 [====================>.........] - ETA: 1:09 - loss: 0.2719 - regression_loss: 0.2518 - classification_loss: 0.0201 365/500 [====================>.........] - ETA: 1:09 - loss: 0.2714 - regression_loss: 0.2514 - classification_loss: 0.0201 366/500 [====================>.........] - ETA: 1:08 - loss: 0.2715 - regression_loss: 0.2514 - classification_loss: 0.0201 367/500 [=====================>........] - ETA: 1:08 - loss: 0.2714 - regression_loss: 0.2513 - classification_loss: 0.0201 368/500 [=====================>........] - ETA: 1:07 - loss: 0.2712 - regression_loss: 0.2512 - classification_loss: 0.0201 369/500 [=====================>........] - ETA: 1:07 - loss: 0.2710 - regression_loss: 0.2509 - classification_loss: 0.0201 370/500 [=====================>........] - ETA: 1:06 - loss: 0.2714 - regression_loss: 0.2513 - classification_loss: 0.0200 371/500 [=====================>........] - ETA: 1:06 - loss: 0.2710 - regression_loss: 0.2510 - classification_loss: 0.0200 372/500 [=====================>........] - ETA: 1:05 - loss: 0.2708 - regression_loss: 0.2508 - classification_loss: 0.0200 373/500 [=====================>........] - ETA: 1:05 - loss: 0.2711 - regression_loss: 0.2511 - classification_loss: 0.0200 374/500 [=====================>........] - ETA: 1:04 - loss: 0.2708 - regression_loss: 0.2508 - classification_loss: 0.0200 375/500 [=====================>........] - ETA: 1:04 - loss: 0.2703 - regression_loss: 0.2503 - classification_loss: 0.0200 376/500 [=====================>........] - ETA: 1:03 - loss: 0.2701 - regression_loss: 0.2502 - classification_loss: 0.0199 377/500 [=====================>........] - ETA: 1:03 - loss: 0.2698 - regression_loss: 0.2499 - classification_loss: 0.0199 378/500 [=====================>........] - ETA: 1:02 - loss: 0.2695 - regression_loss: 0.2495 - classification_loss: 0.0199 379/500 [=====================>........] - ETA: 1:02 - loss: 0.2691 - regression_loss: 0.2492 - classification_loss: 0.0199 380/500 [=====================>........] - ETA: 1:01 - loss: 0.2689 - regression_loss: 0.2490 - classification_loss: 0.0199 381/500 [=====================>........] - ETA: 1:00 - loss: 0.2688 - regression_loss: 0.2490 - classification_loss: 0.0199 382/500 [=====================>........] - ETA: 1:00 - loss: 0.2685 - regression_loss: 0.2486 - classification_loss: 0.0198 383/500 [=====================>........] - ETA: 59s - loss: 0.2683 - regression_loss: 0.2486 - classification_loss: 0.0198  384/500 [======================>.......] - ETA: 59s - loss: 0.2686 - regression_loss: 0.2488 - classification_loss: 0.0198 385/500 [======================>.......] - ETA: 58s - loss: 0.2684 - regression_loss: 0.2486 - classification_loss: 0.0198 386/500 [======================>.......] - ETA: 58s - loss: 0.2679 - regression_loss: 0.2481 - classification_loss: 0.0198 387/500 [======================>.......] - ETA: 57s - loss: 0.2680 - regression_loss: 0.2482 - classification_loss: 0.0198 388/500 [======================>.......] - ETA: 57s - loss: 0.2678 - regression_loss: 0.2480 - classification_loss: 0.0198 389/500 [======================>.......] - ETA: 56s - loss: 0.2678 - regression_loss: 0.2480 - classification_loss: 0.0198 390/500 [======================>.......] - ETA: 56s - loss: 0.2677 - regression_loss: 0.2479 - classification_loss: 0.0197 391/500 [======================>.......] - ETA: 55s - loss: 0.2675 - regression_loss: 0.2478 - classification_loss: 0.0197 392/500 [======================>.......] - ETA: 55s - loss: 0.2672 - regression_loss: 0.2474 - classification_loss: 0.0197 393/500 [======================>.......] - ETA: 54s - loss: 0.2679 - regression_loss: 0.2481 - classification_loss: 0.0198 394/500 [======================>.......] - ETA: 54s - loss: 0.2675 - regression_loss: 0.2477 - classification_loss: 0.0198 395/500 [======================>.......] - ETA: 53s - loss: 0.2675 - regression_loss: 0.2477 - classification_loss: 0.0198 396/500 [======================>.......] - ETA: 53s - loss: 0.2673 - regression_loss: 0.2476 - classification_loss: 0.0198 397/500 [======================>.......] - ETA: 52s - loss: 0.2675 - regression_loss: 0.2477 - classification_loss: 0.0198 398/500 [======================>.......] - ETA: 52s - loss: 0.2676 - regression_loss: 0.2478 - classification_loss: 0.0198 399/500 [======================>.......] - ETA: 51s - loss: 0.2672 - regression_loss: 0.2475 - classification_loss: 0.0198 400/500 [=======================>......] - ETA: 51s - loss: 0.2669 - regression_loss: 0.2472 - classification_loss: 0.0198 401/500 [=======================>......] - ETA: 50s - loss: 0.2668 - regression_loss: 0.2471 - classification_loss: 0.0198 402/500 [=======================>......] - ETA: 50s - loss: 0.2666 - regression_loss: 0.2468 - classification_loss: 0.0197 403/500 [=======================>......] - ETA: 49s - loss: 0.2667 - regression_loss: 0.2469 - classification_loss: 0.0197 404/500 [=======================>......] - ETA: 49s - loss: 0.2665 - regression_loss: 0.2468 - classification_loss: 0.0197 405/500 [=======================>......] - ETA: 48s - loss: 0.2662 - regression_loss: 0.2465 - classification_loss: 0.0197 406/500 [=======================>......] - ETA: 48s - loss: 0.2659 - regression_loss: 0.2462 - classification_loss: 0.0197 407/500 [=======================>......] - ETA: 47s - loss: 0.2657 - regression_loss: 0.2460 - classification_loss: 0.0196 408/500 [=======================>......] - ETA: 47s - loss: 0.2657 - regression_loss: 0.2460 - classification_loss: 0.0196 409/500 [=======================>......] - ETA: 46s - loss: 0.2663 - regression_loss: 0.2466 - classification_loss: 0.0196 410/500 [=======================>......] - ETA: 46s - loss: 0.2659 - regression_loss: 0.2463 - classification_loss: 0.0196 411/500 [=======================>......] - ETA: 45s - loss: 0.2656 - regression_loss: 0.2460 - classification_loss: 0.0196 412/500 [=======================>......] - ETA: 45s - loss: 0.2655 - regression_loss: 0.2459 - classification_loss: 0.0196 413/500 [=======================>......] - ETA: 44s - loss: 0.2653 - regression_loss: 0.2457 - classification_loss: 0.0196 414/500 [=======================>......] - ETA: 44s - loss: 0.2652 - regression_loss: 0.2457 - classification_loss: 0.0195 415/500 [=======================>......] - ETA: 43s - loss: 0.2649 - regression_loss: 0.2454 - classification_loss: 0.0195 416/500 [=======================>......] - ETA: 43s - loss: 0.2646 - regression_loss: 0.2451 - classification_loss: 0.0195 417/500 [========================>.....] - ETA: 42s - loss: 0.2645 - regression_loss: 0.2450 - classification_loss: 0.0195 418/500 [========================>.....] - ETA: 42s - loss: 0.2653 - regression_loss: 0.2456 - classification_loss: 0.0197 419/500 [========================>.....] - ETA: 41s - loss: 0.2651 - regression_loss: 0.2454 - classification_loss: 0.0197 420/500 [========================>.....] - ETA: 40s - loss: 0.2646 - regression_loss: 0.2450 - classification_loss: 0.0196 421/500 [========================>.....] - ETA: 40s - loss: 0.2644 - regression_loss: 0.2448 - classification_loss: 0.0196 422/500 [========================>.....] - ETA: 39s - loss: 0.2641 - regression_loss: 0.2445 - classification_loss: 0.0196 423/500 [========================>.....] - ETA: 39s - loss: 0.2641 - regression_loss: 0.2445 - classification_loss: 0.0196 424/500 [========================>.....] - ETA: 38s - loss: 0.2650 - regression_loss: 0.2453 - classification_loss: 0.0197 425/500 [========================>.....] - ETA: 38s - loss: 0.2647 - regression_loss: 0.2450 - classification_loss: 0.0196 426/500 [========================>.....] - ETA: 37s - loss: 0.2648 - regression_loss: 0.2451 - classification_loss: 0.0197 427/500 [========================>.....] - ETA: 37s - loss: 0.2646 - regression_loss: 0.2450 - classification_loss: 0.0196 428/500 [========================>.....] - ETA: 36s - loss: 0.2645 - regression_loss: 0.2449 - classification_loss: 0.0196 429/500 [========================>.....] - ETA: 36s - loss: 0.2643 - regression_loss: 0.2446 - classification_loss: 0.0197 430/500 [========================>.....] - ETA: 35s - loss: 0.2641 - regression_loss: 0.2445 - classification_loss: 0.0196 431/500 [========================>.....] - ETA: 35s - loss: 0.2640 - regression_loss: 0.2444 - classification_loss: 0.0196 432/500 [========================>.....] - ETA: 34s - loss: 0.2638 - regression_loss: 0.2442 - classification_loss: 0.0196 433/500 [========================>.....] - ETA: 34s - loss: 0.2637 - regression_loss: 0.2441 - classification_loss: 0.0196 434/500 [=========================>....] - ETA: 33s - loss: 0.2637 - regression_loss: 0.2441 - classification_loss: 0.0196 435/500 [=========================>....] - ETA: 33s - loss: 0.2634 - regression_loss: 0.2439 - classification_loss: 0.0196 436/500 [=========================>....] - ETA: 32s - loss: 0.2633 - regression_loss: 0.2437 - classification_loss: 0.0195 437/500 [=========================>....] - ETA: 32s - loss: 0.2630 - regression_loss: 0.2434 - classification_loss: 0.0195 438/500 [=========================>....] - ETA: 31s - loss: 0.2627 - regression_loss: 0.2432 - classification_loss: 0.0195 439/500 [=========================>....] - ETA: 31s - loss: 0.2625 - regression_loss: 0.2431 - classification_loss: 0.0195 440/500 [=========================>....] - ETA: 30s - loss: 0.2625 - regression_loss: 0.2430 - classification_loss: 0.0195 441/500 [=========================>....] - ETA: 30s - loss: 0.2620 - regression_loss: 0.2425 - classification_loss: 0.0195 442/500 [=========================>....] - ETA: 29s - loss: 0.2619 - regression_loss: 0.2424 - classification_loss: 0.0194 443/500 [=========================>....] - ETA: 29s - loss: 0.2616 - regression_loss: 0.2421 - classification_loss: 0.0194 444/500 [=========================>....] - ETA: 28s - loss: 0.2613 - regression_loss: 0.2419 - classification_loss: 0.0194 445/500 [=========================>....] - ETA: 28s - loss: 0.2614 - regression_loss: 0.2420 - classification_loss: 0.0194 446/500 [=========================>....] - ETA: 27s - loss: 0.2611 - regression_loss: 0.2417 - classification_loss: 0.0194 447/500 [=========================>....] - ETA: 27s - loss: 0.2608 - regression_loss: 0.2414 - classification_loss: 0.0194 448/500 [=========================>....] - ETA: 26s - loss: 0.2607 - regression_loss: 0.2414 - classification_loss: 0.0194 449/500 [=========================>....] - ETA: 26s - loss: 0.2606 - regression_loss: 0.2413 - classification_loss: 0.0193 450/500 [==========================>...] - ETA: 25s - loss: 0.2604 - regression_loss: 0.2411 - classification_loss: 0.0193 451/500 [==========================>...] - ETA: 25s - loss: 0.2600 - regression_loss: 0.2407 - classification_loss: 0.0193 452/500 [==========================>...] - ETA: 24s - loss: 0.2597 - regression_loss: 0.2405 - classification_loss: 0.0193 453/500 [==========================>...] - ETA: 24s - loss: 0.2596 - regression_loss: 0.2403 - classification_loss: 0.0193 454/500 [==========================>...] - ETA: 23s - loss: 0.2594 - regression_loss: 0.2402 - classification_loss: 0.0193 455/500 [==========================>...] - ETA: 23s - loss: 0.2592 - regression_loss: 0.2400 - classification_loss: 0.0193 456/500 [==========================>...] - ETA: 22s - loss: 0.2588 - regression_loss: 0.2396 - classification_loss: 0.0192 457/500 [==========================>...] - ETA: 22s - loss: 0.2588 - regression_loss: 0.2396 - classification_loss: 0.0192 458/500 [==========================>...] - ETA: 21s - loss: 0.2588 - regression_loss: 0.2396 - classification_loss: 0.0193 459/500 [==========================>...] - ETA: 21s - loss: 0.2588 - regression_loss: 0.2395 - classification_loss: 0.0193 460/500 [==========================>...] - ETA: 20s - loss: 0.2586 - regression_loss: 0.2393 - classification_loss: 0.0192 461/500 [==========================>...] - ETA: 19s - loss: 0.2584 - regression_loss: 0.2392 - classification_loss: 0.0192 462/500 [==========================>...] - ETA: 19s - loss: 0.2581 - regression_loss: 0.2389 - classification_loss: 0.0192 463/500 [==========================>...] - ETA: 18s - loss: 0.2578 - regression_loss: 0.2387 - classification_loss: 0.0192 464/500 [==========================>...] - ETA: 18s - loss: 0.2578 - regression_loss: 0.2386 - classification_loss: 0.0192 465/500 [==========================>...] - ETA: 17s - loss: 0.2575 - regression_loss: 0.2383 - classification_loss: 0.0192 466/500 [==========================>...] - ETA: 17s - loss: 0.2574 - regression_loss: 0.2382 - classification_loss: 0.0192 467/500 [===========================>..] - ETA: 16s - loss: 0.2571 - regression_loss: 0.2379 - classification_loss: 0.0192 468/500 [===========================>..] - ETA: 16s - loss: 0.2569 - regression_loss: 0.2377 - classification_loss: 0.0192 469/500 [===========================>..] - ETA: 15s - loss: 0.2569 - regression_loss: 0.2377 - classification_loss: 0.0191 470/500 [===========================>..] - ETA: 15s - loss: 0.2568 - regression_loss: 0.2376 - classification_loss: 0.0191 471/500 [===========================>..] - ETA: 14s - loss: 0.2565 - regression_loss: 0.2374 - classification_loss: 0.0191 472/500 [===========================>..] - ETA: 14s - loss: 0.2562 - regression_loss: 0.2371 - classification_loss: 0.0191 473/500 [===========================>..] - ETA: 13s - loss: 0.2559 - regression_loss: 0.2368 - classification_loss: 0.0191 474/500 [===========================>..] - ETA: 13s - loss: 0.2556 - regression_loss: 0.2366 - classification_loss: 0.0190 475/500 [===========================>..] - ETA: 12s - loss: 0.2557 - regression_loss: 0.2367 - classification_loss: 0.0190 476/500 [===========================>..] - ETA: 12s - loss: 0.2556 - regression_loss: 0.2366 - classification_loss: 0.0190 477/500 [===========================>..] - ETA: 11s - loss: 0.2554 - regression_loss: 0.2364 - classification_loss: 0.0190 478/500 [===========================>..] - ETA: 11s - loss: 0.2553 - regression_loss: 0.2363 - classification_loss: 0.0190 479/500 [===========================>..] - ETA: 10s - loss: 0.2550 - regression_loss: 0.2361 - classification_loss: 0.0189 480/500 [===========================>..] - ETA: 10s - loss: 0.2548 - regression_loss: 0.2358 - classification_loss: 0.0189 481/500 [===========================>..] - ETA: 9s - loss: 0.2544 - regression_loss: 0.2355 - classification_loss: 0.0189  482/500 [===========================>..] - ETA: 9s - loss: 0.2542 - regression_loss: 0.2353 - classification_loss: 0.0189 483/500 [===========================>..] - ETA: 8s - loss: 0.2541 - regression_loss: 0.2352 - classification_loss: 0.0189 484/500 [============================>.] - ETA: 8s - loss: 0.2538 - regression_loss: 0.2349 - classification_loss: 0.0189 485/500 [============================>.] - ETA: 7s - loss: 0.2545 - regression_loss: 0.2356 - classification_loss: 0.0189 486/500 [============================>.] - ETA: 7s - loss: 0.2543 - regression_loss: 0.2354 - classification_loss: 0.0189 487/500 [============================>.] - ETA: 6s - loss: 0.2542 - regression_loss: 0.2354 - classification_loss: 0.0189 488/500 [============================>.] - ETA: 6s - loss: 0.2542 - regression_loss: 0.2353 - classification_loss: 0.0189 489/500 [============================>.] - ETA: 5s - loss: 0.2547 - regression_loss: 0.2358 - classification_loss: 0.0189 490/500 [============================>.] - ETA: 5s - loss: 0.2544 - regression_loss: 0.2355 - classification_loss: 0.0189 491/500 [============================>.] - ETA: 4s - loss: 0.2543 - regression_loss: 0.2354 - classification_loss: 0.0189 492/500 [============================>.] - ETA: 4s - loss: 0.2542 - regression_loss: 0.2353 - classification_loss: 0.0189 493/500 [============================>.] - ETA: 3s - loss: 0.2540 - regression_loss: 0.2351 - classification_loss: 0.0188 494/500 [============================>.] - ETA: 3s - loss: 0.2536 - regression_loss: 0.2348 - classification_loss: 0.0188 495/500 [============================>.] - ETA: 2s - loss: 0.2535 - regression_loss: 0.2347 - classification_loss: 0.0188 496/500 [============================>.] - ETA: 2s - loss: 0.2532 - regression_loss: 0.2344 - classification_loss: 0.0188 497/500 [============================>.] - ETA: 1s - loss: 0.2538 - regression_loss: 0.2350 - classification_loss: 0.0188 498/500 [============================>.] - ETA: 1s - loss: 0.2535 - regression_loss: 0.2347 - classification_loss: 0.0188 499/500 [============================>.] - ETA: 0s - loss: 0.2533 - regression_loss: 0.2345 - classification_loss: 0.0188 500/500 [==============================] - 256s 513ms/step - loss: 0.2530 - regression_loss: 0.2343 - classification_loss: 0.0187 326 instances of class plum with average precision: 0.8460 mAP: 0.8460 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 4:18 - loss: 0.1578 - regression_loss: 0.1501 - classification_loss: 0.0076 2/500 [..............................] - ETA: 4:10 - loss: 0.1767 - regression_loss: 0.1671 - classification_loss: 0.0096 3/500 [..............................] - ETA: 4:10 - loss: 0.2254 - regression_loss: 0.2147 - classification_loss: 0.0107 4/500 [..............................] - ETA: 4:10 - loss: 0.2102 - regression_loss: 0.1987 - classification_loss: 0.0114 5/500 [..............................] - ETA: 4:11 - loss: 0.2125 - regression_loss: 0.2010 - classification_loss: 0.0115 6/500 [..............................] - ETA: 4:10 - loss: 0.2029 - regression_loss: 0.1911 - classification_loss: 0.0118 7/500 [..............................] - ETA: 4:12 - loss: 0.1827 - regression_loss: 0.1715 - classification_loss: 0.0113 8/500 [..............................] - ETA: 4:12 - loss: 0.1800 - regression_loss: 0.1675 - classification_loss: 0.0125 9/500 [..............................] - ETA: 4:11 - loss: 0.1763 - regression_loss: 0.1632 - classification_loss: 0.0131 10/500 [..............................] - ETA: 4:11 - loss: 0.1787 - regression_loss: 0.1655 - classification_loss: 0.0132 11/500 [..............................] - ETA: 4:10 - loss: 0.2224 - regression_loss: 0.2048 - classification_loss: 0.0175 12/500 [..............................] - ETA: 4:09 - loss: 0.2318 - regression_loss: 0.2131 - classification_loss: 0.0187 13/500 [..............................] - ETA: 4:08 - loss: 0.2354 - regression_loss: 0.2165 - classification_loss: 0.0189 14/500 [..............................] - ETA: 4:07 - loss: 0.2469 - regression_loss: 0.2274 - classification_loss: 0.0195 15/500 [..............................] - ETA: 4:07 - loss: 0.2483 - regression_loss: 0.2285 - classification_loss: 0.0198 16/500 [..............................] - ETA: 4:07 - loss: 0.2458 - regression_loss: 0.2260 - classification_loss: 0.0198 17/500 [>.............................] - ETA: 4:06 - loss: 0.2365 - regression_loss: 0.2171 - classification_loss: 0.0194 18/500 [>.............................] - ETA: 4:06 - loss: 0.2298 - regression_loss: 0.2111 - classification_loss: 0.0187 19/500 [>.............................] - ETA: 4:05 - loss: 0.2234 - regression_loss: 0.2051 - classification_loss: 0.0184 20/500 [>.............................] - ETA: 4:05 - loss: 0.2197 - regression_loss: 0.2017 - classification_loss: 0.0180 21/500 [>.............................] - ETA: 4:04 - loss: 0.2125 - regression_loss: 0.1949 - classification_loss: 0.0176 22/500 [>.............................] - ETA: 4:03 - loss: 0.2232 - regression_loss: 0.2038 - classification_loss: 0.0194 23/500 [>.............................] - ETA: 4:03 - loss: 0.2172 - regression_loss: 0.1980 - classification_loss: 0.0191 24/500 [>.............................] - ETA: 4:02 - loss: 0.2170 - regression_loss: 0.1981 - classification_loss: 0.0188 25/500 [>.............................] - ETA: 4:02 - loss: 0.2119 - regression_loss: 0.1935 - classification_loss: 0.0184 26/500 [>.............................] - ETA: 4:02 - loss: 0.2087 - regression_loss: 0.1906 - classification_loss: 0.0182 27/500 [>.............................] - ETA: 4:01 - loss: 0.2065 - regression_loss: 0.1887 - classification_loss: 0.0178 28/500 [>.............................] - ETA: 4:01 - loss: 0.2050 - regression_loss: 0.1875 - classification_loss: 0.0175 29/500 [>.............................] - ETA: 4:01 - loss: 0.2008 - regression_loss: 0.1837 - classification_loss: 0.0171 30/500 [>.............................] - ETA: 4:00 - loss: 0.1993 - regression_loss: 0.1822 - classification_loss: 0.0171 31/500 [>.............................] - ETA: 4:00 - loss: 0.1973 - regression_loss: 0.1804 - classification_loss: 0.0168 32/500 [>.............................] - ETA: 3:59 - loss: 0.1943 - regression_loss: 0.1777 - classification_loss: 0.0166 33/500 [>.............................] - ETA: 3:59 - loss: 0.1938 - regression_loss: 0.1771 - classification_loss: 0.0167 34/500 [=>............................] - ETA: 3:58 - loss: 0.1932 - regression_loss: 0.1768 - classification_loss: 0.0164 35/500 [=>............................] - ETA: 3:57 - loss: 0.1954 - regression_loss: 0.1783 - classification_loss: 0.0171 36/500 [=>............................] - ETA: 3:57 - loss: 0.1980 - regression_loss: 0.1808 - classification_loss: 0.0172 37/500 [=>............................] - ETA: 3:56 - loss: 0.1950 - regression_loss: 0.1779 - classification_loss: 0.0170 38/500 [=>............................] - ETA: 3:56 - loss: 0.1972 - regression_loss: 0.1802 - classification_loss: 0.0170 39/500 [=>............................] - ETA: 3:55 - loss: 0.1946 - regression_loss: 0.1778 - classification_loss: 0.0168 40/500 [=>............................] - ETA: 3:55 - loss: 0.1937 - regression_loss: 0.1772 - classification_loss: 0.0166 41/500 [=>............................] - ETA: 3:54 - loss: 0.1996 - regression_loss: 0.1830 - classification_loss: 0.0165 42/500 [=>............................] - ETA: 3:54 - loss: 0.1995 - regression_loss: 0.1829 - classification_loss: 0.0166 43/500 [=>............................] - ETA: 3:53 - loss: 0.1973 - regression_loss: 0.1809 - classification_loss: 0.0164 44/500 [=>............................] - ETA: 3:53 - loss: 0.2027 - regression_loss: 0.1862 - classification_loss: 0.0165 45/500 [=>............................] - ETA: 3:52 - loss: 0.2039 - regression_loss: 0.1873 - classification_loss: 0.0166 46/500 [=>............................] - ETA: 3:52 - loss: 0.2011 - regression_loss: 0.1848 - classification_loss: 0.0164 47/500 [=>............................] - ETA: 3:51 - loss: 0.1991 - regression_loss: 0.1829 - classification_loss: 0.0162 48/500 [=>............................] - ETA: 3:51 - loss: 0.1960 - regression_loss: 0.1801 - classification_loss: 0.0160 49/500 [=>............................] - ETA: 3:50 - loss: 0.1947 - regression_loss: 0.1788 - classification_loss: 0.0159 50/500 [==>...........................] - ETA: 3:50 - loss: 0.1959 - regression_loss: 0.1798 - classification_loss: 0.0160 51/500 [==>...........................] - ETA: 3:49 - loss: 0.1967 - regression_loss: 0.1808 - classification_loss: 0.0159 52/500 [==>...........................] - ETA: 3:48 - loss: 0.1950 - regression_loss: 0.1792 - classification_loss: 0.0158 53/500 [==>...........................] - ETA: 3:48 - loss: 0.1965 - regression_loss: 0.1806 - classification_loss: 0.0159 54/500 [==>...........................] - ETA: 3:47 - loss: 0.1964 - regression_loss: 0.1804 - classification_loss: 0.0159 55/500 [==>...........................] - ETA: 3:47 - loss: 0.1957 - regression_loss: 0.1799 - classification_loss: 0.0159 56/500 [==>...........................] - ETA: 3:46 - loss: 0.1977 - regression_loss: 0.1816 - classification_loss: 0.0160 57/500 [==>...........................] - ETA: 3:45 - loss: 0.1952 - regression_loss: 0.1794 - classification_loss: 0.0159 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1944 - regression_loss: 0.1785 - classification_loss: 0.0159 59/500 [==>...........................] - ETA: 3:44 - loss: 0.1937 - regression_loss: 0.1778 - classification_loss: 0.0159 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1932 - regression_loss: 0.1773 - classification_loss: 0.0159 61/500 [==>...........................] - ETA: 3:44 - loss: 0.1930 - regression_loss: 0.1771 - classification_loss: 0.0159 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1951 - regression_loss: 0.1793 - classification_loss: 0.0159 63/500 [==>...........................] - ETA: 3:43 - loss: 0.1960 - regression_loss: 0.1801 - classification_loss: 0.0159 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1966 - regression_loss: 0.1808 - classification_loss: 0.0158 65/500 [==>...........................] - ETA: 3:42 - loss: 0.1989 - regression_loss: 0.1828 - classification_loss: 0.0161 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1979 - regression_loss: 0.1819 - classification_loss: 0.0160 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1966 - regression_loss: 0.1805 - classification_loss: 0.0160 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1966 - regression_loss: 0.1806 - classification_loss: 0.0160 69/500 [===>..........................] - ETA: 3:39 - loss: 0.1948 - regression_loss: 0.1789 - classification_loss: 0.0159 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1951 - regression_loss: 0.1792 - classification_loss: 0.0158 71/500 [===>..........................] - ETA: 3:38 - loss: 0.1943 - regression_loss: 0.1785 - classification_loss: 0.0157 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1951 - regression_loss: 0.1793 - classification_loss: 0.0158 73/500 [===>..........................] - ETA: 3:37 - loss: 0.1943 - regression_loss: 0.1785 - classification_loss: 0.0158 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1932 - regression_loss: 0.1775 - classification_loss: 0.0157 75/500 [===>..........................] - ETA: 3:36 - loss: 0.1995 - regression_loss: 0.1826 - classification_loss: 0.0168 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1977 - regression_loss: 0.1810 - classification_loss: 0.0167 77/500 [===>..........................] - ETA: 3:35 - loss: 0.1961 - regression_loss: 0.1796 - classification_loss: 0.0165 78/500 [===>..........................] - ETA: 3:35 - loss: 0.2058 - regression_loss: 0.1862 - classification_loss: 0.0195 79/500 [===>..........................] - ETA: 3:35 - loss: 0.2080 - regression_loss: 0.1883 - classification_loss: 0.0197 80/500 [===>..........................] - ETA: 3:34 - loss: 0.2069 - regression_loss: 0.1872 - classification_loss: 0.0197 81/500 [===>..........................] - ETA: 3:34 - loss: 0.2096 - regression_loss: 0.1897 - classification_loss: 0.0199 82/500 [===>..........................] - ETA: 3:33 - loss: 0.2101 - regression_loss: 0.1902 - classification_loss: 0.0199 83/500 [===>..........................] - ETA: 3:33 - loss: 0.2093 - regression_loss: 0.1894 - classification_loss: 0.0199 84/500 [====>.........................] - ETA: 3:32 - loss: 0.2081 - regression_loss: 0.1884 - classification_loss: 0.0197 85/500 [====>.........................] - ETA: 3:32 - loss: 0.2081 - regression_loss: 0.1885 - classification_loss: 0.0196 86/500 [====>.........................] - ETA: 3:31 - loss: 0.2078 - regression_loss: 0.1884 - classification_loss: 0.0195 87/500 [====>.........................] - ETA: 3:31 - loss: 0.2078 - regression_loss: 0.1883 - classification_loss: 0.0195 88/500 [====>.........................] - ETA: 3:30 - loss: 0.2075 - regression_loss: 0.1881 - classification_loss: 0.0194 89/500 [====>.........................] - ETA: 3:29 - loss: 0.2073 - regression_loss: 0.1880 - classification_loss: 0.0193 90/500 [====>.........................] - ETA: 3:29 - loss: 0.2070 - regression_loss: 0.1878 - classification_loss: 0.0192 91/500 [====>.........................] - ETA: 3:28 - loss: 0.2065 - regression_loss: 0.1873 - classification_loss: 0.0192 92/500 [====>.........................] - ETA: 3:28 - loss: 0.2069 - regression_loss: 0.1877 - classification_loss: 0.0193 93/500 [====>.........................] - ETA: 3:27 - loss: 0.2067 - regression_loss: 0.1875 - classification_loss: 0.0192 94/500 [====>.........................] - ETA: 3:27 - loss: 0.2058 - regression_loss: 0.1867 - classification_loss: 0.0191 95/500 [====>.........................] - ETA: 3:26 - loss: 0.2064 - regression_loss: 0.1872 - classification_loss: 0.0192 96/500 [====>.........................] - ETA: 3:26 - loss: 0.2129 - regression_loss: 0.1930 - classification_loss: 0.0198 97/500 [====>.........................] - ETA: 3:25 - loss: 0.2130 - regression_loss: 0.1932 - classification_loss: 0.0198 98/500 [====>.........................] - ETA: 3:24 - loss: 0.2121 - regression_loss: 0.1923 - classification_loss: 0.0197 99/500 [====>.........................] - ETA: 3:24 - loss: 0.2107 - regression_loss: 0.1910 - classification_loss: 0.0196 100/500 [=====>........................] - ETA: 3:24 - loss: 0.2095 - regression_loss: 0.1900 - classification_loss: 0.0195 101/500 [=====>........................] - ETA: 3:23 - loss: 0.2096 - regression_loss: 0.1902 - classification_loss: 0.0195 102/500 [=====>........................] - ETA: 3:23 - loss: 0.2090 - regression_loss: 0.1896 - classification_loss: 0.0194 103/500 [=====>........................] - ETA: 3:22 - loss: 0.2084 - regression_loss: 0.1891 - classification_loss: 0.0193 104/500 [=====>........................] - ETA: 3:22 - loss: 0.2084 - regression_loss: 0.1892 - classification_loss: 0.0193 105/500 [=====>........................] - ETA: 3:21 - loss: 0.2078 - regression_loss: 0.1887 - classification_loss: 0.0192 106/500 [=====>........................] - ETA: 3:21 - loss: 0.2071 - regression_loss: 0.1880 - classification_loss: 0.0191 107/500 [=====>........................] - ETA: 3:20 - loss: 0.2103 - regression_loss: 0.1906 - classification_loss: 0.0197 108/500 [=====>........................] - ETA: 3:20 - loss: 0.2104 - regression_loss: 0.1907 - classification_loss: 0.0197 109/500 [=====>........................] - ETA: 3:19 - loss: 0.2101 - regression_loss: 0.1904 - classification_loss: 0.0196 110/500 [=====>........................] - ETA: 3:19 - loss: 0.2095 - regression_loss: 0.1900 - classification_loss: 0.0195 111/500 [=====>........................] - ETA: 3:18 - loss: 0.2101 - regression_loss: 0.1905 - classification_loss: 0.0195 112/500 [=====>........................] - ETA: 3:18 - loss: 0.2092 - regression_loss: 0.1897 - classification_loss: 0.0195 113/500 [=====>........................] - ETA: 3:17 - loss: 0.2095 - regression_loss: 0.1901 - classification_loss: 0.0195 114/500 [=====>........................] - ETA: 3:17 - loss: 0.2101 - regression_loss: 0.1907 - classification_loss: 0.0195 115/500 [=====>........................] - ETA: 3:16 - loss: 0.2099 - regression_loss: 0.1905 - classification_loss: 0.0194 116/500 [=====>........................] - ETA: 3:16 - loss: 0.2094 - regression_loss: 0.1900 - classification_loss: 0.0193 117/500 [======>.......................] - ETA: 3:15 - loss: 0.2102 - regression_loss: 0.1908 - classification_loss: 0.0194 118/500 [======>.......................] - ETA: 3:15 - loss: 0.2104 - regression_loss: 0.1911 - classification_loss: 0.0193 119/500 [======>.......................] - ETA: 3:14 - loss: 0.2104 - regression_loss: 0.1912 - classification_loss: 0.0192 120/500 [======>.......................] - ETA: 3:13 - loss: 0.2103 - regression_loss: 0.1910 - classification_loss: 0.0192 121/500 [======>.......................] - ETA: 3:13 - loss: 0.2097 - regression_loss: 0.1905 - classification_loss: 0.0192 122/500 [======>.......................] - ETA: 3:12 - loss: 0.2117 - regression_loss: 0.1924 - classification_loss: 0.0193 123/500 [======>.......................] - ETA: 3:12 - loss: 0.2112 - regression_loss: 0.1920 - classification_loss: 0.0192 124/500 [======>.......................] - ETA: 3:11 - loss: 0.2109 - regression_loss: 0.1917 - classification_loss: 0.0192 125/500 [======>.......................] - ETA: 3:11 - loss: 0.2108 - regression_loss: 0.1917 - classification_loss: 0.0191 126/500 [======>.......................] - ETA: 3:10 - loss: 0.2116 - regression_loss: 0.1923 - classification_loss: 0.0192 127/500 [======>.......................] - ETA: 3:10 - loss: 0.2114 - regression_loss: 0.1922 - classification_loss: 0.0192 128/500 [======>.......................] - ETA: 3:09 - loss: 0.2118 - regression_loss: 0.1927 - classification_loss: 0.0191 129/500 [======>.......................] - ETA: 3:09 - loss: 0.2110 - regression_loss: 0.1920 - classification_loss: 0.0190 130/500 [======>.......................] - ETA: 3:09 - loss: 0.2108 - regression_loss: 0.1918 - classification_loss: 0.0190 131/500 [======>.......................] - ETA: 3:08 - loss: 0.2103 - regression_loss: 0.1914 - classification_loss: 0.0189 132/500 [======>.......................] - ETA: 3:07 - loss: 0.2136 - regression_loss: 0.1939 - classification_loss: 0.0197 133/500 [======>.......................] - ETA: 3:07 - loss: 0.2129 - regression_loss: 0.1933 - classification_loss: 0.0196 134/500 [=======>......................] - ETA: 3:07 - loss: 0.2125 - regression_loss: 0.1929 - classification_loss: 0.0196 135/500 [=======>......................] - ETA: 3:06 - loss: 0.2125 - regression_loss: 0.1929 - classification_loss: 0.0196 136/500 [=======>......................] - ETA: 3:06 - loss: 0.2119 - regression_loss: 0.1924 - classification_loss: 0.0195 137/500 [=======>......................] - ETA: 3:05 - loss: 0.2110 - regression_loss: 0.1916 - classification_loss: 0.0194 138/500 [=======>......................] - ETA: 3:04 - loss: 0.2111 - regression_loss: 0.1918 - classification_loss: 0.0194 139/500 [=======>......................] - ETA: 3:04 - loss: 0.2110 - regression_loss: 0.1917 - classification_loss: 0.0193 140/500 [=======>......................] - ETA: 3:03 - loss: 0.2107 - regression_loss: 0.1915 - classification_loss: 0.0192 141/500 [=======>......................] - ETA: 3:03 - loss: 0.2108 - regression_loss: 0.1917 - classification_loss: 0.0191 142/500 [=======>......................] - ETA: 3:02 - loss: 0.2110 - regression_loss: 0.1918 - classification_loss: 0.0192 143/500 [=======>......................] - ETA: 3:02 - loss: 0.2104 - regression_loss: 0.1913 - classification_loss: 0.0191 144/500 [=======>......................] - ETA: 3:01 - loss: 0.2104 - regression_loss: 0.1913 - classification_loss: 0.0191 145/500 [=======>......................] - ETA: 3:01 - loss: 0.2099 - regression_loss: 0.1910 - classification_loss: 0.0190 146/500 [=======>......................] - ETA: 3:00 - loss: 0.2089 - regression_loss: 0.1900 - classification_loss: 0.0189 147/500 [=======>......................] - ETA: 3:00 - loss: 0.2090 - regression_loss: 0.1901 - classification_loss: 0.0189 148/500 [=======>......................] - ETA: 2:59 - loss: 0.2083 - regression_loss: 0.1896 - classification_loss: 0.0188 149/500 [=======>......................] - ETA: 2:59 - loss: 0.2074 - regression_loss: 0.1887 - classification_loss: 0.0187 150/500 [========>.....................] - ETA: 2:58 - loss: 0.2087 - regression_loss: 0.1898 - classification_loss: 0.0189 151/500 [========>.....................] - ETA: 2:58 - loss: 0.2081 - regression_loss: 0.1892 - classification_loss: 0.0189 152/500 [========>.....................] - ETA: 2:57 - loss: 0.2071 - regression_loss: 0.1884 - classification_loss: 0.0188 153/500 [========>.....................] - ETA: 2:57 - loss: 0.2097 - regression_loss: 0.1908 - classification_loss: 0.0189 154/500 [========>.....................] - ETA: 2:56 - loss: 0.2096 - regression_loss: 0.1907 - classification_loss: 0.0189 155/500 [========>.....................] - ETA: 2:56 - loss: 0.2091 - regression_loss: 0.1903 - classification_loss: 0.0188 156/500 [========>.....................] - ETA: 2:55 - loss: 0.2088 - regression_loss: 0.1901 - classification_loss: 0.0187 157/500 [========>.....................] - ETA: 2:55 - loss: 0.2091 - regression_loss: 0.1904 - classification_loss: 0.0187 158/500 [========>.....................] - ETA: 2:54 - loss: 0.2087 - regression_loss: 0.1900 - classification_loss: 0.0187 159/500 [========>.....................] - ETA: 2:54 - loss: 0.2088 - regression_loss: 0.1902 - classification_loss: 0.0187 160/500 [========>.....................] - ETA: 2:53 - loss: 0.2080 - regression_loss: 0.1893 - classification_loss: 0.0186 161/500 [========>.....................] - ETA: 2:53 - loss: 0.2076 - regression_loss: 0.1891 - classification_loss: 0.0186 162/500 [========>.....................] - ETA: 2:52 - loss: 0.2071 - regression_loss: 0.1886 - classification_loss: 0.0185 163/500 [========>.....................] - ETA: 2:52 - loss: 0.2067 - regression_loss: 0.1882 - classification_loss: 0.0185 164/500 [========>.....................] - ETA: 2:51 - loss: 0.2076 - regression_loss: 0.1890 - classification_loss: 0.0186 165/500 [========>.....................] - ETA: 2:51 - loss: 0.2069 - regression_loss: 0.1883 - classification_loss: 0.0185 166/500 [========>.....................] - ETA: 2:50 - loss: 0.2067 - regression_loss: 0.1882 - classification_loss: 0.0185 167/500 [=========>....................] - ETA: 2:49 - loss: 0.2061 - regression_loss: 0.1876 - classification_loss: 0.0185 168/500 [=========>....................] - ETA: 2:49 - loss: 0.2060 - regression_loss: 0.1876 - classification_loss: 0.0184 169/500 [=========>....................] - ETA: 2:48 - loss: 0.2078 - regression_loss: 0.1890 - classification_loss: 0.0189 170/500 [=========>....................] - ETA: 2:48 - loss: 0.2076 - regression_loss: 0.1888 - classification_loss: 0.0188 171/500 [=========>....................] - ETA: 2:47 - loss: 0.2071 - regression_loss: 0.1883 - classification_loss: 0.0187 172/500 [=========>....................] - ETA: 2:47 - loss: 0.2095 - regression_loss: 0.1902 - classification_loss: 0.0193 173/500 [=========>....................] - ETA: 2:47 - loss: 0.2094 - regression_loss: 0.1901 - classification_loss: 0.0193 174/500 [=========>....................] - ETA: 2:46 - loss: 0.2094 - regression_loss: 0.1902 - classification_loss: 0.0192 175/500 [=========>....................] - ETA: 2:46 - loss: 0.2091 - regression_loss: 0.1899 - classification_loss: 0.0192 176/500 [=========>....................] - ETA: 2:45 - loss: 0.2087 - regression_loss: 0.1896 - classification_loss: 0.0192 177/500 [=========>....................] - ETA: 2:45 - loss: 0.2084 - regression_loss: 0.1893 - classification_loss: 0.0191 178/500 [=========>....................] - ETA: 2:44 - loss: 0.2076 - regression_loss: 0.1886 - classification_loss: 0.0190 179/500 [=========>....................] - ETA: 2:44 - loss: 0.2073 - regression_loss: 0.1883 - classification_loss: 0.0190 180/500 [=========>....................] - ETA: 2:43 - loss: 0.2073 - regression_loss: 0.1883 - classification_loss: 0.0190 181/500 [=========>....................] - ETA: 2:43 - loss: 0.2074 - regression_loss: 0.1884 - classification_loss: 0.0190 182/500 [=========>....................] - ETA: 2:42 - loss: 0.2066 - regression_loss: 0.1877 - classification_loss: 0.0189 183/500 [=========>....................] - ETA: 2:42 - loss: 0.2068 - regression_loss: 0.1880 - classification_loss: 0.0188 184/500 [==========>...................] - ETA: 2:41 - loss: 0.2062 - regression_loss: 0.1874 - classification_loss: 0.0188 185/500 [==========>...................] - ETA: 2:40 - loss: 0.2056 - regression_loss: 0.1868 - classification_loss: 0.0187 186/500 [==========>...................] - ETA: 2:40 - loss: 0.2059 - regression_loss: 0.1871 - classification_loss: 0.0188 187/500 [==========>...................] - ETA: 2:39 - loss: 0.2053 - regression_loss: 0.1866 - classification_loss: 0.0188 188/500 [==========>...................] - ETA: 2:39 - loss: 0.2049 - regression_loss: 0.1862 - classification_loss: 0.0187 189/500 [==========>...................] - ETA: 2:38 - loss: 0.2043 - regression_loss: 0.1856 - classification_loss: 0.0186 190/500 [==========>...................] - ETA: 2:38 - loss: 0.2044 - regression_loss: 0.1858 - classification_loss: 0.0186 191/500 [==========>...................] - ETA: 2:37 - loss: 0.2044 - regression_loss: 0.1857 - classification_loss: 0.0186 192/500 [==========>...................] - ETA: 2:37 - loss: 0.2041 - regression_loss: 0.1855 - classification_loss: 0.0186 193/500 [==========>...................] - ETA: 2:36 - loss: 0.2039 - regression_loss: 0.1853 - classification_loss: 0.0185 194/500 [==========>...................] - ETA: 2:36 - loss: 0.2037 - regression_loss: 0.1852 - classification_loss: 0.0186 195/500 [==========>...................] - ETA: 2:35 - loss: 0.2045 - regression_loss: 0.1859 - classification_loss: 0.0186 196/500 [==========>...................] - ETA: 2:35 - loss: 0.2047 - regression_loss: 0.1860 - classification_loss: 0.0187 197/500 [==========>...................] - ETA: 2:34 - loss: 0.2042 - regression_loss: 0.1855 - classification_loss: 0.0186 198/500 [==========>...................] - ETA: 2:34 - loss: 0.2037 - regression_loss: 0.1851 - classification_loss: 0.0186 199/500 [==========>...................] - ETA: 2:33 - loss: 0.2032 - regression_loss: 0.1847 - classification_loss: 0.0185 200/500 [===========>..................] - ETA: 2:33 - loss: 0.2027 - regression_loss: 0.1842 - classification_loss: 0.0185 201/500 [===========>..................] - ETA: 2:32 - loss: 0.2022 - regression_loss: 0.1838 - classification_loss: 0.0184 202/500 [===========>..................] - ETA: 2:32 - loss: 0.2022 - regression_loss: 0.1838 - classification_loss: 0.0184 203/500 [===========>..................] - ETA: 2:31 - loss: 0.2022 - regression_loss: 0.1838 - classification_loss: 0.0184 204/500 [===========>..................] - ETA: 2:31 - loss: 0.2017 - regression_loss: 0.1834 - classification_loss: 0.0183 205/500 [===========>..................] - ETA: 2:30 - loss: 0.2014 - regression_loss: 0.1830 - classification_loss: 0.0183 206/500 [===========>..................] - ETA: 2:30 - loss: 0.2016 - regression_loss: 0.1833 - classification_loss: 0.0183 207/500 [===========>..................] - ETA: 2:29 - loss: 0.2010 - regression_loss: 0.1828 - classification_loss: 0.0182 208/500 [===========>..................] - ETA: 2:29 - loss: 0.2007 - regression_loss: 0.1825 - classification_loss: 0.0182 209/500 [===========>..................] - ETA: 2:28 - loss: 0.2004 - regression_loss: 0.1822 - classification_loss: 0.0182 210/500 [===========>..................] - ETA: 2:28 - loss: 0.2003 - regression_loss: 0.1821 - classification_loss: 0.0182 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1999 - regression_loss: 0.1817 - classification_loss: 0.0181 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1993 - regression_loss: 0.1812 - classification_loss: 0.0181 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1992 - regression_loss: 0.1811 - classification_loss: 0.0181 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1994 - regression_loss: 0.1813 - classification_loss: 0.0181 215/500 [===========>..................] - ETA: 2:25 - loss: 0.2008 - regression_loss: 0.1826 - classification_loss: 0.0181 216/500 [===========>..................] - ETA: 2:25 - loss: 0.2005 - regression_loss: 0.1824 - classification_loss: 0.0181 217/500 [============>.................] - ETA: 2:24 - loss: 0.2004 - regression_loss: 0.1823 - classification_loss: 0.0180 218/500 [============>.................] - ETA: 2:24 - loss: 0.1998 - regression_loss: 0.1818 - classification_loss: 0.0180 219/500 [============>.................] - ETA: 2:23 - loss: 0.1992 - regression_loss: 0.1812 - classification_loss: 0.0179 220/500 [============>.................] - ETA: 2:23 - loss: 0.1987 - regression_loss: 0.1808 - classification_loss: 0.0179 221/500 [============>.................] - ETA: 2:22 - loss: 0.1985 - regression_loss: 0.1806 - classification_loss: 0.0179 222/500 [============>.................] - ETA: 2:22 - loss: 0.1986 - regression_loss: 0.1807 - classification_loss: 0.0179 223/500 [============>.................] - ETA: 2:21 - loss: 0.1985 - regression_loss: 0.1807 - classification_loss: 0.0178 224/500 [============>.................] - ETA: 2:21 - loss: 0.1983 - regression_loss: 0.1805 - classification_loss: 0.0179 225/500 [============>.................] - ETA: 2:20 - loss: 0.1990 - regression_loss: 0.1812 - classification_loss: 0.0178 226/500 [============>.................] - ETA: 2:20 - loss: 0.1989 - regression_loss: 0.1811 - classification_loss: 0.0178 227/500 [============>.................] - ETA: 2:19 - loss: 0.1990 - regression_loss: 0.1812 - classification_loss: 0.0177 228/500 [============>.................] - ETA: 2:19 - loss: 0.1987 - regression_loss: 0.1809 - classification_loss: 0.0177 229/500 [============>.................] - ETA: 2:18 - loss: 0.1983 - regression_loss: 0.1806 - classification_loss: 0.0177 230/500 [============>.................] - ETA: 2:18 - loss: 0.1978 - regression_loss: 0.1801 - classification_loss: 0.0177 231/500 [============>.................] - ETA: 2:17 - loss: 0.1988 - regression_loss: 0.1810 - classification_loss: 0.0177 232/500 [============>.................] - ETA: 2:17 - loss: 0.1985 - regression_loss: 0.1808 - classification_loss: 0.0177 233/500 [============>.................] - ETA: 2:16 - loss: 0.1994 - regression_loss: 0.1814 - classification_loss: 0.0180 234/500 [=============>................] - ETA: 2:15 - loss: 0.1991 - regression_loss: 0.1811 - classification_loss: 0.0179 235/500 [=============>................] - ETA: 2:15 - loss: 0.1990 - regression_loss: 0.1810 - classification_loss: 0.0179 236/500 [=============>................] - ETA: 2:14 - loss: 0.1986 - regression_loss: 0.1807 - classification_loss: 0.0179 237/500 [=============>................] - ETA: 2:14 - loss: 0.1992 - regression_loss: 0.1813 - classification_loss: 0.0179 238/500 [=============>................] - ETA: 2:13 - loss: 0.1991 - regression_loss: 0.1813 - classification_loss: 0.0179 239/500 [=============>................] - ETA: 2:13 - loss: 0.1989 - regression_loss: 0.1811 - classification_loss: 0.0178 240/500 [=============>................] - ETA: 2:12 - loss: 0.1988 - regression_loss: 0.1810 - classification_loss: 0.0179 241/500 [=============>................] - ETA: 2:12 - loss: 0.1987 - regression_loss: 0.1809 - classification_loss: 0.0178 242/500 [=============>................] - ETA: 2:11 - loss: 0.1987 - regression_loss: 0.1809 - classification_loss: 0.0178 243/500 [=============>................] - ETA: 2:11 - loss: 0.1986 - regression_loss: 0.1808 - classification_loss: 0.0178 244/500 [=============>................] - ETA: 2:10 - loss: 0.1986 - regression_loss: 0.1808 - classification_loss: 0.0178 245/500 [=============>................] - ETA: 2:10 - loss: 0.1983 - regression_loss: 0.1805 - classification_loss: 0.0178 246/500 [=============>................] - ETA: 2:09 - loss: 0.1986 - regression_loss: 0.1809 - classification_loss: 0.0178 247/500 [=============>................] - ETA: 2:09 - loss: 0.1981 - regression_loss: 0.1804 - classification_loss: 0.0177 248/500 [=============>................] - ETA: 2:08 - loss: 0.1977 - regression_loss: 0.1800 - classification_loss: 0.0177 249/500 [=============>................] - ETA: 2:08 - loss: 0.1975 - regression_loss: 0.1798 - classification_loss: 0.0177 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1977 - regression_loss: 0.1800 - classification_loss: 0.0176 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1973 - regression_loss: 0.1797 - classification_loss: 0.0176 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1975 - regression_loss: 0.1799 - classification_loss: 0.0176 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1975 - regression_loss: 0.1799 - classification_loss: 0.0176 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1977 - regression_loss: 0.1801 - classification_loss: 0.0176 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1975 - regression_loss: 0.1799 - classification_loss: 0.0176 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1982 - regression_loss: 0.1806 - classification_loss: 0.0176 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1984 - regression_loss: 0.1808 - classification_loss: 0.0176 258/500 [==============>...............] - ETA: 2:03 - loss: 0.2001 - regression_loss: 0.1823 - classification_loss: 0.0178 259/500 [==============>...............] - ETA: 2:03 - loss: 0.2001 - regression_loss: 0.1823 - classification_loss: 0.0178 260/500 [==============>...............] - ETA: 2:02 - loss: 0.2002 - regression_loss: 0.1824 - classification_loss: 0.0178 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1998 - regression_loss: 0.1821 - classification_loss: 0.0177 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1995 - regression_loss: 0.1818 - classification_loss: 0.0177 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1999 - regression_loss: 0.1821 - classification_loss: 0.0178 264/500 [==============>...............] - ETA: 2:00 - loss: 0.2000 - regression_loss: 0.1822 - classification_loss: 0.0178 265/500 [==============>...............] - ETA: 2:00 - loss: 0.2002 - regression_loss: 0.1824 - classification_loss: 0.0178 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1996 - regression_loss: 0.1819 - classification_loss: 0.0178 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1994 - regression_loss: 0.1816 - classification_loss: 0.0178 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1993 - regression_loss: 0.1816 - classification_loss: 0.0177 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1990 - regression_loss: 0.1814 - classification_loss: 0.0177 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1989 - regression_loss: 0.1812 - classification_loss: 0.0177 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1996 - regression_loss: 0.1820 - classification_loss: 0.0177 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1997 - regression_loss: 0.1820 - classification_loss: 0.0177 273/500 [===============>..............] - ETA: 1:56 - loss: 0.2022 - regression_loss: 0.1838 - classification_loss: 0.0185 274/500 [===============>..............] - ETA: 1:55 - loss: 0.2020 - regression_loss: 0.1835 - classification_loss: 0.0185 275/500 [===============>..............] - ETA: 1:55 - loss: 0.2018 - regression_loss: 0.1834 - classification_loss: 0.0185 276/500 [===============>..............] - ETA: 1:54 - loss: 0.2019 - regression_loss: 0.1835 - classification_loss: 0.0185 277/500 [===============>..............] - ETA: 1:54 - loss: 0.2016 - regression_loss: 0.1831 - classification_loss: 0.0185 278/500 [===============>..............] - ETA: 1:53 - loss: 0.2014 - regression_loss: 0.1830 - classification_loss: 0.0184 279/500 [===============>..............] - ETA: 1:53 - loss: 0.2011 - regression_loss: 0.1827 - classification_loss: 0.0184 280/500 [===============>..............] - ETA: 1:52 - loss: 0.2011 - regression_loss: 0.1827 - classification_loss: 0.0184 281/500 [===============>..............] - ETA: 1:52 - loss: 0.2024 - regression_loss: 0.1836 - classification_loss: 0.0187 282/500 [===============>..............] - ETA: 1:51 - loss: 0.2020 - regression_loss: 0.1833 - classification_loss: 0.0187 283/500 [===============>..............] - ETA: 1:51 - loss: 0.2016 - regression_loss: 0.1830 - classification_loss: 0.0186 284/500 [================>.............] - ETA: 1:50 - loss: 0.2016 - regression_loss: 0.1830 - classification_loss: 0.0186 285/500 [================>.............] - ETA: 1:49 - loss: 0.2014 - regression_loss: 0.1828 - classification_loss: 0.0186 286/500 [================>.............] - ETA: 1:49 - loss: 0.2010 - regression_loss: 0.1824 - classification_loss: 0.0186 287/500 [================>.............] - ETA: 1:48 - loss: 0.2007 - regression_loss: 0.1821 - classification_loss: 0.0185 288/500 [================>.............] - ETA: 1:48 - loss: 0.2003 - regression_loss: 0.1818 - classification_loss: 0.0185 289/500 [================>.............] - ETA: 1:47 - loss: 0.2002 - regression_loss: 0.1817 - classification_loss: 0.0185 290/500 [================>.............] - ETA: 1:47 - loss: 0.2023 - regression_loss: 0.1836 - classification_loss: 0.0187 291/500 [================>.............] - ETA: 1:46 - loss: 0.2024 - regression_loss: 0.1836 - classification_loss: 0.0187 292/500 [================>.............] - ETA: 1:46 - loss: 0.2020 - regression_loss: 0.1833 - classification_loss: 0.0187 293/500 [================>.............] - ETA: 1:45 - loss: 0.2017 - regression_loss: 0.1831 - classification_loss: 0.0186 294/500 [================>.............] - ETA: 1:45 - loss: 0.2013 - regression_loss: 0.1827 - classification_loss: 0.0186 295/500 [================>.............] - ETA: 1:44 - loss: 0.2009 - regression_loss: 0.1823 - classification_loss: 0.0186 296/500 [================>.............] - ETA: 1:44 - loss: 0.2009 - regression_loss: 0.1823 - classification_loss: 0.0186 297/500 [================>.............] - ETA: 1:43 - loss: 0.2007 - regression_loss: 0.1821 - classification_loss: 0.0185 298/500 [================>.............] - ETA: 1:43 - loss: 0.2004 - regression_loss: 0.1819 - classification_loss: 0.0185 299/500 [================>.............] - ETA: 1:42 - loss: 0.2001 - regression_loss: 0.1816 - classification_loss: 0.0185 300/500 [=================>............] - ETA: 1:42 - loss: 0.1997 - regression_loss: 0.1813 - classification_loss: 0.0184 301/500 [=================>............] - ETA: 1:41 - loss: 0.1995 - regression_loss: 0.1811 - classification_loss: 0.0184 302/500 [=================>............] - ETA: 1:41 - loss: 0.1991 - regression_loss: 0.1808 - classification_loss: 0.0184 303/500 [=================>............] - ETA: 1:40 - loss: 0.1989 - regression_loss: 0.1806 - classification_loss: 0.0183 304/500 [=================>............] - ETA: 1:40 - loss: 0.1988 - regression_loss: 0.1805 - classification_loss: 0.0183 305/500 [=================>............] - ETA: 1:39 - loss: 0.1993 - regression_loss: 0.1809 - classification_loss: 0.0183 306/500 [=================>............] - ETA: 1:39 - loss: 0.1988 - regression_loss: 0.1805 - classification_loss: 0.0183 307/500 [=================>............] - ETA: 1:38 - loss: 0.1991 - regression_loss: 0.1809 - classification_loss: 0.0183 308/500 [=================>............] - ETA: 1:38 - loss: 0.1988 - regression_loss: 0.1805 - classification_loss: 0.0183 309/500 [=================>............] - ETA: 1:37 - loss: 0.1993 - regression_loss: 0.1810 - classification_loss: 0.0183 310/500 [=================>............] - ETA: 1:37 - loss: 0.1990 - regression_loss: 0.1808 - classification_loss: 0.0183 311/500 [=================>............] - ETA: 1:36 - loss: 0.1989 - regression_loss: 0.1807 - classification_loss: 0.0182 312/500 [=================>............] - ETA: 1:36 - loss: 0.1990 - regression_loss: 0.1808 - classification_loss: 0.0182 313/500 [=================>............] - ETA: 1:35 - loss: 0.1989 - regression_loss: 0.1807 - classification_loss: 0.0182 314/500 [=================>............] - ETA: 1:35 - loss: 0.1988 - regression_loss: 0.1806 - classification_loss: 0.0182 315/500 [=================>............] - ETA: 1:34 - loss: 0.1986 - regression_loss: 0.1804 - classification_loss: 0.0182 316/500 [=================>............] - ETA: 1:34 - loss: 0.1987 - regression_loss: 0.1805 - classification_loss: 0.0182 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1985 - regression_loss: 0.1803 - classification_loss: 0.0182 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1985 - regression_loss: 0.1804 - classification_loss: 0.0182 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1987 - regression_loss: 0.1806 - classification_loss: 0.0181 320/500 [==================>...........] - ETA: 1:32 - loss: 0.1987 - regression_loss: 0.1806 - classification_loss: 0.0181 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1987 - regression_loss: 0.1806 - classification_loss: 0.0181 322/500 [==================>...........] - ETA: 1:31 - loss: 0.1986 - regression_loss: 0.1805 - classification_loss: 0.0181 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1996 - regression_loss: 0.1815 - classification_loss: 0.0181 324/500 [==================>...........] - ETA: 1:30 - loss: 0.1992 - regression_loss: 0.1811 - classification_loss: 0.0181 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1989 - regression_loss: 0.1809 - classification_loss: 0.0180 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1986 - regression_loss: 0.1806 - classification_loss: 0.0180 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1983 - regression_loss: 0.1803 - classification_loss: 0.0180 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1982 - regression_loss: 0.1802 - classification_loss: 0.0180 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1978 - regression_loss: 0.1799 - classification_loss: 0.0179 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1979 - regression_loss: 0.1800 - classification_loss: 0.0179 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1992 - regression_loss: 0.1811 - classification_loss: 0.0181 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1994 - regression_loss: 0.1814 - classification_loss: 0.0181 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1991 - regression_loss: 0.1810 - classification_loss: 0.0180 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1998 - regression_loss: 0.1817 - classification_loss: 0.0181 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1999 - regression_loss: 0.1818 - classification_loss: 0.0181 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1995 - regression_loss: 0.1815 - classification_loss: 0.0180 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1994 - regression_loss: 0.1814 - classification_loss: 0.0180 338/500 [===================>..........] - ETA: 1:22 - loss: 0.2007 - regression_loss: 0.1824 - classification_loss: 0.0183 339/500 [===================>..........] - ETA: 1:22 - loss: 0.2004 - regression_loss: 0.1821 - classification_loss: 0.0182 340/500 [===================>..........] - ETA: 1:21 - loss: 0.2008 - regression_loss: 0.1825 - classification_loss: 0.0183 341/500 [===================>..........] - ETA: 1:21 - loss: 0.2012 - regression_loss: 0.1829 - classification_loss: 0.0183 342/500 [===================>..........] - ETA: 1:20 - loss: 0.2009 - regression_loss: 0.1826 - classification_loss: 0.0183 343/500 [===================>..........] - ETA: 1:20 - loss: 0.2007 - regression_loss: 0.1824 - classification_loss: 0.0183 344/500 [===================>..........] - ETA: 1:19 - loss: 0.2004 - regression_loss: 0.1822 - classification_loss: 0.0182 345/500 [===================>..........] - ETA: 1:19 - loss: 0.2001 - regression_loss: 0.1818 - classification_loss: 0.0182 346/500 [===================>..........] - ETA: 1:18 - loss: 0.2000 - regression_loss: 0.1818 - classification_loss: 0.0182 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1997 - regression_loss: 0.1816 - classification_loss: 0.0182 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1994 - regression_loss: 0.1813 - classification_loss: 0.0181 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1992 - regression_loss: 0.1811 - classification_loss: 0.0181 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1990 - regression_loss: 0.1809 - classification_loss: 0.0181 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1990 - regression_loss: 0.1809 - classification_loss: 0.0181 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1986 - regression_loss: 0.1806 - classification_loss: 0.0180 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1985 - regression_loss: 0.1805 - classification_loss: 0.0180 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1982 - regression_loss: 0.1802 - classification_loss: 0.0180 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1980 - regression_loss: 0.1800 - classification_loss: 0.0180 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1978 - regression_loss: 0.1798 - classification_loss: 0.0179 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1973 - regression_loss: 0.1794 - classification_loss: 0.0179 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1975 - regression_loss: 0.1796 - classification_loss: 0.0179 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1973 - regression_loss: 0.1794 - classification_loss: 0.0179 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1969 - regression_loss: 0.1790 - classification_loss: 0.0179 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1966 - regression_loss: 0.1787 - classification_loss: 0.0178 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1963 - regression_loss: 0.1785 - classification_loss: 0.0178 363/500 [====================>.........] - ETA: 1:10 - loss: 0.1965 - regression_loss: 0.1786 - classification_loss: 0.0178 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1963 - regression_loss: 0.1784 - classification_loss: 0.0178 365/500 [====================>.........] - ETA: 1:09 - loss: 0.1960 - regression_loss: 0.1782 - classification_loss: 0.0178 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1960 - regression_loss: 0.1782 - classification_loss: 0.0178 367/500 [=====================>........] - ETA: 1:08 - loss: 0.1957 - regression_loss: 0.1779 - classification_loss: 0.0178 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1957 - regression_loss: 0.1780 - classification_loss: 0.0178 369/500 [=====================>........] - ETA: 1:07 - loss: 0.1955 - regression_loss: 0.1778 - classification_loss: 0.0178 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1954 - regression_loss: 0.1776 - classification_loss: 0.0177 371/500 [=====================>........] - ETA: 1:06 - loss: 0.1954 - regression_loss: 0.1777 - classification_loss: 0.0177 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1952 - regression_loss: 0.1775 - classification_loss: 0.0177 373/500 [=====================>........] - ETA: 1:05 - loss: 0.1953 - regression_loss: 0.1777 - classification_loss: 0.0177 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1950 - regression_loss: 0.1774 - classification_loss: 0.0177 375/500 [=====================>........] - ETA: 1:04 - loss: 0.1948 - regression_loss: 0.1771 - classification_loss: 0.0177 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1944 - regression_loss: 0.1768 - classification_loss: 0.0176 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1942 - regression_loss: 0.1766 - classification_loss: 0.0176 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1941 - regression_loss: 0.1765 - classification_loss: 0.0176 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1938 - regression_loss: 0.1762 - classification_loss: 0.0176 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1936 - regression_loss: 0.1761 - classification_loss: 0.0175 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1933 - regression_loss: 0.1758 - classification_loss: 0.0175 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1930 - regression_loss: 0.1755 - classification_loss: 0.0175 383/500 [=====================>........] - ETA: 59s - loss: 0.1928 - regression_loss: 0.1753 - classification_loss: 0.0175  384/500 [======================>.......] - ETA: 59s - loss: 0.1927 - regression_loss: 0.1753 - classification_loss: 0.0175 385/500 [======================>.......] - ETA: 58s - loss: 0.1927 - regression_loss: 0.1752 - classification_loss: 0.0174 386/500 [======================>.......] - ETA: 58s - loss: 0.1930 - regression_loss: 0.1755 - classification_loss: 0.0175 387/500 [======================>.......] - ETA: 57s - loss: 0.1929 - regression_loss: 0.1755 - classification_loss: 0.0175 388/500 [======================>.......] - ETA: 57s - loss: 0.1926 - regression_loss: 0.1752 - classification_loss: 0.0175 389/500 [======================>.......] - ETA: 56s - loss: 0.1925 - regression_loss: 0.1751 - classification_loss: 0.0174 390/500 [======================>.......] - ETA: 56s - loss: 0.1924 - regression_loss: 0.1750 - classification_loss: 0.0174 391/500 [======================>.......] - ETA: 55s - loss: 0.1924 - regression_loss: 0.1750 - classification_loss: 0.0174 392/500 [======================>.......] - ETA: 55s - loss: 0.1925 - regression_loss: 0.1751 - classification_loss: 0.0174 393/500 [======================>.......] - ETA: 54s - loss: 0.1924 - regression_loss: 0.1750 - classification_loss: 0.0174 394/500 [======================>.......] - ETA: 54s - loss: 0.1923 - regression_loss: 0.1749 - classification_loss: 0.0174 395/500 [======================>.......] - ETA: 53s - loss: 0.1922 - regression_loss: 0.1748 - classification_loss: 0.0174 396/500 [======================>.......] - ETA: 53s - loss: 0.1920 - regression_loss: 0.1747 - classification_loss: 0.0173 397/500 [======================>.......] - ETA: 52s - loss: 0.1924 - regression_loss: 0.1751 - classification_loss: 0.0174 398/500 [======================>.......] - ETA: 52s - loss: 0.1920 - regression_loss: 0.1747 - classification_loss: 0.0173 399/500 [======================>.......] - ETA: 51s - loss: 0.1918 - regression_loss: 0.1745 - classification_loss: 0.0173 400/500 [=======================>......] - ETA: 51s - loss: 0.1916 - regression_loss: 0.1743 - classification_loss: 0.0173 401/500 [=======================>......] - ETA: 50s - loss: 0.1915 - regression_loss: 0.1742 - classification_loss: 0.0173 402/500 [=======================>......] - ETA: 50s - loss: 0.1915 - regression_loss: 0.1742 - classification_loss: 0.0173 403/500 [=======================>......] - ETA: 49s - loss: 0.1916 - regression_loss: 0.1743 - classification_loss: 0.0173 404/500 [=======================>......] - ETA: 49s - loss: 0.1917 - regression_loss: 0.1745 - classification_loss: 0.0173 405/500 [=======================>......] - ETA: 48s - loss: 0.1927 - regression_loss: 0.1754 - classification_loss: 0.0173 406/500 [=======================>......] - ETA: 48s - loss: 0.1935 - regression_loss: 0.1761 - classification_loss: 0.0174 407/500 [=======================>......] - ETA: 47s - loss: 0.1934 - regression_loss: 0.1760 - classification_loss: 0.0174 408/500 [=======================>......] - ETA: 47s - loss: 0.1931 - regression_loss: 0.1757 - classification_loss: 0.0174 409/500 [=======================>......] - ETA: 46s - loss: 0.1929 - regression_loss: 0.1756 - classification_loss: 0.0174 410/500 [=======================>......] - ETA: 46s - loss: 0.1927 - regression_loss: 0.1754 - classification_loss: 0.0173 411/500 [=======================>......] - ETA: 45s - loss: 0.1927 - regression_loss: 0.1754 - classification_loss: 0.0173 412/500 [=======================>......] - ETA: 45s - loss: 0.1930 - regression_loss: 0.1756 - classification_loss: 0.0174 413/500 [=======================>......] - ETA: 44s - loss: 0.1931 - regression_loss: 0.1758 - classification_loss: 0.0173 414/500 [=======================>......] - ETA: 44s - loss: 0.1929 - regression_loss: 0.1756 - classification_loss: 0.0173 415/500 [=======================>......] - ETA: 43s - loss: 0.1926 - regression_loss: 0.1753 - classification_loss: 0.0173 416/500 [=======================>......] - ETA: 43s - loss: 0.1924 - regression_loss: 0.1751 - classification_loss: 0.0173 417/500 [========================>.....] - ETA: 42s - loss: 0.1927 - regression_loss: 0.1753 - classification_loss: 0.0174 418/500 [========================>.....] - ETA: 41s - loss: 0.1925 - regression_loss: 0.1752 - classification_loss: 0.0174 419/500 [========================>.....] - ETA: 41s - loss: 0.1923 - regression_loss: 0.1750 - classification_loss: 0.0173 420/500 [========================>.....] - ETA: 40s - loss: 0.1923 - regression_loss: 0.1750 - classification_loss: 0.0173 421/500 [========================>.....] - ETA: 40s - loss: 0.1926 - regression_loss: 0.1753 - classification_loss: 0.0173 422/500 [========================>.....] - ETA: 39s - loss: 0.1925 - regression_loss: 0.1752 - classification_loss: 0.0173 423/500 [========================>.....] - ETA: 39s - loss: 0.1924 - regression_loss: 0.1751 - classification_loss: 0.0173 424/500 [========================>.....] - ETA: 38s - loss: 0.1923 - regression_loss: 0.1750 - classification_loss: 0.0172 425/500 [========================>.....] - ETA: 38s - loss: 0.1921 - regression_loss: 0.1749 - classification_loss: 0.0172 426/500 [========================>.....] - ETA: 37s - loss: 0.1919 - regression_loss: 0.1747 - classification_loss: 0.0172 427/500 [========================>.....] - ETA: 37s - loss: 0.1920 - regression_loss: 0.1747 - classification_loss: 0.0173 428/500 [========================>.....] - ETA: 36s - loss: 0.1937 - regression_loss: 0.1759 - classification_loss: 0.0178 429/500 [========================>.....] - ETA: 36s - loss: 0.1934 - regression_loss: 0.1756 - classification_loss: 0.0178 430/500 [========================>.....] - ETA: 35s - loss: 0.1935 - regression_loss: 0.1757 - classification_loss: 0.0178 431/500 [========================>.....] - ETA: 35s - loss: 0.1933 - regression_loss: 0.1755 - classification_loss: 0.0178 432/500 [========================>.....] - ETA: 34s - loss: 0.1932 - regression_loss: 0.1755 - classification_loss: 0.0178 433/500 [========================>.....] - ETA: 34s - loss: 0.1946 - regression_loss: 0.1767 - classification_loss: 0.0179 434/500 [=========================>....] - ETA: 33s - loss: 0.1944 - regression_loss: 0.1766 - classification_loss: 0.0179 435/500 [=========================>....] - ETA: 33s - loss: 0.1943 - regression_loss: 0.1764 - classification_loss: 0.0179 436/500 [=========================>....] - ETA: 32s - loss: 0.1942 - regression_loss: 0.1763 - classification_loss: 0.0179 437/500 [=========================>....] - ETA: 32s - loss: 0.1941 - regression_loss: 0.1762 - classification_loss: 0.0179 438/500 [=========================>....] - ETA: 31s - loss: 0.1939 - regression_loss: 0.1761 - classification_loss: 0.0178 439/500 [=========================>....] - ETA: 31s - loss: 0.1939 - regression_loss: 0.1761 - classification_loss: 0.0178 440/500 [=========================>....] - ETA: 30s - loss: 0.1938 - regression_loss: 0.1759 - classification_loss: 0.0178 441/500 [=========================>....] - ETA: 30s - loss: 0.1935 - regression_loss: 0.1757 - classification_loss: 0.0178 442/500 [=========================>....] - ETA: 29s - loss: 0.1935 - regression_loss: 0.1757 - classification_loss: 0.0178 443/500 [=========================>....] - ETA: 29s - loss: 0.1932 - regression_loss: 0.1754 - classification_loss: 0.0178 444/500 [=========================>....] - ETA: 28s - loss: 0.1932 - regression_loss: 0.1755 - classification_loss: 0.0178 445/500 [=========================>....] - ETA: 28s - loss: 0.1929 - regression_loss: 0.1752 - classification_loss: 0.0177 446/500 [=========================>....] - ETA: 27s - loss: 0.1928 - regression_loss: 0.1750 - classification_loss: 0.0177 447/500 [=========================>....] - ETA: 27s - loss: 0.1925 - regression_loss: 0.1748 - classification_loss: 0.0177 448/500 [=========================>....] - ETA: 26s - loss: 0.1923 - regression_loss: 0.1746 - classification_loss: 0.0177 449/500 [=========================>....] - ETA: 26s - loss: 0.1922 - regression_loss: 0.1745 - classification_loss: 0.0177 450/500 [==========================>...] - ETA: 25s - loss: 0.1921 - regression_loss: 0.1745 - classification_loss: 0.0177 451/500 [==========================>...] - ETA: 25s - loss: 0.1922 - regression_loss: 0.1745 - classification_loss: 0.0177 452/500 [==========================>...] - ETA: 24s - loss: 0.1922 - regression_loss: 0.1745 - classification_loss: 0.0177 453/500 [==========================>...] - ETA: 24s - loss: 0.1920 - regression_loss: 0.1744 - classification_loss: 0.0176 454/500 [==========================>...] - ETA: 23s - loss: 0.1919 - regression_loss: 0.1743 - classification_loss: 0.0176 455/500 [==========================>...] - ETA: 23s - loss: 0.1918 - regression_loss: 0.1742 - classification_loss: 0.0176 456/500 [==========================>...] - ETA: 22s - loss: 0.1919 - regression_loss: 0.1743 - classification_loss: 0.0176 457/500 [==========================>...] - ETA: 22s - loss: 0.1923 - regression_loss: 0.1747 - classification_loss: 0.0176 458/500 [==========================>...] - ETA: 21s - loss: 0.1922 - regression_loss: 0.1746 - classification_loss: 0.0176 459/500 [==========================>...] - ETA: 21s - loss: 0.1923 - regression_loss: 0.1746 - classification_loss: 0.0176 460/500 [==========================>...] - ETA: 20s - loss: 0.1922 - regression_loss: 0.1745 - classification_loss: 0.0176 461/500 [==========================>...] - ETA: 19s - loss: 0.1921 - regression_loss: 0.1745 - classification_loss: 0.0176 462/500 [==========================>...] - ETA: 19s - loss: 0.1921 - regression_loss: 0.1745 - classification_loss: 0.0176 463/500 [==========================>...] - ETA: 18s - loss: 0.1920 - regression_loss: 0.1744 - classification_loss: 0.0176 464/500 [==========================>...] - ETA: 18s - loss: 0.1918 - regression_loss: 0.1742 - classification_loss: 0.0176 465/500 [==========================>...] - ETA: 17s - loss: 0.1919 - regression_loss: 0.1743 - classification_loss: 0.0176 466/500 [==========================>...] - ETA: 17s - loss: 0.1917 - regression_loss: 0.1741 - classification_loss: 0.0176 467/500 [===========================>..] - ETA: 16s - loss: 0.1915 - regression_loss: 0.1740 - classification_loss: 0.0176 468/500 [===========================>..] - ETA: 16s - loss: 0.1916 - regression_loss: 0.1740 - classification_loss: 0.0176 469/500 [===========================>..] - ETA: 15s - loss: 0.1918 - regression_loss: 0.1742 - classification_loss: 0.0176 470/500 [===========================>..] - ETA: 15s - loss: 0.1919 - regression_loss: 0.1743 - classification_loss: 0.0176 471/500 [===========================>..] - ETA: 14s - loss: 0.1918 - regression_loss: 0.1742 - classification_loss: 0.0176 472/500 [===========================>..] - ETA: 14s - loss: 0.1916 - regression_loss: 0.1740 - classification_loss: 0.0176 473/500 [===========================>..] - ETA: 13s - loss: 0.1915 - regression_loss: 0.1739 - classification_loss: 0.0175 474/500 [===========================>..] - ETA: 13s - loss: 0.1914 - regression_loss: 0.1739 - classification_loss: 0.0175 475/500 [===========================>..] - ETA: 12s - loss: 0.1919 - regression_loss: 0.1744 - classification_loss: 0.0175 476/500 [===========================>..] - ETA: 12s - loss: 0.1922 - regression_loss: 0.1747 - classification_loss: 0.0175 477/500 [===========================>..] - ETA: 11s - loss: 0.1921 - regression_loss: 0.1745 - classification_loss: 0.0175 478/500 [===========================>..] - ETA: 11s - loss: 0.1921 - regression_loss: 0.1746 - classification_loss: 0.0175 479/500 [===========================>..] - ETA: 10s - loss: 0.1920 - regression_loss: 0.1745 - classification_loss: 0.0175 480/500 [===========================>..] - ETA: 10s - loss: 0.1920 - regression_loss: 0.1745 - classification_loss: 0.0175 481/500 [===========================>..] - ETA: 9s - loss: 0.1919 - regression_loss: 0.1744 - classification_loss: 0.0175  482/500 [===========================>..] - ETA: 9s - loss: 0.1919 - regression_loss: 0.1744 - classification_loss: 0.0175 483/500 [===========================>..] - ETA: 8s - loss: 0.1918 - regression_loss: 0.1743 - classification_loss: 0.0175 484/500 [============================>.] - ETA: 8s - loss: 0.1915 - regression_loss: 0.1741 - classification_loss: 0.0175 485/500 [============================>.] - ETA: 7s - loss: 0.1913 - regression_loss: 0.1738 - classification_loss: 0.0174 486/500 [============================>.] - ETA: 7s - loss: 0.1913 - regression_loss: 0.1739 - classification_loss: 0.0175 487/500 [============================>.] - ETA: 6s - loss: 0.1912 - regression_loss: 0.1738 - classification_loss: 0.0174 488/500 [============================>.] - ETA: 6s - loss: 0.1910 - regression_loss: 0.1736 - classification_loss: 0.0174 489/500 [============================>.] - ETA: 5s - loss: 0.1909 - regression_loss: 0.1735 - classification_loss: 0.0174 490/500 [============================>.] - ETA: 5s - loss: 0.1910 - regression_loss: 0.1736 - classification_loss: 0.0174 491/500 [============================>.] - ETA: 4s - loss: 0.1909 - regression_loss: 0.1735 - classification_loss: 0.0174 492/500 [============================>.] - ETA: 4s - loss: 0.1908 - regression_loss: 0.1734 - classification_loss: 0.0174 493/500 [============================>.] - ETA: 3s - loss: 0.1907 - regression_loss: 0.1733 - classification_loss: 0.0174 494/500 [============================>.] - ETA: 3s - loss: 0.1906 - regression_loss: 0.1733 - classification_loss: 0.0174 495/500 [============================>.] - ETA: 2s - loss: 0.1904 - regression_loss: 0.1730 - classification_loss: 0.0173 496/500 [============================>.] - ETA: 2s - loss: 0.1902 - regression_loss: 0.1729 - classification_loss: 0.0173 497/500 [============================>.] - ETA: 1s - loss: 0.1916 - regression_loss: 0.1738 - classification_loss: 0.0178 498/500 [============================>.] - ETA: 1s - loss: 0.1914 - regression_loss: 0.1737 - classification_loss: 0.0178 499/500 [============================>.] - ETA: 0s - loss: 0.1913 - regression_loss: 0.1735 - classification_loss: 0.0177 500/500 [==============================] - 256s 512ms/step - loss: 0.1910 - regression_loss: 0.1733 - classification_loss: 0.0177 326 instances of class plum with average precision: 0.8406 mAP: 0.8406 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 4:22 - loss: 0.1295 - regression_loss: 0.1221 - classification_loss: 0.0073 2/500 [..............................] - ETA: 4:17 - loss: 0.1104 - regression_loss: 0.1016 - classification_loss: 0.0088 3/500 [..............................] - ETA: 4:14 - loss: 0.1808 - regression_loss: 0.1652 - classification_loss: 0.0156 4/500 [..............................] - ETA: 4:14 - loss: 0.1665 - regression_loss: 0.1512 - classification_loss: 0.0153 5/500 [..............................] - ETA: 4:14 - loss: 0.1588 - regression_loss: 0.1446 - classification_loss: 0.0141 6/500 [..............................] - ETA: 4:14 - loss: 0.1537 - regression_loss: 0.1400 - classification_loss: 0.0137 7/500 [..............................] - ETA: 4:13 - loss: 0.1575 - regression_loss: 0.1438 - classification_loss: 0.0137 8/500 [..............................] - ETA: 4:11 - loss: 0.1464 - regression_loss: 0.1329 - classification_loss: 0.0135 9/500 [..............................] - ETA: 4:11 - loss: 0.1488 - regression_loss: 0.1342 - classification_loss: 0.0145 10/500 [..............................] - ETA: 4:10 - loss: 0.1426 - regression_loss: 0.1286 - classification_loss: 0.0139 11/500 [..............................] - ETA: 4:09 - loss: 0.1385 - regression_loss: 0.1248 - classification_loss: 0.0138 12/500 [..............................] - ETA: 4:09 - loss: 0.1437 - regression_loss: 0.1302 - classification_loss: 0.0135 13/500 [..............................] - ETA: 4:09 - loss: 0.1491 - regression_loss: 0.1345 - classification_loss: 0.0146 14/500 [..............................] - ETA: 4:08 - loss: 0.1619 - regression_loss: 0.1464 - classification_loss: 0.0156 15/500 [..............................] - ETA: 4:08 - loss: 0.1608 - regression_loss: 0.1447 - classification_loss: 0.0160 16/500 [..............................] - ETA: 4:07 - loss: 0.1547 - regression_loss: 0.1393 - classification_loss: 0.0154 17/500 [>.............................] - ETA: 4:06 - loss: 0.1517 - regression_loss: 0.1366 - classification_loss: 0.0150 18/500 [>.............................] - ETA: 4:06 - loss: 0.1480 - regression_loss: 0.1334 - classification_loss: 0.0146 19/500 [>.............................] - ETA: 4:05 - loss: 0.1442 - regression_loss: 0.1300 - classification_loss: 0.0142 20/500 [>.............................] - ETA: 4:05 - loss: 0.1454 - regression_loss: 0.1314 - classification_loss: 0.0139 21/500 [>.............................] - ETA: 4:04 - loss: 0.1429 - regression_loss: 0.1289 - classification_loss: 0.0139 22/500 [>.............................] - ETA: 4:04 - loss: 0.1639 - regression_loss: 0.1479 - classification_loss: 0.0160 23/500 [>.............................] - ETA: 4:03 - loss: 0.1601 - regression_loss: 0.1445 - classification_loss: 0.0156 24/500 [>.............................] - ETA: 4:02 - loss: 0.1596 - regression_loss: 0.1441 - classification_loss: 0.0155 25/500 [>.............................] - ETA: 4:01 - loss: 0.1553 - regression_loss: 0.1402 - classification_loss: 0.0151 26/500 [>.............................] - ETA: 4:01 - loss: 0.1517 - regression_loss: 0.1368 - classification_loss: 0.0149 27/500 [>.............................] - ETA: 4:00 - loss: 0.1546 - regression_loss: 0.1397 - classification_loss: 0.0149 28/500 [>.............................] - ETA: 3:59 - loss: 0.1557 - regression_loss: 0.1408 - classification_loss: 0.0149 29/500 [>.............................] - ETA: 3:59 - loss: 0.1555 - regression_loss: 0.1406 - classification_loss: 0.0149 30/500 [>.............................] - ETA: 3:58 - loss: 0.1531 - regression_loss: 0.1385 - classification_loss: 0.0146 31/500 [>.............................] - ETA: 3:58 - loss: 0.1569 - regression_loss: 0.1423 - classification_loss: 0.0147 32/500 [>.............................] - ETA: 3:58 - loss: 0.1586 - regression_loss: 0.1435 - classification_loss: 0.0151 33/500 [>.............................] - ETA: 3:57 - loss: 0.1623 - regression_loss: 0.1470 - classification_loss: 0.0153 34/500 [=>............................] - ETA: 3:56 - loss: 0.1610 - regression_loss: 0.1457 - classification_loss: 0.0153 35/500 [=>............................] - ETA: 3:56 - loss: 0.1600 - regression_loss: 0.1449 - classification_loss: 0.0151 36/500 [=>............................] - ETA: 3:55 - loss: 0.1596 - regression_loss: 0.1447 - classification_loss: 0.0149 37/500 [=>............................] - ETA: 3:55 - loss: 0.1605 - regression_loss: 0.1458 - classification_loss: 0.0147 38/500 [=>............................] - ETA: 3:54 - loss: 0.1584 - regression_loss: 0.1438 - classification_loss: 0.0146 39/500 [=>............................] - ETA: 3:54 - loss: 0.1573 - regression_loss: 0.1428 - classification_loss: 0.0146 40/500 [=>............................] - ETA: 3:53 - loss: 0.1574 - regression_loss: 0.1429 - classification_loss: 0.0145 41/500 [=>............................] - ETA: 3:53 - loss: 0.1585 - regression_loss: 0.1440 - classification_loss: 0.0145 42/500 [=>............................] - ETA: 3:52 - loss: 0.1578 - regression_loss: 0.1434 - classification_loss: 0.0144 43/500 [=>............................] - ETA: 3:52 - loss: 0.1613 - regression_loss: 0.1467 - classification_loss: 0.0146 44/500 [=>............................] - ETA: 3:51 - loss: 0.1597 - regression_loss: 0.1453 - classification_loss: 0.0145 45/500 [=>............................] - ETA: 3:51 - loss: 0.1737 - regression_loss: 0.1577 - classification_loss: 0.0159 46/500 [=>............................] - ETA: 3:50 - loss: 0.1721 - regression_loss: 0.1563 - classification_loss: 0.0158 47/500 [=>............................] - ETA: 3:50 - loss: 0.1716 - regression_loss: 0.1559 - classification_loss: 0.0157 48/500 [=>............................] - ETA: 3:50 - loss: 0.1700 - regression_loss: 0.1545 - classification_loss: 0.0155 49/500 [=>............................] - ETA: 3:49 - loss: 0.1690 - regression_loss: 0.1536 - classification_loss: 0.0154 50/500 [==>...........................] - ETA: 3:49 - loss: 0.1703 - regression_loss: 0.1550 - classification_loss: 0.0153 51/500 [==>...........................] - ETA: 3:48 - loss: 0.1702 - regression_loss: 0.1548 - classification_loss: 0.0154 52/500 [==>...........................] - ETA: 3:48 - loss: 0.1684 - regression_loss: 0.1531 - classification_loss: 0.0153 53/500 [==>...........................] - ETA: 3:47 - loss: 0.1679 - regression_loss: 0.1527 - classification_loss: 0.0152 54/500 [==>...........................] - ETA: 3:47 - loss: 0.1684 - regression_loss: 0.1532 - classification_loss: 0.0152 55/500 [==>...........................] - ETA: 3:46 - loss: 0.1681 - regression_loss: 0.1530 - classification_loss: 0.0150 56/500 [==>...........................] - ETA: 3:46 - loss: 0.1664 - regression_loss: 0.1515 - classification_loss: 0.0149 57/500 [==>...........................] - ETA: 3:45 - loss: 0.1674 - regression_loss: 0.1523 - classification_loss: 0.0151 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1669 - regression_loss: 0.1518 - classification_loss: 0.0150 59/500 [==>...........................] - ETA: 3:44 - loss: 0.1660 - regression_loss: 0.1511 - classification_loss: 0.0149 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1655 - regression_loss: 0.1507 - classification_loss: 0.0148 61/500 [==>...........................] - ETA: 3:43 - loss: 0.1690 - regression_loss: 0.1536 - classification_loss: 0.0154 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1685 - regression_loss: 0.1532 - classification_loss: 0.0153 63/500 [==>...........................] - ETA: 3:42 - loss: 0.1688 - regression_loss: 0.1534 - classification_loss: 0.0154 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1679 - regression_loss: 0.1527 - classification_loss: 0.0153 65/500 [==>...........................] - ETA: 3:41 - loss: 0.1664 - regression_loss: 0.1512 - classification_loss: 0.0151 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1652 - regression_loss: 0.1502 - classification_loss: 0.0150 67/500 [===>..........................] - ETA: 3:40 - loss: 0.1636 - regression_loss: 0.1488 - classification_loss: 0.0149 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1627 - regression_loss: 0.1479 - classification_loss: 0.0147 69/500 [===>..........................] - ETA: 3:39 - loss: 0.1624 - regression_loss: 0.1476 - classification_loss: 0.0148 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1622 - regression_loss: 0.1475 - classification_loss: 0.0147 71/500 [===>..........................] - ETA: 3:38 - loss: 0.1627 - regression_loss: 0.1480 - classification_loss: 0.0147 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1620 - regression_loss: 0.1474 - classification_loss: 0.0146 73/500 [===>..........................] - ETA: 3:37 - loss: 0.1609 - regression_loss: 0.1464 - classification_loss: 0.0145 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1603 - regression_loss: 0.1458 - classification_loss: 0.0145 75/500 [===>..........................] - ETA: 3:36 - loss: 0.1594 - regression_loss: 0.1450 - classification_loss: 0.0144 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1599 - regression_loss: 0.1454 - classification_loss: 0.0144 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1605 - regression_loss: 0.1462 - classification_loss: 0.0144 78/500 [===>..........................] - ETA: 3:35 - loss: 0.1614 - regression_loss: 0.1469 - classification_loss: 0.0145 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1604 - regression_loss: 0.1459 - classification_loss: 0.0144 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1596 - regression_loss: 0.1452 - classification_loss: 0.0144 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1591 - regression_loss: 0.1448 - classification_loss: 0.0143 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1594 - regression_loss: 0.1451 - classification_loss: 0.0143 83/500 [===>..........................] - ETA: 3:33 - loss: 0.1595 - regression_loss: 0.1451 - classification_loss: 0.0144 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1620 - regression_loss: 0.1476 - classification_loss: 0.0144 85/500 [====>.........................] - ETA: 3:32 - loss: 0.1621 - regression_loss: 0.1477 - classification_loss: 0.0144 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1619 - regression_loss: 0.1475 - classification_loss: 0.0144 87/500 [====>.........................] - ETA: 3:31 - loss: 0.1613 - regression_loss: 0.1470 - classification_loss: 0.0143 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1611 - regression_loss: 0.1469 - classification_loss: 0.0142 89/500 [====>.........................] - ETA: 3:30 - loss: 0.1609 - regression_loss: 0.1467 - classification_loss: 0.0142 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1601 - regression_loss: 0.1459 - classification_loss: 0.0142 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1598 - regression_loss: 0.1457 - classification_loss: 0.0142 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1592 - regression_loss: 0.1451 - classification_loss: 0.0141 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1618 - regression_loss: 0.1475 - classification_loss: 0.0143 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1627 - regression_loss: 0.1484 - classification_loss: 0.0143 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1620 - regression_loss: 0.1477 - classification_loss: 0.0142 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1622 - regression_loss: 0.1479 - classification_loss: 0.0142 97/500 [====>.........................] - ETA: 3:26 - loss: 0.1610 - regression_loss: 0.1469 - classification_loss: 0.0141 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1598 - regression_loss: 0.1458 - classification_loss: 0.0141 99/500 [====>.........................] - ETA: 3:25 - loss: 0.1588 - regression_loss: 0.1448 - classification_loss: 0.0140 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1582 - regression_loss: 0.1442 - classification_loss: 0.0140 101/500 [=====>........................] - ETA: 3:24 - loss: 0.1577 - regression_loss: 0.1438 - classification_loss: 0.0139 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1578 - regression_loss: 0.1439 - classification_loss: 0.0139 103/500 [=====>........................] - ETA: 3:23 - loss: 0.1567 - regression_loss: 0.1429 - classification_loss: 0.0139 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1574 - regression_loss: 0.1436 - classification_loss: 0.0139 105/500 [=====>........................] - ETA: 3:22 - loss: 0.1573 - regression_loss: 0.1435 - classification_loss: 0.0138 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1562 - regression_loss: 0.1425 - classification_loss: 0.0137 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1553 - regression_loss: 0.1417 - classification_loss: 0.0136 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1554 - regression_loss: 0.1418 - classification_loss: 0.0136 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1583 - regression_loss: 0.1440 - classification_loss: 0.0143 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1610 - regression_loss: 0.1467 - classification_loss: 0.0143 111/500 [=====>........................] - ETA: 3:19 - loss: 0.1607 - regression_loss: 0.1464 - classification_loss: 0.0143 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1600 - regression_loss: 0.1457 - classification_loss: 0.0143 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1593 - regression_loss: 0.1451 - classification_loss: 0.0142 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1591 - regression_loss: 0.1450 - classification_loss: 0.0142 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1587 - regression_loss: 0.1446 - classification_loss: 0.0142 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1580 - regression_loss: 0.1439 - classification_loss: 0.0141 117/500 [======>.......................] - ETA: 3:16 - loss: 0.1573 - regression_loss: 0.1432 - classification_loss: 0.0141 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1573 - regression_loss: 0.1432 - classification_loss: 0.0141 119/500 [======>.......................] - ETA: 3:15 - loss: 0.1567 - regression_loss: 0.1427 - classification_loss: 0.0140 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1563 - regression_loss: 0.1423 - classification_loss: 0.0140 121/500 [======>.......................] - ETA: 3:14 - loss: 0.1558 - regression_loss: 0.1419 - classification_loss: 0.0140 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1563 - regression_loss: 0.1423 - classification_loss: 0.0140 123/500 [======>.......................] - ETA: 3:13 - loss: 0.1577 - regression_loss: 0.1436 - classification_loss: 0.0141 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1574 - regression_loss: 0.1433 - classification_loss: 0.0142 125/500 [======>.......................] - ETA: 3:12 - loss: 0.1569 - regression_loss: 0.1428 - classification_loss: 0.0141 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1563 - regression_loss: 0.1422 - classification_loss: 0.0141 127/500 [======>.......................] - ETA: 3:11 - loss: 0.1559 - regression_loss: 0.1419 - classification_loss: 0.0140 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1555 - regression_loss: 0.1415 - classification_loss: 0.0140 129/500 [======>.......................] - ETA: 3:10 - loss: 0.1548 - regression_loss: 0.1409 - classification_loss: 0.0140 130/500 [======>.......................] - ETA: 3:09 - loss: 0.1545 - regression_loss: 0.1405 - classification_loss: 0.0140 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1543 - regression_loss: 0.1403 - classification_loss: 0.0140 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1539 - regression_loss: 0.1400 - classification_loss: 0.0139 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1536 - regression_loss: 0.1397 - classification_loss: 0.0139 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1530 - regression_loss: 0.1392 - classification_loss: 0.0138 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1534 - regression_loss: 0.1395 - classification_loss: 0.0139 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1534 - regression_loss: 0.1395 - classification_loss: 0.0139 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1533 - regression_loss: 0.1395 - classification_loss: 0.0139 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1551 - regression_loss: 0.1412 - classification_loss: 0.0139 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1546 - regression_loss: 0.1407 - classification_loss: 0.0139 140/500 [=======>......................] - ETA: 3:04 - loss: 0.1547 - regression_loss: 0.1409 - classification_loss: 0.0138 141/500 [=======>......................] - ETA: 3:04 - loss: 0.1555 - regression_loss: 0.1418 - classification_loss: 0.0138 142/500 [=======>......................] - ETA: 3:03 - loss: 0.1557 - regression_loss: 0.1420 - classification_loss: 0.0138 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1560 - regression_loss: 0.1422 - classification_loss: 0.0138 144/500 [=======>......................] - ETA: 3:02 - loss: 0.1557 - regression_loss: 0.1420 - classification_loss: 0.0137 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1554 - regression_loss: 0.1417 - classification_loss: 0.0137 146/500 [=======>......................] - ETA: 3:01 - loss: 0.1551 - regression_loss: 0.1415 - classification_loss: 0.0136 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1556 - regression_loss: 0.1419 - classification_loss: 0.0137 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1550 - regression_loss: 0.1414 - classification_loss: 0.0136 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1545 - regression_loss: 0.1409 - classification_loss: 0.0136 150/500 [========>.....................] - ETA: 2:59 - loss: 0.1542 - regression_loss: 0.1406 - classification_loss: 0.0136 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1540 - regression_loss: 0.1405 - classification_loss: 0.0135 152/500 [========>.....................] - ETA: 2:58 - loss: 0.1535 - regression_loss: 0.1400 - classification_loss: 0.0135 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1549 - regression_loss: 0.1413 - classification_loss: 0.0136 154/500 [========>.....................] - ETA: 2:57 - loss: 0.1550 - regression_loss: 0.1414 - classification_loss: 0.0136 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1563 - regression_loss: 0.1424 - classification_loss: 0.0139 156/500 [========>.....................] - ETA: 2:56 - loss: 0.1564 - regression_loss: 0.1424 - classification_loss: 0.0139 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1569 - regression_loss: 0.1430 - classification_loss: 0.0139 158/500 [========>.....................] - ETA: 2:55 - loss: 0.1564 - regression_loss: 0.1425 - classification_loss: 0.0139 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1560 - regression_loss: 0.1422 - classification_loss: 0.0138 160/500 [========>.....................] - ETA: 2:54 - loss: 0.1569 - regression_loss: 0.1430 - classification_loss: 0.0140 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1567 - regression_loss: 0.1428 - classification_loss: 0.0140 162/500 [========>.....................] - ETA: 2:53 - loss: 0.1573 - regression_loss: 0.1433 - classification_loss: 0.0140 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1575 - regression_loss: 0.1435 - classification_loss: 0.0140 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1576 - regression_loss: 0.1435 - classification_loss: 0.0140 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1600 - regression_loss: 0.1458 - classification_loss: 0.0142 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1596 - regression_loss: 0.1455 - classification_loss: 0.0142 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1606 - regression_loss: 0.1464 - classification_loss: 0.0143 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1606 - regression_loss: 0.1464 - classification_loss: 0.0142 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1607 - regression_loss: 0.1465 - classification_loss: 0.0142 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1601 - regression_loss: 0.1460 - classification_loss: 0.0141 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1599 - regression_loss: 0.1458 - classification_loss: 0.0141 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1598 - regression_loss: 0.1457 - classification_loss: 0.0141 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1592 - regression_loss: 0.1451 - classification_loss: 0.0141 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1587 - regression_loss: 0.1447 - classification_loss: 0.0141 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1584 - regression_loss: 0.1444 - classification_loss: 0.0140 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1582 - regression_loss: 0.1442 - classification_loss: 0.0140 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1577 - regression_loss: 0.1438 - classification_loss: 0.0140 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1575 - regression_loss: 0.1436 - classification_loss: 0.0139 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1576 - regression_loss: 0.1435 - classification_loss: 0.0140 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1611 - regression_loss: 0.1467 - classification_loss: 0.0144 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1614 - regression_loss: 0.1470 - classification_loss: 0.0144 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1611 - regression_loss: 0.1468 - classification_loss: 0.0144 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1607 - regression_loss: 0.1463 - classification_loss: 0.0144 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1604 - regression_loss: 0.1460 - classification_loss: 0.0143 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1597 - regression_loss: 0.1454 - classification_loss: 0.0143 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1597 - regression_loss: 0.1453 - classification_loss: 0.0144 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1599 - regression_loss: 0.1455 - classification_loss: 0.0144 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1601 - regression_loss: 0.1456 - classification_loss: 0.0145 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1598 - regression_loss: 0.1453 - classification_loss: 0.0145 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1593 - regression_loss: 0.1449 - classification_loss: 0.0144 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1605 - regression_loss: 0.1460 - classification_loss: 0.0145 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1608 - regression_loss: 0.1463 - classification_loss: 0.0145 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1611 - regression_loss: 0.1465 - classification_loss: 0.0145 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1605 - regression_loss: 0.1461 - classification_loss: 0.0145 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1613 - regression_loss: 0.1468 - classification_loss: 0.0145 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1612 - regression_loss: 0.1467 - classification_loss: 0.0145 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1616 - regression_loss: 0.1470 - classification_loss: 0.0146 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1611 - regression_loss: 0.1465 - classification_loss: 0.0146 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1613 - regression_loss: 0.1467 - classification_loss: 0.0146 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1616 - regression_loss: 0.1470 - classification_loss: 0.0145 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1615 - regression_loss: 0.1469 - classification_loss: 0.0145 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1621 - regression_loss: 0.1475 - classification_loss: 0.0146 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1618 - regression_loss: 0.1472 - classification_loss: 0.0146 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1612 - regression_loss: 0.1467 - classification_loss: 0.0145 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1623 - regression_loss: 0.1477 - classification_loss: 0.0146 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1620 - regression_loss: 0.1474 - classification_loss: 0.0146 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1620 - regression_loss: 0.1474 - classification_loss: 0.0146 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1615 - regression_loss: 0.1470 - classification_loss: 0.0145 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1617 - regression_loss: 0.1471 - classification_loss: 0.0145 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1614 - regression_loss: 0.1469 - classification_loss: 0.0146 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1609 - regression_loss: 0.1464 - classification_loss: 0.0145 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1608 - regression_loss: 0.1462 - classification_loss: 0.0145 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1604 - regression_loss: 0.1460 - classification_loss: 0.0145 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1602 - regression_loss: 0.1458 - classification_loss: 0.0145 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1598 - regression_loss: 0.1453 - classification_loss: 0.0144 216/500 [===========>..................] - ETA: 2:24 - loss: 0.1597 - regression_loss: 0.1452 - classification_loss: 0.0145 217/500 [============>.................] - ETA: 2:24 - loss: 0.1593 - regression_loss: 0.1448 - classification_loss: 0.0144 218/500 [============>.................] - ETA: 2:23 - loss: 0.1589 - regression_loss: 0.1445 - classification_loss: 0.0144 219/500 [============>.................] - ETA: 2:23 - loss: 0.1590 - regression_loss: 0.1446 - classification_loss: 0.0144 220/500 [============>.................] - ETA: 2:22 - loss: 0.1588 - regression_loss: 0.1444 - classification_loss: 0.0144 221/500 [============>.................] - ETA: 2:22 - loss: 0.1591 - regression_loss: 0.1447 - classification_loss: 0.0144 222/500 [============>.................] - ETA: 2:21 - loss: 0.1591 - regression_loss: 0.1446 - classification_loss: 0.0144 223/500 [============>.................] - ETA: 2:21 - loss: 0.1586 - regression_loss: 0.1442 - classification_loss: 0.0144 224/500 [============>.................] - ETA: 2:20 - loss: 0.1584 - regression_loss: 0.1440 - classification_loss: 0.0144 225/500 [============>.................] - ETA: 2:20 - loss: 0.1583 - regression_loss: 0.1439 - classification_loss: 0.0143 226/500 [============>.................] - ETA: 2:19 - loss: 0.1582 - regression_loss: 0.1439 - classification_loss: 0.0144 227/500 [============>.................] - ETA: 2:19 - loss: 0.1580 - regression_loss: 0.1436 - classification_loss: 0.0144 228/500 [============>.................] - ETA: 2:18 - loss: 0.1577 - regression_loss: 0.1433 - classification_loss: 0.0143 229/500 [============>.................] - ETA: 2:18 - loss: 0.1579 - regression_loss: 0.1436 - classification_loss: 0.0143 230/500 [============>.................] - ETA: 2:17 - loss: 0.1578 - regression_loss: 0.1435 - classification_loss: 0.0143 231/500 [============>.................] - ETA: 2:17 - loss: 0.1576 - regression_loss: 0.1433 - classification_loss: 0.0143 232/500 [============>.................] - ETA: 2:16 - loss: 0.1578 - regression_loss: 0.1435 - classification_loss: 0.0143 233/500 [============>.................] - ETA: 2:16 - loss: 0.1573 - regression_loss: 0.1431 - classification_loss: 0.0143 234/500 [=============>................] - ETA: 2:15 - loss: 0.1574 - regression_loss: 0.1431 - classification_loss: 0.0143 235/500 [=============>................] - ETA: 2:15 - loss: 0.1574 - regression_loss: 0.1431 - classification_loss: 0.0143 236/500 [=============>................] - ETA: 2:14 - loss: 0.1584 - regression_loss: 0.1441 - classification_loss: 0.0143 237/500 [=============>................] - ETA: 2:14 - loss: 0.1580 - regression_loss: 0.1438 - classification_loss: 0.0143 238/500 [=============>................] - ETA: 2:13 - loss: 0.1579 - regression_loss: 0.1436 - classification_loss: 0.0143 239/500 [=============>................] - ETA: 2:13 - loss: 0.1575 - regression_loss: 0.1432 - classification_loss: 0.0142 240/500 [=============>................] - ETA: 2:12 - loss: 0.1570 - regression_loss: 0.1428 - classification_loss: 0.0142 241/500 [=============>................] - ETA: 2:12 - loss: 0.1567 - regression_loss: 0.1425 - classification_loss: 0.0142 242/500 [=============>................] - ETA: 2:11 - loss: 0.1565 - regression_loss: 0.1423 - classification_loss: 0.0142 243/500 [=============>................] - ETA: 2:11 - loss: 0.1562 - regression_loss: 0.1420 - classification_loss: 0.0142 244/500 [=============>................] - ETA: 2:10 - loss: 0.1563 - regression_loss: 0.1421 - classification_loss: 0.0142 245/500 [=============>................] - ETA: 2:10 - loss: 0.1560 - regression_loss: 0.1419 - classification_loss: 0.0141 246/500 [=============>................] - ETA: 2:09 - loss: 0.1558 - regression_loss: 0.1417 - classification_loss: 0.0141 247/500 [=============>................] - ETA: 2:08 - loss: 0.1559 - regression_loss: 0.1417 - classification_loss: 0.0141 248/500 [=============>................] - ETA: 2:08 - loss: 0.1556 - regression_loss: 0.1415 - classification_loss: 0.0141 249/500 [=============>................] - ETA: 2:08 - loss: 0.1554 - regression_loss: 0.1413 - classification_loss: 0.0141 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1550 - regression_loss: 0.1409 - classification_loss: 0.0141 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1548 - regression_loss: 0.1407 - classification_loss: 0.0141 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1545 - regression_loss: 0.1404 - classification_loss: 0.0140 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1541 - regression_loss: 0.1401 - classification_loss: 0.0140 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1544 - regression_loss: 0.1404 - classification_loss: 0.0140 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1558 - regression_loss: 0.1415 - classification_loss: 0.0143 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1557 - regression_loss: 0.1414 - classification_loss: 0.0143 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1556 - regression_loss: 0.1413 - classification_loss: 0.0143 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1558 - regression_loss: 0.1415 - classification_loss: 0.0143 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1555 - regression_loss: 0.1412 - classification_loss: 0.0143 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1553 - regression_loss: 0.1411 - classification_loss: 0.0142 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1550 - regression_loss: 0.1408 - classification_loss: 0.0142 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1553 - regression_loss: 0.1411 - classification_loss: 0.0142 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1549 - regression_loss: 0.1407 - classification_loss: 0.0142 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1546 - regression_loss: 0.1405 - classification_loss: 0.0142 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1550 - regression_loss: 0.1408 - classification_loss: 0.0141 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1552 - regression_loss: 0.1411 - classification_loss: 0.0142 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1552 - regression_loss: 0.1411 - classification_loss: 0.0141 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1549 - regression_loss: 0.1408 - classification_loss: 0.0141 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1548 - regression_loss: 0.1407 - classification_loss: 0.0141 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1545 - regression_loss: 0.1404 - classification_loss: 0.0141 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1541 - regression_loss: 0.1401 - classification_loss: 0.0140 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1542 - regression_loss: 0.1402 - classification_loss: 0.0140 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1541 - regression_loss: 0.1401 - classification_loss: 0.0140 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1540 - regression_loss: 0.1400 - classification_loss: 0.0140 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1538 - regression_loss: 0.1398 - classification_loss: 0.0140 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1565 - regression_loss: 0.1417 - classification_loss: 0.0149 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1564 - regression_loss: 0.1415 - classification_loss: 0.0149 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1561 - regression_loss: 0.1413 - classification_loss: 0.0148 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1562 - regression_loss: 0.1414 - classification_loss: 0.0149 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1558 - regression_loss: 0.1410 - classification_loss: 0.0148 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1558 - regression_loss: 0.1410 - classification_loss: 0.0148 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1558 - regression_loss: 0.1410 - classification_loss: 0.0148 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1557 - regression_loss: 0.1409 - classification_loss: 0.0148 284/500 [================>.............] - ETA: 1:50 - loss: 0.1554 - regression_loss: 0.1407 - classification_loss: 0.0148 285/500 [================>.............] - ETA: 1:49 - loss: 0.1551 - regression_loss: 0.1404 - classification_loss: 0.0148 286/500 [================>.............] - ETA: 1:49 - loss: 0.1553 - regression_loss: 0.1405 - classification_loss: 0.0147 287/500 [================>.............] - ETA: 1:48 - loss: 0.1551 - regression_loss: 0.1404 - classification_loss: 0.0147 288/500 [================>.............] - ETA: 1:48 - loss: 0.1550 - regression_loss: 0.1403 - classification_loss: 0.0147 289/500 [================>.............] - ETA: 1:47 - loss: 0.1555 - regression_loss: 0.1408 - classification_loss: 0.0147 290/500 [================>.............] - ETA: 1:47 - loss: 0.1559 - regression_loss: 0.1412 - classification_loss: 0.0148 291/500 [================>.............] - ETA: 1:46 - loss: 0.1556 - regression_loss: 0.1409 - classification_loss: 0.0148 292/500 [================>.............] - ETA: 1:46 - loss: 0.1553 - regression_loss: 0.1406 - classification_loss: 0.0147 293/500 [================>.............] - ETA: 1:45 - loss: 0.1554 - regression_loss: 0.1407 - classification_loss: 0.0147 294/500 [================>.............] - ETA: 1:45 - loss: 0.1564 - regression_loss: 0.1417 - classification_loss: 0.0147 295/500 [================>.............] - ETA: 1:44 - loss: 0.1563 - regression_loss: 0.1416 - classification_loss: 0.0147 296/500 [================>.............] - ETA: 1:44 - loss: 0.1560 - regression_loss: 0.1414 - classification_loss: 0.0147 297/500 [================>.............] - ETA: 1:43 - loss: 0.1563 - regression_loss: 0.1416 - classification_loss: 0.0147 298/500 [================>.............] - ETA: 1:43 - loss: 0.1561 - regression_loss: 0.1414 - classification_loss: 0.0147 299/500 [================>.............] - ETA: 1:42 - loss: 0.1566 - regression_loss: 0.1419 - classification_loss: 0.0147 300/500 [=================>............] - ETA: 1:42 - loss: 0.1569 - regression_loss: 0.1421 - classification_loss: 0.0148 301/500 [=================>............] - ETA: 1:41 - loss: 0.1569 - regression_loss: 0.1421 - classification_loss: 0.0148 302/500 [=================>............] - ETA: 1:41 - loss: 0.1568 - regression_loss: 0.1420 - classification_loss: 0.0148 303/500 [=================>............] - ETA: 1:40 - loss: 0.1582 - regression_loss: 0.1434 - classification_loss: 0.0149 304/500 [=================>............] - ETA: 1:40 - loss: 0.1581 - regression_loss: 0.1432 - classification_loss: 0.0149 305/500 [=================>............] - ETA: 1:39 - loss: 0.1582 - regression_loss: 0.1434 - classification_loss: 0.0148 306/500 [=================>............] - ETA: 1:39 - loss: 0.1584 - regression_loss: 0.1435 - classification_loss: 0.0148 307/500 [=================>............] - ETA: 1:38 - loss: 0.1586 - regression_loss: 0.1437 - classification_loss: 0.0148 308/500 [=================>............] - ETA: 1:37 - loss: 0.1587 - regression_loss: 0.1438 - classification_loss: 0.0148 309/500 [=================>............] - ETA: 1:37 - loss: 0.1588 - regression_loss: 0.1440 - classification_loss: 0.0148 310/500 [=================>............] - ETA: 1:36 - loss: 0.1586 - regression_loss: 0.1438 - classification_loss: 0.0148 311/500 [=================>............] - ETA: 1:36 - loss: 0.1587 - regression_loss: 0.1439 - classification_loss: 0.0148 312/500 [=================>............] - ETA: 1:35 - loss: 0.1593 - regression_loss: 0.1444 - classification_loss: 0.0149 313/500 [=================>............] - ETA: 1:35 - loss: 0.1590 - regression_loss: 0.1441 - classification_loss: 0.0149 314/500 [=================>............] - ETA: 1:34 - loss: 0.1588 - regression_loss: 0.1439 - classification_loss: 0.0149 315/500 [=================>............] - ETA: 1:34 - loss: 0.1586 - regression_loss: 0.1438 - classification_loss: 0.0148 316/500 [=================>............] - ETA: 1:33 - loss: 0.1587 - regression_loss: 0.1438 - classification_loss: 0.0149 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1585 - regression_loss: 0.1437 - classification_loss: 0.0148 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1583 - regression_loss: 0.1435 - classification_loss: 0.0148 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1582 - regression_loss: 0.1434 - classification_loss: 0.0148 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1581 - regression_loss: 0.1433 - classification_loss: 0.0148 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1580 - regression_loss: 0.1432 - classification_loss: 0.0148 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1578 - regression_loss: 0.1430 - classification_loss: 0.0148 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1576 - regression_loss: 0.1429 - classification_loss: 0.0148 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1577 - regression_loss: 0.1429 - classification_loss: 0.0148 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1576 - regression_loss: 0.1428 - classification_loss: 0.0148 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1580 - regression_loss: 0.1432 - classification_loss: 0.0148 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1594 - regression_loss: 0.1442 - classification_loss: 0.0151 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1612 - regression_loss: 0.1459 - classification_loss: 0.0153 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1611 - regression_loss: 0.1458 - classification_loss: 0.0153 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1608 - regression_loss: 0.1456 - classification_loss: 0.0152 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0152 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1607 - regression_loss: 0.1455 - classification_loss: 0.0152 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1606 - regression_loss: 0.1454 - classification_loss: 0.0152 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1604 - regression_loss: 0.1452 - classification_loss: 0.0152 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1603 - regression_loss: 0.1451 - classification_loss: 0.0152 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1601 - regression_loss: 0.1449 - classification_loss: 0.0152 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1599 - regression_loss: 0.1447 - classification_loss: 0.0152 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1597 - regression_loss: 0.1445 - classification_loss: 0.0151 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1596 - regression_loss: 0.1444 - classification_loss: 0.0151 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1594 - regression_loss: 0.1443 - classification_loss: 0.0151 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1593 - regression_loss: 0.1441 - classification_loss: 0.0151 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1592 - regression_loss: 0.1441 - classification_loss: 0.0151 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1601 - regression_loss: 0.1449 - classification_loss: 0.0152 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1613 - regression_loss: 0.1459 - classification_loss: 0.0155 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1616 - regression_loss: 0.1462 - classification_loss: 0.0154 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1614 - regression_loss: 0.1460 - classification_loss: 0.0154 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1614 - regression_loss: 0.1460 - classification_loss: 0.0154 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1611 - regression_loss: 0.1457 - classification_loss: 0.0154 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1608 - regression_loss: 0.1454 - classification_loss: 0.0154 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1608 - regression_loss: 0.1454 - classification_loss: 0.0154 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0153 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1608 - regression_loss: 0.1454 - classification_loss: 0.0154 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1609 - regression_loss: 0.1455 - classification_loss: 0.0154 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1608 - regression_loss: 0.1455 - classification_loss: 0.0154 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0153 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1613 - regression_loss: 0.1460 - classification_loss: 0.0153 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1610 - regression_loss: 0.1457 - classification_loss: 0.0153 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1608 - regression_loss: 0.1455 - classification_loss: 0.0153 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0153 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1603 - regression_loss: 0.1451 - classification_loss: 0.0152 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1608 - regression_loss: 0.1456 - classification_loss: 0.0152 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1606 - regression_loss: 0.1454 - classification_loss: 0.0152 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1616 - regression_loss: 0.1462 - classification_loss: 0.0155 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1615 - regression_loss: 0.1461 - classification_loss: 0.0154 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1614 - regression_loss: 0.1459 - classification_loss: 0.0154 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1612 - regression_loss: 0.1458 - classification_loss: 0.0154 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1611 - regression_loss: 0.1457 - classification_loss: 0.0154 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1610 - regression_loss: 0.1456 - classification_loss: 0.0154 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1612 - regression_loss: 0.1457 - classification_loss: 0.0154 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1609 - regression_loss: 0.1456 - classification_loss: 0.0154 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1609 - regression_loss: 0.1455 - classification_loss: 0.0154 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1608 - regression_loss: 0.1454 - classification_loss: 0.0154 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0154 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1605 - regression_loss: 0.1452 - classification_loss: 0.0153 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1602 - regression_loss: 0.1449 - classification_loss: 0.0153 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1600 - regression_loss: 0.1447 - classification_loss: 0.0153 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1602 - regression_loss: 0.1449 - classification_loss: 0.0153 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1602 - regression_loss: 0.1449 - classification_loss: 0.0153 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1603 - regression_loss: 0.1450 - classification_loss: 0.0153 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1608 - regression_loss: 0.1455 - classification_loss: 0.0153 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1609 - regression_loss: 0.1456 - classification_loss: 0.0153 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1608 - regression_loss: 0.1455 - classification_loss: 0.0153 383/500 [=====================>........] - ETA: 59s - loss: 0.1609 - regression_loss: 0.1456 - classification_loss: 0.0153  384/500 [======================>.......] - ETA: 59s - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0153 385/500 [======================>.......] - ETA: 58s - loss: 0.1605 - regression_loss: 0.1453 - classification_loss: 0.0153 386/500 [======================>.......] - ETA: 58s - loss: 0.1603 - regression_loss: 0.1451 - classification_loss: 0.0153 387/500 [======================>.......] - ETA: 57s - loss: 0.1607 - regression_loss: 0.1454 - classification_loss: 0.0153 388/500 [======================>.......] - ETA: 57s - loss: 0.1607 - regression_loss: 0.1454 - classification_loss: 0.0153 389/500 [======================>.......] - ETA: 56s - loss: 0.1607 - regression_loss: 0.1454 - classification_loss: 0.0153 390/500 [======================>.......] - ETA: 56s - loss: 0.1606 - regression_loss: 0.1453 - classification_loss: 0.0153 391/500 [======================>.......] - ETA: 55s - loss: 0.1605 - regression_loss: 0.1453 - classification_loss: 0.0153 392/500 [======================>.......] - ETA: 55s - loss: 0.1603 - regression_loss: 0.1451 - classification_loss: 0.0152 393/500 [======================>.......] - ETA: 54s - loss: 0.1602 - regression_loss: 0.1450 - classification_loss: 0.0152 394/500 [======================>.......] - ETA: 54s - loss: 0.1601 - regression_loss: 0.1449 - classification_loss: 0.0152 395/500 [======================>.......] - ETA: 53s - loss: 0.1598 - regression_loss: 0.1446 - classification_loss: 0.0152 396/500 [======================>.......] - ETA: 53s - loss: 0.1596 - regression_loss: 0.1445 - classification_loss: 0.0152 397/500 [======================>.......] - ETA: 52s - loss: 0.1594 - regression_loss: 0.1443 - classification_loss: 0.0152 398/500 [======================>.......] - ETA: 52s - loss: 0.1594 - regression_loss: 0.1443 - classification_loss: 0.0151 399/500 [======================>.......] - ETA: 51s - loss: 0.1593 - regression_loss: 0.1442 - classification_loss: 0.0152 400/500 [=======================>......] - ETA: 51s - loss: 0.1592 - regression_loss: 0.1441 - classification_loss: 0.0151 401/500 [=======================>......] - ETA: 50s - loss: 0.1595 - regression_loss: 0.1444 - classification_loss: 0.0151 402/500 [=======================>......] - ETA: 50s - loss: 0.1592 - regression_loss: 0.1441 - classification_loss: 0.0151 403/500 [=======================>......] - ETA: 49s - loss: 0.1591 - regression_loss: 0.1440 - classification_loss: 0.0151 404/500 [=======================>......] - ETA: 49s - loss: 0.1589 - regression_loss: 0.1438 - classification_loss: 0.0151 405/500 [=======================>......] - ETA: 48s - loss: 0.1590 - regression_loss: 0.1440 - classification_loss: 0.0151 406/500 [=======================>......] - ETA: 47s - loss: 0.1590 - regression_loss: 0.1439 - classification_loss: 0.0151 407/500 [=======================>......] - ETA: 47s - loss: 0.1588 - regression_loss: 0.1438 - classification_loss: 0.0150 408/500 [=======================>......] - ETA: 46s - loss: 0.1587 - regression_loss: 0.1437 - classification_loss: 0.0150 409/500 [=======================>......] - ETA: 46s - loss: 0.1584 - regression_loss: 0.1434 - classification_loss: 0.0150 410/500 [=======================>......] - ETA: 45s - loss: 0.1583 - regression_loss: 0.1433 - classification_loss: 0.0150 411/500 [=======================>......] - ETA: 45s - loss: 0.1581 - regression_loss: 0.1432 - classification_loss: 0.0150 412/500 [=======================>......] - ETA: 44s - loss: 0.1579 - regression_loss: 0.1430 - classification_loss: 0.0150 413/500 [=======================>......] - ETA: 44s - loss: 0.1579 - regression_loss: 0.1430 - classification_loss: 0.0149 414/500 [=======================>......] - ETA: 43s - loss: 0.1580 - regression_loss: 0.1431 - classification_loss: 0.0150 415/500 [=======================>......] - ETA: 43s - loss: 0.1581 - regression_loss: 0.1431 - classification_loss: 0.0150 416/500 [=======================>......] - ETA: 42s - loss: 0.1580 - regression_loss: 0.1430 - classification_loss: 0.0150 417/500 [========================>.....] - ETA: 42s - loss: 0.1583 - regression_loss: 0.1433 - classification_loss: 0.0150 418/500 [========================>.....] - ETA: 41s - loss: 0.1582 - regression_loss: 0.1432 - classification_loss: 0.0150 419/500 [========================>.....] - ETA: 41s - loss: 0.1580 - regression_loss: 0.1431 - classification_loss: 0.0150 420/500 [========================>.....] - ETA: 40s - loss: 0.1578 - regression_loss: 0.1428 - classification_loss: 0.0150 421/500 [========================>.....] - ETA: 40s - loss: 0.1577 - regression_loss: 0.1428 - classification_loss: 0.0150 422/500 [========================>.....] - ETA: 39s - loss: 0.1577 - regression_loss: 0.1427 - classification_loss: 0.0149 423/500 [========================>.....] - ETA: 39s - loss: 0.1577 - regression_loss: 0.1428 - classification_loss: 0.0149 424/500 [========================>.....] - ETA: 38s - loss: 0.1576 - regression_loss: 0.1426 - classification_loss: 0.0149 425/500 [========================>.....] - ETA: 38s - loss: 0.1577 - regression_loss: 0.1427 - classification_loss: 0.0150 426/500 [========================>.....] - ETA: 37s - loss: 0.1575 - regression_loss: 0.1426 - classification_loss: 0.0149 427/500 [========================>.....] - ETA: 37s - loss: 0.1574 - regression_loss: 0.1425 - classification_loss: 0.0149 428/500 [========================>.....] - ETA: 36s - loss: 0.1580 - regression_loss: 0.1429 - classification_loss: 0.0150 429/500 [========================>.....] - ETA: 36s - loss: 0.1581 - regression_loss: 0.1430 - classification_loss: 0.0150 430/500 [========================>.....] - ETA: 35s - loss: 0.1598 - regression_loss: 0.1443 - classification_loss: 0.0156 431/500 [========================>.....] - ETA: 35s - loss: 0.1597 - regression_loss: 0.1441 - classification_loss: 0.0156 432/500 [========================>.....] - ETA: 34s - loss: 0.1597 - regression_loss: 0.1442 - classification_loss: 0.0156 433/500 [========================>.....] - ETA: 34s - loss: 0.1596 - regression_loss: 0.1440 - classification_loss: 0.0156 434/500 [=========================>....] - ETA: 33s - loss: 0.1595 - regression_loss: 0.1439 - classification_loss: 0.0155 435/500 [=========================>....] - ETA: 33s - loss: 0.1595 - regression_loss: 0.1439 - classification_loss: 0.0155 436/500 [=========================>....] - ETA: 32s - loss: 0.1603 - regression_loss: 0.1446 - classification_loss: 0.0157 437/500 [=========================>....] - ETA: 32s - loss: 0.1617 - regression_loss: 0.1459 - classification_loss: 0.0159 438/500 [=========================>....] - ETA: 31s - loss: 0.1618 - regression_loss: 0.1460 - classification_loss: 0.0159 439/500 [=========================>....] - ETA: 31s - loss: 0.1617 - regression_loss: 0.1458 - classification_loss: 0.0159 440/500 [=========================>....] - ETA: 30s - loss: 0.1614 - regression_loss: 0.1456 - classification_loss: 0.0158 441/500 [=========================>....] - ETA: 30s - loss: 0.1624 - regression_loss: 0.1465 - classification_loss: 0.0159 442/500 [=========================>....] - ETA: 29s - loss: 0.1623 - regression_loss: 0.1463 - classification_loss: 0.0159 443/500 [=========================>....] - ETA: 29s - loss: 0.1625 - regression_loss: 0.1465 - classification_loss: 0.0159 444/500 [=========================>....] - ETA: 28s - loss: 0.1623 - regression_loss: 0.1463 - classification_loss: 0.0159 445/500 [=========================>....] - ETA: 28s - loss: 0.1621 - regression_loss: 0.1462 - classification_loss: 0.0159 446/500 [=========================>....] - ETA: 27s - loss: 0.1620 - regression_loss: 0.1461 - classification_loss: 0.0159 447/500 [=========================>....] - ETA: 27s - loss: 0.1623 - regression_loss: 0.1464 - classification_loss: 0.0160 448/500 [=========================>....] - ETA: 26s - loss: 0.1622 - regression_loss: 0.1463 - classification_loss: 0.0159 449/500 [=========================>....] - ETA: 26s - loss: 0.1622 - regression_loss: 0.1463 - classification_loss: 0.0159 450/500 [==========================>...] - ETA: 25s - loss: 0.1623 - regression_loss: 0.1464 - classification_loss: 0.0159 451/500 [==========================>...] - ETA: 25s - loss: 0.1625 - regression_loss: 0.1466 - classification_loss: 0.0159 452/500 [==========================>...] - ETA: 24s - loss: 0.1623 - regression_loss: 0.1464 - classification_loss: 0.0159 453/500 [==========================>...] - ETA: 24s - loss: 0.1623 - regression_loss: 0.1464 - classification_loss: 0.0159 454/500 [==========================>...] - ETA: 23s - loss: 0.1621 - regression_loss: 0.1463 - classification_loss: 0.0159 455/500 [==========================>...] - ETA: 22s - loss: 0.1620 - regression_loss: 0.1461 - classification_loss: 0.0159 456/500 [==========================>...] - ETA: 22s - loss: 0.1618 - regression_loss: 0.1460 - classification_loss: 0.0158 457/500 [==========================>...] - ETA: 21s - loss: 0.1617 - regression_loss: 0.1459 - classification_loss: 0.0158 458/500 [==========================>...] - ETA: 21s - loss: 0.1618 - regression_loss: 0.1459 - classification_loss: 0.0159 459/500 [==========================>...] - ETA: 20s - loss: 0.1615 - regression_loss: 0.1457 - classification_loss: 0.0158 460/500 [==========================>...] - ETA: 20s - loss: 0.1614 - regression_loss: 0.1456 - classification_loss: 0.0158 461/500 [==========================>...] - ETA: 19s - loss: 0.1617 - regression_loss: 0.1459 - classification_loss: 0.0159 462/500 [==========================>...] - ETA: 19s - loss: 0.1618 - regression_loss: 0.1459 - classification_loss: 0.0159 463/500 [==========================>...] - ETA: 18s - loss: 0.1616 - regression_loss: 0.1458 - classification_loss: 0.0159 464/500 [==========================>...] - ETA: 18s - loss: 0.1615 - regression_loss: 0.1456 - classification_loss: 0.0159 465/500 [==========================>...] - ETA: 17s - loss: 0.1615 - regression_loss: 0.1456 - classification_loss: 0.0159 466/500 [==========================>...] - ETA: 17s - loss: 0.1613 - regression_loss: 0.1455 - classification_loss: 0.0158 467/500 [===========================>..] - ETA: 16s - loss: 0.1616 - regression_loss: 0.1458 - classification_loss: 0.0159 468/500 [===========================>..] - ETA: 16s - loss: 0.1617 - regression_loss: 0.1458 - classification_loss: 0.0159 469/500 [===========================>..] - ETA: 15s - loss: 0.1615 - regression_loss: 0.1457 - classification_loss: 0.0158 470/500 [===========================>..] - ETA: 15s - loss: 0.1613 - regression_loss: 0.1455 - classification_loss: 0.0158 471/500 [===========================>..] - ETA: 14s - loss: 0.1611 - regression_loss: 0.1453 - classification_loss: 0.0158 472/500 [===========================>..] - ETA: 14s - loss: 0.1609 - regression_loss: 0.1451 - classification_loss: 0.0158 473/500 [===========================>..] - ETA: 13s - loss: 0.1607 - regression_loss: 0.1450 - classification_loss: 0.0158 474/500 [===========================>..] - ETA: 13s - loss: 0.1607 - regression_loss: 0.1449 - classification_loss: 0.0158 475/500 [===========================>..] - ETA: 12s - loss: 0.1605 - regression_loss: 0.1447 - classification_loss: 0.0158 476/500 [===========================>..] - ETA: 12s - loss: 0.1603 - regression_loss: 0.1446 - classification_loss: 0.0157 477/500 [===========================>..] - ETA: 11s - loss: 0.1601 - regression_loss: 0.1444 - classification_loss: 0.0157 478/500 [===========================>..] - ETA: 11s - loss: 0.1599 - regression_loss: 0.1442 - classification_loss: 0.0157 479/500 [===========================>..] - ETA: 10s - loss: 0.1598 - regression_loss: 0.1441 - classification_loss: 0.0157 480/500 [===========================>..] - ETA: 10s - loss: 0.1596 - regression_loss: 0.1439 - classification_loss: 0.0157 481/500 [===========================>..] - ETA: 9s - loss: 0.1595 - regression_loss: 0.1438 - classification_loss: 0.0157  482/500 [===========================>..] - ETA: 9s - loss: 0.1596 - regression_loss: 0.1439 - classification_loss: 0.0157 483/500 [===========================>..] - ETA: 8s - loss: 0.1595 - regression_loss: 0.1438 - classification_loss: 0.0157 484/500 [============================>.] - ETA: 8s - loss: 0.1596 - regression_loss: 0.1439 - classification_loss: 0.0156 485/500 [============================>.] - ETA: 7s - loss: 0.1594 - regression_loss: 0.1438 - classification_loss: 0.0156 486/500 [============================>.] - ETA: 7s - loss: 0.1593 - regression_loss: 0.1437 - classification_loss: 0.0156 487/500 [============================>.] - ETA: 6s - loss: 0.1592 - regression_loss: 0.1436 - classification_loss: 0.0156 488/500 [============================>.] - ETA: 6s - loss: 0.1590 - regression_loss: 0.1434 - classification_loss: 0.0156 489/500 [============================>.] - ETA: 5s - loss: 0.1588 - regression_loss: 0.1433 - classification_loss: 0.0156 490/500 [============================>.] - ETA: 5s - loss: 0.1587 - regression_loss: 0.1432 - classification_loss: 0.0156 491/500 [============================>.] - ETA: 4s - loss: 0.1586 - regression_loss: 0.1431 - classification_loss: 0.0156 492/500 [============================>.] - ETA: 4s - loss: 0.1586 - regression_loss: 0.1430 - classification_loss: 0.0155 493/500 [============================>.] - ETA: 3s - loss: 0.1583 - regression_loss: 0.1428 - classification_loss: 0.0155 494/500 [============================>.] - ETA: 3s - loss: 0.1583 - regression_loss: 0.1428 - classification_loss: 0.0155 495/500 [============================>.] - ETA: 2s - loss: 0.1583 - regression_loss: 0.1427 - classification_loss: 0.0155 496/500 [============================>.] - ETA: 2s - loss: 0.1581 - regression_loss: 0.1426 - classification_loss: 0.0155 497/500 [============================>.] - ETA: 1s - loss: 0.1580 - regression_loss: 0.1425 - classification_loss: 0.0155 498/500 [============================>.] - ETA: 1s - loss: 0.1589 - regression_loss: 0.1433 - classification_loss: 0.0155 499/500 [============================>.] - ETA: 0s - loss: 0.1588 - regression_loss: 0.1433 - classification_loss: 0.0155 500/500 [==============================] - 255s 511ms/step - loss: 0.1588 - regression_loss: 0.1433 - classification_loss: 0.0155 326 instances of class plum with average precision: 0.8364 mAP: 0.8364 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 4:15 - loss: 0.1885 - regression_loss: 0.1704 - classification_loss: 0.0182 2/500 [..............................] - ETA: 4:14 - loss: 0.1814 - regression_loss: 0.1670 - classification_loss: 0.0143 3/500 [..............................] - ETA: 4:13 - loss: 0.1473 - regression_loss: 0.1342 - classification_loss: 0.0131 4/500 [..............................] - ETA: 4:14 - loss: 0.1236 - regression_loss: 0.1125 - classification_loss: 0.0112 5/500 [..............................] - ETA: 4:12 - loss: 0.1200 - regression_loss: 0.1092 - classification_loss: 0.0108 6/500 [..............................] - ETA: 4:12 - loss: 0.1422 - regression_loss: 0.1302 - classification_loss: 0.0119 7/500 [..............................] - ETA: 4:11 - loss: 0.1483 - regression_loss: 0.1355 - classification_loss: 0.0128 8/500 [..............................] - ETA: 4:11 - loss: 0.1371 - regression_loss: 0.1248 - classification_loss: 0.0123 9/500 [..............................] - ETA: 4:11 - loss: 0.1393 - regression_loss: 0.1264 - classification_loss: 0.0129 10/500 [..............................] - ETA: 4:11 - loss: 0.1480 - regression_loss: 0.1353 - classification_loss: 0.0127 11/500 [..............................] - ETA: 4:10 - loss: 0.1670 - regression_loss: 0.1527 - classification_loss: 0.0143 12/500 [..............................] - ETA: 4:09 - loss: 0.1591 - regression_loss: 0.1456 - classification_loss: 0.0135 13/500 [..............................] - ETA: 4:09 - loss: 0.1838 - regression_loss: 0.1694 - classification_loss: 0.0144 14/500 [..............................] - ETA: 4:08 - loss: 0.1771 - regression_loss: 0.1630 - classification_loss: 0.0141 15/500 [..............................] - ETA: 4:07 - loss: 0.1706 - regression_loss: 0.1571 - classification_loss: 0.0135 16/500 [..............................] - ETA: 4:07 - loss: 0.1693 - regression_loss: 0.1560 - classification_loss: 0.0133 17/500 [>.............................] - ETA: 4:06 - loss: 0.1633 - regression_loss: 0.1503 - classification_loss: 0.0130 18/500 [>.............................] - ETA: 4:05 - loss: 0.1582 - regression_loss: 0.1454 - classification_loss: 0.0128 19/500 [>.............................] - ETA: 4:04 - loss: 0.1615 - regression_loss: 0.1482 - classification_loss: 0.0133 20/500 [>.............................] - ETA: 4:03 - loss: 0.1641 - regression_loss: 0.1508 - classification_loss: 0.0134 21/500 [>.............................] - ETA: 4:02 - loss: 0.1654 - regression_loss: 0.1519 - classification_loss: 0.0135 22/500 [>.............................] - ETA: 4:02 - loss: 0.1623 - regression_loss: 0.1490 - classification_loss: 0.0133 23/500 [>.............................] - ETA: 4:01 - loss: 0.1575 - regression_loss: 0.1445 - classification_loss: 0.0129 24/500 [>.............................] - ETA: 4:00 - loss: 0.1592 - regression_loss: 0.1462 - classification_loss: 0.0130 25/500 [>.............................] - ETA: 4:00 - loss: 0.1608 - regression_loss: 0.1476 - classification_loss: 0.0132 26/500 [>.............................] - ETA: 3:59 - loss: 0.1591 - regression_loss: 0.1459 - classification_loss: 0.0132 27/500 [>.............................] - ETA: 3:59 - loss: 0.1583 - regression_loss: 0.1454 - classification_loss: 0.0129 28/500 [>.............................] - ETA: 3:58 - loss: 0.1563 - regression_loss: 0.1433 - classification_loss: 0.0130 29/500 [>.............................] - ETA: 3:58 - loss: 0.1537 - regression_loss: 0.1409 - classification_loss: 0.0128 30/500 [>.............................] - ETA: 3:57 - loss: 0.1523 - regression_loss: 0.1397 - classification_loss: 0.0126 31/500 [>.............................] - ETA: 3:57 - loss: 0.1502 - regression_loss: 0.1376 - classification_loss: 0.0127 32/500 [>.............................] - ETA: 3:56 - loss: 0.1482 - regression_loss: 0.1357 - classification_loss: 0.0125 33/500 [>.............................] - ETA: 3:56 - loss: 0.1479 - regression_loss: 0.1355 - classification_loss: 0.0124 34/500 [=>............................] - ETA: 3:55 - loss: 0.1461 - regression_loss: 0.1336 - classification_loss: 0.0124 35/500 [=>............................] - ETA: 3:55 - loss: 0.1436 - regression_loss: 0.1313 - classification_loss: 0.0123 36/500 [=>............................] - ETA: 3:54 - loss: 0.1409 - regression_loss: 0.1288 - classification_loss: 0.0121 37/500 [=>............................] - ETA: 3:54 - loss: 0.1394 - regression_loss: 0.1274 - classification_loss: 0.0120 38/500 [=>............................] - ETA: 3:54 - loss: 0.1399 - regression_loss: 0.1279 - classification_loss: 0.0119 39/500 [=>............................] - ETA: 3:53 - loss: 0.1381 - regression_loss: 0.1263 - classification_loss: 0.0118 40/500 [=>............................] - ETA: 3:53 - loss: 0.1366 - regression_loss: 0.1247 - classification_loss: 0.0119 41/500 [=>............................] - ETA: 3:52 - loss: 0.1350 - regression_loss: 0.1231 - classification_loss: 0.0119 42/500 [=>............................] - ETA: 3:52 - loss: 0.1335 - regression_loss: 0.1215 - classification_loss: 0.0120 43/500 [=>............................] - ETA: 3:51 - loss: 0.1327 - regression_loss: 0.1207 - classification_loss: 0.0120 44/500 [=>............................] - ETA: 3:51 - loss: 0.1325 - regression_loss: 0.1202 - classification_loss: 0.0123 45/500 [=>............................] - ETA: 3:50 - loss: 0.1305 - regression_loss: 0.1183 - classification_loss: 0.0122 46/500 [=>............................] - ETA: 3:50 - loss: 0.1326 - regression_loss: 0.1203 - classification_loss: 0.0123 47/500 [=>............................] - ETA: 3:50 - loss: 0.1317 - regression_loss: 0.1193 - classification_loss: 0.0124 48/500 [=>............................] - ETA: 3:49 - loss: 0.1311 - regression_loss: 0.1187 - classification_loss: 0.0124 49/500 [=>............................] - ETA: 3:49 - loss: 0.1300 - regression_loss: 0.1178 - classification_loss: 0.0123 50/500 [==>...........................] - ETA: 3:48 - loss: 0.1287 - regression_loss: 0.1165 - classification_loss: 0.0122 51/500 [==>...........................] - ETA: 3:48 - loss: 0.1302 - regression_loss: 0.1179 - classification_loss: 0.0122 52/500 [==>...........................] - ETA: 3:47 - loss: 0.1290 - regression_loss: 0.1169 - classification_loss: 0.0121 53/500 [==>...........................] - ETA: 3:47 - loss: 0.1286 - regression_loss: 0.1166 - classification_loss: 0.0121 54/500 [==>...........................] - ETA: 3:46 - loss: 0.1298 - regression_loss: 0.1177 - classification_loss: 0.0121 55/500 [==>...........................] - ETA: 3:46 - loss: 0.1283 - regression_loss: 0.1163 - classification_loss: 0.0120 56/500 [==>...........................] - ETA: 3:46 - loss: 0.1275 - regression_loss: 0.1156 - classification_loss: 0.0119 57/500 [==>...........................] - ETA: 3:45 - loss: 0.1265 - regression_loss: 0.1147 - classification_loss: 0.0118 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1254 - regression_loss: 0.1137 - classification_loss: 0.0117 59/500 [==>...........................] - ETA: 3:44 - loss: 0.1270 - regression_loss: 0.1150 - classification_loss: 0.0120 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1260 - regression_loss: 0.1142 - classification_loss: 0.0118 61/500 [==>...........................] - ETA: 3:43 - loss: 0.1381 - regression_loss: 0.1225 - classification_loss: 0.0156 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1381 - regression_loss: 0.1225 - classification_loss: 0.0157 63/500 [==>...........................] - ETA: 3:42 - loss: 0.1377 - regression_loss: 0.1221 - classification_loss: 0.0156 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1368 - regression_loss: 0.1214 - classification_loss: 0.0154 65/500 [==>...........................] - ETA: 3:41 - loss: 0.1353 - regression_loss: 0.1200 - classification_loss: 0.0153 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1360 - regression_loss: 0.1208 - classification_loss: 0.0153 67/500 [===>..........................] - ETA: 3:40 - loss: 0.1369 - regression_loss: 0.1216 - classification_loss: 0.0153 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1382 - regression_loss: 0.1228 - classification_loss: 0.0154 69/500 [===>..........................] - ETA: 3:39 - loss: 0.1375 - regression_loss: 0.1221 - classification_loss: 0.0153 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1366 - regression_loss: 0.1214 - classification_loss: 0.0152 71/500 [===>..........................] - ETA: 3:38 - loss: 0.1356 - regression_loss: 0.1205 - classification_loss: 0.0151 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1361 - regression_loss: 0.1210 - classification_loss: 0.0151 73/500 [===>..........................] - ETA: 3:37 - loss: 0.1376 - regression_loss: 0.1225 - classification_loss: 0.0151 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1382 - regression_loss: 0.1231 - classification_loss: 0.0151 75/500 [===>..........................] - ETA: 3:36 - loss: 0.1373 - regression_loss: 0.1223 - classification_loss: 0.0150 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1370 - regression_loss: 0.1220 - classification_loss: 0.0150 77/500 [===>..........................] - ETA: 3:35 - loss: 0.1365 - regression_loss: 0.1216 - classification_loss: 0.0149 78/500 [===>..........................] - ETA: 3:35 - loss: 0.1361 - regression_loss: 0.1213 - classification_loss: 0.0148 79/500 [===>..........................] - ETA: 3:34 - loss: 0.1351 - regression_loss: 0.1204 - classification_loss: 0.0147 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1358 - regression_loss: 0.1211 - classification_loss: 0.0147 81/500 [===>..........................] - ETA: 3:33 - loss: 0.1358 - regression_loss: 0.1211 - classification_loss: 0.0147 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1351 - regression_loss: 0.1205 - classification_loss: 0.0146 83/500 [===>..........................] - ETA: 3:32 - loss: 0.1356 - regression_loss: 0.1211 - classification_loss: 0.0145 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1346 - regression_loss: 0.1202 - classification_loss: 0.0144 85/500 [====>.........................] - ETA: 3:31 - loss: 0.1335 - regression_loss: 0.1192 - classification_loss: 0.0143 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1329 - regression_loss: 0.1187 - classification_loss: 0.0142 87/500 [====>.........................] - ETA: 3:30 - loss: 0.1327 - regression_loss: 0.1185 - classification_loss: 0.0142 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1323 - regression_loss: 0.1181 - classification_loss: 0.0141 89/500 [====>.........................] - ETA: 3:29 - loss: 0.1320 - regression_loss: 0.1179 - classification_loss: 0.0141 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1314 - regression_loss: 0.1174 - classification_loss: 0.0141 91/500 [====>.........................] - ETA: 3:28 - loss: 0.1323 - regression_loss: 0.1182 - classification_loss: 0.0141 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1323 - regression_loss: 0.1182 - classification_loss: 0.0141 93/500 [====>.........................] - ETA: 3:27 - loss: 0.1315 - regression_loss: 0.1175 - classification_loss: 0.0140 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1313 - regression_loss: 0.1173 - classification_loss: 0.0140 95/500 [====>.........................] - ETA: 3:26 - loss: 0.1309 - regression_loss: 0.1170 - classification_loss: 0.0139 96/500 [====>.........................] - ETA: 3:25 - loss: 0.1307 - regression_loss: 0.1167 - classification_loss: 0.0140 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1313 - regression_loss: 0.1173 - classification_loss: 0.0140 98/500 [====>.........................] - ETA: 3:24 - loss: 0.1307 - regression_loss: 0.1167 - classification_loss: 0.0139 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1314 - regression_loss: 0.1173 - classification_loss: 0.0140 100/500 [=====>........................] - ETA: 3:23 - loss: 0.1310 - regression_loss: 0.1170 - classification_loss: 0.0140 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1309 - regression_loss: 0.1170 - classification_loss: 0.0140 102/500 [=====>........................] - ETA: 3:22 - loss: 0.1347 - regression_loss: 0.1199 - classification_loss: 0.0148 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1342 - regression_loss: 0.1194 - classification_loss: 0.0147 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1338 - regression_loss: 0.1191 - classification_loss: 0.0147 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1338 - regression_loss: 0.1192 - classification_loss: 0.0146 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1345 - regression_loss: 0.1198 - classification_loss: 0.0146 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1341 - regression_loss: 0.1195 - classification_loss: 0.0146 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1332 - regression_loss: 0.1187 - classification_loss: 0.0145 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1323 - regression_loss: 0.1179 - classification_loss: 0.0144 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1319 - regression_loss: 0.1175 - classification_loss: 0.0144 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1312 - regression_loss: 0.1169 - classification_loss: 0.0143 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1307 - regression_loss: 0.1165 - classification_loss: 0.0142 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1308 - regression_loss: 0.1166 - classification_loss: 0.0142 114/500 [=====>........................] - ETA: 3:16 - loss: 0.1309 - regression_loss: 0.1168 - classification_loss: 0.0141 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1302 - regression_loss: 0.1161 - classification_loss: 0.0141 116/500 [=====>........................] - ETA: 3:15 - loss: 0.1301 - regression_loss: 0.1160 - classification_loss: 0.0141 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1294 - regression_loss: 0.1154 - classification_loss: 0.0140 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1292 - regression_loss: 0.1152 - classification_loss: 0.0140 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1292 - regression_loss: 0.1153 - classification_loss: 0.0140 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1319 - regression_loss: 0.1179 - classification_loss: 0.0140 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1321 - regression_loss: 0.1181 - classification_loss: 0.0139 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1317 - regression_loss: 0.1178 - classification_loss: 0.0139 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1316 - regression_loss: 0.1177 - classification_loss: 0.0139 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1329 - regression_loss: 0.1188 - classification_loss: 0.0141 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1328 - regression_loss: 0.1187 - classification_loss: 0.0141 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1326 - regression_loss: 0.1185 - classification_loss: 0.0141 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1326 - regression_loss: 0.1185 - classification_loss: 0.0141 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1344 - regression_loss: 0.1202 - classification_loss: 0.0141 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1338 - regression_loss: 0.1197 - classification_loss: 0.0141 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1335 - regression_loss: 0.1195 - classification_loss: 0.0141 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1338 - regression_loss: 0.1197 - classification_loss: 0.0141 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1345 - regression_loss: 0.1204 - classification_loss: 0.0141 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1348 - regression_loss: 0.1207 - classification_loss: 0.0141 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1374 - regression_loss: 0.1227 - classification_loss: 0.0147 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1368 - regression_loss: 0.1222 - classification_loss: 0.0147 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1366 - regression_loss: 0.1220 - classification_loss: 0.0146 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1359 - regression_loss: 0.1214 - classification_loss: 0.0146 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1354 - regression_loss: 0.1209 - classification_loss: 0.0146 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1355 - regression_loss: 0.1210 - classification_loss: 0.0145 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1368 - regression_loss: 0.1221 - classification_loss: 0.0146 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1363 - regression_loss: 0.1217 - classification_loss: 0.0146 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1358 - regression_loss: 0.1213 - classification_loss: 0.0145 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1355 - regression_loss: 0.1210 - classification_loss: 0.0145 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1387 - regression_loss: 0.1239 - classification_loss: 0.0148 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1383 - regression_loss: 0.1236 - classification_loss: 0.0148 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1381 - regression_loss: 0.1234 - classification_loss: 0.0147 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1383 - regression_loss: 0.1236 - classification_loss: 0.0147 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1396 - regression_loss: 0.1248 - classification_loss: 0.0148 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1394 - regression_loss: 0.1247 - classification_loss: 0.0147 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1392 - regression_loss: 0.1245 - classification_loss: 0.0147 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1394 - regression_loss: 0.1247 - classification_loss: 0.0147 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1390 - regression_loss: 0.1243 - classification_loss: 0.0147 153/500 [========>.....................] - ETA: 2:56 - loss: 0.1389 - regression_loss: 0.1242 - classification_loss: 0.0146 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1391 - regression_loss: 0.1244 - classification_loss: 0.0147 155/500 [========>.....................] - ETA: 2:55 - loss: 0.1388 - regression_loss: 0.1242 - classification_loss: 0.0146 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1392 - regression_loss: 0.1246 - classification_loss: 0.0146 157/500 [========>.....................] - ETA: 2:54 - loss: 0.1387 - regression_loss: 0.1241 - classification_loss: 0.0146 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1400 - regression_loss: 0.1254 - classification_loss: 0.0147 159/500 [========>.....................] - ETA: 2:53 - loss: 0.1398 - regression_loss: 0.1251 - classification_loss: 0.0147 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1405 - regression_loss: 0.1258 - classification_loss: 0.0147 161/500 [========>.....................] - ETA: 2:52 - loss: 0.1401 - regression_loss: 0.1254 - classification_loss: 0.0147 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1399 - regression_loss: 0.1252 - classification_loss: 0.0146 163/500 [========>.....................] - ETA: 2:51 - loss: 0.1392 - regression_loss: 0.1247 - classification_loss: 0.0146 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1388 - regression_loss: 0.1243 - classification_loss: 0.0145 165/500 [========>.....................] - ETA: 2:50 - loss: 0.1385 - regression_loss: 0.1241 - classification_loss: 0.0145 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1391 - regression_loss: 0.1246 - classification_loss: 0.0145 167/500 [=========>....................] - ETA: 2:49 - loss: 0.1389 - regression_loss: 0.1244 - classification_loss: 0.0145 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1425 - regression_loss: 0.1276 - classification_loss: 0.0149 169/500 [=========>....................] - ETA: 2:48 - loss: 0.1429 - regression_loss: 0.1281 - classification_loss: 0.0148 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1427 - regression_loss: 0.1279 - classification_loss: 0.0148 171/500 [=========>....................] - ETA: 2:47 - loss: 0.1426 - regression_loss: 0.1278 - classification_loss: 0.0148 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1428 - regression_loss: 0.1281 - classification_loss: 0.0147 173/500 [=========>....................] - ETA: 2:46 - loss: 0.1426 - regression_loss: 0.1279 - classification_loss: 0.0147 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1425 - regression_loss: 0.1278 - classification_loss: 0.0147 175/500 [=========>....................] - ETA: 2:45 - loss: 0.1421 - regression_loss: 0.1274 - classification_loss: 0.0146 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1417 - regression_loss: 0.1271 - classification_loss: 0.0146 177/500 [=========>....................] - ETA: 2:44 - loss: 0.1414 - regression_loss: 0.1268 - classification_loss: 0.0146 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1410 - regression_loss: 0.1265 - classification_loss: 0.0145 179/500 [=========>....................] - ETA: 2:43 - loss: 0.1420 - regression_loss: 0.1274 - classification_loss: 0.0146 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1428 - regression_loss: 0.1281 - classification_loss: 0.0147 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1425 - regression_loss: 0.1278 - classification_loss: 0.0147 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1424 - regression_loss: 0.1277 - classification_loss: 0.0147 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1420 - regression_loss: 0.1273 - classification_loss: 0.0146 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1418 - regression_loss: 0.1271 - classification_loss: 0.0146 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1419 - regression_loss: 0.1272 - classification_loss: 0.0146 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1441 - regression_loss: 0.1294 - classification_loss: 0.0148 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1440 - regression_loss: 0.1293 - classification_loss: 0.0147 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1441 - regression_loss: 0.1293 - classification_loss: 0.0148 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1443 - regression_loss: 0.1295 - classification_loss: 0.0147 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1442 - regression_loss: 0.1294 - classification_loss: 0.0147 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1451 - regression_loss: 0.1301 - classification_loss: 0.0150 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1457 - regression_loss: 0.1307 - classification_loss: 0.0150 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1460 - regression_loss: 0.1310 - classification_loss: 0.0150 194/500 [==========>...................] - ETA: 2:35 - loss: 0.1458 - regression_loss: 0.1308 - classification_loss: 0.0150 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1461 - regression_loss: 0.1311 - classification_loss: 0.0151 196/500 [==========>...................] - ETA: 2:34 - loss: 0.1460 - regression_loss: 0.1310 - classification_loss: 0.0151 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1456 - regression_loss: 0.1306 - classification_loss: 0.0150 198/500 [==========>...................] - ETA: 2:33 - loss: 0.1452 - regression_loss: 0.1302 - classification_loss: 0.0150 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1450 - regression_loss: 0.1300 - classification_loss: 0.0150 200/500 [===========>..................] - ETA: 2:32 - loss: 0.1453 - regression_loss: 0.1303 - classification_loss: 0.0150 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1448 - regression_loss: 0.1299 - classification_loss: 0.0149 202/500 [===========>..................] - ETA: 2:31 - loss: 0.1444 - regression_loss: 0.1295 - classification_loss: 0.0149 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1441 - regression_loss: 0.1292 - classification_loss: 0.0149 204/500 [===========>..................] - ETA: 2:30 - loss: 0.1437 - regression_loss: 0.1289 - classification_loss: 0.0148 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1437 - regression_loss: 0.1288 - classification_loss: 0.0149 206/500 [===========>..................] - ETA: 2:29 - loss: 0.1438 - regression_loss: 0.1289 - classification_loss: 0.0149 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1434 - regression_loss: 0.1286 - classification_loss: 0.0148 208/500 [===========>..................] - ETA: 2:28 - loss: 0.1435 - regression_loss: 0.1286 - classification_loss: 0.0148 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1435 - regression_loss: 0.1287 - classification_loss: 0.0148 210/500 [===========>..................] - ETA: 2:27 - loss: 0.1431 - regression_loss: 0.1284 - classification_loss: 0.0148 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1434 - regression_loss: 0.1286 - classification_loss: 0.0148 212/500 [===========>..................] - ETA: 2:26 - loss: 0.1431 - regression_loss: 0.1283 - classification_loss: 0.0148 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1432 - regression_loss: 0.1285 - classification_loss: 0.0148 214/500 [===========>..................] - ETA: 2:25 - loss: 0.1435 - regression_loss: 0.1287 - classification_loss: 0.0147 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1432 - regression_loss: 0.1285 - classification_loss: 0.0147 216/500 [===========>..................] - ETA: 2:24 - loss: 0.1429 - regression_loss: 0.1283 - classification_loss: 0.0147 217/500 [============>.................] - ETA: 2:24 - loss: 0.1427 - regression_loss: 0.1281 - classification_loss: 0.0146 218/500 [============>.................] - ETA: 2:23 - loss: 0.1426 - regression_loss: 0.1280 - classification_loss: 0.0146 219/500 [============>.................] - ETA: 2:23 - loss: 0.1430 - regression_loss: 0.1284 - classification_loss: 0.0146 220/500 [============>.................] - ETA: 2:22 - loss: 0.1427 - regression_loss: 0.1281 - classification_loss: 0.0146 221/500 [============>.................] - ETA: 2:22 - loss: 0.1426 - regression_loss: 0.1280 - classification_loss: 0.0146 222/500 [============>.................] - ETA: 2:21 - loss: 0.1429 - regression_loss: 0.1282 - classification_loss: 0.0146 223/500 [============>.................] - ETA: 2:21 - loss: 0.1426 - regression_loss: 0.1280 - classification_loss: 0.0146 224/500 [============>.................] - ETA: 2:20 - loss: 0.1423 - regression_loss: 0.1277 - classification_loss: 0.0146 225/500 [============>.................] - ETA: 2:20 - loss: 0.1420 - regression_loss: 0.1275 - classification_loss: 0.0146 226/500 [============>.................] - ETA: 2:19 - loss: 0.1421 - regression_loss: 0.1275 - classification_loss: 0.0145 227/500 [============>.................] - ETA: 2:19 - loss: 0.1419 - regression_loss: 0.1274 - classification_loss: 0.0145 228/500 [============>.................] - ETA: 2:18 - loss: 0.1416 - regression_loss: 0.1271 - classification_loss: 0.0145 229/500 [============>.................] - ETA: 2:18 - loss: 0.1414 - regression_loss: 0.1269 - classification_loss: 0.0145 230/500 [============>.................] - ETA: 2:17 - loss: 0.1412 - regression_loss: 0.1267 - classification_loss: 0.0145 231/500 [============>.................] - ETA: 2:17 - loss: 0.1419 - regression_loss: 0.1274 - classification_loss: 0.0145 232/500 [============>.................] - ETA: 2:16 - loss: 0.1415 - regression_loss: 0.1270 - classification_loss: 0.0145 233/500 [============>.................] - ETA: 2:16 - loss: 0.1414 - regression_loss: 0.1269 - classification_loss: 0.0145 234/500 [=============>................] - ETA: 2:15 - loss: 0.1415 - regression_loss: 0.1270 - classification_loss: 0.0145 235/500 [=============>................] - ETA: 2:14 - loss: 0.1415 - regression_loss: 0.1270 - classification_loss: 0.0145 236/500 [=============>................] - ETA: 2:14 - loss: 0.1434 - regression_loss: 0.1284 - classification_loss: 0.0150 237/500 [=============>................] - ETA: 2:13 - loss: 0.1430 - regression_loss: 0.1281 - classification_loss: 0.0149 238/500 [=============>................] - ETA: 2:13 - loss: 0.1428 - regression_loss: 0.1279 - classification_loss: 0.0149 239/500 [=============>................] - ETA: 2:12 - loss: 0.1428 - regression_loss: 0.1279 - classification_loss: 0.0149 240/500 [=============>................] - ETA: 2:12 - loss: 0.1425 - regression_loss: 0.1277 - classification_loss: 0.0149 241/500 [=============>................] - ETA: 2:11 - loss: 0.1421 - regression_loss: 0.1272 - classification_loss: 0.0148 242/500 [=============>................] - ETA: 2:11 - loss: 0.1417 - regression_loss: 0.1269 - classification_loss: 0.0148 243/500 [=============>................] - ETA: 2:10 - loss: 0.1415 - regression_loss: 0.1267 - classification_loss: 0.0148 244/500 [=============>................] - ETA: 2:10 - loss: 0.1417 - regression_loss: 0.1270 - classification_loss: 0.0148 245/500 [=============>................] - ETA: 2:09 - loss: 0.1419 - regression_loss: 0.1271 - classification_loss: 0.0148 246/500 [=============>................] - ETA: 2:09 - loss: 0.1416 - regression_loss: 0.1269 - classification_loss: 0.0148 247/500 [=============>................] - ETA: 2:08 - loss: 0.1415 - regression_loss: 0.1267 - classification_loss: 0.0147 248/500 [=============>................] - ETA: 2:08 - loss: 0.1413 - regression_loss: 0.1265 - classification_loss: 0.0147 249/500 [=============>................] - ETA: 2:07 - loss: 0.1411 - regression_loss: 0.1264 - classification_loss: 0.0147 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1407 - regression_loss: 0.1260 - classification_loss: 0.0147 251/500 [==============>...............] - ETA: 2:06 - loss: 0.1404 - regression_loss: 0.1257 - classification_loss: 0.0146 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1413 - regression_loss: 0.1266 - classification_loss: 0.0146 253/500 [==============>...............] - ETA: 2:05 - loss: 0.1409 - regression_loss: 0.1263 - classification_loss: 0.0146 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1407 - regression_loss: 0.1261 - classification_loss: 0.0146 255/500 [==============>...............] - ETA: 2:04 - loss: 0.1408 - regression_loss: 0.1262 - classification_loss: 0.0146 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1410 - regression_loss: 0.1264 - classification_loss: 0.0146 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1409 - regression_loss: 0.1262 - classification_loss: 0.0146 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1408 - regression_loss: 0.1262 - classification_loss: 0.0146 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1408 - regression_loss: 0.1262 - classification_loss: 0.0146 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1407 - regression_loss: 0.1261 - classification_loss: 0.0146 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1405 - regression_loss: 0.1259 - classification_loss: 0.0146 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1401 - regression_loss: 0.1255 - classification_loss: 0.0145 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1398 - regression_loss: 0.1253 - classification_loss: 0.0145 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1393 - regression_loss: 0.1249 - classification_loss: 0.0145 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1394 - regression_loss: 0.1250 - classification_loss: 0.0145 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1391 - regression_loss: 0.1247 - classification_loss: 0.0144 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1393 - regression_loss: 0.1249 - classification_loss: 0.0144 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1392 - regression_loss: 0.1247 - classification_loss: 0.0144 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1389 - regression_loss: 0.1245 - classification_loss: 0.0144 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1387 - regression_loss: 0.1244 - classification_loss: 0.0144 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1390 - regression_loss: 0.1246 - classification_loss: 0.0144 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1387 - regression_loss: 0.1244 - classification_loss: 0.0144 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1385 - regression_loss: 0.1241 - classification_loss: 0.0143 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1413 - regression_loss: 0.1260 - classification_loss: 0.0152 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1414 - regression_loss: 0.1261 - classification_loss: 0.0152 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1410 - regression_loss: 0.1259 - classification_loss: 0.0152 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1409 - regression_loss: 0.1257 - classification_loss: 0.0152 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1408 - regression_loss: 0.1256 - classification_loss: 0.0152 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1405 - regression_loss: 0.1253 - classification_loss: 0.0152 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1407 - regression_loss: 0.1256 - classification_loss: 0.0152 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1406 - regression_loss: 0.1254 - classification_loss: 0.0152 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1418 - regression_loss: 0.1265 - classification_loss: 0.0152 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1417 - regression_loss: 0.1265 - classification_loss: 0.0152 284/500 [================>.............] - ETA: 1:50 - loss: 0.1416 - regression_loss: 0.1264 - classification_loss: 0.0152 285/500 [================>.............] - ETA: 1:49 - loss: 0.1418 - regression_loss: 0.1266 - classification_loss: 0.0152 286/500 [================>.............] - ETA: 1:49 - loss: 0.1416 - regression_loss: 0.1264 - classification_loss: 0.0151 287/500 [================>.............] - ETA: 1:48 - loss: 0.1417 - regression_loss: 0.1266 - classification_loss: 0.0151 288/500 [================>.............] - ETA: 1:48 - loss: 0.1418 - regression_loss: 0.1266 - classification_loss: 0.0152 289/500 [================>.............] - ETA: 1:47 - loss: 0.1417 - regression_loss: 0.1266 - classification_loss: 0.0151 290/500 [================>.............] - ETA: 1:47 - loss: 0.1415 - regression_loss: 0.1264 - classification_loss: 0.0151 291/500 [================>.............] - ETA: 1:46 - loss: 0.1415 - regression_loss: 0.1264 - classification_loss: 0.0151 292/500 [================>.............] - ETA: 1:46 - loss: 0.1415 - regression_loss: 0.1264 - classification_loss: 0.0151 293/500 [================>.............] - ETA: 1:45 - loss: 0.1412 - regression_loss: 0.1261 - classification_loss: 0.0151 294/500 [================>.............] - ETA: 1:45 - loss: 0.1414 - regression_loss: 0.1263 - classification_loss: 0.0151 295/500 [================>.............] - ETA: 1:44 - loss: 0.1412 - regression_loss: 0.1261 - classification_loss: 0.0151 296/500 [================>.............] - ETA: 1:44 - loss: 0.1411 - regression_loss: 0.1261 - classification_loss: 0.0151 297/500 [================>.............] - ETA: 1:43 - loss: 0.1412 - regression_loss: 0.1261 - classification_loss: 0.0151 298/500 [================>.............] - ETA: 1:43 - loss: 0.1409 - regression_loss: 0.1259 - classification_loss: 0.0150 299/500 [================>.............] - ETA: 1:42 - loss: 0.1409 - regression_loss: 0.1258 - classification_loss: 0.0150 300/500 [=================>............] - ETA: 1:42 - loss: 0.1407 - regression_loss: 0.1257 - classification_loss: 0.0150 301/500 [=================>............] - ETA: 1:41 - loss: 0.1416 - regression_loss: 0.1265 - classification_loss: 0.0151 302/500 [=================>............] - ETA: 1:41 - loss: 0.1415 - regression_loss: 0.1264 - classification_loss: 0.0151 303/500 [=================>............] - ETA: 1:40 - loss: 0.1413 - regression_loss: 0.1263 - classification_loss: 0.0151 304/500 [=================>............] - ETA: 1:40 - loss: 0.1418 - regression_loss: 0.1267 - classification_loss: 0.0151 305/500 [=================>............] - ETA: 1:39 - loss: 0.1419 - regression_loss: 0.1268 - classification_loss: 0.0151 306/500 [=================>............] - ETA: 1:39 - loss: 0.1416 - regression_loss: 0.1265 - classification_loss: 0.0150 307/500 [=================>............] - ETA: 1:38 - loss: 0.1413 - regression_loss: 0.1263 - classification_loss: 0.0150 308/500 [=================>............] - ETA: 1:37 - loss: 0.1410 - regression_loss: 0.1260 - classification_loss: 0.0150 309/500 [=================>............] - ETA: 1:37 - loss: 0.1409 - regression_loss: 0.1259 - classification_loss: 0.0150 310/500 [=================>............] - ETA: 1:36 - loss: 0.1406 - regression_loss: 0.1257 - classification_loss: 0.0150 311/500 [=================>............] - ETA: 1:36 - loss: 0.1404 - regression_loss: 0.1255 - classification_loss: 0.0149 312/500 [=================>............] - ETA: 1:35 - loss: 0.1405 - regression_loss: 0.1256 - classification_loss: 0.0149 313/500 [=================>............] - ETA: 1:35 - loss: 0.1403 - regression_loss: 0.1254 - classification_loss: 0.0149 314/500 [=================>............] - ETA: 1:34 - loss: 0.1400 - regression_loss: 0.1251 - classification_loss: 0.0149 315/500 [=================>............] - ETA: 1:34 - loss: 0.1402 - regression_loss: 0.1254 - classification_loss: 0.0149 316/500 [=================>............] - ETA: 1:33 - loss: 0.1416 - regression_loss: 0.1266 - classification_loss: 0.0150 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1427 - regression_loss: 0.1274 - classification_loss: 0.0153 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1425 - regression_loss: 0.1273 - classification_loss: 0.0153 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1430 - regression_loss: 0.1277 - classification_loss: 0.0153 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1433 - regression_loss: 0.1280 - classification_loss: 0.0153 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1431 - regression_loss: 0.1278 - classification_loss: 0.0153 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1442 - regression_loss: 0.1286 - classification_loss: 0.0156 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1439 - regression_loss: 0.1284 - classification_loss: 0.0155 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1438 - regression_loss: 0.1282 - classification_loss: 0.0155 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1435 - regression_loss: 0.1280 - classification_loss: 0.0155 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1434 - regression_loss: 0.1278 - classification_loss: 0.0155 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1452 - regression_loss: 0.1295 - classification_loss: 0.0157 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1454 - regression_loss: 0.1297 - classification_loss: 0.0157 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1459 - regression_loss: 0.1302 - classification_loss: 0.0157 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1459 - regression_loss: 0.1301 - classification_loss: 0.0157 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1460 - regression_loss: 0.1302 - classification_loss: 0.0157 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1461 - regression_loss: 0.1304 - classification_loss: 0.0157 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1460 - regression_loss: 0.1303 - classification_loss: 0.0157 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1457 - regression_loss: 0.1300 - classification_loss: 0.0157 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1454 - regression_loss: 0.1298 - classification_loss: 0.0156 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1452 - regression_loss: 0.1296 - classification_loss: 0.0156 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1449 - regression_loss: 0.1293 - classification_loss: 0.0156 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1450 - regression_loss: 0.1295 - classification_loss: 0.0156 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1450 - regression_loss: 0.1294 - classification_loss: 0.0156 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1447 - regression_loss: 0.1292 - classification_loss: 0.0155 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1446 - regression_loss: 0.1290 - classification_loss: 0.0155 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1458 - regression_loss: 0.1302 - classification_loss: 0.0156 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1455 - regression_loss: 0.1300 - classification_loss: 0.0156 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1454 - regression_loss: 0.1298 - classification_loss: 0.0155 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1455 - regression_loss: 0.1299 - classification_loss: 0.0155 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1453 - regression_loss: 0.1298 - classification_loss: 0.0155 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1459 - regression_loss: 0.1303 - classification_loss: 0.0156 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1459 - regression_loss: 0.1302 - classification_loss: 0.0156 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1456 - regression_loss: 0.1300 - classification_loss: 0.0156 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1454 - regression_loss: 0.1298 - classification_loss: 0.0156 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1455 - regression_loss: 0.1299 - classification_loss: 0.0156 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1452 - regression_loss: 0.1296 - classification_loss: 0.0156 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1451 - regression_loss: 0.1295 - classification_loss: 0.0156 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1451 - regression_loss: 0.1296 - classification_loss: 0.0155 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1449 - regression_loss: 0.1294 - classification_loss: 0.0155 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1447 - regression_loss: 0.1292 - classification_loss: 0.0155 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1444 - regression_loss: 0.1289 - classification_loss: 0.0155 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1441 - regression_loss: 0.1287 - classification_loss: 0.0155 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1440 - regression_loss: 0.1286 - classification_loss: 0.0155 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1442 - regression_loss: 0.1288 - classification_loss: 0.0155 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1440 - regression_loss: 0.1285 - classification_loss: 0.0154 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1439 - regression_loss: 0.1284 - classification_loss: 0.0154 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1439 - regression_loss: 0.1285 - classification_loss: 0.0154 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1446 - regression_loss: 0.1292 - classification_loss: 0.0154 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1444 - regression_loss: 0.1290 - classification_loss: 0.0154 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1443 - regression_loss: 0.1289 - classification_loss: 0.0153 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1442 - regression_loss: 0.1289 - classification_loss: 0.0153 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1447 - regression_loss: 0.1293 - classification_loss: 0.0154 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1446 - regression_loss: 0.1293 - classification_loss: 0.0154 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1445 - regression_loss: 0.1291 - classification_loss: 0.0154 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1445 - regression_loss: 0.1291 - classification_loss: 0.0154 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1444 - regression_loss: 0.1290 - classification_loss: 0.0154 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1444 - regression_loss: 0.1290 - classification_loss: 0.0154 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1443 - regression_loss: 0.1290 - classification_loss: 0.0154 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1443 - regression_loss: 0.1290 - classification_loss: 0.0153 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1441 - regression_loss: 0.1287 - classification_loss: 0.0153 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1443 - regression_loss: 0.1290 - classification_loss: 0.0153 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1442 - regression_loss: 0.1289 - classification_loss: 0.0153 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1441 - regression_loss: 0.1288 - classification_loss: 0.0153 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1445 - regression_loss: 0.1291 - classification_loss: 0.0153 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1445 - regression_loss: 0.1292 - classification_loss: 0.0153 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1446 - regression_loss: 0.1292 - classification_loss: 0.0154 383/500 [=====================>........] - ETA: 59s - loss: 0.1445 - regression_loss: 0.1292 - classification_loss: 0.0154  384/500 [======================>.......] - ETA: 59s - loss: 0.1456 - regression_loss: 0.1302 - classification_loss: 0.0154 385/500 [======================>.......] - ETA: 58s - loss: 0.1454 - regression_loss: 0.1301 - classification_loss: 0.0154 386/500 [======================>.......] - ETA: 58s - loss: 0.1452 - regression_loss: 0.1298 - classification_loss: 0.0154 387/500 [======================>.......] - ETA: 57s - loss: 0.1451 - regression_loss: 0.1297 - classification_loss: 0.0154 388/500 [======================>.......] - ETA: 57s - loss: 0.1448 - regression_loss: 0.1294 - classification_loss: 0.0153 389/500 [======================>.......] - ETA: 56s - loss: 0.1447 - regression_loss: 0.1294 - classification_loss: 0.0153 390/500 [======================>.......] - ETA: 56s - loss: 0.1446 - regression_loss: 0.1292 - classification_loss: 0.0153 391/500 [======================>.......] - ETA: 55s - loss: 0.1447 - regression_loss: 0.1294 - classification_loss: 0.0153 392/500 [======================>.......] - ETA: 55s - loss: 0.1444 - regression_loss: 0.1291 - classification_loss: 0.0153 393/500 [======================>.......] - ETA: 54s - loss: 0.1445 - regression_loss: 0.1292 - classification_loss: 0.0153 394/500 [======================>.......] - ETA: 54s - loss: 0.1445 - regression_loss: 0.1293 - classification_loss: 0.0153 395/500 [======================>.......] - ETA: 53s - loss: 0.1444 - regression_loss: 0.1291 - classification_loss: 0.0153 396/500 [======================>.......] - ETA: 53s - loss: 0.1449 - regression_loss: 0.1296 - classification_loss: 0.0153 397/500 [======================>.......] - ETA: 52s - loss: 0.1447 - regression_loss: 0.1294 - classification_loss: 0.0152 398/500 [======================>.......] - ETA: 52s - loss: 0.1461 - regression_loss: 0.1308 - classification_loss: 0.0154 399/500 [======================>.......] - ETA: 51s - loss: 0.1462 - regression_loss: 0.1308 - classification_loss: 0.0154 400/500 [=======================>......] - ETA: 51s - loss: 0.1459 - regression_loss: 0.1306 - classification_loss: 0.0154 401/500 [=======================>......] - ETA: 50s - loss: 0.1459 - regression_loss: 0.1306 - classification_loss: 0.0153 402/500 [=======================>......] - ETA: 50s - loss: 0.1459 - regression_loss: 0.1305 - classification_loss: 0.0153 403/500 [=======================>......] - ETA: 49s - loss: 0.1457 - regression_loss: 0.1304 - classification_loss: 0.0153 404/500 [=======================>......] - ETA: 49s - loss: 0.1456 - regression_loss: 0.1302 - classification_loss: 0.0153 405/500 [=======================>......] - ETA: 48s - loss: 0.1454 - regression_loss: 0.1301 - classification_loss: 0.0153 406/500 [=======================>......] - ETA: 48s - loss: 0.1452 - regression_loss: 0.1299 - classification_loss: 0.0153 407/500 [=======================>......] - ETA: 47s - loss: 0.1450 - regression_loss: 0.1297 - classification_loss: 0.0153 408/500 [=======================>......] - ETA: 47s - loss: 0.1454 - regression_loss: 0.1301 - classification_loss: 0.0153 409/500 [=======================>......] - ETA: 46s - loss: 0.1452 - regression_loss: 0.1299 - classification_loss: 0.0153 410/500 [=======================>......] - ETA: 46s - loss: 0.1455 - regression_loss: 0.1302 - classification_loss: 0.0153 411/500 [=======================>......] - ETA: 45s - loss: 0.1453 - regression_loss: 0.1300 - classification_loss: 0.0153 412/500 [=======================>......] - ETA: 44s - loss: 0.1457 - regression_loss: 0.1304 - classification_loss: 0.0153 413/500 [=======================>......] - ETA: 44s - loss: 0.1457 - regression_loss: 0.1304 - classification_loss: 0.0153 414/500 [=======================>......] - ETA: 43s - loss: 0.1454 - regression_loss: 0.1302 - classification_loss: 0.0153 415/500 [=======================>......] - ETA: 43s - loss: 0.1452 - regression_loss: 0.1300 - classification_loss: 0.0152 416/500 [=======================>......] - ETA: 42s - loss: 0.1452 - regression_loss: 0.1300 - classification_loss: 0.0152 417/500 [========================>.....] - ETA: 42s - loss: 0.1452 - regression_loss: 0.1300 - classification_loss: 0.0152 418/500 [========================>.....] - ETA: 41s - loss: 0.1450 - regression_loss: 0.1298 - classification_loss: 0.0152 419/500 [========================>.....] - ETA: 41s - loss: 0.1451 - regression_loss: 0.1298 - classification_loss: 0.0152 420/500 [========================>.....] - ETA: 40s - loss: 0.1452 - regression_loss: 0.1299 - classification_loss: 0.0152 421/500 [========================>.....] - ETA: 40s - loss: 0.1454 - regression_loss: 0.1301 - classification_loss: 0.0152 422/500 [========================>.....] - ETA: 39s - loss: 0.1455 - regression_loss: 0.1303 - classification_loss: 0.0152 423/500 [========================>.....] - ETA: 39s - loss: 0.1453 - regression_loss: 0.1301 - classification_loss: 0.0152 424/500 [========================>.....] - ETA: 38s - loss: 0.1453 - regression_loss: 0.1301 - classification_loss: 0.0152 425/500 [========================>.....] - ETA: 38s - loss: 0.1453 - regression_loss: 0.1301 - classification_loss: 0.0152 426/500 [========================>.....] - ETA: 37s - loss: 0.1451 - regression_loss: 0.1299 - classification_loss: 0.0152 427/500 [========================>.....] - ETA: 37s - loss: 0.1449 - regression_loss: 0.1297 - classification_loss: 0.0151 428/500 [========================>.....] - ETA: 36s - loss: 0.1449 - regression_loss: 0.1297 - classification_loss: 0.0151 429/500 [========================>.....] - ETA: 36s - loss: 0.1449 - regression_loss: 0.1297 - classification_loss: 0.0151 430/500 [========================>.....] - ETA: 35s - loss: 0.1447 - regression_loss: 0.1296 - classification_loss: 0.0151 431/500 [========================>.....] - ETA: 35s - loss: 0.1446 - regression_loss: 0.1295 - classification_loss: 0.0151 432/500 [========================>.....] - ETA: 34s - loss: 0.1446 - regression_loss: 0.1295 - classification_loss: 0.0151 433/500 [========================>.....] - ETA: 34s - loss: 0.1444 - regression_loss: 0.1293 - classification_loss: 0.0151 434/500 [=========================>....] - ETA: 33s - loss: 0.1442 - regression_loss: 0.1291 - classification_loss: 0.0151 435/500 [=========================>....] - ETA: 33s - loss: 0.1439 - regression_loss: 0.1289 - classification_loss: 0.0150 436/500 [=========================>....] - ETA: 32s - loss: 0.1440 - regression_loss: 0.1289 - classification_loss: 0.0150 437/500 [=========================>....] - ETA: 32s - loss: 0.1438 - regression_loss: 0.1287 - classification_loss: 0.0150 438/500 [=========================>....] - ETA: 31s - loss: 0.1436 - regression_loss: 0.1286 - classification_loss: 0.0150 439/500 [=========================>....] - ETA: 31s - loss: 0.1436 - regression_loss: 0.1286 - classification_loss: 0.0150 440/500 [=========================>....] - ETA: 30s - loss: 0.1435 - regression_loss: 0.1285 - classification_loss: 0.0150 441/500 [=========================>....] - ETA: 30s - loss: 0.1434 - regression_loss: 0.1284 - classification_loss: 0.0150 442/500 [=========================>....] - ETA: 29s - loss: 0.1432 - regression_loss: 0.1283 - classification_loss: 0.0150 443/500 [=========================>....] - ETA: 29s - loss: 0.1431 - regression_loss: 0.1281 - classification_loss: 0.0150 444/500 [=========================>....] - ETA: 28s - loss: 0.1437 - regression_loss: 0.1288 - classification_loss: 0.0150 445/500 [=========================>....] - ETA: 28s - loss: 0.1437 - regression_loss: 0.1287 - classification_loss: 0.0150 446/500 [=========================>....] - ETA: 27s - loss: 0.1435 - regression_loss: 0.1286 - classification_loss: 0.0149 447/500 [=========================>....] - ETA: 27s - loss: 0.1434 - regression_loss: 0.1284 - classification_loss: 0.0149 448/500 [=========================>....] - ETA: 26s - loss: 0.1433 - regression_loss: 0.1283 - classification_loss: 0.0149 449/500 [=========================>....] - ETA: 26s - loss: 0.1430 - regression_loss: 0.1281 - classification_loss: 0.0149 450/500 [==========================>...] - ETA: 25s - loss: 0.1428 - regression_loss: 0.1279 - classification_loss: 0.0149 451/500 [==========================>...] - ETA: 25s - loss: 0.1427 - regression_loss: 0.1278 - classification_loss: 0.0149 452/500 [==========================>...] - ETA: 24s - loss: 0.1427 - regression_loss: 0.1278 - classification_loss: 0.0149 453/500 [==========================>...] - ETA: 24s - loss: 0.1427 - regression_loss: 0.1278 - classification_loss: 0.0149 454/500 [==========================>...] - ETA: 23s - loss: 0.1428 - regression_loss: 0.1279 - classification_loss: 0.0149 455/500 [==========================>...] - ETA: 22s - loss: 0.1429 - regression_loss: 0.1280 - classification_loss: 0.0149 456/500 [==========================>...] - ETA: 22s - loss: 0.1428 - regression_loss: 0.1279 - classification_loss: 0.0149 457/500 [==========================>...] - ETA: 21s - loss: 0.1426 - regression_loss: 0.1278 - classification_loss: 0.0148 458/500 [==========================>...] - ETA: 21s - loss: 0.1425 - regression_loss: 0.1277 - classification_loss: 0.0148 459/500 [==========================>...] - ETA: 20s - loss: 0.1424 - regression_loss: 0.1275 - classification_loss: 0.0148 460/500 [==========================>...] - ETA: 20s - loss: 0.1421 - regression_loss: 0.1273 - classification_loss: 0.0148 461/500 [==========================>...] - ETA: 19s - loss: 0.1422 - regression_loss: 0.1274 - classification_loss: 0.0148 462/500 [==========================>...] - ETA: 19s - loss: 0.1421 - regression_loss: 0.1273 - classification_loss: 0.0148 463/500 [==========================>...] - ETA: 18s - loss: 0.1420 - regression_loss: 0.1272 - classification_loss: 0.0148 464/500 [==========================>...] - ETA: 18s - loss: 0.1418 - regression_loss: 0.1270 - classification_loss: 0.0148 465/500 [==========================>...] - ETA: 17s - loss: 0.1417 - regression_loss: 0.1269 - classification_loss: 0.0148 466/500 [==========================>...] - ETA: 17s - loss: 0.1420 - regression_loss: 0.1271 - classification_loss: 0.0148 467/500 [===========================>..] - ETA: 16s - loss: 0.1420 - regression_loss: 0.1271 - classification_loss: 0.0148 468/500 [===========================>..] - ETA: 16s - loss: 0.1423 - regression_loss: 0.1274 - classification_loss: 0.0149 469/500 [===========================>..] - ETA: 15s - loss: 0.1423 - regression_loss: 0.1274 - classification_loss: 0.0149 470/500 [===========================>..] - ETA: 15s - loss: 0.1421 - regression_loss: 0.1273 - classification_loss: 0.0148 471/500 [===========================>..] - ETA: 14s - loss: 0.1419 - regression_loss: 0.1271 - classification_loss: 0.0148 472/500 [===========================>..] - ETA: 14s - loss: 0.1419 - regression_loss: 0.1271 - classification_loss: 0.0148 473/500 [===========================>..] - ETA: 13s - loss: 0.1418 - regression_loss: 0.1269 - classification_loss: 0.0148 474/500 [===========================>..] - ETA: 13s - loss: 0.1418 - regression_loss: 0.1270 - classification_loss: 0.0148 475/500 [===========================>..] - ETA: 12s - loss: 0.1422 - regression_loss: 0.1273 - classification_loss: 0.0148 476/500 [===========================>..] - ETA: 12s - loss: 0.1420 - regression_loss: 0.1272 - classification_loss: 0.0148 477/500 [===========================>..] - ETA: 11s - loss: 0.1418 - regression_loss: 0.1270 - classification_loss: 0.0148 478/500 [===========================>..] - ETA: 11s - loss: 0.1418 - regression_loss: 0.1270 - classification_loss: 0.0148 479/500 [===========================>..] - ETA: 10s - loss: 0.1416 - regression_loss: 0.1268 - classification_loss: 0.0148 480/500 [===========================>..] - ETA: 10s - loss: 0.1414 - regression_loss: 0.1267 - classification_loss: 0.0148 481/500 [===========================>..] - ETA: 9s - loss: 0.1413 - regression_loss: 0.1266 - classification_loss: 0.0147  482/500 [===========================>..] - ETA: 9s - loss: 0.1413 - regression_loss: 0.1266 - classification_loss: 0.0147 483/500 [===========================>..] - ETA: 8s - loss: 0.1412 - regression_loss: 0.1265 - classification_loss: 0.0147 484/500 [============================>.] - ETA: 8s - loss: 0.1412 - regression_loss: 0.1265 - classification_loss: 0.0147 485/500 [============================>.] - ETA: 7s - loss: 0.1412 - regression_loss: 0.1265 - classification_loss: 0.0147 486/500 [============================>.] - ETA: 7s - loss: 0.1410 - regression_loss: 0.1263 - classification_loss: 0.0147 487/500 [============================>.] - ETA: 6s - loss: 0.1409 - regression_loss: 0.1263 - classification_loss: 0.0147 488/500 [============================>.] - ETA: 6s - loss: 0.1408 - regression_loss: 0.1261 - classification_loss: 0.0147 489/500 [============================>.] - ETA: 5s - loss: 0.1407 - regression_loss: 0.1261 - classification_loss: 0.0146 490/500 [============================>.] - ETA: 5s - loss: 0.1409 - regression_loss: 0.1262 - classification_loss: 0.0147 491/500 [============================>.] - ETA: 4s - loss: 0.1407 - regression_loss: 0.1261 - classification_loss: 0.0146 492/500 [============================>.] - ETA: 4s - loss: 0.1408 - regression_loss: 0.1261 - classification_loss: 0.0146 493/500 [============================>.] - ETA: 3s - loss: 0.1407 - regression_loss: 0.1261 - classification_loss: 0.0146 494/500 [============================>.] - ETA: 3s - loss: 0.1406 - regression_loss: 0.1259 - classification_loss: 0.0146 495/500 [============================>.] - ETA: 2s - loss: 0.1409 - regression_loss: 0.1262 - classification_loss: 0.0147 496/500 [============================>.] - ETA: 2s - loss: 0.1407 - regression_loss: 0.1260 - classification_loss: 0.0147 497/500 [============================>.] - ETA: 1s - loss: 0.1405 - regression_loss: 0.1259 - classification_loss: 0.0147 498/500 [============================>.] - ETA: 1s - loss: 0.1405 - regression_loss: 0.1259 - classification_loss: 0.0147 499/500 [============================>.] - ETA: 0s - loss: 0.1407 - regression_loss: 0.1260 - classification_loss: 0.0147 500/500 [==============================] - 256s 511ms/step - loss: 0.1405 - regression_loss: 0.1258 - classification_loss: 0.0146 326 instances of class plum with average precision: 0.8330 mAP: 0.8330 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 4:17 - loss: 0.2136 - regression_loss: 0.1933 - classification_loss: 0.0202 2/500 [..............................] - ETA: 4:11 - loss: 0.1350 - regression_loss: 0.1211 - classification_loss: 0.0139 3/500 [..............................] - ETA: 4:11 - loss: 0.1120 - regression_loss: 0.1010 - classification_loss: 0.0110 4/500 [..............................] - ETA: 4:11 - loss: 0.1202 - regression_loss: 0.1089 - classification_loss: 0.0113 5/500 [..............................] - ETA: 4:10 - loss: 0.1340 - regression_loss: 0.1215 - classification_loss: 0.0125 6/500 [..............................] - ETA: 4:09 - loss: 0.1323 - regression_loss: 0.1191 - classification_loss: 0.0132 7/500 [..............................] - ETA: 4:09 - loss: 0.1267 - regression_loss: 0.1125 - classification_loss: 0.0142 8/500 [..............................] - ETA: 4:09 - loss: 0.1248 - regression_loss: 0.1110 - classification_loss: 0.0138 9/500 [..............................] - ETA: 4:09 - loss: 0.1175 - regression_loss: 0.1045 - classification_loss: 0.0130 10/500 [..............................] - ETA: 4:09 - loss: 0.1139 - regression_loss: 0.1015 - classification_loss: 0.0124 11/500 [..............................] - ETA: 4:09 - loss: 0.1069 - regression_loss: 0.0949 - classification_loss: 0.0120 12/500 [..............................] - ETA: 4:08 - loss: 0.1060 - regression_loss: 0.0944 - classification_loss: 0.0116 13/500 [..............................] - ETA: 4:09 - loss: 0.1633 - regression_loss: 0.1337 - classification_loss: 0.0296 14/500 [..............................] - ETA: 4:08 - loss: 0.1589 - regression_loss: 0.1307 - classification_loss: 0.0282 15/500 [..............................] - ETA: 4:07 - loss: 0.1694 - regression_loss: 0.1422 - classification_loss: 0.0272 16/500 [..............................] - ETA: 4:07 - loss: 0.1640 - regression_loss: 0.1380 - classification_loss: 0.0260 17/500 [>.............................] - ETA: 4:06 - loss: 0.1569 - regression_loss: 0.1320 - classification_loss: 0.0248 18/500 [>.............................] - ETA: 4:06 - loss: 0.1515 - regression_loss: 0.1276 - classification_loss: 0.0238 19/500 [>.............................] - ETA: 4:05 - loss: 0.1529 - regression_loss: 0.1293 - classification_loss: 0.0235 20/500 [>.............................] - ETA: 4:05 - loss: 0.1482 - regression_loss: 0.1256 - classification_loss: 0.0226 21/500 [>.............................] - ETA: 4:04 - loss: 0.1513 - regression_loss: 0.1291 - classification_loss: 0.0222 22/500 [>.............................] - ETA: 4:03 - loss: 0.1708 - regression_loss: 0.1472 - classification_loss: 0.0236 23/500 [>.............................] - ETA: 4:03 - loss: 0.1682 - regression_loss: 0.1452 - classification_loss: 0.0230 24/500 [>.............................] - ETA: 4:02 - loss: 0.1642 - regression_loss: 0.1420 - classification_loss: 0.0222 25/500 [>.............................] - ETA: 4:01 - loss: 0.1602 - regression_loss: 0.1385 - classification_loss: 0.0217 26/500 [>.............................] - ETA: 4:01 - loss: 0.1571 - regression_loss: 0.1359 - classification_loss: 0.0212 27/500 [>.............................] - ETA: 4:00 - loss: 0.1578 - regression_loss: 0.1365 - classification_loss: 0.0212 28/500 [>.............................] - ETA: 4:00 - loss: 0.1551 - regression_loss: 0.1344 - classification_loss: 0.0207 29/500 [>.............................] - ETA: 3:59 - loss: 0.1567 - regression_loss: 0.1356 - classification_loss: 0.0211 30/500 [>.............................] - ETA: 3:58 - loss: 0.1563 - regression_loss: 0.1356 - classification_loss: 0.0207 31/500 [>.............................] - ETA: 3:58 - loss: 0.1547 - regression_loss: 0.1338 - classification_loss: 0.0208 32/500 [>.............................] - ETA: 3:58 - loss: 0.1551 - regression_loss: 0.1343 - classification_loss: 0.0207 33/500 [>.............................] - ETA: 3:57 - loss: 0.1608 - regression_loss: 0.1398 - classification_loss: 0.0210 34/500 [=>............................] - ETA: 3:56 - loss: 0.1578 - regression_loss: 0.1372 - classification_loss: 0.0206 35/500 [=>............................] - ETA: 3:56 - loss: 0.1610 - regression_loss: 0.1397 - classification_loss: 0.0212 36/500 [=>............................] - ETA: 3:55 - loss: 0.1595 - regression_loss: 0.1387 - classification_loss: 0.0209 37/500 [=>............................] - ETA: 3:55 - loss: 0.1574 - regression_loss: 0.1368 - classification_loss: 0.0206 38/500 [=>............................] - ETA: 3:55 - loss: 0.1543 - regression_loss: 0.1342 - classification_loss: 0.0201 39/500 [=>............................] - ETA: 3:54 - loss: 0.1533 - regression_loss: 0.1333 - classification_loss: 0.0200 40/500 [=>............................] - ETA: 3:53 - loss: 0.1532 - regression_loss: 0.1333 - classification_loss: 0.0199 41/500 [=>............................] - ETA: 3:53 - loss: 0.1509 - regression_loss: 0.1314 - classification_loss: 0.0195 42/500 [=>............................] - ETA: 3:52 - loss: 0.1487 - regression_loss: 0.1295 - classification_loss: 0.0192 43/500 [=>............................] - ETA: 3:52 - loss: 0.1486 - regression_loss: 0.1295 - classification_loss: 0.0190 44/500 [=>............................] - ETA: 3:52 - loss: 0.1462 - regression_loss: 0.1275 - classification_loss: 0.0187 45/500 [=>............................] - ETA: 3:51 - loss: 0.1532 - regression_loss: 0.1329 - classification_loss: 0.0203 46/500 [=>............................] - ETA: 3:51 - loss: 0.1522 - regression_loss: 0.1321 - classification_loss: 0.0201 47/500 [=>............................] - ETA: 3:50 - loss: 0.1504 - regression_loss: 0.1306 - classification_loss: 0.0198 48/500 [=>............................] - ETA: 3:50 - loss: 0.1513 - regression_loss: 0.1316 - classification_loss: 0.0198 49/500 [=>............................] - ETA: 3:49 - loss: 0.1496 - regression_loss: 0.1299 - classification_loss: 0.0197 50/500 [==>...........................] - ETA: 3:48 - loss: 0.1503 - regression_loss: 0.1307 - classification_loss: 0.0196 51/500 [==>...........................] - ETA: 3:47 - loss: 0.1510 - regression_loss: 0.1314 - classification_loss: 0.0196 52/500 [==>...........................] - ETA: 3:47 - loss: 0.1492 - regression_loss: 0.1297 - classification_loss: 0.0194 53/500 [==>...........................] - ETA: 3:46 - loss: 0.1500 - regression_loss: 0.1307 - classification_loss: 0.0193 54/500 [==>...........................] - ETA: 3:46 - loss: 0.1580 - regression_loss: 0.1368 - classification_loss: 0.0212 55/500 [==>...........................] - ETA: 3:45 - loss: 0.1575 - regression_loss: 0.1365 - classification_loss: 0.0211 56/500 [==>...........................] - ETA: 3:45 - loss: 0.1569 - regression_loss: 0.1361 - classification_loss: 0.0209 57/500 [==>...........................] - ETA: 3:44 - loss: 0.1569 - regression_loss: 0.1360 - classification_loss: 0.0209 58/500 [==>...........................] - ETA: 3:44 - loss: 0.1559 - regression_loss: 0.1352 - classification_loss: 0.0206 59/500 [==>...........................] - ETA: 3:43 - loss: 0.1655 - regression_loss: 0.1440 - classification_loss: 0.0215 60/500 [==>...........................] - ETA: 3:43 - loss: 0.1636 - regression_loss: 0.1424 - classification_loss: 0.0213 61/500 [==>...........................] - ETA: 3:42 - loss: 0.1621 - regression_loss: 0.1411 - classification_loss: 0.0210 62/500 [==>...........................] - ETA: 3:42 - loss: 0.1615 - regression_loss: 0.1405 - classification_loss: 0.0210 63/500 [==>...........................] - ETA: 3:41 - loss: 0.1621 - regression_loss: 0.1413 - classification_loss: 0.0209 64/500 [==>...........................] - ETA: 3:41 - loss: 0.1624 - regression_loss: 0.1416 - classification_loss: 0.0208 65/500 [==>...........................] - ETA: 3:40 - loss: 0.1615 - regression_loss: 0.1409 - classification_loss: 0.0206 66/500 [==>...........................] - ETA: 3:40 - loss: 0.1603 - regression_loss: 0.1399 - classification_loss: 0.0205 67/500 [===>..........................] - ETA: 3:40 - loss: 0.1583 - regression_loss: 0.1381 - classification_loss: 0.0202 68/500 [===>..........................] - ETA: 3:39 - loss: 0.1609 - regression_loss: 0.1406 - classification_loss: 0.0203 69/500 [===>..........................] - ETA: 3:39 - loss: 0.1602 - regression_loss: 0.1401 - classification_loss: 0.0201 70/500 [===>..........................] - ETA: 3:38 - loss: 0.1609 - regression_loss: 0.1408 - classification_loss: 0.0201 71/500 [===>..........................] - ETA: 3:38 - loss: 0.1597 - regression_loss: 0.1398 - classification_loss: 0.0199 72/500 [===>..........................] - ETA: 3:37 - loss: 0.1611 - regression_loss: 0.1411 - classification_loss: 0.0200 73/500 [===>..........................] - ETA: 3:37 - loss: 0.1610 - regression_loss: 0.1411 - classification_loss: 0.0199 74/500 [===>..........................] - ETA: 3:36 - loss: 0.1596 - regression_loss: 0.1398 - classification_loss: 0.0198 75/500 [===>..........................] - ETA: 3:36 - loss: 0.1584 - regression_loss: 0.1387 - classification_loss: 0.0196 76/500 [===>..........................] - ETA: 3:35 - loss: 0.1569 - regression_loss: 0.1374 - classification_loss: 0.0195 77/500 [===>..........................] - ETA: 3:34 - loss: 0.1561 - regression_loss: 0.1367 - classification_loss: 0.0193 78/500 [===>..........................] - ETA: 3:34 - loss: 0.1554 - regression_loss: 0.1362 - classification_loss: 0.0192 79/500 [===>..........................] - ETA: 3:34 - loss: 0.1550 - regression_loss: 0.1359 - classification_loss: 0.0191 80/500 [===>..........................] - ETA: 3:33 - loss: 0.1569 - regression_loss: 0.1378 - classification_loss: 0.0191 81/500 [===>..........................] - ETA: 3:33 - loss: 0.1558 - regression_loss: 0.1368 - classification_loss: 0.0190 82/500 [===>..........................] - ETA: 3:32 - loss: 0.1548 - regression_loss: 0.1358 - classification_loss: 0.0190 83/500 [===>..........................] - ETA: 3:32 - loss: 0.1537 - regression_loss: 0.1348 - classification_loss: 0.0188 84/500 [====>.........................] - ETA: 3:31 - loss: 0.1540 - regression_loss: 0.1352 - classification_loss: 0.0188 85/500 [====>.........................] - ETA: 3:31 - loss: 0.1540 - regression_loss: 0.1352 - classification_loss: 0.0188 86/500 [====>.........................] - ETA: 3:30 - loss: 0.1535 - regression_loss: 0.1347 - classification_loss: 0.0187 87/500 [====>.........................] - ETA: 3:30 - loss: 0.1569 - regression_loss: 0.1381 - classification_loss: 0.0188 88/500 [====>.........................] - ETA: 3:29 - loss: 0.1560 - regression_loss: 0.1373 - classification_loss: 0.0187 89/500 [====>.........................] - ETA: 3:29 - loss: 0.1564 - regression_loss: 0.1377 - classification_loss: 0.0187 90/500 [====>.........................] - ETA: 3:28 - loss: 0.1560 - regression_loss: 0.1374 - classification_loss: 0.0187 91/500 [====>.........................] - ETA: 3:28 - loss: 0.1551 - regression_loss: 0.1365 - classification_loss: 0.0185 92/500 [====>.........................] - ETA: 3:27 - loss: 0.1542 - regression_loss: 0.1358 - classification_loss: 0.0184 93/500 [====>.........................] - ETA: 3:27 - loss: 0.1544 - regression_loss: 0.1360 - classification_loss: 0.0184 94/500 [====>.........................] - ETA: 3:26 - loss: 0.1624 - regression_loss: 0.1415 - classification_loss: 0.0209 95/500 [====>.........................] - ETA: 3:26 - loss: 0.1612 - regression_loss: 0.1404 - classification_loss: 0.0208 96/500 [====>.........................] - ETA: 3:25 - loss: 0.1601 - regression_loss: 0.1395 - classification_loss: 0.0207 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1597 - regression_loss: 0.1392 - classification_loss: 0.0205 98/500 [====>.........................] - ETA: 3:24 - loss: 0.1590 - regression_loss: 0.1385 - classification_loss: 0.0205 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1580 - regression_loss: 0.1376 - classification_loss: 0.0204 100/500 [=====>........................] - ETA: 3:23 - loss: 0.1569 - regression_loss: 0.1367 - classification_loss: 0.0202 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1560 - regression_loss: 0.1359 - classification_loss: 0.0201 102/500 [=====>........................] - ETA: 3:22 - loss: 0.1547 - regression_loss: 0.1348 - classification_loss: 0.0200 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1540 - regression_loss: 0.1342 - classification_loss: 0.0198 104/500 [=====>........................] - ETA: 3:21 - loss: 0.1544 - regression_loss: 0.1346 - classification_loss: 0.0198 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1548 - regression_loss: 0.1351 - classification_loss: 0.0197 106/500 [=====>........................] - ETA: 3:20 - loss: 0.1546 - regression_loss: 0.1349 - classification_loss: 0.0197 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1538 - regression_loss: 0.1342 - classification_loss: 0.0196 108/500 [=====>........................] - ETA: 3:19 - loss: 0.1536 - regression_loss: 0.1342 - classification_loss: 0.0195 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1536 - regression_loss: 0.1343 - classification_loss: 0.0194 110/500 [=====>........................] - ETA: 3:18 - loss: 0.1533 - regression_loss: 0.1340 - classification_loss: 0.0193 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1532 - regression_loss: 0.1340 - classification_loss: 0.0192 112/500 [=====>........................] - ETA: 3:17 - loss: 0.1524 - regression_loss: 0.1333 - classification_loss: 0.0191 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1520 - regression_loss: 0.1330 - classification_loss: 0.0190 114/500 [=====>........................] - ETA: 3:16 - loss: 0.1510 - regression_loss: 0.1321 - classification_loss: 0.0189 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1510 - regression_loss: 0.1322 - classification_loss: 0.0188 116/500 [=====>........................] - ETA: 3:15 - loss: 0.1501 - regression_loss: 0.1314 - classification_loss: 0.0187 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1492 - regression_loss: 0.1306 - classification_loss: 0.0186 118/500 [======>.......................] - ETA: 3:14 - loss: 0.1494 - regression_loss: 0.1308 - classification_loss: 0.0186 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1487 - regression_loss: 0.1302 - classification_loss: 0.0185 120/500 [======>.......................] - ETA: 3:13 - loss: 0.1503 - regression_loss: 0.1318 - classification_loss: 0.0185 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1499 - regression_loss: 0.1314 - classification_loss: 0.0185 122/500 [======>.......................] - ETA: 3:12 - loss: 0.1498 - regression_loss: 0.1314 - classification_loss: 0.0184 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1490 - regression_loss: 0.1307 - classification_loss: 0.0183 124/500 [======>.......................] - ETA: 3:11 - loss: 0.1482 - regression_loss: 0.1300 - classification_loss: 0.0182 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1475 - regression_loss: 0.1294 - classification_loss: 0.0181 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1467 - regression_loss: 0.1287 - classification_loss: 0.0180 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1461 - regression_loss: 0.1281 - classification_loss: 0.0179 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1456 - regression_loss: 0.1278 - classification_loss: 0.0179 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1452 - regression_loss: 0.1274 - classification_loss: 0.0178 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1466 - regression_loss: 0.1285 - classification_loss: 0.0180 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1458 - regression_loss: 0.1279 - classification_loss: 0.0179 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1452 - regression_loss: 0.1274 - classification_loss: 0.0178 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1460 - regression_loss: 0.1280 - classification_loss: 0.0180 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1458 - regression_loss: 0.1279 - classification_loss: 0.0179 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1452 - regression_loss: 0.1274 - classification_loss: 0.0179 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1446 - regression_loss: 0.1268 - classification_loss: 0.0178 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1447 - regression_loss: 0.1269 - classification_loss: 0.0177 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1448 - regression_loss: 0.1271 - classification_loss: 0.0177 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1443 - regression_loss: 0.1266 - classification_loss: 0.0177 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1443 - regression_loss: 0.1267 - classification_loss: 0.0176 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1439 - regression_loss: 0.1263 - classification_loss: 0.0176 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1458 - regression_loss: 0.1278 - classification_loss: 0.0181 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1454 - regression_loss: 0.1274 - classification_loss: 0.0180 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1450 - regression_loss: 0.1271 - classification_loss: 0.0179 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1444 - regression_loss: 0.1266 - classification_loss: 0.0178 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1474 - regression_loss: 0.1293 - classification_loss: 0.0181 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1473 - regression_loss: 0.1292 - classification_loss: 0.0180 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1471 - regression_loss: 0.1291 - classification_loss: 0.0180 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1469 - regression_loss: 0.1289 - classification_loss: 0.0180 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1463 - regression_loss: 0.1284 - classification_loss: 0.0179 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1470 - regression_loss: 0.1291 - classification_loss: 0.0179 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1471 - regression_loss: 0.1291 - classification_loss: 0.0179 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1465 - regression_loss: 0.1286 - classification_loss: 0.0179 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1460 - regression_loss: 0.1282 - classification_loss: 0.0178 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1456 - regression_loss: 0.1278 - classification_loss: 0.0178 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1448 - regression_loss: 0.1271 - classification_loss: 0.0177 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1449 - regression_loss: 0.1273 - classification_loss: 0.0177 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1445 - regression_loss: 0.1269 - classification_loss: 0.0176 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1442 - regression_loss: 0.1267 - classification_loss: 0.0176 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1436 - regression_loss: 0.1262 - classification_loss: 0.0175 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1430 - regression_loss: 0.1256 - classification_loss: 0.0174 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1437 - regression_loss: 0.1262 - classification_loss: 0.0175 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1436 - regression_loss: 0.1262 - classification_loss: 0.0175 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1430 - regression_loss: 0.1256 - classification_loss: 0.0174 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1427 - regression_loss: 0.1254 - classification_loss: 0.0173 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1428 - regression_loss: 0.1254 - classification_loss: 0.0174 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1424 - regression_loss: 0.1251 - classification_loss: 0.0173 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1424 - regression_loss: 0.1251 - classification_loss: 0.0173 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1423 - regression_loss: 0.1251 - classification_loss: 0.0172 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1426 - regression_loss: 0.1253 - classification_loss: 0.0173 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1424 - regression_loss: 0.1251 - classification_loss: 0.0173 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1421 - regression_loss: 0.1248 - classification_loss: 0.0173 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1422 - regression_loss: 0.1250 - classification_loss: 0.0173 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1417 - regression_loss: 0.1245 - classification_loss: 0.0172 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1418 - regression_loss: 0.1246 - classification_loss: 0.0172 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1415 - regression_loss: 0.1244 - classification_loss: 0.0171 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1425 - regression_loss: 0.1253 - classification_loss: 0.0172 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1420 - regression_loss: 0.1249 - classification_loss: 0.0171 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1422 - regression_loss: 0.1251 - classification_loss: 0.0171 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1419 - regression_loss: 0.1248 - classification_loss: 0.0170 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1422 - regression_loss: 0.1252 - classification_loss: 0.0170 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1419 - regression_loss: 0.1249 - classification_loss: 0.0170 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1416 - regression_loss: 0.1247 - classification_loss: 0.0169 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1413 - regression_loss: 0.1244 - classification_loss: 0.0169 185/500 [==========>...................] - ETA: 2:41 - loss: 0.1411 - regression_loss: 0.1243 - classification_loss: 0.0168 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1406 - regression_loss: 0.1238 - classification_loss: 0.0167 187/500 [==========>...................] - ETA: 2:40 - loss: 0.1402 - regression_loss: 0.1235 - classification_loss: 0.0167 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1398 - regression_loss: 0.1231 - classification_loss: 0.0167 189/500 [==========>...................] - ETA: 2:39 - loss: 0.1396 - regression_loss: 0.1230 - classification_loss: 0.0166 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1391 - regression_loss: 0.1225 - classification_loss: 0.0166 191/500 [==========>...................] - ETA: 2:38 - loss: 0.1388 - regression_loss: 0.1223 - classification_loss: 0.0165 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1386 - regression_loss: 0.1221 - classification_loss: 0.0165 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1382 - regression_loss: 0.1217 - classification_loss: 0.0165 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1382 - regression_loss: 0.1217 - classification_loss: 0.0165 195/500 [==========>...................] - ETA: 2:36 - loss: 0.1379 - regression_loss: 0.1215 - classification_loss: 0.0164 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1383 - regression_loss: 0.1219 - classification_loss: 0.0164 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1380 - regression_loss: 0.1216 - classification_loss: 0.0164 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1374 - regression_loss: 0.1211 - classification_loss: 0.0163 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1370 - regression_loss: 0.1208 - classification_loss: 0.0163 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1370 - regression_loss: 0.1208 - classification_loss: 0.0163 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1368 - regression_loss: 0.1206 - classification_loss: 0.0162 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1373 - regression_loss: 0.1211 - classification_loss: 0.0162 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1370 - regression_loss: 0.1208 - classification_loss: 0.0162 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1374 - regression_loss: 0.1213 - classification_loss: 0.0162 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1373 - regression_loss: 0.1212 - classification_loss: 0.0161 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1391 - regression_loss: 0.1225 - classification_loss: 0.0166 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1411 - regression_loss: 0.1244 - classification_loss: 0.0167 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1407 - regression_loss: 0.1241 - classification_loss: 0.0166 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1404 - regression_loss: 0.1238 - classification_loss: 0.0166 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1402 - regression_loss: 0.1237 - classification_loss: 0.0166 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1398 - regression_loss: 0.1233 - classification_loss: 0.0165 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1395 - regression_loss: 0.1230 - classification_loss: 0.0165 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1403 - regression_loss: 0.1237 - classification_loss: 0.0167 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1400 - regression_loss: 0.1234 - classification_loss: 0.0166 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1396 - regression_loss: 0.1230 - classification_loss: 0.0166 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1392 - regression_loss: 0.1227 - classification_loss: 0.0165 217/500 [============>.................] - ETA: 2:24 - loss: 0.1391 - regression_loss: 0.1226 - classification_loss: 0.0165 218/500 [============>.................] - ETA: 2:23 - loss: 0.1397 - regression_loss: 0.1232 - classification_loss: 0.0165 219/500 [============>.................] - ETA: 2:23 - loss: 0.1393 - regression_loss: 0.1228 - classification_loss: 0.0165 220/500 [============>.................] - ETA: 2:22 - loss: 0.1389 - regression_loss: 0.1225 - classification_loss: 0.0164 221/500 [============>.................] - ETA: 2:22 - loss: 0.1386 - regression_loss: 0.1222 - classification_loss: 0.0164 222/500 [============>.................] - ETA: 2:21 - loss: 0.1384 - regression_loss: 0.1220 - classification_loss: 0.0163 223/500 [============>.................] - ETA: 2:21 - loss: 0.1382 - regression_loss: 0.1219 - classification_loss: 0.0163 224/500 [============>.................] - ETA: 2:20 - loss: 0.1380 - regression_loss: 0.1217 - classification_loss: 0.0163 225/500 [============>.................] - ETA: 2:20 - loss: 0.1385 - regression_loss: 0.1222 - classification_loss: 0.0163 226/500 [============>.................] - ETA: 2:19 - loss: 0.1381 - regression_loss: 0.1219 - classification_loss: 0.0163 227/500 [============>.................] - ETA: 2:19 - loss: 0.1381 - regression_loss: 0.1218 - classification_loss: 0.0163 228/500 [============>.................] - ETA: 2:18 - loss: 0.1398 - regression_loss: 0.1234 - classification_loss: 0.0164 229/500 [============>.................] - ETA: 2:18 - loss: 0.1397 - regression_loss: 0.1233 - classification_loss: 0.0163 230/500 [============>.................] - ETA: 2:17 - loss: 0.1392 - regression_loss: 0.1229 - classification_loss: 0.0163 231/500 [============>.................] - ETA: 2:17 - loss: 0.1392 - regression_loss: 0.1229 - classification_loss: 0.0163 232/500 [============>.................] - ETA: 2:16 - loss: 0.1391 - regression_loss: 0.1228 - classification_loss: 0.0163 233/500 [============>.................] - ETA: 2:16 - loss: 0.1389 - regression_loss: 0.1226 - classification_loss: 0.0163 234/500 [=============>................] - ETA: 2:15 - loss: 0.1389 - regression_loss: 0.1227 - classification_loss: 0.0163 235/500 [=============>................] - ETA: 2:15 - loss: 0.1386 - regression_loss: 0.1224 - classification_loss: 0.0162 236/500 [=============>................] - ETA: 2:14 - loss: 0.1400 - regression_loss: 0.1234 - classification_loss: 0.0165 237/500 [=============>................] - ETA: 2:14 - loss: 0.1404 - regression_loss: 0.1239 - classification_loss: 0.0165 238/500 [=============>................] - ETA: 2:13 - loss: 0.1417 - regression_loss: 0.1252 - classification_loss: 0.0165 239/500 [=============>................] - ETA: 2:13 - loss: 0.1415 - regression_loss: 0.1250 - classification_loss: 0.0165 240/500 [=============>................] - ETA: 2:12 - loss: 0.1413 - regression_loss: 0.1249 - classification_loss: 0.0165 241/500 [=============>................] - ETA: 2:12 - loss: 0.1409 - regression_loss: 0.1245 - classification_loss: 0.0164 242/500 [=============>................] - ETA: 2:11 - loss: 0.1405 - regression_loss: 0.1241 - classification_loss: 0.0164 243/500 [=============>................] - ETA: 2:11 - loss: 0.1403 - regression_loss: 0.1239 - classification_loss: 0.0163 244/500 [=============>................] - ETA: 2:10 - loss: 0.1400 - regression_loss: 0.1237 - classification_loss: 0.0163 245/500 [=============>................] - ETA: 2:10 - loss: 0.1399 - regression_loss: 0.1236 - classification_loss: 0.0163 246/500 [=============>................] - ETA: 2:09 - loss: 0.1397 - regression_loss: 0.1234 - classification_loss: 0.0162 247/500 [=============>................] - ETA: 2:09 - loss: 0.1393 - regression_loss: 0.1231 - classification_loss: 0.0162 248/500 [=============>................] - ETA: 2:08 - loss: 0.1394 - regression_loss: 0.1232 - classification_loss: 0.0162 249/500 [=============>................] - ETA: 2:08 - loss: 0.1392 - regression_loss: 0.1230 - classification_loss: 0.0162 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1393 - regression_loss: 0.1231 - classification_loss: 0.0162 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1390 - regression_loss: 0.1229 - classification_loss: 0.0161 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1395 - regression_loss: 0.1233 - classification_loss: 0.0162 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1392 - regression_loss: 0.1231 - classification_loss: 0.0162 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1391 - regression_loss: 0.1229 - classification_loss: 0.0162 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1389 - regression_loss: 0.1228 - classification_loss: 0.0162 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1397 - regression_loss: 0.1236 - classification_loss: 0.0161 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1393 - regression_loss: 0.1233 - classification_loss: 0.0161 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1390 - regression_loss: 0.1230 - classification_loss: 0.0160 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1388 - regression_loss: 0.1228 - classification_loss: 0.0160 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1386 - regression_loss: 0.1226 - classification_loss: 0.0160 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1386 - regression_loss: 0.1226 - classification_loss: 0.0160 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1384 - regression_loss: 0.1225 - classification_loss: 0.0160 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1380 - regression_loss: 0.1221 - classification_loss: 0.0159 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1378 - regression_loss: 0.1219 - classification_loss: 0.0159 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1376 - regression_loss: 0.1218 - classification_loss: 0.0159 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1377 - regression_loss: 0.1218 - classification_loss: 0.0159 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1377 - regression_loss: 0.1218 - classification_loss: 0.0159 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1374 - regression_loss: 0.1215 - classification_loss: 0.0158 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1371 - regression_loss: 0.1213 - classification_loss: 0.0158 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1368 - regression_loss: 0.1210 - classification_loss: 0.0158 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1374 - regression_loss: 0.1215 - classification_loss: 0.0158 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1374 - regression_loss: 0.1215 - classification_loss: 0.0158 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1370 - regression_loss: 0.1212 - classification_loss: 0.0158 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1369 - regression_loss: 0.1211 - classification_loss: 0.0158 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1368 - regression_loss: 0.1210 - classification_loss: 0.0158 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1368 - regression_loss: 0.1210 - classification_loss: 0.0158 277/500 [===============>..............] - ETA: 1:54 - loss: 0.1365 - regression_loss: 0.1208 - classification_loss: 0.0158 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1370 - regression_loss: 0.1211 - classification_loss: 0.0159 279/500 [===============>..............] - ETA: 1:53 - loss: 0.1367 - regression_loss: 0.1209 - classification_loss: 0.0159 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1370 - regression_loss: 0.1211 - classification_loss: 0.0159 281/500 [===============>..............] - ETA: 1:52 - loss: 0.1377 - regression_loss: 0.1218 - classification_loss: 0.0159 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1376 - regression_loss: 0.1217 - classification_loss: 0.0159 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1373 - regression_loss: 0.1215 - classification_loss: 0.0158 284/500 [================>.............] - ETA: 1:50 - loss: 0.1371 - regression_loss: 0.1213 - classification_loss: 0.0158 285/500 [================>.............] - ETA: 1:49 - loss: 0.1369 - regression_loss: 0.1211 - classification_loss: 0.0158 286/500 [================>.............] - ETA: 1:49 - loss: 0.1367 - regression_loss: 0.1209 - classification_loss: 0.0158 287/500 [================>.............] - ETA: 1:48 - loss: 0.1366 - regression_loss: 0.1208 - classification_loss: 0.0158 288/500 [================>.............] - ETA: 1:48 - loss: 0.1364 - regression_loss: 0.1206 - classification_loss: 0.0158 289/500 [================>.............] - ETA: 1:47 - loss: 0.1362 - regression_loss: 0.1204 - classification_loss: 0.0157 290/500 [================>.............] - ETA: 1:47 - loss: 0.1359 - regression_loss: 0.1202 - classification_loss: 0.0157 291/500 [================>.............] - ETA: 1:46 - loss: 0.1357 - regression_loss: 0.1200 - classification_loss: 0.0157 292/500 [================>.............] - ETA: 1:46 - loss: 0.1355 - regression_loss: 0.1198 - classification_loss: 0.0157 293/500 [================>.............] - ETA: 1:45 - loss: 0.1353 - regression_loss: 0.1197 - classification_loss: 0.0157 294/500 [================>.............] - ETA: 1:45 - loss: 0.1350 - regression_loss: 0.1194 - classification_loss: 0.0156 295/500 [================>.............] - ETA: 1:44 - loss: 0.1348 - regression_loss: 0.1192 - classification_loss: 0.0156 296/500 [================>.............] - ETA: 1:44 - loss: 0.1370 - regression_loss: 0.1211 - classification_loss: 0.0158 297/500 [================>.............] - ETA: 1:43 - loss: 0.1371 - regression_loss: 0.1214 - classification_loss: 0.0158 298/500 [================>.............] - ETA: 1:43 - loss: 0.1371 - regression_loss: 0.1213 - classification_loss: 0.0158 299/500 [================>.............] - ETA: 1:42 - loss: 0.1371 - regression_loss: 0.1213 - classification_loss: 0.0158 300/500 [=================>............] - ETA: 1:42 - loss: 0.1369 - regression_loss: 0.1212 - classification_loss: 0.0157 301/500 [=================>............] - ETA: 1:41 - loss: 0.1366 - regression_loss: 0.1209 - classification_loss: 0.0157 302/500 [=================>............] - ETA: 1:41 - loss: 0.1366 - regression_loss: 0.1209 - classification_loss: 0.0157 303/500 [=================>............] - ETA: 1:40 - loss: 0.1364 - regression_loss: 0.1207 - classification_loss: 0.0157 304/500 [=================>............] - ETA: 1:40 - loss: 0.1361 - regression_loss: 0.1205 - classification_loss: 0.0156 305/500 [=================>............] - ETA: 1:39 - loss: 0.1363 - regression_loss: 0.1206 - classification_loss: 0.0157 306/500 [=================>............] - ETA: 1:39 - loss: 0.1360 - regression_loss: 0.1204 - classification_loss: 0.0156 307/500 [=================>............] - ETA: 1:38 - loss: 0.1360 - regression_loss: 0.1203 - classification_loss: 0.0156 308/500 [=================>............] - ETA: 1:38 - loss: 0.1357 - regression_loss: 0.1200 - classification_loss: 0.0156 309/500 [=================>............] - ETA: 1:37 - loss: 0.1356 - regression_loss: 0.1200 - classification_loss: 0.0156 310/500 [=================>............] - ETA: 1:37 - loss: 0.1357 - regression_loss: 0.1201 - classification_loss: 0.0156 311/500 [=================>............] - ETA: 1:36 - loss: 0.1357 - regression_loss: 0.1201 - classification_loss: 0.0156 312/500 [=================>............] - ETA: 1:36 - loss: 0.1355 - regression_loss: 0.1199 - classification_loss: 0.0156 313/500 [=================>............] - ETA: 1:35 - loss: 0.1352 - regression_loss: 0.1197 - classification_loss: 0.0156 314/500 [=================>............] - ETA: 1:35 - loss: 0.1358 - regression_loss: 0.1202 - classification_loss: 0.0156 315/500 [=================>............] - ETA: 1:34 - loss: 0.1356 - regression_loss: 0.1201 - classification_loss: 0.0155 316/500 [=================>............] - ETA: 1:34 - loss: 0.1357 - regression_loss: 0.1201 - classification_loss: 0.0155 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1355 - regression_loss: 0.1199 - classification_loss: 0.0155 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1355 - regression_loss: 0.1200 - classification_loss: 0.0155 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1357 - regression_loss: 0.1201 - classification_loss: 0.0156 320/500 [==================>...........] - ETA: 1:32 - loss: 0.1355 - regression_loss: 0.1200 - classification_loss: 0.0155 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1352 - regression_loss: 0.1197 - classification_loss: 0.0155 322/500 [==================>...........] - ETA: 1:31 - loss: 0.1354 - regression_loss: 0.1199 - classification_loss: 0.0155 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1352 - regression_loss: 0.1197 - classification_loss: 0.0155 324/500 [==================>...........] - ETA: 1:30 - loss: 0.1349 - regression_loss: 0.1195 - classification_loss: 0.0155 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1347 - regression_loss: 0.1193 - classification_loss: 0.0154 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1344 - regression_loss: 0.1190 - classification_loss: 0.0154 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1343 - regression_loss: 0.1189 - classification_loss: 0.0154 328/500 [==================>...........] - ETA: 1:28 - loss: 0.1344 - regression_loss: 0.1190 - classification_loss: 0.0154 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1343 - regression_loss: 0.1189 - classification_loss: 0.0154 330/500 [==================>...........] - ETA: 1:27 - loss: 0.1365 - regression_loss: 0.1204 - classification_loss: 0.0161 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1365 - regression_loss: 0.1204 - classification_loss: 0.0160 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1365 - regression_loss: 0.1204 - classification_loss: 0.0160 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1363 - regression_loss: 0.1203 - classification_loss: 0.0160 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1363 - regression_loss: 0.1203 - classification_loss: 0.0160 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1362 - regression_loss: 0.1202 - classification_loss: 0.0160 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1359 - regression_loss: 0.1200 - classification_loss: 0.0159 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1359 - regression_loss: 0.1199 - classification_loss: 0.0160 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1361 - regression_loss: 0.1200 - classification_loss: 0.0160 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1362 - regression_loss: 0.1202 - classification_loss: 0.0160 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1360 - regression_loss: 0.1200 - classification_loss: 0.0160 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1361 - regression_loss: 0.1201 - classification_loss: 0.0160 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1359 - regression_loss: 0.1200 - classification_loss: 0.0160 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1359 - regression_loss: 0.1199 - classification_loss: 0.0160 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1356 - regression_loss: 0.1197 - classification_loss: 0.0159 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1354 - regression_loss: 0.1195 - classification_loss: 0.0159 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1352 - regression_loss: 0.1194 - classification_loss: 0.0159 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1354 - regression_loss: 0.1195 - classification_loss: 0.0159 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1354 - regression_loss: 0.1195 - classification_loss: 0.0159 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1354 - regression_loss: 0.1196 - classification_loss: 0.0159 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1353 - regression_loss: 0.1194 - classification_loss: 0.0158 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1358 - regression_loss: 0.1199 - classification_loss: 0.0159 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1359 - regression_loss: 0.1200 - classification_loss: 0.0159 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1359 - regression_loss: 0.1200 - classification_loss: 0.0159 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1360 - regression_loss: 0.1202 - classification_loss: 0.0159 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1358 - regression_loss: 0.1200 - classification_loss: 0.0159 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1358 - regression_loss: 0.1199 - classification_loss: 0.0159 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1356 - regression_loss: 0.1197 - classification_loss: 0.0158 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1354 - regression_loss: 0.1196 - classification_loss: 0.0158 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1352 - regression_loss: 0.1194 - classification_loss: 0.0158 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1351 - regression_loss: 0.1193 - classification_loss: 0.0158 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1348 - regression_loss: 0.1191 - classification_loss: 0.0158 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1355 - regression_loss: 0.1197 - classification_loss: 0.0158 363/500 [====================>.........] - ETA: 1:10 - loss: 0.1353 - regression_loss: 0.1195 - classification_loss: 0.0158 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1356 - regression_loss: 0.1198 - classification_loss: 0.0158 365/500 [====================>.........] - ETA: 1:09 - loss: 0.1355 - regression_loss: 0.1197 - classification_loss: 0.0158 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1353 - regression_loss: 0.1195 - classification_loss: 0.0158 367/500 [=====================>........] - ETA: 1:08 - loss: 0.1352 - regression_loss: 0.1194 - classification_loss: 0.0158 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1352 - regression_loss: 0.1194 - classification_loss: 0.0158 369/500 [=====================>........] - ETA: 1:07 - loss: 0.1350 - regression_loss: 0.1192 - classification_loss: 0.0157 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1348 - regression_loss: 0.1191 - classification_loss: 0.0157 371/500 [=====================>........] - ETA: 1:06 - loss: 0.1346 - regression_loss: 0.1189 - classification_loss: 0.0157 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1348 - regression_loss: 0.1191 - classification_loss: 0.0157 373/500 [=====================>........] - ETA: 1:05 - loss: 0.1347 - regression_loss: 0.1190 - classification_loss: 0.0157 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1346 - regression_loss: 0.1189 - classification_loss: 0.0157 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1346 - regression_loss: 0.1189 - classification_loss: 0.0157 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1344 - regression_loss: 0.1188 - classification_loss: 0.0156 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1343 - regression_loss: 0.1187 - classification_loss: 0.0156 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1340 - regression_loss: 0.1185 - classification_loss: 0.0156 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1339 - regression_loss: 0.1184 - classification_loss: 0.0156 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1341 - regression_loss: 0.1185 - classification_loss: 0.0155 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1342 - regression_loss: 0.1187 - classification_loss: 0.0155 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1341 - regression_loss: 0.1186 - classification_loss: 0.0155 383/500 [=====================>........] - ETA: 59s - loss: 0.1343 - regression_loss: 0.1188 - classification_loss: 0.0155  384/500 [======================>.......] - ETA: 59s - loss: 0.1346 - regression_loss: 0.1191 - classification_loss: 0.0155 385/500 [======================>.......] - ETA: 58s - loss: 0.1344 - regression_loss: 0.1189 - classification_loss: 0.0155 386/500 [======================>.......] - ETA: 58s - loss: 0.1342 - regression_loss: 0.1187 - classification_loss: 0.0155 387/500 [======================>.......] - ETA: 57s - loss: 0.1341 - regression_loss: 0.1186 - classification_loss: 0.0155 388/500 [======================>.......] - ETA: 57s - loss: 0.1341 - regression_loss: 0.1186 - classification_loss: 0.0155 389/500 [======================>.......] - ETA: 56s - loss: 0.1339 - regression_loss: 0.1185 - classification_loss: 0.0155 390/500 [======================>.......] - ETA: 56s - loss: 0.1338 - regression_loss: 0.1184 - classification_loss: 0.0154 391/500 [======================>.......] - ETA: 55s - loss: 0.1336 - regression_loss: 0.1182 - classification_loss: 0.0154 392/500 [======================>.......] - ETA: 55s - loss: 0.1336 - regression_loss: 0.1182 - classification_loss: 0.0154 393/500 [======================>.......] - ETA: 54s - loss: 0.1334 - regression_loss: 0.1180 - classification_loss: 0.0154 394/500 [======================>.......] - ETA: 54s - loss: 0.1333 - regression_loss: 0.1179 - classification_loss: 0.0154 395/500 [======================>.......] - ETA: 53s - loss: 0.1332 - regression_loss: 0.1179 - classification_loss: 0.0154 396/500 [======================>.......] - ETA: 53s - loss: 0.1331 - regression_loss: 0.1177 - classification_loss: 0.0153 397/500 [======================>.......] - ETA: 52s - loss: 0.1341 - regression_loss: 0.1187 - classification_loss: 0.0154 398/500 [======================>.......] - ETA: 52s - loss: 0.1340 - regression_loss: 0.1186 - classification_loss: 0.0154 399/500 [======================>.......] - ETA: 51s - loss: 0.1338 - regression_loss: 0.1184 - classification_loss: 0.0154 400/500 [=======================>......] - ETA: 51s - loss: 0.1339 - regression_loss: 0.1185 - classification_loss: 0.0154 401/500 [=======================>......] - ETA: 50s - loss: 0.1337 - regression_loss: 0.1183 - classification_loss: 0.0154 402/500 [=======================>......] - ETA: 50s - loss: 0.1338 - regression_loss: 0.1184 - classification_loss: 0.0154 403/500 [=======================>......] - ETA: 49s - loss: 0.1337 - regression_loss: 0.1183 - classification_loss: 0.0154 404/500 [=======================>......] - ETA: 49s - loss: 0.1336 - regression_loss: 0.1182 - classification_loss: 0.0154 405/500 [=======================>......] - ETA: 48s - loss: 0.1334 - regression_loss: 0.1180 - classification_loss: 0.0154 406/500 [=======================>......] - ETA: 48s - loss: 0.1333 - regression_loss: 0.1179 - classification_loss: 0.0154 407/500 [=======================>......] - ETA: 47s - loss: 0.1332 - regression_loss: 0.1178 - classification_loss: 0.0154 408/500 [=======================>......] - ETA: 47s - loss: 0.1330 - regression_loss: 0.1176 - classification_loss: 0.0153 409/500 [=======================>......] - ETA: 46s - loss: 0.1331 - regression_loss: 0.1178 - classification_loss: 0.0153 410/500 [=======================>......] - ETA: 46s - loss: 0.1330 - regression_loss: 0.1177 - classification_loss: 0.0153 411/500 [=======================>......] - ETA: 45s - loss: 0.1328 - regression_loss: 0.1175 - classification_loss: 0.0153 412/500 [=======================>......] - ETA: 45s - loss: 0.1328 - regression_loss: 0.1175 - classification_loss: 0.0153 413/500 [=======================>......] - ETA: 44s - loss: 0.1326 - regression_loss: 0.1173 - classification_loss: 0.0153 414/500 [=======================>......] - ETA: 44s - loss: 0.1324 - regression_loss: 0.1172 - classification_loss: 0.0152 415/500 [=======================>......] - ETA: 43s - loss: 0.1323 - regression_loss: 0.1171 - classification_loss: 0.0152 416/500 [=======================>......] - ETA: 43s - loss: 0.1321 - regression_loss: 0.1169 - classification_loss: 0.0152 417/500 [========================>.....] - ETA: 42s - loss: 0.1320 - regression_loss: 0.1168 - classification_loss: 0.0152 418/500 [========================>.....] - ETA: 41s - loss: 0.1320 - regression_loss: 0.1168 - classification_loss: 0.0152 419/500 [========================>.....] - ETA: 41s - loss: 0.1318 - regression_loss: 0.1167 - classification_loss: 0.0152 420/500 [========================>.....] - ETA: 40s - loss: 0.1316 - regression_loss: 0.1165 - classification_loss: 0.0151 421/500 [========================>.....] - ETA: 40s - loss: 0.1315 - regression_loss: 0.1163 - classification_loss: 0.0151 422/500 [========================>.....] - ETA: 39s - loss: 0.1313 - regression_loss: 0.1162 - classification_loss: 0.0151 423/500 [========================>.....] - ETA: 39s - loss: 0.1314 - regression_loss: 0.1163 - classification_loss: 0.0151 424/500 [========================>.....] - ETA: 38s - loss: 0.1312 - regression_loss: 0.1161 - classification_loss: 0.0151 425/500 [========================>.....] - ETA: 38s - loss: 0.1311 - regression_loss: 0.1160 - classification_loss: 0.0151 426/500 [========================>.....] - ETA: 37s - loss: 0.1312 - regression_loss: 0.1160 - classification_loss: 0.0151 427/500 [========================>.....] - ETA: 37s - loss: 0.1311 - regression_loss: 0.1160 - classification_loss: 0.0151 428/500 [========================>.....] - ETA: 36s - loss: 0.1312 - regression_loss: 0.1161 - classification_loss: 0.0151 429/500 [========================>.....] - ETA: 36s - loss: 0.1310 - regression_loss: 0.1159 - classification_loss: 0.0151 430/500 [========================>.....] - ETA: 35s - loss: 0.1309 - regression_loss: 0.1158 - classification_loss: 0.0151 431/500 [========================>.....] - ETA: 35s - loss: 0.1307 - regression_loss: 0.1156 - classification_loss: 0.0151 432/500 [========================>.....] - ETA: 34s - loss: 0.1304 - regression_loss: 0.1154 - classification_loss: 0.0150 433/500 [========================>.....] - ETA: 34s - loss: 0.1305 - regression_loss: 0.1155 - classification_loss: 0.0150 434/500 [=========================>....] - ETA: 33s - loss: 0.1303 - regression_loss: 0.1153 - classification_loss: 0.0150 435/500 [=========================>....] - ETA: 33s - loss: 0.1302 - regression_loss: 0.1152 - classification_loss: 0.0150 436/500 [=========================>....] - ETA: 32s - loss: 0.1303 - regression_loss: 0.1153 - classification_loss: 0.0150 437/500 [=========================>....] - ETA: 32s - loss: 0.1301 - regression_loss: 0.1151 - classification_loss: 0.0150 438/500 [=========================>....] - ETA: 31s - loss: 0.1300 - regression_loss: 0.1150 - classification_loss: 0.0150 439/500 [=========================>....] - ETA: 31s - loss: 0.1298 - regression_loss: 0.1149 - classification_loss: 0.0150 440/500 [=========================>....] - ETA: 30s - loss: 0.1297 - regression_loss: 0.1148 - classification_loss: 0.0149 441/500 [=========================>....] - ETA: 30s - loss: 0.1296 - regression_loss: 0.1146 - classification_loss: 0.0149 442/500 [=========================>....] - ETA: 29s - loss: 0.1296 - regression_loss: 0.1147 - classification_loss: 0.0149 443/500 [=========================>....] - ETA: 29s - loss: 0.1294 - regression_loss: 0.1145 - classification_loss: 0.0149 444/500 [=========================>....] - ETA: 28s - loss: 0.1293 - regression_loss: 0.1144 - classification_loss: 0.0149 445/500 [=========================>....] - ETA: 28s - loss: 0.1292 - regression_loss: 0.1144 - classification_loss: 0.0149 446/500 [=========================>....] - ETA: 27s - loss: 0.1294 - regression_loss: 0.1145 - classification_loss: 0.0149 447/500 [=========================>....] - ETA: 27s - loss: 0.1293 - regression_loss: 0.1144 - classification_loss: 0.0149 448/500 [=========================>....] - ETA: 26s - loss: 0.1291 - regression_loss: 0.1142 - classification_loss: 0.0149 449/500 [=========================>....] - ETA: 26s - loss: 0.1289 - regression_loss: 0.1141 - classification_loss: 0.0149 450/500 [==========================>...] - ETA: 25s - loss: 0.1291 - regression_loss: 0.1143 - classification_loss: 0.0149 451/500 [==========================>...] - ETA: 25s - loss: 0.1296 - regression_loss: 0.1147 - classification_loss: 0.0149 452/500 [==========================>...] - ETA: 24s - loss: 0.1293 - regression_loss: 0.1145 - classification_loss: 0.0149 453/500 [==========================>...] - ETA: 24s - loss: 0.1303 - regression_loss: 0.1153 - classification_loss: 0.0149 454/500 [==========================>...] - ETA: 23s - loss: 0.1302 - regression_loss: 0.1153 - classification_loss: 0.0149 455/500 [==========================>...] - ETA: 23s - loss: 0.1306 - regression_loss: 0.1156 - classification_loss: 0.0150 456/500 [==========================>...] - ETA: 22s - loss: 0.1304 - regression_loss: 0.1155 - classification_loss: 0.0150 457/500 [==========================>...] - ETA: 22s - loss: 0.1304 - regression_loss: 0.1155 - classification_loss: 0.0149 458/500 [==========================>...] - ETA: 21s - loss: 0.1307 - regression_loss: 0.1158 - classification_loss: 0.0150 459/500 [==========================>...] - ETA: 21s - loss: 0.1306 - regression_loss: 0.1156 - classification_loss: 0.0149 460/500 [==========================>...] - ETA: 20s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 461/500 [==========================>...] - ETA: 19s - loss: 0.1304 - regression_loss: 0.1155 - classification_loss: 0.0149 462/500 [==========================>...] - ETA: 19s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 463/500 [==========================>...] - ETA: 18s - loss: 0.1305 - regression_loss: 0.1156 - classification_loss: 0.0149 464/500 [==========================>...] - ETA: 18s - loss: 0.1305 - regression_loss: 0.1156 - classification_loss: 0.0149 465/500 [==========================>...] - ETA: 17s - loss: 0.1312 - regression_loss: 0.1163 - classification_loss: 0.0149 466/500 [==========================>...] - ETA: 17s - loss: 0.1310 - regression_loss: 0.1161 - classification_loss: 0.0149 467/500 [===========================>..] - ETA: 16s - loss: 0.1310 - regression_loss: 0.1161 - classification_loss: 0.0149 468/500 [===========================>..] - ETA: 16s - loss: 0.1309 - regression_loss: 0.1160 - classification_loss: 0.0149 469/500 [===========================>..] - ETA: 15s - loss: 0.1310 - regression_loss: 0.1162 - classification_loss: 0.0149 470/500 [===========================>..] - ETA: 15s - loss: 0.1309 - regression_loss: 0.1161 - classification_loss: 0.0149 471/500 [===========================>..] - ETA: 14s - loss: 0.1308 - regression_loss: 0.1159 - classification_loss: 0.0148 472/500 [===========================>..] - ETA: 14s - loss: 0.1306 - regression_loss: 0.1158 - classification_loss: 0.0148 473/500 [===========================>..] - ETA: 13s - loss: 0.1306 - regression_loss: 0.1158 - classification_loss: 0.0148 474/500 [===========================>..] - ETA: 13s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0148 475/500 [===========================>..] - ETA: 12s - loss: 0.1308 - regression_loss: 0.1160 - classification_loss: 0.0148 476/500 [===========================>..] - ETA: 12s - loss: 0.1308 - regression_loss: 0.1159 - classification_loss: 0.0148 477/500 [===========================>..] - ETA: 11s - loss: 0.1309 - regression_loss: 0.1160 - classification_loss: 0.0148 478/500 [===========================>..] - ETA: 11s - loss: 0.1308 - regression_loss: 0.1159 - classification_loss: 0.0148 479/500 [===========================>..] - ETA: 10s - loss: 0.1310 - regression_loss: 0.1162 - classification_loss: 0.0149 480/500 [===========================>..] - ETA: 10s - loss: 0.1311 - regression_loss: 0.1162 - classification_loss: 0.0148 481/500 [===========================>..] - ETA: 9s - loss: 0.1311 - regression_loss: 0.1163 - classification_loss: 0.0148  482/500 [===========================>..] - ETA: 9s - loss: 0.1310 - regression_loss: 0.1162 - classification_loss: 0.0149 483/500 [===========================>..] - ETA: 8s - loss: 0.1309 - regression_loss: 0.1160 - classification_loss: 0.0148 484/500 [============================>.] - ETA: 8s - loss: 0.1307 - regression_loss: 0.1159 - classification_loss: 0.0148 485/500 [============================>.] - ETA: 7s - loss: 0.1314 - regression_loss: 0.1164 - classification_loss: 0.0150 486/500 [============================>.] - ETA: 7s - loss: 0.1312 - regression_loss: 0.1162 - classification_loss: 0.0150 487/500 [============================>.] - ETA: 6s - loss: 0.1311 - regression_loss: 0.1162 - classification_loss: 0.0150 488/500 [============================>.] - ETA: 6s - loss: 0.1310 - regression_loss: 0.1160 - classification_loss: 0.0150 489/500 [============================>.] - ETA: 5s - loss: 0.1308 - regression_loss: 0.1159 - classification_loss: 0.0149 490/500 [============================>.] - ETA: 5s - loss: 0.1307 - regression_loss: 0.1158 - classification_loss: 0.0149 491/500 [============================>.] - ETA: 4s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 492/500 [============================>.] - ETA: 4s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 493/500 [============================>.] - ETA: 3s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 494/500 [============================>.] - ETA: 3s - loss: 0.1308 - regression_loss: 0.1159 - classification_loss: 0.0149 495/500 [============================>.] - ETA: 2s - loss: 0.1307 - regression_loss: 0.1158 - classification_loss: 0.0149 496/500 [============================>.] - ETA: 2s - loss: 0.1305 - regression_loss: 0.1156 - classification_loss: 0.0149 497/500 [============================>.] - ETA: 1s - loss: 0.1307 - regression_loss: 0.1158 - classification_loss: 0.0149 498/500 [============================>.] - ETA: 1s - loss: 0.1306 - regression_loss: 0.1157 - classification_loss: 0.0149 499/500 [============================>.] - ETA: 0s - loss: 0.1304 - regression_loss: 0.1155 - classification_loss: 0.0149 500/500 [==============================] - 256s 513ms/step - loss: 0.1305 - regression_loss: 0.1156 - classification_loss: 0.0149 326 instances of class plum with average precision: 0.8457 mAP: 0.8457 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 4:17 - loss: 0.0592 - regression_loss: 0.0537 - classification_loss: 0.0055 2/500 [..............................] - ETA: 4:14 - loss: 0.2996 - regression_loss: 0.2792 - classification_loss: 0.0204 3/500 [..............................] - ETA: 4:13 - loss: 0.2252 - regression_loss: 0.2083 - classification_loss: 0.0169 4/500 [..............................] - ETA: 4:12 - loss: 0.1934 - regression_loss: 0.1775 - classification_loss: 0.0160 5/500 [..............................] - ETA: 4:12 - loss: 0.1793 - regression_loss: 0.1631 - classification_loss: 0.0162 6/500 [..............................] - ETA: 4:13 - loss: 0.1667 - regression_loss: 0.1515 - classification_loss: 0.0151 7/500 [..............................] - ETA: 4:12 - loss: 0.1655 - regression_loss: 0.1509 - classification_loss: 0.0146 8/500 [..............................] - ETA: 4:11 - loss: 0.1597 - regression_loss: 0.1456 - classification_loss: 0.0141 9/500 [..............................] - ETA: 4:11 - loss: 0.1522 - regression_loss: 0.1390 - classification_loss: 0.0132 10/500 [..............................] - ETA: 4:11 - loss: 0.1471 - regression_loss: 0.1343 - classification_loss: 0.0127 11/500 [..............................] - ETA: 4:10 - loss: 0.1410 - regression_loss: 0.1287 - classification_loss: 0.0123 12/500 [..............................] - ETA: 4:10 - loss: 0.1993 - regression_loss: 0.1675 - classification_loss: 0.0318 13/500 [..............................] - ETA: 4:10 - loss: 0.1898 - regression_loss: 0.1597 - classification_loss: 0.0301 14/500 [..............................] - ETA: 4:09 - loss: 0.1816 - regression_loss: 0.1532 - classification_loss: 0.0284 15/500 [..............................] - ETA: 4:08 - loss: 0.1828 - regression_loss: 0.1554 - classification_loss: 0.0275 16/500 [..............................] - ETA: 4:08 - loss: 0.1745 - regression_loss: 0.1484 - classification_loss: 0.0262 17/500 [>.............................] - ETA: 4:07 - loss: 0.1692 - regression_loss: 0.1442 - classification_loss: 0.0250 18/500 [>.............................] - ETA: 4:06 - loss: 0.1664 - regression_loss: 0.1423 - classification_loss: 0.0241 19/500 [>.............................] - ETA: 4:06 - loss: 0.1633 - regression_loss: 0.1398 - classification_loss: 0.0234 20/500 [>.............................] - ETA: 4:06 - loss: 0.1601 - regression_loss: 0.1374 - classification_loss: 0.0227 21/500 [>.............................] - ETA: 4:05 - loss: 0.1598 - regression_loss: 0.1374 - classification_loss: 0.0224 22/500 [>.............................] - ETA: 4:05 - loss: 0.1557 - regression_loss: 0.1341 - classification_loss: 0.0216 23/500 [>.............................] - ETA: 4:04 - loss: 0.1501 - regression_loss: 0.1293 - classification_loss: 0.0209 24/500 [>.............................] - ETA: 4:04 - loss: 0.1575 - regression_loss: 0.1371 - classification_loss: 0.0204 25/500 [>.............................] - ETA: 4:03 - loss: 0.1589 - regression_loss: 0.1383 - classification_loss: 0.0206 26/500 [>.............................] - ETA: 4:03 - loss: 0.1548 - regression_loss: 0.1347 - classification_loss: 0.0201 27/500 [>.............................] - ETA: 4:02 - loss: 0.1562 - regression_loss: 0.1363 - classification_loss: 0.0200 28/500 [>.............................] - ETA: 4:01 - loss: 0.1563 - regression_loss: 0.1367 - classification_loss: 0.0197 29/500 [>.............................] - ETA: 4:01 - loss: 0.1570 - regression_loss: 0.1374 - classification_loss: 0.0196 30/500 [>.............................] - ETA: 4:00 - loss: 0.1552 - regression_loss: 0.1354 - classification_loss: 0.0198 31/500 [>.............................] - ETA: 4:00 - loss: 0.1526 - regression_loss: 0.1333 - classification_loss: 0.0193 32/500 [>.............................] - ETA: 4:00 - loss: 0.1500 - regression_loss: 0.1310 - classification_loss: 0.0191 33/500 [>.............................] - ETA: 3:58 - loss: 0.1497 - regression_loss: 0.1307 - classification_loss: 0.0190 34/500 [=>............................] - ETA: 3:57 - loss: 0.1478 - regression_loss: 0.1290 - classification_loss: 0.0188 35/500 [=>............................] - ETA: 3:56 - loss: 0.1454 - regression_loss: 0.1269 - classification_loss: 0.0185 36/500 [=>............................] - ETA: 3:55 - loss: 0.1443 - regression_loss: 0.1261 - classification_loss: 0.0182 37/500 [=>............................] - ETA: 3:54 - loss: 0.1415 - regression_loss: 0.1237 - classification_loss: 0.0178 38/500 [=>............................] - ETA: 3:54 - loss: 0.1411 - regression_loss: 0.1235 - classification_loss: 0.0176 39/500 [=>............................] - ETA: 3:53 - loss: 0.1398 - regression_loss: 0.1224 - classification_loss: 0.0174 40/500 [=>............................] - ETA: 3:52 - loss: 0.1390 - regression_loss: 0.1217 - classification_loss: 0.0173 41/500 [=>............................] - ETA: 3:51 - loss: 0.1377 - regression_loss: 0.1206 - classification_loss: 0.0171 42/500 [=>............................] - ETA: 3:51 - loss: 0.1362 - regression_loss: 0.1194 - classification_loss: 0.0168 43/500 [=>............................] - ETA: 3:50 - loss: 0.1358 - regression_loss: 0.1192 - classification_loss: 0.0166 44/500 [=>............................] - ETA: 3:50 - loss: 0.1341 - regression_loss: 0.1176 - classification_loss: 0.0165 45/500 [=>............................] - ETA: 3:50 - loss: 0.1328 - regression_loss: 0.1164 - classification_loss: 0.0164 46/500 [=>............................] - ETA: 3:49 - loss: 0.1339 - regression_loss: 0.1175 - classification_loss: 0.0164 47/500 [=>............................] - ETA: 3:48 - loss: 0.1371 - regression_loss: 0.1199 - classification_loss: 0.0173 48/500 [=>............................] - ETA: 3:48 - loss: 0.1383 - regression_loss: 0.1210 - classification_loss: 0.0173 49/500 [=>............................] - ETA: 3:47 - loss: 0.1373 - regression_loss: 0.1202 - classification_loss: 0.0170 50/500 [==>...........................] - ETA: 3:47 - loss: 0.1359 - regression_loss: 0.1191 - classification_loss: 0.0169 51/500 [==>...........................] - ETA: 3:47 - loss: 0.1341 - regression_loss: 0.1174 - classification_loss: 0.0167 52/500 [==>...........................] - ETA: 3:46 - loss: 0.1335 - regression_loss: 0.1169 - classification_loss: 0.0166 53/500 [==>...........................] - ETA: 3:46 - loss: 0.1378 - regression_loss: 0.1212 - classification_loss: 0.0166 54/500 [==>...........................] - ETA: 3:45 - loss: 0.1387 - regression_loss: 0.1223 - classification_loss: 0.0164 55/500 [==>...........................] - ETA: 3:45 - loss: 0.1381 - regression_loss: 0.1217 - classification_loss: 0.0164 56/500 [==>...........................] - ETA: 3:44 - loss: 0.1364 - regression_loss: 0.1203 - classification_loss: 0.0161 57/500 [==>...........................] - ETA: 3:44 - loss: 0.1377 - regression_loss: 0.1216 - classification_loss: 0.0161 58/500 [==>...........................] - ETA: 3:44 - loss: 0.1370 - regression_loss: 0.1210 - classification_loss: 0.0160 59/500 [==>...........................] - ETA: 3:43 - loss: 0.1378 - regression_loss: 0.1218 - classification_loss: 0.0160 60/500 [==>...........................] - ETA: 3:43 - loss: 0.1369 - regression_loss: 0.1211 - classification_loss: 0.0159 61/500 [==>...........................] - ETA: 3:42 - loss: 0.1358 - regression_loss: 0.1201 - classification_loss: 0.0157 62/500 [==>...........................] - ETA: 3:42 - loss: 0.1341 - regression_loss: 0.1186 - classification_loss: 0.0156 63/500 [==>...........................] - ETA: 3:41 - loss: 0.1327 - regression_loss: 0.1173 - classification_loss: 0.0154 64/500 [==>...........................] - ETA: 3:40 - loss: 0.1316 - regression_loss: 0.1163 - classification_loss: 0.0153 65/500 [==>...........................] - ETA: 3:40 - loss: 0.1309 - regression_loss: 0.1157 - classification_loss: 0.0152 66/500 [==>...........................] - ETA: 3:40 - loss: 0.1298 - regression_loss: 0.1147 - classification_loss: 0.0151 67/500 [===>..........................] - ETA: 3:39 - loss: 0.1293 - regression_loss: 0.1142 - classification_loss: 0.0151 68/500 [===>..........................] - ETA: 3:39 - loss: 0.1286 - regression_loss: 0.1137 - classification_loss: 0.0150 69/500 [===>..........................] - ETA: 3:38 - loss: 0.1286 - regression_loss: 0.1136 - classification_loss: 0.0149 70/500 [===>..........................] - ETA: 3:38 - loss: 0.1276 - regression_loss: 0.1128 - classification_loss: 0.0148 71/500 [===>..........................] - ETA: 3:37 - loss: 0.1273 - regression_loss: 0.1125 - classification_loss: 0.0148 72/500 [===>..........................] - ETA: 3:36 - loss: 0.1268 - regression_loss: 0.1121 - classification_loss: 0.0147 73/500 [===>..........................] - ETA: 3:36 - loss: 0.1278 - regression_loss: 0.1131 - classification_loss: 0.0147 74/500 [===>..........................] - ETA: 3:35 - loss: 0.1363 - regression_loss: 0.1208 - classification_loss: 0.0155 75/500 [===>..........................] - ETA: 3:35 - loss: 0.1355 - regression_loss: 0.1200 - classification_loss: 0.0155 76/500 [===>..........................] - ETA: 3:34 - loss: 0.1367 - regression_loss: 0.1212 - classification_loss: 0.0154 77/500 [===>..........................] - ETA: 3:34 - loss: 0.1359 - regression_loss: 0.1206 - classification_loss: 0.0153 78/500 [===>..........................] - ETA: 3:33 - loss: 0.1351 - regression_loss: 0.1198 - classification_loss: 0.0153 79/500 [===>..........................] - ETA: 3:33 - loss: 0.1338 - regression_loss: 0.1187 - classification_loss: 0.0151 80/500 [===>..........................] - ETA: 3:32 - loss: 0.1342 - regression_loss: 0.1191 - classification_loss: 0.0151 81/500 [===>..........................] - ETA: 3:32 - loss: 0.1335 - regression_loss: 0.1184 - classification_loss: 0.0150 82/500 [===>..........................] - ETA: 3:31 - loss: 0.1336 - regression_loss: 0.1186 - classification_loss: 0.0150 83/500 [===>..........................] - ETA: 3:31 - loss: 0.1328 - regression_loss: 0.1178 - classification_loss: 0.0149 84/500 [====>.........................] - ETA: 3:31 - loss: 0.1321 - regression_loss: 0.1173 - classification_loss: 0.0148 85/500 [====>.........................] - ETA: 3:30 - loss: 0.1323 - regression_loss: 0.1175 - classification_loss: 0.0148 86/500 [====>.........................] - ETA: 3:29 - loss: 0.1314 - regression_loss: 0.1167 - classification_loss: 0.0147 87/500 [====>.........................] - ETA: 3:29 - loss: 0.1333 - regression_loss: 0.1186 - classification_loss: 0.0148 88/500 [====>.........................] - ETA: 3:29 - loss: 0.1333 - regression_loss: 0.1186 - classification_loss: 0.0147 89/500 [====>.........................] - ETA: 3:28 - loss: 0.1332 - regression_loss: 0.1185 - classification_loss: 0.0147 90/500 [====>.........................] - ETA: 3:28 - loss: 0.1323 - regression_loss: 0.1177 - classification_loss: 0.0146 91/500 [====>.........................] - ETA: 3:27 - loss: 0.1312 - regression_loss: 0.1167 - classification_loss: 0.0145 92/500 [====>.........................] - ETA: 3:26 - loss: 0.1308 - regression_loss: 0.1163 - classification_loss: 0.0145 93/500 [====>.........................] - ETA: 3:26 - loss: 0.1301 - regression_loss: 0.1156 - classification_loss: 0.0144 94/500 [====>.........................] - ETA: 3:25 - loss: 0.1290 - regression_loss: 0.1147 - classification_loss: 0.0143 95/500 [====>.........................] - ETA: 3:25 - loss: 0.1291 - regression_loss: 0.1147 - classification_loss: 0.0144 96/500 [====>.........................] - ETA: 3:24 - loss: 0.1300 - regression_loss: 0.1156 - classification_loss: 0.0144 97/500 [====>.........................] - ETA: 3:24 - loss: 0.1304 - regression_loss: 0.1161 - classification_loss: 0.0144 98/500 [====>.........................] - ETA: 3:23 - loss: 0.1308 - regression_loss: 0.1164 - classification_loss: 0.0145 99/500 [====>.........................] - ETA: 3:23 - loss: 0.1300 - regression_loss: 0.1156 - classification_loss: 0.0144 100/500 [=====>........................] - ETA: 3:22 - loss: 0.1298 - regression_loss: 0.1154 - classification_loss: 0.0143 101/500 [=====>........................] - ETA: 3:22 - loss: 0.1300 - regression_loss: 0.1157 - classification_loss: 0.0143 102/500 [=====>........................] - ETA: 3:21 - loss: 0.1294 - regression_loss: 0.1152 - classification_loss: 0.0142 103/500 [=====>........................] - ETA: 3:21 - loss: 0.1290 - regression_loss: 0.1147 - classification_loss: 0.0142 104/500 [=====>........................] - ETA: 3:21 - loss: 0.1289 - regression_loss: 0.1147 - classification_loss: 0.0142 105/500 [=====>........................] - ETA: 3:20 - loss: 0.1285 - regression_loss: 0.1144 - classification_loss: 0.0141 106/500 [=====>........................] - ETA: 3:20 - loss: 0.1296 - regression_loss: 0.1155 - classification_loss: 0.0142 107/500 [=====>........................] - ETA: 3:19 - loss: 0.1291 - regression_loss: 0.1149 - classification_loss: 0.0141 108/500 [=====>........................] - ETA: 3:19 - loss: 0.1296 - regression_loss: 0.1155 - classification_loss: 0.0141 109/500 [=====>........................] - ETA: 3:18 - loss: 0.1302 - regression_loss: 0.1161 - classification_loss: 0.0141 110/500 [=====>........................] - ETA: 3:18 - loss: 0.1298 - regression_loss: 0.1157 - classification_loss: 0.0141 111/500 [=====>........................] - ETA: 3:17 - loss: 0.1288 - regression_loss: 0.1148 - classification_loss: 0.0140 112/500 [=====>........................] - ETA: 3:17 - loss: 0.1286 - regression_loss: 0.1147 - classification_loss: 0.0139 113/500 [=====>........................] - ETA: 3:16 - loss: 0.1298 - regression_loss: 0.1158 - classification_loss: 0.0140 114/500 [=====>........................] - ETA: 3:16 - loss: 0.1303 - regression_loss: 0.1163 - classification_loss: 0.0140 115/500 [=====>........................] - ETA: 3:15 - loss: 0.1301 - regression_loss: 0.1161 - classification_loss: 0.0140 116/500 [=====>........................] - ETA: 3:15 - loss: 0.1335 - regression_loss: 0.1193 - classification_loss: 0.0142 117/500 [======>.......................] - ETA: 3:14 - loss: 0.1327 - regression_loss: 0.1186 - classification_loss: 0.0141 118/500 [======>.......................] - ETA: 3:14 - loss: 0.1321 - regression_loss: 0.1181 - classification_loss: 0.0140 119/500 [======>.......................] - ETA: 3:13 - loss: 0.1317 - regression_loss: 0.1178 - classification_loss: 0.0140 120/500 [======>.......................] - ETA: 3:13 - loss: 0.1312 - regression_loss: 0.1173 - classification_loss: 0.0139 121/500 [======>.......................] - ETA: 3:12 - loss: 0.1331 - regression_loss: 0.1190 - classification_loss: 0.0141 122/500 [======>.......................] - ETA: 3:12 - loss: 0.1326 - regression_loss: 0.1185 - classification_loss: 0.0141 123/500 [======>.......................] - ETA: 3:11 - loss: 0.1329 - regression_loss: 0.1188 - classification_loss: 0.0141 124/500 [======>.......................] - ETA: 3:11 - loss: 0.1340 - regression_loss: 0.1195 - classification_loss: 0.0145 125/500 [======>.......................] - ETA: 3:10 - loss: 0.1333 - regression_loss: 0.1189 - classification_loss: 0.0144 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1331 - regression_loss: 0.1187 - classification_loss: 0.0144 127/500 [======>.......................] - ETA: 3:09 - loss: 0.1341 - regression_loss: 0.1196 - classification_loss: 0.0145 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1344 - regression_loss: 0.1199 - classification_loss: 0.0145 129/500 [======>.......................] - ETA: 3:08 - loss: 0.1342 - regression_loss: 0.1197 - classification_loss: 0.0145 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1335 - regression_loss: 0.1191 - classification_loss: 0.0144 131/500 [======>.......................] - ETA: 3:07 - loss: 0.1380 - regression_loss: 0.1231 - classification_loss: 0.0149 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1373 - regression_loss: 0.1225 - classification_loss: 0.0148 133/500 [======>.......................] - ETA: 3:06 - loss: 0.1371 - regression_loss: 0.1222 - classification_loss: 0.0148 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1375 - regression_loss: 0.1226 - classification_loss: 0.0149 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1369 - regression_loss: 0.1221 - classification_loss: 0.0148 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1365 - regression_loss: 0.1217 - classification_loss: 0.0148 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1368 - regression_loss: 0.1220 - classification_loss: 0.0148 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1362 - regression_loss: 0.1215 - classification_loss: 0.0147 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1356 - regression_loss: 0.1210 - classification_loss: 0.0147 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1377 - regression_loss: 0.1231 - classification_loss: 0.0147 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1385 - regression_loss: 0.1238 - classification_loss: 0.0147 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1387 - regression_loss: 0.1239 - classification_loss: 0.0147 143/500 [=======>......................] - ETA: 3:01 - loss: 0.1382 - regression_loss: 0.1235 - classification_loss: 0.0147 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1383 - regression_loss: 0.1236 - classification_loss: 0.0147 145/500 [=======>......................] - ETA: 3:00 - loss: 0.1386 - regression_loss: 0.1239 - classification_loss: 0.0147 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1383 - regression_loss: 0.1236 - classification_loss: 0.0147 147/500 [=======>......................] - ETA: 2:59 - loss: 0.1379 - regression_loss: 0.1233 - classification_loss: 0.0146 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1378 - regression_loss: 0.1231 - classification_loss: 0.0146 149/500 [=======>......................] - ETA: 2:58 - loss: 0.1375 - regression_loss: 0.1229 - classification_loss: 0.0146 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1371 - regression_loss: 0.1225 - classification_loss: 0.0146 151/500 [========>.....................] - ETA: 2:57 - loss: 0.1366 - regression_loss: 0.1221 - classification_loss: 0.0145 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1361 - regression_loss: 0.1216 - classification_loss: 0.0145 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1354 - regression_loss: 0.1210 - classification_loss: 0.0144 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1350 - regression_loss: 0.1207 - classification_loss: 0.0144 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1352 - regression_loss: 0.1208 - classification_loss: 0.0144 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1350 - regression_loss: 0.1207 - classification_loss: 0.0144 157/500 [========>.....................] - ETA: 2:54 - loss: 0.1347 - regression_loss: 0.1204 - classification_loss: 0.0144 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1343 - regression_loss: 0.1200 - classification_loss: 0.0143 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1340 - regression_loss: 0.1198 - classification_loss: 0.0143 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1334 - regression_loss: 0.1192 - classification_loss: 0.0142 161/500 [========>.....................] - ETA: 2:52 - loss: 0.1334 - regression_loss: 0.1192 - classification_loss: 0.0142 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1328 - regression_loss: 0.1186 - classification_loss: 0.0142 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1323 - regression_loss: 0.1182 - classification_loss: 0.0142 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1323 - regression_loss: 0.1181 - classification_loss: 0.0142 165/500 [========>.....................] - ETA: 2:50 - loss: 0.1318 - regression_loss: 0.1177 - classification_loss: 0.0141 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1314 - regression_loss: 0.1173 - classification_loss: 0.0141 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1309 - regression_loss: 0.1169 - classification_loss: 0.0141 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1303 - regression_loss: 0.1163 - classification_loss: 0.0140 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1299 - regression_loss: 0.1159 - classification_loss: 0.0140 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1296 - regression_loss: 0.1157 - classification_loss: 0.0139 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1296 - regression_loss: 0.1156 - classification_loss: 0.0139 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1295 - regression_loss: 0.1156 - classification_loss: 0.0139 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1295 - regression_loss: 0.1156 - classification_loss: 0.0139 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1292 - regression_loss: 0.1153 - classification_loss: 0.0139 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1288 - regression_loss: 0.1149 - classification_loss: 0.0139 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1287 - regression_loss: 0.1149 - classification_loss: 0.0138 177/500 [=========>....................] - ETA: 2:44 - loss: 0.1284 - regression_loss: 0.1146 - classification_loss: 0.0138 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1285 - regression_loss: 0.1147 - classification_loss: 0.0138 179/500 [=========>....................] - ETA: 2:43 - loss: 0.1280 - regression_loss: 0.1143 - classification_loss: 0.0138 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1276 - regression_loss: 0.1138 - classification_loss: 0.0138 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1275 - regression_loss: 0.1137 - classification_loss: 0.0138 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1271 - regression_loss: 0.1133 - classification_loss: 0.0137 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1272 - regression_loss: 0.1135 - classification_loss: 0.0137 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1269 - regression_loss: 0.1132 - classification_loss: 0.0137 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1265 - regression_loss: 0.1129 - classification_loss: 0.0137 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1263 - regression_loss: 0.1126 - classification_loss: 0.0136 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1260 - regression_loss: 0.1123 - classification_loss: 0.0136 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1257 - regression_loss: 0.1121 - classification_loss: 0.0136 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1254 - regression_loss: 0.1118 - classification_loss: 0.0136 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1250 - regression_loss: 0.1115 - classification_loss: 0.0135 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1246 - regression_loss: 0.1112 - classification_loss: 0.0135 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1247 - regression_loss: 0.1113 - classification_loss: 0.0134 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1246 - regression_loss: 0.1112 - classification_loss: 0.0134 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1253 - regression_loss: 0.1118 - classification_loss: 0.0135 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1250 - regression_loss: 0.1116 - classification_loss: 0.0135 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1290 - regression_loss: 0.1143 - classification_loss: 0.0147 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1291 - regression_loss: 0.1144 - classification_loss: 0.0147 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1287 - regression_loss: 0.1141 - classification_loss: 0.0147 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1290 - regression_loss: 0.1144 - classification_loss: 0.0146 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1290 - regression_loss: 0.1144 - classification_loss: 0.0146 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1290 - regression_loss: 0.1144 - classification_loss: 0.0146 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1289 - regression_loss: 0.1143 - classification_loss: 0.0145 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1289 - regression_loss: 0.1143 - classification_loss: 0.0145 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1285 - regression_loss: 0.1140 - classification_loss: 0.0145 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1282 - regression_loss: 0.1138 - classification_loss: 0.0145 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1277 - regression_loss: 0.1133 - classification_loss: 0.0144 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1274 - regression_loss: 0.1130 - classification_loss: 0.0144 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1272 - regression_loss: 0.1129 - classification_loss: 0.0144 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1269 - regression_loss: 0.1126 - classification_loss: 0.0143 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1270 - regression_loss: 0.1127 - classification_loss: 0.0143 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1284 - regression_loss: 0.1137 - classification_loss: 0.0147 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1280 - regression_loss: 0.1133 - classification_loss: 0.0147 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1279 - regression_loss: 0.1132 - classification_loss: 0.0146 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1276 - regression_loss: 0.1130 - classification_loss: 0.0146 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1274 - regression_loss: 0.1128 - classification_loss: 0.0146 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1272 - regression_loss: 0.1127 - classification_loss: 0.0145 217/500 [============>.................] - ETA: 2:24 - loss: 0.1270 - regression_loss: 0.1125 - classification_loss: 0.0145 218/500 [============>.................] - ETA: 2:24 - loss: 0.1266 - regression_loss: 0.1121 - classification_loss: 0.0145 219/500 [============>.................] - ETA: 2:23 - loss: 0.1267 - regression_loss: 0.1122 - classification_loss: 0.0145 220/500 [============>.................] - ETA: 2:23 - loss: 0.1266 - regression_loss: 0.1121 - classification_loss: 0.0145 221/500 [============>.................] - ETA: 2:22 - loss: 0.1265 - regression_loss: 0.1120 - classification_loss: 0.0145 222/500 [============>.................] - ETA: 2:22 - loss: 0.1262 - regression_loss: 0.1117 - classification_loss: 0.0144 223/500 [============>.................] - ETA: 2:21 - loss: 0.1265 - regression_loss: 0.1120 - classification_loss: 0.0145 224/500 [============>.................] - ETA: 2:21 - loss: 0.1261 - regression_loss: 0.1117 - classification_loss: 0.0145 225/500 [============>.................] - ETA: 2:20 - loss: 0.1270 - regression_loss: 0.1126 - classification_loss: 0.0145 226/500 [============>.................] - ETA: 2:20 - loss: 0.1268 - regression_loss: 0.1123 - classification_loss: 0.0144 227/500 [============>.................] - ETA: 2:19 - loss: 0.1265 - regression_loss: 0.1120 - classification_loss: 0.0144 228/500 [============>.................] - ETA: 2:19 - loss: 0.1272 - regression_loss: 0.1127 - classification_loss: 0.0145 229/500 [============>.................] - ETA: 2:18 - loss: 0.1273 - regression_loss: 0.1128 - classification_loss: 0.0145 230/500 [============>.................] - ETA: 2:17 - loss: 0.1269 - regression_loss: 0.1125 - classification_loss: 0.0145 231/500 [============>.................] - ETA: 2:17 - loss: 0.1269 - regression_loss: 0.1124 - classification_loss: 0.0145 232/500 [============>.................] - ETA: 2:16 - loss: 0.1267 - regression_loss: 0.1122 - classification_loss: 0.0144 233/500 [============>.................] - ETA: 2:16 - loss: 0.1264 - regression_loss: 0.1120 - classification_loss: 0.0144 234/500 [=============>................] - ETA: 2:15 - loss: 0.1267 - regression_loss: 0.1123 - classification_loss: 0.0144 235/500 [=============>................] - ETA: 2:15 - loss: 0.1265 - regression_loss: 0.1121 - classification_loss: 0.0144 236/500 [=============>................] - ETA: 2:14 - loss: 0.1261 - regression_loss: 0.1117 - classification_loss: 0.0143 237/500 [=============>................] - ETA: 2:14 - loss: 0.1279 - regression_loss: 0.1134 - classification_loss: 0.0145 238/500 [=============>................] - ETA: 2:13 - loss: 0.1281 - regression_loss: 0.1136 - classification_loss: 0.0145 239/500 [=============>................] - ETA: 2:13 - loss: 0.1280 - regression_loss: 0.1135 - classification_loss: 0.0145 240/500 [=============>................] - ETA: 2:12 - loss: 0.1278 - regression_loss: 0.1133 - classification_loss: 0.0145 241/500 [=============>................] - ETA: 2:12 - loss: 0.1274 - regression_loss: 0.1130 - classification_loss: 0.0144 242/500 [=============>................] - ETA: 2:11 - loss: 0.1275 - regression_loss: 0.1131 - classification_loss: 0.0144 243/500 [=============>................] - ETA: 2:11 - loss: 0.1277 - regression_loss: 0.1133 - classification_loss: 0.0145 244/500 [=============>................] - ETA: 2:10 - loss: 0.1274 - regression_loss: 0.1129 - classification_loss: 0.0144 245/500 [=============>................] - ETA: 2:10 - loss: 0.1270 - regression_loss: 0.1126 - classification_loss: 0.0144 246/500 [=============>................] - ETA: 2:09 - loss: 0.1268 - regression_loss: 0.1124 - classification_loss: 0.0144 247/500 [=============>................] - ETA: 2:09 - loss: 0.1265 - regression_loss: 0.1121 - classification_loss: 0.0144 248/500 [=============>................] - ETA: 2:08 - loss: 0.1263 - regression_loss: 0.1119 - classification_loss: 0.0144 249/500 [=============>................] - ETA: 2:08 - loss: 0.1260 - regression_loss: 0.1117 - classification_loss: 0.0143 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1258 - regression_loss: 0.1115 - classification_loss: 0.0143 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1255 - regression_loss: 0.1112 - classification_loss: 0.0143 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1257 - regression_loss: 0.1114 - classification_loss: 0.0143 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1254 - regression_loss: 0.1111 - classification_loss: 0.0143 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1252 - regression_loss: 0.1109 - classification_loss: 0.0143 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1256 - regression_loss: 0.1113 - classification_loss: 0.0143 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1253 - regression_loss: 0.1111 - classification_loss: 0.0143 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1252 - regression_loss: 0.1110 - classification_loss: 0.0142 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1252 - regression_loss: 0.1110 - classification_loss: 0.0142 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1253 - regression_loss: 0.1111 - classification_loss: 0.0142 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1252 - regression_loss: 0.1110 - classification_loss: 0.0142 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1254 - regression_loss: 0.1112 - classification_loss: 0.0142 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1251 - regression_loss: 0.1109 - classification_loss: 0.0142 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1251 - regression_loss: 0.1109 - classification_loss: 0.0142 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1252 - regression_loss: 0.1110 - classification_loss: 0.0142 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1250 - regression_loss: 0.1109 - classification_loss: 0.0142 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1249 - regression_loss: 0.1107 - classification_loss: 0.0141 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1247 - regression_loss: 0.1106 - classification_loss: 0.0141 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1244 - regression_loss: 0.1104 - classification_loss: 0.0141 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1243 - regression_loss: 0.1102 - classification_loss: 0.0141 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1242 - regression_loss: 0.1101 - classification_loss: 0.0141 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1241 - regression_loss: 0.1101 - classification_loss: 0.0140 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1239 - regression_loss: 0.1099 - classification_loss: 0.0140 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1244 - regression_loss: 0.1104 - classification_loss: 0.0140 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1242 - regression_loss: 0.1102 - classification_loss: 0.0140 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1243 - regression_loss: 0.1103 - classification_loss: 0.0140 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1240 - regression_loss: 0.1101 - classification_loss: 0.0140 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1243 - regression_loss: 0.1103 - classification_loss: 0.0140 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1240 - regression_loss: 0.1101 - classification_loss: 0.0140 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1238 - regression_loss: 0.1098 - classification_loss: 0.0139 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1235 - regression_loss: 0.1096 - classification_loss: 0.0139 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1234 - regression_loss: 0.1094 - classification_loss: 0.0139 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1233 - regression_loss: 0.1094 - classification_loss: 0.0139 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1230 - regression_loss: 0.1091 - classification_loss: 0.0139 284/500 [================>.............] - ETA: 1:50 - loss: 0.1228 - regression_loss: 0.1089 - classification_loss: 0.0139 285/500 [================>.............] - ETA: 1:49 - loss: 0.1226 - regression_loss: 0.1087 - classification_loss: 0.0139 286/500 [================>.............] - ETA: 1:49 - loss: 0.1223 - regression_loss: 0.1085 - classification_loss: 0.0138 287/500 [================>.............] - ETA: 1:48 - loss: 0.1221 - regression_loss: 0.1083 - classification_loss: 0.0138 288/500 [================>.............] - ETA: 1:48 - loss: 0.1219 - regression_loss: 0.1081 - classification_loss: 0.0138 289/500 [================>.............] - ETA: 1:47 - loss: 0.1221 - regression_loss: 0.1083 - classification_loss: 0.0138 290/500 [================>.............] - ETA: 1:47 - loss: 0.1219 - regression_loss: 0.1081 - classification_loss: 0.0138 291/500 [================>.............] - ETA: 1:46 - loss: 0.1216 - regression_loss: 0.1079 - classification_loss: 0.0137 292/500 [================>.............] - ETA: 1:46 - loss: 0.1213 - regression_loss: 0.1076 - classification_loss: 0.0137 293/500 [================>.............] - ETA: 1:45 - loss: 0.1218 - regression_loss: 0.1081 - classification_loss: 0.0137 294/500 [================>.............] - ETA: 1:45 - loss: 0.1217 - regression_loss: 0.1080 - classification_loss: 0.0137 295/500 [================>.............] - ETA: 1:44 - loss: 0.1214 - regression_loss: 0.1077 - classification_loss: 0.0137 296/500 [================>.............] - ETA: 1:44 - loss: 0.1217 - regression_loss: 0.1080 - classification_loss: 0.0137 297/500 [================>.............] - ETA: 1:43 - loss: 0.1217 - regression_loss: 0.1080 - classification_loss: 0.0137 298/500 [================>.............] - ETA: 1:43 - loss: 0.1215 - regression_loss: 0.1078 - classification_loss: 0.0137 299/500 [================>.............] - ETA: 1:42 - loss: 0.1216 - regression_loss: 0.1079 - classification_loss: 0.0137 300/500 [=================>............] - ETA: 1:42 - loss: 0.1222 - regression_loss: 0.1085 - classification_loss: 0.0136 301/500 [=================>............] - ETA: 1:41 - loss: 0.1220 - regression_loss: 0.1084 - classification_loss: 0.0136 302/500 [=================>............] - ETA: 1:41 - loss: 0.1218 - regression_loss: 0.1082 - classification_loss: 0.0136 303/500 [=================>............] - ETA: 1:40 - loss: 0.1217 - regression_loss: 0.1081 - classification_loss: 0.0136 304/500 [=================>............] - ETA: 1:40 - loss: 0.1240 - regression_loss: 0.1097 - classification_loss: 0.0144 305/500 [=================>............] - ETA: 1:39 - loss: 0.1238 - regression_loss: 0.1095 - classification_loss: 0.0143 306/500 [=================>............] - ETA: 1:39 - loss: 0.1236 - regression_loss: 0.1093 - classification_loss: 0.0143 307/500 [=================>............] - ETA: 1:38 - loss: 0.1234 - regression_loss: 0.1091 - classification_loss: 0.0143 308/500 [=================>............] - ETA: 1:38 - loss: 0.1232 - regression_loss: 0.1090 - classification_loss: 0.0143 309/500 [=================>............] - ETA: 1:37 - loss: 0.1231 - regression_loss: 0.1088 - classification_loss: 0.0142 310/500 [=================>............] - ETA: 1:37 - loss: 0.1233 - regression_loss: 0.1090 - classification_loss: 0.0143 311/500 [=================>............] - ETA: 1:36 - loss: 0.1232 - regression_loss: 0.1089 - classification_loss: 0.0143 312/500 [=================>............] - ETA: 1:35 - loss: 0.1251 - regression_loss: 0.1107 - classification_loss: 0.0145 313/500 [=================>............] - ETA: 1:35 - loss: 0.1252 - regression_loss: 0.1107 - classification_loss: 0.0145 314/500 [=================>............] - ETA: 1:34 - loss: 0.1259 - regression_loss: 0.1114 - classification_loss: 0.0145 315/500 [=================>............] - ETA: 1:34 - loss: 0.1259 - regression_loss: 0.1114 - classification_loss: 0.0145 316/500 [=================>............] - ETA: 1:33 - loss: 0.1256 - regression_loss: 0.1111 - classification_loss: 0.0145 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1256 - regression_loss: 0.1111 - classification_loss: 0.0145 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1254 - regression_loss: 0.1110 - classification_loss: 0.0145 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1252 - regression_loss: 0.1107 - classification_loss: 0.0144 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1256 - regression_loss: 0.1111 - classification_loss: 0.0145 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1255 - regression_loss: 0.1110 - classification_loss: 0.0145 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1258 - regression_loss: 0.1113 - classification_loss: 0.0145 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1267 - regression_loss: 0.1119 - classification_loss: 0.0147 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1264 - regression_loss: 0.1117 - classification_loss: 0.0147 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1263 - regression_loss: 0.1116 - classification_loss: 0.0147 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1261 - regression_loss: 0.1114 - classification_loss: 0.0147 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1259 - regression_loss: 0.1112 - classification_loss: 0.0146 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1257 - regression_loss: 0.1111 - classification_loss: 0.0146 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1256 - regression_loss: 0.1109 - classification_loss: 0.0146 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1253 - regression_loss: 0.1107 - classification_loss: 0.0146 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1255 - regression_loss: 0.1109 - classification_loss: 0.0146 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1255 - regression_loss: 0.1109 - classification_loss: 0.0146 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1254 - regression_loss: 0.1108 - classification_loss: 0.0146 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1254 - regression_loss: 0.1108 - classification_loss: 0.0146 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1253 - regression_loss: 0.1108 - classification_loss: 0.0146 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1252 - regression_loss: 0.1106 - classification_loss: 0.0145 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1250 - regression_loss: 0.1105 - classification_loss: 0.0145 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1248 - regression_loss: 0.1103 - classification_loss: 0.0145 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1247 - regression_loss: 0.1102 - classification_loss: 0.0145 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1245 - regression_loss: 0.1100 - classification_loss: 0.0145 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1246 - regression_loss: 0.1101 - classification_loss: 0.0145 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1243 - regression_loss: 0.1099 - classification_loss: 0.0144 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1248 - regression_loss: 0.1103 - classification_loss: 0.0145 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1248 - regression_loss: 0.1103 - classification_loss: 0.0145 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1246 - regression_loss: 0.1101 - classification_loss: 0.0145 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1244 - regression_loss: 0.1099 - classification_loss: 0.0145 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1243 - regression_loss: 0.1099 - classification_loss: 0.0145 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1243 - regression_loss: 0.1098 - classification_loss: 0.0145 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1242 - regression_loss: 0.1098 - classification_loss: 0.0145 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1244 - regression_loss: 0.1100 - classification_loss: 0.0145 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1243 - regression_loss: 0.1099 - classification_loss: 0.0144 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1243 - regression_loss: 0.1099 - classification_loss: 0.0145 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1241 - regression_loss: 0.1097 - classification_loss: 0.0144 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1238 - regression_loss: 0.1095 - classification_loss: 0.0144 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1237 - regression_loss: 0.1093 - classification_loss: 0.0144 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1236 - regression_loss: 0.1092 - classification_loss: 0.0144 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1233 - regression_loss: 0.1090 - classification_loss: 0.0143 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1234 - regression_loss: 0.1091 - classification_loss: 0.0143 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1232 - regression_loss: 0.1089 - classification_loss: 0.0143 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1231 - regression_loss: 0.1088 - classification_loss: 0.0143 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1231 - regression_loss: 0.1087 - classification_loss: 0.0143 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1229 - regression_loss: 0.1086 - classification_loss: 0.0143 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1231 - regression_loss: 0.1088 - classification_loss: 0.0143 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1230 - regression_loss: 0.1087 - classification_loss: 0.0143 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1229 - regression_loss: 0.1086 - classification_loss: 0.0143 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1229 - regression_loss: 0.1087 - classification_loss: 0.0143 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1231 - regression_loss: 0.1088 - classification_loss: 0.0143 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1228 - regression_loss: 0.1086 - classification_loss: 0.0142 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1227 - regression_loss: 0.1085 - classification_loss: 0.0142 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1226 - regression_loss: 0.1084 - classification_loss: 0.0142 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1226 - regression_loss: 0.1084 - classification_loss: 0.0142 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1225 - regression_loss: 0.1083 - classification_loss: 0.0142 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1224 - regression_loss: 0.1083 - classification_loss: 0.0142 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1227 - regression_loss: 0.1085 - classification_loss: 0.0142 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1227 - regression_loss: 0.1085 - classification_loss: 0.0142 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1225 - regression_loss: 0.1083 - classification_loss: 0.0142 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1223 - regression_loss: 0.1082 - classification_loss: 0.0142 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1222 - regression_loss: 0.1080 - classification_loss: 0.0142 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1226 - regression_loss: 0.1083 - classification_loss: 0.0143 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1224 - regression_loss: 0.1082 - classification_loss: 0.0143 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1223 - regression_loss: 0.1080 - classification_loss: 0.0142 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1220 - regression_loss: 0.1078 - classification_loss: 0.0142 383/500 [=====================>........] - ETA: 59s - loss: 0.1220 - regression_loss: 0.1078 - classification_loss: 0.0142  384/500 [======================>.......] - ETA: 59s - loss: 0.1218 - regression_loss: 0.1076 - classification_loss: 0.0142 385/500 [======================>.......] - ETA: 58s - loss: 0.1219 - regression_loss: 0.1077 - classification_loss: 0.0142 386/500 [======================>.......] - ETA: 58s - loss: 0.1217 - regression_loss: 0.1075 - classification_loss: 0.0142 387/500 [======================>.......] - ETA: 57s - loss: 0.1215 - regression_loss: 0.1074 - classification_loss: 0.0141 388/500 [======================>.......] - ETA: 57s - loss: 0.1214 - regression_loss: 0.1073 - classification_loss: 0.0141 389/500 [======================>.......] - ETA: 56s - loss: 0.1216 - regression_loss: 0.1075 - classification_loss: 0.0141 390/500 [======================>.......] - ETA: 56s - loss: 0.1214 - regression_loss: 0.1073 - classification_loss: 0.0141 391/500 [======================>.......] - ETA: 55s - loss: 0.1217 - regression_loss: 0.1076 - classification_loss: 0.0141 392/500 [======================>.......] - ETA: 55s - loss: 0.1215 - regression_loss: 0.1074 - classification_loss: 0.0141 393/500 [======================>.......] - ETA: 54s - loss: 0.1221 - regression_loss: 0.1080 - classification_loss: 0.0141 394/500 [======================>.......] - ETA: 54s - loss: 0.1223 - regression_loss: 0.1081 - classification_loss: 0.0141 395/500 [======================>.......] - ETA: 53s - loss: 0.1220 - regression_loss: 0.1079 - classification_loss: 0.0141 396/500 [======================>.......] - ETA: 53s - loss: 0.1221 - regression_loss: 0.1079 - classification_loss: 0.0141 397/500 [======================>.......] - ETA: 52s - loss: 0.1220 - regression_loss: 0.1079 - classification_loss: 0.0141 398/500 [======================>.......] - ETA: 52s - loss: 0.1218 - regression_loss: 0.1077 - classification_loss: 0.0141 399/500 [======================>.......] - ETA: 51s - loss: 0.1218 - regression_loss: 0.1077 - classification_loss: 0.0141 400/500 [=======================>......] - ETA: 51s - loss: 0.1219 - regression_loss: 0.1079 - classification_loss: 0.0141 401/500 [=======================>......] - ETA: 50s - loss: 0.1221 - regression_loss: 0.1080 - classification_loss: 0.0141 402/500 [=======================>......] - ETA: 50s - loss: 0.1220 - regression_loss: 0.1079 - classification_loss: 0.0141 403/500 [=======================>......] - ETA: 49s - loss: 0.1223 - regression_loss: 0.1082 - classification_loss: 0.0141 404/500 [=======================>......] - ETA: 49s - loss: 0.1221 - regression_loss: 0.1080 - classification_loss: 0.0141 405/500 [=======================>......] - ETA: 48s - loss: 0.1220 - regression_loss: 0.1079 - classification_loss: 0.0141 406/500 [=======================>......] - ETA: 48s - loss: 0.1235 - regression_loss: 0.1093 - classification_loss: 0.0142 407/500 [=======================>......] - ETA: 47s - loss: 0.1234 - regression_loss: 0.1092 - classification_loss: 0.0142 408/500 [=======================>......] - ETA: 47s - loss: 0.1232 - regression_loss: 0.1090 - classification_loss: 0.0142 409/500 [=======================>......] - ETA: 46s - loss: 0.1231 - regression_loss: 0.1089 - classification_loss: 0.0142 410/500 [=======================>......] - ETA: 45s - loss: 0.1238 - regression_loss: 0.1097 - classification_loss: 0.0142 411/500 [=======================>......] - ETA: 45s - loss: 0.1237 - regression_loss: 0.1095 - classification_loss: 0.0142 412/500 [=======================>......] - ETA: 44s - loss: 0.1236 - regression_loss: 0.1095 - classification_loss: 0.0142 413/500 [=======================>......] - ETA: 44s - loss: 0.1234 - regression_loss: 0.1093 - classification_loss: 0.0141 414/500 [=======================>......] - ETA: 43s - loss: 0.1233 - regression_loss: 0.1091 - classification_loss: 0.0141 415/500 [=======================>......] - ETA: 43s - loss: 0.1232 - regression_loss: 0.1090 - classification_loss: 0.0141 416/500 [=======================>......] - ETA: 42s - loss: 0.1230 - regression_loss: 0.1089 - classification_loss: 0.0141 417/500 [========================>.....] - ETA: 42s - loss: 0.1228 - regression_loss: 0.1088 - classification_loss: 0.0141 418/500 [========================>.....] - ETA: 41s - loss: 0.1230 - regression_loss: 0.1089 - classification_loss: 0.0141 419/500 [========================>.....] - ETA: 41s - loss: 0.1230 - regression_loss: 0.1089 - classification_loss: 0.0141 420/500 [========================>.....] - ETA: 40s - loss: 0.1229 - regression_loss: 0.1088 - classification_loss: 0.0141 421/500 [========================>.....] - ETA: 40s - loss: 0.1227 - regression_loss: 0.1086 - classification_loss: 0.0141 422/500 [========================>.....] - ETA: 39s - loss: 0.1226 - regression_loss: 0.1085 - classification_loss: 0.0141 423/500 [========================>.....] - ETA: 39s - loss: 0.1227 - regression_loss: 0.1086 - classification_loss: 0.0141 424/500 [========================>.....] - ETA: 38s - loss: 0.1227 - regression_loss: 0.1086 - classification_loss: 0.0141 425/500 [========================>.....] - ETA: 38s - loss: 0.1228 - regression_loss: 0.1087 - classification_loss: 0.0141 426/500 [========================>.....] - ETA: 37s - loss: 0.1238 - regression_loss: 0.1097 - classification_loss: 0.0141 427/500 [========================>.....] - ETA: 37s - loss: 0.1236 - regression_loss: 0.1095 - classification_loss: 0.0141 428/500 [========================>.....] - ETA: 36s - loss: 0.1235 - regression_loss: 0.1094 - classification_loss: 0.0141 429/500 [========================>.....] - ETA: 36s - loss: 0.1234 - regression_loss: 0.1093 - classification_loss: 0.0141 430/500 [========================>.....] - ETA: 35s - loss: 0.1233 - regression_loss: 0.1092 - classification_loss: 0.0141 431/500 [========================>.....] - ETA: 35s - loss: 0.1233 - regression_loss: 0.1092 - classification_loss: 0.0141 432/500 [========================>.....] - ETA: 34s - loss: 0.1243 - regression_loss: 0.1101 - classification_loss: 0.0141 433/500 [========================>.....] - ETA: 34s - loss: 0.1241 - regression_loss: 0.1100 - classification_loss: 0.0141 434/500 [=========================>....] - ETA: 33s - loss: 0.1239 - regression_loss: 0.1098 - classification_loss: 0.0141 435/500 [=========================>....] - ETA: 33s - loss: 0.1240 - regression_loss: 0.1099 - classification_loss: 0.0141 436/500 [=========================>....] - ETA: 32s - loss: 0.1240 - regression_loss: 0.1099 - classification_loss: 0.0141 437/500 [=========================>....] - ETA: 32s - loss: 0.1241 - regression_loss: 0.1100 - classification_loss: 0.0141 438/500 [=========================>....] - ETA: 31s - loss: 0.1241 - regression_loss: 0.1100 - classification_loss: 0.0141 439/500 [=========================>....] - ETA: 31s - loss: 0.1241 - regression_loss: 0.1100 - classification_loss: 0.0141 440/500 [=========================>....] - ETA: 30s - loss: 0.1239 - regression_loss: 0.1099 - classification_loss: 0.0141 441/500 [=========================>....] - ETA: 30s - loss: 0.1239 - regression_loss: 0.1098 - classification_loss: 0.0141 442/500 [=========================>....] - ETA: 29s - loss: 0.1237 - regression_loss: 0.1097 - classification_loss: 0.0140 443/500 [=========================>....] - ETA: 29s - loss: 0.1235 - regression_loss: 0.1095 - classification_loss: 0.0140 444/500 [=========================>....] - ETA: 28s - loss: 0.1233 - regression_loss: 0.1093 - classification_loss: 0.0140 445/500 [=========================>....] - ETA: 28s - loss: 0.1232 - regression_loss: 0.1092 - classification_loss: 0.0140 446/500 [=========================>....] - ETA: 27s - loss: 0.1230 - regression_loss: 0.1091 - classification_loss: 0.0140 447/500 [=========================>....] - ETA: 27s - loss: 0.1230 - regression_loss: 0.1091 - classification_loss: 0.0140 448/500 [=========================>....] - ETA: 26s - loss: 0.1229 - regression_loss: 0.1089 - classification_loss: 0.0140 449/500 [=========================>....] - ETA: 26s - loss: 0.1228 - regression_loss: 0.1088 - classification_loss: 0.0139 450/500 [==========================>...] - ETA: 25s - loss: 0.1227 - regression_loss: 0.1087 - classification_loss: 0.0139 451/500 [==========================>...] - ETA: 25s - loss: 0.1225 - regression_loss: 0.1086 - classification_loss: 0.0139 452/500 [==========================>...] - ETA: 24s - loss: 0.1228 - regression_loss: 0.1089 - classification_loss: 0.0139 453/500 [==========================>...] - ETA: 24s - loss: 0.1227 - regression_loss: 0.1088 - classification_loss: 0.0139 454/500 [==========================>...] - ETA: 23s - loss: 0.1227 - regression_loss: 0.1088 - classification_loss: 0.0139 455/500 [==========================>...] - ETA: 22s - loss: 0.1234 - regression_loss: 0.1095 - classification_loss: 0.0139 456/500 [==========================>...] - ETA: 22s - loss: 0.1234 - regression_loss: 0.1095 - classification_loss: 0.0139 457/500 [==========================>...] - ETA: 21s - loss: 0.1234 - regression_loss: 0.1095 - classification_loss: 0.0139 458/500 [==========================>...] - ETA: 21s - loss: 0.1233 - regression_loss: 0.1094 - classification_loss: 0.0139 459/500 [==========================>...] - ETA: 20s - loss: 0.1231 - regression_loss: 0.1093 - classification_loss: 0.0139 460/500 [==========================>...] - ETA: 20s - loss: 0.1230 - regression_loss: 0.1091 - classification_loss: 0.0139 461/500 [==========================>...] - ETA: 19s - loss: 0.1228 - regression_loss: 0.1090 - classification_loss: 0.0139 462/500 [==========================>...] - ETA: 19s - loss: 0.1236 - regression_loss: 0.1096 - classification_loss: 0.0141 463/500 [==========================>...] - ETA: 18s - loss: 0.1235 - regression_loss: 0.1095 - classification_loss: 0.0140 464/500 [==========================>...] - ETA: 18s - loss: 0.1237 - regression_loss: 0.1096 - classification_loss: 0.0141 465/500 [==========================>...] - ETA: 17s - loss: 0.1237 - regression_loss: 0.1096 - classification_loss: 0.0141 466/500 [==========================>...] - ETA: 17s - loss: 0.1236 - regression_loss: 0.1095 - classification_loss: 0.0141 467/500 [===========================>..] - ETA: 16s - loss: 0.1239 - regression_loss: 0.1098 - classification_loss: 0.0141 468/500 [===========================>..] - ETA: 16s - loss: 0.1237 - regression_loss: 0.1096 - classification_loss: 0.0141 469/500 [===========================>..] - ETA: 15s - loss: 0.1236 - regression_loss: 0.1096 - classification_loss: 0.0141 470/500 [===========================>..] - ETA: 15s - loss: 0.1236 - regression_loss: 0.1095 - classification_loss: 0.0141 471/500 [===========================>..] - ETA: 14s - loss: 0.1253 - regression_loss: 0.1107 - classification_loss: 0.0146 472/500 [===========================>..] - ETA: 14s - loss: 0.1251 - regression_loss: 0.1105 - classification_loss: 0.0146 473/500 [===========================>..] - ETA: 13s - loss: 0.1249 - regression_loss: 0.1103 - classification_loss: 0.0146 474/500 [===========================>..] - ETA: 13s - loss: 0.1248 - regression_loss: 0.1103 - classification_loss: 0.0146 475/500 [===========================>..] - ETA: 12s - loss: 0.1246 - regression_loss: 0.1101 - classification_loss: 0.0145 476/500 [===========================>..] - ETA: 12s - loss: 0.1247 - regression_loss: 0.1102 - classification_loss: 0.0145 477/500 [===========================>..] - ETA: 11s - loss: 0.1248 - regression_loss: 0.1102 - classification_loss: 0.0145 478/500 [===========================>..] - ETA: 11s - loss: 0.1246 - regression_loss: 0.1101 - classification_loss: 0.0145 479/500 [===========================>..] - ETA: 10s - loss: 0.1245 - regression_loss: 0.1100 - classification_loss: 0.0145 480/500 [===========================>..] - ETA: 10s - loss: 0.1245 - regression_loss: 0.1100 - classification_loss: 0.0145 481/500 [===========================>..] - ETA: 9s - loss: 0.1243 - regression_loss: 0.1098 - classification_loss: 0.0145  482/500 [===========================>..] - ETA: 9s - loss: 0.1242 - regression_loss: 0.1098 - classification_loss: 0.0145 483/500 [===========================>..] - ETA: 8s - loss: 0.1241 - regression_loss: 0.1097 - classification_loss: 0.0145 484/500 [============================>.] - ETA: 8s - loss: 0.1242 - regression_loss: 0.1097 - classification_loss: 0.0145 485/500 [============================>.] - ETA: 7s - loss: 0.1240 - regression_loss: 0.1096 - classification_loss: 0.0145 486/500 [============================>.] - ETA: 7s - loss: 0.1245 - regression_loss: 0.1100 - classification_loss: 0.0145 487/500 [============================>.] - ETA: 6s - loss: 0.1245 - regression_loss: 0.1100 - classification_loss: 0.0145 488/500 [============================>.] - ETA: 6s - loss: 0.1244 - regression_loss: 0.1100 - classification_loss: 0.0145 489/500 [============================>.] - ETA: 5s - loss: 0.1243 - regression_loss: 0.1098 - classification_loss: 0.0145 490/500 [============================>.] - ETA: 5s - loss: 0.1241 - regression_loss: 0.1097 - classification_loss: 0.0145 491/500 [============================>.] - ETA: 4s - loss: 0.1241 - regression_loss: 0.1097 - classification_loss: 0.0144 492/500 [============================>.] - ETA: 4s - loss: 0.1239 - regression_loss: 0.1095 - classification_loss: 0.0144 493/500 [============================>.] - ETA: 3s - loss: 0.1238 - regression_loss: 0.1094 - classification_loss: 0.0144 494/500 [============================>.] - ETA: 3s - loss: 0.1237 - regression_loss: 0.1093 - classification_loss: 0.0144 495/500 [============================>.] - ETA: 2s - loss: 0.1237 - regression_loss: 0.1093 - classification_loss: 0.0144 496/500 [============================>.] - ETA: 2s - loss: 0.1235 - regression_loss: 0.1092 - classification_loss: 0.0144 497/500 [============================>.] - ETA: 1s - loss: 0.1236 - regression_loss: 0.1092 - classification_loss: 0.0144 498/500 [============================>.] - ETA: 1s - loss: 0.1235 - regression_loss: 0.1091 - classification_loss: 0.0144 499/500 [============================>.] - ETA: 0s - loss: 0.1233 - regression_loss: 0.1090 - classification_loss: 0.0144 500/500 [==============================] - 255s 511ms/step - loss: 0.1232 - regression_loss: 0.1089 - classification_loss: 0.0143 326 instances of class plum with average precision: 0.8403 mAP: 0.8403 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 4:26 - loss: 0.0255 - regression_loss: 0.0220 - classification_loss: 0.0036 2/500 [..............................] - ETA: 4:22 - loss: 0.1857 - regression_loss: 0.1762 - classification_loss: 0.0095 3/500 [..............................] - ETA: 4:17 - loss: 0.2127 - regression_loss: 0.1875 - classification_loss: 0.0252 4/500 [..............................] - ETA: 4:16 - loss: 0.2330 - regression_loss: 0.2044 - classification_loss: 0.0286 5/500 [..............................] - ETA: 4:14 - loss: 0.2989 - regression_loss: 0.2658 - classification_loss: 0.0331 6/500 [..............................] - ETA: 4:13 - loss: 0.2596 - regression_loss: 0.2305 - classification_loss: 0.0291 7/500 [..............................] - ETA: 4:11 - loss: 0.2312 - regression_loss: 0.2046 - classification_loss: 0.0266 8/500 [..............................] - ETA: 4:11 - loss: 0.2083 - regression_loss: 0.1841 - classification_loss: 0.0243 9/500 [..............................] - ETA: 4:11 - loss: 0.1912 - regression_loss: 0.1690 - classification_loss: 0.0223 10/500 [..............................] - ETA: 4:09 - loss: 0.1790 - regression_loss: 0.1582 - classification_loss: 0.0208 11/500 [..............................] - ETA: 4:07 - loss: 0.1877 - regression_loss: 0.1664 - classification_loss: 0.0212 12/500 [..............................] - ETA: 4:07 - loss: 0.1795 - regression_loss: 0.1594 - classification_loss: 0.0201 13/500 [..............................] - ETA: 4:07 - loss: 0.1693 - regression_loss: 0.1503 - classification_loss: 0.0191 14/500 [..............................] - ETA: 4:07 - loss: 0.1607 - regression_loss: 0.1423 - classification_loss: 0.0184 15/500 [..............................] - ETA: 4:06 - loss: 0.1572 - regression_loss: 0.1394 - classification_loss: 0.0178 16/500 [..............................] - ETA: 4:05 - loss: 0.1528 - regression_loss: 0.1358 - classification_loss: 0.0170 17/500 [>.............................] - ETA: 4:05 - loss: 0.1460 - regression_loss: 0.1297 - classification_loss: 0.0162 18/500 [>.............................] - ETA: 4:05 - loss: 0.1400 - regression_loss: 0.1243 - classification_loss: 0.0157 19/500 [>.............................] - ETA: 4:04 - loss: 0.1395 - regression_loss: 0.1240 - classification_loss: 0.0154 20/500 [>.............................] - ETA: 4:03 - loss: 0.1345 - regression_loss: 0.1196 - classification_loss: 0.0149 21/500 [>.............................] - ETA: 4:03 - loss: 0.1366 - regression_loss: 0.1220 - classification_loss: 0.0147 22/500 [>.............................] - ETA: 4:02 - loss: 0.1349 - regression_loss: 0.1204 - classification_loss: 0.0145 23/500 [>.............................] - ETA: 4:02 - loss: 0.1356 - regression_loss: 0.1211 - classification_loss: 0.0146 24/500 [>.............................] - ETA: 4:01 - loss: 0.1355 - regression_loss: 0.1212 - classification_loss: 0.0143 25/500 [>.............................] - ETA: 4:01 - loss: 0.1322 - regression_loss: 0.1181 - classification_loss: 0.0141 26/500 [>.............................] - ETA: 4:00 - loss: 0.1336 - regression_loss: 0.1195 - classification_loss: 0.0141 27/500 [>.............................] - ETA: 4:00 - loss: 0.1322 - regression_loss: 0.1181 - classification_loss: 0.0140 28/500 [>.............................] - ETA: 4:00 - loss: 0.1295 - regression_loss: 0.1158 - classification_loss: 0.0137 29/500 [>.............................] - ETA: 3:59 - loss: 0.1445 - regression_loss: 0.1271 - classification_loss: 0.0175 30/500 [>.............................] - ETA: 3:59 - loss: 0.1421 - regression_loss: 0.1249 - classification_loss: 0.0172 31/500 [>.............................] - ETA: 3:58 - loss: 0.1406 - regression_loss: 0.1237 - classification_loss: 0.0170 32/500 [>.............................] - ETA: 3:58 - loss: 0.1397 - regression_loss: 0.1229 - classification_loss: 0.0168 33/500 [>.............................] - ETA: 3:57 - loss: 0.1397 - regression_loss: 0.1228 - classification_loss: 0.0168 34/500 [=>............................] - ETA: 3:57 - loss: 0.1384 - regression_loss: 0.1218 - classification_loss: 0.0166 35/500 [=>............................] - ETA: 3:56 - loss: 0.1402 - regression_loss: 0.1235 - classification_loss: 0.0167 36/500 [=>............................] - ETA: 3:56 - loss: 0.1416 - regression_loss: 0.1250 - classification_loss: 0.0166 37/500 [=>............................] - ETA: 3:55 - loss: 0.1393 - regression_loss: 0.1230 - classification_loss: 0.0164 38/500 [=>............................] - ETA: 3:54 - loss: 0.1366 - regression_loss: 0.1205 - classification_loss: 0.0161 39/500 [=>............................] - ETA: 3:54 - loss: 0.1349 - regression_loss: 0.1190 - classification_loss: 0.0159 40/500 [=>............................] - ETA: 3:53 - loss: 0.1336 - regression_loss: 0.1180 - classification_loss: 0.0156 41/500 [=>............................] - ETA: 3:53 - loss: 0.1319 - regression_loss: 0.1165 - classification_loss: 0.0154 42/500 [=>............................] - ETA: 3:53 - loss: 0.1310 - regression_loss: 0.1157 - classification_loss: 0.0152 43/500 [=>............................] - ETA: 3:52 - loss: 0.1295 - regression_loss: 0.1145 - classification_loss: 0.0150 44/500 [=>............................] - ETA: 3:52 - loss: 0.1282 - regression_loss: 0.1134 - classification_loss: 0.0148 45/500 [=>............................] - ETA: 3:51 - loss: 0.1290 - regression_loss: 0.1140 - classification_loss: 0.0151 46/500 [=>............................] - ETA: 3:51 - loss: 0.1283 - regression_loss: 0.1133 - classification_loss: 0.0150 47/500 [=>............................] - ETA: 3:51 - loss: 0.1261 - regression_loss: 0.1113 - classification_loss: 0.0148 48/500 [=>............................] - ETA: 3:50 - loss: 0.1248 - regression_loss: 0.1101 - classification_loss: 0.0146 49/500 [=>............................] - ETA: 3:50 - loss: 0.1263 - regression_loss: 0.1118 - classification_loss: 0.0146 50/500 [==>...........................] - ETA: 3:49 - loss: 0.1249 - regression_loss: 0.1105 - classification_loss: 0.0144 51/500 [==>...........................] - ETA: 3:48 - loss: 0.1305 - regression_loss: 0.1147 - classification_loss: 0.0158 52/500 [==>...........................] - ETA: 3:48 - loss: 0.1319 - regression_loss: 0.1162 - classification_loss: 0.0157 53/500 [==>...........................] - ETA: 3:47 - loss: 0.1299 - regression_loss: 0.1144 - classification_loss: 0.0155 54/500 [==>...........................] - ETA: 3:47 - loss: 0.1299 - regression_loss: 0.1144 - classification_loss: 0.0155 55/500 [==>...........................] - ETA: 3:46 - loss: 0.1301 - regression_loss: 0.1147 - classification_loss: 0.0155 56/500 [==>...........................] - ETA: 3:46 - loss: 0.1284 - regression_loss: 0.1132 - classification_loss: 0.0153 57/500 [==>...........................] - ETA: 3:46 - loss: 0.1393 - regression_loss: 0.1229 - classification_loss: 0.0164 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1396 - regression_loss: 0.1233 - classification_loss: 0.0164 59/500 [==>...........................] - ETA: 3:44 - loss: 0.1382 - regression_loss: 0.1220 - classification_loss: 0.0162 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1366 - regression_loss: 0.1205 - classification_loss: 0.0161 61/500 [==>...........................] - ETA: 3:44 - loss: 0.1356 - regression_loss: 0.1195 - classification_loss: 0.0161 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1346 - regression_loss: 0.1186 - classification_loss: 0.0160 63/500 [==>...........................] - ETA: 3:43 - loss: 0.1333 - regression_loss: 0.1175 - classification_loss: 0.0158 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1323 - regression_loss: 0.1166 - classification_loss: 0.0156 65/500 [==>...........................] - ETA: 3:41 - loss: 0.1317 - regression_loss: 0.1162 - classification_loss: 0.0155 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1308 - regression_loss: 0.1154 - classification_loss: 0.0153 67/500 [===>..........................] - ETA: 3:40 - loss: 0.1304 - regression_loss: 0.1151 - classification_loss: 0.0153 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1289 - regression_loss: 0.1138 - classification_loss: 0.0151 69/500 [===>..........................] - ETA: 3:39 - loss: 0.1293 - regression_loss: 0.1143 - classification_loss: 0.0150 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1284 - regression_loss: 0.1135 - classification_loss: 0.0149 71/500 [===>..........................] - ETA: 3:38 - loss: 0.1284 - regression_loss: 0.1136 - classification_loss: 0.0148 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1281 - regression_loss: 0.1135 - classification_loss: 0.0147 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1283 - regression_loss: 0.1138 - classification_loss: 0.0146 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1274 - regression_loss: 0.1130 - classification_loss: 0.0145 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1267 - regression_loss: 0.1122 - classification_loss: 0.0145 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1283 - regression_loss: 0.1136 - classification_loss: 0.0147 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1271 - regression_loss: 0.1126 - classification_loss: 0.0146 78/500 [===>..........................] - ETA: 3:35 - loss: 0.1273 - regression_loss: 0.1127 - classification_loss: 0.0146 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1270 - regression_loss: 0.1125 - classification_loss: 0.0145 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1261 - regression_loss: 0.1117 - classification_loss: 0.0144 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1259 - regression_loss: 0.1115 - classification_loss: 0.0144 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1251 - regression_loss: 0.1108 - classification_loss: 0.0143 83/500 [===>..........................] - ETA: 3:32 - loss: 0.1259 - regression_loss: 0.1115 - classification_loss: 0.0143 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1250 - regression_loss: 0.1107 - classification_loss: 0.0142 85/500 [====>.........................] - ETA: 3:31 - loss: 0.1255 - regression_loss: 0.1113 - classification_loss: 0.0142 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1250 - regression_loss: 0.1108 - classification_loss: 0.0142 87/500 [====>.........................] - ETA: 3:30 - loss: 0.1244 - regression_loss: 0.1102 - classification_loss: 0.0142 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1234 - regression_loss: 0.1093 - classification_loss: 0.0141 89/500 [====>.........................] - ETA: 3:30 - loss: 0.1228 - regression_loss: 0.1088 - classification_loss: 0.0140 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1226 - regression_loss: 0.1086 - classification_loss: 0.0140 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1227 - regression_loss: 0.1087 - classification_loss: 0.0140 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1231 - regression_loss: 0.1091 - classification_loss: 0.0140 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1223 - regression_loss: 0.1084 - classification_loss: 0.0139 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1220 - regression_loss: 0.1082 - classification_loss: 0.0139 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1218 - regression_loss: 0.1080 - classification_loss: 0.0138 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1214 - regression_loss: 0.1077 - classification_loss: 0.0137 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1209 - regression_loss: 0.1073 - classification_loss: 0.0136 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1202 - regression_loss: 0.1066 - classification_loss: 0.0136 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1196 - regression_loss: 0.1060 - classification_loss: 0.0135 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1195 - regression_loss: 0.1059 - classification_loss: 0.0135 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1189 - regression_loss: 0.1054 - classification_loss: 0.0135 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1193 - regression_loss: 0.1058 - classification_loss: 0.0135 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1187 - regression_loss: 0.1053 - classification_loss: 0.0134 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1194 - regression_loss: 0.1060 - classification_loss: 0.0134 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1185 - regression_loss: 0.1052 - classification_loss: 0.0133 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1191 - regression_loss: 0.1057 - classification_loss: 0.0134 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1183 - regression_loss: 0.1050 - classification_loss: 0.0133 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1180 - regression_loss: 0.1047 - classification_loss: 0.0133 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1192 - regression_loss: 0.1059 - classification_loss: 0.0133 110/500 [=====>........................] - ETA: 3:18 - loss: 0.1189 - regression_loss: 0.1056 - classification_loss: 0.0133 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1183 - regression_loss: 0.1050 - classification_loss: 0.0133 112/500 [=====>........................] - ETA: 3:17 - loss: 0.1182 - regression_loss: 0.1050 - classification_loss: 0.0132 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1181 - regression_loss: 0.1049 - classification_loss: 0.0132 114/500 [=====>........................] - ETA: 3:16 - loss: 0.1179 - regression_loss: 0.1048 - classification_loss: 0.0132 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1174 - regression_loss: 0.1043 - classification_loss: 0.0131 116/500 [=====>........................] - ETA: 3:15 - loss: 0.1173 - regression_loss: 0.1042 - classification_loss: 0.0131 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1167 - regression_loss: 0.1037 - classification_loss: 0.0130 118/500 [======>.......................] - ETA: 3:14 - loss: 0.1162 - regression_loss: 0.1032 - classification_loss: 0.0129 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1179 - regression_loss: 0.1048 - classification_loss: 0.0131 120/500 [======>.......................] - ETA: 3:13 - loss: 0.1179 - regression_loss: 0.1047 - classification_loss: 0.0132 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1174 - regression_loss: 0.1042 - classification_loss: 0.0132 122/500 [======>.......................] - ETA: 3:12 - loss: 0.1180 - regression_loss: 0.1048 - classification_loss: 0.0132 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1177 - regression_loss: 0.1045 - classification_loss: 0.0132 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1171 - regression_loss: 0.1040 - classification_loss: 0.0131 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1169 - regression_loss: 0.1039 - classification_loss: 0.0131 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1166 - regression_loss: 0.1036 - classification_loss: 0.0130 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1162 - regression_loss: 0.1033 - classification_loss: 0.0130 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1172 - regression_loss: 0.1041 - classification_loss: 0.0131 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1164 - regression_loss: 0.1034 - classification_loss: 0.0130 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1170 - regression_loss: 0.1040 - classification_loss: 0.0131 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1163 - regression_loss: 0.1033 - classification_loss: 0.0130 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1164 - regression_loss: 0.1035 - classification_loss: 0.0130 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1163 - regression_loss: 0.1034 - classification_loss: 0.0129 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1161 - regression_loss: 0.1032 - classification_loss: 0.0129 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1156 - regression_loss: 0.1028 - classification_loss: 0.0129 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1161 - regression_loss: 0.1032 - classification_loss: 0.0129 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1157 - regression_loss: 0.1029 - classification_loss: 0.0128 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1157 - regression_loss: 0.1029 - classification_loss: 0.0128 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1167 - regression_loss: 0.1036 - classification_loss: 0.0131 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1165 - regression_loss: 0.1034 - classification_loss: 0.0130 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1164 - regression_loss: 0.1033 - classification_loss: 0.0131 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1205 - regression_loss: 0.1070 - classification_loss: 0.0135 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1202 - regression_loss: 0.1068 - classification_loss: 0.0134 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1197 - regression_loss: 0.1063 - classification_loss: 0.0134 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1200 - regression_loss: 0.1065 - classification_loss: 0.0135 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1198 - regression_loss: 0.1063 - classification_loss: 0.0135 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1196 - regression_loss: 0.1061 - classification_loss: 0.0134 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1190 - regression_loss: 0.1056 - classification_loss: 0.0134 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1185 - regression_loss: 0.1052 - classification_loss: 0.0133 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1181 - regression_loss: 0.1048 - classification_loss: 0.0133 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1191 - regression_loss: 0.1057 - classification_loss: 0.0134 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1193 - regression_loss: 0.1059 - classification_loss: 0.0134 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1189 - regression_loss: 0.1055 - classification_loss: 0.0134 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1191 - regression_loss: 0.1057 - classification_loss: 0.0134 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1189 - regression_loss: 0.1055 - classification_loss: 0.0134 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1191 - regression_loss: 0.1057 - classification_loss: 0.0134 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1186 - regression_loss: 0.1052 - classification_loss: 0.0134 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1183 - regression_loss: 0.1050 - classification_loss: 0.0133 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1187 - regression_loss: 0.1053 - classification_loss: 0.0134 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1189 - regression_loss: 0.1055 - classification_loss: 0.0134 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1187 - regression_loss: 0.1054 - classification_loss: 0.0134 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1191 - regression_loss: 0.1057 - classification_loss: 0.0134 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1190 - regression_loss: 0.1057 - classification_loss: 0.0134 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1187 - regression_loss: 0.1053 - classification_loss: 0.0133 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1187 - regression_loss: 0.1055 - classification_loss: 0.0133 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1230 - regression_loss: 0.1084 - classification_loss: 0.0147 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1225 - regression_loss: 0.1079 - classification_loss: 0.0146 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1241 - regression_loss: 0.1091 - classification_loss: 0.0151 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1244 - regression_loss: 0.1094 - classification_loss: 0.0150 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1240 - regression_loss: 0.1090 - classification_loss: 0.0150 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1237 - regression_loss: 0.1087 - classification_loss: 0.0150 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1232 - regression_loss: 0.1083 - classification_loss: 0.0149 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1229 - regression_loss: 0.1081 - classification_loss: 0.0149 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1225 - regression_loss: 0.1077 - classification_loss: 0.0148 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1223 - regression_loss: 0.1075 - classification_loss: 0.0148 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1230 - regression_loss: 0.1082 - classification_loss: 0.0148 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1230 - regression_loss: 0.1081 - classification_loss: 0.0148 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1226 - regression_loss: 0.1078 - classification_loss: 0.0148 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1227 - regression_loss: 0.1080 - classification_loss: 0.0147 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1230 - regression_loss: 0.1082 - classification_loss: 0.0147 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1227 - regression_loss: 0.1080 - classification_loss: 0.0147 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1225 - regression_loss: 0.1078 - classification_loss: 0.0147 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1221 - regression_loss: 0.1075 - classification_loss: 0.0147 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1216 - regression_loss: 0.1070 - classification_loss: 0.0146 185/500 [==========>...................] - ETA: 2:41 - loss: 0.1211 - regression_loss: 0.1066 - classification_loss: 0.0145 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1207 - regression_loss: 0.1062 - classification_loss: 0.0145 187/500 [==========>...................] - ETA: 2:40 - loss: 0.1206 - regression_loss: 0.1061 - classification_loss: 0.0145 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1203 - regression_loss: 0.1058 - classification_loss: 0.0145 189/500 [==========>...................] - ETA: 2:39 - loss: 0.1224 - regression_loss: 0.1079 - classification_loss: 0.0146 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1223 - regression_loss: 0.1077 - classification_loss: 0.0146 191/500 [==========>...................] - ETA: 2:38 - loss: 0.1223 - regression_loss: 0.1077 - classification_loss: 0.0146 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1221 - regression_loss: 0.1076 - classification_loss: 0.0145 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1220 - regression_loss: 0.1074 - classification_loss: 0.0145 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1214 - regression_loss: 0.1070 - classification_loss: 0.0145 195/500 [==========>...................] - ETA: 2:36 - loss: 0.1213 - regression_loss: 0.1069 - classification_loss: 0.0144 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1229 - regression_loss: 0.1084 - classification_loss: 0.0145 197/500 [==========>...................] - ETA: 2:35 - loss: 0.1230 - regression_loss: 0.1085 - classification_loss: 0.0145 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1228 - regression_loss: 0.1084 - classification_loss: 0.0144 199/500 [==========>...................] - ETA: 2:34 - loss: 0.1225 - regression_loss: 0.1081 - classification_loss: 0.0144 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1225 - regression_loss: 0.1081 - classification_loss: 0.0144 201/500 [===========>..................] - ETA: 2:33 - loss: 0.1226 - regression_loss: 0.1082 - classification_loss: 0.0144 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1221 - regression_loss: 0.1078 - classification_loss: 0.0144 203/500 [===========>..................] - ETA: 2:32 - loss: 0.1220 - regression_loss: 0.1076 - classification_loss: 0.0144 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1219 - regression_loss: 0.1075 - classification_loss: 0.0143 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1237 - regression_loss: 0.1089 - classification_loss: 0.0148 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1233 - regression_loss: 0.1086 - classification_loss: 0.0147 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1230 - regression_loss: 0.1083 - classification_loss: 0.0147 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1226 - regression_loss: 0.1080 - classification_loss: 0.0146 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1247 - regression_loss: 0.1098 - classification_loss: 0.0148 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1244 - regression_loss: 0.1096 - classification_loss: 0.0148 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1243 - regression_loss: 0.1095 - classification_loss: 0.0147 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1245 - regression_loss: 0.1097 - classification_loss: 0.0147 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1242 - regression_loss: 0.1095 - classification_loss: 0.0147 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1240 - regression_loss: 0.1093 - classification_loss: 0.0147 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1239 - regression_loss: 0.1092 - classification_loss: 0.0147 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1236 - regression_loss: 0.1090 - classification_loss: 0.0146 217/500 [============>.................] - ETA: 2:24 - loss: 0.1246 - regression_loss: 0.1099 - classification_loss: 0.0146 218/500 [============>.................] - ETA: 2:24 - loss: 0.1244 - regression_loss: 0.1098 - classification_loss: 0.0146 219/500 [============>.................] - ETA: 2:23 - loss: 0.1242 - regression_loss: 0.1096 - classification_loss: 0.0146 220/500 [============>.................] - ETA: 2:23 - loss: 0.1239 - regression_loss: 0.1094 - classification_loss: 0.0145 221/500 [============>.................] - ETA: 2:22 - loss: 0.1241 - regression_loss: 0.1095 - classification_loss: 0.0146 222/500 [============>.................] - ETA: 2:22 - loss: 0.1239 - regression_loss: 0.1093 - classification_loss: 0.0145 223/500 [============>.................] - ETA: 2:21 - loss: 0.1237 - regression_loss: 0.1091 - classification_loss: 0.0145 224/500 [============>.................] - ETA: 2:21 - loss: 0.1233 - regression_loss: 0.1088 - classification_loss: 0.0145 225/500 [============>.................] - ETA: 2:20 - loss: 0.1231 - regression_loss: 0.1087 - classification_loss: 0.0144 226/500 [============>.................] - ETA: 2:20 - loss: 0.1228 - regression_loss: 0.1084 - classification_loss: 0.0144 227/500 [============>.................] - ETA: 2:19 - loss: 0.1225 - regression_loss: 0.1081 - classification_loss: 0.0144 228/500 [============>.................] - ETA: 2:19 - loss: 0.1226 - regression_loss: 0.1083 - classification_loss: 0.0144 229/500 [============>.................] - ETA: 2:18 - loss: 0.1223 - regression_loss: 0.1079 - classification_loss: 0.0143 230/500 [============>.................] - ETA: 2:18 - loss: 0.1224 - regression_loss: 0.1080 - classification_loss: 0.0143 231/500 [============>.................] - ETA: 2:17 - loss: 0.1222 - regression_loss: 0.1079 - classification_loss: 0.0143 232/500 [============>.................] - ETA: 2:17 - loss: 0.1220 - regression_loss: 0.1077 - classification_loss: 0.0143 233/500 [============>.................] - ETA: 2:16 - loss: 0.1220 - regression_loss: 0.1077 - classification_loss: 0.0143 234/500 [=============>................] - ETA: 2:16 - loss: 0.1227 - regression_loss: 0.1084 - classification_loss: 0.0143 235/500 [=============>................] - ETA: 2:15 - loss: 0.1224 - regression_loss: 0.1082 - classification_loss: 0.0143 236/500 [=============>................] - ETA: 2:15 - loss: 0.1222 - regression_loss: 0.1080 - classification_loss: 0.0142 237/500 [=============>................] - ETA: 2:14 - loss: 0.1220 - regression_loss: 0.1077 - classification_loss: 0.0142 238/500 [=============>................] - ETA: 2:14 - loss: 0.1216 - regression_loss: 0.1074 - classification_loss: 0.0142 239/500 [=============>................] - ETA: 2:13 - loss: 0.1213 - regression_loss: 0.1072 - classification_loss: 0.0142 240/500 [=============>................] - ETA: 2:13 - loss: 0.1213 - regression_loss: 0.1071 - classification_loss: 0.0142 241/500 [=============>................] - ETA: 2:12 - loss: 0.1213 - regression_loss: 0.1072 - classification_loss: 0.0141 242/500 [=============>................] - ETA: 2:11 - loss: 0.1210 - regression_loss: 0.1069 - classification_loss: 0.0141 243/500 [=============>................] - ETA: 2:11 - loss: 0.1212 - regression_loss: 0.1071 - classification_loss: 0.0141 244/500 [=============>................] - ETA: 2:10 - loss: 0.1208 - regression_loss: 0.1067 - classification_loss: 0.0141 245/500 [=============>................] - ETA: 2:10 - loss: 0.1205 - regression_loss: 0.1065 - classification_loss: 0.0140 246/500 [=============>................] - ETA: 2:09 - loss: 0.1202 - regression_loss: 0.1062 - classification_loss: 0.0140 247/500 [=============>................] - ETA: 2:09 - loss: 0.1199 - regression_loss: 0.1059 - classification_loss: 0.0140 248/500 [=============>................] - ETA: 2:08 - loss: 0.1197 - regression_loss: 0.1057 - classification_loss: 0.0139 249/500 [=============>................] - ETA: 2:08 - loss: 0.1198 - regression_loss: 0.1059 - classification_loss: 0.0140 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1197 - regression_loss: 0.1057 - classification_loss: 0.0140 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1194 - regression_loss: 0.1055 - classification_loss: 0.0139 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1194 - regression_loss: 0.1055 - classification_loss: 0.0139 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1192 - regression_loss: 0.1053 - classification_loss: 0.0139 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1190 - regression_loss: 0.1051 - classification_loss: 0.0139 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1189 - regression_loss: 0.1051 - classification_loss: 0.0139 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1190 - regression_loss: 0.1051 - classification_loss: 0.0139 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1187 - regression_loss: 0.1049 - classification_loss: 0.0138 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1184 - regression_loss: 0.1046 - classification_loss: 0.0138 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1189 - regression_loss: 0.1050 - classification_loss: 0.0139 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1186 - regression_loss: 0.1047 - classification_loss: 0.0139 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1183 - regression_loss: 0.1045 - classification_loss: 0.0138 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1181 - regression_loss: 0.1043 - classification_loss: 0.0138 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1180 - regression_loss: 0.1042 - classification_loss: 0.0138 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1178 - regression_loss: 0.1040 - classification_loss: 0.0138 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1176 - regression_loss: 0.1038 - classification_loss: 0.0138 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1174 - regression_loss: 0.1037 - classification_loss: 0.0137 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1174 - regression_loss: 0.1036 - classification_loss: 0.0137 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1173 - regression_loss: 0.1036 - classification_loss: 0.0137 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1171 - regression_loss: 0.1034 - classification_loss: 0.0137 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1169 - regression_loss: 0.1032 - classification_loss: 0.0137 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1172 - regression_loss: 0.1035 - classification_loss: 0.0137 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1171 - regression_loss: 0.1034 - classification_loss: 0.0137 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1171 - regression_loss: 0.1035 - classification_loss: 0.0137 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1169 - regression_loss: 0.1033 - classification_loss: 0.0136 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1168 - regression_loss: 0.1031 - classification_loss: 0.0136 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1166 - regression_loss: 0.1029 - classification_loss: 0.0136 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1164 - regression_loss: 0.1028 - classification_loss: 0.0136 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1180 - regression_loss: 0.1041 - classification_loss: 0.0140 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1179 - regression_loss: 0.1040 - classification_loss: 0.0140 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1179 - regression_loss: 0.1039 - classification_loss: 0.0140 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1179 - regression_loss: 0.1039 - classification_loss: 0.0139 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1179 - regression_loss: 0.1040 - classification_loss: 0.0139 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1179 - regression_loss: 0.1040 - classification_loss: 0.0139 284/500 [================>.............] - ETA: 1:50 - loss: 0.1178 - regression_loss: 0.1039 - classification_loss: 0.0139 285/500 [================>.............] - ETA: 1:49 - loss: 0.1175 - regression_loss: 0.1036 - classification_loss: 0.0139 286/500 [================>.............] - ETA: 1:49 - loss: 0.1174 - regression_loss: 0.1036 - classification_loss: 0.0139 287/500 [================>.............] - ETA: 1:48 - loss: 0.1173 - regression_loss: 0.1035 - classification_loss: 0.0138 288/500 [================>.............] - ETA: 1:48 - loss: 0.1171 - regression_loss: 0.1032 - classification_loss: 0.0138 289/500 [================>.............] - ETA: 1:47 - loss: 0.1171 - regression_loss: 0.1033 - classification_loss: 0.0138 290/500 [================>.............] - ETA: 1:47 - loss: 0.1169 - regression_loss: 0.1031 - classification_loss: 0.0138 291/500 [================>.............] - ETA: 1:46 - loss: 0.1168 - regression_loss: 0.1030 - classification_loss: 0.0138 292/500 [================>.............] - ETA: 1:46 - loss: 0.1174 - regression_loss: 0.1035 - classification_loss: 0.0139 293/500 [================>.............] - ETA: 1:45 - loss: 0.1174 - regression_loss: 0.1035 - classification_loss: 0.0138 294/500 [================>.............] - ETA: 1:45 - loss: 0.1172 - regression_loss: 0.1034 - classification_loss: 0.0138 295/500 [================>.............] - ETA: 1:44 - loss: 0.1179 - regression_loss: 0.1041 - classification_loss: 0.0138 296/500 [================>.............] - ETA: 1:44 - loss: 0.1176 - regression_loss: 0.1038 - classification_loss: 0.0138 297/500 [================>.............] - ETA: 1:43 - loss: 0.1177 - regression_loss: 0.1039 - classification_loss: 0.0138 298/500 [================>.............] - ETA: 1:43 - loss: 0.1175 - regression_loss: 0.1037 - classification_loss: 0.0138 299/500 [================>.............] - ETA: 1:42 - loss: 0.1174 - regression_loss: 0.1036 - classification_loss: 0.0138 300/500 [=================>............] - ETA: 1:42 - loss: 0.1174 - regression_loss: 0.1036 - classification_loss: 0.0138 301/500 [=================>............] - ETA: 1:41 - loss: 0.1172 - regression_loss: 0.1034 - classification_loss: 0.0138 302/500 [=================>............] - ETA: 1:41 - loss: 0.1171 - regression_loss: 0.1033 - classification_loss: 0.0138 303/500 [=================>............] - ETA: 1:40 - loss: 0.1169 - regression_loss: 0.1032 - classification_loss: 0.0138 304/500 [=================>............] - ETA: 1:40 - loss: 0.1178 - regression_loss: 0.1038 - classification_loss: 0.0140 305/500 [=================>............] - ETA: 1:39 - loss: 0.1178 - regression_loss: 0.1037 - classification_loss: 0.0140 306/500 [=================>............] - ETA: 1:39 - loss: 0.1175 - regression_loss: 0.1035 - classification_loss: 0.0140 307/500 [=================>............] - ETA: 1:38 - loss: 0.1179 - regression_loss: 0.1039 - classification_loss: 0.0140 308/500 [=================>............] - ETA: 1:38 - loss: 0.1176 - regression_loss: 0.1037 - classification_loss: 0.0140 309/500 [=================>............] - ETA: 1:37 - loss: 0.1175 - regression_loss: 0.1035 - classification_loss: 0.0140 310/500 [=================>............] - ETA: 1:37 - loss: 0.1173 - regression_loss: 0.1034 - classification_loss: 0.0139 311/500 [=================>............] - ETA: 1:36 - loss: 0.1171 - regression_loss: 0.1032 - classification_loss: 0.0139 312/500 [=================>............] - ETA: 1:36 - loss: 0.1168 - regression_loss: 0.1029 - classification_loss: 0.0139 313/500 [=================>............] - ETA: 1:35 - loss: 0.1167 - regression_loss: 0.1028 - classification_loss: 0.0139 314/500 [=================>............] - ETA: 1:34 - loss: 0.1168 - regression_loss: 0.1029 - classification_loss: 0.0139 315/500 [=================>............] - ETA: 1:34 - loss: 0.1166 - regression_loss: 0.1028 - classification_loss: 0.0139 316/500 [=================>............] - ETA: 1:33 - loss: 0.1165 - regression_loss: 0.1026 - classification_loss: 0.0138 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1165 - regression_loss: 0.1027 - classification_loss: 0.0139 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1167 - regression_loss: 0.1028 - classification_loss: 0.0138 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1165 - regression_loss: 0.1027 - classification_loss: 0.0138 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1163 - regression_loss: 0.1025 - classification_loss: 0.0138 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1163 - regression_loss: 0.1025 - classification_loss: 0.0138 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0138 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1162 - regression_loss: 0.1024 - classification_loss: 0.0138 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0137 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1159 - regression_loss: 0.1021 - classification_loss: 0.0137 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1158 - regression_loss: 0.1021 - classification_loss: 0.0137 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1158 - regression_loss: 0.1021 - classification_loss: 0.0137 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1155 - regression_loss: 0.1019 - classification_loss: 0.0137 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1157 - regression_loss: 0.1020 - classification_loss: 0.0137 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1162 - regression_loss: 0.1023 - classification_loss: 0.0138 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1160 - regression_loss: 0.1022 - classification_loss: 0.0138 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1159 - regression_loss: 0.1021 - classification_loss: 0.0138 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1158 - regression_loss: 0.1020 - classification_loss: 0.0138 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1157 - regression_loss: 0.1019 - classification_loss: 0.0138 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1160 - regression_loss: 0.1022 - classification_loss: 0.0138 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1158 - regression_loss: 0.1020 - classification_loss: 0.0138 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1170 - regression_loss: 0.1031 - classification_loss: 0.0139 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1169 - regression_loss: 0.1030 - classification_loss: 0.0138 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1166 - regression_loss: 0.1028 - classification_loss: 0.0138 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1168 - regression_loss: 0.1030 - classification_loss: 0.0138 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1168 - regression_loss: 0.1030 - classification_loss: 0.0138 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1166 - regression_loss: 0.1028 - classification_loss: 0.0138 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1165 - regression_loss: 0.1027 - classification_loss: 0.0138 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1163 - regression_loss: 0.1025 - classification_loss: 0.0138 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1161 - regression_loss: 0.1023 - classification_loss: 0.0137 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1166 - regression_loss: 0.1028 - classification_loss: 0.0138 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1164 - regression_loss: 0.1026 - classification_loss: 0.0138 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1164 - regression_loss: 0.1026 - classification_loss: 0.0137 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1161 - regression_loss: 0.1024 - classification_loss: 0.0137 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0137 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1159 - regression_loss: 0.1022 - classification_loss: 0.0137 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1164 - regression_loss: 0.1027 - classification_loss: 0.0137 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1162 - regression_loss: 0.1025 - classification_loss: 0.0137 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0137 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1159 - regression_loss: 0.1022 - classification_loss: 0.0137 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1157 - regression_loss: 0.1021 - classification_loss: 0.0136 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1158 - regression_loss: 0.1022 - classification_loss: 0.0137 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1157 - regression_loss: 0.1021 - classification_loss: 0.0136 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1158 - regression_loss: 0.1022 - classification_loss: 0.0136 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1166 - regression_loss: 0.1030 - classification_loss: 0.0136 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1165 - regression_loss: 0.1029 - classification_loss: 0.0136 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1171 - regression_loss: 0.1034 - classification_loss: 0.0136 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1170 - regression_loss: 0.1034 - classification_loss: 0.0136 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1170 - regression_loss: 0.1034 - classification_loss: 0.0136 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1170 - regression_loss: 0.1034 - classification_loss: 0.0136 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1168 - regression_loss: 0.1032 - classification_loss: 0.0136 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1169 - regression_loss: 0.1033 - classification_loss: 0.0136 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1167 - regression_loss: 0.1031 - classification_loss: 0.0136 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1165 - regression_loss: 0.1030 - classification_loss: 0.0136 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1163 - regression_loss: 0.1028 - classification_loss: 0.0135 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1163 - regression_loss: 0.1027 - classification_loss: 0.0135 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1161 - regression_loss: 0.1026 - classification_loss: 0.0135 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1163 - regression_loss: 0.1028 - classification_loss: 0.0135 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1162 - regression_loss: 0.1026 - classification_loss: 0.0135 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1160 - regression_loss: 0.1025 - classification_loss: 0.0135 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1159 - regression_loss: 0.1025 - classification_loss: 0.0135 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1160 - regression_loss: 0.1025 - classification_loss: 0.0135 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1159 - regression_loss: 0.1024 - classification_loss: 0.0135 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1157 - regression_loss: 0.1022 - classification_loss: 0.0135 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1155 - regression_loss: 0.1021 - classification_loss: 0.0134 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1155 - regression_loss: 0.1021 - classification_loss: 0.0134 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1157 - regression_loss: 0.1022 - classification_loss: 0.0135 383/500 [=====================>........] - ETA: 59s - loss: 0.1158 - regression_loss: 0.1024 - classification_loss: 0.0135  384/500 [======================>.......] - ETA: 59s - loss: 0.1159 - regression_loss: 0.1024 - classification_loss: 0.0134 385/500 [======================>.......] - ETA: 58s - loss: 0.1161 - regression_loss: 0.1026 - classification_loss: 0.0135 386/500 [======================>.......] - ETA: 58s - loss: 0.1163 - regression_loss: 0.1029 - classification_loss: 0.0135 387/500 [======================>.......] - ETA: 57s - loss: 0.1161 - regression_loss: 0.1027 - classification_loss: 0.0135 388/500 [======================>.......] - ETA: 57s - loss: 0.1173 - regression_loss: 0.1037 - classification_loss: 0.0136 389/500 [======================>.......] - ETA: 56s - loss: 0.1173 - regression_loss: 0.1037 - classification_loss: 0.0136 390/500 [======================>.......] - ETA: 56s - loss: 0.1171 - regression_loss: 0.1036 - classification_loss: 0.0135 391/500 [======================>.......] - ETA: 55s - loss: 0.1170 - regression_loss: 0.1035 - classification_loss: 0.0135 392/500 [======================>.......] - ETA: 55s - loss: 0.1171 - regression_loss: 0.1036 - classification_loss: 0.0135 393/500 [======================>.......] - ETA: 54s - loss: 0.1170 - regression_loss: 0.1034 - classification_loss: 0.0135 394/500 [======================>.......] - ETA: 54s - loss: 0.1168 - regression_loss: 0.1033 - classification_loss: 0.0135 395/500 [======================>.......] - ETA: 53s - loss: 0.1167 - regression_loss: 0.1032 - classification_loss: 0.0135 396/500 [======================>.......] - ETA: 53s - loss: 0.1165 - regression_loss: 0.1031 - classification_loss: 0.0135 397/500 [======================>.......] - ETA: 52s - loss: 0.1165 - regression_loss: 0.1030 - classification_loss: 0.0134 398/500 [======================>.......] - ETA: 52s - loss: 0.1162 - regression_loss: 0.1028 - classification_loss: 0.0134 399/500 [======================>.......] - ETA: 51s - loss: 0.1163 - regression_loss: 0.1029 - classification_loss: 0.0134 400/500 [=======================>......] - ETA: 51s - loss: 0.1162 - regression_loss: 0.1028 - classification_loss: 0.0134 401/500 [=======================>......] - ETA: 50s - loss: 0.1163 - regression_loss: 0.1029 - classification_loss: 0.0135 402/500 [=======================>......] - ETA: 50s - loss: 0.1163 - regression_loss: 0.1028 - classification_loss: 0.0135 403/500 [=======================>......] - ETA: 49s - loss: 0.1183 - regression_loss: 0.1042 - classification_loss: 0.0141 404/500 [=======================>......] - ETA: 49s - loss: 0.1181 - regression_loss: 0.1041 - classification_loss: 0.0141 405/500 [=======================>......] - ETA: 48s - loss: 0.1180 - regression_loss: 0.1040 - classification_loss: 0.0140 406/500 [=======================>......] - ETA: 47s - loss: 0.1181 - regression_loss: 0.1041 - classification_loss: 0.0140 407/500 [=======================>......] - ETA: 47s - loss: 0.1182 - regression_loss: 0.1041 - classification_loss: 0.0141 408/500 [=======================>......] - ETA: 46s - loss: 0.1180 - regression_loss: 0.1040 - classification_loss: 0.0140 409/500 [=======================>......] - ETA: 46s - loss: 0.1181 - regression_loss: 0.1041 - classification_loss: 0.0140 410/500 [=======================>......] - ETA: 45s - loss: 0.1180 - regression_loss: 0.1040 - classification_loss: 0.0140 411/500 [=======================>......] - ETA: 45s - loss: 0.1183 - regression_loss: 0.1043 - classification_loss: 0.0141 412/500 [=======================>......] - ETA: 44s - loss: 0.1181 - regression_loss: 0.1041 - classification_loss: 0.0140 413/500 [=======================>......] - ETA: 44s - loss: 0.1179 - regression_loss: 0.1039 - classification_loss: 0.0140 414/500 [=======================>......] - ETA: 43s - loss: 0.1177 - regression_loss: 0.1037 - classification_loss: 0.0140 415/500 [=======================>......] - ETA: 43s - loss: 0.1175 - regression_loss: 0.1036 - classification_loss: 0.0140 416/500 [=======================>......] - ETA: 42s - loss: 0.1177 - regression_loss: 0.1037 - classification_loss: 0.0140 417/500 [========================>.....] - ETA: 42s - loss: 0.1175 - regression_loss: 0.1036 - classification_loss: 0.0139 418/500 [========================>.....] - ETA: 41s - loss: 0.1173 - regression_loss: 0.1034 - classification_loss: 0.0139 419/500 [========================>.....] - ETA: 41s - loss: 0.1174 - regression_loss: 0.1034 - classification_loss: 0.0139 420/500 [========================>.....] - ETA: 40s - loss: 0.1172 - regression_loss: 0.1033 - classification_loss: 0.0139 421/500 [========================>.....] - ETA: 40s - loss: 0.1171 - regression_loss: 0.1032 - classification_loss: 0.0139 422/500 [========================>.....] - ETA: 39s - loss: 0.1172 - regression_loss: 0.1033 - classification_loss: 0.0139 423/500 [========================>.....] - ETA: 39s - loss: 0.1171 - regression_loss: 0.1033 - classification_loss: 0.0139 424/500 [========================>.....] - ETA: 38s - loss: 0.1170 - regression_loss: 0.1031 - classification_loss: 0.0138 425/500 [========================>.....] - ETA: 38s - loss: 0.1168 - regression_loss: 0.1029 - classification_loss: 0.0138 426/500 [========================>.....] - ETA: 37s - loss: 0.1167 - regression_loss: 0.1029 - classification_loss: 0.0138 427/500 [========================>.....] - ETA: 37s - loss: 0.1165 - regression_loss: 0.1027 - classification_loss: 0.0138 428/500 [========================>.....] - ETA: 36s - loss: 0.1164 - regression_loss: 0.1026 - classification_loss: 0.0138 429/500 [========================>.....] - ETA: 36s - loss: 0.1162 - regression_loss: 0.1025 - classification_loss: 0.0138 430/500 [========================>.....] - ETA: 35s - loss: 0.1162 - regression_loss: 0.1025 - classification_loss: 0.0137 431/500 [========================>.....] - ETA: 35s - loss: 0.1161 - regression_loss: 0.1024 - classification_loss: 0.0137 432/500 [========================>.....] - ETA: 34s - loss: 0.1159 - regression_loss: 0.1022 - classification_loss: 0.0137 433/500 [========================>.....] - ETA: 34s - loss: 0.1160 - regression_loss: 0.1022 - classification_loss: 0.0137 434/500 [=========================>....] - ETA: 33s - loss: 0.1158 - regression_loss: 0.1021 - classification_loss: 0.0137 435/500 [=========================>....] - ETA: 33s - loss: 0.1157 - regression_loss: 0.1020 - classification_loss: 0.0137 436/500 [=========================>....] - ETA: 32s - loss: 0.1156 - regression_loss: 0.1019 - classification_loss: 0.0137 437/500 [=========================>....] - ETA: 32s - loss: 0.1156 - regression_loss: 0.1020 - classification_loss: 0.0137 438/500 [=========================>....] - ETA: 31s - loss: 0.1155 - regression_loss: 0.1018 - classification_loss: 0.0137 439/500 [=========================>....] - ETA: 31s - loss: 0.1153 - regression_loss: 0.1017 - classification_loss: 0.0136 440/500 [=========================>....] - ETA: 30s - loss: 0.1152 - regression_loss: 0.1016 - classification_loss: 0.0136 441/500 [=========================>....] - ETA: 30s - loss: 0.1165 - regression_loss: 0.1028 - classification_loss: 0.0138 442/500 [=========================>....] - ETA: 29s - loss: 0.1164 - regression_loss: 0.1026 - classification_loss: 0.0137 443/500 [=========================>....] - ETA: 29s - loss: 0.1164 - regression_loss: 0.1027 - classification_loss: 0.0137 444/500 [=========================>....] - ETA: 28s - loss: 0.1163 - regression_loss: 0.1026 - classification_loss: 0.0137 445/500 [=========================>....] - ETA: 28s - loss: 0.1163 - regression_loss: 0.1025 - classification_loss: 0.0137 446/500 [=========================>....] - ETA: 27s - loss: 0.1162 - regression_loss: 0.1025 - classification_loss: 0.0137 447/500 [=========================>....] - ETA: 27s - loss: 0.1161 - regression_loss: 0.1024 - classification_loss: 0.0137 448/500 [=========================>....] - ETA: 26s - loss: 0.1161 - regression_loss: 0.1024 - classification_loss: 0.0137 449/500 [=========================>....] - ETA: 26s - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0137 450/500 [==========================>...] - ETA: 25s - loss: 0.1163 - regression_loss: 0.1026 - classification_loss: 0.0137 451/500 [==========================>...] - ETA: 25s - loss: 0.1161 - regression_loss: 0.1024 - classification_loss: 0.0137 452/500 [==========================>...] - ETA: 24s - loss: 0.1160 - regression_loss: 0.1023 - classification_loss: 0.0137 453/500 [==========================>...] - ETA: 23s - loss: 0.1159 - regression_loss: 0.1022 - classification_loss: 0.0136 454/500 [==========================>...] - ETA: 23s - loss: 0.1157 - regression_loss: 0.1021 - classification_loss: 0.0136 455/500 [==========================>...] - ETA: 22s - loss: 0.1157 - regression_loss: 0.1021 - classification_loss: 0.0136 456/500 [==========================>...] - ETA: 22s - loss: 0.1155 - regression_loss: 0.1019 - classification_loss: 0.0136 457/500 [==========================>...] - ETA: 21s - loss: 0.1153 - regression_loss: 0.1017 - classification_loss: 0.0136 458/500 [==========================>...] - ETA: 21s - loss: 0.1153 - regression_loss: 0.1017 - classification_loss: 0.0136 459/500 [==========================>...] - ETA: 20s - loss: 0.1154 - regression_loss: 0.1018 - classification_loss: 0.0136 460/500 [==========================>...] - ETA: 20s - loss: 0.1153 - regression_loss: 0.1017 - classification_loss: 0.0136 461/500 [==========================>...] - ETA: 19s - loss: 0.1153 - regression_loss: 0.1017 - classification_loss: 0.0136 462/500 [==========================>...] - ETA: 19s - loss: 0.1152 - regression_loss: 0.1016 - classification_loss: 0.0136 463/500 [==========================>...] - ETA: 18s - loss: 0.1150 - regression_loss: 0.1015 - classification_loss: 0.0136 464/500 [==========================>...] - ETA: 18s - loss: 0.1154 - regression_loss: 0.1018 - classification_loss: 0.0136 465/500 [==========================>...] - ETA: 17s - loss: 0.1152 - regression_loss: 0.1017 - classification_loss: 0.0136 466/500 [==========================>...] - ETA: 17s - loss: 0.1151 - regression_loss: 0.1015 - classification_loss: 0.0136 467/500 [===========================>..] - ETA: 16s - loss: 0.1149 - regression_loss: 0.1014 - classification_loss: 0.0136 468/500 [===========================>..] - ETA: 16s - loss: 0.1148 - regression_loss: 0.1012 - classification_loss: 0.0135 469/500 [===========================>..] - ETA: 15s - loss: 0.1147 - regression_loss: 0.1012 - classification_loss: 0.0135 470/500 [===========================>..] - ETA: 15s - loss: 0.1146 - regression_loss: 0.1011 - classification_loss: 0.0135 471/500 [===========================>..] - ETA: 14s - loss: 0.1146 - regression_loss: 0.1011 - classification_loss: 0.0135 472/500 [===========================>..] - ETA: 14s - loss: 0.1145 - regression_loss: 0.1010 - classification_loss: 0.0135 473/500 [===========================>..] - ETA: 13s - loss: 0.1144 - regression_loss: 0.1009 - classification_loss: 0.0135 474/500 [===========================>..] - ETA: 13s - loss: 0.1148 - regression_loss: 0.1013 - classification_loss: 0.0135 475/500 [===========================>..] - ETA: 12s - loss: 0.1147 - regression_loss: 0.1012 - classification_loss: 0.0135 476/500 [===========================>..] - ETA: 12s - loss: 0.1146 - regression_loss: 0.1011 - classification_loss: 0.0135 477/500 [===========================>..] - ETA: 11s - loss: 0.1145 - regression_loss: 0.1010 - classification_loss: 0.0135 478/500 [===========================>..] - ETA: 11s - loss: 0.1145 - regression_loss: 0.1010 - classification_loss: 0.0135 479/500 [===========================>..] - ETA: 10s - loss: 0.1144 - regression_loss: 0.1009 - classification_loss: 0.0135 480/500 [===========================>..] - ETA: 10s - loss: 0.1145 - regression_loss: 0.1010 - classification_loss: 0.0135 481/500 [===========================>..] - ETA: 9s - loss: 0.1147 - regression_loss: 0.1012 - classification_loss: 0.0135  482/500 [===========================>..] - ETA: 9s - loss: 0.1147 - regression_loss: 0.1013 - classification_loss: 0.0135 483/500 [===========================>..] - ETA: 8s - loss: 0.1146 - regression_loss: 0.1011 - classification_loss: 0.0135 484/500 [============================>.] - ETA: 8s - loss: 0.1145 - regression_loss: 0.1011 - classification_loss: 0.0135 485/500 [============================>.] - ETA: 7s - loss: 0.1143 - regression_loss: 0.1009 - classification_loss: 0.0134 486/500 [============================>.] - ETA: 7s - loss: 0.1144 - regression_loss: 0.1010 - classification_loss: 0.0134 487/500 [============================>.] - ETA: 6s - loss: 0.1142 - regression_loss: 0.1008 - classification_loss: 0.0134 488/500 [============================>.] - ETA: 6s - loss: 0.1141 - regression_loss: 0.1006 - classification_loss: 0.0134 489/500 [============================>.] - ETA: 5s - loss: 0.1139 - regression_loss: 0.1005 - classification_loss: 0.0134 490/500 [============================>.] - ETA: 5s - loss: 0.1139 - regression_loss: 0.1005 - classification_loss: 0.0134 491/500 [============================>.] - ETA: 4s - loss: 0.1141 - regression_loss: 0.1007 - classification_loss: 0.0134 492/500 [============================>.] - ETA: 4s - loss: 0.1140 - regression_loss: 0.1006 - classification_loss: 0.0134 493/500 [============================>.] - ETA: 3s - loss: 0.1143 - regression_loss: 0.1008 - classification_loss: 0.0135 494/500 [============================>.] - ETA: 3s - loss: 0.1143 - regression_loss: 0.1008 - classification_loss: 0.0135 495/500 [============================>.] - ETA: 2s - loss: 0.1143 - regression_loss: 0.1008 - classification_loss: 0.0135 496/500 [============================>.] - ETA: 2s - loss: 0.1151 - regression_loss: 0.1016 - classification_loss: 0.0135 497/500 [============================>.] - ETA: 1s - loss: 0.1150 - regression_loss: 0.1015 - classification_loss: 0.0135 498/500 [============================>.] - ETA: 1s - loss: 0.1150 - regression_loss: 0.1015 - classification_loss: 0.0135 499/500 [============================>.] - ETA: 0s - loss: 0.1151 - regression_loss: 0.1016 - classification_loss: 0.0135 500/500 [==============================] - 255s 511ms/step - loss: 0.1150 - regression_loss: 0.1015 - classification_loss: 0.0135 326 instances of class plum with average precision: 0.8451 mAP: 0.8451 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 4:26 - loss: 0.0563 - regression_loss: 0.0519 - classification_loss: 0.0044 2/500 [..............................] - ETA: 4:19 - loss: 0.0520 - regression_loss: 0.0475 - classification_loss: 0.0045 3/500 [..............................] - ETA: 4:15 - loss: 0.0505 - regression_loss: 0.0458 - classification_loss: 0.0047 4/500 [..............................] - ETA: 4:13 - loss: 0.0551 - regression_loss: 0.0493 - classification_loss: 0.0058 5/500 [..............................] - ETA: 4:11 - loss: 0.0907 - regression_loss: 0.0814 - classification_loss: 0.0093 6/500 [..............................] - ETA: 4:10 - loss: 0.0911 - regression_loss: 0.0814 - classification_loss: 0.0098 7/500 [..............................] - ETA: 4:11 - loss: 0.0845 - regression_loss: 0.0751 - classification_loss: 0.0093 8/500 [..............................] - ETA: 4:10 - loss: 0.0787 - regression_loss: 0.0700 - classification_loss: 0.0088 9/500 [..............................] - ETA: 4:10 - loss: 0.0952 - regression_loss: 0.0839 - classification_loss: 0.0114 10/500 [..............................] - ETA: 4:10 - loss: 0.1049 - regression_loss: 0.0928 - classification_loss: 0.0121 11/500 [..............................] - ETA: 4:11 - loss: 0.1075 - regression_loss: 0.0956 - classification_loss: 0.0118 12/500 [..............................] - ETA: 4:11 - loss: 0.1039 - regression_loss: 0.0924 - classification_loss: 0.0114 13/500 [..............................] - ETA: 4:11 - loss: 0.1012 - regression_loss: 0.0901 - classification_loss: 0.0111 14/500 [..............................] - ETA: 4:10 - loss: 0.1007 - regression_loss: 0.0896 - classification_loss: 0.0111 15/500 [..............................] - ETA: 4:10 - loss: 0.0990 - regression_loss: 0.0882 - classification_loss: 0.0108 16/500 [..............................] - ETA: 4:09 - loss: 0.0956 - regression_loss: 0.0849 - classification_loss: 0.0108 17/500 [>.............................] - ETA: 4:09 - loss: 0.0952 - regression_loss: 0.0846 - classification_loss: 0.0107 18/500 [>.............................] - ETA: 4:08 - loss: 0.0955 - regression_loss: 0.0848 - classification_loss: 0.0107 19/500 [>.............................] - ETA: 4:08 - loss: 0.0939 - regression_loss: 0.0832 - classification_loss: 0.0108 20/500 [>.............................] - ETA: 4:07 - loss: 0.1310 - regression_loss: 0.1085 - classification_loss: 0.0225 21/500 [>.............................] - ETA: 4:06 - loss: 0.1329 - regression_loss: 0.1108 - classification_loss: 0.0221 22/500 [>.............................] - ETA: 4:06 - loss: 0.1292 - regression_loss: 0.1079 - classification_loss: 0.0213 23/500 [>.............................] - ETA: 4:05 - loss: 0.1282 - regression_loss: 0.1075 - classification_loss: 0.0207 24/500 [>.............................] - ETA: 4:04 - loss: 0.1368 - regression_loss: 0.1153 - classification_loss: 0.0215 25/500 [>.............................] - ETA: 4:04 - loss: 0.1383 - regression_loss: 0.1170 - classification_loss: 0.0213 26/500 [>.............................] - ETA: 4:03 - loss: 0.1353 - regression_loss: 0.1145 - classification_loss: 0.0208 27/500 [>.............................] - ETA: 4:03 - loss: 0.1321 - regression_loss: 0.1116 - classification_loss: 0.0205 28/500 [>.............................] - ETA: 4:02 - loss: 0.1296 - regression_loss: 0.1094 - classification_loss: 0.0202 29/500 [>.............................] - ETA: 4:02 - loss: 0.1295 - regression_loss: 0.1096 - classification_loss: 0.0200 30/500 [>.............................] - ETA: 4:02 - loss: 0.1299 - regression_loss: 0.1102 - classification_loss: 0.0197 31/500 [>.............................] - ETA: 4:01 - loss: 0.1268 - regression_loss: 0.1075 - classification_loss: 0.0193 32/500 [>.............................] - ETA: 4:01 - loss: 0.1352 - regression_loss: 0.1138 - classification_loss: 0.0213 33/500 [>.............................] - ETA: 4:00 - loss: 0.1327 - regression_loss: 0.1118 - classification_loss: 0.0209 34/500 [=>............................] - ETA: 4:00 - loss: 0.1302 - regression_loss: 0.1097 - classification_loss: 0.0205 35/500 [=>............................] - ETA: 3:59 - loss: 0.1288 - regression_loss: 0.1087 - classification_loss: 0.0201 36/500 [=>............................] - ETA: 3:59 - loss: 0.1268 - regression_loss: 0.1070 - classification_loss: 0.0198 37/500 [=>............................] - ETA: 3:58 - loss: 0.1243 - regression_loss: 0.1049 - classification_loss: 0.0195 38/500 [=>............................] - ETA: 3:57 - loss: 0.1286 - regression_loss: 0.1087 - classification_loss: 0.0199 39/500 [=>............................] - ETA: 3:57 - loss: 0.1295 - regression_loss: 0.1097 - classification_loss: 0.0198 40/500 [=>............................] - ETA: 3:56 - loss: 0.1268 - regression_loss: 0.1074 - classification_loss: 0.0194 41/500 [=>............................] - ETA: 3:56 - loss: 0.1264 - regression_loss: 0.1072 - classification_loss: 0.0192 42/500 [=>............................] - ETA: 3:55 - loss: 0.1260 - regression_loss: 0.1070 - classification_loss: 0.0189 43/500 [=>............................] - ETA: 3:55 - loss: 0.1279 - regression_loss: 0.1091 - classification_loss: 0.0189 44/500 [=>............................] - ETA: 3:55 - loss: 0.1286 - regression_loss: 0.1096 - classification_loss: 0.0190 45/500 [=>............................] - ETA: 3:54 - loss: 0.1278 - regression_loss: 0.1089 - classification_loss: 0.0188 46/500 [=>............................] - ETA: 3:54 - loss: 0.1260 - regression_loss: 0.1074 - classification_loss: 0.0186 47/500 [=>............................] - ETA: 3:53 - loss: 0.1242 - regression_loss: 0.1059 - classification_loss: 0.0183 48/500 [=>............................] - ETA: 3:52 - loss: 0.1233 - regression_loss: 0.1053 - classification_loss: 0.0181 49/500 [=>............................] - ETA: 3:52 - loss: 0.1219 - regression_loss: 0.1040 - classification_loss: 0.0179 50/500 [==>...........................] - ETA: 3:51 - loss: 0.1240 - regression_loss: 0.1057 - classification_loss: 0.0183 51/500 [==>...........................] - ETA: 3:51 - loss: 0.1227 - regression_loss: 0.1047 - classification_loss: 0.0180 52/500 [==>...........................] - ETA: 3:50 - loss: 0.1210 - regression_loss: 0.1032 - classification_loss: 0.0178 53/500 [==>...........................] - ETA: 3:49 - loss: 0.1209 - regression_loss: 0.1033 - classification_loss: 0.0176 54/500 [==>...........................] - ETA: 3:49 - loss: 0.1220 - regression_loss: 0.1045 - classification_loss: 0.0175 55/500 [==>...........................] - ETA: 3:48 - loss: 0.1217 - regression_loss: 0.1044 - classification_loss: 0.0173 56/500 [==>...........................] - ETA: 3:48 - loss: 0.1202 - regression_loss: 0.1031 - classification_loss: 0.0171 57/500 [==>...........................] - ETA: 3:47 - loss: 0.1188 - regression_loss: 0.1018 - classification_loss: 0.0169 58/500 [==>...........................] - ETA: 3:47 - loss: 0.1179 - regression_loss: 0.1012 - classification_loss: 0.0167 59/500 [==>...........................] - ETA: 3:46 - loss: 0.1173 - regression_loss: 0.1008 - classification_loss: 0.0166 60/500 [==>...........................] - ETA: 3:46 - loss: 0.1224 - regression_loss: 0.1058 - classification_loss: 0.0166 61/500 [==>...........................] - ETA: 3:45 - loss: 0.1225 - regression_loss: 0.1061 - classification_loss: 0.0165 62/500 [==>...........................] - ETA: 3:45 - loss: 0.1211 - regression_loss: 0.1048 - classification_loss: 0.0163 63/500 [==>...........................] - ETA: 3:44 - loss: 0.1204 - regression_loss: 0.1042 - classification_loss: 0.0162 64/500 [==>...........................] - ETA: 3:44 - loss: 0.1195 - regression_loss: 0.1034 - classification_loss: 0.0161 65/500 [==>...........................] - ETA: 3:43 - loss: 0.1190 - regression_loss: 0.1030 - classification_loss: 0.0160 66/500 [==>...........................] - ETA: 3:43 - loss: 0.1188 - regression_loss: 0.1027 - classification_loss: 0.0160 67/500 [===>..........................] - ETA: 3:42 - loss: 0.1177 - regression_loss: 0.1019 - classification_loss: 0.0159 68/500 [===>..........................] - ETA: 3:41 - loss: 0.1176 - regression_loss: 0.1018 - classification_loss: 0.0158 69/500 [===>..........................] - ETA: 3:41 - loss: 0.1173 - regression_loss: 0.1016 - classification_loss: 0.0157 70/500 [===>..........................] - ETA: 3:40 - loss: 0.1164 - regression_loss: 0.1008 - classification_loss: 0.0156 71/500 [===>..........................] - ETA: 3:40 - loss: 0.1163 - regression_loss: 0.1008 - classification_loss: 0.0155 72/500 [===>..........................] - ETA: 3:39 - loss: 0.1165 - regression_loss: 0.1010 - classification_loss: 0.0155 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1164 - regression_loss: 0.1009 - classification_loss: 0.0154 74/500 [===>..........................] - ETA: 3:38 - loss: 0.1154 - regression_loss: 0.1001 - classification_loss: 0.0153 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1147 - regression_loss: 0.0995 - classification_loss: 0.0152 76/500 [===>..........................] - ETA: 3:37 - loss: 0.1140 - regression_loss: 0.0989 - classification_loss: 0.0151 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1135 - regression_loss: 0.0985 - classification_loss: 0.0150 78/500 [===>..........................] - ETA: 3:36 - loss: 0.1124 - regression_loss: 0.0976 - classification_loss: 0.0149 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1120 - regression_loss: 0.0972 - classification_loss: 0.0147 80/500 [===>..........................] - ETA: 3:35 - loss: 0.1114 - regression_loss: 0.0968 - classification_loss: 0.0146 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1114 - regression_loss: 0.0968 - classification_loss: 0.0146 82/500 [===>..........................] - ETA: 3:34 - loss: 0.1108 - regression_loss: 0.0964 - classification_loss: 0.0144 83/500 [===>..........................] - ETA: 3:33 - loss: 0.1101 - regression_loss: 0.0957 - classification_loss: 0.0144 84/500 [====>.........................] - ETA: 3:33 - loss: 0.1096 - regression_loss: 0.0953 - classification_loss: 0.0143 85/500 [====>.........................] - ETA: 3:32 - loss: 0.1093 - regression_loss: 0.0951 - classification_loss: 0.0142 86/500 [====>.........................] - ETA: 3:32 - loss: 0.1088 - regression_loss: 0.0947 - classification_loss: 0.0141 87/500 [====>.........................] - ETA: 3:31 - loss: 0.1085 - regression_loss: 0.0945 - classification_loss: 0.0140 88/500 [====>.........................] - ETA: 3:31 - loss: 0.1075 - regression_loss: 0.0936 - classification_loss: 0.0139 89/500 [====>.........................] - ETA: 3:30 - loss: 0.1089 - regression_loss: 0.0949 - classification_loss: 0.0140 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1094 - regression_loss: 0.0955 - classification_loss: 0.0139 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1090 - regression_loss: 0.0951 - classification_loss: 0.0139 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1084 - regression_loss: 0.0947 - classification_loss: 0.0138 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1090 - regression_loss: 0.0952 - classification_loss: 0.0138 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1095 - regression_loss: 0.0957 - classification_loss: 0.0138 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1159 - regression_loss: 0.1015 - classification_loss: 0.0144 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1155 - regression_loss: 0.1012 - classification_loss: 0.0143 97/500 [====>.........................] - ETA: 3:26 - loss: 0.1157 - regression_loss: 0.1014 - classification_loss: 0.0143 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1149 - regression_loss: 0.1007 - classification_loss: 0.0142 99/500 [====>.........................] - ETA: 3:25 - loss: 0.1149 - regression_loss: 0.1007 - classification_loss: 0.0142 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1151 - regression_loss: 0.1010 - classification_loss: 0.0142 101/500 [=====>........................] - ETA: 3:24 - loss: 0.1154 - regression_loss: 0.1013 - classification_loss: 0.0141 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1148 - regression_loss: 0.1008 - classification_loss: 0.0140 103/500 [=====>........................] - ETA: 3:23 - loss: 0.1144 - regression_loss: 0.1004 - classification_loss: 0.0140 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1140 - regression_loss: 0.1000 - classification_loss: 0.0139 105/500 [=====>........................] - ETA: 3:22 - loss: 0.1135 - regression_loss: 0.0996 - classification_loss: 0.0139 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1130 - regression_loss: 0.0991 - classification_loss: 0.0139 107/500 [=====>........................] - ETA: 3:21 - loss: 0.1125 - regression_loss: 0.0988 - classification_loss: 0.0138 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1119 - regression_loss: 0.0982 - classification_loss: 0.0137 109/500 [=====>........................] - ETA: 3:20 - loss: 0.1115 - regression_loss: 0.0978 - classification_loss: 0.0137 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1172 - regression_loss: 0.1029 - classification_loss: 0.0143 111/500 [=====>........................] - ETA: 3:19 - loss: 0.1171 - regression_loss: 0.1029 - classification_loss: 0.0142 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1164 - regression_loss: 0.1022 - classification_loss: 0.0142 113/500 [=====>........................] - ETA: 3:18 - loss: 0.1176 - regression_loss: 0.1033 - classification_loss: 0.0142 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1170 - regression_loss: 0.1028 - classification_loss: 0.0142 115/500 [=====>........................] - ETA: 3:17 - loss: 0.1167 - regression_loss: 0.1026 - classification_loss: 0.0142 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1165 - regression_loss: 0.1024 - classification_loss: 0.0141 117/500 [======>.......................] - ETA: 3:16 - loss: 0.1162 - regression_loss: 0.1022 - classification_loss: 0.0140 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1157 - regression_loss: 0.1017 - classification_loss: 0.0140 119/500 [======>.......................] - ETA: 3:15 - loss: 0.1154 - regression_loss: 0.1015 - classification_loss: 0.0139 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1150 - regression_loss: 0.1011 - classification_loss: 0.0139 121/500 [======>.......................] - ETA: 3:14 - loss: 0.1146 - regression_loss: 0.1007 - classification_loss: 0.0139 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1143 - regression_loss: 0.1004 - classification_loss: 0.0138 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1143 - regression_loss: 0.1004 - classification_loss: 0.0138 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1149 - regression_loss: 0.1011 - classification_loss: 0.0138 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1151 - regression_loss: 0.1013 - classification_loss: 0.0138 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1148 - regression_loss: 0.1010 - classification_loss: 0.0137 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1143 - regression_loss: 0.1006 - classification_loss: 0.0137 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1148 - regression_loss: 0.1011 - classification_loss: 0.0137 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1151 - regression_loss: 0.1013 - classification_loss: 0.0138 130/500 [======>.......................] - ETA: 3:09 - loss: 0.1153 - regression_loss: 0.1015 - classification_loss: 0.0138 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1149 - regression_loss: 0.1011 - classification_loss: 0.0138 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1144 - regression_loss: 0.1007 - classification_loss: 0.0137 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1141 - regression_loss: 0.1004 - classification_loss: 0.0137 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1144 - regression_loss: 0.1006 - classification_loss: 0.0137 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1138 - regression_loss: 0.1002 - classification_loss: 0.0137 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1136 - regression_loss: 0.1000 - classification_loss: 0.0136 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1136 - regression_loss: 0.1000 - classification_loss: 0.0136 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1131 - regression_loss: 0.0995 - classification_loss: 0.0136 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1133 - regression_loss: 0.0997 - classification_loss: 0.0135 140/500 [=======>......................] - ETA: 3:04 - loss: 0.1129 - regression_loss: 0.0994 - classification_loss: 0.0135 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1127 - regression_loss: 0.0992 - classification_loss: 0.0135 142/500 [=======>......................] - ETA: 3:03 - loss: 0.1125 - regression_loss: 0.0990 - classification_loss: 0.0134 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1123 - regression_loss: 0.0989 - classification_loss: 0.0134 144/500 [=======>......................] - ETA: 3:02 - loss: 0.1118 - regression_loss: 0.0985 - classification_loss: 0.0134 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1120 - regression_loss: 0.0986 - classification_loss: 0.0134 146/500 [=======>......................] - ETA: 3:01 - loss: 0.1115 - regression_loss: 0.0981 - classification_loss: 0.0134 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1111 - regression_loss: 0.0978 - classification_loss: 0.0133 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1106 - regression_loss: 0.0973 - classification_loss: 0.0133 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1104 - regression_loss: 0.0971 - classification_loss: 0.0133 150/500 [========>.....................] - ETA: 2:59 - loss: 0.1104 - regression_loss: 0.0972 - classification_loss: 0.0132 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1102 - regression_loss: 0.0969 - classification_loss: 0.0133 152/500 [========>.....................] - ETA: 2:58 - loss: 0.1103 - regression_loss: 0.0970 - classification_loss: 0.0133 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1099 - regression_loss: 0.0967 - classification_loss: 0.0132 154/500 [========>.....................] - ETA: 2:57 - loss: 0.1099 - regression_loss: 0.0967 - classification_loss: 0.0132 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1097 - regression_loss: 0.0965 - classification_loss: 0.0132 156/500 [========>.....................] - ETA: 2:56 - loss: 0.1102 - regression_loss: 0.0969 - classification_loss: 0.0133 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1097 - regression_loss: 0.0965 - classification_loss: 0.0132 158/500 [========>.....................] - ETA: 2:55 - loss: 0.1117 - regression_loss: 0.0984 - classification_loss: 0.0132 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1113 - regression_loss: 0.0981 - classification_loss: 0.0132 160/500 [========>.....................] - ETA: 2:54 - loss: 0.1108 - regression_loss: 0.0977 - classification_loss: 0.0131 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1136 - regression_loss: 0.0998 - classification_loss: 0.0138 162/500 [========>.....................] - ETA: 2:53 - loss: 0.1132 - regression_loss: 0.0995 - classification_loss: 0.0137 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1135 - regression_loss: 0.0997 - classification_loss: 0.0137 164/500 [========>.....................] - ETA: 2:52 - loss: 0.1132 - regression_loss: 0.0995 - classification_loss: 0.0137 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1142 - regression_loss: 0.1005 - classification_loss: 0.0137 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1146 - regression_loss: 0.1009 - classification_loss: 0.0137 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1141 - regression_loss: 0.1005 - classification_loss: 0.0137 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1138 - regression_loss: 0.1002 - classification_loss: 0.0136 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1139 - regression_loss: 0.1003 - classification_loss: 0.0136 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1141 - regression_loss: 0.1005 - classification_loss: 0.0136 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1138 - regression_loss: 0.1003 - classification_loss: 0.0136 172/500 [=========>....................] - ETA: 2:48 - loss: 0.1138 - regression_loss: 0.1002 - classification_loss: 0.0135 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1135 - regression_loss: 0.1000 - classification_loss: 0.0135 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1133 - regression_loss: 0.0998 - classification_loss: 0.0134 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1131 - regression_loss: 0.0997 - classification_loss: 0.0134 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1130 - regression_loss: 0.0996 - classification_loss: 0.0134 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1126 - regression_loss: 0.0992 - classification_loss: 0.0134 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1125 - regression_loss: 0.0991 - classification_loss: 0.0134 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1122 - regression_loss: 0.0988 - classification_loss: 0.0133 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1117 - regression_loss: 0.0984 - classification_loss: 0.0133 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1115 - regression_loss: 0.0982 - classification_loss: 0.0133 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1154 - regression_loss: 0.1008 - classification_loss: 0.0146 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1152 - regression_loss: 0.1006 - classification_loss: 0.0145 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1147 - regression_loss: 0.1002 - classification_loss: 0.0145 185/500 [==========>...................] - ETA: 2:41 - loss: 0.1143 - regression_loss: 0.0999 - classification_loss: 0.0144 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1139 - regression_loss: 0.0995 - classification_loss: 0.0144 187/500 [==========>...................] - ETA: 2:40 - loss: 0.1137 - regression_loss: 0.0994 - classification_loss: 0.0143 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1138 - regression_loss: 0.0995 - classification_loss: 0.0144 189/500 [==========>...................] - ETA: 2:39 - loss: 0.1135 - regression_loss: 0.0991 - classification_loss: 0.0143 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1136 - regression_loss: 0.0993 - classification_loss: 0.0144 191/500 [==========>...................] - ETA: 2:38 - loss: 0.1136 - regression_loss: 0.0993 - classification_loss: 0.0143 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1137 - regression_loss: 0.0993 - classification_loss: 0.0143 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1158 - regression_loss: 0.1013 - classification_loss: 0.0144 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1165 - regression_loss: 0.1020 - classification_loss: 0.0145 195/500 [==========>...................] - ETA: 2:36 - loss: 0.1164 - regression_loss: 0.1020 - classification_loss: 0.0144 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1166 - regression_loss: 0.1021 - classification_loss: 0.0144 197/500 [==========>...................] - ETA: 2:35 - loss: 0.1163 - regression_loss: 0.1019 - classification_loss: 0.0144 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1159 - regression_loss: 0.1016 - classification_loss: 0.0143 199/500 [==========>...................] - ETA: 2:34 - loss: 0.1155 - regression_loss: 0.1012 - classification_loss: 0.0143 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1153 - regression_loss: 0.1010 - classification_loss: 0.0143 201/500 [===========>..................] - ETA: 2:33 - loss: 0.1152 - regression_loss: 0.1009 - classification_loss: 0.0143 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1149 - regression_loss: 0.1006 - classification_loss: 0.0142 203/500 [===========>..................] - ETA: 2:32 - loss: 0.1146 - regression_loss: 0.1004 - classification_loss: 0.0142 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1145 - regression_loss: 0.1004 - classification_loss: 0.0142 205/500 [===========>..................] - ETA: 2:31 - loss: 0.1149 - regression_loss: 0.1007 - classification_loss: 0.0141 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1146 - regression_loss: 0.1005 - classification_loss: 0.0141 207/500 [===========>..................] - ETA: 2:30 - loss: 0.1144 - regression_loss: 0.1003 - classification_loss: 0.0141 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1146 - regression_loss: 0.1006 - classification_loss: 0.0140 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1144 - regression_loss: 0.1003 - classification_loss: 0.0140 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1143 - regression_loss: 0.1003 - classification_loss: 0.0140 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1139 - regression_loss: 0.0999 - classification_loss: 0.0140 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1140 - regression_loss: 0.1001 - classification_loss: 0.0140 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1137 - regression_loss: 0.0997 - classification_loss: 0.0139 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1134 - regression_loss: 0.0995 - classification_loss: 0.0139 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1144 - regression_loss: 0.1005 - classification_loss: 0.0139 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1144 - regression_loss: 0.1005 - classification_loss: 0.0139 217/500 [============>.................] - ETA: 2:24 - loss: 0.1146 - regression_loss: 0.1007 - classification_loss: 0.0139 218/500 [============>.................] - ETA: 2:24 - loss: 0.1148 - regression_loss: 0.1009 - classification_loss: 0.0139 219/500 [============>.................] - ETA: 2:23 - loss: 0.1145 - regression_loss: 0.1007 - classification_loss: 0.0138 220/500 [============>.................] - ETA: 2:23 - loss: 0.1143 - regression_loss: 0.1005 - classification_loss: 0.0138 221/500 [============>.................] - ETA: 2:22 - loss: 0.1142 - regression_loss: 0.1004 - classification_loss: 0.0138 222/500 [============>.................] - ETA: 2:22 - loss: 0.1138 - regression_loss: 0.1001 - classification_loss: 0.0138 223/500 [============>.................] - ETA: 2:21 - loss: 0.1144 - regression_loss: 0.1006 - classification_loss: 0.0138 224/500 [============>.................] - ETA: 2:21 - loss: 0.1141 - regression_loss: 0.1003 - classification_loss: 0.0138 225/500 [============>.................] - ETA: 2:20 - loss: 0.1140 - regression_loss: 0.1002 - classification_loss: 0.0138 226/500 [============>.................] - ETA: 2:20 - loss: 0.1157 - regression_loss: 0.1015 - classification_loss: 0.0142 227/500 [============>.................] - ETA: 2:19 - loss: 0.1155 - regression_loss: 0.1014 - classification_loss: 0.0141 228/500 [============>.................] - ETA: 2:19 - loss: 0.1153 - regression_loss: 0.1012 - classification_loss: 0.0141 229/500 [============>.................] - ETA: 2:18 - loss: 0.1149 - regression_loss: 0.1008 - classification_loss: 0.0141 230/500 [============>.................] - ETA: 2:18 - loss: 0.1147 - regression_loss: 0.1007 - classification_loss: 0.0140 231/500 [============>.................] - ETA: 2:17 - loss: 0.1147 - regression_loss: 0.1007 - classification_loss: 0.0140 232/500 [============>.................] - ETA: 2:17 - loss: 0.1144 - regression_loss: 0.1005 - classification_loss: 0.0140 233/500 [============>.................] - ETA: 2:16 - loss: 0.1153 - regression_loss: 0.1012 - classification_loss: 0.0140 234/500 [=============>................] - ETA: 2:16 - loss: 0.1152 - regression_loss: 0.1012 - classification_loss: 0.0140 235/500 [=============>................] - ETA: 2:15 - loss: 0.1151 - regression_loss: 0.1011 - classification_loss: 0.0140 236/500 [=============>................] - ETA: 2:15 - loss: 0.1149 - regression_loss: 0.1009 - classification_loss: 0.0140 237/500 [=============>................] - ETA: 2:14 - loss: 0.1147 - regression_loss: 0.1008 - classification_loss: 0.0139 238/500 [=============>................] - ETA: 2:14 - loss: 0.1147 - regression_loss: 0.1008 - classification_loss: 0.0139 239/500 [=============>................] - ETA: 2:13 - loss: 0.1150 - regression_loss: 0.1010 - classification_loss: 0.0139 240/500 [=============>................] - ETA: 2:13 - loss: 0.1147 - regression_loss: 0.1008 - classification_loss: 0.0139 241/500 [=============>................] - ETA: 2:12 - loss: 0.1146 - regression_loss: 0.1007 - classification_loss: 0.0139 242/500 [=============>................] - ETA: 2:12 - loss: 0.1145 - regression_loss: 0.1006 - classification_loss: 0.0139 243/500 [=============>................] - ETA: 2:11 - loss: 0.1168 - regression_loss: 0.1028 - classification_loss: 0.0141 244/500 [=============>................] - ETA: 2:11 - loss: 0.1192 - regression_loss: 0.1049 - classification_loss: 0.0143 245/500 [=============>................] - ETA: 2:10 - loss: 0.1191 - regression_loss: 0.1048 - classification_loss: 0.0143 246/500 [=============>................] - ETA: 2:10 - loss: 0.1189 - regression_loss: 0.1046 - classification_loss: 0.0143 247/500 [=============>................] - ETA: 2:09 - loss: 0.1189 - regression_loss: 0.1047 - classification_loss: 0.0143 248/500 [=============>................] - ETA: 2:09 - loss: 0.1186 - regression_loss: 0.1044 - classification_loss: 0.0142 249/500 [=============>................] - ETA: 2:08 - loss: 0.1184 - regression_loss: 0.1042 - classification_loss: 0.0142 250/500 [==============>...............] - ETA: 2:08 - loss: 0.1182 - regression_loss: 0.1040 - classification_loss: 0.0142 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1181 - regression_loss: 0.1040 - classification_loss: 0.0141 252/500 [==============>...............] - ETA: 2:07 - loss: 0.1178 - regression_loss: 0.1037 - classification_loss: 0.0141 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1188 - regression_loss: 0.1044 - classification_loss: 0.0144 254/500 [==============>...............] - ETA: 2:06 - loss: 0.1186 - regression_loss: 0.1042 - classification_loss: 0.0144 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1189 - regression_loss: 0.1045 - classification_loss: 0.0144 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1191 - regression_loss: 0.1047 - classification_loss: 0.0144 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1188 - regression_loss: 0.1045 - classification_loss: 0.0144 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1186 - regression_loss: 0.1043 - classification_loss: 0.0144 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1187 - regression_loss: 0.1044 - classification_loss: 0.0144 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1184 - regression_loss: 0.1041 - classification_loss: 0.0143 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1184 - regression_loss: 0.1041 - classification_loss: 0.0143 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1181 - regression_loss: 0.1038 - classification_loss: 0.0143 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1178 - regression_loss: 0.1035 - classification_loss: 0.0143 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1175 - regression_loss: 0.1032 - classification_loss: 0.0142 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1173 - regression_loss: 0.1031 - classification_loss: 0.0142 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1171 - regression_loss: 0.1030 - classification_loss: 0.0142 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1173 - regression_loss: 0.1031 - classification_loss: 0.0142 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1171 - regression_loss: 0.1029 - classification_loss: 0.0142 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1168 - regression_loss: 0.1027 - classification_loss: 0.0141 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1170 - regression_loss: 0.1028 - classification_loss: 0.0141 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1171 - regression_loss: 0.1030 - classification_loss: 0.0141 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1167 - regression_loss: 0.1027 - classification_loss: 0.0141 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1166 - regression_loss: 0.1026 - classification_loss: 0.0141 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1163 - regression_loss: 0.1023 - classification_loss: 0.0140 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1161 - regression_loss: 0.1021 - classification_loss: 0.0140 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1159 - regression_loss: 0.1019 - classification_loss: 0.0140 277/500 [===============>..............] - ETA: 1:54 - loss: 0.1156 - regression_loss: 0.1017 - classification_loss: 0.0139 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1156 - regression_loss: 0.1017 - classification_loss: 0.0139 279/500 [===============>..............] - ETA: 1:53 - loss: 0.1154 - regression_loss: 0.1015 - classification_loss: 0.0139 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1152 - regression_loss: 0.1013 - classification_loss: 0.0139 281/500 [===============>..............] - ETA: 1:52 - loss: 0.1150 - regression_loss: 0.1012 - classification_loss: 0.0138 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1148 - regression_loss: 0.1009 - classification_loss: 0.0138 283/500 [===============>..............] - ETA: 1:51 - loss: 0.1150 - regression_loss: 0.1012 - classification_loss: 0.0138 284/500 [================>.............] - ETA: 1:50 - loss: 0.1148 - regression_loss: 0.1010 - classification_loss: 0.0138 285/500 [================>.............] - ETA: 1:50 - loss: 0.1147 - regression_loss: 0.1009 - classification_loss: 0.0138 286/500 [================>.............] - ETA: 1:49 - loss: 0.1152 - regression_loss: 0.1014 - classification_loss: 0.0138 287/500 [================>.............] - ETA: 1:49 - loss: 0.1151 - regression_loss: 0.1013 - classification_loss: 0.0138 288/500 [================>.............] - ETA: 1:48 - loss: 0.1152 - regression_loss: 0.1013 - classification_loss: 0.0138 289/500 [================>.............] - ETA: 1:47 - loss: 0.1149 - regression_loss: 0.1011 - classification_loss: 0.0138 290/500 [================>.............] - ETA: 1:47 - loss: 0.1147 - regression_loss: 0.1009 - classification_loss: 0.0138 291/500 [================>.............] - ETA: 1:46 - loss: 0.1145 - regression_loss: 0.1008 - classification_loss: 0.0137 292/500 [================>.............] - ETA: 1:46 - loss: 0.1151 - regression_loss: 0.1013 - classification_loss: 0.0138 293/500 [================>.............] - ETA: 1:45 - loss: 0.1149 - regression_loss: 0.1011 - classification_loss: 0.0138 294/500 [================>.............] - ETA: 1:45 - loss: 0.1146 - regression_loss: 0.1009 - classification_loss: 0.0138 295/500 [================>.............] - ETA: 1:44 - loss: 0.1151 - regression_loss: 0.1012 - classification_loss: 0.0139 296/500 [================>.............] - ETA: 1:44 - loss: 0.1149 - regression_loss: 0.1010 - classification_loss: 0.0139 297/500 [================>.............] - ETA: 1:43 - loss: 0.1149 - regression_loss: 0.1010 - classification_loss: 0.0139 298/500 [================>.............] - ETA: 1:43 - loss: 0.1152 - regression_loss: 0.1013 - classification_loss: 0.0139 299/500 [================>.............] - ETA: 1:42 - loss: 0.1151 - regression_loss: 0.1012 - classification_loss: 0.0139 300/500 [=================>............] - ETA: 1:42 - loss: 0.1148 - regression_loss: 0.1009 - classification_loss: 0.0139 301/500 [=================>............] - ETA: 1:41 - loss: 0.1148 - regression_loss: 0.1009 - classification_loss: 0.0139 302/500 [=================>............] - ETA: 1:41 - loss: 0.1149 - regression_loss: 0.1010 - classification_loss: 0.0139 303/500 [=================>............] - ETA: 1:40 - loss: 0.1146 - regression_loss: 0.1007 - classification_loss: 0.0139 304/500 [=================>............] - ETA: 1:40 - loss: 0.1144 - regression_loss: 0.1006 - classification_loss: 0.0138 305/500 [=================>............] - ETA: 1:39 - loss: 0.1141 - regression_loss: 0.1003 - classification_loss: 0.0138 306/500 [=================>............] - ETA: 1:39 - loss: 0.1139 - regression_loss: 0.1001 - classification_loss: 0.0138 307/500 [=================>............] - ETA: 1:38 - loss: 0.1137 - regression_loss: 0.1000 - classification_loss: 0.0138 308/500 [=================>............] - ETA: 1:38 - loss: 0.1135 - regression_loss: 0.0998 - classification_loss: 0.0137 309/500 [=================>............] - ETA: 1:37 - loss: 0.1133 - regression_loss: 0.0996 - classification_loss: 0.0137 310/500 [=================>............] - ETA: 1:37 - loss: 0.1133 - regression_loss: 0.0996 - classification_loss: 0.0137 311/500 [=================>............] - ETA: 1:36 - loss: 0.1134 - regression_loss: 0.0997 - classification_loss: 0.0137 312/500 [=================>............] - ETA: 1:36 - loss: 0.1131 - regression_loss: 0.0994 - classification_loss: 0.0137 313/500 [=================>............] - ETA: 1:35 - loss: 0.1131 - regression_loss: 0.0994 - classification_loss: 0.0137 314/500 [=================>............] - ETA: 1:35 - loss: 0.1130 - regression_loss: 0.0993 - classification_loss: 0.0137 315/500 [=================>............] - ETA: 1:34 - loss: 0.1131 - regression_loss: 0.0994 - classification_loss: 0.0137 316/500 [=================>............] - ETA: 1:34 - loss: 0.1133 - regression_loss: 0.0996 - classification_loss: 0.0137 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1147 - regression_loss: 0.1009 - classification_loss: 0.0138 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1145 - regression_loss: 0.1007 - classification_loss: 0.0138 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1148 - regression_loss: 0.1010 - classification_loss: 0.0138 320/500 [==================>...........] - ETA: 1:32 - loss: 0.1147 - regression_loss: 0.1009 - classification_loss: 0.0138 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1146 - regression_loss: 0.1008 - classification_loss: 0.0138 322/500 [==================>...........] - ETA: 1:31 - loss: 0.1145 - regression_loss: 0.1007 - classification_loss: 0.0138 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1143 - regression_loss: 0.1005 - classification_loss: 0.0138 324/500 [==================>...........] - ETA: 1:30 - loss: 0.1141 - regression_loss: 0.1003 - classification_loss: 0.0138 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1139 - regression_loss: 0.1001 - classification_loss: 0.0137 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1136 - regression_loss: 0.0999 - classification_loss: 0.0137 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1135 - regression_loss: 0.0998 - classification_loss: 0.0137 328/500 [==================>...........] - ETA: 1:28 - loss: 0.1134 - regression_loss: 0.0997 - classification_loss: 0.0137 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1132 - regression_loss: 0.0996 - classification_loss: 0.0137 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1131 - regression_loss: 0.0995 - classification_loss: 0.0137 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1132 - regression_loss: 0.0995 - classification_loss: 0.0136 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1130 - regression_loss: 0.0994 - classification_loss: 0.0136 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1131 - regression_loss: 0.0995 - classification_loss: 0.0136 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1130 - regression_loss: 0.0994 - classification_loss: 0.0136 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1129 - regression_loss: 0.0993 - classification_loss: 0.0136 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0136 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1125 - regression_loss: 0.0990 - classification_loss: 0.0135 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1124 - regression_loss: 0.0988 - classification_loss: 0.0135 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1123 - regression_loss: 0.0988 - classification_loss: 0.0135 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1124 - regression_loss: 0.0989 - classification_loss: 0.0135 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1122 - regression_loss: 0.0987 - classification_loss: 0.0135 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1120 - regression_loss: 0.0985 - classification_loss: 0.0135 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1120 - regression_loss: 0.0985 - classification_loss: 0.0135 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1121 - regression_loss: 0.0986 - classification_loss: 0.0135 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1124 - regression_loss: 0.0989 - classification_loss: 0.0135 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0135 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1127 - regression_loss: 0.0991 - classification_loss: 0.0135 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0135 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1124 - regression_loss: 0.0989 - classification_loss: 0.0135 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0135 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0135 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1124 - regression_loss: 0.0989 - classification_loss: 0.0135 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1123 - regression_loss: 0.0988 - classification_loss: 0.0135 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1120 - regression_loss: 0.0986 - classification_loss: 0.0134 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1120 - regression_loss: 0.0985 - classification_loss: 0.0134 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1125 - regression_loss: 0.0990 - classification_loss: 0.0135 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1123 - regression_loss: 0.0989 - classification_loss: 0.0134 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1121 - regression_loss: 0.0987 - classification_loss: 0.0134 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1120 - regression_loss: 0.0986 - classification_loss: 0.0134 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1118 - regression_loss: 0.0984 - classification_loss: 0.0134 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1119 - regression_loss: 0.0986 - classification_loss: 0.0134 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1119 - regression_loss: 0.0985 - classification_loss: 0.0134 363/500 [====================>.........] - ETA: 1:10 - loss: 0.1118 - regression_loss: 0.0984 - classification_loss: 0.0134 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1115 - regression_loss: 0.0982 - classification_loss: 0.0133 365/500 [====================>.........] - ETA: 1:09 - loss: 0.1115 - regression_loss: 0.0982 - classification_loss: 0.0133 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1114 - regression_loss: 0.0981 - classification_loss: 0.0133 367/500 [=====================>........] - ETA: 1:08 - loss: 0.1113 - regression_loss: 0.0981 - classification_loss: 0.0133 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1112 - regression_loss: 0.0979 - classification_loss: 0.0133 369/500 [=====================>........] - ETA: 1:07 - loss: 0.1111 - regression_loss: 0.0978 - classification_loss: 0.0132 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1109 - regression_loss: 0.0977 - classification_loss: 0.0132 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1107 - regression_loss: 0.0975 - classification_loss: 0.0132 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1105 - regression_loss: 0.0974 - classification_loss: 0.0132 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1106 - regression_loss: 0.0974 - classification_loss: 0.0132 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1107 - regression_loss: 0.0975 - classification_loss: 0.0132 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1106 - regression_loss: 0.0974 - classification_loss: 0.0132 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1104 - regression_loss: 0.0972 - classification_loss: 0.0132 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1102 - regression_loss: 0.0971 - classification_loss: 0.0131 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1101 - regression_loss: 0.0969 - classification_loss: 0.0131 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1099 - regression_loss: 0.0968 - classification_loss: 0.0131 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1101 - regression_loss: 0.0969 - classification_loss: 0.0131 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1100 - regression_loss: 0.0969 - classification_loss: 0.0131 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1099 - regression_loss: 0.0968 - classification_loss: 0.0131 383/500 [=====================>........] - ETA: 59s - loss: 0.1098 - regression_loss: 0.0967 - classification_loss: 0.0131  384/500 [======================>.......] - ETA: 59s - loss: 0.1096 - regression_loss: 0.0965 - classification_loss: 0.0131 385/500 [======================>.......] - ETA: 58s - loss: 0.1097 - regression_loss: 0.0966 - classification_loss: 0.0131 386/500 [======================>.......] - ETA: 58s - loss: 0.1096 - regression_loss: 0.0965 - classification_loss: 0.0131 387/500 [======================>.......] - ETA: 57s - loss: 0.1097 - regression_loss: 0.0966 - classification_loss: 0.0130 388/500 [======================>.......] - ETA: 57s - loss: 0.1100 - regression_loss: 0.0969 - classification_loss: 0.0131 389/500 [======================>.......] - ETA: 56s - loss: 0.1099 - regression_loss: 0.0967 - classification_loss: 0.0132 390/500 [======================>.......] - ETA: 56s - loss: 0.1098 - regression_loss: 0.0967 - classification_loss: 0.0131 391/500 [======================>.......] - ETA: 55s - loss: 0.1098 - regression_loss: 0.0967 - classification_loss: 0.0131 392/500 [======================>.......] - ETA: 55s - loss: 0.1098 - regression_loss: 0.0967 - classification_loss: 0.0131 393/500 [======================>.......] - ETA: 54s - loss: 0.1096 - regression_loss: 0.0965 - classification_loss: 0.0131 394/500 [======================>.......] - ETA: 54s - loss: 0.1095 - regression_loss: 0.0964 - classification_loss: 0.0131 395/500 [======================>.......] - ETA: 53s - loss: 0.1094 - regression_loss: 0.0964 - classification_loss: 0.0131 396/500 [======================>.......] - ETA: 53s - loss: 0.1094 - regression_loss: 0.0964 - classification_loss: 0.0131 397/500 [======================>.......] - ETA: 52s - loss: 0.1093 - regression_loss: 0.0962 - classification_loss: 0.0131 398/500 [======================>.......] - ETA: 52s - loss: 0.1092 - regression_loss: 0.0961 - classification_loss: 0.0131 399/500 [======================>.......] - ETA: 51s - loss: 0.1090 - regression_loss: 0.0960 - classification_loss: 0.0130 400/500 [=======================>......] - ETA: 51s - loss: 0.1088 - regression_loss: 0.0958 - classification_loss: 0.0130 401/500 [=======================>......] - ETA: 50s - loss: 0.1108 - regression_loss: 0.0972 - classification_loss: 0.0136 402/500 [=======================>......] - ETA: 50s - loss: 0.1117 - regression_loss: 0.0981 - classification_loss: 0.0137 403/500 [=======================>......] - ETA: 49s - loss: 0.1116 - regression_loss: 0.0979 - classification_loss: 0.0137 404/500 [=======================>......] - ETA: 49s - loss: 0.1116 - regression_loss: 0.0979 - classification_loss: 0.0137 405/500 [=======================>......] - ETA: 48s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0136 406/500 [=======================>......] - ETA: 48s - loss: 0.1113 - regression_loss: 0.0977 - classification_loss: 0.0136 407/500 [=======================>......] - ETA: 47s - loss: 0.1111 - regression_loss: 0.0976 - classification_loss: 0.0136 408/500 [=======================>......] - ETA: 47s - loss: 0.1117 - regression_loss: 0.0981 - classification_loss: 0.0136 409/500 [=======================>......] - ETA: 46s - loss: 0.1115 - regression_loss: 0.0980 - classification_loss: 0.0136 410/500 [=======================>......] - ETA: 46s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0136 411/500 [=======================>......] - ETA: 45s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0135 412/500 [=======================>......] - ETA: 45s - loss: 0.1112 - regression_loss: 0.0977 - classification_loss: 0.0135 413/500 [=======================>......] - ETA: 44s - loss: 0.1113 - regression_loss: 0.0978 - classification_loss: 0.0135 414/500 [=======================>......] - ETA: 44s - loss: 0.1112 - regression_loss: 0.0977 - classification_loss: 0.0135 415/500 [=======================>......] - ETA: 43s - loss: 0.1110 - regression_loss: 0.0975 - classification_loss: 0.0135 416/500 [=======================>......] - ETA: 42s - loss: 0.1124 - regression_loss: 0.0988 - classification_loss: 0.0136 417/500 [========================>.....] - ETA: 42s - loss: 0.1122 - regression_loss: 0.0986 - classification_loss: 0.0136 418/500 [========================>.....] - ETA: 41s - loss: 0.1120 - regression_loss: 0.0984 - classification_loss: 0.0136 419/500 [========================>.....] - ETA: 41s - loss: 0.1121 - regression_loss: 0.0985 - classification_loss: 0.0136 420/500 [========================>.....] - ETA: 40s - loss: 0.1120 - regression_loss: 0.0984 - classification_loss: 0.0136 421/500 [========================>.....] - ETA: 40s - loss: 0.1120 - regression_loss: 0.0984 - classification_loss: 0.0136 422/500 [========================>.....] - ETA: 39s - loss: 0.1120 - regression_loss: 0.0984 - classification_loss: 0.0136 423/500 [========================>.....] - ETA: 39s - loss: 0.1119 - regression_loss: 0.0984 - classification_loss: 0.0136 424/500 [========================>.....] - ETA: 38s - loss: 0.1118 - regression_loss: 0.0982 - classification_loss: 0.0135 425/500 [========================>.....] - ETA: 38s - loss: 0.1118 - regression_loss: 0.0983 - classification_loss: 0.0135 426/500 [========================>.....] - ETA: 37s - loss: 0.1119 - regression_loss: 0.0984 - classification_loss: 0.0135 427/500 [========================>.....] - ETA: 37s - loss: 0.1117 - regression_loss: 0.0982 - classification_loss: 0.0135 428/500 [========================>.....] - ETA: 36s - loss: 0.1115 - regression_loss: 0.0980 - classification_loss: 0.0135 429/500 [========================>.....] - ETA: 36s - loss: 0.1113 - regression_loss: 0.0979 - classification_loss: 0.0135 430/500 [========================>.....] - ETA: 35s - loss: 0.1112 - regression_loss: 0.0978 - classification_loss: 0.0135 431/500 [========================>.....] - ETA: 35s - loss: 0.1110 - regression_loss: 0.0976 - classification_loss: 0.0134 432/500 [========================>.....] - ETA: 34s - loss: 0.1109 - regression_loss: 0.0975 - classification_loss: 0.0134 433/500 [========================>.....] - ETA: 34s - loss: 0.1108 - regression_loss: 0.0974 - classification_loss: 0.0134 434/500 [=========================>....] - ETA: 33s - loss: 0.1106 - regression_loss: 0.0972 - classification_loss: 0.0134 435/500 [=========================>....] - ETA: 33s - loss: 0.1105 - regression_loss: 0.0971 - classification_loss: 0.0134 436/500 [=========================>....] - ETA: 32s - loss: 0.1105 - regression_loss: 0.0971 - classification_loss: 0.0134 437/500 [=========================>....] - ETA: 32s - loss: 0.1104 - regression_loss: 0.0970 - classification_loss: 0.0134 438/500 [=========================>....] - ETA: 31s - loss: 0.1105 - regression_loss: 0.0971 - classification_loss: 0.0134 439/500 [=========================>....] - ETA: 31s - loss: 0.1104 - regression_loss: 0.0970 - classification_loss: 0.0134 440/500 [=========================>....] - ETA: 30s - loss: 0.1102 - regression_loss: 0.0969 - classification_loss: 0.0134 441/500 [=========================>....] - ETA: 30s - loss: 0.1101 - regression_loss: 0.0968 - classification_loss: 0.0133 442/500 [=========================>....] - ETA: 29s - loss: 0.1102 - regression_loss: 0.0968 - classification_loss: 0.0133 443/500 [=========================>....] - ETA: 29s - loss: 0.1104 - regression_loss: 0.0970 - classification_loss: 0.0134 444/500 [=========================>....] - ETA: 28s - loss: 0.1107 - regression_loss: 0.0973 - classification_loss: 0.0134 445/500 [=========================>....] - ETA: 28s - loss: 0.1106 - regression_loss: 0.0972 - classification_loss: 0.0134 446/500 [=========================>....] - ETA: 27s - loss: 0.1112 - regression_loss: 0.0977 - classification_loss: 0.0135 447/500 [=========================>....] - ETA: 27s - loss: 0.1111 - regression_loss: 0.0975 - classification_loss: 0.0135 448/500 [=========================>....] - ETA: 26s - loss: 0.1110 - regression_loss: 0.0974 - classification_loss: 0.0135 449/500 [=========================>....] - ETA: 26s - loss: 0.1109 - regression_loss: 0.0974 - classification_loss: 0.0135 450/500 [==========================>...] - ETA: 25s - loss: 0.1107 - regression_loss: 0.0973 - classification_loss: 0.0135 451/500 [==========================>...] - ETA: 25s - loss: 0.1108 - regression_loss: 0.0973 - classification_loss: 0.0135 452/500 [==========================>...] - ETA: 24s - loss: 0.1116 - regression_loss: 0.0979 - classification_loss: 0.0137 453/500 [==========================>...] - ETA: 24s - loss: 0.1115 - regression_loss: 0.0978 - classification_loss: 0.0137 454/500 [==========================>...] - ETA: 23s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0137 455/500 [==========================>...] - ETA: 23s - loss: 0.1114 - regression_loss: 0.0977 - classification_loss: 0.0136 456/500 [==========================>...] - ETA: 22s - loss: 0.1113 - regression_loss: 0.0976 - classification_loss: 0.0136 457/500 [==========================>...] - ETA: 22s - loss: 0.1116 - regression_loss: 0.0980 - classification_loss: 0.0136 458/500 [==========================>...] - ETA: 21s - loss: 0.1115 - regression_loss: 0.0979 - classification_loss: 0.0136 459/500 [==========================>...] - ETA: 20s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0136 460/500 [==========================>...] - ETA: 20s - loss: 0.1118 - regression_loss: 0.0981 - classification_loss: 0.0137 461/500 [==========================>...] - ETA: 19s - loss: 0.1117 - regression_loss: 0.0980 - classification_loss: 0.0137 462/500 [==========================>...] - ETA: 19s - loss: 0.1115 - regression_loss: 0.0979 - classification_loss: 0.0136 463/500 [==========================>...] - ETA: 18s - loss: 0.1115 - regression_loss: 0.0978 - classification_loss: 0.0136 464/500 [==========================>...] - ETA: 18s - loss: 0.1114 - regression_loss: 0.0978 - classification_loss: 0.0136 465/500 [==========================>...] - ETA: 17s - loss: 0.1113 - regression_loss: 0.0976 - classification_loss: 0.0136 466/500 [==========================>...] - ETA: 17s - loss: 0.1112 - regression_loss: 0.0976 - classification_loss: 0.0136 467/500 [===========================>..] - ETA: 16s - loss: 0.1118 - regression_loss: 0.0982 - classification_loss: 0.0136 468/500 [===========================>..] - ETA: 16s - loss: 0.1120 - regression_loss: 0.0983 - classification_loss: 0.0136 469/500 [===========================>..] - ETA: 15s - loss: 0.1120 - regression_loss: 0.0984 - classification_loss: 0.0136 470/500 [===========================>..] - ETA: 15s - loss: 0.1119 - regression_loss: 0.0983 - classification_loss: 0.0136 471/500 [===========================>..] - ETA: 14s - loss: 0.1117 - regression_loss: 0.0981 - classification_loss: 0.0136 472/500 [===========================>..] - ETA: 14s - loss: 0.1116 - regression_loss: 0.0980 - classification_loss: 0.0136 473/500 [===========================>..] - ETA: 13s - loss: 0.1115 - regression_loss: 0.0979 - classification_loss: 0.0136 474/500 [===========================>..] - ETA: 13s - loss: 0.1113 - regression_loss: 0.0978 - classification_loss: 0.0135 475/500 [===========================>..] - ETA: 12s - loss: 0.1112 - regression_loss: 0.0977 - classification_loss: 0.0135 476/500 [===========================>..] - ETA: 12s - loss: 0.1113 - regression_loss: 0.0978 - classification_loss: 0.0135 477/500 [===========================>..] - ETA: 11s - loss: 0.1111 - regression_loss: 0.0976 - classification_loss: 0.0135 478/500 [===========================>..] - ETA: 11s - loss: 0.1112 - regression_loss: 0.0977 - classification_loss: 0.0135 479/500 [===========================>..] - ETA: 10s - loss: 0.1113 - regression_loss: 0.0978 - classification_loss: 0.0135 480/500 [===========================>..] - ETA: 10s - loss: 0.1112 - regression_loss: 0.0976 - classification_loss: 0.0135 481/500 [===========================>..] - ETA: 9s - loss: 0.1111 - regression_loss: 0.0976 - classification_loss: 0.0135  482/500 [===========================>..] - ETA: 9s - loss: 0.1110 - regression_loss: 0.0975 - classification_loss: 0.0135 483/500 [===========================>..] - ETA: 8s - loss: 0.1109 - regression_loss: 0.0974 - classification_loss: 0.0135 484/500 [============================>.] - ETA: 8s - loss: 0.1107 - regression_loss: 0.0973 - classification_loss: 0.0135 485/500 [============================>.] - ETA: 7s - loss: 0.1109 - regression_loss: 0.0974 - classification_loss: 0.0135 486/500 [============================>.] - ETA: 7s - loss: 0.1107 - regression_loss: 0.0972 - classification_loss: 0.0135 487/500 [============================>.] - ETA: 6s - loss: 0.1106 - regression_loss: 0.0971 - classification_loss: 0.0134 488/500 [============================>.] - ETA: 6s - loss: 0.1104 - regression_loss: 0.0970 - classification_loss: 0.0134 489/500 [============================>.] - ETA: 5s - loss: 0.1104 - regression_loss: 0.0970 - classification_loss: 0.0134 490/500 [============================>.] - ETA: 5s - loss: 0.1103 - regression_loss: 0.0969 - classification_loss: 0.0134 491/500 [============================>.] - ETA: 4s - loss: 0.1102 - regression_loss: 0.0968 - classification_loss: 0.0134 492/500 [============================>.] - ETA: 4s - loss: 0.1102 - regression_loss: 0.0968 - classification_loss: 0.0134 493/500 [============================>.] - ETA: 3s - loss: 0.1102 - regression_loss: 0.0968 - classification_loss: 0.0134 494/500 [============================>.] - ETA: 3s - loss: 0.1101 - regression_loss: 0.0967 - classification_loss: 0.0134 495/500 [============================>.] - ETA: 2s - loss: 0.1108 - regression_loss: 0.0974 - classification_loss: 0.0134 496/500 [============================>.] - ETA: 2s - loss: 0.1110 - regression_loss: 0.0975 - classification_loss: 0.0134 497/500 [============================>.] - ETA: 1s - loss: 0.1110 - regression_loss: 0.0976 - classification_loss: 0.0134 498/500 [============================>.] - ETA: 1s - loss: 0.1108 - regression_loss: 0.0974 - classification_loss: 0.0134 499/500 [============================>.] - ETA: 0s - loss: 0.1107 - regression_loss: 0.0973 - classification_loss: 0.0134 500/500 [==============================] - 256s 512ms/step - loss: 0.1106 - regression_loss: 0.0972 - classification_loss: 0.0134 326 instances of class plum with average precision: 0.8427 mAP: 0.8427 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 4:17 - loss: 0.1177 - regression_loss: 0.1113 - classification_loss: 0.0065 2/500 [..............................] - ETA: 4:09 - loss: 0.0747 - regression_loss: 0.0683 - classification_loss: 0.0064 3/500 [..............................] - ETA: 4:07 - loss: 0.0712 - regression_loss: 0.0644 - classification_loss: 0.0069 4/500 [..............................] - ETA: 4:08 - loss: 0.2588 - regression_loss: 0.1924 - classification_loss: 0.0664 5/500 [..............................] - ETA: 4:06 - loss: 0.2258 - regression_loss: 0.1708 - classification_loss: 0.0550 6/500 [..............................] - ETA: 4:06 - loss: 0.1960 - regression_loss: 0.1494 - classification_loss: 0.0466 7/500 [..............................] - ETA: 4:06 - loss: 0.1742 - regression_loss: 0.1335 - classification_loss: 0.0407 8/500 [..............................] - ETA: 4:06 - loss: 0.1612 - regression_loss: 0.1240 - classification_loss: 0.0372 9/500 [..............................] - ETA: 4:06 - loss: 0.1515 - regression_loss: 0.1177 - classification_loss: 0.0338 10/500 [..............................] - ETA: 4:06 - loss: 0.1530 - regression_loss: 0.1216 - classification_loss: 0.0314 11/500 [..............................] - ETA: 4:05 - loss: 0.1520 - regression_loss: 0.1222 - classification_loss: 0.0299 12/500 [..............................] - ETA: 4:05 - loss: 0.1436 - regression_loss: 0.1158 - classification_loss: 0.0278 13/500 [..............................] - ETA: 4:05 - loss: 0.1399 - regression_loss: 0.1129 - classification_loss: 0.0270 14/500 [..............................] - ETA: 4:04 - loss: 0.1340 - regression_loss: 0.1086 - classification_loss: 0.0255 15/500 [..............................] - ETA: 4:05 - loss: 0.1291 - regression_loss: 0.1047 - classification_loss: 0.0244 16/500 [..............................] - ETA: 4:04 - loss: 0.1256 - regression_loss: 0.1023 - classification_loss: 0.0233 17/500 [>.............................] - ETA: 4:04 - loss: 0.1212 - regression_loss: 0.0989 - classification_loss: 0.0224 18/500 [>.............................] - ETA: 4:03 - loss: 0.1175 - regression_loss: 0.0961 - classification_loss: 0.0214 19/500 [>.............................] - ETA: 4:02 - loss: 0.1160 - regression_loss: 0.0950 - classification_loss: 0.0209 20/500 [>.............................] - ETA: 4:02 - loss: 0.1247 - regression_loss: 0.1035 - classification_loss: 0.0212 21/500 [>.............................] - ETA: 4:01 - loss: 0.1224 - regression_loss: 0.1017 - classification_loss: 0.0207 22/500 [>.............................] - ETA: 4:01 - loss: 0.1206 - regression_loss: 0.1005 - classification_loss: 0.0201 23/500 [>.............................] - ETA: 4:00 - loss: 0.1197 - regression_loss: 0.1000 - classification_loss: 0.0196 24/500 [>.............................] - ETA: 4:00 - loss: 0.1171 - regression_loss: 0.0978 - classification_loss: 0.0192 25/500 [>.............................] - ETA: 4:00 - loss: 0.1139 - regression_loss: 0.0952 - classification_loss: 0.0188 26/500 [>.............................] - ETA: 4:00 - loss: 0.1139 - regression_loss: 0.0956 - classification_loss: 0.0184 27/500 [>.............................] - ETA: 3:59 - loss: 0.1125 - regression_loss: 0.0945 - classification_loss: 0.0179 28/500 [>.............................] - ETA: 3:58 - loss: 0.1102 - regression_loss: 0.0926 - classification_loss: 0.0176 29/500 [>.............................] - ETA: 3:58 - loss: 0.1079 - regression_loss: 0.0908 - classification_loss: 0.0171 30/500 [>.............................] - ETA: 3:57 - loss: 0.1071 - regression_loss: 0.0902 - classification_loss: 0.0169 31/500 [>.............................] - ETA: 3:57 - loss: 0.1041 - regression_loss: 0.0877 - classification_loss: 0.0164 32/500 [>.............................] - ETA: 3:57 - loss: 0.1021 - regression_loss: 0.0860 - classification_loss: 0.0161 33/500 [>.............................] - ETA: 3:57 - loss: 0.1001 - regression_loss: 0.0842 - classification_loss: 0.0158 34/500 [=>............................] - ETA: 3:56 - loss: 0.1004 - regression_loss: 0.0848 - classification_loss: 0.0155 35/500 [=>............................] - ETA: 3:56 - loss: 0.0987 - regression_loss: 0.0834 - classification_loss: 0.0153 36/500 [=>............................] - ETA: 3:55 - loss: 0.0997 - regression_loss: 0.0843 - classification_loss: 0.0154 37/500 [=>............................] - ETA: 3:55 - loss: 0.1006 - regression_loss: 0.0853 - classification_loss: 0.0153 38/500 [=>............................] - ETA: 3:54 - loss: 0.1003 - regression_loss: 0.0852 - classification_loss: 0.0151 39/500 [=>............................] - ETA: 3:53 - loss: 0.0981 - regression_loss: 0.0833 - classification_loss: 0.0148 40/500 [=>............................] - ETA: 3:53 - loss: 0.0977 - regression_loss: 0.0830 - classification_loss: 0.0146 41/500 [=>............................] - ETA: 3:53 - loss: 0.0978 - regression_loss: 0.0834 - classification_loss: 0.0145 42/500 [=>............................] - ETA: 3:52 - loss: 0.0968 - regression_loss: 0.0825 - classification_loss: 0.0143 43/500 [=>............................] - ETA: 3:52 - loss: 0.0956 - regression_loss: 0.0814 - classification_loss: 0.0142 44/500 [=>............................] - ETA: 3:52 - loss: 0.0950 - regression_loss: 0.0808 - classification_loss: 0.0142 45/500 [=>............................] - ETA: 3:51 - loss: 0.0938 - regression_loss: 0.0798 - classification_loss: 0.0140 46/500 [=>............................] - ETA: 3:51 - loss: 0.0947 - regression_loss: 0.0806 - classification_loss: 0.0141 47/500 [=>............................] - ETA: 3:50 - loss: 0.0949 - regression_loss: 0.0809 - classification_loss: 0.0140 48/500 [=>............................] - ETA: 3:50 - loss: 0.0944 - regression_loss: 0.0804 - classification_loss: 0.0140 49/500 [=>............................] - ETA: 3:49 - loss: 0.0937 - regression_loss: 0.0799 - classification_loss: 0.0138 50/500 [==>...........................] - ETA: 3:49 - loss: 0.0958 - regression_loss: 0.0818 - classification_loss: 0.0140 51/500 [==>...........................] - ETA: 3:48 - loss: 0.0950 - regression_loss: 0.0812 - classification_loss: 0.0139 52/500 [==>...........................] - ETA: 3:48 - loss: 0.0940 - regression_loss: 0.0803 - classification_loss: 0.0137 53/500 [==>...........................] - ETA: 3:47 - loss: 0.0931 - regression_loss: 0.0795 - classification_loss: 0.0136 54/500 [==>...........................] - ETA: 3:47 - loss: 0.0938 - regression_loss: 0.0803 - classification_loss: 0.0135 55/500 [==>...........................] - ETA: 3:46 - loss: 0.0939 - regression_loss: 0.0805 - classification_loss: 0.0134 56/500 [==>...........................] - ETA: 3:45 - loss: 0.0933 - regression_loss: 0.0799 - classification_loss: 0.0133 57/500 [==>...........................] - ETA: 3:45 - loss: 0.0941 - regression_loss: 0.0808 - classification_loss: 0.0133 58/500 [==>...........................] - ETA: 3:44 - loss: 0.0937 - regression_loss: 0.0805 - classification_loss: 0.0132 59/500 [==>...........................] - ETA: 3:44 - loss: 0.0932 - regression_loss: 0.0800 - classification_loss: 0.0131 60/500 [==>...........................] - ETA: 3:43 - loss: 0.0923 - regression_loss: 0.0793 - classification_loss: 0.0130 61/500 [==>...........................] - ETA: 3:43 - loss: 0.0920 - regression_loss: 0.0791 - classification_loss: 0.0129 62/500 [==>...........................] - ETA: 3:42 - loss: 0.0930 - regression_loss: 0.0801 - classification_loss: 0.0129 63/500 [==>...........................] - ETA: 3:42 - loss: 0.0957 - regression_loss: 0.0826 - classification_loss: 0.0131 64/500 [==>...........................] - ETA: 3:41 - loss: 0.0949 - regression_loss: 0.0819 - classification_loss: 0.0130 65/500 [==>...........................] - ETA: 3:41 - loss: 0.0945 - regression_loss: 0.0816 - classification_loss: 0.0129 66/500 [==>...........................] - ETA: 3:41 - loss: 0.0950 - regression_loss: 0.0821 - classification_loss: 0.0129 67/500 [===>..........................] - ETA: 3:40 - loss: 0.0941 - regression_loss: 0.0813 - classification_loss: 0.0128 68/500 [===>..........................] - ETA: 3:40 - loss: 0.0934 - regression_loss: 0.0807 - classification_loss: 0.0127 69/500 [===>..........................] - ETA: 3:39 - loss: 0.0935 - regression_loss: 0.0807 - classification_loss: 0.0128 70/500 [===>..........................] - ETA: 3:39 - loss: 0.0934 - regression_loss: 0.0807 - classification_loss: 0.0127 71/500 [===>..........................] - ETA: 3:38 - loss: 0.0928 - regression_loss: 0.0801 - classification_loss: 0.0127 72/500 [===>..........................] - ETA: 3:38 - loss: 0.0929 - regression_loss: 0.0802 - classification_loss: 0.0127 73/500 [===>..........................] - ETA: 3:37 - loss: 0.0923 - regression_loss: 0.0797 - classification_loss: 0.0126 74/500 [===>..........................] - ETA: 3:37 - loss: 0.0928 - regression_loss: 0.0802 - classification_loss: 0.0126 75/500 [===>..........................] - ETA: 3:36 - loss: 0.0920 - regression_loss: 0.0795 - classification_loss: 0.0125 76/500 [===>..........................] - ETA: 3:36 - loss: 0.0915 - regression_loss: 0.0790 - classification_loss: 0.0125 77/500 [===>..........................] - ETA: 3:35 - loss: 0.0908 - regression_loss: 0.0783 - classification_loss: 0.0125 78/500 [===>..........................] - ETA: 3:35 - loss: 0.0900 - regression_loss: 0.0776 - classification_loss: 0.0124 79/500 [===>..........................] - ETA: 3:34 - loss: 0.0908 - regression_loss: 0.0784 - classification_loss: 0.0124 80/500 [===>..........................] - ETA: 3:34 - loss: 0.0930 - regression_loss: 0.0806 - classification_loss: 0.0124 81/500 [===>..........................] - ETA: 3:33 - loss: 0.0928 - regression_loss: 0.0804 - classification_loss: 0.0123 82/500 [===>..........................] - ETA: 3:33 - loss: 0.0931 - regression_loss: 0.0808 - classification_loss: 0.0123 83/500 [===>..........................] - ETA: 3:32 - loss: 0.0947 - regression_loss: 0.0823 - classification_loss: 0.0125 84/500 [====>.........................] - ETA: 3:32 - loss: 0.0944 - regression_loss: 0.0820 - classification_loss: 0.0124 85/500 [====>.........................] - ETA: 3:31 - loss: 0.0950 - regression_loss: 0.0825 - classification_loss: 0.0124 86/500 [====>.........................] - ETA: 3:31 - loss: 0.0944 - regression_loss: 0.0820 - classification_loss: 0.0124 87/500 [====>.........................] - ETA: 3:31 - loss: 0.0935 - regression_loss: 0.0812 - classification_loss: 0.0123 88/500 [====>.........................] - ETA: 3:30 - loss: 0.0930 - regression_loss: 0.0808 - classification_loss: 0.0122 89/500 [====>.........................] - ETA: 3:30 - loss: 0.0930 - regression_loss: 0.0809 - classification_loss: 0.0121 90/500 [====>.........................] - ETA: 3:29 - loss: 0.0928 - regression_loss: 0.0807 - classification_loss: 0.0121 91/500 [====>.........................] - ETA: 3:28 - loss: 0.0924 - regression_loss: 0.0803 - classification_loss: 0.0120 92/500 [====>.........................] - ETA: 3:28 - loss: 0.0930 - regression_loss: 0.0808 - classification_loss: 0.0122 93/500 [====>.........................] - ETA: 3:27 - loss: 0.0979 - regression_loss: 0.0853 - classification_loss: 0.0126 94/500 [====>.........................] - ETA: 3:27 - loss: 0.0977 - regression_loss: 0.0851 - classification_loss: 0.0126 95/500 [====>.........................] - ETA: 3:26 - loss: 0.0975 - regression_loss: 0.0850 - classification_loss: 0.0126 96/500 [====>.........................] - ETA: 3:26 - loss: 0.0971 - regression_loss: 0.0847 - classification_loss: 0.0125 97/500 [====>.........................] - ETA: 3:25 - loss: 0.0965 - regression_loss: 0.0841 - classification_loss: 0.0124 98/500 [====>.........................] - ETA: 3:25 - loss: 0.0964 - regression_loss: 0.0840 - classification_loss: 0.0124 99/500 [====>.........................] - ETA: 3:24 - loss: 0.0960 - regression_loss: 0.0836 - classification_loss: 0.0124 100/500 [=====>........................] - ETA: 3:24 - loss: 0.0968 - regression_loss: 0.0844 - classification_loss: 0.0125 101/500 [=====>........................] - ETA: 3:23 - loss: 0.0962 - regression_loss: 0.0839 - classification_loss: 0.0124 102/500 [=====>........................] - ETA: 3:23 - loss: 0.0969 - regression_loss: 0.0844 - classification_loss: 0.0124 103/500 [=====>........................] - ETA: 3:22 - loss: 0.0966 - regression_loss: 0.0843 - classification_loss: 0.0124 104/500 [=====>........................] - ETA: 3:22 - loss: 0.0962 - regression_loss: 0.0839 - classification_loss: 0.0123 105/500 [=====>........................] - ETA: 3:21 - loss: 0.0959 - regression_loss: 0.0836 - classification_loss: 0.0123 106/500 [=====>........................] - ETA: 3:21 - loss: 0.0957 - regression_loss: 0.0835 - classification_loss: 0.0122 107/500 [=====>........................] - ETA: 3:20 - loss: 0.0953 - regression_loss: 0.0831 - classification_loss: 0.0122 108/500 [=====>........................] - ETA: 3:20 - loss: 0.0950 - regression_loss: 0.0829 - classification_loss: 0.0121 109/500 [=====>........................] - ETA: 3:19 - loss: 0.0970 - regression_loss: 0.0849 - classification_loss: 0.0122 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1027 - regression_loss: 0.0900 - classification_loss: 0.0128 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1041 - regression_loss: 0.0910 - classification_loss: 0.0131 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1046 - regression_loss: 0.0914 - classification_loss: 0.0131 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1044 - regression_loss: 0.0913 - classification_loss: 0.0131 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1049 - regression_loss: 0.0917 - classification_loss: 0.0131 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1047 - regression_loss: 0.0916 - classification_loss: 0.0131 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1044 - regression_loss: 0.0914 - classification_loss: 0.0130 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1047 - regression_loss: 0.0916 - classification_loss: 0.0130 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1046 - regression_loss: 0.0916 - classification_loss: 0.0130 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1040 - regression_loss: 0.0911 - classification_loss: 0.0129 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1035 - regression_loss: 0.0906 - classification_loss: 0.0129 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1029 - regression_loss: 0.0901 - classification_loss: 0.0128 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1030 - regression_loss: 0.0902 - classification_loss: 0.0128 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1023 - regression_loss: 0.0896 - classification_loss: 0.0127 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1019 - regression_loss: 0.0892 - classification_loss: 0.0127 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1026 - regression_loss: 0.0899 - classification_loss: 0.0127 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1028 - regression_loss: 0.0901 - classification_loss: 0.0127 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1026 - regression_loss: 0.0899 - classification_loss: 0.0127 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1040 - regression_loss: 0.0911 - classification_loss: 0.0128 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1036 - regression_loss: 0.0908 - classification_loss: 0.0128 130/500 [======>.......................] - ETA: 3:09 - loss: 0.1035 - regression_loss: 0.0908 - classification_loss: 0.0128 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1045 - regression_loss: 0.0917 - classification_loss: 0.0128 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1044 - regression_loss: 0.0916 - classification_loss: 0.0128 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1048 - regression_loss: 0.0920 - classification_loss: 0.0128 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1046 - regression_loss: 0.0918 - classification_loss: 0.0128 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1042 - regression_loss: 0.0915 - classification_loss: 0.0127 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1051 - regression_loss: 0.0923 - classification_loss: 0.0128 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1047 - regression_loss: 0.0920 - classification_loss: 0.0127 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1043 - regression_loss: 0.0916 - classification_loss: 0.0127 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1044 - regression_loss: 0.0917 - classification_loss: 0.0127 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1040 - regression_loss: 0.0914 - classification_loss: 0.0126 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1039 - regression_loss: 0.0913 - classification_loss: 0.0126 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1034 - regression_loss: 0.0909 - classification_loss: 0.0126 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0125 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1032 - regression_loss: 0.0907 - classification_loss: 0.0124 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1058 - regression_loss: 0.0932 - classification_loss: 0.0126 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1054 - regression_loss: 0.0928 - classification_loss: 0.0126 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1052 - regression_loss: 0.0927 - classification_loss: 0.0125 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1050 - regression_loss: 0.0925 - classification_loss: 0.0125 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1046 - regression_loss: 0.0922 - classification_loss: 0.0125 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1045 - regression_loss: 0.0920 - classification_loss: 0.0125 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1044 - regression_loss: 0.0919 - classification_loss: 0.0124 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1051 - regression_loss: 0.0926 - classification_loss: 0.0125 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1046 - regression_loss: 0.0922 - classification_loss: 0.0124 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1063 - regression_loss: 0.0934 - classification_loss: 0.0129 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1064 - regression_loss: 0.0935 - classification_loss: 0.0129 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1062 - regression_loss: 0.0933 - classification_loss: 0.0129 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1058 - regression_loss: 0.0929 - classification_loss: 0.0129 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1053 - regression_loss: 0.0925 - classification_loss: 0.0128 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1049 - regression_loss: 0.0922 - classification_loss: 0.0128 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1047 - regression_loss: 0.0920 - classification_loss: 0.0127 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1043 - regression_loss: 0.0916 - classification_loss: 0.0127 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1040 - regression_loss: 0.0913 - classification_loss: 0.0127 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1039 - regression_loss: 0.0913 - classification_loss: 0.0126 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0126 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1035 - regression_loss: 0.0909 - classification_loss: 0.0126 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1039 - regression_loss: 0.0913 - classification_loss: 0.0126 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1034 - regression_loss: 0.0909 - classification_loss: 0.0126 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1036 - regression_loss: 0.0911 - classification_loss: 0.0126 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1033 - regression_loss: 0.0907 - classification_loss: 0.0125 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1034 - regression_loss: 0.0908 - classification_loss: 0.0126 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1034 - regression_loss: 0.0908 - classification_loss: 0.0125 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1036 - regression_loss: 0.0911 - classification_loss: 0.0125 173/500 [=========>....................] - ETA: 2:46 - loss: 0.1036 - regression_loss: 0.0910 - classification_loss: 0.0125 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1031 - regression_loss: 0.0906 - classification_loss: 0.0125 175/500 [=========>....................] - ETA: 2:45 - loss: 0.1028 - regression_loss: 0.0904 - classification_loss: 0.0124 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1026 - regression_loss: 0.0902 - classification_loss: 0.0124 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1023 - regression_loss: 0.0899 - classification_loss: 0.0124 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1023 - regression_loss: 0.0899 - classification_loss: 0.0124 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1019 - regression_loss: 0.0896 - classification_loss: 0.0123 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1020 - regression_loss: 0.0897 - classification_loss: 0.0123 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1018 - regression_loss: 0.0896 - classification_loss: 0.0123 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1016 - regression_loss: 0.0893 - classification_loss: 0.0123 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1013 - regression_loss: 0.0890 - classification_loss: 0.0122 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1010 - regression_loss: 0.0888 - classification_loss: 0.0122 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1014 - regression_loss: 0.0892 - classification_loss: 0.0122 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1012 - regression_loss: 0.0890 - classification_loss: 0.0122 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1010 - regression_loss: 0.0888 - classification_loss: 0.0122 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1016 - regression_loss: 0.0893 - classification_loss: 0.0123 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1013 - regression_loss: 0.0891 - classification_loss: 0.0123 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1017 - regression_loss: 0.0895 - classification_loss: 0.0123 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1016 - regression_loss: 0.0894 - classification_loss: 0.0122 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1017 - regression_loss: 0.0895 - classification_loss: 0.0123 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1016 - regression_loss: 0.0894 - classification_loss: 0.0123 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1014 - regression_loss: 0.0891 - classification_loss: 0.0123 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1031 - regression_loss: 0.0908 - classification_loss: 0.0123 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1062 - regression_loss: 0.0936 - classification_loss: 0.0126 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1059 - regression_loss: 0.0933 - classification_loss: 0.0126 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1057 - regression_loss: 0.0931 - classification_loss: 0.0126 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1054 - regression_loss: 0.0928 - classification_loss: 0.0126 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1051 - regression_loss: 0.0925 - classification_loss: 0.0125 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1050 - regression_loss: 0.0925 - classification_loss: 0.0125 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1051 - regression_loss: 0.0926 - classification_loss: 0.0125 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1050 - regression_loss: 0.0925 - classification_loss: 0.0125 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1048 - regression_loss: 0.0923 - classification_loss: 0.0125 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1046 - regression_loss: 0.0921 - classification_loss: 0.0125 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1044 - regression_loss: 0.0919 - classification_loss: 0.0125 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1041 - regression_loss: 0.0916 - classification_loss: 0.0124 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1036 - regression_loss: 0.0913 - classification_loss: 0.0124 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1034 - regression_loss: 0.0911 - classification_loss: 0.0123 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1031 - regression_loss: 0.0908 - classification_loss: 0.0123 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1034 - regression_loss: 0.0910 - classification_loss: 0.0124 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0125 217/500 [============>.................] - ETA: 2:24 - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0125 218/500 [============>.................] - ETA: 2:24 - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 219/500 [============>.................] - ETA: 2:23 - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 220/500 [============>.................] - ETA: 2:23 - loss: 0.1030 - regression_loss: 0.0906 - classification_loss: 0.0124 221/500 [============>.................] - ETA: 2:22 - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 222/500 [============>.................] - ETA: 2:22 - loss: 0.1029 - regression_loss: 0.0906 - classification_loss: 0.0124 223/500 [============>.................] - ETA: 2:21 - loss: 0.1028 - regression_loss: 0.0905 - classification_loss: 0.0123 224/500 [============>.................] - ETA: 2:21 - loss: 0.1027 - regression_loss: 0.0903 - classification_loss: 0.0123 225/500 [============>.................] - ETA: 2:20 - loss: 0.1027 - regression_loss: 0.0904 - classification_loss: 0.0123 226/500 [============>.................] - ETA: 2:20 - loss: 0.1025 - regression_loss: 0.0902 - classification_loss: 0.0123 227/500 [============>.................] - ETA: 2:19 - loss: 0.1030 - regression_loss: 0.0907 - classification_loss: 0.0123 228/500 [============>.................] - ETA: 2:19 - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0124 229/500 [============>.................] - ETA: 2:18 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 230/500 [============>.................] - ETA: 2:17 - loss: 0.1040 - regression_loss: 0.0916 - classification_loss: 0.0125 231/500 [============>.................] - ETA: 2:17 - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0124 232/500 [============>.................] - ETA: 2:16 - loss: 0.1039 - regression_loss: 0.0914 - classification_loss: 0.0124 233/500 [============>.................] - ETA: 2:16 - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 234/500 [=============>................] - ETA: 2:16 - loss: 0.1034 - regression_loss: 0.0910 - classification_loss: 0.0124 235/500 [=============>................] - ETA: 2:15 - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 236/500 [=============>................] - ETA: 2:15 - loss: 0.1041 - regression_loss: 0.0917 - classification_loss: 0.0124 237/500 [=============>................] - ETA: 2:14 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 238/500 [=============>................] - ETA: 2:13 - loss: 0.1038 - regression_loss: 0.0914 - classification_loss: 0.0124 239/500 [=============>................] - ETA: 2:13 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 240/500 [=============>................] - ETA: 2:12 - loss: 0.1037 - regression_loss: 0.0914 - classification_loss: 0.0124 241/500 [=============>................] - ETA: 2:12 - loss: 0.1035 - regression_loss: 0.0912 - classification_loss: 0.0123 242/500 [=============>................] - ETA: 2:11 - loss: 0.1042 - regression_loss: 0.0919 - classification_loss: 0.0124 243/500 [=============>................] - ETA: 2:11 - loss: 0.1040 - regression_loss: 0.0916 - classification_loss: 0.0123 244/500 [=============>................] - ETA: 2:10 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0123 245/500 [=============>................] - ETA: 2:10 - loss: 0.1036 - regression_loss: 0.0913 - classification_loss: 0.0123 246/500 [=============>................] - ETA: 2:09 - loss: 0.1035 - regression_loss: 0.0912 - classification_loss: 0.0123 247/500 [=============>................] - ETA: 2:09 - loss: 0.1031 - regression_loss: 0.0909 - classification_loss: 0.0123 248/500 [=============>................] - ETA: 2:08 - loss: 0.1031 - regression_loss: 0.0908 - classification_loss: 0.0122 249/500 [=============>................] - ETA: 2:08 - loss: 0.1031 - regression_loss: 0.0909 - classification_loss: 0.0122 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1029 - regression_loss: 0.0907 - classification_loss: 0.0122 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1029 - regression_loss: 0.0908 - classification_loss: 0.0122 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1029 - regression_loss: 0.0907 - classification_loss: 0.0122 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1028 - regression_loss: 0.0906 - classification_loss: 0.0122 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1025 - regression_loss: 0.0903 - classification_loss: 0.0121 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1041 - regression_loss: 0.0919 - classification_loss: 0.0122 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1038 - regression_loss: 0.0916 - classification_loss: 0.0122 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1039 - regression_loss: 0.0917 - classification_loss: 0.0122 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1037 - regression_loss: 0.0915 - classification_loss: 0.0122 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1035 - regression_loss: 0.0913 - classification_loss: 0.0122 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1033 - regression_loss: 0.0912 - classification_loss: 0.0121 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1030 - regression_loss: 0.0909 - classification_loss: 0.0121 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1028 - regression_loss: 0.0907 - classification_loss: 0.0121 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1027 - regression_loss: 0.0906 - classification_loss: 0.0121 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1024 - regression_loss: 0.0903 - classification_loss: 0.0121 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1022 - regression_loss: 0.0901 - classification_loss: 0.0120 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1019 - regression_loss: 0.0899 - classification_loss: 0.0120 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1017 - regression_loss: 0.0897 - classification_loss: 0.0120 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1015 - regression_loss: 0.0895 - classification_loss: 0.0120 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1013 - regression_loss: 0.0894 - classification_loss: 0.0120 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1011 - regression_loss: 0.0891 - classification_loss: 0.0120 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1010 - regression_loss: 0.0890 - classification_loss: 0.0119 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1008 - regression_loss: 0.0889 - classification_loss: 0.0119 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1007 - regression_loss: 0.0888 - classification_loss: 0.0119 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1010 - regression_loss: 0.0891 - classification_loss: 0.0119 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1008 - regression_loss: 0.0889 - classification_loss: 0.0119 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1007 - regression_loss: 0.0888 - classification_loss: 0.0119 277/500 [===============>..............] - ETA: 1:54 - loss: 0.1007 - regression_loss: 0.0888 - classification_loss: 0.0119 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1006 - regression_loss: 0.0887 - classification_loss: 0.0119 279/500 [===============>..............] - ETA: 1:53 - loss: 0.1008 - regression_loss: 0.0889 - classification_loss: 0.0119 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1008 - regression_loss: 0.0889 - classification_loss: 0.0119 281/500 [===============>..............] - ETA: 1:52 - loss: 0.1011 - regression_loss: 0.0892 - classification_loss: 0.0119 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1010 - regression_loss: 0.0891 - classification_loss: 0.0119 283/500 [===============>..............] - ETA: 1:51 - loss: 0.1017 - regression_loss: 0.0898 - classification_loss: 0.0119 284/500 [================>.............] - ETA: 1:50 - loss: 0.1017 - regression_loss: 0.0898 - classification_loss: 0.0119 285/500 [================>.............] - ETA: 1:50 - loss: 0.1016 - regression_loss: 0.0897 - classification_loss: 0.0119 286/500 [================>.............] - ETA: 1:49 - loss: 0.1017 - regression_loss: 0.0898 - classification_loss: 0.0119 287/500 [================>.............] - ETA: 1:48 - loss: 0.1019 - regression_loss: 0.0900 - classification_loss: 0.0119 288/500 [================>.............] - ETA: 1:48 - loss: 0.1017 - regression_loss: 0.0898 - classification_loss: 0.0119 289/500 [================>.............] - ETA: 1:47 - loss: 0.1017 - regression_loss: 0.0898 - classification_loss: 0.0119 290/500 [================>.............] - ETA: 1:47 - loss: 0.1019 - regression_loss: 0.0900 - classification_loss: 0.0119 291/500 [================>.............] - ETA: 1:46 - loss: 0.1018 - regression_loss: 0.0899 - classification_loss: 0.0119 292/500 [================>.............] - ETA: 1:46 - loss: 0.1015 - regression_loss: 0.0896 - classification_loss: 0.0119 293/500 [================>.............] - ETA: 1:45 - loss: 0.1015 - regression_loss: 0.0896 - classification_loss: 0.0119 294/500 [================>.............] - ETA: 1:45 - loss: 0.1016 - regression_loss: 0.0897 - classification_loss: 0.0119 295/500 [================>.............] - ETA: 1:44 - loss: 0.1043 - regression_loss: 0.0915 - classification_loss: 0.0127 296/500 [================>.............] - ETA: 1:44 - loss: 0.1042 - regression_loss: 0.0915 - classification_loss: 0.0127 297/500 [================>.............] - ETA: 1:43 - loss: 0.1040 - regression_loss: 0.0913 - classification_loss: 0.0127 298/500 [================>.............] - ETA: 1:43 - loss: 0.1039 - regression_loss: 0.0912 - classification_loss: 0.0127 299/500 [================>.............] - ETA: 1:42 - loss: 0.1039 - regression_loss: 0.0912 - classification_loss: 0.0127 300/500 [=================>............] - ETA: 1:42 - loss: 0.1037 - regression_loss: 0.0910 - classification_loss: 0.0127 301/500 [=================>............] - ETA: 1:41 - loss: 0.1035 - regression_loss: 0.0909 - classification_loss: 0.0126 302/500 [=================>............] - ETA: 1:41 - loss: 0.1033 - regression_loss: 0.0906 - classification_loss: 0.0126 303/500 [=================>............] - ETA: 1:40 - loss: 0.1030 - regression_loss: 0.0904 - classification_loss: 0.0126 304/500 [=================>............] - ETA: 1:40 - loss: 0.1029 - regression_loss: 0.0903 - classification_loss: 0.0126 305/500 [=================>............] - ETA: 1:39 - loss: 0.1027 - regression_loss: 0.0901 - classification_loss: 0.0126 306/500 [=================>............] - ETA: 1:39 - loss: 0.1030 - regression_loss: 0.0905 - classification_loss: 0.0126 307/500 [=================>............] - ETA: 1:38 - loss: 0.1030 - regression_loss: 0.0904 - classification_loss: 0.0126 308/500 [=================>............] - ETA: 1:38 - loss: 0.1028 - regression_loss: 0.0903 - classification_loss: 0.0125 309/500 [=================>............] - ETA: 1:37 - loss: 0.1028 - regression_loss: 0.0903 - classification_loss: 0.0125 310/500 [=================>............] - ETA: 1:37 - loss: 0.1026 - regression_loss: 0.0901 - classification_loss: 0.0125 311/500 [=================>............] - ETA: 1:36 - loss: 0.1025 - regression_loss: 0.0900 - classification_loss: 0.0125 312/500 [=================>............] - ETA: 1:36 - loss: 0.1027 - regression_loss: 0.0902 - classification_loss: 0.0125 313/500 [=================>............] - ETA: 1:35 - loss: 0.1025 - regression_loss: 0.0900 - classification_loss: 0.0125 314/500 [=================>............] - ETA: 1:35 - loss: 0.1025 - regression_loss: 0.0900 - classification_loss: 0.0125 315/500 [=================>............] - ETA: 1:34 - loss: 0.1023 - regression_loss: 0.0898 - classification_loss: 0.0125 316/500 [=================>............] - ETA: 1:34 - loss: 0.1023 - regression_loss: 0.0898 - classification_loss: 0.0125 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1022 - regression_loss: 0.0897 - classification_loss: 0.0125 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1020 - regression_loss: 0.0896 - classification_loss: 0.0124 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1021 - regression_loss: 0.0896 - classification_loss: 0.0125 320/500 [==================>...........] - ETA: 1:32 - loss: 0.1019 - regression_loss: 0.0895 - classification_loss: 0.0125 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1018 - regression_loss: 0.0894 - classification_loss: 0.0124 322/500 [==================>...........] - ETA: 1:31 - loss: 0.1036 - regression_loss: 0.0910 - classification_loss: 0.0126 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1036 - regression_loss: 0.0910 - classification_loss: 0.0126 324/500 [==================>...........] - ETA: 1:30 - loss: 0.1037 - regression_loss: 0.0911 - classification_loss: 0.0126 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1036 - regression_loss: 0.0910 - classification_loss: 0.0126 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1042 - regression_loss: 0.0916 - classification_loss: 0.0126 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1041 - regression_loss: 0.0915 - classification_loss: 0.0126 328/500 [==================>...........] - ETA: 1:28 - loss: 0.1049 - regression_loss: 0.0921 - classification_loss: 0.0128 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1047 - regression_loss: 0.0919 - classification_loss: 0.0128 330/500 [==================>...........] - ETA: 1:27 - loss: 0.1050 - regression_loss: 0.0922 - classification_loss: 0.0128 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1048 - regression_loss: 0.0921 - classification_loss: 0.0128 332/500 [==================>...........] - ETA: 1:26 - loss: 0.1047 - regression_loss: 0.0920 - classification_loss: 0.0128 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1045 - regression_loss: 0.0918 - classification_loss: 0.0127 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1044 - regression_loss: 0.0916 - classification_loss: 0.0127 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1042 - regression_loss: 0.0915 - classification_loss: 0.0127 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1047 - regression_loss: 0.0920 - classification_loss: 0.0127 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1044 - regression_loss: 0.0917 - classification_loss: 0.0127 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1043 - regression_loss: 0.0917 - classification_loss: 0.0127 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1041 - regression_loss: 0.0915 - classification_loss: 0.0126 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1043 - regression_loss: 0.0917 - classification_loss: 0.0126 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1042 - regression_loss: 0.0916 - classification_loss: 0.0126 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1040 - regression_loss: 0.0914 - classification_loss: 0.0126 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1041 - regression_loss: 0.0915 - classification_loss: 0.0126 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1040 - regression_loss: 0.0914 - classification_loss: 0.0126 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1039 - regression_loss: 0.0913 - classification_loss: 0.0126 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1037 - regression_loss: 0.0911 - classification_loss: 0.0126 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1038 - regression_loss: 0.0912 - classification_loss: 0.0126 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1037 - regression_loss: 0.0911 - classification_loss: 0.0125 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1036 - regression_loss: 0.0911 - classification_loss: 0.0125 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1034 - regression_loss: 0.0909 - classification_loss: 0.0125 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1033 - regression_loss: 0.0908 - classification_loss: 0.0125 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0125 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0125 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1034 - regression_loss: 0.0909 - classification_loss: 0.0125 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1046 - regression_loss: 0.0920 - classification_loss: 0.0126 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1045 - regression_loss: 0.0919 - classification_loss: 0.0126 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1043 - regression_loss: 0.0918 - classification_loss: 0.0126 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1043 - regression_loss: 0.0917 - classification_loss: 0.0126 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1046 - regression_loss: 0.0920 - classification_loss: 0.0126 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1047 - regression_loss: 0.0921 - classification_loss: 0.0126 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1045 - regression_loss: 0.0919 - classification_loss: 0.0126 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1043 - regression_loss: 0.0917 - classification_loss: 0.0125 363/500 [====================>.........] - ETA: 1:10 - loss: 0.1043 - regression_loss: 0.0917 - classification_loss: 0.0125 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1041 - regression_loss: 0.0916 - classification_loss: 0.0125 365/500 [====================>.........] - ETA: 1:09 - loss: 0.1040 - regression_loss: 0.0915 - classification_loss: 0.0125 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0125 367/500 [=====================>........] - ETA: 1:08 - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0125 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1036 - regression_loss: 0.0911 - classification_loss: 0.0124 369/500 [=====================>........] - ETA: 1:07 - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0124 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1033 - regression_loss: 0.0909 - classification_loss: 0.0124 371/500 [=====================>........] - ETA: 1:06 - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1031 - regression_loss: 0.0906 - classification_loss: 0.0124 373/500 [=====================>........] - ETA: 1:05 - loss: 0.1030 - regression_loss: 0.0906 - classification_loss: 0.0124 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1033 - regression_loss: 0.0909 - classification_loss: 0.0124 375/500 [=====================>........] - ETA: 1:04 - loss: 0.1033 - regression_loss: 0.0908 - classification_loss: 0.0124 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1034 - regression_loss: 0.0909 - classification_loss: 0.0124 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1034 - regression_loss: 0.0910 - classification_loss: 0.0124 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1041 - regression_loss: 0.0916 - classification_loss: 0.0124 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1042 - regression_loss: 0.0917 - classification_loss: 0.0124 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1040 - regression_loss: 0.0916 - classification_loss: 0.0124 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 383/500 [=====================>........] - ETA: 59s - loss: 0.1038 - regression_loss: 0.0914 - classification_loss: 0.0124  384/500 [======================>.......] - ETA: 59s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 385/500 [======================>.......] - ETA: 58s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 386/500 [======================>.......] - ETA: 58s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 387/500 [======================>.......] - ETA: 57s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 388/500 [======================>.......] - ETA: 57s - loss: 0.1035 - regression_loss: 0.0912 - classification_loss: 0.0123 389/500 [======================>.......] - ETA: 56s - loss: 0.1034 - regression_loss: 0.0911 - classification_loss: 0.0123 390/500 [======================>.......] - ETA: 56s - loss: 0.1034 - regression_loss: 0.0911 - classification_loss: 0.0123 391/500 [======================>.......] - ETA: 55s - loss: 0.1034 - regression_loss: 0.0910 - classification_loss: 0.0123 392/500 [======================>.......] - ETA: 55s - loss: 0.1033 - regression_loss: 0.0910 - classification_loss: 0.0123 393/500 [======================>.......] - ETA: 54s - loss: 0.1032 - regression_loss: 0.0909 - classification_loss: 0.0123 394/500 [======================>.......] - ETA: 54s - loss: 0.1031 - regression_loss: 0.0908 - classification_loss: 0.0123 395/500 [======================>.......] - ETA: 53s - loss: 0.1030 - regression_loss: 0.0907 - classification_loss: 0.0123 396/500 [======================>.......] - ETA: 53s - loss: 0.1031 - regression_loss: 0.0908 - classification_loss: 0.0123 397/500 [======================>.......] - ETA: 52s - loss: 0.1030 - regression_loss: 0.0907 - classification_loss: 0.0123 398/500 [======================>.......] - ETA: 52s - loss: 0.1029 - regression_loss: 0.0906 - classification_loss: 0.0123 399/500 [======================>.......] - ETA: 51s - loss: 0.1028 - regression_loss: 0.0906 - classification_loss: 0.0122 400/500 [=======================>......] - ETA: 51s - loss: 0.1039 - regression_loss: 0.0914 - classification_loss: 0.0125 401/500 [=======================>......] - ETA: 50s - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0125 402/500 [=======================>......] - ETA: 50s - loss: 0.1037 - regression_loss: 0.0912 - classification_loss: 0.0125 403/500 [=======================>......] - ETA: 49s - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0125 404/500 [=======================>......] - ETA: 49s - loss: 0.1033 - regression_loss: 0.0909 - classification_loss: 0.0124 405/500 [=======================>......] - ETA: 48s - loss: 0.1035 - regression_loss: 0.0910 - classification_loss: 0.0125 406/500 [=======================>......] - ETA: 48s - loss: 0.1034 - regression_loss: 0.0910 - classification_loss: 0.0124 407/500 [=======================>......] - ETA: 47s - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 408/500 [=======================>......] - ETA: 47s - loss: 0.1033 - regression_loss: 0.0909 - classification_loss: 0.0124 409/500 [=======================>......] - ETA: 46s - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 410/500 [=======================>......] - ETA: 46s - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 411/500 [=======================>......] - ETA: 45s - loss: 0.1030 - regression_loss: 0.0907 - classification_loss: 0.0124 412/500 [=======================>......] - ETA: 45s - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 413/500 [=======================>......] - ETA: 44s - loss: 0.1029 - regression_loss: 0.0905 - classification_loss: 0.0124 414/500 [=======================>......] - ETA: 44s - loss: 0.1027 - regression_loss: 0.0904 - classification_loss: 0.0123 415/500 [=======================>......] - ETA: 43s - loss: 0.1031 - regression_loss: 0.0907 - classification_loss: 0.0124 416/500 [=======================>......] - ETA: 43s - loss: 0.1030 - regression_loss: 0.0906 - classification_loss: 0.0124 417/500 [========================>.....] - ETA: 42s - loss: 0.1028 - regression_loss: 0.0905 - classification_loss: 0.0124 418/500 [========================>.....] - ETA: 41s - loss: 0.1043 - regression_loss: 0.0918 - classification_loss: 0.0125 419/500 [========================>.....] - ETA: 41s - loss: 0.1042 - regression_loss: 0.0917 - classification_loss: 0.0125 420/500 [========================>.....] - ETA: 40s - loss: 0.1040 - regression_loss: 0.0916 - classification_loss: 0.0125 421/500 [========================>.....] - ETA: 40s - loss: 0.1039 - regression_loss: 0.0914 - classification_loss: 0.0125 422/500 [========================>.....] - ETA: 39s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 423/500 [========================>.....] - ETA: 39s - loss: 0.1037 - regression_loss: 0.0912 - classification_loss: 0.0124 424/500 [========================>.....] - ETA: 38s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 425/500 [========================>.....] - ETA: 38s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 426/500 [========================>.....] - ETA: 37s - loss: 0.1042 - regression_loss: 0.0918 - classification_loss: 0.0124 427/500 [========================>.....] - ETA: 37s - loss: 0.1045 - regression_loss: 0.0921 - classification_loss: 0.0124 428/500 [========================>.....] - ETA: 36s - loss: 0.1044 - regression_loss: 0.0920 - classification_loss: 0.0124 429/500 [========================>.....] - ETA: 36s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 430/500 [========================>.....] - ETA: 35s - loss: 0.1042 - regression_loss: 0.0918 - classification_loss: 0.0124 431/500 [========================>.....] - ETA: 35s - loss: 0.1047 - regression_loss: 0.0922 - classification_loss: 0.0124 432/500 [========================>.....] - ETA: 34s - loss: 0.1045 - regression_loss: 0.0921 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 34s - loss: 0.1045 - regression_loss: 0.0920 - classification_loss: 0.0124 434/500 [=========================>....] - ETA: 33s - loss: 0.1046 - regression_loss: 0.0921 - classification_loss: 0.0124 435/500 [=========================>....] - ETA: 33s - loss: 0.1044 - regression_loss: 0.0920 - classification_loss: 0.0124 436/500 [=========================>....] - ETA: 32s - loss: 0.1047 - regression_loss: 0.0923 - classification_loss: 0.0124 437/500 [=========================>....] - ETA: 32s - loss: 0.1046 - regression_loss: 0.0922 - classification_loss: 0.0124 438/500 [=========================>....] - ETA: 31s - loss: 0.1045 - regression_loss: 0.0921 - classification_loss: 0.0124 439/500 [=========================>....] - ETA: 31s - loss: 0.1046 - regression_loss: 0.0922 - classification_loss: 0.0124 440/500 [=========================>....] - ETA: 30s - loss: 0.1045 - regression_loss: 0.0920 - classification_loss: 0.0124 441/500 [=========================>....] - ETA: 30s - loss: 0.1044 - regression_loss: 0.0920 - classification_loss: 0.0124 442/500 [=========================>....] - ETA: 29s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 443/500 [=========================>....] - ETA: 29s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 444/500 [=========================>....] - ETA: 28s - loss: 0.1044 - regression_loss: 0.0920 - classification_loss: 0.0124 445/500 [=========================>....] - ETA: 28s - loss: 0.1043 - regression_loss: 0.0920 - classification_loss: 0.0124 446/500 [=========================>....] - ETA: 27s - loss: 0.1042 - regression_loss: 0.0919 - classification_loss: 0.0124 447/500 [=========================>....] - ETA: 27s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 448/500 [=========================>....] - ETA: 26s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 449/500 [=========================>....] - ETA: 26s - loss: 0.1043 - regression_loss: 0.0919 - classification_loss: 0.0124 450/500 [==========================>...] - ETA: 25s - loss: 0.1042 - regression_loss: 0.0919 - classification_loss: 0.0124 451/500 [==========================>...] - ETA: 25s - loss: 0.1046 - regression_loss: 0.0922 - classification_loss: 0.0124 452/500 [==========================>...] - ETA: 24s - loss: 0.1054 - regression_loss: 0.0928 - classification_loss: 0.0126 453/500 [==========================>...] - ETA: 24s - loss: 0.1054 - regression_loss: 0.0928 - classification_loss: 0.0126 454/500 [==========================>...] - ETA: 23s - loss: 0.1053 - regression_loss: 0.0927 - classification_loss: 0.0126 455/500 [==========================>...] - ETA: 23s - loss: 0.1052 - regression_loss: 0.0926 - classification_loss: 0.0125 456/500 [==========================>...] - ETA: 22s - loss: 0.1050 - regression_loss: 0.0925 - classification_loss: 0.0125 457/500 [==========================>...] - ETA: 22s - loss: 0.1049 - regression_loss: 0.0924 - classification_loss: 0.0125 458/500 [==========================>...] - ETA: 21s - loss: 0.1051 - regression_loss: 0.0926 - classification_loss: 0.0126 459/500 [==========================>...] - ETA: 20s - loss: 0.1051 - regression_loss: 0.0925 - classification_loss: 0.0126 460/500 [==========================>...] - ETA: 20s - loss: 0.1049 - regression_loss: 0.0924 - classification_loss: 0.0125 461/500 [==========================>...] - ETA: 19s - loss: 0.1048 - regression_loss: 0.0923 - classification_loss: 0.0125 462/500 [==========================>...] - ETA: 19s - loss: 0.1047 - regression_loss: 0.0922 - classification_loss: 0.0125 463/500 [==========================>...] - ETA: 18s - loss: 0.1045 - regression_loss: 0.0920 - classification_loss: 0.0125 464/500 [==========================>...] - ETA: 18s - loss: 0.1045 - regression_loss: 0.0920 - classification_loss: 0.0125 465/500 [==========================>...] - ETA: 17s - loss: 0.1043 - regression_loss: 0.0918 - classification_loss: 0.0125 466/500 [==========================>...] - ETA: 17s - loss: 0.1042 - regression_loss: 0.0917 - classification_loss: 0.0125 467/500 [===========================>..] - ETA: 16s - loss: 0.1042 - regression_loss: 0.0918 - classification_loss: 0.0125 468/500 [===========================>..] - ETA: 16s - loss: 0.1041 - regression_loss: 0.0916 - classification_loss: 0.0125 469/500 [===========================>..] - ETA: 15s - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0124 470/500 [===========================>..] - ETA: 15s - loss: 0.1038 - regression_loss: 0.0914 - classification_loss: 0.0124 471/500 [===========================>..] - ETA: 14s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 472/500 [===========================>..] - ETA: 14s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 473/500 [===========================>..] - ETA: 13s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 474/500 [===========================>..] - ETA: 13s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 475/500 [===========================>..] - ETA: 12s - loss: 0.1037 - regression_loss: 0.0913 - classification_loss: 0.0124 476/500 [===========================>..] - ETA: 12s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 477/500 [===========================>..] - ETA: 11s - loss: 0.1039 - regression_loss: 0.0914 - classification_loss: 0.0125 478/500 [===========================>..] - ETA: 11s - loss: 0.1039 - regression_loss: 0.0915 - classification_loss: 0.0125 479/500 [===========================>..] - ETA: 10s - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0125 480/500 [===========================>..] - ETA: 10s - loss: 0.1036 - regression_loss: 0.0912 - classification_loss: 0.0124 481/500 [===========================>..] - ETA: 9s - loss: 0.1044 - regression_loss: 0.0919 - classification_loss: 0.0125  482/500 [===========================>..] - ETA: 9s - loss: 0.1059 - regression_loss: 0.0929 - classification_loss: 0.0130 483/500 [===========================>..] - ETA: 8s - loss: 0.1057 - regression_loss: 0.0928 - classification_loss: 0.0130 484/500 [============================>.] - ETA: 8s - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 485/500 [============================>.] - ETA: 7s - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 486/500 [============================>.] - ETA: 7s - loss: 0.1057 - regression_loss: 0.0928 - classification_loss: 0.0129 487/500 [============================>.] - ETA: 6s - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 488/500 [============================>.] - ETA: 6s - loss: 0.1057 - regression_loss: 0.0927 - classification_loss: 0.0129 489/500 [============================>.] - ETA: 5s - loss: 0.1055 - regression_loss: 0.0926 - classification_loss: 0.0129 490/500 [============================>.] - ETA: 5s - loss: 0.1058 - regression_loss: 0.0928 - classification_loss: 0.0130 491/500 [============================>.] - ETA: 4s - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 492/500 [============================>.] - ETA: 4s - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 493/500 [============================>.] - ETA: 3s - loss: 0.1055 - regression_loss: 0.0926 - classification_loss: 0.0129 494/500 [============================>.] - ETA: 3s - loss: 0.1060 - regression_loss: 0.0929 - classification_loss: 0.0131 495/500 [============================>.] - ETA: 2s - loss: 0.1060 - regression_loss: 0.0929 - classification_loss: 0.0131 496/500 [============================>.] - ETA: 2s - loss: 0.1059 - regression_loss: 0.0928 - classification_loss: 0.0131 497/500 [============================>.] - ETA: 1s - loss: 0.1057 - regression_loss: 0.0926 - classification_loss: 0.0131 498/500 [============================>.] - ETA: 1s - loss: 0.1057 - regression_loss: 0.0926 - classification_loss: 0.0131 499/500 [============================>.] - ETA: 0s - loss: 0.1056 - regression_loss: 0.0926 - classification_loss: 0.0130 500/500 [==============================] - 256s 512ms/step - loss: 0.1055 - regression_loss: 0.0925 - classification_loss: 0.0130 326 instances of class plum with average precision: 0.8338 mAP: 0.8338 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 4:19 - loss: 0.0355 - regression_loss: 0.0277 - classification_loss: 0.0078 2/500 [..............................] - ETA: 4:15 - loss: 0.0304 - regression_loss: 0.0232 - classification_loss: 0.0072 3/500 [..............................] - ETA: 4:11 - loss: 0.0364 - regression_loss: 0.0287 - classification_loss: 0.0077 4/500 [..............................] - ETA: 4:10 - loss: 0.0398 - regression_loss: 0.0325 - classification_loss: 0.0073 5/500 [..............................] - ETA: 4:10 - loss: 0.0474 - regression_loss: 0.0399 - classification_loss: 0.0075 6/500 [..............................] - ETA: 4:10 - loss: 0.0533 - regression_loss: 0.0458 - classification_loss: 0.0076 7/500 [..............................] - ETA: 4:09 - loss: 0.0496 - regression_loss: 0.0423 - classification_loss: 0.0073 8/500 [..............................] - ETA: 4:09 - loss: 0.0526 - regression_loss: 0.0441 - classification_loss: 0.0085 9/500 [..............................] - ETA: 4:09 - loss: 0.0568 - regression_loss: 0.0480 - classification_loss: 0.0087 10/500 [..............................] - ETA: 4:09 - loss: 0.0599 - regression_loss: 0.0508 - classification_loss: 0.0091 11/500 [..............................] - ETA: 4:08 - loss: 0.0619 - regression_loss: 0.0529 - classification_loss: 0.0090 12/500 [..............................] - ETA: 4:07 - loss: 0.0619 - regression_loss: 0.0529 - classification_loss: 0.0090 13/500 [..............................] - ETA: 4:06 - loss: 0.0607 - regression_loss: 0.0520 - classification_loss: 0.0087 14/500 [..............................] - ETA: 4:06 - loss: 0.0624 - regression_loss: 0.0538 - classification_loss: 0.0086 15/500 [..............................] - ETA: 4:05 - loss: 0.0616 - regression_loss: 0.0530 - classification_loss: 0.0086 16/500 [..............................] - ETA: 4:05 - loss: 0.0619 - regression_loss: 0.0534 - classification_loss: 0.0085 17/500 [>.............................] - ETA: 4:05 - loss: 0.0667 - regression_loss: 0.0579 - classification_loss: 0.0088 18/500 [>.............................] - ETA: 4:04 - loss: 0.0660 - regression_loss: 0.0572 - classification_loss: 0.0088 19/500 [>.............................] - ETA: 4:04 - loss: 0.0648 - regression_loss: 0.0562 - classification_loss: 0.0086 20/500 [>.............................] - ETA: 4:04 - loss: 0.0872 - regression_loss: 0.0774 - classification_loss: 0.0098 21/500 [>.............................] - ETA: 4:04 - loss: 0.0844 - regression_loss: 0.0748 - classification_loss: 0.0096 22/500 [>.............................] - ETA: 4:04 - loss: 0.0847 - regression_loss: 0.0751 - classification_loss: 0.0096 23/500 [>.............................] - ETA: 4:03 - loss: 0.0836 - regression_loss: 0.0741 - classification_loss: 0.0095 24/500 [>.............................] - ETA: 4:03 - loss: 0.0826 - regression_loss: 0.0733 - classification_loss: 0.0093 25/500 [>.............................] - ETA: 4:02 - loss: 0.0859 - regression_loss: 0.0762 - classification_loss: 0.0097 26/500 [>.............................] - ETA: 4:02 - loss: 0.0837 - regression_loss: 0.0742 - classification_loss: 0.0095 27/500 [>.............................] - ETA: 4:01 - loss: 0.0815 - regression_loss: 0.0721 - classification_loss: 0.0094 28/500 [>.............................] - ETA: 4:00 - loss: 0.0809 - regression_loss: 0.0716 - classification_loss: 0.0093 29/500 [>.............................] - ETA: 4:00 - loss: 0.0794 - regression_loss: 0.0700 - classification_loss: 0.0094 30/500 [>.............................] - ETA: 3:59 - loss: 0.0818 - regression_loss: 0.0722 - classification_loss: 0.0096 31/500 [>.............................] - ETA: 3:58 - loss: 0.0827 - regression_loss: 0.0730 - classification_loss: 0.0097 32/500 [>.............................] - ETA: 3:58 - loss: 0.0811 - regression_loss: 0.0715 - classification_loss: 0.0096 33/500 [>.............................] - ETA: 3:57 - loss: 0.0837 - regression_loss: 0.0740 - classification_loss: 0.0097 34/500 [=>............................] - ETA: 3:57 - loss: 0.0825 - regression_loss: 0.0729 - classification_loss: 0.0096 35/500 [=>............................] - ETA: 3:56 - loss: 0.0995 - regression_loss: 0.0881 - classification_loss: 0.0114 36/500 [=>............................] - ETA: 3:56 - loss: 0.0979 - regression_loss: 0.0866 - classification_loss: 0.0113 37/500 [=>............................] - ETA: 3:55 - loss: 0.0971 - regression_loss: 0.0858 - classification_loss: 0.0113 38/500 [=>............................] - ETA: 3:55 - loss: 0.0970 - regression_loss: 0.0859 - classification_loss: 0.0111 39/500 [=>............................] - ETA: 3:54 - loss: 0.0956 - regression_loss: 0.0846 - classification_loss: 0.0110 40/500 [=>............................] - ETA: 3:54 - loss: 0.0944 - regression_loss: 0.0835 - classification_loss: 0.0109 41/500 [=>............................] - ETA: 3:54 - loss: 0.0935 - regression_loss: 0.0825 - classification_loss: 0.0110 42/500 [=>............................] - ETA: 3:53 - loss: 0.0922 - regression_loss: 0.0813 - classification_loss: 0.0109 43/500 [=>............................] - ETA: 3:52 - loss: 0.0908 - regression_loss: 0.0801 - classification_loss: 0.0107 44/500 [=>............................] - ETA: 3:52 - loss: 0.0895 - regression_loss: 0.0790 - classification_loss: 0.0105 45/500 [=>............................] - ETA: 3:51 - loss: 0.0909 - regression_loss: 0.0803 - classification_loss: 0.0106 46/500 [=>............................] - ETA: 3:51 - loss: 0.0904 - regression_loss: 0.0799 - classification_loss: 0.0105 47/500 [=>............................] - ETA: 3:50 - loss: 0.0903 - regression_loss: 0.0798 - classification_loss: 0.0105 48/500 [=>............................] - ETA: 3:50 - loss: 0.0898 - regression_loss: 0.0792 - classification_loss: 0.0105 49/500 [=>............................] - ETA: 3:49 - loss: 0.0889 - regression_loss: 0.0785 - classification_loss: 0.0105 50/500 [==>...........................] - ETA: 3:49 - loss: 0.0882 - regression_loss: 0.0777 - classification_loss: 0.0104 51/500 [==>...........................] - ETA: 3:48 - loss: 0.0876 - regression_loss: 0.0772 - classification_loss: 0.0104 52/500 [==>...........................] - ETA: 3:48 - loss: 0.0896 - regression_loss: 0.0792 - classification_loss: 0.0104 53/500 [==>...........................] - ETA: 3:47 - loss: 0.0890 - regression_loss: 0.0787 - classification_loss: 0.0103 54/500 [==>...........................] - ETA: 3:47 - loss: 0.0880 - regression_loss: 0.0778 - classification_loss: 0.0102 55/500 [==>...........................] - ETA: 3:46 - loss: 0.0886 - regression_loss: 0.0783 - classification_loss: 0.0103 56/500 [==>...........................] - ETA: 3:46 - loss: 0.0883 - regression_loss: 0.0780 - classification_loss: 0.0103 57/500 [==>...........................] - ETA: 3:45 - loss: 0.0877 - regression_loss: 0.0775 - classification_loss: 0.0102 58/500 [==>...........................] - ETA: 3:45 - loss: 0.0871 - regression_loss: 0.0770 - classification_loss: 0.0101 59/500 [==>...........................] - ETA: 3:44 - loss: 0.0871 - regression_loss: 0.0770 - classification_loss: 0.0101 60/500 [==>...........................] - ETA: 3:44 - loss: 0.0867 - regression_loss: 0.0768 - classification_loss: 0.0100 61/500 [==>...........................] - ETA: 3:43 - loss: 0.0887 - regression_loss: 0.0785 - classification_loss: 0.0101 62/500 [==>...........................] - ETA: 3:43 - loss: 0.0905 - regression_loss: 0.0802 - classification_loss: 0.0103 63/500 [==>...........................] - ETA: 3:42 - loss: 0.0897 - regression_loss: 0.0794 - classification_loss: 0.0103 64/500 [==>...........................] - ETA: 3:42 - loss: 0.0897 - regression_loss: 0.0794 - classification_loss: 0.0103 65/500 [==>...........................] - ETA: 3:41 - loss: 0.0901 - regression_loss: 0.0798 - classification_loss: 0.0103 66/500 [==>...........................] - ETA: 3:41 - loss: 0.0897 - regression_loss: 0.0795 - classification_loss: 0.0102 67/500 [===>..........................] - ETA: 3:40 - loss: 0.0894 - regression_loss: 0.0791 - classification_loss: 0.0103 68/500 [===>..........................] - ETA: 3:40 - loss: 0.0892 - regression_loss: 0.0790 - classification_loss: 0.0102 69/500 [===>..........................] - ETA: 3:39 - loss: 0.0886 - regression_loss: 0.0784 - classification_loss: 0.0101 70/500 [===>..........................] - ETA: 3:39 - loss: 0.0879 - regression_loss: 0.0778 - classification_loss: 0.0101 71/500 [===>..........................] - ETA: 3:38 - loss: 0.0884 - regression_loss: 0.0783 - classification_loss: 0.0101 72/500 [===>..........................] - ETA: 3:38 - loss: 0.0882 - regression_loss: 0.0782 - classification_loss: 0.0100 73/500 [===>..........................] - ETA: 3:37 - loss: 0.0944 - regression_loss: 0.0839 - classification_loss: 0.0106 74/500 [===>..........................] - ETA: 3:36 - loss: 0.0951 - regression_loss: 0.0843 - classification_loss: 0.0108 75/500 [===>..........................] - ETA: 3:36 - loss: 0.0943 - regression_loss: 0.0836 - classification_loss: 0.0107 76/500 [===>..........................] - ETA: 3:35 - loss: 0.0934 - regression_loss: 0.0828 - classification_loss: 0.0106 77/500 [===>..........................] - ETA: 3:35 - loss: 0.0926 - regression_loss: 0.0820 - classification_loss: 0.0106 78/500 [===>..........................] - ETA: 3:35 - loss: 0.0935 - regression_loss: 0.0827 - classification_loss: 0.0107 79/500 [===>..........................] - ETA: 3:34 - loss: 0.0928 - regression_loss: 0.0822 - classification_loss: 0.0106 80/500 [===>..........................] - ETA: 3:34 - loss: 0.0932 - regression_loss: 0.0826 - classification_loss: 0.0107 81/500 [===>..........................] - ETA: 3:33 - loss: 0.0927 - regression_loss: 0.0821 - classification_loss: 0.0106 82/500 [===>..........................] - ETA: 3:33 - loss: 0.0926 - regression_loss: 0.0820 - classification_loss: 0.0106 83/500 [===>..........................] - ETA: 3:32 - loss: 0.0928 - regression_loss: 0.0822 - classification_loss: 0.0106 84/500 [====>.........................] - ETA: 3:32 - loss: 0.0922 - regression_loss: 0.0816 - classification_loss: 0.0106 85/500 [====>.........................] - ETA: 3:31 - loss: 0.0938 - regression_loss: 0.0831 - classification_loss: 0.0107 86/500 [====>.........................] - ETA: 3:31 - loss: 0.0934 - regression_loss: 0.0827 - classification_loss: 0.0107 87/500 [====>.........................] - ETA: 3:30 - loss: 0.0952 - regression_loss: 0.0844 - classification_loss: 0.0108 88/500 [====>.........................] - ETA: 3:30 - loss: 0.0945 - regression_loss: 0.0837 - classification_loss: 0.0108 89/500 [====>.........................] - ETA: 3:29 - loss: 0.0940 - regression_loss: 0.0833 - classification_loss: 0.0107 90/500 [====>.........................] - ETA: 3:29 - loss: 0.0936 - regression_loss: 0.0829 - classification_loss: 0.0107 91/500 [====>.........................] - ETA: 3:28 - loss: 0.0938 - regression_loss: 0.0832 - classification_loss: 0.0107 92/500 [====>.........................] - ETA: 3:28 - loss: 0.0951 - regression_loss: 0.0843 - classification_loss: 0.0108 93/500 [====>.........................] - ETA: 3:27 - loss: 0.0951 - regression_loss: 0.0843 - classification_loss: 0.0108 94/500 [====>.........................] - ETA: 3:27 - loss: 0.0951 - regression_loss: 0.0843 - classification_loss: 0.0108 95/500 [====>.........................] - ETA: 3:26 - loss: 0.0947 - regression_loss: 0.0839 - classification_loss: 0.0108 96/500 [====>.........................] - ETA: 3:26 - loss: 0.0942 - regression_loss: 0.0835 - classification_loss: 0.0107 97/500 [====>.........................] - ETA: 3:25 - loss: 0.0943 - regression_loss: 0.0836 - classification_loss: 0.0107 98/500 [====>.........................] - ETA: 3:25 - loss: 0.0952 - regression_loss: 0.0844 - classification_loss: 0.0108 99/500 [====>.........................] - ETA: 3:24 - loss: 0.0948 - regression_loss: 0.0840 - classification_loss: 0.0107 100/500 [=====>........................] - ETA: 3:24 - loss: 0.0942 - regression_loss: 0.0835 - classification_loss: 0.0107 101/500 [=====>........................] - ETA: 3:23 - loss: 0.0939 - regression_loss: 0.0832 - classification_loss: 0.0107 102/500 [=====>........................] - ETA: 3:22 - loss: 0.0934 - regression_loss: 0.0826 - classification_loss: 0.0107 103/500 [=====>........................] - ETA: 3:22 - loss: 0.0930 - regression_loss: 0.0822 - classification_loss: 0.0107 104/500 [=====>........................] - ETA: 3:21 - loss: 0.0928 - regression_loss: 0.0821 - classification_loss: 0.0107 105/500 [=====>........................] - ETA: 3:21 - loss: 0.0923 - regression_loss: 0.0816 - classification_loss: 0.0106 106/500 [=====>........................] - ETA: 3:20 - loss: 0.0926 - regression_loss: 0.0819 - classification_loss: 0.0107 107/500 [=====>........................] - ETA: 3:20 - loss: 0.0956 - regression_loss: 0.0848 - classification_loss: 0.0108 108/500 [=====>........................] - ETA: 3:19 - loss: 0.0951 - regression_loss: 0.0843 - classification_loss: 0.0107 109/500 [=====>........................] - ETA: 3:19 - loss: 0.0948 - regression_loss: 0.0841 - classification_loss: 0.0107 110/500 [=====>........................] - ETA: 3:18 - loss: 0.0972 - regression_loss: 0.0858 - classification_loss: 0.0114 111/500 [=====>........................] - ETA: 3:18 - loss: 0.0967 - regression_loss: 0.0854 - classification_loss: 0.0114 112/500 [=====>........................] - ETA: 3:17 - loss: 0.0966 - regression_loss: 0.0853 - classification_loss: 0.0114 113/500 [=====>........................] - ETA: 3:17 - loss: 0.0964 - regression_loss: 0.0851 - classification_loss: 0.0113 114/500 [=====>........................] - ETA: 3:16 - loss: 0.0967 - regression_loss: 0.0853 - classification_loss: 0.0114 115/500 [=====>........................] - ETA: 3:16 - loss: 0.0971 - regression_loss: 0.0857 - classification_loss: 0.0114 116/500 [=====>........................] - ETA: 3:15 - loss: 0.0965 - regression_loss: 0.0852 - classification_loss: 0.0113 117/500 [======>.......................] - ETA: 3:15 - loss: 0.0961 - regression_loss: 0.0848 - classification_loss: 0.0113 118/500 [======>.......................] - ETA: 3:14 - loss: 0.0962 - regression_loss: 0.0850 - classification_loss: 0.0112 119/500 [======>.......................] - ETA: 3:14 - loss: 0.0957 - regression_loss: 0.0845 - classification_loss: 0.0112 120/500 [======>.......................] - ETA: 3:13 - loss: 0.0957 - regression_loss: 0.0844 - classification_loss: 0.0113 121/500 [======>.......................] - ETA: 3:13 - loss: 0.0963 - regression_loss: 0.0849 - classification_loss: 0.0113 122/500 [======>.......................] - ETA: 3:12 - loss: 0.0965 - regression_loss: 0.0851 - classification_loss: 0.0114 123/500 [======>.......................] - ETA: 3:12 - loss: 0.0966 - regression_loss: 0.0852 - classification_loss: 0.0114 124/500 [======>.......................] - ETA: 3:11 - loss: 0.0963 - regression_loss: 0.0850 - classification_loss: 0.0113 125/500 [======>.......................] - ETA: 3:11 - loss: 0.0961 - regression_loss: 0.0847 - classification_loss: 0.0113 126/500 [======>.......................] - ETA: 3:10 - loss: 0.0956 - regression_loss: 0.0843 - classification_loss: 0.0113 127/500 [======>.......................] - ETA: 3:10 - loss: 0.0952 - regression_loss: 0.0840 - classification_loss: 0.0112 128/500 [======>.......................] - ETA: 3:09 - loss: 0.0952 - regression_loss: 0.0839 - classification_loss: 0.0112 129/500 [======>.......................] - ETA: 3:09 - loss: 0.0949 - regression_loss: 0.0837 - classification_loss: 0.0112 130/500 [======>.......................] - ETA: 3:08 - loss: 0.0952 - regression_loss: 0.0840 - classification_loss: 0.0112 131/500 [======>.......................] - ETA: 3:08 - loss: 0.0953 - regression_loss: 0.0841 - classification_loss: 0.0112 132/500 [======>.......................] - ETA: 3:07 - loss: 0.0952 - regression_loss: 0.0840 - classification_loss: 0.0112 133/500 [======>.......................] - ETA: 3:07 - loss: 0.0949 - regression_loss: 0.0838 - classification_loss: 0.0111 134/500 [=======>......................] - ETA: 3:06 - loss: 0.0945 - regression_loss: 0.0835 - classification_loss: 0.0111 135/500 [=======>......................] - ETA: 3:06 - loss: 0.0951 - regression_loss: 0.0840 - classification_loss: 0.0110 136/500 [=======>......................] - ETA: 3:05 - loss: 0.0956 - regression_loss: 0.0846 - classification_loss: 0.0111 137/500 [=======>......................] - ETA: 3:05 - loss: 0.0953 - regression_loss: 0.0843 - classification_loss: 0.0110 138/500 [=======>......................] - ETA: 3:04 - loss: 0.0950 - regression_loss: 0.0840 - classification_loss: 0.0110 139/500 [=======>......................] - ETA: 3:04 - loss: 0.0949 - regression_loss: 0.0839 - classification_loss: 0.0110 140/500 [=======>......................] - ETA: 3:03 - loss: 0.0945 - regression_loss: 0.0836 - classification_loss: 0.0109 141/500 [=======>......................] - ETA: 3:03 - loss: 0.0943 - regression_loss: 0.0835 - classification_loss: 0.0109 142/500 [=======>......................] - ETA: 3:02 - loss: 0.0940 - regression_loss: 0.0832 - classification_loss: 0.0108 143/500 [=======>......................] - ETA: 3:02 - loss: 0.0938 - regression_loss: 0.0830 - classification_loss: 0.0108 144/500 [=======>......................] - ETA: 3:01 - loss: 0.0936 - regression_loss: 0.0828 - classification_loss: 0.0108 145/500 [=======>......................] - ETA: 3:01 - loss: 0.0938 - regression_loss: 0.0831 - classification_loss: 0.0108 146/500 [=======>......................] - ETA: 3:00 - loss: 0.0936 - regression_loss: 0.0828 - classification_loss: 0.0107 147/500 [=======>......................] - ETA: 3:00 - loss: 0.0947 - regression_loss: 0.0838 - classification_loss: 0.0109 148/500 [=======>......................] - ETA: 2:59 - loss: 0.0943 - regression_loss: 0.0835 - classification_loss: 0.0109 149/500 [=======>......................] - ETA: 2:59 - loss: 0.0941 - regression_loss: 0.0833 - classification_loss: 0.0108 150/500 [========>.....................] - ETA: 2:58 - loss: 0.0940 - regression_loss: 0.0832 - classification_loss: 0.0108 151/500 [========>.....................] - ETA: 2:58 - loss: 0.0936 - regression_loss: 0.0829 - classification_loss: 0.0108 152/500 [========>.....................] - ETA: 2:57 - loss: 0.0933 - regression_loss: 0.0826 - classification_loss: 0.0107 153/500 [========>.....................] - ETA: 2:57 - loss: 0.0932 - regression_loss: 0.0825 - classification_loss: 0.0107 154/500 [========>.....................] - ETA: 2:56 - loss: 0.0927 - regression_loss: 0.0821 - classification_loss: 0.0107 155/500 [========>.....................] - ETA: 2:56 - loss: 0.0927 - regression_loss: 0.0820 - classification_loss: 0.0107 156/500 [========>.....................] - ETA: 2:55 - loss: 0.0941 - regression_loss: 0.0833 - classification_loss: 0.0109 157/500 [========>.....................] - ETA: 2:55 - loss: 0.0944 - regression_loss: 0.0835 - classification_loss: 0.0109 158/500 [========>.....................] - ETA: 2:54 - loss: 0.0944 - regression_loss: 0.0835 - classification_loss: 0.0109 159/500 [========>.....................] - ETA: 2:54 - loss: 0.0941 - regression_loss: 0.0833 - classification_loss: 0.0108 160/500 [========>.....................] - ETA: 2:53 - loss: 0.0938 - regression_loss: 0.0830 - classification_loss: 0.0108 161/500 [========>.....................] - ETA: 2:53 - loss: 0.0947 - regression_loss: 0.0837 - classification_loss: 0.0110 162/500 [========>.....................] - ETA: 2:52 - loss: 0.0945 - regression_loss: 0.0834 - classification_loss: 0.0110 163/500 [========>.....................] - ETA: 2:52 - loss: 0.0944 - regression_loss: 0.0834 - classification_loss: 0.0110 164/500 [========>.....................] - ETA: 2:51 - loss: 0.0944 - regression_loss: 0.0834 - classification_loss: 0.0110 165/500 [========>.....................] - ETA: 2:51 - loss: 0.0941 - regression_loss: 0.0831 - classification_loss: 0.0110 166/500 [========>.....................] - ETA: 2:50 - loss: 0.0944 - regression_loss: 0.0834 - classification_loss: 0.0110 167/500 [=========>....................] - ETA: 2:50 - loss: 0.0941 - regression_loss: 0.0831 - classification_loss: 0.0110 168/500 [=========>....................] - ETA: 2:49 - loss: 0.0938 - regression_loss: 0.0829 - classification_loss: 0.0109 169/500 [=========>....................] - ETA: 2:49 - loss: 0.0934 - regression_loss: 0.0825 - classification_loss: 0.0109 170/500 [=========>....................] - ETA: 2:48 - loss: 0.0935 - regression_loss: 0.0826 - classification_loss: 0.0109 171/500 [=========>....................] - ETA: 2:48 - loss: 0.0934 - regression_loss: 0.0826 - classification_loss: 0.0109 172/500 [=========>....................] - ETA: 2:47 - loss: 0.0934 - regression_loss: 0.0825 - classification_loss: 0.0109 173/500 [=========>....................] - ETA: 2:47 - loss: 0.0932 - regression_loss: 0.0823 - classification_loss: 0.0109 174/500 [=========>....................] - ETA: 2:46 - loss: 0.0932 - regression_loss: 0.0824 - classification_loss: 0.0109 175/500 [=========>....................] - ETA: 2:46 - loss: 0.0978 - regression_loss: 0.0855 - classification_loss: 0.0123 176/500 [=========>....................] - ETA: 2:45 - loss: 0.0991 - regression_loss: 0.0868 - classification_loss: 0.0123 177/500 [=========>....................] - ETA: 2:45 - loss: 0.0986 - regression_loss: 0.0864 - classification_loss: 0.0123 178/500 [=========>....................] - ETA: 2:44 - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 179/500 [=========>....................] - ETA: 2:44 - loss: 0.0981 - regression_loss: 0.0859 - classification_loss: 0.0122 180/500 [=========>....................] - ETA: 2:43 - loss: 0.0988 - regression_loss: 0.0865 - classification_loss: 0.0122 181/500 [=========>....................] - ETA: 2:43 - loss: 0.0985 - regression_loss: 0.0863 - classification_loss: 0.0122 182/500 [=========>....................] - ETA: 2:42 - loss: 0.0984 - regression_loss: 0.0862 - classification_loss: 0.0122 183/500 [=========>....................] - ETA: 2:42 - loss: 0.0988 - regression_loss: 0.0865 - classification_loss: 0.0123 184/500 [==========>...................] - ETA: 2:41 - loss: 0.0991 - regression_loss: 0.0869 - classification_loss: 0.0123 185/500 [==========>...................] - ETA: 2:41 - loss: 0.0993 - regression_loss: 0.0870 - classification_loss: 0.0123 186/500 [==========>...................] - ETA: 2:40 - loss: 0.0990 - regression_loss: 0.0868 - classification_loss: 0.0122 187/500 [==========>...................] - ETA: 2:40 - loss: 0.0988 - regression_loss: 0.0866 - classification_loss: 0.0122 188/500 [==========>...................] - ETA: 2:39 - loss: 0.0984 - regression_loss: 0.0862 - classification_loss: 0.0122 189/500 [==========>...................] - ETA: 2:39 - loss: 0.0981 - regression_loss: 0.0860 - classification_loss: 0.0122 190/500 [==========>...................] - ETA: 2:38 - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 191/500 [==========>...................] - ETA: 2:38 - loss: 0.0976 - regression_loss: 0.0855 - classification_loss: 0.0121 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1008 - regression_loss: 0.0883 - classification_loss: 0.0124 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1005 - regression_loss: 0.0881 - classification_loss: 0.0124 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1002 - regression_loss: 0.0878 - classification_loss: 0.0124 195/500 [==========>...................] - ETA: 2:36 - loss: 0.1003 - regression_loss: 0.0879 - classification_loss: 0.0124 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1000 - regression_loss: 0.0877 - classification_loss: 0.0123 197/500 [==========>...................] - ETA: 2:35 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0123 198/500 [==========>...................] - ETA: 2:34 - loss: 0.0999 - regression_loss: 0.0876 - classification_loss: 0.0123 199/500 [==========>...................] - ETA: 2:33 - loss: 0.0996 - regression_loss: 0.0873 - classification_loss: 0.0123 200/500 [===========>..................] - ETA: 2:33 - loss: 0.0992 - regression_loss: 0.0870 - classification_loss: 0.0122 201/500 [===========>..................] - ETA: 2:32 - loss: 0.0993 - regression_loss: 0.0871 - classification_loss: 0.0122 202/500 [===========>..................] - ETA: 2:32 - loss: 0.0995 - regression_loss: 0.0873 - classification_loss: 0.0123 203/500 [===========>..................] - ETA: 2:31 - loss: 0.0992 - regression_loss: 0.0870 - classification_loss: 0.0122 204/500 [===========>..................] - ETA: 2:31 - loss: 0.0989 - regression_loss: 0.0867 - classification_loss: 0.0122 205/500 [===========>..................] - ETA: 2:30 - loss: 0.0990 - regression_loss: 0.0868 - classification_loss: 0.0122 206/500 [===========>..................] - ETA: 2:30 - loss: 0.0986 - regression_loss: 0.0865 - classification_loss: 0.0122 207/500 [===========>..................] - ETA: 2:29 - loss: 0.0984 - regression_loss: 0.0862 - classification_loss: 0.0121 208/500 [===========>..................] - ETA: 2:29 - loss: 0.0982 - regression_loss: 0.0861 - classification_loss: 0.0121 209/500 [===========>..................] - ETA: 2:28 - loss: 0.0982 - regression_loss: 0.0861 - classification_loss: 0.0121 210/500 [===========>..................] - ETA: 2:28 - loss: 0.0979 - regression_loss: 0.0858 - classification_loss: 0.0121 211/500 [===========>..................] - ETA: 2:27 - loss: 0.0994 - regression_loss: 0.0873 - classification_loss: 0.0121 212/500 [===========>..................] - ETA: 2:27 - loss: 0.0991 - regression_loss: 0.0870 - classification_loss: 0.0121 213/500 [===========>..................] - ETA: 2:26 - loss: 0.0989 - regression_loss: 0.0868 - classification_loss: 0.0121 214/500 [===========>..................] - ETA: 2:26 - loss: 0.0997 - regression_loss: 0.0876 - classification_loss: 0.0121 215/500 [===========>..................] - ETA: 2:25 - loss: 0.0996 - regression_loss: 0.0875 - classification_loss: 0.0121 216/500 [===========>..................] - ETA: 2:25 - loss: 0.0996 - regression_loss: 0.0876 - classification_loss: 0.0121 217/500 [============>.................] - ETA: 2:24 - loss: 0.0995 - regression_loss: 0.0875 - classification_loss: 0.0121 218/500 [============>.................] - ETA: 2:24 - loss: 0.0992 - regression_loss: 0.0872 - classification_loss: 0.0120 219/500 [============>.................] - ETA: 2:23 - loss: 0.0990 - regression_loss: 0.0870 - classification_loss: 0.0120 220/500 [============>.................] - ETA: 2:23 - loss: 0.0989 - regression_loss: 0.0869 - classification_loss: 0.0120 221/500 [============>.................] - ETA: 2:22 - loss: 0.0989 - regression_loss: 0.0870 - classification_loss: 0.0120 222/500 [============>.................] - ETA: 2:22 - loss: 0.0989 - regression_loss: 0.0870 - classification_loss: 0.0119 223/500 [============>.................] - ETA: 2:21 - loss: 0.0994 - regression_loss: 0.0874 - classification_loss: 0.0119 224/500 [============>.................] - ETA: 2:21 - loss: 0.0994 - regression_loss: 0.0874 - classification_loss: 0.0119 225/500 [============>.................] - ETA: 2:20 - loss: 0.0992 - regression_loss: 0.0873 - classification_loss: 0.0119 226/500 [============>.................] - ETA: 2:20 - loss: 0.1012 - regression_loss: 0.0891 - classification_loss: 0.0121 227/500 [============>.................] - ETA: 2:19 - loss: 0.1014 - regression_loss: 0.0894 - classification_loss: 0.0121 228/500 [============>.................] - ETA: 2:19 - loss: 0.1014 - regression_loss: 0.0893 - classification_loss: 0.0121 229/500 [============>.................] - ETA: 2:18 - loss: 0.1013 - regression_loss: 0.0893 - classification_loss: 0.0120 230/500 [============>.................] - ETA: 2:18 - loss: 0.1011 - regression_loss: 0.0891 - classification_loss: 0.0120 231/500 [============>.................] - ETA: 2:17 - loss: 0.1008 - regression_loss: 0.0889 - classification_loss: 0.0120 232/500 [============>.................] - ETA: 2:17 - loss: 0.1008 - regression_loss: 0.0888 - classification_loss: 0.0120 233/500 [============>.................] - ETA: 2:16 - loss: 0.1006 - regression_loss: 0.0886 - classification_loss: 0.0120 234/500 [=============>................] - ETA: 2:16 - loss: 0.1004 - regression_loss: 0.0884 - classification_loss: 0.0119 235/500 [=============>................] - ETA: 2:15 - loss: 0.1002 - regression_loss: 0.0883 - classification_loss: 0.0119 236/500 [=============>................] - ETA: 2:15 - loss: 0.1000 - regression_loss: 0.0881 - classification_loss: 0.0119 237/500 [=============>................] - ETA: 2:14 - loss: 0.1002 - regression_loss: 0.0883 - classification_loss: 0.0119 238/500 [=============>................] - ETA: 2:14 - loss: 0.1001 - regression_loss: 0.0883 - classification_loss: 0.0119 239/500 [=============>................] - ETA: 2:13 - loss: 0.0999 - regression_loss: 0.0880 - classification_loss: 0.0119 240/500 [=============>................] - ETA: 2:13 - loss: 0.0999 - regression_loss: 0.0880 - classification_loss: 0.0118 241/500 [=============>................] - ETA: 2:12 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 242/500 [=============>................] - ETA: 2:12 - loss: 0.0996 - regression_loss: 0.0878 - classification_loss: 0.0118 243/500 [=============>................] - ETA: 2:11 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 244/500 [=============>................] - ETA: 2:11 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0118 245/500 [=============>................] - ETA: 2:10 - loss: 0.0996 - regression_loss: 0.0878 - classification_loss: 0.0118 246/500 [=============>................] - ETA: 2:10 - loss: 0.0999 - regression_loss: 0.0880 - classification_loss: 0.0118 247/500 [=============>................] - ETA: 2:09 - loss: 0.0998 - regression_loss: 0.0880 - classification_loss: 0.0118 248/500 [=============>................] - ETA: 2:09 - loss: 0.1002 - regression_loss: 0.0884 - classification_loss: 0.0119 249/500 [=============>................] - ETA: 2:08 - loss: 0.1001 - regression_loss: 0.0882 - classification_loss: 0.0118 250/500 [==============>...............] - ETA: 2:08 - loss: 0.0999 - regression_loss: 0.0880 - classification_loss: 0.0118 251/500 [==============>...............] - ETA: 2:07 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1003 - regression_loss: 0.0884 - classification_loss: 0.0119 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1003 - regression_loss: 0.0884 - classification_loss: 0.0119 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1001 - regression_loss: 0.0882 - classification_loss: 0.0119 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1000 - regression_loss: 0.0881 - classification_loss: 0.0119 256/500 [==============>...............] - ETA: 2:04 - loss: 0.0998 - regression_loss: 0.0879 - classification_loss: 0.0119 257/500 [==============>...............] - ETA: 2:04 - loss: 0.0996 - regression_loss: 0.0878 - classification_loss: 0.0119 258/500 [==============>...............] - ETA: 2:03 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0118 259/500 [==============>...............] - ETA: 2:03 - loss: 0.0992 - regression_loss: 0.0874 - classification_loss: 0.0118 260/500 [==============>...............] - ETA: 2:02 - loss: 0.0991 - regression_loss: 0.0873 - classification_loss: 0.0118 261/500 [==============>...............] - ETA: 2:02 - loss: 0.0989 - regression_loss: 0.0871 - classification_loss: 0.0118 262/500 [==============>...............] - ETA: 2:01 - loss: 0.0987 - regression_loss: 0.0870 - classification_loss: 0.0117 263/500 [==============>...............] - ETA: 2:01 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0117 264/500 [==============>...............] - ETA: 2:00 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0117 265/500 [==============>...............] - ETA: 2:00 - loss: 0.0992 - regression_loss: 0.0875 - classification_loss: 0.0117 266/500 [==============>...............] - ETA: 1:59 - loss: 0.0990 - regression_loss: 0.0873 - classification_loss: 0.0117 267/500 [===============>..............] - ETA: 1:59 - loss: 0.0993 - regression_loss: 0.0876 - classification_loss: 0.0117 268/500 [===============>..............] - ETA: 1:58 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0117 269/500 [===============>..............] - ETA: 1:58 - loss: 0.0993 - regression_loss: 0.0875 - classification_loss: 0.0117 270/500 [===============>..............] - ETA: 1:57 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 271/500 [===============>..............] - ETA: 1:57 - loss: 0.0992 - regression_loss: 0.0874 - classification_loss: 0.0117 272/500 [===============>..............] - ETA: 1:56 - loss: 0.0989 - regression_loss: 0.0872 - classification_loss: 0.0117 273/500 [===============>..............] - ETA: 1:56 - loss: 0.0987 - regression_loss: 0.0870 - classification_loss: 0.0117 274/500 [===============>..............] - ETA: 1:55 - loss: 0.0985 - regression_loss: 0.0869 - classification_loss: 0.0117 275/500 [===============>..............] - ETA: 1:55 - loss: 0.0983 - regression_loss: 0.0867 - classification_loss: 0.0116 276/500 [===============>..............] - ETA: 1:54 - loss: 0.0984 - regression_loss: 0.0867 - classification_loss: 0.0116 277/500 [===============>..............] - ETA: 1:54 - loss: 0.0987 - regression_loss: 0.0870 - classification_loss: 0.0116 278/500 [===============>..............] - ETA: 1:53 - loss: 0.0988 - regression_loss: 0.0872 - classification_loss: 0.0116 279/500 [===============>..............] - ETA: 1:53 - loss: 0.0986 - regression_loss: 0.0870 - classification_loss: 0.0116 280/500 [===============>..............] - ETA: 1:52 - loss: 0.0984 - regression_loss: 0.0868 - classification_loss: 0.0116 281/500 [===============>..............] - ETA: 1:52 - loss: 0.0998 - regression_loss: 0.0881 - classification_loss: 0.0117 282/500 [===============>..............] - ETA: 1:51 - loss: 0.0996 - regression_loss: 0.0879 - classification_loss: 0.0117 283/500 [===============>..............] - ETA: 1:51 - loss: 0.0997 - regression_loss: 0.0881 - classification_loss: 0.0116 284/500 [================>.............] - ETA: 1:50 - loss: 0.0998 - regression_loss: 0.0881 - classification_loss: 0.0117 285/500 [================>.............] - ETA: 1:50 - loss: 0.0996 - regression_loss: 0.0879 - classification_loss: 0.0116 286/500 [================>.............] - ETA: 1:49 - loss: 0.0994 - regression_loss: 0.0878 - classification_loss: 0.0116 287/500 [================>.............] - ETA: 1:48 - loss: 0.0991 - regression_loss: 0.0876 - classification_loss: 0.0116 288/500 [================>.............] - ETA: 1:48 - loss: 0.0989 - regression_loss: 0.0873 - classification_loss: 0.0115 289/500 [================>.............] - ETA: 1:47 - loss: 0.0988 - regression_loss: 0.0873 - classification_loss: 0.0115 290/500 [================>.............] - ETA: 1:47 - loss: 0.0986 - regression_loss: 0.0871 - classification_loss: 0.0115 291/500 [================>.............] - ETA: 1:46 - loss: 0.0995 - regression_loss: 0.0878 - classification_loss: 0.0118 292/500 [================>.............] - ETA: 1:46 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 293/500 [================>.............] - ETA: 1:45 - loss: 0.0995 - regression_loss: 0.0878 - classification_loss: 0.0118 294/500 [================>.............] - ETA: 1:45 - loss: 0.0993 - regression_loss: 0.0876 - classification_loss: 0.0117 295/500 [================>.............] - ETA: 1:44 - loss: 0.0995 - regression_loss: 0.0878 - classification_loss: 0.0118 296/500 [================>.............] - ETA: 1:44 - loss: 0.1000 - regression_loss: 0.0881 - classification_loss: 0.0119 297/500 [================>.............] - ETA: 1:43 - loss: 0.0998 - regression_loss: 0.0879 - classification_loss: 0.0119 298/500 [================>.............] - ETA: 1:43 - loss: 0.0997 - regression_loss: 0.0878 - classification_loss: 0.0118 299/500 [================>.............] - ETA: 1:42 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 300/500 [=================>............] - ETA: 1:42 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 301/500 [=================>............] - ETA: 1:41 - loss: 0.0998 - regression_loss: 0.0879 - classification_loss: 0.0118 302/500 [=================>............] - ETA: 1:41 - loss: 0.0997 - regression_loss: 0.0879 - classification_loss: 0.0118 303/500 [=================>............] - ETA: 1:40 - loss: 0.0994 - regression_loss: 0.0877 - classification_loss: 0.0118 304/500 [=================>............] - ETA: 1:40 - loss: 0.0993 - regression_loss: 0.0875 - classification_loss: 0.0118 305/500 [=================>............] - ETA: 1:39 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 306/500 [=================>............] - ETA: 1:39 - loss: 0.0993 - regression_loss: 0.0875 - classification_loss: 0.0118 307/500 [=================>............] - ETA: 1:38 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 308/500 [=================>............] - ETA: 1:38 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 309/500 [=================>............] - ETA: 1:37 - loss: 0.0995 - regression_loss: 0.0877 - classification_loss: 0.0118 310/500 [=================>............] - ETA: 1:37 - loss: 0.0994 - regression_loss: 0.0876 - classification_loss: 0.0118 311/500 [=================>............] - ETA: 1:36 - loss: 0.0993 - regression_loss: 0.0875 - classification_loss: 0.0118 312/500 [=================>............] - ETA: 1:36 - loss: 0.0991 - regression_loss: 0.0874 - classification_loss: 0.0118 313/500 [=================>............] - ETA: 1:35 - loss: 0.0990 - regression_loss: 0.0872 - classification_loss: 0.0117 314/500 [=================>............] - ETA: 1:35 - loss: 0.0989 - regression_loss: 0.0872 - classification_loss: 0.0117 315/500 [=================>............] - ETA: 1:34 - loss: 0.0990 - regression_loss: 0.0873 - classification_loss: 0.0117 316/500 [=================>............] - ETA: 1:34 - loss: 0.0988 - regression_loss: 0.0871 - classification_loss: 0.0117 317/500 [==================>...........] - ETA: 1:33 - loss: 0.0987 - regression_loss: 0.0870 - classification_loss: 0.0117 318/500 [==================>...........] - ETA: 1:33 - loss: 0.0991 - regression_loss: 0.0874 - classification_loss: 0.0117 319/500 [==================>...........] - ETA: 1:32 - loss: 0.0990 - regression_loss: 0.0873 - classification_loss: 0.0117 320/500 [==================>...........] - ETA: 1:32 - loss: 0.0989 - regression_loss: 0.0872 - classification_loss: 0.0117 321/500 [==================>...........] - ETA: 1:31 - loss: 0.0988 - regression_loss: 0.0871 - classification_loss: 0.0117 322/500 [==================>...........] - ETA: 1:31 - loss: 0.0987 - regression_loss: 0.0870 - classification_loss: 0.0117 323/500 [==================>...........] - ETA: 1:30 - loss: 0.0986 - regression_loss: 0.0869 - classification_loss: 0.0117 324/500 [==================>...........] - ETA: 1:30 - loss: 0.0984 - regression_loss: 0.0867 - classification_loss: 0.0116 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1008 - regression_loss: 0.0884 - classification_loss: 0.0124 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1006 - regression_loss: 0.0883 - classification_loss: 0.0124 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1005 - regression_loss: 0.0881 - classification_loss: 0.0124 328/500 [==================>...........] - ETA: 1:28 - loss: 0.1004 - regression_loss: 0.0880 - classification_loss: 0.0123 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1005 - regression_loss: 0.0881 - classification_loss: 0.0124 330/500 [==================>...........] - ETA: 1:27 - loss: 0.1003 - regression_loss: 0.0880 - classification_loss: 0.0123 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0123 332/500 [==================>...........] - ETA: 1:26 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0123 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1000 - regression_loss: 0.0877 - classification_loss: 0.0123 334/500 [===================>..........] - ETA: 1:24 - loss: 0.0998 - regression_loss: 0.0875 - classification_loss: 0.0123 335/500 [===================>..........] - ETA: 1:24 - loss: 0.0996 - regression_loss: 0.0873 - classification_loss: 0.0123 336/500 [===================>..........] - ETA: 1:23 - loss: 0.0995 - regression_loss: 0.0872 - classification_loss: 0.0122 337/500 [===================>..........] - ETA: 1:23 - loss: 0.0993 - regression_loss: 0.0871 - classification_loss: 0.0122 338/500 [===================>..........] - ETA: 1:22 - loss: 0.0997 - regression_loss: 0.0874 - classification_loss: 0.0122 339/500 [===================>..........] - ETA: 1:22 - loss: 0.0995 - regression_loss: 0.0872 - classification_loss: 0.0122 340/500 [===================>..........] - ETA: 1:21 - loss: 0.0994 - regression_loss: 0.0871 - classification_loss: 0.0122 341/500 [===================>..........] - ETA: 1:21 - loss: 0.0999 - regression_loss: 0.0876 - classification_loss: 0.0123 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0123 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1000 - regression_loss: 0.0878 - classification_loss: 0.0123 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0998 - regression_loss: 0.0876 - classification_loss: 0.0123 345/500 [===================>..........] - ETA: 1:19 - loss: 0.0998 - regression_loss: 0.0875 - classification_loss: 0.0122 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0997 - regression_loss: 0.0875 - classification_loss: 0.0122 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1000 - regression_loss: 0.0878 - classification_loss: 0.0123 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1000 - regression_loss: 0.0877 - classification_loss: 0.0122 349/500 [===================>..........] - ETA: 1:17 - loss: 0.0999 - regression_loss: 0.0876 - classification_loss: 0.0122 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0997 - regression_loss: 0.0875 - classification_loss: 0.0122 351/500 [====================>.........] - ETA: 1:16 - loss: 0.0996 - regression_loss: 0.0873 - classification_loss: 0.0122 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0995 - regression_loss: 0.0873 - classification_loss: 0.0122 353/500 [====================>.........] - ETA: 1:15 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0122 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0994 - regression_loss: 0.0872 - classification_loss: 0.0122 355/500 [====================>.........] - ETA: 1:14 - loss: 0.0996 - regression_loss: 0.0874 - classification_loss: 0.0122 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0995 - regression_loss: 0.0873 - classification_loss: 0.0122 357/500 [====================>.........] - ETA: 1:13 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0122 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0122 359/500 [====================>.........] - ETA: 1:12 - loss: 0.0992 - regression_loss: 0.0871 - classification_loss: 0.0122 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0991 - regression_loss: 0.0869 - classification_loss: 0.0121 361/500 [====================>.........] - ETA: 1:11 - loss: 0.0990 - regression_loss: 0.0869 - classification_loss: 0.0121 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0989 - regression_loss: 0.0868 - classification_loss: 0.0121 363/500 [====================>.........] - ETA: 1:10 - loss: 0.0987 - regression_loss: 0.0866 - classification_loss: 0.0121 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0985 - regression_loss: 0.0865 - classification_loss: 0.0121 365/500 [====================>.........] - ETA: 1:09 - loss: 0.0984 - regression_loss: 0.0863 - classification_loss: 0.0121 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0985 - regression_loss: 0.0864 - classification_loss: 0.0121 367/500 [=====================>........] - ETA: 1:08 - loss: 0.0984 - regression_loss: 0.0863 - classification_loss: 0.0121 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0983 - regression_loss: 0.0862 - classification_loss: 0.0120 369/500 [=====================>........] - ETA: 1:07 - loss: 0.0981 - regression_loss: 0.0861 - classification_loss: 0.0120 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0120 371/500 [=====================>........] - ETA: 1:06 - loss: 0.0978 - regression_loss: 0.0858 - classification_loss: 0.0120 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0977 - regression_loss: 0.0857 - classification_loss: 0.0120 373/500 [=====================>........] - ETA: 1:05 - loss: 0.0976 - regression_loss: 0.0856 - classification_loss: 0.0120 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0975 - regression_loss: 0.0855 - classification_loss: 0.0119 375/500 [=====================>........] - ETA: 1:03 - loss: 0.0974 - regression_loss: 0.0855 - classification_loss: 0.0119 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0974 - regression_loss: 0.0854 - classification_loss: 0.0119 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0973 - regression_loss: 0.0854 - classification_loss: 0.0119 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0973 - regression_loss: 0.0854 - classification_loss: 0.0119 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0971 - regression_loss: 0.0852 - classification_loss: 0.0119 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0970 - regression_loss: 0.0851 - classification_loss: 0.0119 381/500 [=====================>........] - ETA: 1:00 - loss: 0.0968 - regression_loss: 0.0850 - classification_loss: 0.0119 382/500 [=====================>........] - ETA: 1:00 - loss: 0.0967 - regression_loss: 0.0848 - classification_loss: 0.0118 383/500 [=====================>........] - ETA: 59s - loss: 0.0966 - regression_loss: 0.0848 - classification_loss: 0.0118  384/500 [======================>.......] - ETA: 59s - loss: 0.0966 - regression_loss: 0.0847 - classification_loss: 0.0118 385/500 [======================>.......] - ETA: 58s - loss: 0.0966 - regression_loss: 0.0848 - classification_loss: 0.0118 386/500 [======================>.......] - ETA: 58s - loss: 0.0975 - regression_loss: 0.0856 - classification_loss: 0.0118 387/500 [======================>.......] - ETA: 57s - loss: 0.0973 - regression_loss: 0.0854 - classification_loss: 0.0118 388/500 [======================>.......] - ETA: 57s - loss: 0.0971 - regression_loss: 0.0853 - classification_loss: 0.0118 389/500 [======================>.......] - ETA: 56s - loss: 0.0970 - regression_loss: 0.0852 - classification_loss: 0.0118 390/500 [======================>.......] - ETA: 56s - loss: 0.0969 - regression_loss: 0.0851 - classification_loss: 0.0118 391/500 [======================>.......] - ETA: 55s - loss: 0.0971 - regression_loss: 0.0853 - classification_loss: 0.0118 392/500 [======================>.......] - ETA: 55s - loss: 0.0971 - regression_loss: 0.0853 - classification_loss: 0.0118 393/500 [======================>.......] - ETA: 54s - loss: 0.0969 - regression_loss: 0.0852 - classification_loss: 0.0118 394/500 [======================>.......] - ETA: 54s - loss: 0.0969 - regression_loss: 0.0851 - classification_loss: 0.0118 395/500 [======================>.......] - ETA: 53s - loss: 0.0968 - regression_loss: 0.0851 - classification_loss: 0.0117 396/500 [======================>.......] - ETA: 53s - loss: 0.0967 - regression_loss: 0.0850 - classification_loss: 0.0117 397/500 [======================>.......] - ETA: 52s - loss: 0.0968 - regression_loss: 0.0851 - classification_loss: 0.0117 398/500 [======================>.......] - ETA: 52s - loss: 0.0968 - regression_loss: 0.0850 - classification_loss: 0.0117 399/500 [======================>.......] - ETA: 51s - loss: 0.0969 - regression_loss: 0.0852 - classification_loss: 0.0117 400/500 [=======================>......] - ETA: 51s - loss: 0.0968 - regression_loss: 0.0851 - classification_loss: 0.0117 401/500 [=======================>......] - ETA: 50s - loss: 0.0966 - regression_loss: 0.0849 - classification_loss: 0.0117 402/500 [=======================>......] - ETA: 50s - loss: 0.0967 - regression_loss: 0.0850 - classification_loss: 0.0117 403/500 [=======================>......] - ETA: 49s - loss: 0.0967 - regression_loss: 0.0850 - classification_loss: 0.0117 404/500 [=======================>......] - ETA: 49s - loss: 0.0966 - regression_loss: 0.0849 - classification_loss: 0.0117 405/500 [=======================>......] - ETA: 48s - loss: 0.0964 - regression_loss: 0.0847 - classification_loss: 0.0117 406/500 [=======================>......] - ETA: 48s - loss: 0.0964 - regression_loss: 0.0848 - classification_loss: 0.0117 407/500 [=======================>......] - ETA: 47s - loss: 0.0967 - regression_loss: 0.0850 - classification_loss: 0.0117 408/500 [=======================>......] - ETA: 47s - loss: 0.0965 - regression_loss: 0.0849 - classification_loss: 0.0117 409/500 [=======================>......] - ETA: 46s - loss: 0.0966 - regression_loss: 0.0850 - classification_loss: 0.0117 410/500 [=======================>......] - ETA: 46s - loss: 0.0965 - regression_loss: 0.0849 - classification_loss: 0.0117 411/500 [=======================>......] - ETA: 45s - loss: 0.0964 - regression_loss: 0.0848 - classification_loss: 0.0116 412/500 [=======================>......] - ETA: 45s - loss: 0.0963 - regression_loss: 0.0846 - classification_loss: 0.0116 413/500 [=======================>......] - ETA: 44s - loss: 0.0961 - regression_loss: 0.0845 - classification_loss: 0.0116 414/500 [=======================>......] - ETA: 44s - loss: 0.0960 - regression_loss: 0.0844 - classification_loss: 0.0116 415/500 [=======================>......] - ETA: 43s - loss: 0.0959 - regression_loss: 0.0843 - classification_loss: 0.0116 416/500 [=======================>......] - ETA: 42s - loss: 0.0960 - regression_loss: 0.0844 - classification_loss: 0.0116 417/500 [========================>.....] - ETA: 42s - loss: 0.0959 - regression_loss: 0.0843 - classification_loss: 0.0116 418/500 [========================>.....] - ETA: 41s - loss: 0.0958 - regression_loss: 0.0843 - classification_loss: 0.0116 419/500 [========================>.....] - ETA: 41s - loss: 0.0960 - regression_loss: 0.0844 - classification_loss: 0.0116 420/500 [========================>.....] - ETA: 40s - loss: 0.0959 - regression_loss: 0.0843 - classification_loss: 0.0116 421/500 [========================>.....] - ETA: 40s - loss: 0.0961 - regression_loss: 0.0845 - classification_loss: 0.0116 422/500 [========================>.....] - ETA: 39s - loss: 0.0961 - regression_loss: 0.0845 - classification_loss: 0.0116 423/500 [========================>.....] - ETA: 39s - loss: 0.0960 - regression_loss: 0.0844 - classification_loss: 0.0116 424/500 [========================>.....] - ETA: 38s - loss: 0.0961 - regression_loss: 0.0845 - classification_loss: 0.0116 425/500 [========================>.....] - ETA: 38s - loss: 0.0960 - regression_loss: 0.0845 - classification_loss: 0.0116 426/500 [========================>.....] - ETA: 37s - loss: 0.0959 - regression_loss: 0.0844 - classification_loss: 0.0116 427/500 [========================>.....] - ETA: 37s - loss: 0.0959 - regression_loss: 0.0844 - classification_loss: 0.0115 428/500 [========================>.....] - ETA: 36s - loss: 0.0958 - regression_loss: 0.0843 - classification_loss: 0.0115 429/500 [========================>.....] - ETA: 36s - loss: 0.0957 - regression_loss: 0.0842 - classification_loss: 0.0115 430/500 [========================>.....] - ETA: 35s - loss: 0.0956 - regression_loss: 0.0841 - classification_loss: 0.0115 431/500 [========================>.....] - ETA: 35s - loss: 0.0955 - regression_loss: 0.0841 - classification_loss: 0.0115 432/500 [========================>.....] - ETA: 34s - loss: 0.0955 - regression_loss: 0.0840 - classification_loss: 0.0115 433/500 [========================>.....] - ETA: 34s - loss: 0.0956 - regression_loss: 0.0841 - classification_loss: 0.0115 434/500 [=========================>....] - ETA: 33s - loss: 0.0955 - regression_loss: 0.0840 - classification_loss: 0.0115 435/500 [=========================>....] - ETA: 33s - loss: 0.0956 - regression_loss: 0.0841 - classification_loss: 0.0115 436/500 [=========================>....] - ETA: 32s - loss: 0.0959 - regression_loss: 0.0844 - classification_loss: 0.0115 437/500 [=========================>....] - ETA: 32s - loss: 0.0958 - regression_loss: 0.0843 - classification_loss: 0.0115 438/500 [=========================>....] - ETA: 31s - loss: 0.0962 - regression_loss: 0.0847 - classification_loss: 0.0115 439/500 [=========================>....] - ETA: 31s - loss: 0.0966 - regression_loss: 0.0850 - classification_loss: 0.0116 440/500 [=========================>....] - ETA: 30s - loss: 0.0965 - regression_loss: 0.0849 - classification_loss: 0.0115 441/500 [=========================>....] - ETA: 30s - loss: 0.0967 - regression_loss: 0.0851 - classification_loss: 0.0115 442/500 [=========================>....] - ETA: 29s - loss: 0.0967 - regression_loss: 0.0851 - classification_loss: 0.0115 443/500 [=========================>....] - ETA: 29s - loss: 0.0965 - regression_loss: 0.0850 - classification_loss: 0.0115 444/500 [=========================>....] - ETA: 28s - loss: 0.0964 - regression_loss: 0.0849 - classification_loss: 0.0115 445/500 [=========================>....] - ETA: 28s - loss: 0.0963 - regression_loss: 0.0848 - classification_loss: 0.0115 446/500 [=========================>....] - ETA: 27s - loss: 0.0962 - regression_loss: 0.0847 - classification_loss: 0.0115 447/500 [=========================>....] - ETA: 27s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 448/500 [=========================>....] - ETA: 26s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 449/500 [=========================>....] - ETA: 26s - loss: 0.0973 - regression_loss: 0.0857 - classification_loss: 0.0116 450/500 [==========================>...] - ETA: 25s - loss: 0.0972 - regression_loss: 0.0856 - classification_loss: 0.0116 451/500 [==========================>...] - ETA: 25s - loss: 0.0972 - regression_loss: 0.0856 - classification_loss: 0.0116 452/500 [==========================>...] - ETA: 24s - loss: 0.0971 - regression_loss: 0.0855 - classification_loss: 0.0116 453/500 [==========================>...] - ETA: 24s - loss: 0.0980 - regression_loss: 0.0863 - classification_loss: 0.0116 454/500 [==========================>...] - ETA: 23s - loss: 0.0979 - regression_loss: 0.0862 - classification_loss: 0.0116 455/500 [==========================>...] - ETA: 23s - loss: 0.0978 - regression_loss: 0.0861 - classification_loss: 0.0116 456/500 [==========================>...] - ETA: 22s - loss: 0.0978 - regression_loss: 0.0862 - classification_loss: 0.0116 457/500 [==========================>...] - ETA: 22s - loss: 0.0976 - regression_loss: 0.0860 - classification_loss: 0.0116 458/500 [==========================>...] - ETA: 21s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 459/500 [==========================>...] - ETA: 20s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 460/500 [==========================>...] - ETA: 20s - loss: 0.0972 - regression_loss: 0.0857 - classification_loss: 0.0116 461/500 [==========================>...] - ETA: 19s - loss: 0.0976 - regression_loss: 0.0860 - classification_loss: 0.0116 462/500 [==========================>...] - ETA: 19s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 463/500 [==========================>...] - ETA: 18s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 464/500 [==========================>...] - ETA: 18s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 465/500 [==========================>...] - ETA: 17s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 466/500 [==========================>...] - ETA: 17s - loss: 0.0973 - regression_loss: 0.0857 - classification_loss: 0.0116 467/500 [===========================>..] - ETA: 16s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 468/500 [===========================>..] - ETA: 16s - loss: 0.0978 - regression_loss: 0.0862 - classification_loss: 0.0116 469/500 [===========================>..] - ETA: 15s - loss: 0.0977 - regression_loss: 0.0861 - classification_loss: 0.0116 470/500 [===========================>..] - ETA: 15s - loss: 0.0979 - regression_loss: 0.0863 - classification_loss: 0.0116 471/500 [===========================>..] - ETA: 14s - loss: 0.0979 - regression_loss: 0.0862 - classification_loss: 0.0116 472/500 [===========================>..] - ETA: 14s - loss: 0.0977 - regression_loss: 0.0861 - classification_loss: 0.0116 473/500 [===========================>..] - ETA: 13s - loss: 0.0977 - regression_loss: 0.0861 - classification_loss: 0.0116 474/500 [===========================>..] - ETA: 13s - loss: 0.0977 - regression_loss: 0.0861 - classification_loss: 0.0116 475/500 [===========================>..] - ETA: 12s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 476/500 [===========================>..] - ETA: 12s - loss: 0.0978 - regression_loss: 0.0861 - classification_loss: 0.0116 477/500 [===========================>..] - ETA: 11s - loss: 0.0977 - regression_loss: 0.0861 - classification_loss: 0.0116 478/500 [===========================>..] - ETA: 11s - loss: 0.0976 - regression_loss: 0.0859 - classification_loss: 0.0116 479/500 [===========================>..] - ETA: 10s - loss: 0.0976 - regression_loss: 0.0860 - classification_loss: 0.0116 480/500 [===========================>..] - ETA: 10s - loss: 0.0976 - regression_loss: 0.0859 - classification_loss: 0.0116 481/500 [===========================>..] - ETA: 9s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116  482/500 [===========================>..] - ETA: 9s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 483/500 [===========================>..] - ETA: 8s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 484/500 [============================>.] - ETA: 8s - loss: 0.0975 - regression_loss: 0.0859 - classification_loss: 0.0116 485/500 [============================>.] - ETA: 7s - loss: 0.0974 - regression_loss: 0.0859 - classification_loss: 0.0116 486/500 [============================>.] - ETA: 7s - loss: 0.0973 - regression_loss: 0.0858 - classification_loss: 0.0116 487/500 [============================>.] - ETA: 6s - loss: 0.0974 - regression_loss: 0.0858 - classification_loss: 0.0116 488/500 [============================>.] - ETA: 6s - loss: 0.0973 - regression_loss: 0.0858 - classification_loss: 0.0116 489/500 [============================>.] - ETA: 5s - loss: 0.0973 - regression_loss: 0.0857 - classification_loss: 0.0116 490/500 [============================>.] - ETA: 5s - loss: 0.0971 - regression_loss: 0.0856 - classification_loss: 0.0116 491/500 [============================>.] - ETA: 4s - loss: 0.0970 - regression_loss: 0.0855 - classification_loss: 0.0115 492/500 [============================>.] - ETA: 4s - loss: 0.0970 - regression_loss: 0.0855 - classification_loss: 0.0115 493/500 [============================>.] - ETA: 3s - loss: 0.0969 - regression_loss: 0.0854 - classification_loss: 0.0115 494/500 [============================>.] - ETA: 3s - loss: 0.0968 - regression_loss: 0.0853 - classification_loss: 0.0115 495/500 [============================>.] - ETA: 2s - loss: 0.0970 - regression_loss: 0.0855 - classification_loss: 0.0115 496/500 [============================>.] - ETA: 2s - loss: 0.0969 - regression_loss: 0.0854 - classification_loss: 0.0115 497/500 [============================>.] - ETA: 1s - loss: 0.0968 - regression_loss: 0.0853 - classification_loss: 0.0115 498/500 [============================>.] - ETA: 1s - loss: 0.0967 - regression_loss: 0.0852 - classification_loss: 0.0115 499/500 [============================>.] - ETA: 0s - loss: 0.0966 - regression_loss: 0.0851 - classification_loss: 0.0115 500/500 [==============================] - 256s 511ms/step - loss: 0.0966 - regression_loss: 0.0851 - classification_loss: 0.0115 326 instances of class plum with average precision: 0.8413 mAP: 0.8413 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 4:06 - loss: 0.5427 - regression_loss: 0.4293 - classification_loss: 0.1134 2/500 [..............................] - ETA: 4:07 - loss: 0.2912 - regression_loss: 0.2326 - classification_loss: 0.0586 3/500 [..............................] - ETA: 4:08 - loss: 0.2133 - regression_loss: 0.1721 - classification_loss: 0.0413 4/500 [..............................] - ETA: 4:07 - loss: 0.1688 - regression_loss: 0.1365 - classification_loss: 0.0323 5/500 [..............................] - ETA: 4:07 - loss: 0.1750 - regression_loss: 0.1452 - classification_loss: 0.0298 6/500 [..............................] - ETA: 4:07 - loss: 0.1540 - regression_loss: 0.1283 - classification_loss: 0.0256 7/500 [..............................] - ETA: 4:09 - loss: 0.2597 - regression_loss: 0.2007 - classification_loss: 0.0590 8/500 [..............................] - ETA: 4:08 - loss: 0.2393 - regression_loss: 0.1865 - classification_loss: 0.0528 9/500 [..............................] - ETA: 4:08 - loss: 0.2156 - regression_loss: 0.1681 - classification_loss: 0.0475 10/500 [..............................] - ETA: 4:07 - loss: 0.1990 - regression_loss: 0.1555 - classification_loss: 0.0434 11/500 [..............................] - ETA: 4:08 - loss: 0.1870 - regression_loss: 0.1470 - classification_loss: 0.0400 12/500 [..............................] - ETA: 4:07 - loss: 0.1788 - regression_loss: 0.1411 - classification_loss: 0.0377 13/500 [..............................] - ETA: 4:07 - loss: 0.1697 - regression_loss: 0.1343 - classification_loss: 0.0354 14/500 [..............................] - ETA: 4:08 - loss: 0.1634 - regression_loss: 0.1295 - classification_loss: 0.0340 15/500 [..............................] - ETA: 4:08 - loss: 0.1559 - regression_loss: 0.1236 - classification_loss: 0.0323 16/500 [..............................] - ETA: 4:07 - loss: 0.1587 - regression_loss: 0.1268 - classification_loss: 0.0320 17/500 [>.............................] - ETA: 4:07 - loss: 0.1512 - regression_loss: 0.1207 - classification_loss: 0.0305 18/500 [>.............................] - ETA: 4:06 - loss: 0.1475 - regression_loss: 0.1182 - classification_loss: 0.0293 19/500 [>.............................] - ETA: 4:05 - loss: 0.1430 - regression_loss: 0.1148 - classification_loss: 0.0282 20/500 [>.............................] - ETA: 4:05 - loss: 0.1535 - regression_loss: 0.1222 - classification_loss: 0.0313 21/500 [>.............................] - ETA: 4:04 - loss: 0.1494 - regression_loss: 0.1193 - classification_loss: 0.0302 22/500 [>.............................] - ETA: 4:04 - loss: 0.1467 - regression_loss: 0.1175 - classification_loss: 0.0292 23/500 [>.............................] - ETA: 4:04 - loss: 0.1448 - regression_loss: 0.1165 - classification_loss: 0.0282 24/500 [>.............................] - ETA: 4:04 - loss: 0.1408 - regression_loss: 0.1134 - classification_loss: 0.0273 25/500 [>.............................] - ETA: 4:03 - loss: 0.1449 - regression_loss: 0.1166 - classification_loss: 0.0283 26/500 [>.............................] - ETA: 4:03 - loss: 0.1408 - regression_loss: 0.1132 - classification_loss: 0.0276 27/500 [>.............................] - ETA: 4:02 - loss: 0.1368 - regression_loss: 0.1100 - classification_loss: 0.0268 28/500 [>.............................] - ETA: 4:02 - loss: 0.1332 - regression_loss: 0.1070 - classification_loss: 0.0262 29/500 [>.............................] - ETA: 4:01 - loss: 0.1304 - regression_loss: 0.1049 - classification_loss: 0.0255 30/500 [>.............................] - ETA: 4:01 - loss: 0.1294 - regression_loss: 0.1046 - classification_loss: 0.0248 31/500 [>.............................] - ETA: 4:01 - loss: 0.1278 - regression_loss: 0.1035 - classification_loss: 0.0243 32/500 [>.............................] - ETA: 4:00 - loss: 0.1277 - regression_loss: 0.1038 - classification_loss: 0.0239 33/500 [>.............................] - ETA: 4:00 - loss: 0.1256 - regression_loss: 0.1021 - classification_loss: 0.0235 34/500 [=>............................] - ETA: 3:59 - loss: 0.1418 - regression_loss: 0.1171 - classification_loss: 0.0247 35/500 [=>............................] - ETA: 3:59 - loss: 0.1423 - regression_loss: 0.1180 - classification_loss: 0.0243 36/500 [=>............................] - ETA: 3:58 - loss: 0.1404 - regression_loss: 0.1165 - classification_loss: 0.0239 37/500 [=>............................] - ETA: 3:57 - loss: 0.1381 - regression_loss: 0.1147 - classification_loss: 0.0234 38/500 [=>............................] - ETA: 3:57 - loss: 0.1371 - regression_loss: 0.1142 - classification_loss: 0.0229 39/500 [=>............................] - ETA: 3:56 - loss: 0.1370 - regression_loss: 0.1144 - classification_loss: 0.0226 40/500 [=>............................] - ETA: 3:56 - loss: 0.1344 - regression_loss: 0.1121 - classification_loss: 0.0223 41/500 [=>............................] - ETA: 3:55 - loss: 0.1330 - regression_loss: 0.1111 - classification_loss: 0.0219 42/500 [=>............................] - ETA: 3:55 - loss: 0.1310 - regression_loss: 0.1095 - classification_loss: 0.0214 43/500 [=>............................] - ETA: 3:54 - loss: 0.1339 - regression_loss: 0.1125 - classification_loss: 0.0213 44/500 [=>............................] - ETA: 3:54 - loss: 0.1328 - regression_loss: 0.1118 - classification_loss: 0.0210 45/500 [=>............................] - ETA: 3:53 - loss: 0.1307 - regression_loss: 0.1100 - classification_loss: 0.0206 46/500 [=>............................] - ETA: 3:53 - loss: 0.1294 - regression_loss: 0.1091 - classification_loss: 0.0203 47/500 [=>............................] - ETA: 3:52 - loss: 0.1285 - regression_loss: 0.1084 - classification_loss: 0.0201 48/500 [=>............................] - ETA: 3:52 - loss: 0.1266 - regression_loss: 0.1067 - classification_loss: 0.0198 49/500 [=>............................] - ETA: 3:51 - loss: 0.1248 - regression_loss: 0.1053 - classification_loss: 0.0195 50/500 [==>...........................] - ETA: 3:51 - loss: 0.1241 - regression_loss: 0.1049 - classification_loss: 0.0192 51/500 [==>...........................] - ETA: 3:50 - loss: 0.1233 - regression_loss: 0.1043 - classification_loss: 0.0190 52/500 [==>...........................] - ETA: 3:50 - loss: 0.1219 - regression_loss: 0.1032 - classification_loss: 0.0188 53/500 [==>...........................] - ETA: 3:49 - loss: 0.1206 - regression_loss: 0.1021 - classification_loss: 0.0185 54/500 [==>...........................] - ETA: 3:49 - loss: 0.1198 - regression_loss: 0.1015 - classification_loss: 0.0184 55/500 [==>...........................] - ETA: 3:48 - loss: 0.1188 - regression_loss: 0.1007 - classification_loss: 0.0182 56/500 [==>...........................] - ETA: 3:47 - loss: 0.1177 - regression_loss: 0.0997 - classification_loss: 0.0180 57/500 [==>...........................] - ETA: 3:47 - loss: 0.1188 - regression_loss: 0.1008 - classification_loss: 0.0180 58/500 [==>...........................] - ETA: 3:47 - loss: 0.1321 - regression_loss: 0.1100 - classification_loss: 0.0222 59/500 [==>...........................] - ETA: 3:46 - loss: 0.1312 - regression_loss: 0.1092 - classification_loss: 0.0220 60/500 [==>...........................] - ETA: 3:46 - loss: 0.1310 - regression_loss: 0.1090 - classification_loss: 0.0219 61/500 [==>...........................] - ETA: 3:45 - loss: 0.1327 - regression_loss: 0.1103 - classification_loss: 0.0224 62/500 [==>...........................] - ETA: 3:44 - loss: 0.1314 - regression_loss: 0.1092 - classification_loss: 0.0221 63/500 [==>...........................] - ETA: 3:44 - loss: 0.1306 - regression_loss: 0.1087 - classification_loss: 0.0219 64/500 [==>...........................] - ETA: 3:43 - loss: 0.1295 - regression_loss: 0.1078 - classification_loss: 0.0217 65/500 [==>...........................] - ETA: 3:43 - loss: 0.1320 - regression_loss: 0.1104 - classification_loss: 0.0216 66/500 [==>...........................] - ETA: 3:42 - loss: 0.1313 - regression_loss: 0.1099 - classification_loss: 0.0214 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1299 - regression_loss: 0.1087 - classification_loss: 0.0212 68/500 [===>..........................] - ETA: 3:41 - loss: 0.1290 - regression_loss: 0.1080 - classification_loss: 0.0210 69/500 [===>..........................] - ETA: 3:40 - loss: 0.1278 - regression_loss: 0.1069 - classification_loss: 0.0208 70/500 [===>..........................] - ETA: 3:40 - loss: 0.1270 - regression_loss: 0.1063 - classification_loss: 0.0207 71/500 [===>..........................] - ETA: 3:39 - loss: 0.1265 - regression_loss: 0.1060 - classification_loss: 0.0205 72/500 [===>..........................] - ETA: 3:39 - loss: 0.1261 - regression_loss: 0.1057 - classification_loss: 0.0204 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1260 - regression_loss: 0.1058 - classification_loss: 0.0203 74/500 [===>..........................] - ETA: 3:38 - loss: 0.1252 - regression_loss: 0.1051 - classification_loss: 0.0201 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1254 - regression_loss: 0.1054 - classification_loss: 0.0200 76/500 [===>..........................] - ETA: 3:37 - loss: 0.1263 - regression_loss: 0.1062 - classification_loss: 0.0201 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1251 - regression_loss: 0.1051 - classification_loss: 0.0199 78/500 [===>..........................] - ETA: 3:36 - loss: 0.1242 - regression_loss: 0.1044 - classification_loss: 0.0198 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1233 - regression_loss: 0.1037 - classification_loss: 0.0196 80/500 [===>..........................] - ETA: 3:35 - loss: 0.1223 - regression_loss: 0.1029 - classification_loss: 0.0194 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1246 - regression_loss: 0.1050 - classification_loss: 0.0196 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1248 - regression_loss: 0.1053 - classification_loss: 0.0195 83/500 [===>..........................] - ETA: 3:33 - loss: 0.1249 - regression_loss: 0.1053 - classification_loss: 0.0195 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1240 - regression_loss: 0.1046 - classification_loss: 0.0194 85/500 [====>.........................] - ETA: 3:32 - loss: 0.1245 - regression_loss: 0.1052 - classification_loss: 0.0193 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1238 - regression_loss: 0.1046 - classification_loss: 0.0192 87/500 [====>.........................] - ETA: 3:31 - loss: 0.1231 - regression_loss: 0.1041 - classification_loss: 0.0190 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1230 - regression_loss: 0.1041 - classification_loss: 0.0189 89/500 [====>.........................] - ETA: 3:30 - loss: 0.1272 - regression_loss: 0.1081 - classification_loss: 0.0191 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1264 - regression_loss: 0.1075 - classification_loss: 0.0189 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1261 - regression_loss: 0.1073 - classification_loss: 0.0188 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1249 - regression_loss: 0.1063 - classification_loss: 0.0186 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1245 - regression_loss: 0.1059 - classification_loss: 0.0186 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1236 - regression_loss: 0.1052 - classification_loss: 0.0185 95/500 [====>.........................] - ETA: 3:26 - loss: 0.1237 - regression_loss: 0.1054 - classification_loss: 0.0184 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1228 - regression_loss: 0.1045 - classification_loss: 0.0182 97/500 [====>.........................] - ETA: 3:26 - loss: 0.1223 - regression_loss: 0.1042 - classification_loss: 0.0182 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1217 - regression_loss: 0.1036 - classification_loss: 0.0181 99/500 [====>.........................] - ETA: 3:25 - loss: 0.1229 - regression_loss: 0.1048 - classification_loss: 0.0181 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1230 - regression_loss: 0.1050 - classification_loss: 0.0180 101/500 [=====>........................] - ETA: 3:24 - loss: 0.1226 - regression_loss: 0.1045 - classification_loss: 0.0181 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1230 - regression_loss: 0.1048 - classification_loss: 0.0182 103/500 [=====>........................] - ETA: 3:23 - loss: 0.1220 - regression_loss: 0.1040 - classification_loss: 0.0180 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1242 - regression_loss: 0.1056 - classification_loss: 0.0187 105/500 [=====>........................] - ETA: 3:22 - loss: 0.1236 - regression_loss: 0.1050 - classification_loss: 0.0185 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1235 - regression_loss: 0.1050 - classification_loss: 0.0185 107/500 [=====>........................] - ETA: 3:21 - loss: 0.1242 - regression_loss: 0.1058 - classification_loss: 0.0185 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1234 - regression_loss: 0.1051 - classification_loss: 0.0184 109/500 [=====>........................] - ETA: 3:20 - loss: 0.1232 - regression_loss: 0.1049 - classification_loss: 0.0183 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1224 - regression_loss: 0.1043 - classification_loss: 0.0181 111/500 [=====>........................] - ETA: 3:19 - loss: 0.1217 - regression_loss: 0.1037 - classification_loss: 0.0180 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1212 - regression_loss: 0.1033 - classification_loss: 0.0179 113/500 [=====>........................] - ETA: 3:18 - loss: 0.1204 - regression_loss: 0.1026 - classification_loss: 0.0178 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1197 - regression_loss: 0.1021 - classification_loss: 0.0177 115/500 [=====>........................] - ETA: 3:17 - loss: 0.1192 - regression_loss: 0.1016 - classification_loss: 0.0176 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1184 - regression_loss: 0.1010 - classification_loss: 0.0175 117/500 [======>.......................] - ETA: 3:16 - loss: 0.1179 - regression_loss: 0.1005 - classification_loss: 0.0174 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1177 - regression_loss: 0.1004 - classification_loss: 0.0174 119/500 [======>.......................] - ETA: 3:15 - loss: 0.1171 - regression_loss: 0.0998 - classification_loss: 0.0173 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1167 - regression_loss: 0.0995 - classification_loss: 0.0172 121/500 [======>.......................] - ETA: 3:14 - loss: 0.1160 - regression_loss: 0.0989 - classification_loss: 0.0171 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1153 - regression_loss: 0.0984 - classification_loss: 0.0170 123/500 [======>.......................] - ETA: 3:13 - loss: 0.1148 - regression_loss: 0.0979 - classification_loss: 0.0169 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1142 - regression_loss: 0.0974 - classification_loss: 0.0168 125/500 [======>.......................] - ETA: 3:12 - loss: 0.1136 - regression_loss: 0.0969 - classification_loss: 0.0167 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1134 - regression_loss: 0.0967 - classification_loss: 0.0166 127/500 [======>.......................] - ETA: 3:11 - loss: 0.1167 - regression_loss: 0.0993 - classification_loss: 0.0174 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1169 - regression_loss: 0.0995 - classification_loss: 0.0174 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1181 - regression_loss: 0.1007 - classification_loss: 0.0173 130/500 [======>.......................] - ETA: 3:09 - loss: 0.1182 - regression_loss: 0.1010 - classification_loss: 0.0173 131/500 [======>.......................] - ETA: 3:09 - loss: 0.1186 - regression_loss: 0.1013 - classification_loss: 0.0173 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1182 - regression_loss: 0.1010 - classification_loss: 0.0172 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1176 - regression_loss: 0.1004 - classification_loss: 0.0171 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1170 - regression_loss: 0.1000 - classification_loss: 0.0170 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1164 - regression_loss: 0.0995 - classification_loss: 0.0169 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1161 - regression_loss: 0.0992 - classification_loss: 0.0169 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1163 - regression_loss: 0.0994 - classification_loss: 0.0169 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1161 - regression_loss: 0.0993 - classification_loss: 0.0168 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1155 - regression_loss: 0.0988 - classification_loss: 0.0167 140/500 [=======>......................] - ETA: 3:04 - loss: 0.1150 - regression_loss: 0.0984 - classification_loss: 0.0166 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1147 - regression_loss: 0.0982 - classification_loss: 0.0165 142/500 [=======>......................] - ETA: 3:03 - loss: 0.1145 - regression_loss: 0.0980 - classification_loss: 0.0165 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1146 - regression_loss: 0.0981 - classification_loss: 0.0165 144/500 [=======>......................] - ETA: 3:02 - loss: 0.1141 - regression_loss: 0.0977 - classification_loss: 0.0165 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1137 - regression_loss: 0.0973 - classification_loss: 0.0164 146/500 [=======>......................] - ETA: 3:01 - loss: 0.1147 - regression_loss: 0.0982 - classification_loss: 0.0165 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1176 - regression_loss: 0.1008 - classification_loss: 0.0168 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1200 - regression_loss: 0.1027 - classification_loss: 0.0173 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1196 - regression_loss: 0.1023 - classification_loss: 0.0173 150/500 [========>.....................] - ETA: 2:59 - loss: 0.1191 - regression_loss: 0.1018 - classification_loss: 0.0172 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1184 - regression_loss: 0.1013 - classification_loss: 0.0172 152/500 [========>.....................] - ETA: 2:58 - loss: 0.1179 - regression_loss: 0.1008 - classification_loss: 0.0171 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1175 - regression_loss: 0.1004 - classification_loss: 0.0170 154/500 [========>.....................] - ETA: 2:57 - loss: 0.1170 - regression_loss: 0.1000 - classification_loss: 0.0170 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1167 - regression_loss: 0.0998 - classification_loss: 0.0169 156/500 [========>.....................] - ETA: 2:56 - loss: 0.1164 - regression_loss: 0.0995 - classification_loss: 0.0169 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1165 - regression_loss: 0.0996 - classification_loss: 0.0169 158/500 [========>.....................] - ETA: 2:55 - loss: 0.1161 - regression_loss: 0.0992 - classification_loss: 0.0169 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1157 - regression_loss: 0.0989 - classification_loss: 0.0168 160/500 [========>.....................] - ETA: 2:54 - loss: 0.1153 - regression_loss: 0.0986 - classification_loss: 0.0167 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1150 - regression_loss: 0.0984 - classification_loss: 0.0167 162/500 [========>.....................] - ETA: 2:53 - loss: 0.1144 - regression_loss: 0.0979 - classification_loss: 0.0166 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1141 - regression_loss: 0.0976 - classification_loss: 0.0165 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1138 - regression_loss: 0.0974 - classification_loss: 0.0165 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1140 - regression_loss: 0.0976 - classification_loss: 0.0164 166/500 [========>.....................] - ETA: 2:51 - loss: 0.1136 - regression_loss: 0.0972 - classification_loss: 0.0164 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1132 - regression_loss: 0.0968 - classification_loss: 0.0163 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1128 - regression_loss: 0.0965 - classification_loss: 0.0163 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1123 - regression_loss: 0.0961 - classification_loss: 0.0162 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1127 - regression_loss: 0.0965 - classification_loss: 0.0162 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1123 - regression_loss: 0.0961 - classification_loss: 0.0162 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1118 - regression_loss: 0.0957 - classification_loss: 0.0161 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1116 - regression_loss: 0.0956 - classification_loss: 0.0161 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1149 - regression_loss: 0.0985 - classification_loss: 0.0164 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1145 - regression_loss: 0.0982 - classification_loss: 0.0163 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1144 - regression_loss: 0.0982 - classification_loss: 0.0163 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1141 - regression_loss: 0.0979 - classification_loss: 0.0162 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1139 - regression_loss: 0.0977 - classification_loss: 0.0162 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1136 - regression_loss: 0.0975 - classification_loss: 0.0162 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1138 - regression_loss: 0.0977 - classification_loss: 0.0161 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1154 - regression_loss: 0.0993 - classification_loss: 0.0161 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1150 - regression_loss: 0.0989 - classification_loss: 0.0161 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1147 - regression_loss: 0.0987 - classification_loss: 0.0160 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1144 - regression_loss: 0.0984 - classification_loss: 0.0160 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1142 - regression_loss: 0.0983 - classification_loss: 0.0159 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1138 - regression_loss: 0.0979 - classification_loss: 0.0159 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1134 - regression_loss: 0.0976 - classification_loss: 0.0158 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1131 - regression_loss: 0.0973 - classification_loss: 0.0158 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1127 - regression_loss: 0.0970 - classification_loss: 0.0157 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1123 - regression_loss: 0.0966 - classification_loss: 0.0157 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1121 - regression_loss: 0.0965 - classification_loss: 0.0156 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1118 - regression_loss: 0.0962 - classification_loss: 0.0156 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1120 - regression_loss: 0.0965 - classification_loss: 0.0156 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1117 - regression_loss: 0.0962 - classification_loss: 0.0155 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1117 - regression_loss: 0.0962 - classification_loss: 0.0155 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1113 - regression_loss: 0.0959 - classification_loss: 0.0155 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1110 - regression_loss: 0.0956 - classification_loss: 0.0154 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1105 - regression_loss: 0.0952 - classification_loss: 0.0153 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1106 - regression_loss: 0.0953 - classification_loss: 0.0153 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1103 - regression_loss: 0.0950 - classification_loss: 0.0153 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1099 - regression_loss: 0.0947 - classification_loss: 0.0152 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1098 - regression_loss: 0.0946 - classification_loss: 0.0152 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1094 - regression_loss: 0.0943 - classification_loss: 0.0152 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1092 - regression_loss: 0.0941 - classification_loss: 0.0151 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1090 - regression_loss: 0.0939 - classification_loss: 0.0151 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1086 - regression_loss: 0.0936 - classification_loss: 0.0150 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1083 - regression_loss: 0.0933 - classification_loss: 0.0150 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1082 - regression_loss: 0.0933 - classification_loss: 0.0149 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1081 - regression_loss: 0.0932 - classification_loss: 0.0149 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1079 - regression_loss: 0.0930 - classification_loss: 0.0149 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1084 - regression_loss: 0.0935 - classification_loss: 0.0149 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1083 - regression_loss: 0.0934 - classification_loss: 0.0149 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1078 - regression_loss: 0.0930 - classification_loss: 0.0148 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1077 - regression_loss: 0.0929 - classification_loss: 0.0148 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1075 - regression_loss: 0.0927 - classification_loss: 0.0148 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1075 - regression_loss: 0.0928 - classification_loss: 0.0148 217/500 [============>.................] - ETA: 2:24 - loss: 0.1079 - regression_loss: 0.0932 - classification_loss: 0.0148 218/500 [============>.................] - ETA: 2:24 - loss: 0.1077 - regression_loss: 0.0929 - classification_loss: 0.0148 219/500 [============>.................] - ETA: 2:23 - loss: 0.1078 - regression_loss: 0.0930 - classification_loss: 0.0148 220/500 [============>.................] - ETA: 2:23 - loss: 0.1079 - regression_loss: 0.0932 - classification_loss: 0.0147 221/500 [============>.................] - ETA: 2:22 - loss: 0.1115 - regression_loss: 0.0956 - classification_loss: 0.0158 222/500 [============>.................] - ETA: 2:22 - loss: 0.1112 - regression_loss: 0.0954 - classification_loss: 0.0158 223/500 [============>.................] - ETA: 2:21 - loss: 0.1108 - regression_loss: 0.0951 - classification_loss: 0.0158 224/500 [============>.................] - ETA: 2:21 - loss: 0.1106 - regression_loss: 0.0949 - classification_loss: 0.0157 225/500 [============>.................] - ETA: 2:20 - loss: 0.1105 - regression_loss: 0.0949 - classification_loss: 0.0157 226/500 [============>.................] - ETA: 2:20 - loss: 0.1103 - regression_loss: 0.0946 - classification_loss: 0.0157 227/500 [============>.................] - ETA: 2:19 - loss: 0.1100 - regression_loss: 0.0944 - classification_loss: 0.0156 228/500 [============>.................] - ETA: 2:19 - loss: 0.1103 - regression_loss: 0.0947 - classification_loss: 0.0156 229/500 [============>.................] - ETA: 2:18 - loss: 0.1102 - regression_loss: 0.0946 - classification_loss: 0.0156 230/500 [============>.................] - ETA: 2:17 - loss: 0.1115 - regression_loss: 0.0959 - classification_loss: 0.0156 231/500 [============>.................] - ETA: 2:17 - loss: 0.1114 - regression_loss: 0.0958 - classification_loss: 0.0156 232/500 [============>.................] - ETA: 2:16 - loss: 0.1111 - regression_loss: 0.0956 - classification_loss: 0.0155 233/500 [============>.................] - ETA: 2:16 - loss: 0.1109 - regression_loss: 0.0954 - classification_loss: 0.0155 234/500 [=============>................] - ETA: 2:15 - loss: 0.1106 - regression_loss: 0.0952 - classification_loss: 0.0155 235/500 [=============>................] - ETA: 2:15 - loss: 0.1104 - regression_loss: 0.0950 - classification_loss: 0.0154 236/500 [=============>................] - ETA: 2:14 - loss: 0.1102 - regression_loss: 0.0948 - classification_loss: 0.0154 237/500 [=============>................] - ETA: 2:14 - loss: 0.1100 - regression_loss: 0.0947 - classification_loss: 0.0154 238/500 [=============>................] - ETA: 2:13 - loss: 0.1103 - regression_loss: 0.0949 - classification_loss: 0.0154 239/500 [=============>................] - ETA: 2:13 - loss: 0.1099 - regression_loss: 0.0946 - classification_loss: 0.0153 240/500 [=============>................] - ETA: 2:12 - loss: 0.1115 - regression_loss: 0.0961 - classification_loss: 0.0154 241/500 [=============>................] - ETA: 2:12 - loss: 0.1112 - regression_loss: 0.0959 - classification_loss: 0.0153 242/500 [=============>................] - ETA: 2:11 - loss: 0.1109 - regression_loss: 0.0956 - classification_loss: 0.0153 243/500 [=============>................] - ETA: 2:11 - loss: 0.1106 - regression_loss: 0.0954 - classification_loss: 0.0153 244/500 [=============>................] - ETA: 2:10 - loss: 0.1103 - regression_loss: 0.0951 - classification_loss: 0.0152 245/500 [=============>................] - ETA: 2:10 - loss: 0.1102 - regression_loss: 0.0950 - classification_loss: 0.0152 246/500 [=============>................] - ETA: 2:09 - loss: 0.1107 - regression_loss: 0.0955 - classification_loss: 0.0152 247/500 [=============>................] - ETA: 2:09 - loss: 0.1104 - regression_loss: 0.0952 - classification_loss: 0.0152 248/500 [=============>................] - ETA: 2:08 - loss: 0.1102 - regression_loss: 0.0951 - classification_loss: 0.0152 249/500 [=============>................] - ETA: 2:08 - loss: 0.1103 - regression_loss: 0.0952 - classification_loss: 0.0151 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1101 - regression_loss: 0.0949 - classification_loss: 0.0151 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1098 - regression_loss: 0.0947 - classification_loss: 0.0151 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1095 - regression_loss: 0.0944 - classification_loss: 0.0151 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1094 - regression_loss: 0.0943 - classification_loss: 0.0150 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1091 - regression_loss: 0.0941 - classification_loss: 0.0150 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1088 - regression_loss: 0.0938 - classification_loss: 0.0150 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1085 - regression_loss: 0.0936 - classification_loss: 0.0149 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1084 - regression_loss: 0.0935 - classification_loss: 0.0149 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1081 - regression_loss: 0.0932 - classification_loss: 0.0149 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1082 - regression_loss: 0.0933 - classification_loss: 0.0149 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1082 - regression_loss: 0.0933 - classification_loss: 0.0148 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1078 - regression_loss: 0.0930 - classification_loss: 0.0148 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1076 - regression_loss: 0.0928 - classification_loss: 0.0148 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1077 - regression_loss: 0.0930 - classification_loss: 0.0148 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1075 - regression_loss: 0.0928 - classification_loss: 0.0148 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1075 - regression_loss: 0.0927 - classification_loss: 0.0148 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1073 - regression_loss: 0.0926 - classification_loss: 0.0147 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1072 - regression_loss: 0.0925 - classification_loss: 0.0147 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1081 - regression_loss: 0.0932 - classification_loss: 0.0150 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1080 - regression_loss: 0.0930 - classification_loss: 0.0150 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1077 - regression_loss: 0.0928 - classification_loss: 0.0149 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1077 - regression_loss: 0.0928 - classification_loss: 0.0149 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1075 - regression_loss: 0.0927 - classification_loss: 0.0149 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1075 - regression_loss: 0.0926 - classification_loss: 0.0149 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1072 - regression_loss: 0.0924 - classification_loss: 0.0148 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1070 - regression_loss: 0.0922 - classification_loss: 0.0148 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1067 - regression_loss: 0.0920 - classification_loss: 0.0148 277/500 [===============>..............] - ETA: 1:54 - loss: 0.1064 - regression_loss: 0.0917 - classification_loss: 0.0147 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1063 - regression_loss: 0.0916 - classification_loss: 0.0147 279/500 [===============>..............] - ETA: 1:53 - loss: 0.1063 - regression_loss: 0.0916 - classification_loss: 0.0147 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1060 - regression_loss: 0.0913 - classification_loss: 0.0146 281/500 [===============>..............] - ETA: 1:52 - loss: 0.1058 - regression_loss: 0.0912 - classification_loss: 0.0146 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1059 - regression_loss: 0.0912 - classification_loss: 0.0146 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1056 - regression_loss: 0.0910 - classification_loss: 0.0146 284/500 [================>.............] - ETA: 1:50 - loss: 0.1061 - regression_loss: 0.0915 - classification_loss: 0.0146 285/500 [================>.............] - ETA: 1:49 - loss: 0.1058 - regression_loss: 0.0913 - classification_loss: 0.0146 286/500 [================>.............] - ETA: 1:49 - loss: 0.1056 - regression_loss: 0.0910 - classification_loss: 0.0145 287/500 [================>.............] - ETA: 1:48 - loss: 0.1054 - regression_loss: 0.0909 - classification_loss: 0.0145 288/500 [================>.............] - ETA: 1:48 - loss: 0.1052 - regression_loss: 0.0908 - classification_loss: 0.0145 289/500 [================>.............] - ETA: 1:47 - loss: 0.1050 - regression_loss: 0.0906 - classification_loss: 0.0144 290/500 [================>.............] - ETA: 1:47 - loss: 0.1052 - regression_loss: 0.0907 - classification_loss: 0.0144 291/500 [================>.............] - ETA: 1:46 - loss: 0.1051 - regression_loss: 0.0907 - classification_loss: 0.0144 292/500 [================>.............] - ETA: 1:46 - loss: 0.1053 - regression_loss: 0.0909 - classification_loss: 0.0144 293/500 [================>.............] - ETA: 1:45 - loss: 0.1052 - regression_loss: 0.0908 - classification_loss: 0.0144 294/500 [================>.............] - ETA: 1:45 - loss: 0.1049 - regression_loss: 0.0906 - classification_loss: 0.0143 295/500 [================>.............] - ETA: 1:44 - loss: 0.1052 - regression_loss: 0.0909 - classification_loss: 0.0143 296/500 [================>.............] - ETA: 1:44 - loss: 0.1050 - regression_loss: 0.0907 - classification_loss: 0.0143 297/500 [================>.............] - ETA: 1:43 - loss: 0.1050 - regression_loss: 0.0907 - classification_loss: 0.0143 298/500 [================>.............] - ETA: 1:43 - loss: 0.1049 - regression_loss: 0.0906 - classification_loss: 0.0143 299/500 [================>.............] - ETA: 1:42 - loss: 0.1050 - regression_loss: 0.0907 - classification_loss: 0.0143 300/500 [=================>............] - ETA: 1:42 - loss: 0.1050 - regression_loss: 0.0907 - classification_loss: 0.0143 301/500 [=================>............] - ETA: 1:41 - loss: 0.1049 - regression_loss: 0.0906 - classification_loss: 0.0142 302/500 [=================>............] - ETA: 1:41 - loss: 0.1047 - regression_loss: 0.0905 - classification_loss: 0.0142 303/500 [=================>............] - ETA: 1:40 - loss: 0.1046 - regression_loss: 0.0904 - classification_loss: 0.0142 304/500 [=================>............] - ETA: 1:40 - loss: 0.1050 - regression_loss: 0.0908 - classification_loss: 0.0142 305/500 [=================>............] - ETA: 1:39 - loss: 0.1053 - regression_loss: 0.0911 - classification_loss: 0.0142 306/500 [=================>............] - ETA: 1:39 - loss: 0.1053 - regression_loss: 0.0911 - classification_loss: 0.0142 307/500 [=================>............] - ETA: 1:38 - loss: 0.1052 - regression_loss: 0.0910 - classification_loss: 0.0142 308/500 [=================>............] - ETA: 1:38 - loss: 0.1050 - regression_loss: 0.0908 - classification_loss: 0.0142 309/500 [=================>............] - ETA: 1:37 - loss: 0.1050 - regression_loss: 0.0908 - classification_loss: 0.0141 310/500 [=================>............] - ETA: 1:37 - loss: 0.1048 - regression_loss: 0.0907 - classification_loss: 0.0141 311/500 [=================>............] - ETA: 1:36 - loss: 0.1047 - regression_loss: 0.0906 - classification_loss: 0.0141 312/500 [=================>............] - ETA: 1:36 - loss: 0.1045 - regression_loss: 0.0904 - classification_loss: 0.0141 313/500 [=================>............] - ETA: 1:35 - loss: 0.1043 - regression_loss: 0.0902 - classification_loss: 0.0141 314/500 [=================>............] - ETA: 1:35 - loss: 0.1049 - regression_loss: 0.0909 - classification_loss: 0.0141 315/500 [=================>............] - ETA: 1:34 - loss: 0.1048 - regression_loss: 0.0908 - classification_loss: 0.0140 316/500 [=================>............] - ETA: 1:34 - loss: 0.1046 - regression_loss: 0.0906 - classification_loss: 0.0140 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1045 - regression_loss: 0.0905 - classification_loss: 0.0140 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1044 - regression_loss: 0.0905 - classification_loss: 0.0140 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1042 - regression_loss: 0.0903 - classification_loss: 0.0139 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1041 - regression_loss: 0.0901 - classification_loss: 0.0139 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1042 - regression_loss: 0.0903 - classification_loss: 0.0139 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1041 - regression_loss: 0.0902 - classification_loss: 0.0139 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1043 - regression_loss: 0.0903 - classification_loss: 0.0139 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1042 - regression_loss: 0.0903 - classification_loss: 0.0139 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1040 - regression_loss: 0.0902 - classification_loss: 0.0139 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1039 - regression_loss: 0.0900 - classification_loss: 0.0139 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1038 - regression_loss: 0.0899 - classification_loss: 0.0138 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1036 - regression_loss: 0.0897 - classification_loss: 0.0138 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1036 - regression_loss: 0.0898 - classification_loss: 0.0138 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1034 - regression_loss: 0.0896 - classification_loss: 0.0138 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1034 - regression_loss: 0.0897 - classification_loss: 0.0138 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1032 - regression_loss: 0.0895 - classification_loss: 0.0137 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1031 - regression_loss: 0.0894 - classification_loss: 0.0137 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1029 - regression_loss: 0.0892 - classification_loss: 0.0137 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1029 - regression_loss: 0.0892 - classification_loss: 0.0137 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1028 - regression_loss: 0.0892 - classification_loss: 0.0136 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1027 - regression_loss: 0.0891 - classification_loss: 0.0137 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1025 - regression_loss: 0.0889 - classification_loss: 0.0136 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1024 - regression_loss: 0.0888 - classification_loss: 0.0136 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1023 - regression_loss: 0.0887 - classification_loss: 0.0136 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1022 - regression_loss: 0.0886 - classification_loss: 0.0136 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1027 - regression_loss: 0.0891 - classification_loss: 0.0136 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1024 - regression_loss: 0.0889 - classification_loss: 0.0136 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1023 - regression_loss: 0.0887 - classification_loss: 0.0136 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1022 - regression_loss: 0.0887 - classification_loss: 0.0135 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1023 - regression_loss: 0.0887 - classification_loss: 0.0136 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1021 - regression_loss: 0.0886 - classification_loss: 0.0135 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1020 - regression_loss: 0.0885 - classification_loss: 0.0135 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1021 - regression_loss: 0.0885 - classification_loss: 0.0136 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1021 - regression_loss: 0.0885 - classification_loss: 0.0136 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1019 - regression_loss: 0.0883 - classification_loss: 0.0136 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1017 - regression_loss: 0.0882 - classification_loss: 0.0135 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1015 - regression_loss: 0.0880 - classification_loss: 0.0135 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1014 - regression_loss: 0.0879 - classification_loss: 0.0135 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1014 - regression_loss: 0.0879 - classification_loss: 0.0135 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0135 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1015 - regression_loss: 0.0879 - classification_loss: 0.0135 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1016 - regression_loss: 0.0881 - classification_loss: 0.0135 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1015 - regression_loss: 0.0880 - classification_loss: 0.0135 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1016 - regression_loss: 0.0881 - classification_loss: 0.0135 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1014 - regression_loss: 0.0879 - classification_loss: 0.0135 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1013 - regression_loss: 0.0878 - classification_loss: 0.0135 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1013 - regression_loss: 0.0878 - classification_loss: 0.0135 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1012 - regression_loss: 0.0877 - classification_loss: 0.0135 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1011 - regression_loss: 0.0876 - classification_loss: 0.0135 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1013 - regression_loss: 0.0878 - classification_loss: 0.0135 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1013 - regression_loss: 0.0878 - classification_loss: 0.0134 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1012 - regression_loss: 0.0878 - classification_loss: 0.0134 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0134 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1009 - regression_loss: 0.0876 - classification_loss: 0.0134 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1007 - regression_loss: 0.0874 - classification_loss: 0.0134 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1005 - regression_loss: 0.0872 - classification_loss: 0.0133 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1004 - regression_loss: 0.0871 - classification_loss: 0.0133 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1003 - regression_loss: 0.0870 - classification_loss: 0.0133 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1001 - regression_loss: 0.0868 - classification_loss: 0.0133 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0999 - regression_loss: 0.0867 - classification_loss: 0.0133 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0997 - regression_loss: 0.0865 - classification_loss: 0.0132 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0995 - regression_loss: 0.0863 - classification_loss: 0.0132 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0995 - regression_loss: 0.0863 - classification_loss: 0.0132 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0134 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0134 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0134 383/500 [=====================>........] - ETA: 59s - loss: 0.1009 - regression_loss: 0.0876 - classification_loss: 0.0133  384/500 [======================>.......] - ETA: 59s - loss: 0.1009 - regression_loss: 0.0875 - classification_loss: 0.0133 385/500 [======================>.......] - ETA: 58s - loss: 0.1012 - regression_loss: 0.0878 - classification_loss: 0.0134 386/500 [======================>.......] - ETA: 58s - loss: 0.1010 - regression_loss: 0.0877 - classification_loss: 0.0133 387/500 [======================>.......] - ETA: 57s - loss: 0.1010 - regression_loss: 0.0877 - classification_loss: 0.0133 388/500 [======================>.......] - ETA: 57s - loss: 0.1009 - regression_loss: 0.0876 - classification_loss: 0.0133 389/500 [======================>.......] - ETA: 56s - loss: 0.1008 - regression_loss: 0.0875 - classification_loss: 0.0133 390/500 [======================>.......] - ETA: 56s - loss: 0.1011 - regression_loss: 0.0877 - classification_loss: 0.0133 391/500 [======================>.......] - ETA: 55s - loss: 0.1009 - regression_loss: 0.0876 - classification_loss: 0.0133 392/500 [======================>.......] - ETA: 55s - loss: 0.1021 - regression_loss: 0.0886 - classification_loss: 0.0134 393/500 [======================>.......] - ETA: 54s - loss: 0.1019 - regression_loss: 0.0885 - classification_loss: 0.0134 394/500 [======================>.......] - ETA: 54s - loss: 0.1018 - regression_loss: 0.0884 - classification_loss: 0.0134 395/500 [======================>.......] - ETA: 53s - loss: 0.1017 - regression_loss: 0.0883 - classification_loss: 0.0134 396/500 [======================>.......] - ETA: 53s - loss: 0.1015 - regression_loss: 0.0882 - classification_loss: 0.0133 397/500 [======================>.......] - ETA: 52s - loss: 0.1015 - regression_loss: 0.0881 - classification_loss: 0.0133 398/500 [======================>.......] - ETA: 52s - loss: 0.1015 - regression_loss: 0.0882 - classification_loss: 0.0133 399/500 [======================>.......] - ETA: 51s - loss: 0.1015 - regression_loss: 0.0882 - classification_loss: 0.0133 400/500 [=======================>......] - ETA: 51s - loss: 0.1014 - regression_loss: 0.0881 - classification_loss: 0.0133 401/500 [=======================>......] - ETA: 50s - loss: 0.1015 - regression_loss: 0.0882 - classification_loss: 0.0133 402/500 [=======================>......] - ETA: 50s - loss: 0.1013 - regression_loss: 0.0880 - classification_loss: 0.0133 403/500 [=======================>......] - ETA: 49s - loss: 0.1011 - regression_loss: 0.0878 - classification_loss: 0.0132 404/500 [=======================>......] - ETA: 49s - loss: 0.1009 - regression_loss: 0.0877 - classification_loss: 0.0132 405/500 [=======================>......] - ETA: 48s - loss: 0.1008 - regression_loss: 0.0876 - classification_loss: 0.0132 406/500 [=======================>......] - ETA: 48s - loss: 0.1007 - regression_loss: 0.0875 - classification_loss: 0.0132 407/500 [=======================>......] - ETA: 47s - loss: 0.1006 - regression_loss: 0.0875 - classification_loss: 0.0132 408/500 [=======================>......] - ETA: 46s - loss: 0.1005 - regression_loss: 0.0874 - classification_loss: 0.0132 409/500 [=======================>......] - ETA: 46s - loss: 0.1003 - regression_loss: 0.0872 - classification_loss: 0.0132 410/500 [=======================>......] - ETA: 45s - loss: 0.1003 - regression_loss: 0.0872 - classification_loss: 0.0131 411/500 [=======================>......] - ETA: 45s - loss: 0.1004 - regression_loss: 0.0873 - classification_loss: 0.0131 412/500 [=======================>......] - ETA: 44s - loss: 0.1006 - regression_loss: 0.0875 - classification_loss: 0.0131 413/500 [=======================>......] - ETA: 44s - loss: 0.1006 - regression_loss: 0.0875 - classification_loss: 0.0132 414/500 [=======================>......] - ETA: 43s - loss: 0.1005 - regression_loss: 0.0873 - classification_loss: 0.0131 415/500 [=======================>......] - ETA: 43s - loss: 0.1022 - regression_loss: 0.0885 - classification_loss: 0.0137 416/500 [=======================>......] - ETA: 42s - loss: 0.1021 - regression_loss: 0.0884 - classification_loss: 0.0137 417/500 [========================>.....] - ETA: 42s - loss: 0.1019 - regression_loss: 0.0882 - classification_loss: 0.0137 418/500 [========================>.....] - ETA: 41s - loss: 0.1017 - regression_loss: 0.0881 - classification_loss: 0.0136 419/500 [========================>.....] - ETA: 41s - loss: 0.1017 - regression_loss: 0.0880 - classification_loss: 0.0136 420/500 [========================>.....] - ETA: 40s - loss: 0.1016 - regression_loss: 0.0879 - classification_loss: 0.0136 421/500 [========================>.....] - ETA: 40s - loss: 0.1014 - regression_loss: 0.0878 - classification_loss: 0.0136 422/500 [========================>.....] - ETA: 39s - loss: 0.1014 - regression_loss: 0.0878 - classification_loss: 0.0136 423/500 [========================>.....] - ETA: 39s - loss: 0.1015 - regression_loss: 0.0879 - classification_loss: 0.0136 424/500 [========================>.....] - ETA: 38s - loss: 0.1017 - regression_loss: 0.0881 - classification_loss: 0.0136 425/500 [========================>.....] - ETA: 38s - loss: 0.1015 - regression_loss: 0.0879 - classification_loss: 0.0136 426/500 [========================>.....] - ETA: 37s - loss: 0.1015 - regression_loss: 0.0879 - classification_loss: 0.0136 427/500 [========================>.....] - ETA: 37s - loss: 0.1014 - regression_loss: 0.0878 - classification_loss: 0.0136 428/500 [========================>.....] - ETA: 36s - loss: 0.1016 - regression_loss: 0.0880 - classification_loss: 0.0136 429/500 [========================>.....] - ETA: 36s - loss: 0.1025 - regression_loss: 0.0887 - classification_loss: 0.0138 430/500 [========================>.....] - ETA: 35s - loss: 0.1023 - regression_loss: 0.0885 - classification_loss: 0.0138 431/500 [========================>.....] - ETA: 35s - loss: 0.1024 - regression_loss: 0.0887 - classification_loss: 0.0138 432/500 [========================>.....] - ETA: 34s - loss: 0.1023 - regression_loss: 0.0885 - classification_loss: 0.0137 433/500 [========================>.....] - ETA: 34s - loss: 0.1022 - regression_loss: 0.0885 - classification_loss: 0.0137 434/500 [=========================>....] - ETA: 33s - loss: 0.1023 - regression_loss: 0.0885 - classification_loss: 0.0137 435/500 [=========================>....] - ETA: 33s - loss: 0.1022 - regression_loss: 0.0885 - classification_loss: 0.0137 436/500 [=========================>....] - ETA: 32s - loss: 0.1021 - regression_loss: 0.0884 - classification_loss: 0.0137 437/500 [=========================>....] - ETA: 32s - loss: 0.1021 - regression_loss: 0.0884 - classification_loss: 0.0137 438/500 [=========================>....] - ETA: 31s - loss: 0.1020 - regression_loss: 0.0883 - classification_loss: 0.0137 439/500 [=========================>....] - ETA: 31s - loss: 0.1019 - regression_loss: 0.0882 - classification_loss: 0.0137 440/500 [=========================>....] - ETA: 30s - loss: 0.1017 - regression_loss: 0.0881 - classification_loss: 0.0137 441/500 [=========================>....] - ETA: 30s - loss: 0.1017 - regression_loss: 0.0880 - classification_loss: 0.0137 442/500 [=========================>....] - ETA: 29s - loss: 0.1017 - regression_loss: 0.0881 - classification_loss: 0.0136 443/500 [=========================>....] - ETA: 29s - loss: 0.1016 - regression_loss: 0.0880 - classification_loss: 0.0136 444/500 [=========================>....] - ETA: 28s - loss: 0.1015 - regression_loss: 0.0879 - classification_loss: 0.0136 445/500 [=========================>....] - ETA: 28s - loss: 0.1014 - regression_loss: 0.0878 - classification_loss: 0.0136 446/500 [=========================>....] - ETA: 27s - loss: 0.1012 - regression_loss: 0.0877 - classification_loss: 0.0136 447/500 [=========================>....] - ETA: 27s - loss: 0.1012 - regression_loss: 0.0876 - classification_loss: 0.0136 448/500 [=========================>....] - ETA: 26s - loss: 0.1011 - regression_loss: 0.0875 - classification_loss: 0.0135 449/500 [=========================>....] - ETA: 26s - loss: 0.1009 - regression_loss: 0.0874 - classification_loss: 0.0135 450/500 [==========================>...] - ETA: 25s - loss: 0.1007 - regression_loss: 0.0872 - classification_loss: 0.0135 451/500 [==========================>...] - ETA: 25s - loss: 0.1006 - regression_loss: 0.0871 - classification_loss: 0.0135 452/500 [==========================>...] - ETA: 24s - loss: 0.1004 - regression_loss: 0.0870 - classification_loss: 0.0135 453/500 [==========================>...] - ETA: 24s - loss: 0.1003 - regression_loss: 0.0869 - classification_loss: 0.0135 454/500 [==========================>...] - ETA: 23s - loss: 0.1002 - regression_loss: 0.0868 - classification_loss: 0.0134 455/500 [==========================>...] - ETA: 22s - loss: 0.1004 - regression_loss: 0.0870 - classification_loss: 0.0135 456/500 [==========================>...] - ETA: 22s - loss: 0.1007 - regression_loss: 0.0872 - classification_loss: 0.0136 457/500 [==========================>...] - ETA: 21s - loss: 0.1007 - regression_loss: 0.0871 - classification_loss: 0.0135 458/500 [==========================>...] - ETA: 21s - loss: 0.1006 - regression_loss: 0.0871 - classification_loss: 0.0135 459/500 [==========================>...] - ETA: 20s - loss: 0.1004 - regression_loss: 0.0869 - classification_loss: 0.0135 460/500 [==========================>...] - ETA: 20s - loss: 0.1004 - regression_loss: 0.0869 - classification_loss: 0.0135 461/500 [==========================>...] - ETA: 19s - loss: 0.1005 - regression_loss: 0.0870 - classification_loss: 0.0135 462/500 [==========================>...] - ETA: 19s - loss: 0.1004 - regression_loss: 0.0869 - classification_loss: 0.0135 463/500 [==========================>...] - ETA: 18s - loss: 0.1003 - regression_loss: 0.0868 - classification_loss: 0.0135 464/500 [==========================>...] - ETA: 18s - loss: 0.1002 - regression_loss: 0.0867 - classification_loss: 0.0134 465/500 [==========================>...] - ETA: 17s - loss: 0.1001 - regression_loss: 0.0866 - classification_loss: 0.0134 466/500 [==========================>...] - ETA: 17s - loss: 0.1000 - regression_loss: 0.0866 - classification_loss: 0.0134 467/500 [===========================>..] - ETA: 16s - loss: 0.0999 - regression_loss: 0.0865 - classification_loss: 0.0134 468/500 [===========================>..] - ETA: 16s - loss: 0.1000 - regression_loss: 0.0865 - classification_loss: 0.0134 469/500 [===========================>..] - ETA: 15s - loss: 0.0999 - regression_loss: 0.0865 - classification_loss: 0.0134 470/500 [===========================>..] - ETA: 15s - loss: 0.0997 - regression_loss: 0.0863 - classification_loss: 0.0134 471/500 [===========================>..] - ETA: 14s - loss: 0.0996 - regression_loss: 0.0862 - classification_loss: 0.0134 472/500 [===========================>..] - ETA: 14s - loss: 0.0995 - regression_loss: 0.0861 - classification_loss: 0.0133 473/500 [===========================>..] - ETA: 13s - loss: 0.0993 - regression_loss: 0.0860 - classification_loss: 0.0133 474/500 [===========================>..] - ETA: 13s - loss: 0.0993 - regression_loss: 0.0860 - classification_loss: 0.0133 475/500 [===========================>..] - ETA: 12s - loss: 0.0992 - regression_loss: 0.0859 - classification_loss: 0.0133 476/500 [===========================>..] - ETA: 12s - loss: 0.0991 - regression_loss: 0.0858 - classification_loss: 0.0133 477/500 [===========================>..] - ETA: 11s - loss: 0.0991 - regression_loss: 0.0858 - classification_loss: 0.0133 478/500 [===========================>..] - ETA: 11s - loss: 0.0990 - regression_loss: 0.0857 - classification_loss: 0.0133 479/500 [===========================>..] - ETA: 10s - loss: 0.0990 - regression_loss: 0.0857 - classification_loss: 0.0133 480/500 [===========================>..] - ETA: 10s - loss: 0.0996 - regression_loss: 0.0861 - classification_loss: 0.0134 481/500 [===========================>..] - ETA: 9s - loss: 0.0995 - regression_loss: 0.0860 - classification_loss: 0.0134  482/500 [===========================>..] - ETA: 9s - loss: 0.0993 - regression_loss: 0.0859 - classification_loss: 0.0134 483/500 [===========================>..] - ETA: 8s - loss: 0.0998 - regression_loss: 0.0863 - classification_loss: 0.0135 484/500 [============================>.] - ETA: 8s - loss: 0.0997 - regression_loss: 0.0863 - classification_loss: 0.0134 485/500 [============================>.] - ETA: 7s - loss: 0.0996 - regression_loss: 0.0862 - classification_loss: 0.0134 486/500 [============================>.] - ETA: 7s - loss: 0.0997 - regression_loss: 0.0862 - classification_loss: 0.0134 487/500 [============================>.] - ETA: 6s - loss: 0.0996 - regression_loss: 0.0862 - classification_loss: 0.0134 488/500 [============================>.] - ETA: 6s - loss: 0.0996 - regression_loss: 0.0862 - classification_loss: 0.0134 489/500 [============================>.] - ETA: 5s - loss: 0.0994 - regression_loss: 0.0861 - classification_loss: 0.0134 490/500 [============================>.] - ETA: 5s - loss: 0.0995 - regression_loss: 0.0861 - classification_loss: 0.0134 491/500 [============================>.] - ETA: 4s - loss: 0.0993 - regression_loss: 0.0860 - classification_loss: 0.0134 492/500 [============================>.] - ETA: 4s - loss: 0.0992 - regression_loss: 0.0859 - classification_loss: 0.0133 493/500 [============================>.] - ETA: 3s - loss: 0.0991 - regression_loss: 0.0858 - classification_loss: 0.0133 494/500 [============================>.] - ETA: 3s - loss: 0.0990 - regression_loss: 0.0857 - classification_loss: 0.0133 495/500 [============================>.] - ETA: 2s - loss: 0.0990 - regression_loss: 0.0856 - classification_loss: 0.0133 496/500 [============================>.] - ETA: 2s - loss: 0.0988 - regression_loss: 0.0855 - classification_loss: 0.0133 497/500 [============================>.] - ETA: 1s - loss: 0.0987 - regression_loss: 0.0854 - classification_loss: 0.0133 498/500 [============================>.] - ETA: 1s - loss: 0.0987 - regression_loss: 0.0854 - classification_loss: 0.0133 499/500 [============================>.] - ETA: 0s - loss: 0.0985 - regression_loss: 0.0853 - classification_loss: 0.0132 500/500 [==============================] - 256s 511ms/step - loss: 0.0986 - regression_loss: 0.0853 - classification_loss: 0.0132 326 instances of class plum with average precision: 0.8427 mAP: 0.8427 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 4:23 - loss: 0.0564 - regression_loss: 0.0471 - classification_loss: 0.0093 2/500 [..............................] - ETA: 4:15 - loss: 0.0620 - regression_loss: 0.0521 - classification_loss: 0.0099 3/500 [..............................] - ETA: 4:16 - loss: 0.0573 - regression_loss: 0.0481 - classification_loss: 0.0092 4/500 [..............................] - ETA: 4:16 - loss: 0.0510 - regression_loss: 0.0431 - classification_loss: 0.0079 5/500 [..............................] - ETA: 4:17 - loss: 0.0639 - regression_loss: 0.0549 - classification_loss: 0.0089 6/500 [..............................] - ETA: 4:17 - loss: 0.0716 - regression_loss: 0.0622 - classification_loss: 0.0094 7/500 [..............................] - ETA: 4:15 - loss: 0.0747 - regression_loss: 0.0652 - classification_loss: 0.0095 8/500 [..............................] - ETA: 4:15 - loss: 0.0742 - regression_loss: 0.0646 - classification_loss: 0.0096 9/500 [..............................] - ETA: 4:13 - loss: 0.0706 - regression_loss: 0.0613 - classification_loss: 0.0092 10/500 [..............................] - ETA: 4:12 - loss: 0.0830 - regression_loss: 0.0712 - classification_loss: 0.0118 11/500 [..............................] - ETA: 4:12 - loss: 0.0964 - regression_loss: 0.0849 - classification_loss: 0.0114 12/500 [..............................] - ETA: 4:12 - loss: 0.0973 - regression_loss: 0.0860 - classification_loss: 0.0113 13/500 [..............................] - ETA: 4:12 - loss: 0.0934 - regression_loss: 0.0826 - classification_loss: 0.0108 14/500 [..............................] - ETA: 4:11 - loss: 0.0966 - regression_loss: 0.0845 - classification_loss: 0.0121 15/500 [..............................] - ETA: 4:10 - loss: 0.1345 - regression_loss: 0.1184 - classification_loss: 0.0161 16/500 [..............................] - ETA: 4:09 - loss: 0.1282 - regression_loss: 0.1126 - classification_loss: 0.0156 17/500 [>.............................] - ETA: 4:08 - loss: 0.1263 - regression_loss: 0.1108 - classification_loss: 0.0154 18/500 [>.............................] - ETA: 4:08 - loss: 0.1231 - regression_loss: 0.1081 - classification_loss: 0.0150 19/500 [>.............................] - ETA: 4:07 - loss: 0.1211 - regression_loss: 0.1066 - classification_loss: 0.0145 20/500 [>.............................] - ETA: 4:07 - loss: 0.1173 - regression_loss: 0.1032 - classification_loss: 0.0141 21/500 [>.............................] - ETA: 4:06 - loss: 0.1178 - regression_loss: 0.1037 - classification_loss: 0.0141 22/500 [>.............................] - ETA: 4:06 - loss: 0.1183 - regression_loss: 0.1042 - classification_loss: 0.0141 23/500 [>.............................] - ETA: 4:05 - loss: 0.1186 - regression_loss: 0.1046 - classification_loss: 0.0140 24/500 [>.............................] - ETA: 4:05 - loss: 0.1181 - regression_loss: 0.1039 - classification_loss: 0.0142 25/500 [>.............................] - ETA: 4:05 - loss: 0.1151 - regression_loss: 0.1013 - classification_loss: 0.0139 26/500 [>.............................] - ETA: 4:04 - loss: 0.1126 - regression_loss: 0.0991 - classification_loss: 0.0135 27/500 [>.............................] - ETA: 4:03 - loss: 0.1109 - regression_loss: 0.0978 - classification_loss: 0.0132 28/500 [>.............................] - ETA: 4:02 - loss: 0.1095 - regression_loss: 0.0964 - classification_loss: 0.0131 29/500 [>.............................] - ETA: 4:01 - loss: 0.1074 - regression_loss: 0.0946 - classification_loss: 0.0129 30/500 [>.............................] - ETA: 4:00 - loss: 0.1058 - regression_loss: 0.0932 - classification_loss: 0.0126 31/500 [>.............................] - ETA: 4:00 - loss: 0.1087 - regression_loss: 0.0959 - classification_loss: 0.0129 32/500 [>.............................] - ETA: 3:59 - loss: 0.1080 - regression_loss: 0.0953 - classification_loss: 0.0127 33/500 [>.............................] - ETA: 3:58 - loss: 0.1063 - regression_loss: 0.0938 - classification_loss: 0.0125 34/500 [=>............................] - ETA: 3:58 - loss: 0.1044 - regression_loss: 0.0922 - classification_loss: 0.0122 35/500 [=>............................] - ETA: 3:57 - loss: 0.1135 - regression_loss: 0.1010 - classification_loss: 0.0125 36/500 [=>............................] - ETA: 3:57 - loss: 0.1136 - regression_loss: 0.1012 - classification_loss: 0.0123 37/500 [=>............................] - ETA: 3:56 - loss: 0.1141 - regression_loss: 0.1017 - classification_loss: 0.0124 38/500 [=>............................] - ETA: 3:56 - loss: 0.1120 - regression_loss: 0.0998 - classification_loss: 0.0122 39/500 [=>............................] - ETA: 3:56 - loss: 0.1141 - regression_loss: 0.1017 - classification_loss: 0.0124 40/500 [=>............................] - ETA: 3:55 - loss: 0.1131 - regression_loss: 0.1008 - classification_loss: 0.0123 41/500 [=>............................] - ETA: 3:54 - loss: 0.1114 - regression_loss: 0.0993 - classification_loss: 0.0121 42/500 [=>............................] - ETA: 3:54 - loss: 0.1101 - regression_loss: 0.0981 - classification_loss: 0.0120 43/500 [=>............................] - ETA: 3:53 - loss: 0.1185 - regression_loss: 0.1060 - classification_loss: 0.0125 44/500 [=>............................] - ETA: 3:53 - loss: 0.1169 - regression_loss: 0.1046 - classification_loss: 0.0123 45/500 [=>............................] - ETA: 3:53 - loss: 0.1154 - regression_loss: 0.1031 - classification_loss: 0.0122 46/500 [=>............................] - ETA: 3:52 - loss: 0.1244 - regression_loss: 0.1101 - classification_loss: 0.0143 47/500 [=>............................] - ETA: 3:52 - loss: 0.1227 - regression_loss: 0.1086 - classification_loss: 0.0141 48/500 [=>............................] - ETA: 3:51 - loss: 0.1210 - regression_loss: 0.1070 - classification_loss: 0.0139 49/500 [=>............................] - ETA: 3:51 - loss: 0.1194 - regression_loss: 0.1057 - classification_loss: 0.0137 50/500 [==>...........................] - ETA: 3:50 - loss: 0.1187 - regression_loss: 0.1052 - classification_loss: 0.0136 51/500 [==>...........................] - ETA: 3:49 - loss: 0.1174 - regression_loss: 0.1041 - classification_loss: 0.0134 52/500 [==>...........................] - ETA: 3:49 - loss: 0.1178 - regression_loss: 0.1044 - classification_loss: 0.0134 53/500 [==>...........................] - ETA: 3:48 - loss: 0.1166 - regression_loss: 0.1033 - classification_loss: 0.0133 54/500 [==>...........................] - ETA: 3:48 - loss: 0.1153 - regression_loss: 0.1021 - classification_loss: 0.0132 55/500 [==>...........................] - ETA: 3:47 - loss: 0.1139 - regression_loss: 0.1009 - classification_loss: 0.0130 56/500 [==>...........................] - ETA: 3:47 - loss: 0.1127 - regression_loss: 0.0998 - classification_loss: 0.0129 57/500 [==>...........................] - ETA: 3:46 - loss: 0.1129 - regression_loss: 0.0999 - classification_loss: 0.0130 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1153 - regression_loss: 0.1020 - classification_loss: 0.0133 59/500 [==>...........................] - ETA: 3:45 - loss: 0.1140 - regression_loss: 0.1009 - classification_loss: 0.0131 60/500 [==>...........................] - ETA: 3:45 - loss: 0.1126 - regression_loss: 0.0996 - classification_loss: 0.0131 61/500 [==>...........................] - ETA: 3:44 - loss: 0.1155 - regression_loss: 0.1024 - classification_loss: 0.0131 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1146 - regression_loss: 0.1015 - classification_loss: 0.0131 63/500 [==>...........................] - ETA: 3:43 - loss: 0.1134 - regression_loss: 0.1005 - classification_loss: 0.0129 64/500 [==>...........................] - ETA: 3:43 - loss: 0.1155 - regression_loss: 0.1024 - classification_loss: 0.0131 65/500 [==>...........................] - ETA: 3:42 - loss: 0.1241 - regression_loss: 0.1102 - classification_loss: 0.0139 66/500 [==>...........................] - ETA: 3:42 - loss: 0.1229 - regression_loss: 0.1092 - classification_loss: 0.0138 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1219 - regression_loss: 0.1083 - classification_loss: 0.0136 68/500 [===>..........................] - ETA: 3:41 - loss: 0.1219 - regression_loss: 0.1083 - classification_loss: 0.0136 69/500 [===>..........................] - ETA: 3:40 - loss: 0.1206 - regression_loss: 0.1072 - classification_loss: 0.0135 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1193 - regression_loss: 0.1060 - classification_loss: 0.0133 71/500 [===>..........................] - ETA: 3:39 - loss: 0.1185 - regression_loss: 0.1053 - classification_loss: 0.0132 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1211 - regression_loss: 0.1076 - classification_loss: 0.0135 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1201 - regression_loss: 0.1067 - classification_loss: 0.0135 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1193 - regression_loss: 0.1059 - classification_loss: 0.0134 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1184 - regression_loss: 0.1051 - classification_loss: 0.0133 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1176 - regression_loss: 0.1043 - classification_loss: 0.0133 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1166 - regression_loss: 0.1034 - classification_loss: 0.0132 78/500 [===>..........................] - ETA: 3:35 - loss: 0.1162 - regression_loss: 0.1030 - classification_loss: 0.0132 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1158 - regression_loss: 0.1027 - classification_loss: 0.0131 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1254 - regression_loss: 0.1093 - classification_loss: 0.0162 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1246 - regression_loss: 0.1086 - classification_loss: 0.0160 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1236 - regression_loss: 0.1076 - classification_loss: 0.0159 83/500 [===>..........................] - ETA: 3:33 - loss: 0.1226 - regression_loss: 0.1068 - classification_loss: 0.0158 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1216 - regression_loss: 0.1060 - classification_loss: 0.0156 85/500 [====>.........................] - ETA: 3:32 - loss: 0.1205 - regression_loss: 0.1050 - classification_loss: 0.0155 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1195 - regression_loss: 0.1041 - classification_loss: 0.0153 87/500 [====>.........................] - ETA: 3:31 - loss: 0.1195 - regression_loss: 0.1041 - classification_loss: 0.0154 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1191 - regression_loss: 0.1038 - classification_loss: 0.0153 89/500 [====>.........................] - ETA: 3:30 - loss: 0.1182 - regression_loss: 0.1030 - classification_loss: 0.0152 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1172 - regression_loss: 0.1021 - classification_loss: 0.0151 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1174 - regression_loss: 0.1024 - classification_loss: 0.0150 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1165 - regression_loss: 0.1016 - classification_loss: 0.0149 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1162 - regression_loss: 0.1013 - classification_loss: 0.0149 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1154 - regression_loss: 0.1007 - classification_loss: 0.0148 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1148 - regression_loss: 0.1001 - classification_loss: 0.0147 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1143 - regression_loss: 0.0997 - classification_loss: 0.0146 97/500 [====>.........................] - ETA: 3:26 - loss: 0.1138 - regression_loss: 0.0992 - classification_loss: 0.0146 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1133 - regression_loss: 0.0989 - classification_loss: 0.0145 99/500 [====>.........................] - ETA: 3:25 - loss: 0.1127 - regression_loss: 0.0983 - classification_loss: 0.0144 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1124 - regression_loss: 0.0981 - classification_loss: 0.0143 101/500 [=====>........................] - ETA: 3:24 - loss: 0.1117 - regression_loss: 0.0975 - classification_loss: 0.0142 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1113 - regression_loss: 0.0971 - classification_loss: 0.0142 103/500 [=====>........................] - ETA: 3:23 - loss: 0.1109 - regression_loss: 0.0967 - classification_loss: 0.0141 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1099 - regression_loss: 0.0959 - classification_loss: 0.0140 105/500 [=====>........................] - ETA: 3:22 - loss: 0.1094 - regression_loss: 0.0954 - classification_loss: 0.0139 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1106 - regression_loss: 0.0964 - classification_loss: 0.0142 107/500 [=====>........................] - ETA: 3:21 - loss: 0.1105 - regression_loss: 0.0963 - classification_loss: 0.0142 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1101 - regression_loss: 0.0959 - classification_loss: 0.0142 109/500 [=====>........................] - ETA: 3:20 - loss: 0.1094 - regression_loss: 0.0952 - classification_loss: 0.0141 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1107 - regression_loss: 0.0964 - classification_loss: 0.0143 111/500 [=====>........................] - ETA: 3:19 - loss: 0.1104 - regression_loss: 0.0962 - classification_loss: 0.0142 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1141 - regression_loss: 0.0991 - classification_loss: 0.0150 113/500 [=====>........................] - ETA: 3:18 - loss: 0.1139 - regression_loss: 0.0989 - classification_loss: 0.0150 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1131 - regression_loss: 0.0982 - classification_loss: 0.0149 115/500 [=====>........................] - ETA: 3:17 - loss: 0.1127 - regression_loss: 0.0978 - classification_loss: 0.0149 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1121 - regression_loss: 0.0973 - classification_loss: 0.0148 117/500 [======>.......................] - ETA: 3:16 - loss: 0.1124 - regression_loss: 0.0976 - classification_loss: 0.0148 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1122 - regression_loss: 0.0974 - classification_loss: 0.0148 119/500 [======>.......................] - ETA: 3:15 - loss: 0.1119 - regression_loss: 0.0972 - classification_loss: 0.0147 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1114 - regression_loss: 0.0967 - classification_loss: 0.0146 121/500 [======>.......................] - ETA: 3:14 - loss: 0.1108 - regression_loss: 0.0963 - classification_loss: 0.0145 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1105 - regression_loss: 0.0960 - classification_loss: 0.0145 123/500 [======>.......................] - ETA: 3:13 - loss: 0.1105 - regression_loss: 0.0960 - classification_loss: 0.0145 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1101 - regression_loss: 0.0957 - classification_loss: 0.0144 125/500 [======>.......................] - ETA: 3:12 - loss: 0.1097 - regression_loss: 0.0954 - classification_loss: 0.0144 126/500 [======>.......................] - ETA: 3:11 - loss: 0.1092 - regression_loss: 0.0949 - classification_loss: 0.0143 127/500 [======>.......................] - ETA: 3:11 - loss: 0.1087 - regression_loss: 0.0945 - classification_loss: 0.0142 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1096 - regression_loss: 0.0954 - classification_loss: 0.0142 129/500 [======>.......................] - ETA: 3:10 - loss: 0.1090 - regression_loss: 0.0949 - classification_loss: 0.0141 130/500 [======>.......................] - ETA: 3:09 - loss: 0.1087 - regression_loss: 0.0946 - classification_loss: 0.0141 131/500 [======>.......................] - ETA: 3:09 - loss: 0.1082 - regression_loss: 0.0942 - classification_loss: 0.0141 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1080 - regression_loss: 0.0939 - classification_loss: 0.0140 133/500 [======>.......................] - ETA: 3:08 - loss: 0.1077 - regression_loss: 0.0938 - classification_loss: 0.0140 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1072 - regression_loss: 0.0933 - classification_loss: 0.0139 135/500 [=======>......................] - ETA: 3:07 - loss: 0.1074 - regression_loss: 0.0934 - classification_loss: 0.0140 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1092 - regression_loss: 0.0947 - classification_loss: 0.0145 137/500 [=======>......................] - ETA: 3:06 - loss: 0.1088 - regression_loss: 0.0944 - classification_loss: 0.0144 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1084 - regression_loss: 0.0941 - classification_loss: 0.0144 139/500 [=======>......................] - ETA: 3:05 - loss: 0.1124 - regression_loss: 0.0977 - classification_loss: 0.0147 140/500 [=======>......................] - ETA: 3:04 - loss: 0.1120 - regression_loss: 0.0973 - classification_loss: 0.0147 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1114 - regression_loss: 0.0968 - classification_loss: 0.0146 142/500 [=======>......................] - ETA: 3:03 - loss: 0.1122 - regression_loss: 0.0976 - classification_loss: 0.0146 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1117 - regression_loss: 0.0971 - classification_loss: 0.0146 144/500 [=======>......................] - ETA: 3:02 - loss: 0.1112 - regression_loss: 0.0967 - classification_loss: 0.0145 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1108 - regression_loss: 0.0963 - classification_loss: 0.0145 146/500 [=======>......................] - ETA: 3:01 - loss: 0.1104 - regression_loss: 0.0960 - classification_loss: 0.0144 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1099 - regression_loss: 0.0956 - classification_loss: 0.0143 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1098 - regression_loss: 0.0955 - classification_loss: 0.0143 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1135 - regression_loss: 0.0988 - classification_loss: 0.0147 150/500 [========>.....................] - ETA: 2:59 - loss: 0.1138 - regression_loss: 0.0991 - classification_loss: 0.0147 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1138 - regression_loss: 0.0991 - classification_loss: 0.0146 152/500 [========>.....................] - ETA: 2:58 - loss: 0.1132 - regression_loss: 0.0986 - classification_loss: 0.0145 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1125 - regression_loss: 0.0981 - classification_loss: 0.0145 154/500 [========>.....................] - ETA: 2:57 - loss: 0.1122 - regression_loss: 0.0978 - classification_loss: 0.0144 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1121 - regression_loss: 0.0977 - classification_loss: 0.0144 156/500 [========>.....................] - ETA: 2:56 - loss: 0.1119 - regression_loss: 0.0976 - classification_loss: 0.0143 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1115 - regression_loss: 0.0972 - classification_loss: 0.0143 158/500 [========>.....................] - ETA: 2:55 - loss: 0.1110 - regression_loss: 0.0968 - classification_loss: 0.0142 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1105 - regression_loss: 0.0963 - classification_loss: 0.0142 160/500 [========>.....................] - ETA: 2:54 - loss: 0.1101 - regression_loss: 0.0960 - classification_loss: 0.0141 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1101 - regression_loss: 0.0960 - classification_loss: 0.0141 162/500 [========>.....................] - ETA: 2:53 - loss: 0.1098 - regression_loss: 0.0958 - classification_loss: 0.0141 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1094 - regression_loss: 0.0954 - classification_loss: 0.0140 164/500 [========>.....................] - ETA: 2:52 - loss: 0.1089 - regression_loss: 0.0949 - classification_loss: 0.0139 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1083 - regression_loss: 0.0945 - classification_loss: 0.0139 166/500 [========>.....................] - ETA: 2:51 - loss: 0.1083 - regression_loss: 0.0945 - classification_loss: 0.0138 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1094 - regression_loss: 0.0955 - classification_loss: 0.0138 168/500 [=========>....................] - ETA: 2:50 - loss: 0.1112 - regression_loss: 0.0973 - classification_loss: 0.0139 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1107 - regression_loss: 0.0969 - classification_loss: 0.0138 170/500 [=========>....................] - ETA: 2:49 - loss: 0.1104 - regression_loss: 0.0965 - classification_loss: 0.0138 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1099 - regression_loss: 0.0962 - classification_loss: 0.0138 172/500 [=========>....................] - ETA: 2:48 - loss: 0.1101 - regression_loss: 0.0964 - classification_loss: 0.0137 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1098 - regression_loss: 0.0961 - classification_loss: 0.0137 174/500 [=========>....................] - ETA: 2:47 - loss: 0.1095 - regression_loss: 0.0958 - classification_loss: 0.0137 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1092 - regression_loss: 0.0955 - classification_loss: 0.0136 176/500 [=========>....................] - ETA: 2:46 - loss: 0.1095 - regression_loss: 0.0959 - classification_loss: 0.0136 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1092 - regression_loss: 0.0956 - classification_loss: 0.0136 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1089 - regression_loss: 0.0953 - classification_loss: 0.0136 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1086 - regression_loss: 0.0950 - classification_loss: 0.0135 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1083 - regression_loss: 0.0948 - classification_loss: 0.0135 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1090 - regression_loss: 0.0955 - classification_loss: 0.0135 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1087 - regression_loss: 0.0953 - classification_loss: 0.0135 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1086 - regression_loss: 0.0952 - classification_loss: 0.0134 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1085 - regression_loss: 0.0951 - classification_loss: 0.0134 185/500 [==========>...................] - ETA: 2:41 - loss: 0.1081 - regression_loss: 0.0947 - classification_loss: 0.0133 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1077 - regression_loss: 0.0944 - classification_loss: 0.0133 187/500 [==========>...................] - ETA: 2:40 - loss: 0.1079 - regression_loss: 0.0946 - classification_loss: 0.0133 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1079 - regression_loss: 0.0946 - classification_loss: 0.0132 189/500 [==========>...................] - ETA: 2:39 - loss: 0.1080 - regression_loss: 0.0947 - classification_loss: 0.0132 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1076 - regression_loss: 0.0944 - classification_loss: 0.0132 191/500 [==========>...................] - ETA: 2:38 - loss: 0.1072 - regression_loss: 0.0940 - classification_loss: 0.0132 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1069 - regression_loss: 0.0938 - classification_loss: 0.0131 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1066 - regression_loss: 0.0935 - classification_loss: 0.0131 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1064 - regression_loss: 0.0934 - classification_loss: 0.0131 195/500 [==========>...................] - ETA: 2:36 - loss: 0.1060 - regression_loss: 0.0930 - classification_loss: 0.0130 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1057 - regression_loss: 0.0927 - classification_loss: 0.0130 197/500 [==========>...................] - ETA: 2:35 - loss: 0.1054 - regression_loss: 0.0925 - classification_loss: 0.0130 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0130 199/500 [==========>...................] - ETA: 2:34 - loss: 0.1062 - regression_loss: 0.0932 - classification_loss: 0.0130 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1059 - regression_loss: 0.0930 - classification_loss: 0.0130 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1056 - regression_loss: 0.0927 - classification_loss: 0.0129 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1057 - regression_loss: 0.0928 - classification_loss: 0.0129 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1055 - regression_loss: 0.0927 - classification_loss: 0.0129 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1054 - regression_loss: 0.0925 - classification_loss: 0.0129 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1051 - regression_loss: 0.0923 - classification_loss: 0.0129 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1051 - regression_loss: 0.0922 - classification_loss: 0.0129 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1049 - regression_loss: 0.0921 - classification_loss: 0.0129 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1050 - regression_loss: 0.0921 - classification_loss: 0.0129 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1049 - regression_loss: 0.0921 - classification_loss: 0.0129 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1046 - regression_loss: 0.0918 - classification_loss: 0.0128 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1045 - regression_loss: 0.0918 - classification_loss: 0.0128 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1044 - regression_loss: 0.0916 - classification_loss: 0.0128 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1041 - regression_loss: 0.0914 - classification_loss: 0.0127 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1041 - regression_loss: 0.0914 - classification_loss: 0.0127 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1039 - regression_loss: 0.0912 - classification_loss: 0.0127 217/500 [============>.................] - ETA: 2:24 - loss: 0.1040 - regression_loss: 0.0912 - classification_loss: 0.0127 218/500 [============>.................] - ETA: 2:24 - loss: 0.1039 - regression_loss: 0.0912 - classification_loss: 0.0127 219/500 [============>.................] - ETA: 2:23 - loss: 0.1056 - regression_loss: 0.0929 - classification_loss: 0.0128 220/500 [============>.................] - ETA: 2:23 - loss: 0.1054 - regression_loss: 0.0927 - classification_loss: 0.0128 221/500 [============>.................] - ETA: 2:22 - loss: 0.1057 - regression_loss: 0.0930 - classification_loss: 0.0128 222/500 [============>.................] - ETA: 2:22 - loss: 0.1055 - regression_loss: 0.0928 - classification_loss: 0.0127 223/500 [============>.................] - ETA: 2:21 - loss: 0.1054 - regression_loss: 0.0927 - classification_loss: 0.0127 224/500 [============>.................] - ETA: 2:21 - loss: 0.1054 - regression_loss: 0.0927 - classification_loss: 0.0127 225/500 [============>.................] - ETA: 2:20 - loss: 0.1051 - regression_loss: 0.0924 - classification_loss: 0.0126 226/500 [============>.................] - ETA: 2:20 - loss: 0.1048 - regression_loss: 0.0922 - classification_loss: 0.0126 227/500 [============>.................] - ETA: 2:19 - loss: 0.1045 - regression_loss: 0.0919 - classification_loss: 0.0126 228/500 [============>.................] - ETA: 2:19 - loss: 0.1045 - regression_loss: 0.0919 - classification_loss: 0.0126 229/500 [============>.................] - ETA: 2:18 - loss: 0.1047 - regression_loss: 0.0922 - classification_loss: 0.0126 230/500 [============>.................] - ETA: 2:18 - loss: 0.1050 - regression_loss: 0.0924 - classification_loss: 0.0126 231/500 [============>.................] - ETA: 2:17 - loss: 0.1049 - regression_loss: 0.0923 - classification_loss: 0.0126 232/500 [============>.................] - ETA: 2:17 - loss: 0.1047 - regression_loss: 0.0921 - classification_loss: 0.0126 233/500 [============>.................] - ETA: 2:16 - loss: 0.1043 - regression_loss: 0.0918 - classification_loss: 0.0125 234/500 [=============>................] - ETA: 2:16 - loss: 0.1041 - regression_loss: 0.0916 - classification_loss: 0.0125 235/500 [=============>................] - ETA: 2:15 - loss: 0.1039 - regression_loss: 0.0914 - classification_loss: 0.0125 236/500 [=============>................] - ETA: 2:15 - loss: 0.1038 - regression_loss: 0.0913 - classification_loss: 0.0125 237/500 [=============>................] - ETA: 2:14 - loss: 0.1035 - regression_loss: 0.0911 - classification_loss: 0.0125 238/500 [=============>................] - ETA: 2:14 - loss: 0.1032 - regression_loss: 0.0908 - classification_loss: 0.0124 239/500 [=============>................] - ETA: 2:13 - loss: 0.1047 - regression_loss: 0.0919 - classification_loss: 0.0128 240/500 [=============>................] - ETA: 2:13 - loss: 0.1044 - regression_loss: 0.0916 - classification_loss: 0.0128 241/500 [=============>................] - ETA: 2:12 - loss: 0.1041 - regression_loss: 0.0914 - classification_loss: 0.0127 242/500 [=============>................] - ETA: 2:12 - loss: 0.1041 - regression_loss: 0.0913 - classification_loss: 0.0127 243/500 [=============>................] - ETA: 2:11 - loss: 0.1043 - regression_loss: 0.0916 - classification_loss: 0.0127 244/500 [=============>................] - ETA: 2:10 - loss: 0.1046 - regression_loss: 0.0918 - classification_loss: 0.0128 245/500 [=============>................] - ETA: 2:10 - loss: 0.1043 - regression_loss: 0.0916 - classification_loss: 0.0128 246/500 [=============>................] - ETA: 2:09 - loss: 0.1041 - regression_loss: 0.0914 - classification_loss: 0.0127 247/500 [=============>................] - ETA: 2:09 - loss: 0.1041 - regression_loss: 0.0914 - classification_loss: 0.0127 248/500 [=============>................] - ETA: 2:08 - loss: 0.1040 - regression_loss: 0.0913 - classification_loss: 0.0127 249/500 [=============>................] - ETA: 2:08 - loss: 0.1038 - regression_loss: 0.0911 - classification_loss: 0.0127 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1040 - regression_loss: 0.0913 - classification_loss: 0.0127 251/500 [==============>...............] - ETA: 2:07 - loss: 0.1038 - regression_loss: 0.0911 - classification_loss: 0.0126 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1037 - regression_loss: 0.0910 - classification_loss: 0.0126 253/500 [==============>...............] - ETA: 2:06 - loss: 0.1035 - regression_loss: 0.0909 - classification_loss: 0.0126 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1034 - regression_loss: 0.0908 - classification_loss: 0.0126 255/500 [==============>...............] - ETA: 2:05 - loss: 0.1035 - regression_loss: 0.0909 - classification_loss: 0.0126 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1032 - regression_loss: 0.0906 - classification_loss: 0.0126 257/500 [==============>...............] - ETA: 2:04 - loss: 0.1030 - regression_loss: 0.0905 - classification_loss: 0.0126 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1027 - regression_loss: 0.0902 - classification_loss: 0.0125 259/500 [==============>...............] - ETA: 2:03 - loss: 0.1028 - regression_loss: 0.0903 - classification_loss: 0.0125 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1030 - regression_loss: 0.0905 - classification_loss: 0.0125 261/500 [==============>...............] - ETA: 2:02 - loss: 0.1027 - regression_loss: 0.0902 - classification_loss: 0.0125 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1025 - regression_loss: 0.0900 - classification_loss: 0.0125 263/500 [==============>...............] - ETA: 2:01 - loss: 0.1029 - regression_loss: 0.0904 - classification_loss: 0.0125 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1026 - regression_loss: 0.0902 - classification_loss: 0.0125 265/500 [==============>...............] - ETA: 2:00 - loss: 0.1024 - regression_loss: 0.0899 - classification_loss: 0.0125 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1022 - regression_loss: 0.0898 - classification_loss: 0.0124 267/500 [===============>..............] - ETA: 1:59 - loss: 0.1020 - regression_loss: 0.0896 - classification_loss: 0.0124 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1018 - regression_loss: 0.0894 - classification_loss: 0.0124 269/500 [===============>..............] - ETA: 1:58 - loss: 0.1019 - regression_loss: 0.0895 - classification_loss: 0.0124 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1017 - regression_loss: 0.0893 - classification_loss: 0.0123 271/500 [===============>..............] - ETA: 1:57 - loss: 0.1015 - regression_loss: 0.0891 - classification_loss: 0.0123 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1012 - regression_loss: 0.0888 - classification_loss: 0.0123 273/500 [===============>..............] - ETA: 1:56 - loss: 0.1016 - regression_loss: 0.0893 - classification_loss: 0.0123 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1014 - regression_loss: 0.0891 - classification_loss: 0.0123 275/500 [===============>..............] - ETA: 1:55 - loss: 0.1013 - regression_loss: 0.0890 - classification_loss: 0.0123 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1012 - regression_loss: 0.0889 - classification_loss: 0.0123 277/500 [===============>..............] - ETA: 1:54 - loss: 0.1010 - regression_loss: 0.0887 - classification_loss: 0.0122 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1007 - regression_loss: 0.0885 - classification_loss: 0.0122 279/500 [===============>..............] - ETA: 1:53 - loss: 0.1004 - regression_loss: 0.0883 - classification_loss: 0.0122 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1005 - regression_loss: 0.0883 - classification_loss: 0.0122 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1004 - regression_loss: 0.0882 - classification_loss: 0.0122 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1003 - regression_loss: 0.0881 - classification_loss: 0.0122 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1001 - regression_loss: 0.0879 - classification_loss: 0.0121 284/500 [================>.............] - ETA: 1:50 - loss: 0.1000 - regression_loss: 0.0878 - classification_loss: 0.0121 285/500 [================>.............] - ETA: 1:49 - loss: 0.0998 - regression_loss: 0.0877 - classification_loss: 0.0121 286/500 [================>.............] - ETA: 1:49 - loss: 0.0996 - regression_loss: 0.0875 - classification_loss: 0.0121 287/500 [================>.............] - ETA: 1:48 - loss: 0.0993 - regression_loss: 0.0873 - classification_loss: 0.0120 288/500 [================>.............] - ETA: 1:48 - loss: 0.0996 - regression_loss: 0.0876 - classification_loss: 0.0121 289/500 [================>.............] - ETA: 1:47 - loss: 0.0995 - regression_loss: 0.0874 - classification_loss: 0.0121 290/500 [================>.............] - ETA: 1:47 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0120 291/500 [================>.............] - ETA: 1:46 - loss: 0.0994 - regression_loss: 0.0874 - classification_loss: 0.0120 292/500 [================>.............] - ETA: 1:46 - loss: 0.0994 - regression_loss: 0.0873 - classification_loss: 0.0120 293/500 [================>.............] - ETA: 1:45 - loss: 0.0992 - regression_loss: 0.0872 - classification_loss: 0.0120 294/500 [================>.............] - ETA: 1:45 - loss: 0.0990 - regression_loss: 0.0870 - classification_loss: 0.0120 295/500 [================>.............] - ETA: 1:44 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0121 296/500 [================>.............] - ETA: 1:44 - loss: 0.0991 - regression_loss: 0.0870 - classification_loss: 0.0120 297/500 [================>.............] - ETA: 1:43 - loss: 0.0990 - regression_loss: 0.0869 - classification_loss: 0.0120 298/500 [================>.............] - ETA: 1:43 - loss: 0.0994 - regression_loss: 0.0874 - classification_loss: 0.0121 299/500 [================>.............] - ETA: 1:42 - loss: 0.0992 - regression_loss: 0.0871 - classification_loss: 0.0120 300/500 [=================>............] - ETA: 1:42 - loss: 0.0992 - regression_loss: 0.0872 - classification_loss: 0.0120 301/500 [=================>............] - ETA: 1:41 - loss: 0.0990 - regression_loss: 0.0870 - classification_loss: 0.0120 302/500 [=================>............] - ETA: 1:41 - loss: 0.0988 - regression_loss: 0.0868 - classification_loss: 0.0120 303/500 [=================>............] - ETA: 1:40 - loss: 0.0988 - regression_loss: 0.0869 - classification_loss: 0.0120 304/500 [=================>............] - ETA: 1:40 - loss: 0.0989 - regression_loss: 0.0869 - classification_loss: 0.0120 305/500 [=================>............] - ETA: 1:39 - loss: 0.0987 - regression_loss: 0.0867 - classification_loss: 0.0120 306/500 [=================>............] - ETA: 1:39 - loss: 0.0985 - regression_loss: 0.0866 - classification_loss: 0.0120 307/500 [=================>............] - ETA: 1:38 - loss: 0.0983 - regression_loss: 0.0864 - classification_loss: 0.0119 308/500 [=================>............] - ETA: 1:38 - loss: 0.0982 - regression_loss: 0.0863 - classification_loss: 0.0119 309/500 [=================>............] - ETA: 1:37 - loss: 0.0981 - regression_loss: 0.0862 - classification_loss: 0.0119 310/500 [=================>............] - ETA: 1:37 - loss: 0.0979 - regression_loss: 0.0860 - classification_loss: 0.0119 311/500 [=================>............] - ETA: 1:36 - loss: 0.0978 - regression_loss: 0.0859 - classification_loss: 0.0119 312/500 [=================>............] - ETA: 1:36 - loss: 0.0977 - regression_loss: 0.0858 - classification_loss: 0.0119 313/500 [=================>............] - ETA: 1:35 - loss: 0.0976 - regression_loss: 0.0857 - classification_loss: 0.0119 314/500 [=================>............] - ETA: 1:35 - loss: 0.0986 - regression_loss: 0.0867 - classification_loss: 0.0119 315/500 [=================>............] - ETA: 1:34 - loss: 0.0986 - regression_loss: 0.0867 - classification_loss: 0.0119 316/500 [=================>............] - ETA: 1:34 - loss: 0.0998 - regression_loss: 0.0878 - classification_loss: 0.0119 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1021 - regression_loss: 0.0894 - classification_loss: 0.0127 318/500 [==================>...........] - ETA: 1:33 - loss: 0.1018 - regression_loss: 0.0891 - classification_loss: 0.0127 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1018 - regression_loss: 0.0891 - classification_loss: 0.0127 320/500 [==================>...........] - ETA: 1:32 - loss: 0.1015 - regression_loss: 0.0889 - classification_loss: 0.0126 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1013 - regression_loss: 0.0887 - classification_loss: 0.0126 322/500 [==================>...........] - ETA: 1:31 - loss: 0.1016 - regression_loss: 0.0889 - classification_loss: 0.0126 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1015 - regression_loss: 0.0889 - classification_loss: 0.0126 324/500 [==================>...........] - ETA: 1:30 - loss: 0.1017 - regression_loss: 0.0891 - classification_loss: 0.0126 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1023 - regression_loss: 0.0896 - classification_loss: 0.0127 326/500 [==================>...........] - ETA: 1:29 - loss: 0.1022 - regression_loss: 0.0895 - classification_loss: 0.0127 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1020 - regression_loss: 0.0893 - classification_loss: 0.0127 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1019 - regression_loss: 0.0892 - classification_loss: 0.0127 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1017 - regression_loss: 0.0891 - classification_loss: 0.0126 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1017 - regression_loss: 0.0890 - classification_loss: 0.0126 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1015 - regression_loss: 0.0888 - classification_loss: 0.0126 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1014 - regression_loss: 0.0888 - classification_loss: 0.0126 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1013 - regression_loss: 0.0887 - classification_loss: 0.0126 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1013 - regression_loss: 0.0887 - classification_loss: 0.0126 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1011 - regression_loss: 0.0885 - classification_loss: 0.0126 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1010 - regression_loss: 0.0884 - classification_loss: 0.0125 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1006 - regression_loss: 0.0881 - classification_loss: 0.0125 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1005 - regression_loss: 0.0881 - classification_loss: 0.0125 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1004 - regression_loss: 0.0879 - classification_loss: 0.0125 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1002 - regression_loss: 0.0877 - classification_loss: 0.0124 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1000 - regression_loss: 0.0876 - classification_loss: 0.0124 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1000 - regression_loss: 0.0876 - classification_loss: 0.0124 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1017 - regression_loss: 0.0891 - classification_loss: 0.0126 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1016 - regression_loss: 0.0891 - classification_loss: 0.0126 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1015 - regression_loss: 0.0889 - classification_loss: 0.0126 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1014 - regression_loss: 0.0888 - classification_loss: 0.0126 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1013 - regression_loss: 0.0887 - classification_loss: 0.0126 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1011 - regression_loss: 0.0886 - classification_loss: 0.0125 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1008 - regression_loss: 0.0883 - classification_loss: 0.0125 353/500 [====================>.........] - ETA: 1:15 - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 355/500 [====================>.........] - ETA: 1:14 - loss: 0.1005 - regression_loss: 0.0881 - classification_loss: 0.0124 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1005 - regression_loss: 0.0880 - classification_loss: 0.0124 357/500 [====================>.........] - ETA: 1:13 - loss: 0.1004 - regression_loss: 0.0880 - classification_loss: 0.0124 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1003 - regression_loss: 0.0879 - classification_loss: 0.0124 359/500 [====================>.........] - ETA: 1:12 - loss: 0.1003 - regression_loss: 0.0879 - classification_loss: 0.0124 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1002 - regression_loss: 0.0878 - classification_loss: 0.0124 361/500 [====================>.........] - ETA: 1:11 - loss: 0.1004 - regression_loss: 0.0880 - classification_loss: 0.0124 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1002 - regression_loss: 0.0879 - classification_loss: 0.0124 363/500 [====================>.........] - ETA: 1:10 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0124 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1000 - regression_loss: 0.0877 - classification_loss: 0.0123 365/500 [====================>.........] - ETA: 1:09 - loss: 0.1001 - regression_loss: 0.0877 - classification_loss: 0.0123 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1001 - regression_loss: 0.0878 - classification_loss: 0.0123 367/500 [=====================>........] - ETA: 1:08 - loss: 0.1001 - regression_loss: 0.0877 - classification_loss: 0.0123 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0999 - regression_loss: 0.0876 - classification_loss: 0.0123 369/500 [=====================>........] - ETA: 1:07 - loss: 0.0998 - regression_loss: 0.0875 - classification_loss: 0.0123 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0996 - regression_loss: 0.0873 - classification_loss: 0.0123 371/500 [=====================>........] - ETA: 1:06 - loss: 0.1003 - regression_loss: 0.0878 - classification_loss: 0.0125 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1014 - regression_loss: 0.0887 - classification_loss: 0.0127 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1012 - regression_loss: 0.0885 - classification_loss: 0.0127 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1011 - regression_loss: 0.0884 - classification_loss: 0.0127 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1010 - regression_loss: 0.0883 - classification_loss: 0.0127 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1009 - regression_loss: 0.0882 - classification_loss: 0.0127 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1008 - regression_loss: 0.0881 - classification_loss: 0.0127 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1007 - regression_loss: 0.0881 - classification_loss: 0.0126 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1007 - regression_loss: 0.0881 - classification_loss: 0.0126 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1006 - regression_loss: 0.0880 - classification_loss: 0.0126 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1004 - regression_loss: 0.0878 - classification_loss: 0.0126 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1003 - regression_loss: 0.0877 - classification_loss: 0.0126 383/500 [=====================>........] - ETA: 59s - loss: 0.1018 - regression_loss: 0.0891 - classification_loss: 0.0127  384/500 [======================>.......] - ETA: 59s - loss: 0.1017 - regression_loss: 0.0890 - classification_loss: 0.0127 385/500 [======================>.......] - ETA: 58s - loss: 0.1021 - regression_loss: 0.0894 - classification_loss: 0.0127 386/500 [======================>.......] - ETA: 58s - loss: 0.1020 - regression_loss: 0.0892 - classification_loss: 0.0127 387/500 [======================>.......] - ETA: 57s - loss: 0.1019 - regression_loss: 0.0892 - classification_loss: 0.0127 388/500 [======================>.......] - ETA: 57s - loss: 0.1017 - regression_loss: 0.0891 - classification_loss: 0.0127 389/500 [======================>.......] - ETA: 56s - loss: 0.1019 - regression_loss: 0.0892 - classification_loss: 0.0127 390/500 [======================>.......] - ETA: 56s - loss: 0.1018 - regression_loss: 0.0891 - classification_loss: 0.0127 391/500 [======================>.......] - ETA: 55s - loss: 0.1017 - regression_loss: 0.0891 - classification_loss: 0.0126 392/500 [======================>.......] - ETA: 55s - loss: 0.1016 - regression_loss: 0.0890 - classification_loss: 0.0126 393/500 [======================>.......] - ETA: 54s - loss: 0.1017 - regression_loss: 0.0890 - classification_loss: 0.0127 394/500 [======================>.......] - ETA: 54s - loss: 0.1016 - regression_loss: 0.0889 - classification_loss: 0.0126 395/500 [======================>.......] - ETA: 53s - loss: 0.1015 - regression_loss: 0.0888 - classification_loss: 0.0126 396/500 [======================>.......] - ETA: 53s - loss: 0.1013 - regression_loss: 0.0887 - classification_loss: 0.0126 397/500 [======================>.......] - ETA: 52s - loss: 0.1011 - regression_loss: 0.0885 - classification_loss: 0.0126 398/500 [======================>.......] - ETA: 52s - loss: 0.1010 - regression_loss: 0.0885 - classification_loss: 0.0126 399/500 [======================>.......] - ETA: 51s - loss: 0.1009 - regression_loss: 0.0883 - classification_loss: 0.0126 400/500 [=======================>......] - ETA: 51s - loss: 0.1008 - regression_loss: 0.0882 - classification_loss: 0.0125 401/500 [=======================>......] - ETA: 50s - loss: 0.1006 - regression_loss: 0.0881 - classification_loss: 0.0125 402/500 [=======================>......] - ETA: 50s - loss: 0.1005 - regression_loss: 0.0880 - classification_loss: 0.0125 403/500 [=======================>......] - ETA: 49s - loss: 0.1004 - regression_loss: 0.0879 - classification_loss: 0.0125 404/500 [=======================>......] - ETA: 49s - loss: 0.1003 - regression_loss: 0.0878 - classification_loss: 0.0125 405/500 [=======================>......] - ETA: 48s - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 406/500 [=======================>......] - ETA: 48s - loss: 0.1008 - regression_loss: 0.0883 - classification_loss: 0.0125 407/500 [=======================>......] - ETA: 47s - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 408/500 [=======================>......] - ETA: 47s - loss: 0.1006 - regression_loss: 0.0881 - classification_loss: 0.0125 409/500 [=======================>......] - ETA: 46s - loss: 0.1005 - regression_loss: 0.0880 - classification_loss: 0.0125 410/500 [=======================>......] - ETA: 46s - loss: 0.1006 - regression_loss: 0.0881 - classification_loss: 0.0125 411/500 [=======================>......] - ETA: 45s - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 412/500 [=======================>......] - ETA: 45s - loss: 0.1010 - regression_loss: 0.0885 - classification_loss: 0.0125 413/500 [=======================>......] - ETA: 44s - loss: 0.1008 - regression_loss: 0.0883 - classification_loss: 0.0125 414/500 [=======================>......] - ETA: 43s - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 415/500 [=======================>......] - ETA: 43s - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 416/500 [=======================>......] - ETA: 42s - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 417/500 [========================>.....] - ETA: 42s - loss: 0.1010 - regression_loss: 0.0884 - classification_loss: 0.0126 418/500 [========================>.....] - ETA: 41s - loss: 0.1010 - regression_loss: 0.0884 - classification_loss: 0.0126 419/500 [========================>.....] - ETA: 41s - loss: 0.1009 - regression_loss: 0.0883 - classification_loss: 0.0126 420/500 [========================>.....] - ETA: 40s - loss: 0.1008 - regression_loss: 0.0883 - classification_loss: 0.0125 421/500 [========================>.....] - ETA: 40s - loss: 0.1009 - regression_loss: 0.0883 - classification_loss: 0.0126 422/500 [========================>.....] - ETA: 39s - loss: 0.1009 - regression_loss: 0.0884 - classification_loss: 0.0125 423/500 [========================>.....] - ETA: 39s - loss: 0.1007 - regression_loss: 0.0882 - classification_loss: 0.0125 424/500 [========================>.....] - ETA: 38s - loss: 0.1005 - regression_loss: 0.0880 - classification_loss: 0.0125 425/500 [========================>.....] - ETA: 38s - loss: 0.1004 - regression_loss: 0.0879 - classification_loss: 0.0125 426/500 [========================>.....] - ETA: 37s - loss: 0.1003 - regression_loss: 0.0878 - classification_loss: 0.0125 427/500 [========================>.....] - ETA: 37s - loss: 0.1001 - regression_loss: 0.0877 - classification_loss: 0.0125 428/500 [========================>.....] - ETA: 36s - loss: 0.1000 - regression_loss: 0.0876 - classification_loss: 0.0124 429/500 [========================>.....] - ETA: 36s - loss: 0.0999 - regression_loss: 0.0875 - classification_loss: 0.0124 430/500 [========================>.....] - ETA: 35s - loss: 0.0997 - regression_loss: 0.0873 - classification_loss: 0.0124 431/500 [========================>.....] - ETA: 35s - loss: 0.0996 - regression_loss: 0.0872 - classification_loss: 0.0124 432/500 [========================>.....] - ETA: 34s - loss: 0.0996 - regression_loss: 0.0872 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 34s - loss: 0.0997 - regression_loss: 0.0873 - classification_loss: 0.0124 434/500 [=========================>....] - ETA: 33s - loss: 0.0996 - regression_loss: 0.0872 - classification_loss: 0.0124 435/500 [=========================>....] - ETA: 33s - loss: 0.0996 - regression_loss: 0.0872 - classification_loss: 0.0124 436/500 [=========================>....] - ETA: 32s - loss: 0.0994 - regression_loss: 0.0870 - classification_loss: 0.0124 437/500 [=========================>....] - ETA: 32s - loss: 0.0993 - regression_loss: 0.0869 - classification_loss: 0.0124 438/500 [=========================>....] - ETA: 31s - loss: 0.0992 - regression_loss: 0.0868 - classification_loss: 0.0124 439/500 [=========================>....] - ETA: 31s - loss: 0.0991 - regression_loss: 0.0867 - classification_loss: 0.0123 440/500 [=========================>....] - ETA: 30s - loss: 0.0990 - regression_loss: 0.0866 - classification_loss: 0.0123 441/500 [=========================>....] - ETA: 30s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 442/500 [=========================>....] - ETA: 29s - loss: 0.0988 - regression_loss: 0.0865 - classification_loss: 0.0123 443/500 [=========================>....] - ETA: 29s - loss: 0.0991 - regression_loss: 0.0868 - classification_loss: 0.0123 444/500 [=========================>....] - ETA: 28s - loss: 0.0990 - regression_loss: 0.0866 - classification_loss: 0.0123 445/500 [=========================>....] - ETA: 28s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 446/500 [=========================>....] - ETA: 27s - loss: 0.0987 - regression_loss: 0.0864 - classification_loss: 0.0123 447/500 [=========================>....] - ETA: 27s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 448/500 [=========================>....] - ETA: 26s - loss: 0.0993 - regression_loss: 0.0869 - classification_loss: 0.0123 449/500 [=========================>....] - ETA: 26s - loss: 0.0992 - regression_loss: 0.0868 - classification_loss: 0.0123 450/500 [==========================>...] - ETA: 25s - loss: 0.0991 - regression_loss: 0.0868 - classification_loss: 0.0123 451/500 [==========================>...] - ETA: 25s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 452/500 [==========================>...] - ETA: 24s - loss: 0.0990 - regression_loss: 0.0867 - classification_loss: 0.0123 453/500 [==========================>...] - ETA: 24s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 454/500 [==========================>...] - ETA: 23s - loss: 0.0987 - regression_loss: 0.0865 - classification_loss: 0.0123 455/500 [==========================>...] - ETA: 23s - loss: 0.0989 - regression_loss: 0.0866 - classification_loss: 0.0123 456/500 [==========================>...] - ETA: 22s - loss: 0.0987 - regression_loss: 0.0865 - classification_loss: 0.0123 457/500 [==========================>...] - ETA: 21s - loss: 0.0986 - regression_loss: 0.0864 - classification_loss: 0.0123 458/500 [==========================>...] - ETA: 21s - loss: 0.0985 - regression_loss: 0.0863 - classification_loss: 0.0123 459/500 [==========================>...] - ETA: 20s - loss: 0.0985 - regression_loss: 0.0863 - classification_loss: 0.0122 460/500 [==========================>...] - ETA: 20s - loss: 0.0984 - regression_loss: 0.0862 - classification_loss: 0.0122 461/500 [==========================>...] - ETA: 19s - loss: 0.0985 - regression_loss: 0.0862 - classification_loss: 0.0122 462/500 [==========================>...] - ETA: 19s - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 463/500 [==========================>...] - ETA: 18s - loss: 0.0982 - regression_loss: 0.0860 - classification_loss: 0.0122 464/500 [==========================>...] - ETA: 18s - loss: 0.0981 - regression_loss: 0.0859 - classification_loss: 0.0122 465/500 [==========================>...] - ETA: 17s - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 466/500 [==========================>...] - ETA: 17s - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 467/500 [===========================>..] - ETA: 16s - loss: 0.0982 - regression_loss: 0.0860 - classification_loss: 0.0122 468/500 [===========================>..] - ETA: 16s - loss: 0.0981 - regression_loss: 0.0859 - classification_loss: 0.0122 469/500 [===========================>..] - ETA: 15s - loss: 0.0980 - regression_loss: 0.0858 - classification_loss: 0.0122 470/500 [===========================>..] - ETA: 15s - loss: 0.0979 - regression_loss: 0.0857 - classification_loss: 0.0122 471/500 [===========================>..] - ETA: 14s - loss: 0.0978 - regression_loss: 0.0856 - classification_loss: 0.0121 472/500 [===========================>..] - ETA: 14s - loss: 0.0984 - regression_loss: 0.0863 - classification_loss: 0.0122 473/500 [===========================>..] - ETA: 13s - loss: 0.0984 - regression_loss: 0.0863 - classification_loss: 0.0122 474/500 [===========================>..] - ETA: 13s - loss: 0.0984 - regression_loss: 0.0862 - classification_loss: 0.0122 475/500 [===========================>..] - ETA: 12s - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 476/500 [===========================>..] - ETA: 12s - loss: 0.0982 - regression_loss: 0.0860 - classification_loss: 0.0121 477/500 [===========================>..] - ETA: 11s - loss: 0.0980 - regression_loss: 0.0859 - classification_loss: 0.0121 478/500 [===========================>..] - ETA: 11s - loss: 0.0979 - regression_loss: 0.0858 - classification_loss: 0.0121 479/500 [===========================>..] - ETA: 10s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 480/500 [===========================>..] - ETA: 10s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 481/500 [===========================>..] - ETA: 9s - loss: 0.0979 - regression_loss: 0.0858 - classification_loss: 0.0121  482/500 [===========================>..] - ETA: 9s - loss: 0.0978 - regression_loss: 0.0858 - classification_loss: 0.0121 483/500 [===========================>..] - ETA: 8s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 484/500 [============================>.] - ETA: 8s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 485/500 [============================>.] - ETA: 7s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0121 486/500 [============================>.] - ETA: 7s - loss: 0.0977 - regression_loss: 0.0856 - classification_loss: 0.0121 487/500 [============================>.] - ETA: 6s - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0121 488/500 [============================>.] - ETA: 6s - loss: 0.0978 - regression_loss: 0.0858 - classification_loss: 0.0120 489/500 [============================>.] - ETA: 5s - loss: 0.0981 - regression_loss: 0.0860 - classification_loss: 0.0121 490/500 [============================>.] - ETA: 5s - loss: 0.0981 - regression_loss: 0.0860 - classification_loss: 0.0121 491/500 [============================>.] - ETA: 4s - loss: 0.0980 - regression_loss: 0.0859 - classification_loss: 0.0120 492/500 [============================>.] - ETA: 4s - loss: 0.0979 - regression_loss: 0.0858 - classification_loss: 0.0120 493/500 [============================>.] - ETA: 3s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0120 494/500 [============================>.] - ETA: 3s - loss: 0.0978 - regression_loss: 0.0857 - classification_loss: 0.0120 495/500 [============================>.] - ETA: 2s - loss: 0.0977 - regression_loss: 0.0856 - classification_loss: 0.0120 496/500 [============================>.] - ETA: 2s - loss: 0.0975 - regression_loss: 0.0855 - classification_loss: 0.0120 497/500 [============================>.] - ETA: 1s - loss: 0.0975 - regression_loss: 0.0855 - classification_loss: 0.0120 498/500 [============================>.] - ETA: 1s - loss: 0.0974 - regression_loss: 0.0854 - classification_loss: 0.0120 499/500 [============================>.] - ETA: 0s - loss: 0.0990 - regression_loss: 0.0865 - classification_loss: 0.0125 500/500 [==============================] - 256s 511ms/step - loss: 0.0991 - regression_loss: 0.0866 - classification_loss: 0.0125 326 instances of class plum with average precision: 0.8405 mAP: 0.8405 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 4:16 - loss: 0.0450 - regression_loss: 0.0422 - classification_loss: 0.0028 2/500 [..............................] - ETA: 4:10 - loss: 0.0372 - regression_loss: 0.0326 - classification_loss: 0.0047 3/500 [..............................] - ETA: 4:14 - loss: 0.0798 - regression_loss: 0.0739 - classification_loss: 0.0058 4/500 [..............................] - ETA: 4:16 - loss: 0.1154 - regression_loss: 0.1058 - classification_loss: 0.0096 5/500 [..............................] - ETA: 4:14 - loss: 0.2237 - regression_loss: 0.2013 - classification_loss: 0.0223 6/500 [..............................] - ETA: 4:12 - loss: 0.2051 - regression_loss: 0.1842 - classification_loss: 0.0209 7/500 [..............................] - ETA: 4:11 - loss: 0.1815 - regression_loss: 0.1622 - classification_loss: 0.0193 8/500 [..............................] - ETA: 4:10 - loss: 0.1634 - regression_loss: 0.1459 - classification_loss: 0.0175 9/500 [..............................] - ETA: 4:10 - loss: 0.1499 - regression_loss: 0.1337 - classification_loss: 0.0163 10/500 [..............................] - ETA: 4:09 - loss: 0.1572 - regression_loss: 0.1377 - classification_loss: 0.0195 11/500 [..............................] - ETA: 4:09 - loss: 0.1543 - regression_loss: 0.1358 - classification_loss: 0.0185 12/500 [..............................] - ETA: 4:08 - loss: 0.1467 - regression_loss: 0.1292 - classification_loss: 0.0175 13/500 [..............................] - ETA: 4:08 - loss: 0.1428 - regression_loss: 0.1262 - classification_loss: 0.0166 14/500 [..............................] - ETA: 4:08 - loss: 0.1362 - regression_loss: 0.1204 - classification_loss: 0.0157 15/500 [..............................] - ETA: 4:07 - loss: 0.1301 - regression_loss: 0.1149 - classification_loss: 0.0152 16/500 [..............................] - ETA: 4:06 - loss: 0.1301 - regression_loss: 0.1150 - classification_loss: 0.0151 17/500 [>.............................] - ETA: 4:06 - loss: 0.1238 - regression_loss: 0.1093 - classification_loss: 0.0145 18/500 [>.............................] - ETA: 4:05 - loss: 0.1209 - regression_loss: 0.1069 - classification_loss: 0.0139 19/500 [>.............................] - ETA: 4:04 - loss: 0.1160 - regression_loss: 0.1024 - classification_loss: 0.0136 20/500 [>.............................] - ETA: 4:04 - loss: 0.1177 - regression_loss: 0.1039 - classification_loss: 0.0138 21/500 [>.............................] - ETA: 4:03 - loss: 0.1146 - regression_loss: 0.1012 - classification_loss: 0.0134 22/500 [>.............................] - ETA: 4:03 - loss: 0.1250 - regression_loss: 0.1084 - classification_loss: 0.0166 23/500 [>.............................] - ETA: 4:03 - loss: 0.1225 - regression_loss: 0.1064 - classification_loss: 0.0161 24/500 [>.............................] - ETA: 4:02 - loss: 0.1184 - regression_loss: 0.1027 - classification_loss: 0.0157 25/500 [>.............................] - ETA: 4:01 - loss: 0.1249 - regression_loss: 0.1093 - classification_loss: 0.0156 26/500 [>.............................] - ETA: 4:01 - loss: 0.1252 - regression_loss: 0.1097 - classification_loss: 0.0154 27/500 [>.............................] - ETA: 4:01 - loss: 0.1221 - regression_loss: 0.1071 - classification_loss: 0.0150 28/500 [>.............................] - ETA: 4:00 - loss: 0.1204 - regression_loss: 0.1056 - classification_loss: 0.0148 29/500 [>.............................] - ETA: 3:59 - loss: 0.1186 - regression_loss: 0.1040 - classification_loss: 0.0146 30/500 [>.............................] - ETA: 3:59 - loss: 0.1174 - regression_loss: 0.1031 - classification_loss: 0.0143 31/500 [>.............................] - ETA: 3:58 - loss: 0.1153 - regression_loss: 0.1012 - classification_loss: 0.0141 32/500 [>.............................] - ETA: 3:58 - loss: 0.1142 - regression_loss: 0.1004 - classification_loss: 0.0138 33/500 [>.............................] - ETA: 3:57 - loss: 0.1115 - regression_loss: 0.0980 - classification_loss: 0.0135 34/500 [=>............................] - ETA: 3:57 - loss: 0.1099 - regression_loss: 0.0966 - classification_loss: 0.0133 35/500 [=>............................] - ETA: 3:56 - loss: 0.1113 - regression_loss: 0.0980 - classification_loss: 0.0133 36/500 [=>............................] - ETA: 3:56 - loss: 0.1102 - regression_loss: 0.0970 - classification_loss: 0.0131 37/500 [=>............................] - ETA: 3:55 - loss: 0.1084 - regression_loss: 0.0954 - classification_loss: 0.0130 38/500 [=>............................] - ETA: 3:55 - loss: 0.1083 - regression_loss: 0.0954 - classification_loss: 0.0130 39/500 [=>............................] - ETA: 3:54 - loss: 0.1088 - regression_loss: 0.0956 - classification_loss: 0.0133 40/500 [=>............................] - ETA: 3:54 - loss: 0.1082 - regression_loss: 0.0952 - classification_loss: 0.0131 41/500 [=>............................] - ETA: 3:53 - loss: 0.1067 - regression_loss: 0.0939 - classification_loss: 0.0129 42/500 [=>............................] - ETA: 3:53 - loss: 0.1053 - regression_loss: 0.0926 - classification_loss: 0.0127 43/500 [=>............................] - ETA: 3:52 - loss: 0.1052 - regression_loss: 0.0926 - classification_loss: 0.0127 44/500 [=>............................] - ETA: 3:51 - loss: 0.1046 - regression_loss: 0.0918 - classification_loss: 0.0128 45/500 [=>............................] - ETA: 3:50 - loss: 0.1071 - regression_loss: 0.0941 - classification_loss: 0.0129 46/500 [=>............................] - ETA: 3:49 - loss: 0.1086 - regression_loss: 0.0954 - classification_loss: 0.0132 47/500 [=>............................] - ETA: 3:49 - loss: 0.1081 - regression_loss: 0.0949 - classification_loss: 0.0132 48/500 [=>............................] - ETA: 3:48 - loss: 0.1074 - regression_loss: 0.0943 - classification_loss: 0.0131 49/500 [=>............................] - ETA: 3:47 - loss: 0.1068 - regression_loss: 0.0938 - classification_loss: 0.0130 50/500 [==>...........................] - ETA: 3:47 - loss: 0.1060 - regression_loss: 0.0930 - classification_loss: 0.0129 51/500 [==>...........................] - ETA: 3:47 - loss: 0.1044 - regression_loss: 0.0916 - classification_loss: 0.0128 52/500 [==>...........................] - ETA: 3:46 - loss: 0.1059 - regression_loss: 0.0928 - classification_loss: 0.0132 53/500 [==>...........................] - ETA: 3:46 - loss: 0.1046 - regression_loss: 0.0916 - classification_loss: 0.0130 54/500 [==>...........................] - ETA: 3:45 - loss: 0.1033 - regression_loss: 0.0904 - classification_loss: 0.0129 55/500 [==>...........................] - ETA: 3:44 - loss: 0.1031 - regression_loss: 0.0903 - classification_loss: 0.0128 56/500 [==>...........................] - ETA: 3:44 - loss: 0.1130 - regression_loss: 0.0993 - classification_loss: 0.0137 57/500 [==>...........................] - ETA: 3:43 - loss: 0.1121 - regression_loss: 0.0986 - classification_loss: 0.0135 58/500 [==>...........................] - ETA: 3:43 - loss: 0.1110 - regression_loss: 0.0976 - classification_loss: 0.0134 59/500 [==>...........................] - ETA: 3:42 - loss: 0.1096 - regression_loss: 0.0963 - classification_loss: 0.0133 60/500 [==>...........................] - ETA: 3:42 - loss: 0.1086 - regression_loss: 0.0955 - classification_loss: 0.0131 61/500 [==>...........................] - ETA: 3:41 - loss: 0.1075 - regression_loss: 0.0945 - classification_loss: 0.0130 62/500 [==>...........................] - ETA: 3:41 - loss: 0.1072 - regression_loss: 0.0943 - classification_loss: 0.0129 63/500 [==>...........................] - ETA: 3:40 - loss: 0.1072 - regression_loss: 0.0943 - classification_loss: 0.0129 64/500 [==>...........................] - ETA: 3:40 - loss: 0.1059 - regression_loss: 0.0932 - classification_loss: 0.0127 65/500 [==>...........................] - ETA: 3:39 - loss: 0.1051 - regression_loss: 0.0924 - classification_loss: 0.0127 66/500 [==>...........................] - ETA: 3:39 - loss: 0.1046 - regression_loss: 0.0920 - classification_loss: 0.0126 67/500 [===>..........................] - ETA: 3:38 - loss: 0.1040 - regression_loss: 0.0915 - classification_loss: 0.0125 68/500 [===>..........................] - ETA: 3:38 - loss: 0.1030 - regression_loss: 0.0906 - classification_loss: 0.0124 69/500 [===>..........................] - ETA: 3:37 - loss: 0.1022 - regression_loss: 0.0899 - classification_loss: 0.0123 70/500 [===>..........................] - ETA: 3:37 - loss: 0.1011 - regression_loss: 0.0889 - classification_loss: 0.0122 71/500 [===>..........................] - ETA: 3:37 - loss: 0.1008 - regression_loss: 0.0886 - classification_loss: 0.0122 72/500 [===>..........................] - ETA: 3:36 - loss: 0.0999 - regression_loss: 0.0878 - classification_loss: 0.0121 73/500 [===>..........................] - ETA: 3:36 - loss: 0.0990 - regression_loss: 0.0870 - classification_loss: 0.0120 74/500 [===>..........................] - ETA: 3:35 - loss: 0.1039 - regression_loss: 0.0917 - classification_loss: 0.0122 75/500 [===>..........................] - ETA: 3:35 - loss: 0.1037 - regression_loss: 0.0916 - classification_loss: 0.0122 76/500 [===>..........................] - ETA: 3:34 - loss: 0.1030 - regression_loss: 0.0909 - classification_loss: 0.0122 77/500 [===>..........................] - ETA: 3:34 - loss: 0.1021 - regression_loss: 0.0900 - classification_loss: 0.0121 78/500 [===>..........................] - ETA: 3:33 - loss: 0.1015 - regression_loss: 0.0896 - classification_loss: 0.0119 79/500 [===>..........................] - ETA: 3:33 - loss: 0.1011 - regression_loss: 0.0892 - classification_loss: 0.0119 80/500 [===>..........................] - ETA: 3:32 - loss: 0.1007 - regression_loss: 0.0889 - classification_loss: 0.0118 81/500 [===>..........................] - ETA: 3:32 - loss: 0.1051 - regression_loss: 0.0922 - classification_loss: 0.0129 82/500 [===>..........................] - ETA: 3:32 - loss: 0.1045 - regression_loss: 0.0917 - classification_loss: 0.0128 83/500 [===>..........................] - ETA: 3:31 - loss: 0.1039 - regression_loss: 0.0912 - classification_loss: 0.0127 84/500 [====>.........................] - ETA: 3:30 - loss: 0.1035 - regression_loss: 0.0908 - classification_loss: 0.0127 85/500 [====>.........................] - ETA: 3:30 - loss: 0.1033 - regression_loss: 0.0906 - classification_loss: 0.0126 86/500 [====>.........................] - ETA: 3:30 - loss: 0.1032 - regression_loss: 0.0906 - classification_loss: 0.0125 87/500 [====>.........................] - ETA: 3:29 - loss: 0.1024 - regression_loss: 0.0899 - classification_loss: 0.0125 88/500 [====>.........................] - ETA: 3:28 - loss: 0.1017 - regression_loss: 0.0893 - classification_loss: 0.0124 89/500 [====>.........................] - ETA: 3:28 - loss: 0.1009 - regression_loss: 0.0886 - classification_loss: 0.0123 90/500 [====>.........................] - ETA: 3:27 - loss: 0.1008 - regression_loss: 0.0886 - classification_loss: 0.0123 91/500 [====>.........................] - ETA: 3:27 - loss: 0.1000 - regression_loss: 0.0878 - classification_loss: 0.0122 92/500 [====>.........................] - ETA: 3:26 - loss: 0.0993 - regression_loss: 0.0872 - classification_loss: 0.0121 93/500 [====>.........................] - ETA: 3:26 - loss: 0.0986 - regression_loss: 0.0865 - classification_loss: 0.0120 94/500 [====>.........................] - ETA: 3:25 - loss: 0.0982 - regression_loss: 0.0862 - classification_loss: 0.0120 95/500 [====>.........................] - ETA: 3:25 - loss: 0.0980 - regression_loss: 0.0861 - classification_loss: 0.0120 96/500 [====>.........................] - ETA: 3:24 - loss: 0.0984 - regression_loss: 0.0864 - classification_loss: 0.0120 97/500 [====>.........................] - ETA: 3:24 - loss: 0.0976 - regression_loss: 0.0857 - classification_loss: 0.0119 98/500 [====>.........................] - ETA: 3:23 - loss: 0.0969 - regression_loss: 0.0851 - classification_loss: 0.0118 99/500 [====>.........................] - ETA: 3:23 - loss: 0.0972 - regression_loss: 0.0854 - classification_loss: 0.0118 100/500 [=====>........................] - ETA: 3:22 - loss: 0.0976 - regression_loss: 0.0857 - classification_loss: 0.0119 101/500 [=====>........................] - ETA: 3:22 - loss: 0.0990 - regression_loss: 0.0870 - classification_loss: 0.0121 102/500 [=====>........................] - ETA: 3:21 - loss: 0.0984 - regression_loss: 0.0864 - classification_loss: 0.0120 103/500 [=====>........................] - ETA: 3:21 - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0120 104/500 [=====>........................] - ETA: 3:20 - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0120 105/500 [=====>........................] - ETA: 3:20 - loss: 0.1002 - regression_loss: 0.0875 - classification_loss: 0.0127 106/500 [=====>........................] - ETA: 3:19 - loss: 0.0995 - regression_loss: 0.0869 - classification_loss: 0.0126 107/500 [=====>........................] - ETA: 3:19 - loss: 0.0997 - regression_loss: 0.0871 - classification_loss: 0.0126 108/500 [=====>........................] - ETA: 3:18 - loss: 0.0993 - regression_loss: 0.0868 - classification_loss: 0.0126 109/500 [=====>........................] - ETA: 3:18 - loss: 0.0988 - regression_loss: 0.0863 - classification_loss: 0.0125 110/500 [=====>........................] - ETA: 3:17 - loss: 0.0986 - regression_loss: 0.0861 - classification_loss: 0.0125 111/500 [=====>........................] - ETA: 3:16 - loss: 0.0979 - regression_loss: 0.0855 - classification_loss: 0.0125 112/500 [=====>........................] - ETA: 3:16 - loss: 0.0976 - regression_loss: 0.0851 - classification_loss: 0.0124 113/500 [=====>........................] - ETA: 3:15 - loss: 0.0970 - regression_loss: 0.0847 - classification_loss: 0.0124 114/500 [=====>........................] - ETA: 3:15 - loss: 0.0970 - regression_loss: 0.0846 - classification_loss: 0.0124 115/500 [=====>........................] - ETA: 3:14 - loss: 0.0969 - regression_loss: 0.0846 - classification_loss: 0.0123 116/500 [=====>........................] - ETA: 3:14 - loss: 0.0963 - regression_loss: 0.0841 - classification_loss: 0.0122 117/500 [======>.......................] - ETA: 3:13 - loss: 0.0962 - regression_loss: 0.0840 - classification_loss: 0.0122 118/500 [======>.......................] - ETA: 3:13 - loss: 0.0956 - regression_loss: 0.0835 - classification_loss: 0.0122 119/500 [======>.......................] - ETA: 3:12 - loss: 0.0966 - regression_loss: 0.0843 - classification_loss: 0.0123 120/500 [======>.......................] - ETA: 3:12 - loss: 0.0960 - regression_loss: 0.0838 - classification_loss: 0.0122 121/500 [======>.......................] - ETA: 3:11 - loss: 0.0957 - regression_loss: 0.0835 - classification_loss: 0.0122 122/500 [======>.......................] - ETA: 3:11 - loss: 0.0952 - regression_loss: 0.0831 - classification_loss: 0.0121 123/500 [======>.......................] - ETA: 3:10 - loss: 0.0950 - regression_loss: 0.0829 - classification_loss: 0.0121 124/500 [======>.......................] - ETA: 3:10 - loss: 0.0955 - regression_loss: 0.0833 - classification_loss: 0.0121 125/500 [======>.......................] - ETA: 3:09 - loss: 0.0952 - regression_loss: 0.0831 - classification_loss: 0.0121 126/500 [======>.......................] - ETA: 3:09 - loss: 0.0949 - regression_loss: 0.0829 - classification_loss: 0.0120 127/500 [======>.......................] - ETA: 3:08 - loss: 0.0948 - regression_loss: 0.0828 - classification_loss: 0.0120 128/500 [======>.......................] - ETA: 3:08 - loss: 0.0947 - regression_loss: 0.0827 - classification_loss: 0.0120 129/500 [======>.......................] - ETA: 3:07 - loss: 0.0974 - regression_loss: 0.0848 - classification_loss: 0.0126 130/500 [======>.......................] - ETA: 3:07 - loss: 0.0978 - regression_loss: 0.0853 - classification_loss: 0.0126 131/500 [======>.......................] - ETA: 3:06 - loss: 0.0973 - regression_loss: 0.0848 - classification_loss: 0.0125 132/500 [======>.......................] - ETA: 3:06 - loss: 0.0970 - regression_loss: 0.0846 - classification_loss: 0.0125 133/500 [======>.......................] - ETA: 3:05 - loss: 0.0967 - regression_loss: 0.0843 - classification_loss: 0.0125 134/500 [=======>......................] - ETA: 3:05 - loss: 0.0964 - regression_loss: 0.0840 - classification_loss: 0.0124 135/500 [=======>......................] - ETA: 3:04 - loss: 0.0960 - regression_loss: 0.0836 - classification_loss: 0.0124 136/500 [=======>......................] - ETA: 3:03 - loss: 0.0956 - regression_loss: 0.0833 - classification_loss: 0.0123 137/500 [=======>......................] - ETA: 3:03 - loss: 0.0952 - regression_loss: 0.0830 - classification_loss: 0.0122 138/500 [=======>......................] - ETA: 3:02 - loss: 0.0949 - regression_loss: 0.0827 - classification_loss: 0.0122 139/500 [=======>......................] - ETA: 3:02 - loss: 0.0945 - regression_loss: 0.0824 - classification_loss: 0.0122 140/500 [=======>......................] - ETA: 3:01 - loss: 0.0941 - regression_loss: 0.0820 - classification_loss: 0.0121 141/500 [=======>......................] - ETA: 3:01 - loss: 0.0937 - regression_loss: 0.0817 - classification_loss: 0.0121 142/500 [=======>......................] - ETA: 3:00 - loss: 0.0933 - regression_loss: 0.0813 - classification_loss: 0.0120 143/500 [=======>......................] - ETA: 3:00 - loss: 0.0973 - regression_loss: 0.0849 - classification_loss: 0.0124 144/500 [=======>......................] - ETA: 2:59 - loss: 0.0969 - regression_loss: 0.0845 - classification_loss: 0.0124 145/500 [=======>......................] - ETA: 2:59 - loss: 0.0967 - regression_loss: 0.0843 - classification_loss: 0.0124 146/500 [=======>......................] - ETA: 2:58 - loss: 0.0967 - regression_loss: 0.0843 - classification_loss: 0.0124 147/500 [=======>......................] - ETA: 2:58 - loss: 0.0962 - regression_loss: 0.0839 - classification_loss: 0.0123 148/500 [=======>......................] - ETA: 2:57 - loss: 0.0971 - regression_loss: 0.0848 - classification_loss: 0.0123 149/500 [=======>......................] - ETA: 2:57 - loss: 0.0974 - regression_loss: 0.0850 - classification_loss: 0.0123 150/500 [========>.....................] - ETA: 2:56 - loss: 0.0985 - regression_loss: 0.0862 - classification_loss: 0.0124 151/500 [========>.....................] - ETA: 2:56 - loss: 0.0983 - regression_loss: 0.0860 - classification_loss: 0.0123 152/500 [========>.....................] - ETA: 2:55 - loss: 0.0983 - regression_loss: 0.0860 - classification_loss: 0.0123 153/500 [========>.....................] - ETA: 2:55 - loss: 0.0979 - regression_loss: 0.0857 - classification_loss: 0.0122 154/500 [========>.....................] - ETA: 2:54 - loss: 0.0977 - regression_loss: 0.0855 - classification_loss: 0.0122 155/500 [========>.....................] - ETA: 2:54 - loss: 0.0983 - regression_loss: 0.0861 - classification_loss: 0.0122 156/500 [========>.....................] - ETA: 2:53 - loss: 0.0985 - regression_loss: 0.0862 - classification_loss: 0.0123 157/500 [========>.....................] - ETA: 2:53 - loss: 0.0982 - regression_loss: 0.0860 - classification_loss: 0.0122 158/500 [========>.....................] - ETA: 2:52 - loss: 0.0981 - regression_loss: 0.0859 - classification_loss: 0.0122 159/500 [========>.....................] - ETA: 2:52 - loss: 0.0979 - regression_loss: 0.0858 - classification_loss: 0.0122 160/500 [========>.....................] - ETA: 2:51 - loss: 0.0975 - regression_loss: 0.0854 - classification_loss: 0.0121 161/500 [========>.....................] - ETA: 2:50 - loss: 0.0971 - regression_loss: 0.0851 - classification_loss: 0.0121 162/500 [========>.....................] - ETA: 2:50 - loss: 0.0968 - regression_loss: 0.0848 - classification_loss: 0.0120 163/500 [========>.....................] - ETA: 2:49 - loss: 0.0966 - regression_loss: 0.0846 - classification_loss: 0.0120 164/500 [========>.....................] - ETA: 2:49 - loss: 0.0963 - regression_loss: 0.0844 - classification_loss: 0.0120 165/500 [========>.....................] - ETA: 2:48 - loss: 0.0962 - regression_loss: 0.0842 - classification_loss: 0.0120 166/500 [========>.....................] - ETA: 2:48 - loss: 0.0959 - regression_loss: 0.0840 - classification_loss: 0.0120 167/500 [=========>....................] - ETA: 2:47 - loss: 0.0957 - regression_loss: 0.0837 - classification_loss: 0.0119 168/500 [=========>....................] - ETA: 2:47 - loss: 0.0990 - regression_loss: 0.0868 - classification_loss: 0.0122 169/500 [=========>....................] - ETA: 2:46 - loss: 0.0988 - regression_loss: 0.0867 - classification_loss: 0.0122 170/500 [=========>....................] - ETA: 2:46 - loss: 0.0987 - regression_loss: 0.0866 - classification_loss: 0.0121 171/500 [=========>....................] - ETA: 2:46 - loss: 0.0993 - regression_loss: 0.0871 - classification_loss: 0.0122 172/500 [=========>....................] - ETA: 2:45 - loss: 0.0989 - regression_loss: 0.0868 - classification_loss: 0.0121 173/500 [=========>....................] - ETA: 2:45 - loss: 0.0988 - regression_loss: 0.0867 - classification_loss: 0.0121 174/500 [=========>....................] - ETA: 2:44 - loss: 0.0984 - regression_loss: 0.0863 - classification_loss: 0.0121 175/500 [=========>....................] - ETA: 2:44 - loss: 0.0981 - regression_loss: 0.0861 - classification_loss: 0.0120 176/500 [=========>....................] - ETA: 2:43 - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0120 177/500 [=========>....................] - ETA: 2:43 - loss: 0.0976 - regression_loss: 0.0856 - classification_loss: 0.0120 178/500 [=========>....................] - ETA: 2:42 - loss: 0.0979 - regression_loss: 0.0859 - classification_loss: 0.0120 179/500 [=========>....................] - ETA: 2:42 - loss: 0.0976 - regression_loss: 0.0857 - classification_loss: 0.0119 180/500 [=========>....................] - ETA: 2:41 - loss: 0.0975 - regression_loss: 0.0856 - classification_loss: 0.0119 181/500 [=========>....................] - ETA: 2:41 - loss: 0.0974 - regression_loss: 0.0855 - classification_loss: 0.0119 182/500 [=========>....................] - ETA: 2:40 - loss: 0.0970 - regression_loss: 0.0852 - classification_loss: 0.0118 183/500 [=========>....................] - ETA: 2:40 - loss: 0.0968 - regression_loss: 0.0850 - classification_loss: 0.0118 184/500 [==========>...................] - ETA: 2:39 - loss: 0.0965 - regression_loss: 0.0847 - classification_loss: 0.0118 185/500 [==========>...................] - ETA: 2:39 - loss: 0.0962 - regression_loss: 0.0845 - classification_loss: 0.0117 186/500 [==========>...................] - ETA: 2:38 - loss: 0.0962 - regression_loss: 0.0845 - classification_loss: 0.0117 187/500 [==========>...................] - ETA: 2:38 - loss: 0.0959 - regression_loss: 0.0842 - classification_loss: 0.0117 188/500 [==========>...................] - ETA: 2:37 - loss: 0.0957 - regression_loss: 0.0840 - classification_loss: 0.0117 189/500 [==========>...................] - ETA: 2:37 - loss: 0.0953 - regression_loss: 0.0837 - classification_loss: 0.0116 190/500 [==========>...................] - ETA: 2:36 - loss: 0.0956 - regression_loss: 0.0839 - classification_loss: 0.0117 191/500 [==========>...................] - ETA: 2:36 - loss: 0.0954 - regression_loss: 0.0837 - classification_loss: 0.0116 192/500 [==========>...................] - ETA: 2:35 - loss: 0.0953 - regression_loss: 0.0836 - classification_loss: 0.0116 193/500 [==========>...................] - ETA: 2:35 - loss: 0.0952 - regression_loss: 0.0836 - classification_loss: 0.0116 194/500 [==========>...................] - ETA: 2:34 - loss: 0.0950 - regression_loss: 0.0834 - classification_loss: 0.0116 195/500 [==========>...................] - ETA: 2:34 - loss: 0.0970 - regression_loss: 0.0853 - classification_loss: 0.0117 196/500 [==========>...................] - ETA: 2:33 - loss: 0.0968 - regression_loss: 0.0851 - classification_loss: 0.0117 197/500 [==========>...................] - ETA: 2:33 - loss: 0.0970 - regression_loss: 0.0853 - classification_loss: 0.0117 198/500 [==========>...................] - ETA: 2:32 - loss: 0.0967 - regression_loss: 0.0851 - classification_loss: 0.0117 199/500 [==========>...................] - ETA: 2:32 - loss: 0.0967 - regression_loss: 0.0850 - classification_loss: 0.0116 200/500 [===========>..................] - ETA: 2:31 - loss: 0.0964 - regression_loss: 0.0848 - classification_loss: 0.0116 201/500 [===========>..................] - ETA: 2:31 - loss: 0.0962 - regression_loss: 0.0846 - classification_loss: 0.0116 202/500 [===========>..................] - ETA: 2:30 - loss: 0.0963 - regression_loss: 0.0847 - classification_loss: 0.0116 203/500 [===========>..................] - ETA: 2:30 - loss: 0.0962 - regression_loss: 0.0846 - classification_loss: 0.0116 204/500 [===========>..................] - ETA: 2:29 - loss: 0.0964 - regression_loss: 0.0849 - classification_loss: 0.0116 205/500 [===========>..................] - ETA: 2:29 - loss: 0.0965 - regression_loss: 0.0849 - classification_loss: 0.0116 206/500 [===========>..................] - ETA: 2:28 - loss: 0.0972 - regression_loss: 0.0854 - classification_loss: 0.0117 207/500 [===========>..................] - ETA: 2:28 - loss: 0.0969 - regression_loss: 0.0852 - classification_loss: 0.0117 208/500 [===========>..................] - ETA: 2:27 - loss: 0.0969 - regression_loss: 0.0852 - classification_loss: 0.0117 209/500 [===========>..................] - ETA: 2:27 - loss: 0.0968 - regression_loss: 0.0851 - classification_loss: 0.0117 210/500 [===========>..................] - ETA: 2:26 - loss: 0.0966 - regression_loss: 0.0849 - classification_loss: 0.0117 211/500 [===========>..................] - ETA: 2:26 - loss: 0.0963 - regression_loss: 0.0847 - classification_loss: 0.0116 212/500 [===========>..................] - ETA: 2:25 - loss: 0.0961 - regression_loss: 0.0844 - classification_loss: 0.0116 213/500 [===========>..................] - ETA: 2:25 - loss: 0.0965 - regression_loss: 0.0848 - classification_loss: 0.0116 214/500 [===========>..................] - ETA: 2:24 - loss: 0.0962 - regression_loss: 0.0846 - classification_loss: 0.0116 215/500 [===========>..................] - ETA: 2:24 - loss: 0.0971 - regression_loss: 0.0854 - classification_loss: 0.0117 216/500 [===========>..................] - ETA: 2:23 - loss: 0.0969 - regression_loss: 0.0853 - classification_loss: 0.0116 217/500 [============>.................] - ETA: 2:23 - loss: 0.0967 - regression_loss: 0.0851 - classification_loss: 0.0116 218/500 [============>.................] - ETA: 2:22 - loss: 0.0965 - regression_loss: 0.0849 - classification_loss: 0.0116 219/500 [============>.................] - ETA: 2:22 - loss: 0.0966 - regression_loss: 0.0850 - classification_loss: 0.0116 220/500 [============>.................] - ETA: 2:21 - loss: 0.0962 - regression_loss: 0.0847 - classification_loss: 0.0115 221/500 [============>.................] - ETA: 2:21 - loss: 0.0959 - regression_loss: 0.0844 - classification_loss: 0.0115 222/500 [============>.................] - ETA: 2:20 - loss: 0.0960 - regression_loss: 0.0845 - classification_loss: 0.0116 223/500 [============>.................] - ETA: 2:20 - loss: 0.0960 - regression_loss: 0.0844 - classification_loss: 0.0116 224/500 [============>.................] - ETA: 2:19 - loss: 0.0958 - regression_loss: 0.0843 - classification_loss: 0.0115 225/500 [============>.................] - ETA: 2:19 - loss: 0.0956 - regression_loss: 0.0841 - classification_loss: 0.0115 226/500 [============>.................] - ETA: 2:18 - loss: 0.0954 - regression_loss: 0.0839 - classification_loss: 0.0115 227/500 [============>.................] - ETA: 2:18 - loss: 0.0951 - regression_loss: 0.0837 - classification_loss: 0.0114 228/500 [============>.................] - ETA: 2:17 - loss: 0.0949 - regression_loss: 0.0834 - classification_loss: 0.0114 229/500 [============>.................] - ETA: 2:17 - loss: 0.0947 - regression_loss: 0.0832 - classification_loss: 0.0114 230/500 [============>.................] - ETA: 2:16 - loss: 0.0947 - regression_loss: 0.0833 - classification_loss: 0.0114 231/500 [============>.................] - ETA: 2:16 - loss: 0.0945 - regression_loss: 0.0831 - classification_loss: 0.0114 232/500 [============>.................] - ETA: 2:15 - loss: 0.0943 - regression_loss: 0.0829 - classification_loss: 0.0114 233/500 [============>.................] - ETA: 2:15 - loss: 0.0942 - regression_loss: 0.0828 - classification_loss: 0.0114 234/500 [=============>................] - ETA: 2:14 - loss: 0.0941 - regression_loss: 0.0827 - classification_loss: 0.0113 235/500 [=============>................] - ETA: 2:14 - loss: 0.0938 - regression_loss: 0.0825 - classification_loss: 0.0113 236/500 [=============>................] - ETA: 2:13 - loss: 0.0936 - regression_loss: 0.0823 - classification_loss: 0.0113 237/500 [=============>................] - ETA: 2:13 - loss: 0.0934 - regression_loss: 0.0821 - classification_loss: 0.0113 238/500 [=============>................] - ETA: 2:12 - loss: 0.0932 - regression_loss: 0.0820 - classification_loss: 0.0113 239/500 [=============>................] - ETA: 2:12 - loss: 0.0930 - regression_loss: 0.0817 - classification_loss: 0.0112 240/500 [=============>................] - ETA: 2:11 - loss: 0.0930 - regression_loss: 0.0818 - classification_loss: 0.0113 241/500 [=============>................] - ETA: 2:11 - loss: 0.0929 - regression_loss: 0.0817 - classification_loss: 0.0112 242/500 [=============>................] - ETA: 2:10 - loss: 0.0926 - regression_loss: 0.0814 - classification_loss: 0.0112 243/500 [=============>................] - ETA: 2:10 - loss: 0.0925 - regression_loss: 0.0813 - classification_loss: 0.0112 244/500 [=============>................] - ETA: 2:09 - loss: 0.0938 - regression_loss: 0.0826 - classification_loss: 0.0112 245/500 [=============>................] - ETA: 2:09 - loss: 0.0936 - regression_loss: 0.0824 - classification_loss: 0.0112 246/500 [=============>................] - ETA: 2:08 - loss: 0.0935 - regression_loss: 0.0823 - classification_loss: 0.0112 247/500 [=============>................] - ETA: 2:08 - loss: 0.0933 - regression_loss: 0.0821 - classification_loss: 0.0112 248/500 [=============>................] - ETA: 2:07 - loss: 0.0931 - regression_loss: 0.0819 - classification_loss: 0.0111 249/500 [=============>................] - ETA: 2:07 - loss: 0.0931 - regression_loss: 0.0820 - classification_loss: 0.0111 250/500 [==============>...............] - ETA: 2:06 - loss: 0.0929 - regression_loss: 0.0818 - classification_loss: 0.0111 251/500 [==============>...............] - ETA: 2:06 - loss: 0.0926 - regression_loss: 0.0816 - classification_loss: 0.0111 252/500 [==============>...............] - ETA: 2:05 - loss: 0.0928 - regression_loss: 0.0817 - classification_loss: 0.0111 253/500 [==============>...............] - ETA: 2:05 - loss: 0.0928 - regression_loss: 0.0817 - classification_loss: 0.0111 254/500 [==============>...............] - ETA: 2:04 - loss: 0.0925 - regression_loss: 0.0814 - classification_loss: 0.0111 255/500 [==============>...............] - ETA: 2:04 - loss: 0.0926 - regression_loss: 0.0816 - classification_loss: 0.0110 256/500 [==============>...............] - ETA: 2:03 - loss: 0.0924 - regression_loss: 0.0814 - classification_loss: 0.0110 257/500 [==============>...............] - ETA: 2:03 - loss: 0.0922 - regression_loss: 0.0812 - classification_loss: 0.0110 258/500 [==============>...............] - ETA: 2:02 - loss: 0.0921 - regression_loss: 0.0811 - classification_loss: 0.0110 259/500 [==============>...............] - ETA: 2:02 - loss: 0.0921 - regression_loss: 0.0811 - classification_loss: 0.0110 260/500 [==============>...............] - ETA: 2:01 - loss: 0.0921 - regression_loss: 0.0811 - classification_loss: 0.0110 261/500 [==============>...............] - ETA: 2:01 - loss: 0.0922 - regression_loss: 0.0812 - classification_loss: 0.0110 262/500 [==============>...............] - ETA: 2:00 - loss: 0.0920 - regression_loss: 0.0810 - classification_loss: 0.0110 263/500 [==============>...............] - ETA: 2:00 - loss: 0.0918 - regression_loss: 0.0809 - classification_loss: 0.0110 264/500 [==============>...............] - ETA: 1:59 - loss: 0.0915 - regression_loss: 0.0806 - classification_loss: 0.0109 265/500 [==============>...............] - ETA: 1:59 - loss: 0.0914 - regression_loss: 0.0805 - classification_loss: 0.0109 266/500 [==============>...............] - ETA: 1:58 - loss: 0.0942 - regression_loss: 0.0824 - classification_loss: 0.0118 267/500 [===============>..............] - ETA: 1:58 - loss: 0.0939 - regression_loss: 0.0822 - classification_loss: 0.0118 268/500 [===============>..............] - ETA: 1:57 - loss: 0.0940 - regression_loss: 0.0822 - classification_loss: 0.0118 269/500 [===============>..............] - ETA: 1:57 - loss: 0.0938 - regression_loss: 0.0820 - classification_loss: 0.0118 270/500 [===============>..............] - ETA: 1:56 - loss: 0.0940 - regression_loss: 0.0822 - classification_loss: 0.0118 271/500 [===============>..............] - ETA: 1:56 - loss: 0.0938 - regression_loss: 0.0821 - classification_loss: 0.0117 272/500 [===============>..............] - ETA: 1:55 - loss: 0.0937 - regression_loss: 0.0820 - classification_loss: 0.0117 273/500 [===============>..............] - ETA: 1:55 - loss: 0.0935 - regression_loss: 0.0818 - classification_loss: 0.0117 274/500 [===============>..............] - ETA: 1:54 - loss: 0.0934 - regression_loss: 0.0818 - classification_loss: 0.0117 275/500 [===============>..............] - ETA: 1:54 - loss: 0.0934 - regression_loss: 0.0817 - classification_loss: 0.0117 276/500 [===============>..............] - ETA: 1:53 - loss: 0.0932 - regression_loss: 0.0815 - classification_loss: 0.0116 277/500 [===============>..............] - ETA: 1:53 - loss: 0.0929 - regression_loss: 0.0813 - classification_loss: 0.0116 278/500 [===============>..............] - ETA: 1:52 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0116 279/500 [===============>..............] - ETA: 1:52 - loss: 0.0925 - regression_loss: 0.0810 - classification_loss: 0.0116 280/500 [===============>..............] - ETA: 1:51 - loss: 0.0925 - regression_loss: 0.0809 - classification_loss: 0.0116 281/500 [===============>..............] - ETA: 1:51 - loss: 0.0922 - regression_loss: 0.0807 - classification_loss: 0.0116 282/500 [===============>..............] - ETA: 1:50 - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0115 283/500 [===============>..............] - ETA: 1:50 - loss: 0.0919 - regression_loss: 0.0804 - classification_loss: 0.0115 284/500 [================>.............] - ETA: 1:49 - loss: 0.0918 - regression_loss: 0.0803 - classification_loss: 0.0115 285/500 [================>.............] - ETA: 1:49 - loss: 0.0927 - regression_loss: 0.0809 - classification_loss: 0.0118 286/500 [================>.............] - ETA: 1:48 - loss: 0.0925 - regression_loss: 0.0807 - classification_loss: 0.0118 287/500 [================>.............] - ETA: 1:48 - loss: 0.0925 - regression_loss: 0.0807 - classification_loss: 0.0118 288/500 [================>.............] - ETA: 1:47 - loss: 0.0931 - regression_loss: 0.0813 - classification_loss: 0.0118 289/500 [================>.............] - ETA: 1:47 - loss: 0.0929 - regression_loss: 0.0811 - classification_loss: 0.0118 290/500 [================>.............] - ETA: 1:46 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0118 291/500 [================>.............] - ETA: 1:46 - loss: 0.0929 - regression_loss: 0.0811 - classification_loss: 0.0117 292/500 [================>.............] - ETA: 1:45 - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0117 293/500 [================>.............] - ETA: 1:45 - loss: 0.0946 - regression_loss: 0.0827 - classification_loss: 0.0119 294/500 [================>.............] - ETA: 1:44 - loss: 0.0944 - regression_loss: 0.0825 - classification_loss: 0.0119 295/500 [================>.............] - ETA: 1:43 - loss: 0.0942 - regression_loss: 0.0823 - classification_loss: 0.0119 296/500 [================>.............] - ETA: 1:43 - loss: 0.0940 - regression_loss: 0.0822 - classification_loss: 0.0119 297/500 [================>.............] - ETA: 1:42 - loss: 0.0941 - regression_loss: 0.0822 - classification_loss: 0.0119 298/500 [================>.............] - ETA: 1:42 - loss: 0.0939 - regression_loss: 0.0821 - classification_loss: 0.0118 299/500 [================>.............] - ETA: 1:41 - loss: 0.0941 - regression_loss: 0.0822 - classification_loss: 0.0118 300/500 [=================>............] - ETA: 1:41 - loss: 0.0938 - regression_loss: 0.0820 - classification_loss: 0.0118 301/500 [=================>............] - ETA: 1:40 - loss: 0.0937 - regression_loss: 0.0819 - classification_loss: 0.0118 302/500 [=================>............] - ETA: 1:40 - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0118 303/500 [=================>............] - ETA: 1:39 - loss: 0.0935 - regression_loss: 0.0817 - classification_loss: 0.0118 304/500 [=================>............] - ETA: 1:39 - loss: 0.0933 - regression_loss: 0.0815 - classification_loss: 0.0118 305/500 [=================>............] - ETA: 1:38 - loss: 0.0931 - regression_loss: 0.0814 - classification_loss: 0.0117 306/500 [=================>............] - ETA: 1:38 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0117 307/500 [=================>............] - ETA: 1:37 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0117 308/500 [=================>............] - ETA: 1:37 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0117 309/500 [=================>............] - ETA: 1:36 - loss: 0.0931 - regression_loss: 0.0814 - classification_loss: 0.0117 310/500 [=================>............] - ETA: 1:36 - loss: 0.0931 - regression_loss: 0.0814 - classification_loss: 0.0117 311/500 [=================>............] - ETA: 1:35 - loss: 0.0929 - regression_loss: 0.0812 - classification_loss: 0.0117 312/500 [=================>............] - ETA: 1:35 - loss: 0.0927 - regression_loss: 0.0811 - classification_loss: 0.0116 313/500 [=================>............] - ETA: 1:34 - loss: 0.0926 - regression_loss: 0.0809 - classification_loss: 0.0116 314/500 [=================>............] - ETA: 1:34 - loss: 0.0925 - regression_loss: 0.0809 - classification_loss: 0.0116 315/500 [=================>............] - ETA: 1:33 - loss: 0.0924 - regression_loss: 0.0808 - classification_loss: 0.0116 316/500 [=================>............] - ETA: 1:33 - loss: 0.0922 - regression_loss: 0.0806 - classification_loss: 0.0116 317/500 [==================>...........] - ETA: 1:32 - loss: 0.0923 - regression_loss: 0.0807 - classification_loss: 0.0116 318/500 [==================>...........] - ETA: 1:32 - loss: 0.0924 - regression_loss: 0.0808 - classification_loss: 0.0116 319/500 [==================>...........] - ETA: 1:31 - loss: 0.0923 - regression_loss: 0.0807 - classification_loss: 0.0116 320/500 [==================>...........] - ETA: 1:31 - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0115 321/500 [==================>...........] - ETA: 1:30 - loss: 0.0919 - regression_loss: 0.0804 - classification_loss: 0.0115 322/500 [==================>...........] - ETA: 1:30 - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0115 323/500 [==================>...........] - ETA: 1:29 - loss: 0.0920 - regression_loss: 0.0804 - classification_loss: 0.0115 324/500 [==================>...........] - ETA: 1:29 - loss: 0.0918 - regression_loss: 0.0803 - classification_loss: 0.0115 325/500 [==================>...........] - ETA: 1:28 - loss: 0.0916 - regression_loss: 0.0801 - classification_loss: 0.0115 326/500 [==================>...........] - ETA: 1:28 - loss: 0.0915 - regression_loss: 0.0800 - classification_loss: 0.0115 327/500 [==================>...........] - ETA: 1:27 - loss: 0.0914 - regression_loss: 0.0799 - classification_loss: 0.0114 328/500 [==================>...........] - ETA: 1:27 - loss: 0.0914 - regression_loss: 0.0800 - classification_loss: 0.0114 329/500 [==================>...........] - ETA: 1:26 - loss: 0.0913 - regression_loss: 0.0799 - classification_loss: 0.0114 330/500 [==================>...........] - ETA: 1:26 - loss: 0.0912 - regression_loss: 0.0798 - classification_loss: 0.0114 331/500 [==================>...........] - ETA: 1:25 - loss: 0.0911 - regression_loss: 0.0797 - classification_loss: 0.0114 332/500 [==================>...........] - ETA: 1:25 - loss: 0.0910 - regression_loss: 0.0796 - classification_loss: 0.0114 333/500 [==================>...........] - ETA: 1:24 - loss: 0.0911 - regression_loss: 0.0797 - classification_loss: 0.0114 334/500 [===================>..........] - ETA: 1:24 - loss: 0.0909 - regression_loss: 0.0796 - classification_loss: 0.0114 335/500 [===================>..........] - ETA: 1:23 - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0113 336/500 [===================>..........] - ETA: 1:23 - loss: 0.0907 - regression_loss: 0.0794 - classification_loss: 0.0113 337/500 [===================>..........] - ETA: 1:22 - loss: 0.0906 - regression_loss: 0.0793 - classification_loss: 0.0113 338/500 [===================>..........] - ETA: 1:22 - loss: 0.0905 - regression_loss: 0.0792 - classification_loss: 0.0113 339/500 [===================>..........] - ETA: 1:21 - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0113 340/500 [===================>..........] - ETA: 1:21 - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0113 341/500 [===================>..........] - ETA: 1:20 - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0112 342/500 [===================>..........] - ETA: 1:20 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0112 343/500 [===================>..........] - ETA: 1:19 - loss: 0.0900 - regression_loss: 0.0788 - classification_loss: 0.0112 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0112 345/500 [===================>..........] - ETA: 1:18 - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0113 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0113 347/500 [===================>..........] - ETA: 1:17 - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0112 348/500 [===================>..........] - ETA: 1:17 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0112 349/500 [===================>..........] - ETA: 1:16 - loss: 0.0901 - regression_loss: 0.0789 - classification_loss: 0.0112 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0905 - regression_loss: 0.0792 - classification_loss: 0.0112 351/500 [====================>.........] - ETA: 1:15 - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0112 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0903 - regression_loss: 0.0791 - classification_loss: 0.0112 353/500 [====================>.........] - ETA: 1:14 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0112 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0907 - regression_loss: 0.0794 - classification_loss: 0.0113 355/500 [====================>.........] - ETA: 1:13 - loss: 0.0908 - regression_loss: 0.0796 - classification_loss: 0.0113 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0924 - regression_loss: 0.0810 - classification_loss: 0.0114 357/500 [====================>.........] - ETA: 1:12 - loss: 0.0946 - regression_loss: 0.0825 - classification_loss: 0.0121 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0121 359/500 [====================>.........] - ETA: 1:11 - loss: 0.0944 - regression_loss: 0.0823 - classification_loss: 0.0121 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0942 - regression_loss: 0.0822 - classification_loss: 0.0120 361/500 [====================>.........] - ETA: 1:10 - loss: 0.0941 - regression_loss: 0.0821 - classification_loss: 0.0120 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0940 - regression_loss: 0.0820 - classification_loss: 0.0120 363/500 [====================>.........] - ETA: 1:09 - loss: 0.0939 - regression_loss: 0.0820 - classification_loss: 0.0120 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0948 - regression_loss: 0.0828 - classification_loss: 0.0120 365/500 [====================>.........] - ETA: 1:08 - loss: 0.0948 - regression_loss: 0.0828 - classification_loss: 0.0120 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0947 - regression_loss: 0.0827 - classification_loss: 0.0120 367/500 [=====================>........] - ETA: 1:07 - loss: 0.0945 - regression_loss: 0.0826 - classification_loss: 0.0120 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0946 - regression_loss: 0.0826 - classification_loss: 0.0120 369/500 [=====================>........] - ETA: 1:06 - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0120 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0943 - regression_loss: 0.0824 - classification_loss: 0.0119 371/500 [=====================>........] - ETA: 1:05 - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0119 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0944 - regression_loss: 0.0825 - classification_loss: 0.0119 373/500 [=====================>........] - ETA: 1:04 - loss: 0.0943 - regression_loss: 0.0824 - classification_loss: 0.0119 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0943 - regression_loss: 0.0823 - classification_loss: 0.0119 375/500 [=====================>........] - ETA: 1:03 - loss: 0.0941 - regression_loss: 0.0822 - classification_loss: 0.0119 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0939 - regression_loss: 0.0820 - classification_loss: 0.0119 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0938 - regression_loss: 0.0819 - classification_loss: 0.0119 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0938 - regression_loss: 0.0819 - classification_loss: 0.0119 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0937 - regression_loss: 0.0818 - classification_loss: 0.0118 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0936 - regression_loss: 0.0817 - classification_loss: 0.0118 381/500 [=====================>........] - ETA: 1:00 - loss: 0.0935 - regression_loss: 0.0817 - classification_loss: 0.0118 382/500 [=====================>........] - ETA: 59s - loss: 0.0933 - regression_loss: 0.0815 - classification_loss: 0.0118  383/500 [=====================>........] - ETA: 59s - loss: 0.0934 - regression_loss: 0.0816 - classification_loss: 0.0118 384/500 [======================>.......] - ETA: 58s - loss: 0.0935 - regression_loss: 0.0817 - classification_loss: 0.0118 385/500 [======================>.......] - ETA: 58s - loss: 0.0934 - regression_loss: 0.0816 - classification_loss: 0.0118 386/500 [======================>.......] - ETA: 57s - loss: 0.0933 - regression_loss: 0.0815 - classification_loss: 0.0118 387/500 [======================>.......] - ETA: 57s - loss: 0.0932 - regression_loss: 0.0814 - classification_loss: 0.0118 388/500 [======================>.......] - ETA: 56s - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0118 389/500 [======================>.......] - ETA: 56s - loss: 0.0929 - regression_loss: 0.0812 - classification_loss: 0.0117 390/500 [======================>.......] - ETA: 55s - loss: 0.0928 - regression_loss: 0.0810 - classification_loss: 0.0117 391/500 [======================>.......] - ETA: 55s - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0117 392/500 [======================>.......] - ETA: 54s - loss: 0.0926 - regression_loss: 0.0809 - classification_loss: 0.0117 393/500 [======================>.......] - ETA: 54s - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0117 394/500 [======================>.......] - ETA: 53s - loss: 0.0925 - regression_loss: 0.0808 - classification_loss: 0.0117 395/500 [======================>.......] - ETA: 53s - loss: 0.0923 - regression_loss: 0.0807 - classification_loss: 0.0117 396/500 [======================>.......] - ETA: 52s - loss: 0.0922 - regression_loss: 0.0805 - classification_loss: 0.0117 397/500 [======================>.......] - ETA: 52s - loss: 0.0920 - regression_loss: 0.0804 - classification_loss: 0.0116 398/500 [======================>.......] - ETA: 51s - loss: 0.0919 - regression_loss: 0.0803 - classification_loss: 0.0116 399/500 [======================>.......] - ETA: 51s - loss: 0.0918 - regression_loss: 0.0802 - classification_loss: 0.0116 400/500 [=======================>......] - ETA: 50s - loss: 0.0917 - regression_loss: 0.0801 - classification_loss: 0.0116 401/500 [=======================>......] - ETA: 50s - loss: 0.0916 - regression_loss: 0.0801 - classification_loss: 0.0116 402/500 [=======================>......] - ETA: 49s - loss: 0.0916 - regression_loss: 0.0800 - classification_loss: 0.0116 403/500 [=======================>......] - ETA: 49s - loss: 0.0916 - regression_loss: 0.0800 - classification_loss: 0.0116 404/500 [=======================>......] - ETA: 48s - loss: 0.0914 - regression_loss: 0.0799 - classification_loss: 0.0116 405/500 [=======================>......] - ETA: 48s - loss: 0.0925 - regression_loss: 0.0807 - classification_loss: 0.0118 406/500 [=======================>......] - ETA: 47s - loss: 0.0934 - regression_loss: 0.0814 - classification_loss: 0.0120 407/500 [=======================>......] - ETA: 47s - loss: 0.0936 - regression_loss: 0.0816 - classification_loss: 0.0120 408/500 [=======================>......] - ETA: 46s - loss: 0.0935 - regression_loss: 0.0815 - classification_loss: 0.0120 409/500 [=======================>......] - ETA: 46s - loss: 0.0934 - regression_loss: 0.0814 - classification_loss: 0.0120 410/500 [=======================>......] - ETA: 45s - loss: 0.0932 - regression_loss: 0.0813 - classification_loss: 0.0119 411/500 [=======================>......] - ETA: 45s - loss: 0.0933 - regression_loss: 0.0813 - classification_loss: 0.0120 412/500 [=======================>......] - ETA: 44s - loss: 0.0932 - regression_loss: 0.0813 - classification_loss: 0.0120 413/500 [=======================>......] - ETA: 44s - loss: 0.0934 - regression_loss: 0.0814 - classification_loss: 0.0120 414/500 [=======================>......] - ETA: 43s - loss: 0.0933 - regression_loss: 0.0814 - classification_loss: 0.0120 415/500 [=======================>......] - ETA: 43s - loss: 0.0932 - regression_loss: 0.0812 - classification_loss: 0.0120 416/500 [=======================>......] - ETA: 42s - loss: 0.0931 - regression_loss: 0.0811 - classification_loss: 0.0119 417/500 [========================>.....] - ETA: 42s - loss: 0.0930 - regression_loss: 0.0811 - classification_loss: 0.0119 418/500 [========================>.....] - ETA: 41s - loss: 0.0933 - regression_loss: 0.0813 - classification_loss: 0.0120 419/500 [========================>.....] - ETA: 41s - loss: 0.0931 - regression_loss: 0.0812 - classification_loss: 0.0119 420/500 [========================>.....] - ETA: 40s - loss: 0.0931 - regression_loss: 0.0812 - classification_loss: 0.0119 421/500 [========================>.....] - ETA: 40s - loss: 0.0934 - regression_loss: 0.0814 - classification_loss: 0.0120 422/500 [========================>.....] - ETA: 39s - loss: 0.0933 - regression_loss: 0.0813 - classification_loss: 0.0120 423/500 [========================>.....] - ETA: 39s - loss: 0.0932 - regression_loss: 0.0812 - classification_loss: 0.0120 424/500 [========================>.....] - ETA: 38s - loss: 0.0931 - regression_loss: 0.0811 - classification_loss: 0.0120 425/500 [========================>.....] - ETA: 38s - loss: 0.0930 - regression_loss: 0.0810 - classification_loss: 0.0120 426/500 [========================>.....] - ETA: 37s - loss: 0.0930 - regression_loss: 0.0810 - classification_loss: 0.0120 427/500 [========================>.....] - ETA: 37s - loss: 0.0939 - regression_loss: 0.0819 - classification_loss: 0.0120 428/500 [========================>.....] - ETA: 36s - loss: 0.0937 - regression_loss: 0.0817 - classification_loss: 0.0120 429/500 [========================>.....] - ETA: 36s - loss: 0.0942 - regression_loss: 0.0822 - classification_loss: 0.0120 430/500 [========================>.....] - ETA: 35s - loss: 0.0943 - regression_loss: 0.0823 - classification_loss: 0.0120 431/500 [========================>.....] - ETA: 35s - loss: 0.0946 - regression_loss: 0.0826 - classification_loss: 0.0120 432/500 [========================>.....] - ETA: 34s - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0120 433/500 [========================>.....] - ETA: 34s - loss: 0.0944 - regression_loss: 0.0824 - classification_loss: 0.0120 434/500 [=========================>....] - ETA: 33s - loss: 0.0944 - regression_loss: 0.0824 - classification_loss: 0.0120 435/500 [=========================>....] - ETA: 33s - loss: 0.0944 - regression_loss: 0.0824 - classification_loss: 0.0120 436/500 [=========================>....] - ETA: 32s - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0120 437/500 [=========================>....] - ETA: 32s - loss: 0.0946 - regression_loss: 0.0826 - classification_loss: 0.0120 438/500 [=========================>....] - ETA: 31s - loss: 0.0945 - regression_loss: 0.0825 - classification_loss: 0.0120 439/500 [=========================>....] - ETA: 31s - loss: 0.0943 - regression_loss: 0.0824 - classification_loss: 0.0120 440/500 [=========================>....] - ETA: 30s - loss: 0.0943 - regression_loss: 0.0823 - classification_loss: 0.0120 441/500 [=========================>....] - ETA: 30s - loss: 0.0941 - regression_loss: 0.0822 - classification_loss: 0.0119 442/500 [=========================>....] - ETA: 29s - loss: 0.0940 - regression_loss: 0.0821 - classification_loss: 0.0119 443/500 [=========================>....] - ETA: 29s - loss: 0.0939 - regression_loss: 0.0820 - classification_loss: 0.0119 444/500 [=========================>....] - ETA: 28s - loss: 0.0939 - regression_loss: 0.0819 - classification_loss: 0.0119 445/500 [=========================>....] - ETA: 28s - loss: 0.0937 - regression_loss: 0.0818 - classification_loss: 0.0119 446/500 [=========================>....] - ETA: 27s - loss: 0.0936 - regression_loss: 0.0817 - classification_loss: 0.0119 447/500 [=========================>....] - ETA: 26s - loss: 0.0936 - regression_loss: 0.0817 - classification_loss: 0.0119 448/500 [=========================>....] - ETA: 26s - loss: 0.0936 - regression_loss: 0.0817 - classification_loss: 0.0119 449/500 [=========================>....] - ETA: 25s - loss: 0.0934 - regression_loss: 0.0816 - classification_loss: 0.0119 450/500 [==========================>...] - ETA: 25s - loss: 0.0939 - regression_loss: 0.0819 - classification_loss: 0.0120 451/500 [==========================>...] - ETA: 24s - loss: 0.0940 - regression_loss: 0.0820 - classification_loss: 0.0120 452/500 [==========================>...] - ETA: 24s - loss: 0.0939 - regression_loss: 0.0819 - classification_loss: 0.0120 453/500 [==========================>...] - ETA: 23s - loss: 0.0938 - regression_loss: 0.0818 - classification_loss: 0.0120 454/500 [==========================>...] - ETA: 23s - loss: 0.0936 - regression_loss: 0.0817 - classification_loss: 0.0120 455/500 [==========================>...] - ETA: 22s - loss: 0.0936 - regression_loss: 0.0816 - classification_loss: 0.0119 456/500 [==========================>...] - ETA: 22s - loss: 0.0935 - regression_loss: 0.0815 - classification_loss: 0.0119 457/500 [==========================>...] - ETA: 21s - loss: 0.0934 - regression_loss: 0.0815 - classification_loss: 0.0119 458/500 [==========================>...] - ETA: 21s - loss: 0.0934 - regression_loss: 0.0815 - classification_loss: 0.0119 459/500 [==========================>...] - ETA: 20s - loss: 0.0933 - regression_loss: 0.0814 - classification_loss: 0.0119 460/500 [==========================>...] - ETA: 20s - loss: 0.0932 - regression_loss: 0.0813 - classification_loss: 0.0119 461/500 [==========================>...] - ETA: 19s - loss: 0.0931 - regression_loss: 0.0812 - classification_loss: 0.0119 462/500 [==========================>...] - ETA: 19s - loss: 0.0939 - regression_loss: 0.0820 - classification_loss: 0.0119 463/500 [==========================>...] - ETA: 18s - loss: 0.0939 - regression_loss: 0.0820 - classification_loss: 0.0119 464/500 [==========================>...] - ETA: 18s - loss: 0.0938 - regression_loss: 0.0819 - classification_loss: 0.0119 465/500 [==========================>...] - ETA: 17s - loss: 0.0938 - regression_loss: 0.0819 - classification_loss: 0.0119 466/500 [==========================>...] - ETA: 17s - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0119 467/500 [===========================>..] - ETA: 16s - loss: 0.0935 - regression_loss: 0.0817 - classification_loss: 0.0119 468/500 [===========================>..] - ETA: 16s - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0119 469/500 [===========================>..] - ETA: 15s - loss: 0.0937 - regression_loss: 0.0818 - classification_loss: 0.0119 470/500 [===========================>..] - ETA: 15s - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0119 471/500 [===========================>..] - ETA: 14s - loss: 0.0938 - regression_loss: 0.0819 - classification_loss: 0.0119 472/500 [===========================>..] - ETA: 14s - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0119 473/500 [===========================>..] - ETA: 13s - loss: 0.0935 - regression_loss: 0.0816 - classification_loss: 0.0118 474/500 [===========================>..] - ETA: 13s - loss: 0.0934 - regression_loss: 0.0816 - classification_loss: 0.0118 475/500 [===========================>..] - ETA: 12s - loss: 0.0934 - regression_loss: 0.0815 - classification_loss: 0.0118 476/500 [===========================>..] - ETA: 12s - loss: 0.0932 - regression_loss: 0.0814 - classification_loss: 0.0118 477/500 [===========================>..] - ETA: 11s - loss: 0.0931 - regression_loss: 0.0813 - classification_loss: 0.0118 478/500 [===========================>..] - ETA: 11s - loss: 0.0930 - regression_loss: 0.0812 - classification_loss: 0.0118 479/500 [===========================>..] - ETA: 10s - loss: 0.0929 - regression_loss: 0.0811 - classification_loss: 0.0118 480/500 [===========================>..] - ETA: 10s - loss: 0.0928 - regression_loss: 0.0810 - classification_loss: 0.0118 481/500 [===========================>..] - ETA: 9s - loss: 0.0927 - regression_loss: 0.0809 - classification_loss: 0.0117  482/500 [===========================>..] - ETA: 9s - loss: 0.0928 - regression_loss: 0.0810 - classification_loss: 0.0117 483/500 [===========================>..] - ETA: 8s - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0117 484/500 [============================>.] - ETA: 8s - loss: 0.0927 - regression_loss: 0.0809 - classification_loss: 0.0117 485/500 [============================>.] - ETA: 7s - loss: 0.0926 - regression_loss: 0.0809 - classification_loss: 0.0117 486/500 [============================>.] - ETA: 7s - loss: 0.0926 - regression_loss: 0.0809 - classification_loss: 0.0117 487/500 [============================>.] - ETA: 6s - loss: 0.0926 - regression_loss: 0.0809 - classification_loss: 0.0117 488/500 [============================>.] - ETA: 6s - loss: 0.0926 - regression_loss: 0.0808 - classification_loss: 0.0117 489/500 [============================>.] - ETA: 5s - loss: 0.0925 - regression_loss: 0.0808 - classification_loss: 0.0117 490/500 [============================>.] - ETA: 5s - loss: 0.0923 - regression_loss: 0.0807 - classification_loss: 0.0117 491/500 [============================>.] - ETA: 4s - loss: 0.0923 - regression_loss: 0.0806 - classification_loss: 0.0117 492/500 [============================>.] - ETA: 4s - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0117 493/500 [============================>.] - ETA: 3s - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0116 494/500 [============================>.] - ETA: 3s - loss: 0.0922 - regression_loss: 0.0806 - classification_loss: 0.0117 495/500 [============================>.] - ETA: 2s - loss: 0.0923 - regression_loss: 0.0806 - classification_loss: 0.0116 496/500 [============================>.] - ETA: 2s - loss: 0.0922 - regression_loss: 0.0805 - classification_loss: 0.0116 497/500 [============================>.] - ETA: 1s - loss: 0.0920 - regression_loss: 0.0804 - classification_loss: 0.0116 498/500 [============================>.] - ETA: 1s - loss: 0.0919 - regression_loss: 0.0803 - classification_loss: 0.0116 499/500 [============================>.] - ETA: 0s - loss: 0.0918 - regression_loss: 0.0802 - classification_loss: 0.0116 500/500 [==============================] - 255s 509ms/step - loss: 0.0919 - regression_loss: 0.0803 - classification_loss: 0.0116 326 instances of class plum with average precision: 0.8402 mAP: 0.8402 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 4:27 - loss: 0.0406 - regression_loss: 0.0353 - classification_loss: 0.0052 2/500 [..............................] - ETA: 4:22 - loss: 0.0384 - regression_loss: 0.0339 - classification_loss: 0.0045 3/500 [..............................] - ETA: 4:15 - loss: 0.0380 - regression_loss: 0.0326 - classification_loss: 0.0054 4/500 [..............................] - ETA: 4:14 - loss: 0.0440 - regression_loss: 0.0384 - classification_loss: 0.0057 5/500 [..............................] - ETA: 4:12 - loss: 0.0518 - regression_loss: 0.0448 - classification_loss: 0.0070 6/500 [..............................] - ETA: 4:11 - loss: 0.0581 - regression_loss: 0.0505 - classification_loss: 0.0077 7/500 [..............................] - ETA: 4:11 - loss: 0.0652 - regression_loss: 0.0578 - classification_loss: 0.0074 8/500 [..............................] - ETA: 4:12 - loss: 0.0653 - regression_loss: 0.0579 - classification_loss: 0.0073 9/500 [..............................] - ETA: 4:12 - loss: 0.0678 - regression_loss: 0.0603 - classification_loss: 0.0075 10/500 [..............................] - ETA: 4:11 - loss: 0.0665 - regression_loss: 0.0589 - classification_loss: 0.0075 11/500 [..............................] - ETA: 4:10 - loss: 0.0679 - regression_loss: 0.0605 - classification_loss: 0.0074 12/500 [..............................] - ETA: 4:10 - loss: 0.0649 - regression_loss: 0.0577 - classification_loss: 0.0072 13/500 [..............................] - ETA: 4:09 - loss: 0.0645 - regression_loss: 0.0573 - classification_loss: 0.0073 14/500 [..............................] - ETA: 4:08 - loss: 0.0628 - regression_loss: 0.0559 - classification_loss: 0.0070 15/500 [..............................] - ETA: 4:07 - loss: 0.0627 - regression_loss: 0.0560 - classification_loss: 0.0068 16/500 [..............................] - ETA: 4:06 - loss: 0.0617 - regression_loss: 0.0550 - classification_loss: 0.0067 17/500 [>.............................] - ETA: 4:06 - loss: 0.0618 - regression_loss: 0.0550 - classification_loss: 0.0068 18/500 [>.............................] - ETA: 4:06 - loss: 0.0632 - regression_loss: 0.0565 - classification_loss: 0.0067 19/500 [>.............................] - ETA: 4:05 - loss: 0.0662 - regression_loss: 0.0590 - classification_loss: 0.0072 20/500 [>.............................] - ETA: 4:04 - loss: 0.0653 - regression_loss: 0.0581 - classification_loss: 0.0072 21/500 [>.............................] - ETA: 4:03 - loss: 0.0657 - regression_loss: 0.0583 - classification_loss: 0.0074 22/500 [>.............................] - ETA: 4:03 - loss: 0.0651 - regression_loss: 0.0576 - classification_loss: 0.0075 23/500 [>.............................] - ETA: 4:03 - loss: 0.0724 - regression_loss: 0.0640 - classification_loss: 0.0084 24/500 [>.............................] - ETA: 4:03 - loss: 0.0719 - regression_loss: 0.0636 - classification_loss: 0.0083 25/500 [>.............................] - ETA: 4:03 - loss: 0.0752 - regression_loss: 0.0667 - classification_loss: 0.0085 26/500 [>.............................] - ETA: 4:02 - loss: 0.0739 - regression_loss: 0.0656 - classification_loss: 0.0083 27/500 [>.............................] - ETA: 4:01 - loss: 0.0741 - regression_loss: 0.0657 - classification_loss: 0.0084 28/500 [>.............................] - ETA: 4:01 - loss: 0.0734 - regression_loss: 0.0651 - classification_loss: 0.0083 29/500 [>.............................] - ETA: 4:00 - loss: 0.0733 - regression_loss: 0.0651 - classification_loss: 0.0082 30/500 [>.............................] - ETA: 3:59 - loss: 0.0745 - regression_loss: 0.0661 - classification_loss: 0.0084 31/500 [>.............................] - ETA: 3:59 - loss: 0.0732 - regression_loss: 0.0647 - classification_loss: 0.0085 32/500 [>.............................] - ETA: 3:58 - loss: 0.0777 - regression_loss: 0.0680 - classification_loss: 0.0097 33/500 [>.............................] - ETA: 3:58 - loss: 0.0765 - regression_loss: 0.0669 - classification_loss: 0.0095 34/500 [=>............................] - ETA: 3:57 - loss: 0.0763 - regression_loss: 0.0667 - classification_loss: 0.0096 35/500 [=>............................] - ETA: 3:56 - loss: 0.0755 - regression_loss: 0.0659 - classification_loss: 0.0096 36/500 [=>............................] - ETA: 3:55 - loss: 0.0743 - regression_loss: 0.0649 - classification_loss: 0.0094 37/500 [=>............................] - ETA: 3:55 - loss: 0.0738 - regression_loss: 0.0645 - classification_loss: 0.0093 38/500 [=>............................] - ETA: 3:54 - loss: 0.0732 - regression_loss: 0.0640 - classification_loss: 0.0092 39/500 [=>............................] - ETA: 3:53 - loss: 0.0719 - regression_loss: 0.0628 - classification_loss: 0.0091 40/500 [=>............................] - ETA: 3:53 - loss: 0.0714 - regression_loss: 0.0624 - classification_loss: 0.0090 41/500 [=>............................] - ETA: 3:52 - loss: 0.0713 - regression_loss: 0.0624 - classification_loss: 0.0089 42/500 [=>............................] - ETA: 3:52 - loss: 0.0708 - regression_loss: 0.0619 - classification_loss: 0.0089 43/500 [=>............................] - ETA: 3:51 - loss: 0.0899 - regression_loss: 0.0751 - classification_loss: 0.0147 44/500 [=>............................] - ETA: 3:51 - loss: 0.0889 - regression_loss: 0.0743 - classification_loss: 0.0146 45/500 [=>............................] - ETA: 3:50 - loss: 0.0883 - regression_loss: 0.0739 - classification_loss: 0.0144 46/500 [=>............................] - ETA: 3:49 - loss: 0.0882 - regression_loss: 0.0739 - classification_loss: 0.0143 47/500 [=>............................] - ETA: 3:49 - loss: 0.0870 - regression_loss: 0.0728 - classification_loss: 0.0142 48/500 [=>............................] - ETA: 3:49 - loss: 0.0909 - regression_loss: 0.0767 - classification_loss: 0.0142 49/500 [=>............................] - ETA: 3:48 - loss: 0.0903 - regression_loss: 0.0762 - classification_loss: 0.0141 50/500 [==>...........................] - ETA: 3:48 - loss: 0.0902 - regression_loss: 0.0762 - classification_loss: 0.0140 51/500 [==>...........................] - ETA: 3:47 - loss: 0.0889 - regression_loss: 0.0751 - classification_loss: 0.0138 52/500 [==>...........................] - ETA: 3:47 - loss: 0.0879 - regression_loss: 0.0743 - classification_loss: 0.0136 53/500 [==>...........................] - ETA: 3:46 - loss: 0.0872 - regression_loss: 0.0737 - classification_loss: 0.0134 54/500 [==>...........................] - ETA: 3:46 - loss: 0.0864 - regression_loss: 0.0731 - classification_loss: 0.0133 55/500 [==>...........................] - ETA: 3:45 - loss: 0.0857 - regression_loss: 0.0726 - classification_loss: 0.0131 56/500 [==>...........................] - ETA: 3:45 - loss: 0.0853 - regression_loss: 0.0723 - classification_loss: 0.0130 57/500 [==>...........................] - ETA: 3:44 - loss: 0.0847 - regression_loss: 0.0719 - classification_loss: 0.0129 58/500 [==>...........................] - ETA: 3:44 - loss: 0.0840 - regression_loss: 0.0712 - classification_loss: 0.0128 59/500 [==>...........................] - ETA: 3:43 - loss: 0.0858 - regression_loss: 0.0727 - classification_loss: 0.0131 60/500 [==>...........................] - ETA: 3:43 - loss: 0.0853 - regression_loss: 0.0723 - classification_loss: 0.0130 61/500 [==>...........................] - ETA: 3:42 - loss: 0.0871 - regression_loss: 0.0740 - classification_loss: 0.0131 62/500 [==>...........................] - ETA: 3:41 - loss: 0.0866 - regression_loss: 0.0737 - classification_loss: 0.0129 63/500 [==>...........................] - ETA: 3:41 - loss: 0.0863 - regression_loss: 0.0735 - classification_loss: 0.0129 64/500 [==>...........................] - ETA: 3:40 - loss: 0.0872 - regression_loss: 0.0743 - classification_loss: 0.0129 65/500 [==>...........................] - ETA: 3:40 - loss: 0.0865 - regression_loss: 0.0737 - classification_loss: 0.0128 66/500 [==>...........................] - ETA: 3:40 - loss: 0.0864 - regression_loss: 0.0736 - classification_loss: 0.0128 67/500 [===>..........................] - ETA: 3:39 - loss: 0.0871 - regression_loss: 0.0743 - classification_loss: 0.0128 68/500 [===>..........................] - ETA: 3:39 - loss: 0.0889 - regression_loss: 0.0759 - classification_loss: 0.0130 69/500 [===>..........................] - ETA: 3:38 - loss: 0.0892 - regression_loss: 0.0761 - classification_loss: 0.0132 70/500 [===>..........................] - ETA: 3:38 - loss: 0.0886 - regression_loss: 0.0756 - classification_loss: 0.0131 71/500 [===>..........................] - ETA: 3:37 - loss: 0.0879 - regression_loss: 0.0750 - classification_loss: 0.0130 72/500 [===>..........................] - ETA: 3:37 - loss: 0.0870 - regression_loss: 0.0742 - classification_loss: 0.0129 73/500 [===>..........................] - ETA: 3:36 - loss: 0.0869 - regression_loss: 0.0741 - classification_loss: 0.0128 74/500 [===>..........................] - ETA: 3:36 - loss: 0.0863 - regression_loss: 0.0736 - classification_loss: 0.0126 75/500 [===>..........................] - ETA: 3:35 - loss: 0.0868 - regression_loss: 0.0743 - classification_loss: 0.0126 76/500 [===>..........................] - ETA: 3:35 - loss: 0.0879 - regression_loss: 0.0753 - classification_loss: 0.0126 77/500 [===>..........................] - ETA: 3:34 - loss: 0.0877 - regression_loss: 0.0752 - classification_loss: 0.0125 78/500 [===>..........................] - ETA: 3:34 - loss: 0.0890 - regression_loss: 0.0765 - classification_loss: 0.0126 79/500 [===>..........................] - ETA: 3:33 - loss: 0.0885 - regression_loss: 0.0761 - classification_loss: 0.0125 80/500 [===>..........................] - ETA: 3:33 - loss: 0.0938 - regression_loss: 0.0802 - classification_loss: 0.0136 81/500 [===>..........................] - ETA: 3:32 - loss: 0.0932 - regression_loss: 0.0797 - classification_loss: 0.0135 82/500 [===>..........................] - ETA: 3:32 - loss: 0.0933 - regression_loss: 0.0798 - classification_loss: 0.0134 83/500 [===>..........................] - ETA: 3:31 - loss: 0.0925 - regression_loss: 0.0792 - classification_loss: 0.0133 84/500 [====>.........................] - ETA: 3:31 - loss: 0.0917 - regression_loss: 0.0785 - classification_loss: 0.0132 85/500 [====>.........................] - ETA: 3:30 - loss: 0.0910 - regression_loss: 0.0779 - classification_loss: 0.0131 86/500 [====>.........................] - ETA: 3:30 - loss: 0.0905 - regression_loss: 0.0775 - classification_loss: 0.0130 87/500 [====>.........................] - ETA: 3:30 - loss: 0.0897 - regression_loss: 0.0768 - classification_loss: 0.0129 88/500 [====>.........................] - ETA: 3:29 - loss: 0.0892 - regression_loss: 0.0764 - classification_loss: 0.0128 89/500 [====>.........................] - ETA: 3:28 - loss: 0.0889 - regression_loss: 0.0762 - classification_loss: 0.0127 90/500 [====>.........................] - ETA: 3:28 - loss: 0.0882 - regression_loss: 0.0755 - classification_loss: 0.0127 91/500 [====>.........................] - ETA: 3:27 - loss: 0.0875 - regression_loss: 0.0750 - classification_loss: 0.0126 92/500 [====>.........................] - ETA: 3:27 - loss: 0.0873 - regression_loss: 0.0747 - classification_loss: 0.0125 93/500 [====>.........................] - ETA: 3:26 - loss: 0.0876 - regression_loss: 0.0751 - classification_loss: 0.0125 94/500 [====>.........................] - ETA: 3:26 - loss: 0.0870 - regression_loss: 0.0746 - classification_loss: 0.0124 95/500 [====>.........................] - ETA: 3:25 - loss: 0.0868 - regression_loss: 0.0744 - classification_loss: 0.0124 96/500 [====>.........................] - ETA: 3:25 - loss: 0.0864 - regression_loss: 0.0741 - classification_loss: 0.0123 97/500 [====>.........................] - ETA: 3:24 - loss: 0.0858 - regression_loss: 0.0736 - classification_loss: 0.0122 98/500 [====>.........................] - ETA: 3:24 - loss: 0.0854 - regression_loss: 0.0733 - classification_loss: 0.0122 99/500 [====>.........................] - ETA: 3:23 - loss: 0.0849 - regression_loss: 0.0728 - classification_loss: 0.0121 100/500 [=====>........................] - ETA: 3:23 - loss: 0.0844 - regression_loss: 0.0723 - classification_loss: 0.0120 101/500 [=====>........................] - ETA: 3:22 - loss: 0.0862 - regression_loss: 0.0740 - classification_loss: 0.0121 102/500 [=====>........................] - ETA: 3:22 - loss: 0.0858 - regression_loss: 0.0738 - classification_loss: 0.0121 103/500 [=====>........................] - ETA: 3:21 - loss: 0.0859 - regression_loss: 0.0739 - classification_loss: 0.0120 104/500 [=====>........................] - ETA: 3:21 - loss: 0.0862 - regression_loss: 0.0742 - classification_loss: 0.0120 105/500 [=====>........................] - ETA: 3:20 - loss: 0.0876 - regression_loss: 0.0756 - classification_loss: 0.0120 106/500 [=====>........................] - ETA: 3:20 - loss: 0.0873 - regression_loss: 0.0753 - classification_loss: 0.0120 107/500 [=====>........................] - ETA: 3:19 - loss: 0.0869 - regression_loss: 0.0749 - classification_loss: 0.0120 108/500 [=====>........................] - ETA: 3:19 - loss: 0.0865 - regression_loss: 0.0746 - classification_loss: 0.0119 109/500 [=====>........................] - ETA: 3:18 - loss: 0.0865 - regression_loss: 0.0746 - classification_loss: 0.0119 110/500 [=====>........................] - ETA: 3:18 - loss: 0.0906 - regression_loss: 0.0784 - classification_loss: 0.0122 111/500 [=====>........................] - ETA: 3:18 - loss: 0.0904 - regression_loss: 0.0782 - classification_loss: 0.0122 112/500 [=====>........................] - ETA: 3:17 - loss: 0.0932 - regression_loss: 0.0809 - classification_loss: 0.0123 113/500 [=====>........................] - ETA: 3:17 - loss: 0.0928 - regression_loss: 0.0806 - classification_loss: 0.0122 114/500 [=====>........................] - ETA: 3:16 - loss: 0.0924 - regression_loss: 0.0803 - classification_loss: 0.0121 115/500 [=====>........................] - ETA: 3:15 - loss: 0.0919 - regression_loss: 0.0798 - classification_loss: 0.0121 116/500 [=====>........................] - ETA: 3:15 - loss: 0.0914 - regression_loss: 0.0794 - classification_loss: 0.0120 117/500 [======>.......................] - ETA: 3:14 - loss: 0.0924 - regression_loss: 0.0804 - classification_loss: 0.0120 118/500 [======>.......................] - ETA: 3:14 - loss: 0.0924 - regression_loss: 0.0804 - classification_loss: 0.0120 119/500 [======>.......................] - ETA: 3:13 - loss: 0.0926 - regression_loss: 0.0805 - classification_loss: 0.0120 120/500 [======>.......................] - ETA: 3:13 - loss: 0.0920 - regression_loss: 0.0801 - classification_loss: 0.0120 121/500 [======>.......................] - ETA: 3:12 - loss: 0.0970 - regression_loss: 0.0845 - classification_loss: 0.0125 122/500 [======>.......................] - ETA: 3:12 - loss: 0.0968 - regression_loss: 0.0844 - classification_loss: 0.0125 123/500 [======>.......................] - ETA: 3:11 - loss: 0.0963 - regression_loss: 0.0839 - classification_loss: 0.0124 124/500 [======>.......................] - ETA: 3:11 - loss: 0.0959 - regression_loss: 0.0835 - classification_loss: 0.0124 125/500 [======>.......................] - ETA: 3:10 - loss: 0.0957 - regression_loss: 0.0834 - classification_loss: 0.0123 126/500 [======>.......................] - ETA: 3:10 - loss: 0.0961 - regression_loss: 0.0837 - classification_loss: 0.0124 127/500 [======>.......................] - ETA: 3:09 - loss: 0.0957 - regression_loss: 0.0834 - classification_loss: 0.0123 128/500 [======>.......................] - ETA: 3:09 - loss: 0.0953 - regression_loss: 0.0830 - classification_loss: 0.0123 129/500 [======>.......................] - ETA: 3:08 - loss: 0.0949 - regression_loss: 0.0827 - classification_loss: 0.0122 130/500 [======>.......................] - ETA: 3:08 - loss: 0.0944 - regression_loss: 0.0822 - classification_loss: 0.0122 131/500 [======>.......................] - ETA: 3:07 - loss: 0.0943 - regression_loss: 0.0821 - classification_loss: 0.0122 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1003 - regression_loss: 0.0862 - classification_loss: 0.0141 133/500 [======>.......................] - ETA: 3:06 - loss: 0.1000 - regression_loss: 0.0860 - classification_loss: 0.0140 134/500 [=======>......................] - ETA: 3:06 - loss: 0.0995 - regression_loss: 0.0856 - classification_loss: 0.0139 135/500 [=======>......................] - ETA: 3:05 - loss: 0.0990 - regression_loss: 0.0851 - classification_loss: 0.0139 136/500 [=======>......................] - ETA: 3:05 - loss: 0.0988 - regression_loss: 0.0850 - classification_loss: 0.0139 137/500 [=======>......................] - ETA: 3:04 - loss: 0.0983 - regression_loss: 0.0844 - classification_loss: 0.0138 138/500 [=======>......................] - ETA: 3:04 - loss: 0.0978 - regression_loss: 0.0841 - classification_loss: 0.0138 139/500 [=======>......................] - ETA: 3:03 - loss: 0.0975 - regression_loss: 0.0838 - classification_loss: 0.0137 140/500 [=======>......................] - ETA: 3:03 - loss: 0.0973 - regression_loss: 0.0836 - classification_loss: 0.0137 141/500 [=======>......................] - ETA: 3:02 - loss: 0.0968 - regression_loss: 0.0832 - classification_loss: 0.0136 142/500 [=======>......................] - ETA: 3:02 - loss: 0.0966 - regression_loss: 0.0830 - classification_loss: 0.0136 143/500 [=======>......................] - ETA: 3:01 - loss: 0.0963 - regression_loss: 0.0827 - classification_loss: 0.0136 144/500 [=======>......................] - ETA: 3:01 - loss: 0.0960 - regression_loss: 0.0825 - classification_loss: 0.0135 145/500 [=======>......................] - ETA: 3:00 - loss: 0.0958 - regression_loss: 0.0823 - classification_loss: 0.0135 146/500 [=======>......................] - ETA: 3:00 - loss: 0.0954 - regression_loss: 0.0820 - classification_loss: 0.0134 147/500 [=======>......................] - ETA: 2:59 - loss: 0.0952 - regression_loss: 0.0818 - classification_loss: 0.0134 148/500 [=======>......................] - ETA: 2:59 - loss: 0.0948 - regression_loss: 0.0815 - classification_loss: 0.0133 149/500 [=======>......................] - ETA: 2:58 - loss: 0.0948 - regression_loss: 0.0815 - classification_loss: 0.0133 150/500 [========>.....................] - ETA: 2:58 - loss: 0.0945 - regression_loss: 0.0813 - classification_loss: 0.0132 151/500 [========>.....................] - ETA: 2:57 - loss: 0.0941 - regression_loss: 0.0809 - classification_loss: 0.0132 152/500 [========>.....................] - ETA: 2:57 - loss: 0.0937 - regression_loss: 0.0806 - classification_loss: 0.0131 153/500 [========>.....................] - ETA: 2:56 - loss: 0.0944 - regression_loss: 0.0813 - classification_loss: 0.0131 154/500 [========>.....................] - ETA: 2:56 - loss: 0.0939 - regression_loss: 0.0809 - classification_loss: 0.0130 155/500 [========>.....................] - ETA: 2:55 - loss: 0.0936 - regression_loss: 0.0806 - classification_loss: 0.0130 156/500 [========>.....................] - ETA: 2:55 - loss: 0.0932 - regression_loss: 0.0803 - classification_loss: 0.0129 157/500 [========>.....................] - ETA: 2:54 - loss: 0.0930 - regression_loss: 0.0801 - classification_loss: 0.0129 158/500 [========>.....................] - ETA: 2:54 - loss: 0.0927 - regression_loss: 0.0799 - classification_loss: 0.0128 159/500 [========>.....................] - ETA: 2:53 - loss: 0.0926 - regression_loss: 0.0798 - classification_loss: 0.0128 160/500 [========>.....................] - ETA: 2:53 - loss: 0.0922 - regression_loss: 0.0795 - classification_loss: 0.0127 161/500 [========>.....................] - ETA: 2:52 - loss: 0.0921 - regression_loss: 0.0794 - classification_loss: 0.0127 162/500 [========>.....................] - ETA: 2:52 - loss: 0.0917 - regression_loss: 0.0790 - classification_loss: 0.0127 163/500 [========>.....................] - ETA: 2:51 - loss: 0.0913 - regression_loss: 0.0787 - classification_loss: 0.0126 164/500 [========>.....................] - ETA: 2:51 - loss: 0.0926 - regression_loss: 0.0799 - classification_loss: 0.0127 165/500 [========>.....................] - ETA: 2:50 - loss: 0.0923 - regression_loss: 0.0796 - classification_loss: 0.0127 166/500 [========>.....................] - ETA: 2:50 - loss: 0.0921 - regression_loss: 0.0794 - classification_loss: 0.0127 167/500 [=========>....................] - ETA: 2:49 - loss: 0.0918 - regression_loss: 0.0792 - classification_loss: 0.0126 168/500 [=========>....................] - ETA: 2:49 - loss: 0.0916 - regression_loss: 0.0790 - classification_loss: 0.0126 169/500 [=========>....................] - ETA: 2:48 - loss: 0.0913 - regression_loss: 0.0787 - classification_loss: 0.0126 170/500 [=========>....................] - ETA: 2:48 - loss: 0.0911 - regression_loss: 0.0785 - classification_loss: 0.0126 171/500 [=========>....................] - ETA: 2:47 - loss: 0.0906 - regression_loss: 0.0781 - classification_loss: 0.0125 172/500 [=========>....................] - ETA: 2:47 - loss: 0.0903 - regression_loss: 0.0778 - classification_loss: 0.0125 173/500 [=========>....................] - ETA: 2:46 - loss: 0.0904 - regression_loss: 0.0780 - classification_loss: 0.0125 174/500 [=========>....................] - ETA: 2:46 - loss: 0.0901 - regression_loss: 0.0777 - classification_loss: 0.0124 175/500 [=========>....................] - ETA: 2:45 - loss: 0.0898 - regression_loss: 0.0774 - classification_loss: 0.0124 176/500 [=========>....................] - ETA: 2:45 - loss: 0.0897 - regression_loss: 0.0773 - classification_loss: 0.0123 177/500 [=========>....................] - ETA: 2:44 - loss: 0.0896 - regression_loss: 0.0772 - classification_loss: 0.0124 178/500 [=========>....................] - ETA: 2:44 - loss: 0.0893 - regression_loss: 0.0770 - classification_loss: 0.0123 179/500 [=========>....................] - ETA: 2:43 - loss: 0.0890 - regression_loss: 0.0767 - classification_loss: 0.0123 180/500 [=========>....................] - ETA: 2:43 - loss: 0.0886 - regression_loss: 0.0764 - classification_loss: 0.0122 181/500 [=========>....................] - ETA: 2:42 - loss: 0.0886 - regression_loss: 0.0764 - classification_loss: 0.0122 182/500 [=========>....................] - ETA: 2:42 - loss: 0.0890 - regression_loss: 0.0767 - classification_loss: 0.0122 183/500 [=========>....................] - ETA: 2:41 - loss: 0.0891 - regression_loss: 0.0769 - classification_loss: 0.0122 184/500 [==========>...................] - ETA: 2:41 - loss: 0.0888 - regression_loss: 0.0766 - classification_loss: 0.0122 185/500 [==========>...................] - ETA: 2:40 - loss: 0.0893 - regression_loss: 0.0771 - classification_loss: 0.0122 186/500 [==========>...................] - ETA: 2:40 - loss: 0.0896 - regression_loss: 0.0774 - classification_loss: 0.0122 187/500 [==========>...................] - ETA: 2:39 - loss: 0.0895 - regression_loss: 0.0773 - classification_loss: 0.0122 188/500 [==========>...................] - ETA: 2:39 - loss: 0.0894 - regression_loss: 0.0772 - classification_loss: 0.0122 189/500 [==========>...................] - ETA: 2:38 - loss: 0.0892 - regression_loss: 0.0771 - classification_loss: 0.0121 190/500 [==========>...................] - ETA: 2:37 - loss: 0.0890 - regression_loss: 0.0769 - classification_loss: 0.0121 191/500 [==========>...................] - ETA: 2:37 - loss: 0.0888 - regression_loss: 0.0768 - classification_loss: 0.0120 192/500 [==========>...................] - ETA: 2:36 - loss: 0.0886 - regression_loss: 0.0765 - classification_loss: 0.0120 193/500 [==========>...................] - ETA: 2:36 - loss: 0.0885 - regression_loss: 0.0765 - classification_loss: 0.0120 194/500 [==========>...................] - ETA: 2:35 - loss: 0.0888 - regression_loss: 0.0768 - classification_loss: 0.0120 195/500 [==========>...................] - ETA: 2:35 - loss: 0.0890 - regression_loss: 0.0770 - classification_loss: 0.0120 196/500 [==========>...................] - ETA: 2:34 - loss: 0.0890 - regression_loss: 0.0770 - classification_loss: 0.0120 197/500 [==========>...................] - ETA: 2:34 - loss: 0.0888 - regression_loss: 0.0768 - classification_loss: 0.0120 198/500 [==========>...................] - ETA: 2:33 - loss: 0.0889 - regression_loss: 0.0770 - classification_loss: 0.0119 199/500 [==========>...................] - ETA: 2:33 - loss: 0.0889 - regression_loss: 0.0769 - classification_loss: 0.0119 200/500 [===========>..................] - ETA: 2:32 - loss: 0.0886 - regression_loss: 0.0767 - classification_loss: 0.0119 201/500 [===========>..................] - ETA: 2:32 - loss: 0.0891 - regression_loss: 0.0771 - classification_loss: 0.0120 202/500 [===========>..................] - ETA: 2:31 - loss: 0.0889 - regression_loss: 0.0770 - classification_loss: 0.0120 203/500 [===========>..................] - ETA: 2:31 - loss: 0.0902 - regression_loss: 0.0779 - classification_loss: 0.0123 204/500 [===========>..................] - ETA: 2:30 - loss: 0.0899 - regression_loss: 0.0776 - classification_loss: 0.0123 205/500 [===========>..................] - ETA: 2:30 - loss: 0.0899 - regression_loss: 0.0776 - classification_loss: 0.0123 206/500 [===========>..................] - ETA: 2:29 - loss: 0.0901 - regression_loss: 0.0778 - classification_loss: 0.0123 207/500 [===========>..................] - ETA: 2:29 - loss: 0.0903 - regression_loss: 0.0781 - classification_loss: 0.0123 208/500 [===========>..................] - ETA: 2:28 - loss: 0.0902 - regression_loss: 0.0780 - classification_loss: 0.0122 209/500 [===========>..................] - ETA: 2:28 - loss: 0.0898 - regression_loss: 0.0777 - classification_loss: 0.0122 210/500 [===========>..................] - ETA: 2:27 - loss: 0.0896 - regression_loss: 0.0774 - classification_loss: 0.0121 211/500 [===========>..................] - ETA: 2:27 - loss: 0.0900 - regression_loss: 0.0779 - classification_loss: 0.0122 212/500 [===========>..................] - ETA: 2:26 - loss: 0.0898 - regression_loss: 0.0776 - classification_loss: 0.0121 213/500 [===========>..................] - ETA: 2:26 - loss: 0.0902 - regression_loss: 0.0781 - classification_loss: 0.0122 214/500 [===========>..................] - ETA: 2:25 - loss: 0.0899 - regression_loss: 0.0778 - classification_loss: 0.0121 215/500 [===========>..................] - ETA: 2:25 - loss: 0.0898 - regression_loss: 0.0777 - classification_loss: 0.0121 216/500 [===========>..................] - ETA: 2:24 - loss: 0.0898 - regression_loss: 0.0777 - classification_loss: 0.0121 217/500 [============>.................] - ETA: 2:24 - loss: 0.0896 - regression_loss: 0.0776 - classification_loss: 0.0120 218/500 [============>.................] - ETA: 2:23 - loss: 0.0894 - regression_loss: 0.0774 - classification_loss: 0.0120 219/500 [============>.................] - ETA: 2:23 - loss: 0.0893 - regression_loss: 0.0773 - classification_loss: 0.0120 220/500 [============>.................] - ETA: 2:22 - loss: 0.0891 - regression_loss: 0.0771 - classification_loss: 0.0120 221/500 [============>.................] - ETA: 2:22 - loss: 0.0891 - regression_loss: 0.0771 - classification_loss: 0.0119 222/500 [============>.................] - ETA: 2:21 - loss: 0.0888 - regression_loss: 0.0769 - classification_loss: 0.0119 223/500 [============>.................] - ETA: 2:21 - loss: 0.0895 - regression_loss: 0.0775 - classification_loss: 0.0120 224/500 [============>.................] - ETA: 2:20 - loss: 0.0893 - regression_loss: 0.0774 - classification_loss: 0.0119 225/500 [============>.................] - ETA: 2:20 - loss: 0.0917 - regression_loss: 0.0796 - classification_loss: 0.0122 226/500 [============>.................] - ETA: 2:19 - loss: 0.0918 - regression_loss: 0.0797 - classification_loss: 0.0122 227/500 [============>.................] - ETA: 2:19 - loss: 0.0917 - regression_loss: 0.0795 - classification_loss: 0.0122 228/500 [============>.................] - ETA: 2:18 - loss: 0.0915 - regression_loss: 0.0794 - classification_loss: 0.0121 229/500 [============>.................] - ETA: 2:18 - loss: 0.0916 - regression_loss: 0.0795 - classification_loss: 0.0121 230/500 [============>.................] - ETA: 2:17 - loss: 0.0913 - regression_loss: 0.0792 - classification_loss: 0.0121 231/500 [============>.................] - ETA: 2:17 - loss: 0.0912 - regression_loss: 0.0792 - classification_loss: 0.0121 232/500 [============>.................] - ETA: 2:16 - loss: 0.0926 - regression_loss: 0.0805 - classification_loss: 0.0121 233/500 [============>.................] - ETA: 2:16 - loss: 0.0941 - regression_loss: 0.0820 - classification_loss: 0.0122 234/500 [=============>................] - ETA: 2:15 - loss: 0.0940 - regression_loss: 0.0819 - classification_loss: 0.0122 235/500 [=============>................] - ETA: 2:15 - loss: 0.0940 - regression_loss: 0.0819 - classification_loss: 0.0122 236/500 [=============>................] - ETA: 2:14 - loss: 0.0938 - regression_loss: 0.0817 - classification_loss: 0.0121 237/500 [=============>................] - ETA: 2:14 - loss: 0.0936 - regression_loss: 0.0815 - classification_loss: 0.0121 238/500 [=============>................] - ETA: 2:13 - loss: 0.0933 - regression_loss: 0.0813 - classification_loss: 0.0121 239/500 [=============>................] - ETA: 2:13 - loss: 0.0938 - regression_loss: 0.0816 - classification_loss: 0.0122 240/500 [=============>................] - ETA: 2:12 - loss: 0.0939 - regression_loss: 0.0817 - classification_loss: 0.0122 241/500 [=============>................] - ETA: 2:12 - loss: 0.0937 - regression_loss: 0.0815 - classification_loss: 0.0122 242/500 [=============>................] - ETA: 2:11 - loss: 0.0936 - regression_loss: 0.0814 - classification_loss: 0.0122 243/500 [=============>................] - ETA: 2:10 - loss: 0.0935 - regression_loss: 0.0814 - classification_loss: 0.0121 244/500 [=============>................] - ETA: 2:10 - loss: 0.0934 - regression_loss: 0.0813 - classification_loss: 0.0121 245/500 [=============>................] - ETA: 2:09 - loss: 0.0931 - regression_loss: 0.0810 - classification_loss: 0.0121 246/500 [=============>................] - ETA: 2:09 - loss: 0.0929 - regression_loss: 0.0809 - classification_loss: 0.0121 247/500 [=============>................] - ETA: 2:08 - loss: 0.0927 - regression_loss: 0.0807 - classification_loss: 0.0120 248/500 [=============>................] - ETA: 2:08 - loss: 0.0925 - regression_loss: 0.0805 - classification_loss: 0.0120 249/500 [=============>................] - ETA: 2:07 - loss: 0.0924 - regression_loss: 0.0805 - classification_loss: 0.0120 250/500 [==============>...............] - ETA: 2:07 - loss: 0.0922 - regression_loss: 0.0802 - classification_loss: 0.0119 251/500 [==============>...............] - ETA: 2:06 - loss: 0.0919 - regression_loss: 0.0800 - classification_loss: 0.0119 252/500 [==============>...............] - ETA: 2:06 - loss: 0.0919 - regression_loss: 0.0800 - classification_loss: 0.0119 253/500 [==============>...............] - ETA: 2:05 - loss: 0.0919 - regression_loss: 0.0800 - classification_loss: 0.0119 254/500 [==============>...............] - ETA: 2:05 - loss: 0.0924 - regression_loss: 0.0805 - classification_loss: 0.0119 255/500 [==============>...............] - ETA: 2:04 - loss: 0.0921 - regression_loss: 0.0802 - classification_loss: 0.0119 256/500 [==============>...............] - ETA: 2:04 - loss: 0.0922 - regression_loss: 0.0803 - classification_loss: 0.0119 257/500 [==============>...............] - ETA: 2:03 - loss: 0.0920 - regression_loss: 0.0802 - classification_loss: 0.0119 258/500 [==============>...............] - ETA: 2:03 - loss: 0.0919 - regression_loss: 0.0800 - classification_loss: 0.0119 259/500 [==============>...............] - ETA: 2:02 - loss: 0.0917 - regression_loss: 0.0799 - classification_loss: 0.0119 260/500 [==============>...............] - ETA: 2:02 - loss: 0.0919 - regression_loss: 0.0800 - classification_loss: 0.0119 261/500 [==============>...............] - ETA: 2:01 - loss: 0.0917 - regression_loss: 0.0799 - classification_loss: 0.0119 262/500 [==============>...............] - ETA: 2:01 - loss: 0.0915 - regression_loss: 0.0797 - classification_loss: 0.0118 263/500 [==============>...............] - ETA: 2:00 - loss: 0.0916 - regression_loss: 0.0797 - classification_loss: 0.0118 264/500 [==============>...............] - ETA: 2:00 - loss: 0.0915 - regression_loss: 0.0796 - classification_loss: 0.0118 265/500 [==============>...............] - ETA: 1:59 - loss: 0.0917 - regression_loss: 0.0799 - classification_loss: 0.0118 266/500 [==============>...............] - ETA: 1:59 - loss: 0.0916 - regression_loss: 0.0798 - classification_loss: 0.0118 267/500 [===============>..............] - ETA: 1:58 - loss: 0.0915 - regression_loss: 0.0797 - classification_loss: 0.0118 268/500 [===============>..............] - ETA: 1:58 - loss: 0.0913 - regression_loss: 0.0795 - classification_loss: 0.0118 269/500 [===============>..............] - ETA: 1:57 - loss: 0.0912 - regression_loss: 0.0794 - classification_loss: 0.0118 270/500 [===============>..............] - ETA: 1:57 - loss: 0.0910 - regression_loss: 0.0792 - classification_loss: 0.0117 271/500 [===============>..............] - ETA: 1:56 - loss: 0.0911 - regression_loss: 0.0793 - classification_loss: 0.0117 272/500 [===============>..............] - ETA: 1:56 - loss: 0.0911 - regression_loss: 0.0794 - classification_loss: 0.0117 273/500 [===============>..............] - ETA: 1:55 - loss: 0.0913 - regression_loss: 0.0796 - classification_loss: 0.0117 274/500 [===============>..............] - ETA: 1:55 - loss: 0.0914 - regression_loss: 0.0796 - classification_loss: 0.0118 275/500 [===============>..............] - ETA: 1:54 - loss: 0.0919 - regression_loss: 0.0801 - classification_loss: 0.0118 276/500 [===============>..............] - ETA: 1:54 - loss: 0.0917 - regression_loss: 0.0799 - classification_loss: 0.0118 277/500 [===============>..............] - ETA: 1:53 - loss: 0.0915 - regression_loss: 0.0798 - classification_loss: 0.0118 278/500 [===============>..............] - ETA: 1:53 - loss: 0.0913 - regression_loss: 0.0796 - classification_loss: 0.0118 279/500 [===============>..............] - ETA: 1:52 - loss: 0.0913 - regression_loss: 0.0795 - classification_loss: 0.0117 280/500 [===============>..............] - ETA: 1:52 - loss: 0.0911 - regression_loss: 0.0794 - classification_loss: 0.0117 281/500 [===============>..............] - ETA: 1:51 - loss: 0.0913 - regression_loss: 0.0796 - classification_loss: 0.0117 282/500 [===============>..............] - ETA: 1:51 - loss: 0.0911 - regression_loss: 0.0794 - classification_loss: 0.0117 283/500 [===============>..............] - ETA: 1:50 - loss: 0.0930 - regression_loss: 0.0812 - classification_loss: 0.0119 284/500 [================>.............] - ETA: 1:50 - loss: 0.0928 - regression_loss: 0.0810 - classification_loss: 0.0118 285/500 [================>.............] - ETA: 1:49 - loss: 0.0927 - regression_loss: 0.0808 - classification_loss: 0.0118 286/500 [================>.............] - ETA: 1:49 - loss: 0.0924 - regression_loss: 0.0806 - classification_loss: 0.0118 287/500 [================>.............] - ETA: 1:48 - loss: 0.0930 - regression_loss: 0.0812 - classification_loss: 0.0118 288/500 [================>.............] - ETA: 1:48 - loss: 0.0928 - regression_loss: 0.0810 - classification_loss: 0.0118 289/500 [================>.............] - ETA: 1:47 - loss: 0.0926 - regression_loss: 0.0808 - classification_loss: 0.0118 290/500 [================>.............] - ETA: 1:47 - loss: 0.0926 - regression_loss: 0.0808 - classification_loss: 0.0118 291/500 [================>.............] - ETA: 1:46 - loss: 0.0926 - regression_loss: 0.0808 - classification_loss: 0.0118 292/500 [================>.............] - ETA: 1:46 - loss: 0.0925 - regression_loss: 0.0807 - classification_loss: 0.0118 293/500 [================>.............] - ETA: 1:45 - loss: 0.0923 - regression_loss: 0.0806 - classification_loss: 0.0117 294/500 [================>.............] - ETA: 1:45 - loss: 0.0922 - regression_loss: 0.0804 - classification_loss: 0.0117 295/500 [================>.............] - ETA: 1:44 - loss: 0.0923 - regression_loss: 0.0806 - classification_loss: 0.0117 296/500 [================>.............] - ETA: 1:44 - loss: 0.0921 - regression_loss: 0.0804 - classification_loss: 0.0117 297/500 [================>.............] - ETA: 1:43 - loss: 0.0920 - regression_loss: 0.0803 - classification_loss: 0.0117 298/500 [================>.............] - ETA: 1:43 - loss: 0.0918 - regression_loss: 0.0802 - classification_loss: 0.0117 299/500 [================>.............] - ETA: 1:42 - loss: 0.0917 - regression_loss: 0.0800 - classification_loss: 0.0117 300/500 [=================>............] - ETA: 1:42 - loss: 0.0915 - regression_loss: 0.0798 - classification_loss: 0.0117 301/500 [=================>............] - ETA: 1:41 - loss: 0.0913 - regression_loss: 0.0797 - classification_loss: 0.0117 302/500 [=================>............] - ETA: 1:41 - loss: 0.0911 - regression_loss: 0.0795 - classification_loss: 0.0116 303/500 [=================>............] - ETA: 1:40 - loss: 0.0912 - regression_loss: 0.0796 - classification_loss: 0.0116 304/500 [=================>............] - ETA: 1:39 - loss: 0.0910 - regression_loss: 0.0794 - classification_loss: 0.0116 305/500 [=================>............] - ETA: 1:39 - loss: 0.0912 - regression_loss: 0.0796 - classification_loss: 0.0116 306/500 [=================>............] - ETA: 1:38 - loss: 0.0911 - regression_loss: 0.0795 - classification_loss: 0.0116 307/500 [=================>............] - ETA: 1:38 - loss: 0.0909 - regression_loss: 0.0794 - classification_loss: 0.0116 308/500 [=================>............] - ETA: 1:37 - loss: 0.0908 - regression_loss: 0.0792 - classification_loss: 0.0115 309/500 [=================>............] - ETA: 1:37 - loss: 0.0906 - regression_loss: 0.0791 - classification_loss: 0.0115 310/500 [=================>............] - ETA: 1:36 - loss: 0.0905 - regression_loss: 0.0790 - classification_loss: 0.0115 311/500 [=================>............] - ETA: 1:36 - loss: 0.0903 - regression_loss: 0.0788 - classification_loss: 0.0115 312/500 [=================>............] - ETA: 1:35 - loss: 0.0922 - regression_loss: 0.0805 - classification_loss: 0.0117 313/500 [=================>............] - ETA: 1:35 - loss: 0.0921 - regression_loss: 0.0805 - classification_loss: 0.0117 314/500 [=================>............] - ETA: 1:34 - loss: 0.0920 - regression_loss: 0.0803 - classification_loss: 0.0116 315/500 [=================>............] - ETA: 1:34 - loss: 0.0922 - regression_loss: 0.0805 - classification_loss: 0.0116 316/500 [=================>............] - ETA: 1:33 - loss: 0.0921 - regression_loss: 0.0804 - classification_loss: 0.0116 317/500 [==================>...........] - ETA: 1:33 - loss: 0.0919 - regression_loss: 0.0803 - classification_loss: 0.0116 318/500 [==================>...........] - ETA: 1:32 - loss: 0.0936 - regression_loss: 0.0819 - classification_loss: 0.0117 319/500 [==================>...........] - ETA: 1:32 - loss: 0.0935 - regression_loss: 0.0818 - classification_loss: 0.0117 320/500 [==================>...........] - ETA: 1:31 - loss: 0.0936 - regression_loss: 0.0818 - classification_loss: 0.0117 321/500 [==================>...........] - ETA: 1:31 - loss: 0.0934 - regression_loss: 0.0816 - classification_loss: 0.0117 322/500 [==================>...........] - ETA: 1:30 - loss: 0.0931 - regression_loss: 0.0814 - classification_loss: 0.0117 323/500 [==================>...........] - ETA: 1:30 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0117 324/500 [==================>...........] - ETA: 1:29 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0117 325/500 [==================>...........] - ETA: 1:29 - loss: 0.0926 - regression_loss: 0.0810 - classification_loss: 0.0116 326/500 [==================>...........] - ETA: 1:28 - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0116 327/500 [==================>...........] - ETA: 1:28 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0117 328/500 [==================>...........] - ETA: 1:27 - loss: 0.0931 - regression_loss: 0.0814 - classification_loss: 0.0117 329/500 [==================>...........] - ETA: 1:27 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0116 330/500 [==================>...........] - ETA: 1:26 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0116 331/500 [==================>...........] - ETA: 1:26 - loss: 0.0928 - regression_loss: 0.0812 - classification_loss: 0.0116 332/500 [==================>...........] - ETA: 1:25 - loss: 0.0927 - regression_loss: 0.0811 - classification_loss: 0.0116 333/500 [==================>...........] - ETA: 1:25 - loss: 0.0926 - regression_loss: 0.0810 - classification_loss: 0.0116 334/500 [===================>..........] - ETA: 1:24 - loss: 0.0929 - regression_loss: 0.0812 - classification_loss: 0.0117 335/500 [===================>..........] - ETA: 1:24 - loss: 0.0932 - regression_loss: 0.0815 - classification_loss: 0.0117 336/500 [===================>..........] - ETA: 1:23 - loss: 0.0933 - regression_loss: 0.0815 - classification_loss: 0.0117 337/500 [===================>..........] - ETA: 1:23 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0117 338/500 [===================>..........] - ETA: 1:22 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0117 339/500 [===================>..........] - ETA: 1:22 - loss: 0.0929 - regression_loss: 0.0812 - classification_loss: 0.0117 340/500 [===================>..........] - ETA: 1:21 - loss: 0.0930 - regression_loss: 0.0813 - classification_loss: 0.0117 341/500 [===================>..........] - ETA: 1:21 - loss: 0.0929 - regression_loss: 0.0812 - classification_loss: 0.0117 342/500 [===================>..........] - ETA: 1:20 - loss: 0.0927 - regression_loss: 0.0811 - classification_loss: 0.0117 343/500 [===================>..........] - ETA: 1:20 - loss: 0.0926 - regression_loss: 0.0810 - classification_loss: 0.0116 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0925 - regression_loss: 0.0809 - classification_loss: 0.0116 345/500 [===================>..........] - ETA: 1:19 - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0116 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0926 - regression_loss: 0.0810 - classification_loss: 0.0116 347/500 [===================>..........] - ETA: 1:18 - loss: 0.0925 - regression_loss: 0.0809 - classification_loss: 0.0116 348/500 [===================>..........] - ETA: 1:17 - loss: 0.0927 - regression_loss: 0.0810 - classification_loss: 0.0116 349/500 [===================>..........] - ETA: 1:17 - loss: 0.0925 - regression_loss: 0.0809 - classification_loss: 0.0116 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0924 - regression_loss: 0.0808 - classification_loss: 0.0116 351/500 [====================>.........] - ETA: 1:15 - loss: 0.0922 - regression_loss: 0.0806 - classification_loss: 0.0116 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0920 - regression_loss: 0.0805 - classification_loss: 0.0116 353/500 [====================>.........] - ETA: 1:14 - loss: 0.0919 - regression_loss: 0.0803 - classification_loss: 0.0115 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0922 - regression_loss: 0.0806 - classification_loss: 0.0115 355/500 [====================>.........] - ETA: 1:13 - loss: 0.0920 - regression_loss: 0.0805 - classification_loss: 0.0115 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0919 - regression_loss: 0.0804 - classification_loss: 0.0115 357/500 [====================>.........] - ETA: 1:12 - loss: 0.0918 - regression_loss: 0.0803 - classification_loss: 0.0115 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0917 - regression_loss: 0.0802 - classification_loss: 0.0115 359/500 [====================>.........] - ETA: 1:11 - loss: 0.0915 - regression_loss: 0.0800 - classification_loss: 0.0115 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0915 - regression_loss: 0.0801 - classification_loss: 0.0114 361/500 [====================>.........] - ETA: 1:10 - loss: 0.0914 - regression_loss: 0.0800 - classification_loss: 0.0114 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0912 - regression_loss: 0.0798 - classification_loss: 0.0114 363/500 [====================>.........] - ETA: 1:09 - loss: 0.0910 - regression_loss: 0.0796 - classification_loss: 0.0114 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0909 - regression_loss: 0.0795 - classification_loss: 0.0114 365/500 [====================>.........] - ETA: 1:08 - loss: 0.0909 - regression_loss: 0.0795 - classification_loss: 0.0114 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0907 - regression_loss: 0.0794 - classification_loss: 0.0114 367/500 [=====================>........] - ETA: 1:07 - loss: 0.0905 - regression_loss: 0.0792 - classification_loss: 0.0113 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0904 - regression_loss: 0.0790 - classification_loss: 0.0113 369/500 [=====================>........] - ETA: 1:06 - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 371/500 [=====================>........] - ETA: 1:05 - loss: 0.0900 - regression_loss: 0.0788 - classification_loss: 0.0113 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0905 - regression_loss: 0.0792 - classification_loss: 0.0113 373/500 [=====================>........] - ETA: 1:04 - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0113 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0903 - regression_loss: 0.0791 - classification_loss: 0.0113 375/500 [=====================>........] - ETA: 1:03 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0113 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0901 - regression_loss: 0.0789 - classification_loss: 0.0113 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0900 - regression_loss: 0.0788 - classification_loss: 0.0112 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0899 - regression_loss: 0.0787 - classification_loss: 0.0112 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0898 - regression_loss: 0.0786 - classification_loss: 0.0112 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0112 381/500 [=====================>........] - ETA: 1:00 - loss: 0.0901 - regression_loss: 0.0789 - classification_loss: 0.0112 382/500 [=====================>........] - ETA: 1:00 - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0114 383/500 [=====================>........] - ETA: 59s - loss: 0.0907 - regression_loss: 0.0793 - classification_loss: 0.0114  384/500 [======================>.......] - ETA: 59s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0114 385/500 [======================>.......] - ETA: 58s - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0114 386/500 [======================>.......] - ETA: 58s - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 387/500 [======================>.......] - ETA: 57s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0113 388/500 [======================>.......] - ETA: 57s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0113 389/500 [======================>.......] - ETA: 56s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0113 390/500 [======================>.......] - ETA: 56s - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 391/500 [======================>.......] - ETA: 55s - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0113 392/500 [======================>.......] - ETA: 55s - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0113 393/500 [======================>.......] - ETA: 54s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0113 394/500 [======================>.......] - ETA: 54s - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 395/500 [======================>.......] - ETA: 53s - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0113 396/500 [======================>.......] - ETA: 53s - loss: 0.0902 - regression_loss: 0.0790 - classification_loss: 0.0113 397/500 [======================>.......] - ETA: 52s - loss: 0.0901 - regression_loss: 0.0789 - classification_loss: 0.0112 398/500 [======================>.......] - ETA: 52s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0112 399/500 [======================>.......] - ETA: 51s - loss: 0.0901 - regression_loss: 0.0789 - classification_loss: 0.0112 400/500 [=======================>......] - ETA: 51s - loss: 0.0900 - regression_loss: 0.0787 - classification_loss: 0.0112 401/500 [=======================>......] - ETA: 50s - loss: 0.0899 - regression_loss: 0.0787 - classification_loss: 0.0112 402/500 [=======================>......] - ETA: 49s - loss: 0.0898 - regression_loss: 0.0786 - classification_loss: 0.0112 403/500 [=======================>......] - ETA: 49s - loss: 0.0897 - regression_loss: 0.0785 - classification_loss: 0.0112 404/500 [=======================>......] - ETA: 48s - loss: 0.0896 - regression_loss: 0.0784 - classification_loss: 0.0112 405/500 [=======================>......] - ETA: 48s - loss: 0.0895 - regression_loss: 0.0783 - classification_loss: 0.0112 406/500 [=======================>......] - ETA: 47s - loss: 0.0896 - regression_loss: 0.0785 - classification_loss: 0.0112 407/500 [=======================>......] - ETA: 47s - loss: 0.0895 - regression_loss: 0.0784 - classification_loss: 0.0112 408/500 [=======================>......] - ETA: 46s - loss: 0.0894 - regression_loss: 0.0783 - classification_loss: 0.0111 409/500 [=======================>......] - ETA: 46s - loss: 0.0893 - regression_loss: 0.0782 - classification_loss: 0.0111 410/500 [=======================>......] - ETA: 45s - loss: 0.0893 - regression_loss: 0.0782 - classification_loss: 0.0111 411/500 [=======================>......] - ETA: 45s - loss: 0.0893 - regression_loss: 0.0782 - classification_loss: 0.0111 412/500 [=======================>......] - ETA: 44s - loss: 0.0892 - regression_loss: 0.0781 - classification_loss: 0.0111 413/500 [=======================>......] - ETA: 44s - loss: 0.0891 - regression_loss: 0.0780 - classification_loss: 0.0111 414/500 [=======================>......] - ETA: 43s - loss: 0.0890 - regression_loss: 0.0779 - classification_loss: 0.0111 415/500 [=======================>......] - ETA: 43s - loss: 0.0890 - regression_loss: 0.0779 - classification_loss: 0.0111 416/500 [=======================>......] - ETA: 42s - loss: 0.0891 - regression_loss: 0.0780 - classification_loss: 0.0111 417/500 [========================>.....] - ETA: 42s - loss: 0.0890 - regression_loss: 0.0779 - classification_loss: 0.0111 418/500 [========================>.....] - ETA: 41s - loss: 0.0909 - regression_loss: 0.0792 - classification_loss: 0.0117 419/500 [========================>.....] - ETA: 41s - loss: 0.0908 - regression_loss: 0.0791 - classification_loss: 0.0117 420/500 [========================>.....] - ETA: 40s - loss: 0.0908 - regression_loss: 0.0791 - classification_loss: 0.0117 421/500 [========================>.....] - ETA: 40s - loss: 0.0907 - regression_loss: 0.0790 - classification_loss: 0.0116 422/500 [========================>.....] - ETA: 39s - loss: 0.0905 - regression_loss: 0.0789 - classification_loss: 0.0116 423/500 [========================>.....] - ETA: 39s - loss: 0.0904 - regression_loss: 0.0788 - classification_loss: 0.0116 424/500 [========================>.....] - ETA: 38s - loss: 0.0905 - regression_loss: 0.0789 - classification_loss: 0.0116 425/500 [========================>.....] - ETA: 38s - loss: 0.0904 - regression_loss: 0.0788 - classification_loss: 0.0116 426/500 [========================>.....] - ETA: 37s - loss: 0.0904 - regression_loss: 0.0788 - classification_loss: 0.0116 427/500 [========================>.....] - ETA: 37s - loss: 0.0903 - regression_loss: 0.0787 - classification_loss: 0.0116 428/500 [========================>.....] - ETA: 36s - loss: 0.0905 - regression_loss: 0.0789 - classification_loss: 0.0116 429/500 [========================>.....] - ETA: 36s - loss: 0.0906 - regression_loss: 0.0790 - classification_loss: 0.0116 430/500 [========================>.....] - ETA: 35s - loss: 0.0907 - regression_loss: 0.0791 - classification_loss: 0.0116 431/500 [========================>.....] - ETA: 35s - loss: 0.0905 - regression_loss: 0.0790 - classification_loss: 0.0116 432/500 [========================>.....] - ETA: 34s - loss: 0.0906 - regression_loss: 0.0791 - classification_loss: 0.0116 433/500 [========================>.....] - ETA: 34s - loss: 0.0906 - regression_loss: 0.0790 - classification_loss: 0.0115 434/500 [=========================>....] - ETA: 33s - loss: 0.0914 - regression_loss: 0.0798 - classification_loss: 0.0116 435/500 [=========================>....] - ETA: 33s - loss: 0.0913 - regression_loss: 0.0797 - classification_loss: 0.0116 436/500 [=========================>....] - ETA: 32s - loss: 0.0913 - regression_loss: 0.0797 - classification_loss: 0.0116 437/500 [=========================>....] - ETA: 32s - loss: 0.0912 - regression_loss: 0.0797 - classification_loss: 0.0116 438/500 [=========================>....] - ETA: 31s - loss: 0.0915 - regression_loss: 0.0800 - classification_loss: 0.0116 439/500 [=========================>....] - ETA: 31s - loss: 0.0917 - regression_loss: 0.0801 - classification_loss: 0.0116 440/500 [=========================>....] - ETA: 30s - loss: 0.0916 - regression_loss: 0.0800 - classification_loss: 0.0116 441/500 [=========================>....] - ETA: 30s - loss: 0.0915 - regression_loss: 0.0799 - classification_loss: 0.0116 442/500 [=========================>....] - ETA: 29s - loss: 0.0917 - regression_loss: 0.0801 - classification_loss: 0.0116 443/500 [=========================>....] - ETA: 29s - loss: 0.0916 - regression_loss: 0.0800 - classification_loss: 0.0116 444/500 [=========================>....] - ETA: 28s - loss: 0.0916 - regression_loss: 0.0800 - classification_loss: 0.0116 445/500 [=========================>....] - ETA: 28s - loss: 0.0915 - regression_loss: 0.0800 - classification_loss: 0.0116 446/500 [=========================>....] - ETA: 27s - loss: 0.0914 - regression_loss: 0.0799 - classification_loss: 0.0116 447/500 [=========================>....] - ETA: 27s - loss: 0.0913 - regression_loss: 0.0797 - classification_loss: 0.0116 448/500 [=========================>....] - ETA: 26s - loss: 0.0912 - regression_loss: 0.0797 - classification_loss: 0.0115 449/500 [=========================>....] - ETA: 26s - loss: 0.0911 - regression_loss: 0.0796 - classification_loss: 0.0115 450/500 [==========================>...] - ETA: 25s - loss: 0.0910 - regression_loss: 0.0795 - classification_loss: 0.0115 451/500 [==========================>...] - ETA: 25s - loss: 0.0910 - regression_loss: 0.0794 - classification_loss: 0.0115 452/500 [==========================>...] - ETA: 24s - loss: 0.0908 - regression_loss: 0.0793 - classification_loss: 0.0115 453/500 [==========================>...] - ETA: 23s - loss: 0.0908 - regression_loss: 0.0793 - classification_loss: 0.0115 454/500 [==========================>...] - ETA: 23s - loss: 0.0907 - regression_loss: 0.0792 - classification_loss: 0.0115 455/500 [==========================>...] - ETA: 22s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0115 456/500 [==========================>...] - ETA: 22s - loss: 0.0913 - regression_loss: 0.0798 - classification_loss: 0.0115 457/500 [==========================>...] - ETA: 21s - loss: 0.0912 - regression_loss: 0.0798 - classification_loss: 0.0115 458/500 [==========================>...] - ETA: 21s - loss: 0.0912 - regression_loss: 0.0798 - classification_loss: 0.0115 459/500 [==========================>...] - ETA: 20s - loss: 0.0911 - regression_loss: 0.0797 - classification_loss: 0.0115 460/500 [==========================>...] - ETA: 20s - loss: 0.0910 - regression_loss: 0.0796 - classification_loss: 0.0114 461/500 [==========================>...] - ETA: 19s - loss: 0.0909 - regression_loss: 0.0794 - classification_loss: 0.0114 462/500 [==========================>...] - ETA: 19s - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0114 463/500 [==========================>...] - ETA: 18s - loss: 0.0908 - regression_loss: 0.0793 - classification_loss: 0.0114 464/500 [==========================>...] - ETA: 18s - loss: 0.0907 - regression_loss: 0.0793 - classification_loss: 0.0114 465/500 [==========================>...] - ETA: 17s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0114 466/500 [==========================>...] - ETA: 17s - loss: 0.0905 - regression_loss: 0.0791 - classification_loss: 0.0114 467/500 [===========================>..] - ETA: 16s - loss: 0.0905 - regression_loss: 0.0791 - classification_loss: 0.0114 468/500 [===========================>..] - ETA: 16s - loss: 0.0908 - regression_loss: 0.0793 - classification_loss: 0.0114 469/500 [===========================>..] - ETA: 15s - loss: 0.0907 - regression_loss: 0.0793 - classification_loss: 0.0114 470/500 [===========================>..] - ETA: 15s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0114 471/500 [===========================>..] - ETA: 14s - loss: 0.0907 - regression_loss: 0.0792 - classification_loss: 0.0114 472/500 [===========================>..] - ETA: 14s - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0115 473/500 [===========================>..] - ETA: 13s - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0115 474/500 [===========================>..] - ETA: 13s - loss: 0.0907 - regression_loss: 0.0792 - classification_loss: 0.0114 475/500 [===========================>..] - ETA: 12s - loss: 0.0906 - regression_loss: 0.0791 - classification_loss: 0.0114 476/500 [===========================>..] - ETA: 12s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0114 477/500 [===========================>..] - ETA: 11s - loss: 0.0905 - regression_loss: 0.0791 - classification_loss: 0.0114 478/500 [===========================>..] - ETA: 11s - loss: 0.0904 - regression_loss: 0.0790 - classification_loss: 0.0114 479/500 [===========================>..] - ETA: 10s - loss: 0.0903 - regression_loss: 0.0789 - classification_loss: 0.0114 480/500 [===========================>..] - ETA: 10s - loss: 0.0902 - regression_loss: 0.0788 - classification_loss: 0.0114 481/500 [===========================>..] - ETA: 9s - loss: 0.0902 - regression_loss: 0.0788 - classification_loss: 0.0114  482/500 [===========================>..] - ETA: 9s - loss: 0.0908 - regression_loss: 0.0794 - classification_loss: 0.0114 483/500 [===========================>..] - ETA: 8s - loss: 0.0907 - regression_loss: 0.0793 - classification_loss: 0.0114 484/500 [============================>.] - ETA: 8s - loss: 0.0906 - regression_loss: 0.0792 - classification_loss: 0.0114 485/500 [============================>.] - ETA: 7s - loss: 0.0905 - regression_loss: 0.0792 - classification_loss: 0.0114 486/500 [============================>.] - ETA: 7s - loss: 0.0904 - regression_loss: 0.0791 - classification_loss: 0.0114 487/500 [============================>.] - ETA: 6s - loss: 0.0903 - regression_loss: 0.0790 - classification_loss: 0.0113 488/500 [============================>.] - ETA: 6s - loss: 0.0902 - regression_loss: 0.0789 - classification_loss: 0.0113 489/500 [============================>.] - ETA: 5s - loss: 0.0901 - regression_loss: 0.0788 - classification_loss: 0.0113 490/500 [============================>.] - ETA: 5s - loss: 0.0900 - regression_loss: 0.0787 - classification_loss: 0.0113 491/500 [============================>.] - ETA: 4s - loss: 0.0899 - regression_loss: 0.0786 - classification_loss: 0.0113 492/500 [============================>.] - ETA: 4s - loss: 0.0898 - regression_loss: 0.0785 - classification_loss: 0.0113 493/500 [============================>.] - ETA: 3s - loss: 0.0897 - regression_loss: 0.0785 - classification_loss: 0.0113 494/500 [============================>.] - ETA: 3s - loss: 0.0900 - regression_loss: 0.0787 - classification_loss: 0.0113 495/500 [============================>.] - ETA: 2s - loss: 0.0900 - regression_loss: 0.0787 - classification_loss: 0.0113 496/500 [============================>.] - ETA: 2s - loss: 0.0898 - regression_loss: 0.0786 - classification_loss: 0.0113 497/500 [============================>.] - ETA: 1s - loss: 0.0898 - regression_loss: 0.0785 - classification_loss: 0.0113 498/500 [============================>.] - ETA: 1s - loss: 0.0897 - regression_loss: 0.0785 - classification_loss: 0.0113 499/500 [============================>.] - ETA: 0s - loss: 0.0896 - regression_loss: 0.0784 - classification_loss: 0.0112 500/500 [==============================] - 255s 511ms/step - loss: 0.0897 - regression_loss: 0.0784 - classification_loss: 0.0112 326 instances of class plum with average precision: 0.8297 mAP: 0.8297 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 4:03 - loss: 0.1483 - regression_loss: 0.1283 - classification_loss: 0.0200 2/500 [..............................] - ETA: 4:06 - loss: 0.1258 - regression_loss: 0.1070 - classification_loss: 0.0188 3/500 [..............................] - ETA: 4:07 - loss: 0.1060 - regression_loss: 0.0906 - classification_loss: 0.0153 4/500 [..............................] - ETA: 4:07 - loss: 0.0912 - regression_loss: 0.0783 - classification_loss: 0.0129 5/500 [..............................] - ETA: 4:07 - loss: 0.0783 - regression_loss: 0.0674 - classification_loss: 0.0109 6/500 [..............................] - ETA: 4:07 - loss: 0.1027 - regression_loss: 0.0870 - classification_loss: 0.0157 7/500 [..............................] - ETA: 4:06 - loss: 0.0916 - regression_loss: 0.0774 - classification_loss: 0.0142 8/500 [..............................] - ETA: 4:05 - loss: 0.0862 - regression_loss: 0.0729 - classification_loss: 0.0133 9/500 [..............................] - ETA: 4:05 - loss: 0.0852 - regression_loss: 0.0725 - classification_loss: 0.0127 10/500 [..............................] - ETA: 4:05 - loss: 0.0816 - regression_loss: 0.0696 - classification_loss: 0.0120 11/500 [..............................] - ETA: 4:05 - loss: 0.0769 - regression_loss: 0.0654 - classification_loss: 0.0115 12/500 [..............................] - ETA: 4:05 - loss: 0.0947 - regression_loss: 0.0812 - classification_loss: 0.0135 13/500 [..............................] - ETA: 4:05 - loss: 0.0965 - regression_loss: 0.0829 - classification_loss: 0.0136 14/500 [..............................] - ETA: 4:05 - loss: 0.0973 - regression_loss: 0.0840 - classification_loss: 0.0134 15/500 [..............................] - ETA: 4:05 - loss: 0.0933 - regression_loss: 0.0807 - classification_loss: 0.0126 16/500 [..............................] - ETA: 4:05 - loss: 0.0899 - regression_loss: 0.0777 - classification_loss: 0.0122 17/500 [>.............................] - ETA: 4:05 - loss: 0.0946 - regression_loss: 0.0823 - classification_loss: 0.0122 18/500 [>.............................] - ETA: 4:05 - loss: 0.0919 - regression_loss: 0.0801 - classification_loss: 0.0119 19/500 [>.............................] - ETA: 4:04 - loss: 0.0908 - regression_loss: 0.0793 - classification_loss: 0.0115 20/500 [>.............................] - ETA: 4:04 - loss: 0.0899 - regression_loss: 0.0785 - classification_loss: 0.0114 21/500 [>.............................] - ETA: 4:03 - loss: 0.0890 - regression_loss: 0.0779 - classification_loss: 0.0111 22/500 [>.............................] - ETA: 4:03 - loss: 0.0872 - regression_loss: 0.0763 - classification_loss: 0.0109 23/500 [>.............................] - ETA: 4:03 - loss: 0.0851 - regression_loss: 0.0744 - classification_loss: 0.0108 24/500 [>.............................] - ETA: 4:02 - loss: 0.0834 - regression_loss: 0.0728 - classification_loss: 0.0107 25/500 [>.............................] - ETA: 4:02 - loss: 0.0852 - regression_loss: 0.0745 - classification_loss: 0.0107 26/500 [>.............................] - ETA: 4:01 - loss: 0.0838 - regression_loss: 0.0732 - classification_loss: 0.0106 27/500 [>.............................] - ETA: 4:00 - loss: 0.0818 - regression_loss: 0.0715 - classification_loss: 0.0104 28/500 [>.............................] - ETA: 4:00 - loss: 0.0836 - regression_loss: 0.0731 - classification_loss: 0.0105 29/500 [>.............................] - ETA: 3:59 - loss: 0.0823 - regression_loss: 0.0719 - classification_loss: 0.0104 30/500 [>.............................] - ETA: 3:59 - loss: 0.0837 - regression_loss: 0.0733 - classification_loss: 0.0104 31/500 [>.............................] - ETA: 3:59 - loss: 0.0831 - regression_loss: 0.0729 - classification_loss: 0.0102 32/500 [>.............................] - ETA: 3:58 - loss: 0.0878 - regression_loss: 0.0771 - classification_loss: 0.0107 33/500 [>.............................] - ETA: 3:58 - loss: 0.0867 - regression_loss: 0.0761 - classification_loss: 0.0106 34/500 [=>............................] - ETA: 3:57 - loss: 0.0871 - regression_loss: 0.0766 - classification_loss: 0.0105 35/500 [=>............................] - ETA: 3:56 - loss: 0.0875 - regression_loss: 0.0769 - classification_loss: 0.0106 36/500 [=>............................] - ETA: 3:56 - loss: 0.0866 - regression_loss: 0.0761 - classification_loss: 0.0104 37/500 [=>............................] - ETA: 3:56 - loss: 0.0860 - regression_loss: 0.0756 - classification_loss: 0.0104 38/500 [=>............................] - ETA: 3:55 - loss: 0.0848 - regression_loss: 0.0745 - classification_loss: 0.0103 39/500 [=>............................] - ETA: 3:55 - loss: 0.0835 - regression_loss: 0.0733 - classification_loss: 0.0101 40/500 [=>............................] - ETA: 3:55 - loss: 0.0837 - regression_loss: 0.0735 - classification_loss: 0.0102 41/500 [=>............................] - ETA: 3:54 - loss: 0.0832 - regression_loss: 0.0731 - classification_loss: 0.0102 42/500 [=>............................] - ETA: 3:54 - loss: 0.0821 - regression_loss: 0.0720 - classification_loss: 0.0100 43/500 [=>............................] - ETA: 3:53 - loss: 0.0807 - regression_loss: 0.0708 - classification_loss: 0.0099 44/500 [=>............................] - ETA: 3:53 - loss: 0.0801 - regression_loss: 0.0703 - classification_loss: 0.0098 45/500 [=>............................] - ETA: 3:52 - loss: 0.0793 - regression_loss: 0.0696 - classification_loss: 0.0097 46/500 [=>............................] - ETA: 3:52 - loss: 0.0798 - regression_loss: 0.0700 - classification_loss: 0.0098 47/500 [=>............................] - ETA: 3:51 - loss: 0.0789 - regression_loss: 0.0692 - classification_loss: 0.0097 48/500 [=>............................] - ETA: 3:51 - loss: 0.0784 - regression_loss: 0.0689 - classification_loss: 0.0096 49/500 [=>............................] - ETA: 3:50 - loss: 0.0779 - regression_loss: 0.0684 - classification_loss: 0.0095 50/500 [==>...........................] - ETA: 3:50 - loss: 0.0768 - regression_loss: 0.0675 - classification_loss: 0.0093 51/500 [==>...........................] - ETA: 3:49 - loss: 0.0760 - regression_loss: 0.0668 - classification_loss: 0.0093 52/500 [==>...........................] - ETA: 3:48 - loss: 0.0753 - regression_loss: 0.0662 - classification_loss: 0.0091 53/500 [==>...........................] - ETA: 3:48 - loss: 0.0763 - regression_loss: 0.0671 - classification_loss: 0.0092 54/500 [==>...........................] - ETA: 3:47 - loss: 0.0755 - regression_loss: 0.0664 - classification_loss: 0.0091 55/500 [==>...........................] - ETA: 3:47 - loss: 0.0748 - regression_loss: 0.0658 - classification_loss: 0.0091 56/500 [==>...........................] - ETA: 3:46 - loss: 0.0748 - regression_loss: 0.0657 - classification_loss: 0.0091 57/500 [==>...........................] - ETA: 3:46 - loss: 0.0740 - regression_loss: 0.0650 - classification_loss: 0.0090 58/500 [==>...........................] - ETA: 3:45 - loss: 0.0737 - regression_loss: 0.0648 - classification_loss: 0.0089 59/500 [==>...........................] - ETA: 3:45 - loss: 0.0736 - regression_loss: 0.0647 - classification_loss: 0.0089 60/500 [==>...........................] - ETA: 3:44 - loss: 0.0872 - regression_loss: 0.0741 - classification_loss: 0.0131 61/500 [==>...........................] - ETA: 3:44 - loss: 0.0864 - regression_loss: 0.0735 - classification_loss: 0.0129 62/500 [==>...........................] - ETA: 3:43 - loss: 0.0881 - regression_loss: 0.0751 - classification_loss: 0.0130 63/500 [==>...........................] - ETA: 3:43 - loss: 0.0876 - regression_loss: 0.0747 - classification_loss: 0.0129 64/500 [==>...........................] - ETA: 3:42 - loss: 0.0882 - regression_loss: 0.0753 - classification_loss: 0.0129 65/500 [==>...........................] - ETA: 3:42 - loss: 0.0890 - regression_loss: 0.0759 - classification_loss: 0.0131 66/500 [==>...........................] - ETA: 3:41 - loss: 0.0888 - regression_loss: 0.0758 - classification_loss: 0.0130 67/500 [===>..........................] - ETA: 3:41 - loss: 0.0877 - regression_loss: 0.0749 - classification_loss: 0.0128 68/500 [===>..........................] - ETA: 3:40 - loss: 0.0877 - regression_loss: 0.0750 - classification_loss: 0.0127 69/500 [===>..........................] - ETA: 3:39 - loss: 0.0871 - regression_loss: 0.0745 - classification_loss: 0.0126 70/500 [===>..........................] - ETA: 3:39 - loss: 0.0863 - regression_loss: 0.0739 - classification_loss: 0.0124 71/500 [===>..........................] - ETA: 3:38 - loss: 0.0857 - regression_loss: 0.0733 - classification_loss: 0.0124 72/500 [===>..........................] - ETA: 3:38 - loss: 0.0850 - regression_loss: 0.0727 - classification_loss: 0.0123 73/500 [===>..........................] - ETA: 3:37 - loss: 0.0842 - regression_loss: 0.0720 - classification_loss: 0.0122 74/500 [===>..........................] - ETA: 3:37 - loss: 0.0837 - regression_loss: 0.0716 - classification_loss: 0.0121 75/500 [===>..........................] - ETA: 3:37 - loss: 0.0832 - regression_loss: 0.0712 - classification_loss: 0.0120 76/500 [===>..........................] - ETA: 3:36 - loss: 0.0828 - regression_loss: 0.0709 - classification_loss: 0.0119 77/500 [===>..........................] - ETA: 3:36 - loss: 0.0822 - regression_loss: 0.0703 - classification_loss: 0.0118 78/500 [===>..........................] - ETA: 3:35 - loss: 0.0820 - regression_loss: 0.0702 - classification_loss: 0.0118 79/500 [===>..........................] - ETA: 3:35 - loss: 0.0815 - regression_loss: 0.0698 - classification_loss: 0.0117 80/500 [===>..........................] - ETA: 3:34 - loss: 0.0809 - regression_loss: 0.0693 - classification_loss: 0.0116 81/500 [===>..........................] - ETA: 3:33 - loss: 0.0805 - regression_loss: 0.0690 - classification_loss: 0.0115 82/500 [===>..........................] - ETA: 3:33 - loss: 0.0817 - regression_loss: 0.0701 - classification_loss: 0.0116 83/500 [===>..........................] - ETA: 3:32 - loss: 0.0822 - regression_loss: 0.0707 - classification_loss: 0.0115 84/500 [====>.........................] - ETA: 3:32 - loss: 0.0816 - regression_loss: 0.0701 - classification_loss: 0.0115 85/500 [====>.........................] - ETA: 3:31 - loss: 0.0820 - regression_loss: 0.0705 - classification_loss: 0.0115 86/500 [====>.........................] - ETA: 3:31 - loss: 0.0817 - regression_loss: 0.0702 - classification_loss: 0.0114 87/500 [====>.........................] - ETA: 3:30 - loss: 0.0815 - regression_loss: 0.0701 - classification_loss: 0.0114 88/500 [====>.........................] - ETA: 3:30 - loss: 0.0823 - regression_loss: 0.0709 - classification_loss: 0.0114 89/500 [====>.........................] - ETA: 3:30 - loss: 0.0822 - regression_loss: 0.0710 - classification_loss: 0.0113 90/500 [====>.........................] - ETA: 3:29 - loss: 0.0817 - regression_loss: 0.0705 - classification_loss: 0.0112 91/500 [====>.........................] - ETA: 3:29 - loss: 0.0819 - regression_loss: 0.0707 - classification_loss: 0.0112 92/500 [====>.........................] - ETA: 3:28 - loss: 0.0815 - regression_loss: 0.0704 - classification_loss: 0.0112 93/500 [====>.........................] - ETA: 3:28 - loss: 0.0812 - regression_loss: 0.0701 - classification_loss: 0.0111 94/500 [====>.........................] - ETA: 3:27 - loss: 0.0861 - regression_loss: 0.0746 - classification_loss: 0.0115 95/500 [====>.........................] - ETA: 3:27 - loss: 0.0860 - regression_loss: 0.0745 - classification_loss: 0.0115 96/500 [====>.........................] - ETA: 3:26 - loss: 0.0853 - regression_loss: 0.0739 - classification_loss: 0.0114 97/500 [====>.........................] - ETA: 3:25 - loss: 0.0852 - regression_loss: 0.0738 - classification_loss: 0.0114 98/500 [====>.........................] - ETA: 3:25 - loss: 0.0846 - regression_loss: 0.0733 - classification_loss: 0.0113 99/500 [====>.........................] - ETA: 3:24 - loss: 0.0843 - regression_loss: 0.0730 - classification_loss: 0.0113 100/500 [=====>........................] - ETA: 3:24 - loss: 0.0848 - regression_loss: 0.0735 - classification_loss: 0.0113 101/500 [=====>........................] - ETA: 3:24 - loss: 0.0845 - regression_loss: 0.0732 - classification_loss: 0.0113 102/500 [=====>........................] - ETA: 3:23 - loss: 0.0842 - regression_loss: 0.0729 - classification_loss: 0.0112 103/500 [=====>........................] - ETA: 3:22 - loss: 0.0837 - regression_loss: 0.0725 - classification_loss: 0.0112 104/500 [=====>........................] - ETA: 3:22 - loss: 0.0831 - regression_loss: 0.0721 - classification_loss: 0.0111 105/500 [=====>........................] - ETA: 3:21 - loss: 0.0825 - regression_loss: 0.0715 - classification_loss: 0.0110 106/500 [=====>........................] - ETA: 3:21 - loss: 0.0833 - regression_loss: 0.0722 - classification_loss: 0.0111 107/500 [=====>........................] - ETA: 3:21 - loss: 0.0829 - regression_loss: 0.0718 - classification_loss: 0.0110 108/500 [=====>........................] - ETA: 3:20 - loss: 0.0829 - regression_loss: 0.0718 - classification_loss: 0.0110 109/500 [=====>........................] - ETA: 3:20 - loss: 0.0824 - regression_loss: 0.0715 - classification_loss: 0.0110 110/500 [=====>........................] - ETA: 3:19 - loss: 0.0821 - regression_loss: 0.0712 - classification_loss: 0.0109 111/500 [=====>........................] - ETA: 3:19 - loss: 0.0823 - regression_loss: 0.0714 - classification_loss: 0.0109 112/500 [=====>........................] - ETA: 3:18 - loss: 0.0825 - regression_loss: 0.0715 - classification_loss: 0.0110 113/500 [=====>........................] - ETA: 3:18 - loss: 0.0821 - regression_loss: 0.0711 - classification_loss: 0.0109 114/500 [=====>........................] - ETA: 3:17 - loss: 0.0822 - regression_loss: 0.0712 - classification_loss: 0.0110 115/500 [=====>........................] - ETA: 3:17 - loss: 0.0831 - regression_loss: 0.0721 - classification_loss: 0.0110 116/500 [=====>........................] - ETA: 3:16 - loss: 0.0829 - regression_loss: 0.0720 - classification_loss: 0.0110 117/500 [======>.......................] - ETA: 3:15 - loss: 0.0827 - regression_loss: 0.0717 - classification_loss: 0.0110 118/500 [======>.......................] - ETA: 3:15 - loss: 0.0825 - regression_loss: 0.0716 - classification_loss: 0.0109 119/500 [======>.......................] - ETA: 3:14 - loss: 0.0821 - regression_loss: 0.0712 - classification_loss: 0.0109 120/500 [======>.......................] - ETA: 3:14 - loss: 0.0851 - regression_loss: 0.0741 - classification_loss: 0.0110 121/500 [======>.......................] - ETA: 3:13 - loss: 0.0855 - regression_loss: 0.0745 - classification_loss: 0.0110 122/500 [======>.......................] - ETA: 3:13 - loss: 0.0855 - regression_loss: 0.0745 - classification_loss: 0.0110 123/500 [======>.......................] - ETA: 3:12 - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0110 124/500 [======>.......................] - ETA: 3:12 - loss: 0.0856 - regression_loss: 0.0746 - classification_loss: 0.0110 125/500 [======>.......................] - ETA: 3:11 - loss: 0.0851 - regression_loss: 0.0742 - classification_loss: 0.0109 126/500 [======>.......................] - ETA: 3:11 - loss: 0.0850 - regression_loss: 0.0741 - classification_loss: 0.0109 127/500 [======>.......................] - ETA: 3:10 - loss: 0.0848 - regression_loss: 0.0739 - classification_loss: 0.0109 128/500 [======>.......................] - ETA: 3:10 - loss: 0.0845 - regression_loss: 0.0736 - classification_loss: 0.0109 129/500 [======>.......................] - ETA: 3:09 - loss: 0.0846 - regression_loss: 0.0737 - classification_loss: 0.0109 130/500 [======>.......................] - ETA: 3:09 - loss: 0.0842 - regression_loss: 0.0733 - classification_loss: 0.0108 131/500 [======>.......................] - ETA: 3:08 - loss: 0.0855 - regression_loss: 0.0747 - classification_loss: 0.0109 132/500 [======>.......................] - ETA: 3:08 - loss: 0.0851 - regression_loss: 0.0743 - classification_loss: 0.0108 133/500 [======>.......................] - ETA: 3:07 - loss: 0.0848 - regression_loss: 0.0741 - classification_loss: 0.0108 134/500 [=======>......................] - ETA: 3:07 - loss: 0.0846 - regression_loss: 0.0739 - classification_loss: 0.0107 135/500 [=======>......................] - ETA: 3:06 - loss: 0.0841 - regression_loss: 0.0734 - classification_loss: 0.0107 136/500 [=======>......................] - ETA: 3:06 - loss: 0.0840 - regression_loss: 0.0733 - classification_loss: 0.0107 137/500 [=======>......................] - ETA: 3:05 - loss: 0.0837 - regression_loss: 0.0731 - classification_loss: 0.0106 138/500 [=======>......................] - ETA: 3:05 - loss: 0.0835 - regression_loss: 0.0729 - classification_loss: 0.0106 139/500 [=======>......................] - ETA: 3:04 - loss: 0.0834 - regression_loss: 0.0728 - classification_loss: 0.0106 140/500 [=======>......................] - ETA: 3:04 - loss: 0.0842 - regression_loss: 0.0735 - classification_loss: 0.0107 141/500 [=======>......................] - ETA: 3:03 - loss: 0.0838 - regression_loss: 0.0731 - classification_loss: 0.0106 142/500 [=======>......................] - ETA: 3:03 - loss: 0.0840 - regression_loss: 0.0733 - classification_loss: 0.0106 143/500 [=======>......................] - ETA: 3:02 - loss: 0.0837 - regression_loss: 0.0731 - classification_loss: 0.0106 144/500 [=======>......................] - ETA: 3:02 - loss: 0.0833 - regression_loss: 0.0728 - classification_loss: 0.0105 145/500 [=======>......................] - ETA: 3:01 - loss: 0.0874 - regression_loss: 0.0764 - classification_loss: 0.0110 146/500 [=======>......................] - ETA: 3:01 - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 147/500 [=======>......................] - ETA: 3:00 - loss: 0.0877 - regression_loss: 0.0767 - classification_loss: 0.0110 148/500 [=======>......................] - ETA: 3:00 - loss: 0.0875 - regression_loss: 0.0765 - classification_loss: 0.0110 149/500 [=======>......................] - ETA: 2:59 - loss: 0.0871 - regression_loss: 0.0762 - classification_loss: 0.0110 150/500 [========>.....................] - ETA: 2:59 - loss: 0.0868 - regression_loss: 0.0759 - classification_loss: 0.0109 151/500 [========>.....................] - ETA: 2:58 - loss: 0.0864 - regression_loss: 0.0755 - classification_loss: 0.0109 152/500 [========>.....................] - ETA: 2:58 - loss: 0.0880 - regression_loss: 0.0766 - classification_loss: 0.0114 153/500 [========>.....................] - ETA: 2:57 - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0114 154/500 [========>.....................] - ETA: 2:57 - loss: 0.0875 - regression_loss: 0.0762 - classification_loss: 0.0113 155/500 [========>.....................] - ETA: 2:56 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0113 156/500 [========>.....................] - ETA: 2:55 - loss: 0.0869 - regression_loss: 0.0756 - classification_loss: 0.0113 157/500 [========>.....................] - ETA: 2:55 - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0113 158/500 [========>.....................] - ETA: 2:54 - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0112 159/500 [========>.....................] - ETA: 2:54 - loss: 0.0864 - regression_loss: 0.0751 - classification_loss: 0.0112 160/500 [========>.....................] - ETA: 2:53 - loss: 0.0861 - regression_loss: 0.0749 - classification_loss: 0.0112 161/500 [========>.....................] - ETA: 2:53 - loss: 0.0858 - regression_loss: 0.0747 - classification_loss: 0.0111 162/500 [========>.....................] - ETA: 2:52 - loss: 0.0853 - regression_loss: 0.0743 - classification_loss: 0.0111 163/500 [========>.....................] - ETA: 2:52 - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0111 164/500 [========>.....................] - ETA: 2:51 - loss: 0.0848 - regression_loss: 0.0738 - classification_loss: 0.0111 165/500 [========>.....................] - ETA: 2:51 - loss: 0.0848 - regression_loss: 0.0737 - classification_loss: 0.0111 166/500 [========>.....................] - ETA: 2:50 - loss: 0.0853 - regression_loss: 0.0742 - classification_loss: 0.0111 167/500 [=========>....................] - ETA: 2:50 - loss: 0.0853 - regression_loss: 0.0742 - classification_loss: 0.0111 168/500 [=========>....................] - ETA: 2:49 - loss: 0.0850 - regression_loss: 0.0740 - classification_loss: 0.0110 169/500 [=========>....................] - ETA: 2:49 - loss: 0.0851 - regression_loss: 0.0740 - classification_loss: 0.0111 170/500 [=========>....................] - ETA: 2:48 - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0110 171/500 [=========>....................] - ETA: 2:48 - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0111 172/500 [=========>....................] - ETA: 2:47 - loss: 0.0855 - regression_loss: 0.0745 - classification_loss: 0.0110 173/500 [=========>....................] - ETA: 2:47 - loss: 0.0854 - regression_loss: 0.0743 - classification_loss: 0.0110 174/500 [=========>....................] - ETA: 2:46 - loss: 0.0851 - regression_loss: 0.0741 - classification_loss: 0.0110 175/500 [=========>....................] - ETA: 2:46 - loss: 0.0849 - regression_loss: 0.0740 - classification_loss: 0.0109 176/500 [=========>....................] - ETA: 2:45 - loss: 0.0855 - regression_loss: 0.0746 - classification_loss: 0.0109 177/500 [=========>....................] - ETA: 2:45 - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 178/500 [=========>....................] - ETA: 2:44 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 179/500 [=========>....................] - ETA: 2:44 - loss: 0.0848 - regression_loss: 0.0740 - classification_loss: 0.0108 180/500 [=========>....................] - ETA: 2:43 - loss: 0.0846 - regression_loss: 0.0738 - classification_loss: 0.0108 181/500 [=========>....................] - ETA: 2:43 - loss: 0.0843 - regression_loss: 0.0736 - classification_loss: 0.0108 182/500 [=========>....................] - ETA: 2:42 - loss: 0.0842 - regression_loss: 0.0734 - classification_loss: 0.0108 183/500 [=========>....................] - ETA: 2:42 - loss: 0.0839 - regression_loss: 0.0732 - classification_loss: 0.0107 184/500 [==========>...................] - ETA: 2:41 - loss: 0.0841 - regression_loss: 0.0734 - classification_loss: 0.0107 185/500 [==========>...................] - ETA: 2:40 - loss: 0.0839 - regression_loss: 0.0732 - classification_loss: 0.0107 186/500 [==========>...................] - ETA: 2:40 - loss: 0.0836 - regression_loss: 0.0730 - classification_loss: 0.0106 187/500 [==========>...................] - ETA: 2:39 - loss: 0.0835 - regression_loss: 0.0729 - classification_loss: 0.0106 188/500 [==========>...................] - ETA: 2:39 - loss: 0.0833 - regression_loss: 0.0727 - classification_loss: 0.0106 189/500 [==========>...................] - ETA: 2:38 - loss: 0.0831 - regression_loss: 0.0725 - classification_loss: 0.0106 190/500 [==========>...................] - ETA: 2:38 - loss: 0.0835 - regression_loss: 0.0729 - classification_loss: 0.0106 191/500 [==========>...................] - ETA: 2:37 - loss: 0.0834 - regression_loss: 0.0728 - classification_loss: 0.0106 192/500 [==========>...................] - ETA: 2:37 - loss: 0.0833 - regression_loss: 0.0728 - classification_loss: 0.0106 193/500 [==========>...................] - ETA: 2:36 - loss: 0.0832 - regression_loss: 0.0726 - classification_loss: 0.0105 194/500 [==========>...................] - ETA: 2:36 - loss: 0.0829 - regression_loss: 0.0724 - classification_loss: 0.0105 195/500 [==========>...................] - ETA: 2:35 - loss: 0.0851 - regression_loss: 0.0741 - classification_loss: 0.0110 196/500 [==========>...................] - ETA: 2:35 - loss: 0.0849 - regression_loss: 0.0739 - classification_loss: 0.0109 197/500 [==========>...................] - ETA: 2:34 - loss: 0.0847 - regression_loss: 0.0738 - classification_loss: 0.0109 198/500 [==========>...................] - ETA: 2:34 - loss: 0.0845 - regression_loss: 0.0736 - classification_loss: 0.0109 199/500 [==========>...................] - ETA: 2:33 - loss: 0.0844 - regression_loss: 0.0735 - classification_loss: 0.0109 200/500 [===========>..................] - ETA: 2:33 - loss: 0.0846 - regression_loss: 0.0738 - classification_loss: 0.0109 201/500 [===========>..................] - ETA: 2:32 - loss: 0.0846 - regression_loss: 0.0737 - classification_loss: 0.0109 202/500 [===========>..................] - ETA: 2:32 - loss: 0.0843 - regression_loss: 0.0734 - classification_loss: 0.0108 203/500 [===========>..................] - ETA: 2:31 - loss: 0.0841 - regression_loss: 0.0733 - classification_loss: 0.0108 204/500 [===========>..................] - ETA: 2:31 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 205/500 [===========>..................] - ETA: 2:30 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 206/500 [===========>..................] - ETA: 2:30 - loss: 0.0847 - regression_loss: 0.0740 - classification_loss: 0.0108 207/500 [===========>..................] - ETA: 2:29 - loss: 0.0845 - regression_loss: 0.0737 - classification_loss: 0.0107 208/500 [===========>..................] - ETA: 2:29 - loss: 0.0843 - regression_loss: 0.0736 - classification_loss: 0.0107 209/500 [===========>..................] - ETA: 2:28 - loss: 0.0841 - regression_loss: 0.0734 - classification_loss: 0.0107 210/500 [===========>..................] - ETA: 2:28 - loss: 0.0855 - regression_loss: 0.0748 - classification_loss: 0.0107 211/500 [===========>..................] - ETA: 2:27 - loss: 0.0854 - regression_loss: 0.0747 - classification_loss: 0.0107 212/500 [===========>..................] - ETA: 2:27 - loss: 0.0852 - regression_loss: 0.0745 - classification_loss: 0.0107 213/500 [===========>..................] - ETA: 2:26 - loss: 0.0851 - regression_loss: 0.0745 - classification_loss: 0.0106 214/500 [===========>..................] - ETA: 2:26 - loss: 0.0849 - regression_loss: 0.0743 - classification_loss: 0.0106 215/500 [===========>..................] - ETA: 2:25 - loss: 0.0849 - regression_loss: 0.0743 - classification_loss: 0.0106 216/500 [===========>..................] - ETA: 2:24 - loss: 0.0847 - regression_loss: 0.0741 - classification_loss: 0.0106 217/500 [============>.................] - ETA: 2:24 - loss: 0.0845 - regression_loss: 0.0740 - classification_loss: 0.0106 218/500 [============>.................] - ETA: 2:23 - loss: 0.0843 - regression_loss: 0.0738 - classification_loss: 0.0105 219/500 [============>.................] - ETA: 2:23 - loss: 0.0841 - regression_loss: 0.0736 - classification_loss: 0.0105 220/500 [============>.................] - ETA: 2:22 - loss: 0.0840 - regression_loss: 0.0735 - classification_loss: 0.0105 221/500 [============>.................] - ETA: 2:22 - loss: 0.0842 - regression_loss: 0.0737 - classification_loss: 0.0105 222/500 [============>.................] - ETA: 2:21 - loss: 0.0841 - regression_loss: 0.0736 - classification_loss: 0.0105 223/500 [============>.................] - ETA: 2:21 - loss: 0.0840 - regression_loss: 0.0735 - classification_loss: 0.0104 224/500 [============>.................] - ETA: 2:20 - loss: 0.0837 - regression_loss: 0.0733 - classification_loss: 0.0104 225/500 [============>.................] - ETA: 2:20 - loss: 0.0837 - regression_loss: 0.0733 - classification_loss: 0.0104 226/500 [============>.................] - ETA: 2:19 - loss: 0.0834 - regression_loss: 0.0731 - classification_loss: 0.0104 227/500 [============>.................] - ETA: 2:19 - loss: 0.0843 - regression_loss: 0.0738 - classification_loss: 0.0105 228/500 [============>.................] - ETA: 2:18 - loss: 0.0841 - regression_loss: 0.0736 - classification_loss: 0.0105 229/500 [============>.................] - ETA: 2:18 - loss: 0.0843 - regression_loss: 0.0738 - classification_loss: 0.0105 230/500 [============>.................] - ETA: 2:17 - loss: 0.0841 - regression_loss: 0.0736 - classification_loss: 0.0104 231/500 [============>.................] - ETA: 2:17 - loss: 0.0846 - regression_loss: 0.0741 - classification_loss: 0.0105 232/500 [============>.................] - ETA: 2:16 - loss: 0.0845 - regression_loss: 0.0740 - classification_loss: 0.0105 233/500 [============>.................] - ETA: 2:16 - loss: 0.0842 - regression_loss: 0.0738 - classification_loss: 0.0104 234/500 [=============>................] - ETA: 2:15 - loss: 0.0841 - regression_loss: 0.0737 - classification_loss: 0.0104 235/500 [=============>................] - ETA: 2:15 - loss: 0.0839 - regression_loss: 0.0735 - classification_loss: 0.0104 236/500 [=============>................] - ETA: 2:14 - loss: 0.0840 - regression_loss: 0.0736 - classification_loss: 0.0104 237/500 [=============>................] - ETA: 2:14 - loss: 0.0840 - regression_loss: 0.0736 - classification_loss: 0.0104 238/500 [=============>................] - ETA: 2:13 - loss: 0.0838 - regression_loss: 0.0734 - classification_loss: 0.0104 239/500 [=============>................] - ETA: 2:13 - loss: 0.0838 - regression_loss: 0.0734 - classification_loss: 0.0104 240/500 [=============>................] - ETA: 2:12 - loss: 0.0836 - regression_loss: 0.0732 - classification_loss: 0.0104 241/500 [=============>................] - ETA: 2:12 - loss: 0.0838 - regression_loss: 0.0734 - classification_loss: 0.0104 242/500 [=============>................] - ETA: 2:11 - loss: 0.0837 - regression_loss: 0.0734 - classification_loss: 0.0104 243/500 [=============>................] - ETA: 2:11 - loss: 0.0835 - regression_loss: 0.0732 - classification_loss: 0.0103 244/500 [=============>................] - ETA: 2:10 - loss: 0.0833 - regression_loss: 0.0730 - classification_loss: 0.0103 245/500 [=============>................] - ETA: 2:10 - loss: 0.0831 - regression_loss: 0.0728 - classification_loss: 0.0103 246/500 [=============>................] - ETA: 2:09 - loss: 0.0830 - regression_loss: 0.0727 - classification_loss: 0.0103 247/500 [=============>................] - ETA: 2:09 - loss: 0.0828 - regression_loss: 0.0726 - classification_loss: 0.0102 248/500 [=============>................] - ETA: 2:08 - loss: 0.0827 - regression_loss: 0.0725 - classification_loss: 0.0102 249/500 [=============>................] - ETA: 2:08 - loss: 0.0826 - regression_loss: 0.0724 - classification_loss: 0.0102 250/500 [==============>...............] - ETA: 2:07 - loss: 0.0827 - regression_loss: 0.0725 - classification_loss: 0.0102 251/500 [==============>...............] - ETA: 2:07 - loss: 0.0828 - regression_loss: 0.0725 - classification_loss: 0.0102 252/500 [==============>...............] - ETA: 2:06 - loss: 0.0832 - regression_loss: 0.0729 - classification_loss: 0.0103 253/500 [==============>...............] - ETA: 2:06 - loss: 0.0861 - regression_loss: 0.0749 - classification_loss: 0.0112 254/500 [==============>...............] - ETA: 2:05 - loss: 0.0860 - regression_loss: 0.0748 - classification_loss: 0.0112 255/500 [==============>...............] - ETA: 2:05 - loss: 0.0858 - regression_loss: 0.0747 - classification_loss: 0.0112 256/500 [==============>...............] - ETA: 2:04 - loss: 0.0857 - regression_loss: 0.0746 - classification_loss: 0.0112 257/500 [==============>...............] - ETA: 2:04 - loss: 0.0855 - regression_loss: 0.0744 - classification_loss: 0.0111 258/500 [==============>...............] - ETA: 2:03 - loss: 0.0856 - regression_loss: 0.0745 - classification_loss: 0.0111 259/500 [==============>...............] - ETA: 2:03 - loss: 0.0857 - regression_loss: 0.0745 - classification_loss: 0.0112 260/500 [==============>...............] - ETA: 2:02 - loss: 0.0856 - regression_loss: 0.0745 - classification_loss: 0.0111 261/500 [==============>...............] - ETA: 2:02 - loss: 0.0854 - regression_loss: 0.0743 - classification_loss: 0.0111 262/500 [==============>...............] - ETA: 2:01 - loss: 0.0851 - regression_loss: 0.0741 - classification_loss: 0.0111 263/500 [==============>...............] - ETA: 2:01 - loss: 0.0849 - regression_loss: 0.0739 - classification_loss: 0.0110 264/500 [==============>...............] - ETA: 2:00 - loss: 0.0848 - regression_loss: 0.0737 - classification_loss: 0.0110 265/500 [==============>...............] - ETA: 2:00 - loss: 0.0846 - regression_loss: 0.0736 - classification_loss: 0.0110 266/500 [==============>...............] - ETA: 1:59 - loss: 0.0844 - regression_loss: 0.0734 - classification_loss: 0.0110 267/500 [===============>..............] - ETA: 1:59 - loss: 0.0844 - regression_loss: 0.0734 - classification_loss: 0.0110 268/500 [===============>..............] - ETA: 1:58 - loss: 0.0846 - regression_loss: 0.0736 - classification_loss: 0.0110 269/500 [===============>..............] - ETA: 1:58 - loss: 0.0845 - regression_loss: 0.0735 - classification_loss: 0.0110 270/500 [===============>..............] - ETA: 1:57 - loss: 0.0844 - regression_loss: 0.0734 - classification_loss: 0.0110 271/500 [===============>..............] - ETA: 1:57 - loss: 0.0842 - regression_loss: 0.0733 - classification_loss: 0.0109 272/500 [===============>..............] - ETA: 1:56 - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 273/500 [===============>..............] - ETA: 1:55 - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 274/500 [===============>..............] - ETA: 1:55 - loss: 0.0849 - regression_loss: 0.0737 - classification_loss: 0.0112 275/500 [===============>..............] - ETA: 1:54 - loss: 0.0848 - regression_loss: 0.0736 - classification_loss: 0.0112 276/500 [===============>..............] - ETA: 1:54 - loss: 0.0846 - regression_loss: 0.0735 - classification_loss: 0.0111 277/500 [===============>..............] - ETA: 1:53 - loss: 0.0844 - regression_loss: 0.0733 - classification_loss: 0.0111 278/500 [===============>..............] - ETA: 1:53 - loss: 0.0843 - regression_loss: 0.0732 - classification_loss: 0.0111 279/500 [===============>..............] - ETA: 1:52 - loss: 0.0848 - regression_loss: 0.0737 - classification_loss: 0.0112 280/500 [===============>..............] - ETA: 1:52 - loss: 0.0861 - regression_loss: 0.0749 - classification_loss: 0.0112 281/500 [===============>..............] - ETA: 1:51 - loss: 0.0863 - regression_loss: 0.0751 - classification_loss: 0.0112 282/500 [===============>..............] - ETA: 1:51 - loss: 0.0862 - regression_loss: 0.0750 - classification_loss: 0.0112 283/500 [===============>..............] - ETA: 1:50 - loss: 0.0881 - regression_loss: 0.0767 - classification_loss: 0.0114 284/500 [================>.............] - ETA: 1:50 - loss: 0.0883 - regression_loss: 0.0769 - classification_loss: 0.0114 285/500 [================>.............] - ETA: 1:49 - loss: 0.0899 - regression_loss: 0.0783 - classification_loss: 0.0116 286/500 [================>.............] - ETA: 1:49 - loss: 0.0898 - regression_loss: 0.0783 - classification_loss: 0.0115 287/500 [================>.............] - ETA: 1:48 - loss: 0.0897 - regression_loss: 0.0782 - classification_loss: 0.0115 288/500 [================>.............] - ETA: 1:48 - loss: 0.0896 - regression_loss: 0.0781 - classification_loss: 0.0115 289/500 [================>.............] - ETA: 1:47 - loss: 0.0896 - regression_loss: 0.0781 - classification_loss: 0.0115 290/500 [================>.............] - ETA: 1:47 - loss: 0.0896 - regression_loss: 0.0781 - classification_loss: 0.0115 291/500 [================>.............] - ETA: 1:46 - loss: 0.0894 - regression_loss: 0.0779 - classification_loss: 0.0115 292/500 [================>.............] - ETA: 1:46 - loss: 0.0893 - regression_loss: 0.0778 - classification_loss: 0.0115 293/500 [================>.............] - ETA: 1:45 - loss: 0.0891 - regression_loss: 0.0776 - classification_loss: 0.0115 294/500 [================>.............] - ETA: 1:45 - loss: 0.0888 - regression_loss: 0.0774 - classification_loss: 0.0114 295/500 [================>.............] - ETA: 1:44 - loss: 0.0887 - regression_loss: 0.0773 - classification_loss: 0.0114 296/500 [================>.............] - ETA: 1:44 - loss: 0.0886 - regression_loss: 0.0772 - classification_loss: 0.0114 297/500 [================>.............] - ETA: 1:43 - loss: 0.0886 - regression_loss: 0.0772 - classification_loss: 0.0114 298/500 [================>.............] - ETA: 1:43 - loss: 0.0888 - regression_loss: 0.0774 - classification_loss: 0.0114 299/500 [================>.............] - ETA: 1:42 - loss: 0.0892 - regression_loss: 0.0777 - classification_loss: 0.0115 300/500 [=================>............] - ETA: 1:42 - loss: 0.0891 - regression_loss: 0.0776 - classification_loss: 0.0115 301/500 [=================>............] - ETA: 1:41 - loss: 0.0889 - regression_loss: 0.0775 - classification_loss: 0.0114 302/500 [=================>............] - ETA: 1:41 - loss: 0.0889 - regression_loss: 0.0775 - classification_loss: 0.0114 303/500 [=================>............] - ETA: 1:40 - loss: 0.0887 - regression_loss: 0.0774 - classification_loss: 0.0114 304/500 [=================>............] - ETA: 1:40 - loss: 0.0886 - regression_loss: 0.0772 - classification_loss: 0.0114 305/500 [=================>............] - ETA: 1:39 - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0114 306/500 [=================>............] - ETA: 1:39 - loss: 0.0883 - regression_loss: 0.0769 - classification_loss: 0.0113 307/500 [=================>............] - ETA: 1:38 - loss: 0.0881 - regression_loss: 0.0768 - classification_loss: 0.0113 308/500 [=================>............] - ETA: 1:38 - loss: 0.0886 - regression_loss: 0.0772 - classification_loss: 0.0113 309/500 [=================>............] - ETA: 1:37 - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 310/500 [=================>............] - ETA: 1:37 - loss: 0.0883 - regression_loss: 0.0770 - classification_loss: 0.0113 311/500 [=================>............] - ETA: 1:36 - loss: 0.0881 - regression_loss: 0.0768 - classification_loss: 0.0113 312/500 [=================>............] - ETA: 1:36 - loss: 0.0882 - regression_loss: 0.0769 - classification_loss: 0.0113 313/500 [=================>............] - ETA: 1:35 - loss: 0.0881 - regression_loss: 0.0768 - classification_loss: 0.0113 314/500 [=================>............] - ETA: 1:34 - loss: 0.0880 - regression_loss: 0.0767 - classification_loss: 0.0113 315/500 [=================>............] - ETA: 1:34 - loss: 0.0878 - regression_loss: 0.0766 - classification_loss: 0.0113 316/500 [=================>............] - ETA: 1:33 - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0112 317/500 [==================>...........] - ETA: 1:33 - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0113 318/500 [==================>...........] - ETA: 1:32 - loss: 0.0881 - regression_loss: 0.0768 - classification_loss: 0.0113 319/500 [==================>...........] - ETA: 1:32 - loss: 0.0879 - regression_loss: 0.0767 - classification_loss: 0.0112 320/500 [==================>...........] - ETA: 1:31 - loss: 0.0878 - regression_loss: 0.0766 - classification_loss: 0.0112 321/500 [==================>...........] - ETA: 1:31 - loss: 0.0878 - regression_loss: 0.0766 - classification_loss: 0.0112 322/500 [==================>...........] - ETA: 1:30 - loss: 0.0876 - regression_loss: 0.0764 - classification_loss: 0.0112 323/500 [==================>...........] - ETA: 1:30 - loss: 0.0875 - regression_loss: 0.0763 - classification_loss: 0.0112 324/500 [==================>...........] - ETA: 1:29 - loss: 0.0877 - regression_loss: 0.0765 - classification_loss: 0.0112 325/500 [==================>...........] - ETA: 1:29 - loss: 0.0876 - regression_loss: 0.0764 - classification_loss: 0.0112 326/500 [==================>...........] - ETA: 1:28 - loss: 0.0875 - regression_loss: 0.0764 - classification_loss: 0.0112 327/500 [==================>...........] - ETA: 1:28 - loss: 0.0874 - regression_loss: 0.0763 - classification_loss: 0.0111 328/500 [==================>...........] - ETA: 1:27 - loss: 0.0875 - regression_loss: 0.0763 - classification_loss: 0.0112 329/500 [==================>...........] - ETA: 1:27 - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0112 330/500 [==================>...........] - ETA: 1:26 - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0111 331/500 [==================>...........] - ETA: 1:26 - loss: 0.0872 - regression_loss: 0.0761 - classification_loss: 0.0111 332/500 [==================>...........] - ETA: 1:25 - loss: 0.0871 - regression_loss: 0.0760 - classification_loss: 0.0111 333/500 [==================>...........] - ETA: 1:25 - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111 334/500 [===================>..........] - ETA: 1:24 - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 335/500 [===================>..........] - ETA: 1:24 - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0111 336/500 [===================>..........] - ETA: 1:23 - loss: 0.0867 - regression_loss: 0.0756 - classification_loss: 0.0110 337/500 [===================>..........] - ETA: 1:23 - loss: 0.0880 - regression_loss: 0.0768 - classification_loss: 0.0112 338/500 [===================>..........] - ETA: 1:22 - loss: 0.0880 - regression_loss: 0.0769 - classification_loss: 0.0112 339/500 [===================>..........] - ETA: 1:22 - loss: 0.0893 - regression_loss: 0.0779 - classification_loss: 0.0114 340/500 [===================>..........] - ETA: 1:21 - loss: 0.0891 - regression_loss: 0.0777 - classification_loss: 0.0114 341/500 [===================>..........] - ETA: 1:21 - loss: 0.0889 - regression_loss: 0.0775 - classification_loss: 0.0114 342/500 [===================>..........] - ETA: 1:20 - loss: 0.0888 - regression_loss: 0.0774 - classification_loss: 0.0114 343/500 [===================>..........] - ETA: 1:20 - loss: 0.0887 - regression_loss: 0.0774 - classification_loss: 0.0113 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0886 - regression_loss: 0.0773 - classification_loss: 0.0113 345/500 [===================>..........] - ETA: 1:19 - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 347/500 [===================>..........] - ETA: 1:18 - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 348/500 [===================>..........] - ETA: 1:17 - loss: 0.0882 - regression_loss: 0.0770 - classification_loss: 0.0113 349/500 [===================>..........] - ETA: 1:17 - loss: 0.0881 - regression_loss: 0.0769 - classification_loss: 0.0113 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0879 - regression_loss: 0.0767 - classification_loss: 0.0113 351/500 [====================>.........] - ETA: 1:16 - loss: 0.0878 - regression_loss: 0.0766 - classification_loss: 0.0112 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0878 - regression_loss: 0.0765 - classification_loss: 0.0112 353/500 [====================>.........] - ETA: 1:15 - loss: 0.0879 - regression_loss: 0.0767 - classification_loss: 0.0112 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0112 355/500 [====================>.........] - ETA: 1:14 - loss: 0.0877 - regression_loss: 0.0765 - classification_loss: 0.0112 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0877 - regression_loss: 0.0765 - classification_loss: 0.0112 357/500 [====================>.........] - ETA: 1:12 - loss: 0.0877 - regression_loss: 0.0765 - classification_loss: 0.0112 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0876 - regression_loss: 0.0764 - classification_loss: 0.0112 359/500 [====================>.........] - ETA: 1:11 - loss: 0.0874 - regression_loss: 0.0763 - classification_loss: 0.0112 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0873 - regression_loss: 0.0762 - classification_loss: 0.0112 361/500 [====================>.........] - ETA: 1:10 - loss: 0.0872 - regression_loss: 0.0761 - classification_loss: 0.0111 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0111 363/500 [====================>.........] - ETA: 1:09 - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0111 365/500 [====================>.........] - ETA: 1:08 - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0111 367/500 [=====================>........] - ETA: 1:07 - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0111 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0111 369/500 [=====================>........] - ETA: 1:06 - loss: 0.0867 - regression_loss: 0.0756 - classification_loss: 0.0111 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 371/500 [=====================>........] - ETA: 1:05 - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 373/500 [=====================>........] - ETA: 1:04 - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0111 375/500 [=====================>........] - ETA: 1:03 - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0111 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0111 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0868 - regression_loss: 0.0758 - classification_loss: 0.0111 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 381/500 [=====================>........] - ETA: 1:00 - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0111 382/500 [=====================>........] - ETA: 1:00 - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110 383/500 [=====================>........] - ETA: 59s - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110  384/500 [======================>.......] - ETA: 59s - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0110 385/500 [======================>.......] - ETA: 58s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 386/500 [======================>.......] - ETA: 58s - loss: 0.0865 - regression_loss: 0.0756 - classification_loss: 0.0110 387/500 [======================>.......] - ETA: 57s - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 388/500 [======================>.......] - ETA: 57s - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 389/500 [======================>.......] - ETA: 56s - loss: 0.0869 - regression_loss: 0.0759 - classification_loss: 0.0110 390/500 [======================>.......] - ETA: 56s - loss: 0.0868 - regression_loss: 0.0758 - classification_loss: 0.0110 391/500 [======================>.......] - ETA: 55s - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110 392/500 [======================>.......] - ETA: 55s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 393/500 [======================>.......] - ETA: 54s - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 394/500 [======================>.......] - ETA: 54s - loss: 0.0870 - regression_loss: 0.0760 - classification_loss: 0.0110 395/500 [======================>.......] - ETA: 53s - loss: 0.0869 - regression_loss: 0.0759 - classification_loss: 0.0110 396/500 [======================>.......] - ETA: 53s - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 397/500 [======================>.......] - ETA: 52s - loss: 0.0870 - regression_loss: 0.0761 - classification_loss: 0.0110 398/500 [======================>.......] - ETA: 52s - loss: 0.0871 - regression_loss: 0.0762 - classification_loss: 0.0110 399/500 [======================>.......] - ETA: 51s - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 400/500 [=======================>......] - ETA: 51s - loss: 0.0870 - regression_loss: 0.0760 - classification_loss: 0.0110 401/500 [=======================>......] - ETA: 50s - loss: 0.0872 - regression_loss: 0.0762 - classification_loss: 0.0110 402/500 [=======================>......] - ETA: 50s - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 403/500 [=======================>......] - ETA: 49s - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 404/500 [=======================>......] - ETA: 48s - loss: 0.0871 - regression_loss: 0.0761 - classification_loss: 0.0110 405/500 [=======================>......] - ETA: 48s - loss: 0.0869 - regression_loss: 0.0760 - classification_loss: 0.0109 406/500 [=======================>......] - ETA: 47s - loss: 0.0870 - regression_loss: 0.0760 - classification_loss: 0.0109 407/500 [=======================>......] - ETA: 47s - loss: 0.0870 - regression_loss: 0.0760 - classification_loss: 0.0109 408/500 [=======================>......] - ETA: 46s - loss: 0.0869 - regression_loss: 0.0759 - classification_loss: 0.0109 409/500 [=======================>......] - ETA: 46s - loss: 0.0867 - regression_loss: 0.0758 - classification_loss: 0.0109 410/500 [=======================>......] - ETA: 45s - loss: 0.0866 - regression_loss: 0.0757 - classification_loss: 0.0109 411/500 [=======================>......] - ETA: 45s - loss: 0.0866 - regression_loss: 0.0757 - classification_loss: 0.0109 412/500 [=======================>......] - ETA: 44s - loss: 0.0873 - regression_loss: 0.0765 - classification_loss: 0.0109 413/500 [=======================>......] - ETA: 44s - loss: 0.0872 - regression_loss: 0.0763 - classification_loss: 0.0109 414/500 [=======================>......] - ETA: 43s - loss: 0.0871 - regression_loss: 0.0762 - classification_loss: 0.0109 415/500 [=======================>......] - ETA: 43s - loss: 0.0872 - regression_loss: 0.0763 - classification_loss: 0.0109 416/500 [=======================>......] - ETA: 42s - loss: 0.0871 - regression_loss: 0.0763 - classification_loss: 0.0109 417/500 [========================>.....] - ETA: 42s - loss: 0.0871 - regression_loss: 0.0762 - classification_loss: 0.0108 418/500 [========================>.....] - ETA: 41s - loss: 0.0870 - regression_loss: 0.0761 - classification_loss: 0.0108 419/500 [========================>.....] - ETA: 41s - loss: 0.0870 - regression_loss: 0.0762 - classification_loss: 0.0108 420/500 [========================>.....] - ETA: 40s - loss: 0.0871 - regression_loss: 0.0763 - classification_loss: 0.0108 421/500 [========================>.....] - ETA: 40s - loss: 0.0870 - regression_loss: 0.0762 - classification_loss: 0.0108 422/500 [========================>.....] - ETA: 39s - loss: 0.0883 - regression_loss: 0.0773 - classification_loss: 0.0110 423/500 [========================>.....] - ETA: 39s - loss: 0.0882 - regression_loss: 0.0772 - classification_loss: 0.0110 424/500 [========================>.....] - ETA: 38s - loss: 0.0881 - regression_loss: 0.0771 - classification_loss: 0.0109 425/500 [========================>.....] - ETA: 38s - loss: 0.0880 - regression_loss: 0.0771 - classification_loss: 0.0109 426/500 [========================>.....] - ETA: 37s - loss: 0.0879 - regression_loss: 0.0770 - classification_loss: 0.0109 427/500 [========================>.....] - ETA: 37s - loss: 0.0878 - regression_loss: 0.0769 - classification_loss: 0.0109 428/500 [========================>.....] - ETA: 36s - loss: 0.0876 - regression_loss: 0.0767 - classification_loss: 0.0109 429/500 [========================>.....] - ETA: 36s - loss: 0.0875 - regression_loss: 0.0766 - classification_loss: 0.0109 430/500 [========================>.....] - ETA: 35s - loss: 0.0874 - regression_loss: 0.0765 - classification_loss: 0.0109 431/500 [========================>.....] - ETA: 35s - loss: 0.0875 - regression_loss: 0.0766 - classification_loss: 0.0109 432/500 [========================>.....] - ETA: 34s - loss: 0.0873 - regression_loss: 0.0765 - classification_loss: 0.0108 433/500 [========================>.....] - ETA: 34s - loss: 0.0892 - regression_loss: 0.0778 - classification_loss: 0.0114 434/500 [=========================>....] - ETA: 33s - loss: 0.0891 - regression_loss: 0.0777 - classification_loss: 0.0114 435/500 [=========================>....] - ETA: 33s - loss: 0.0892 - regression_loss: 0.0778 - classification_loss: 0.0114 436/500 [=========================>....] - ETA: 32s - loss: 0.0891 - regression_loss: 0.0777 - classification_loss: 0.0114 437/500 [=========================>....] - ETA: 32s - loss: 0.0890 - regression_loss: 0.0776 - classification_loss: 0.0114 438/500 [=========================>....] - ETA: 31s - loss: 0.0889 - regression_loss: 0.0775 - classification_loss: 0.0114 439/500 [=========================>....] - ETA: 31s - loss: 0.0888 - regression_loss: 0.0774 - classification_loss: 0.0114 440/500 [=========================>....] - ETA: 30s - loss: 0.0887 - regression_loss: 0.0773 - classification_loss: 0.0114 441/500 [=========================>....] - ETA: 30s - loss: 0.0890 - regression_loss: 0.0776 - classification_loss: 0.0114 442/500 [=========================>....] - ETA: 29s - loss: 0.0888 - regression_loss: 0.0775 - classification_loss: 0.0113 443/500 [=========================>....] - ETA: 29s - loss: 0.0891 - regression_loss: 0.0777 - classification_loss: 0.0114 444/500 [=========================>....] - ETA: 28s - loss: 0.0890 - regression_loss: 0.0776 - classification_loss: 0.0114 445/500 [=========================>....] - ETA: 28s - loss: 0.0889 - regression_loss: 0.0775 - classification_loss: 0.0114 446/500 [=========================>....] - ETA: 27s - loss: 0.0888 - regression_loss: 0.0774 - classification_loss: 0.0114 447/500 [=========================>....] - ETA: 27s - loss: 0.0887 - regression_loss: 0.0773 - classification_loss: 0.0114 448/500 [=========================>....] - ETA: 26s - loss: 0.0886 - regression_loss: 0.0772 - classification_loss: 0.0114 449/500 [=========================>....] - ETA: 26s - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 450/500 [==========================>...] - ETA: 25s - loss: 0.0883 - regression_loss: 0.0770 - classification_loss: 0.0113 451/500 [==========================>...] - ETA: 25s - loss: 0.0885 - regression_loss: 0.0772 - classification_loss: 0.0113 452/500 [==========================>...] - ETA: 24s - loss: 0.0885 - regression_loss: 0.0772 - classification_loss: 0.0113 453/500 [==========================>...] - ETA: 23s - loss: 0.0885 - regression_loss: 0.0772 - classification_loss: 0.0113 454/500 [==========================>...] - ETA: 23s - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 455/500 [==========================>...] - ETA: 22s - loss: 0.0883 - regression_loss: 0.0770 - classification_loss: 0.0113 456/500 [==========================>...] - ETA: 22s - loss: 0.0882 - regression_loss: 0.0769 - classification_loss: 0.0113 457/500 [==========================>...] - ETA: 21s - loss: 0.0884 - regression_loss: 0.0771 - classification_loss: 0.0113 458/500 [==========================>...] - ETA: 21s - loss: 0.0883 - regression_loss: 0.0770 - classification_loss: 0.0113 459/500 [==========================>...] - ETA: 20s - loss: 0.0882 - regression_loss: 0.0769 - classification_loss: 0.0113 460/500 [==========================>...] - ETA: 20s - loss: 0.0883 - regression_loss: 0.0769 - classification_loss: 0.0113 461/500 [==========================>...] - ETA: 19s - loss: 0.0882 - regression_loss: 0.0769 - classification_loss: 0.0113 462/500 [==========================>...] - ETA: 19s - loss: 0.0881 - regression_loss: 0.0768 - classification_loss: 0.0113 463/500 [==========================>...] - ETA: 18s - loss: 0.0880 - regression_loss: 0.0767 - classification_loss: 0.0113 464/500 [==========================>...] - ETA: 18s - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0113 465/500 [==========================>...] - ETA: 17s - loss: 0.0877 - regression_loss: 0.0765 - classification_loss: 0.0113 466/500 [==========================>...] - ETA: 17s - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0113 467/500 [===========================>..] - ETA: 16s - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0113 468/500 [===========================>..] - ETA: 16s - loss: 0.0875 - regression_loss: 0.0763 - classification_loss: 0.0112 469/500 [===========================>..] - ETA: 15s - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0112 470/500 [===========================>..] - ETA: 15s - loss: 0.0873 - regression_loss: 0.0761 - classification_loss: 0.0112 471/500 [===========================>..] - ETA: 14s - loss: 0.0873 - regression_loss: 0.0760 - classification_loss: 0.0112 472/500 [===========================>..] - ETA: 14s - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0112 473/500 [===========================>..] - ETA: 13s - loss: 0.0873 - regression_loss: 0.0761 - classification_loss: 0.0112 474/500 [===========================>..] - ETA: 13s - loss: 0.0872 - regression_loss: 0.0760 - classification_loss: 0.0112 475/500 [===========================>..] - ETA: 12s - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0112 476/500 [===========================>..] - ETA: 12s - loss: 0.0872 - regression_loss: 0.0760 - classification_loss: 0.0112 477/500 [===========================>..] - ETA: 11s - loss: 0.0872 - regression_loss: 0.0760 - classification_loss: 0.0112 478/500 [===========================>..] - ETA: 11s - loss: 0.0871 - regression_loss: 0.0760 - classification_loss: 0.0111 479/500 [===========================>..] - ETA: 10s - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111 480/500 [===========================>..] - ETA: 10s - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111 481/500 [===========================>..] - ETA: 9s - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111  482/500 [===========================>..] - ETA: 9s - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0111 483/500 [===========================>..] - ETA: 8s - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 484/500 [============================>.] - ETA: 8s - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0111 485/500 [============================>.] - ETA: 7s - loss: 0.0867 - regression_loss: 0.0756 - classification_loss: 0.0111 486/500 [============================>.] - ETA: 7s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0111 487/500 [============================>.] - ETA: 6s - loss: 0.0866 - regression_loss: 0.0755 - classification_loss: 0.0111 488/500 [============================>.] - ETA: 6s - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 489/500 [============================>.] - ETA: 5s - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 490/500 [============================>.] - ETA: 5s - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 491/500 [============================>.] - ETA: 4s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0110 492/500 [============================>.] - ETA: 4s - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 493/500 [============================>.] - ETA: 3s - loss: 0.0862 - regression_loss: 0.0752 - classification_loss: 0.0110 494/500 [============================>.] - ETA: 3s - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 495/500 [============================>.] - ETA: 2s - loss: 0.0868 - regression_loss: 0.0756 - classification_loss: 0.0111 496/500 [============================>.] - ETA: 2s - loss: 0.0867 - regression_loss: 0.0756 - classification_loss: 0.0111 497/500 [============================>.] - ETA: 1s - loss: 0.0866 - regression_loss: 0.0755 - classification_loss: 0.0111 498/500 [============================>.] - ETA: 1s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 499/500 [============================>.] - ETA: 0s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 500/500 [==============================] - 255s 511ms/step - loss: 0.0864 - regression_loss: 0.0753 - classification_loss: 0.0111 326 instances of class plum with average precision: 0.8342 mAP: 0.8342 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 4:16 - loss: 0.0364 - regression_loss: 0.0335 - classification_loss: 0.0028 2/500 [..............................] - ETA: 4:13 - loss: 0.1019 - regression_loss: 0.0929 - classification_loss: 0.0090 3/500 [..............................] - ETA: 4:12 - loss: 0.1138 - regression_loss: 0.1053 - classification_loss: 0.0085 4/500 [..............................] - ETA: 4:10 - loss: 0.2049 - regression_loss: 0.1904 - classification_loss: 0.0145 5/500 [..............................] - ETA: 4:08 - loss: 0.1715 - regression_loss: 0.1587 - classification_loss: 0.0127 6/500 [..............................] - ETA: 4:08 - loss: 0.1641 - regression_loss: 0.1516 - classification_loss: 0.0125 7/500 [..............................] - ETA: 4:08 - loss: 0.1451 - regression_loss: 0.1340 - classification_loss: 0.0112 8/500 [..............................] - ETA: 4:07 - loss: 0.1295 - regression_loss: 0.1193 - classification_loss: 0.0102 9/500 [..............................] - ETA: 4:07 - loss: 0.1218 - regression_loss: 0.1121 - classification_loss: 0.0097 10/500 [..............................] - ETA: 4:07 - loss: 0.1234 - regression_loss: 0.1136 - classification_loss: 0.0098 11/500 [..............................] - ETA: 4:07 - loss: 0.1143 - regression_loss: 0.1050 - classification_loss: 0.0093 12/500 [..............................] - ETA: 4:07 - loss: 0.1083 - regression_loss: 0.0993 - classification_loss: 0.0090 13/500 [..............................] - ETA: 4:06 - loss: 0.1023 - regression_loss: 0.0937 - classification_loss: 0.0086 14/500 [..............................] - ETA: 4:06 - loss: 0.0999 - regression_loss: 0.0912 - classification_loss: 0.0088 15/500 [..............................] - ETA: 4:05 - loss: 0.0983 - regression_loss: 0.0894 - classification_loss: 0.0089 16/500 [..............................] - ETA: 4:06 - loss: 0.1435 - regression_loss: 0.1196 - classification_loss: 0.0239 17/500 [>.............................] - ETA: 4:06 - loss: 0.1385 - regression_loss: 0.1157 - classification_loss: 0.0229 18/500 [>.............................] - ETA: 4:05 - loss: 0.1335 - regression_loss: 0.1115 - classification_loss: 0.0220 19/500 [>.............................] - ETA: 4:04 - loss: 0.1294 - regression_loss: 0.1083 - classification_loss: 0.0211 20/500 [>.............................] - ETA: 4:04 - loss: 0.1317 - regression_loss: 0.1107 - classification_loss: 0.0210 21/500 [>.............................] - ETA: 4:04 - loss: 0.1282 - regression_loss: 0.1077 - classification_loss: 0.0205 22/500 [>.............................] - ETA: 4:03 - loss: 0.1264 - regression_loss: 0.1064 - classification_loss: 0.0200 23/500 [>.............................] - ETA: 4:02 - loss: 0.1243 - regression_loss: 0.1047 - classification_loss: 0.0195 24/500 [>.............................] - ETA: 4:02 - loss: 0.1206 - regression_loss: 0.1018 - classification_loss: 0.0188 25/500 [>.............................] - ETA: 4:02 - loss: 0.1203 - regression_loss: 0.1017 - classification_loss: 0.0186 26/500 [>.............................] - ETA: 4:01 - loss: 0.1176 - regression_loss: 0.0995 - classification_loss: 0.0181 27/500 [>.............................] - ETA: 4:01 - loss: 0.1154 - regression_loss: 0.0978 - classification_loss: 0.0176 28/500 [>.............................] - ETA: 4:00 - loss: 0.1168 - regression_loss: 0.0994 - classification_loss: 0.0174 29/500 [>.............................] - ETA: 4:00 - loss: 0.1150 - regression_loss: 0.0979 - classification_loss: 0.0171 30/500 [>.............................] - ETA: 3:59 - loss: 0.1120 - regression_loss: 0.0953 - classification_loss: 0.0167 31/500 [>.............................] - ETA: 3:59 - loss: 0.1096 - regression_loss: 0.0932 - classification_loss: 0.0164 32/500 [>.............................] - ETA: 3:58 - loss: 0.1151 - regression_loss: 0.0981 - classification_loss: 0.0169 33/500 [>.............................] - ETA: 3:58 - loss: 0.1124 - regression_loss: 0.0959 - classification_loss: 0.0165 34/500 [=>............................] - ETA: 3:57 - loss: 0.1106 - regression_loss: 0.0943 - classification_loss: 0.0163 35/500 [=>............................] - ETA: 3:56 - loss: 0.1104 - regression_loss: 0.0940 - classification_loss: 0.0164 36/500 [=>............................] - ETA: 3:56 - loss: 0.1082 - regression_loss: 0.0921 - classification_loss: 0.0161 37/500 [=>............................] - ETA: 3:55 - loss: 0.1063 - regression_loss: 0.0905 - classification_loss: 0.0159 38/500 [=>............................] - ETA: 3:55 - loss: 0.1052 - regression_loss: 0.0896 - classification_loss: 0.0156 39/500 [=>............................] - ETA: 3:55 - loss: 0.1033 - regression_loss: 0.0879 - classification_loss: 0.0153 40/500 [=>............................] - ETA: 3:54 - loss: 0.1018 - regression_loss: 0.0867 - classification_loss: 0.0151 41/500 [=>............................] - ETA: 3:54 - loss: 0.1101 - regression_loss: 0.0930 - classification_loss: 0.0171 42/500 [=>............................] - ETA: 3:53 - loss: 0.1089 - regression_loss: 0.0919 - classification_loss: 0.0169 43/500 [=>............................] - ETA: 3:53 - loss: 0.1073 - regression_loss: 0.0906 - classification_loss: 0.0166 44/500 [=>............................] - ETA: 3:52 - loss: 0.1059 - regression_loss: 0.0895 - classification_loss: 0.0164 45/500 [=>............................] - ETA: 3:52 - loss: 0.1045 - regression_loss: 0.0883 - classification_loss: 0.0163 46/500 [=>............................] - ETA: 3:51 - loss: 0.1038 - regression_loss: 0.0877 - classification_loss: 0.0162 47/500 [=>............................] - ETA: 3:51 - loss: 0.1025 - regression_loss: 0.0866 - classification_loss: 0.0159 48/500 [=>............................] - ETA: 3:50 - loss: 0.1025 - regression_loss: 0.0867 - classification_loss: 0.0158 49/500 [=>............................] - ETA: 3:50 - loss: 0.1008 - regression_loss: 0.0853 - classification_loss: 0.0156 50/500 [==>...........................] - ETA: 3:49 - loss: 0.0996 - regression_loss: 0.0842 - classification_loss: 0.0154 51/500 [==>...........................] - ETA: 3:49 - loss: 0.0984 - regression_loss: 0.0832 - classification_loss: 0.0152 52/500 [==>...........................] - ETA: 3:48 - loss: 0.0975 - regression_loss: 0.0825 - classification_loss: 0.0150 53/500 [==>...........................] - ETA: 3:48 - loss: 0.1002 - regression_loss: 0.0849 - classification_loss: 0.0153 54/500 [==>...........................] - ETA: 3:47 - loss: 0.0990 - regression_loss: 0.0838 - classification_loss: 0.0151 55/500 [==>...........................] - ETA: 3:47 - loss: 0.1087 - regression_loss: 0.0929 - classification_loss: 0.0159 56/500 [==>...........................] - ETA: 3:46 - loss: 0.1115 - regression_loss: 0.0957 - classification_loss: 0.0158 57/500 [==>...........................] - ETA: 3:46 - loss: 0.1111 - regression_loss: 0.0954 - classification_loss: 0.0157 58/500 [==>...........................] - ETA: 3:45 - loss: 0.1104 - regression_loss: 0.0947 - classification_loss: 0.0157 59/500 [==>...........................] - ETA: 3:45 - loss: 0.1095 - regression_loss: 0.0939 - classification_loss: 0.0156 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1087 - regression_loss: 0.0932 - classification_loss: 0.0154 61/500 [==>...........................] - ETA: 3:44 - loss: 0.1077 - regression_loss: 0.0924 - classification_loss: 0.0153 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1070 - regression_loss: 0.0919 - classification_loss: 0.0151 63/500 [==>...........................] - ETA: 3:43 - loss: 0.1062 - regression_loss: 0.0913 - classification_loss: 0.0149 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1053 - regression_loss: 0.0905 - classification_loss: 0.0148 65/500 [==>...........................] - ETA: 3:42 - loss: 0.1042 - regression_loss: 0.0896 - classification_loss: 0.0146 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1037 - regression_loss: 0.0891 - classification_loss: 0.0145 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1036 - regression_loss: 0.0891 - classification_loss: 0.0144 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1027 - regression_loss: 0.0884 - classification_loss: 0.0144 69/500 [===>..........................] - ETA: 3:40 - loss: 0.1018 - regression_loss: 0.0876 - classification_loss: 0.0142 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1018 - regression_loss: 0.0876 - classification_loss: 0.0143 71/500 [===>..........................] - ETA: 3:39 - loss: 0.1011 - regression_loss: 0.0869 - classification_loss: 0.0142 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1004 - regression_loss: 0.0864 - classification_loss: 0.0141 73/500 [===>..........................] - ETA: 3:38 - loss: 0.0995 - regression_loss: 0.0856 - classification_loss: 0.0140 74/500 [===>..........................] - ETA: 3:37 - loss: 0.0987 - regression_loss: 0.0849 - classification_loss: 0.0138 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1001 - regression_loss: 0.0861 - classification_loss: 0.0139 76/500 [===>..........................] - ETA: 3:36 - loss: 0.0993 - regression_loss: 0.0854 - classification_loss: 0.0139 77/500 [===>..........................] - ETA: 3:36 - loss: 0.0983 - regression_loss: 0.0845 - classification_loss: 0.0137 78/500 [===>..........................] - ETA: 3:35 - loss: 0.0976 - regression_loss: 0.0840 - classification_loss: 0.0136 79/500 [===>..........................] - ETA: 3:34 - loss: 0.0975 - regression_loss: 0.0839 - classification_loss: 0.0136 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1019 - regression_loss: 0.0881 - classification_loss: 0.0138 81/500 [===>..........................] - ETA: 3:33 - loss: 0.1011 - regression_loss: 0.0874 - classification_loss: 0.0137 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1003 - regression_loss: 0.0867 - classification_loss: 0.0136 83/500 [===>..........................] - ETA: 3:32 - loss: 0.1004 - regression_loss: 0.0869 - classification_loss: 0.0135 84/500 [====>.........................] - ETA: 3:32 - loss: 0.0994 - regression_loss: 0.0860 - classification_loss: 0.0134 85/500 [====>.........................] - ETA: 3:31 - loss: 0.0988 - regression_loss: 0.0855 - classification_loss: 0.0133 86/500 [====>.........................] - ETA: 3:31 - loss: 0.0980 - regression_loss: 0.0848 - classification_loss: 0.0132 87/500 [====>.........................] - ETA: 3:30 - loss: 0.0977 - regression_loss: 0.0846 - classification_loss: 0.0131 88/500 [====>.........................] - ETA: 3:30 - loss: 0.0972 - regression_loss: 0.0841 - classification_loss: 0.0131 89/500 [====>.........................] - ETA: 3:29 - loss: 0.0967 - regression_loss: 0.0837 - classification_loss: 0.0130 90/500 [====>.........................] - ETA: 3:29 - loss: 0.0957 - regression_loss: 0.0828 - classification_loss: 0.0129 91/500 [====>.........................] - ETA: 3:28 - loss: 0.0957 - regression_loss: 0.0828 - classification_loss: 0.0129 92/500 [====>.........................] - ETA: 3:28 - loss: 0.0952 - regression_loss: 0.0824 - classification_loss: 0.0128 93/500 [====>.........................] - ETA: 3:27 - loss: 0.0965 - regression_loss: 0.0836 - classification_loss: 0.0129 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1022 - regression_loss: 0.0887 - classification_loss: 0.0135 95/500 [====>.........................] - ETA: 3:26 - loss: 0.1015 - regression_loss: 0.0881 - classification_loss: 0.0134 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1010 - regression_loss: 0.0876 - classification_loss: 0.0133 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1003 - regression_loss: 0.0870 - classification_loss: 0.0133 98/500 [====>.........................] - ETA: 3:25 - loss: 0.0996 - regression_loss: 0.0864 - classification_loss: 0.0132 99/500 [====>.........................] - ETA: 3:24 - loss: 0.0992 - regression_loss: 0.0861 - classification_loss: 0.0131 100/500 [=====>........................] - ETA: 3:24 - loss: 0.0990 - regression_loss: 0.0859 - classification_loss: 0.0131 101/500 [=====>........................] - ETA: 3:23 - loss: 0.0988 - regression_loss: 0.0858 - classification_loss: 0.0130 102/500 [=====>........................] - ETA: 3:23 - loss: 0.0985 - regression_loss: 0.0856 - classification_loss: 0.0130 103/500 [=====>........................] - ETA: 3:23 - loss: 0.0978 - regression_loss: 0.0850 - classification_loss: 0.0129 104/500 [=====>........................] - ETA: 3:22 - loss: 0.0974 - regression_loss: 0.0846 - classification_loss: 0.0128 105/500 [=====>........................] - ETA: 3:21 - loss: 0.0966 - regression_loss: 0.0840 - classification_loss: 0.0127 106/500 [=====>........................] - ETA: 3:21 - loss: 0.0962 - regression_loss: 0.0836 - classification_loss: 0.0126 107/500 [=====>........................] - ETA: 3:20 - loss: 0.0958 - regression_loss: 0.0832 - classification_loss: 0.0125 108/500 [=====>........................] - ETA: 3:20 - loss: 0.0967 - regression_loss: 0.0839 - classification_loss: 0.0128 109/500 [=====>........................] - ETA: 3:19 - loss: 0.0962 - regression_loss: 0.0834 - classification_loss: 0.0128 110/500 [=====>........................] - ETA: 3:19 - loss: 0.0957 - regression_loss: 0.0830 - classification_loss: 0.0127 111/500 [=====>........................] - ETA: 3:18 - loss: 0.0953 - regression_loss: 0.0827 - classification_loss: 0.0127 112/500 [=====>........................] - ETA: 3:18 - loss: 0.0948 - regression_loss: 0.0822 - classification_loss: 0.0126 113/500 [=====>........................] - ETA: 3:17 - loss: 0.0974 - regression_loss: 0.0847 - classification_loss: 0.0126 114/500 [=====>........................] - ETA: 3:17 - loss: 0.0972 - regression_loss: 0.0845 - classification_loss: 0.0126 115/500 [=====>........................] - ETA: 3:16 - loss: 0.0968 - regression_loss: 0.0842 - classification_loss: 0.0126 116/500 [=====>........................] - ETA: 3:16 - loss: 0.0962 - regression_loss: 0.0837 - classification_loss: 0.0125 117/500 [======>.......................] - ETA: 3:15 - loss: 0.0956 - regression_loss: 0.0831 - classification_loss: 0.0124 118/500 [======>.......................] - ETA: 3:15 - loss: 0.0950 - regression_loss: 0.0826 - classification_loss: 0.0124 119/500 [======>.......................] - ETA: 3:14 - loss: 0.0957 - regression_loss: 0.0832 - classification_loss: 0.0125 120/500 [======>.......................] - ETA: 3:14 - loss: 0.0951 - regression_loss: 0.0827 - classification_loss: 0.0124 121/500 [======>.......................] - ETA: 3:13 - loss: 0.0948 - regression_loss: 0.0824 - classification_loss: 0.0124 122/500 [======>.......................] - ETA: 3:13 - loss: 0.0944 - regression_loss: 0.0821 - classification_loss: 0.0123 123/500 [======>.......................] - ETA: 3:12 - loss: 0.0940 - regression_loss: 0.0817 - classification_loss: 0.0123 124/500 [======>.......................] - ETA: 3:12 - loss: 0.0934 - regression_loss: 0.0812 - classification_loss: 0.0122 125/500 [======>.......................] - ETA: 3:11 - loss: 0.0930 - regression_loss: 0.0808 - classification_loss: 0.0121 126/500 [======>.......................] - ETA: 3:11 - loss: 0.0925 - regression_loss: 0.0805 - classification_loss: 0.0121 127/500 [======>.......................] - ETA: 3:10 - loss: 0.0921 - regression_loss: 0.0800 - classification_loss: 0.0120 128/500 [======>.......................] - ETA: 3:10 - loss: 0.0918 - regression_loss: 0.0798 - classification_loss: 0.0120 129/500 [======>.......................] - ETA: 3:09 - loss: 0.0917 - regression_loss: 0.0797 - classification_loss: 0.0119 130/500 [======>.......................] - ETA: 3:09 - loss: 0.0912 - regression_loss: 0.0793 - classification_loss: 0.0119 131/500 [======>.......................] - ETA: 3:08 - loss: 0.0906 - regression_loss: 0.0788 - classification_loss: 0.0118 132/500 [======>.......................] - ETA: 3:08 - loss: 0.0923 - regression_loss: 0.0800 - classification_loss: 0.0124 133/500 [======>.......................] - ETA: 3:07 - loss: 0.0922 - regression_loss: 0.0799 - classification_loss: 0.0123 134/500 [=======>......................] - ETA: 3:06 - loss: 0.0918 - regression_loss: 0.0795 - classification_loss: 0.0123 135/500 [=======>......................] - ETA: 3:06 - loss: 0.0914 - regression_loss: 0.0792 - classification_loss: 0.0122 136/500 [=======>......................] - ETA: 3:06 - loss: 0.0909 - regression_loss: 0.0787 - classification_loss: 0.0122 137/500 [=======>......................] - ETA: 3:05 - loss: 0.0909 - regression_loss: 0.0787 - classification_loss: 0.0121 138/500 [=======>......................] - ETA: 3:05 - loss: 0.0905 - regression_loss: 0.0785 - classification_loss: 0.0121 139/500 [=======>......................] - ETA: 3:04 - loss: 0.0905 - regression_loss: 0.0785 - classification_loss: 0.0120 140/500 [=======>......................] - ETA: 3:03 - loss: 0.0901 - regression_loss: 0.0781 - classification_loss: 0.0119 141/500 [=======>......................] - ETA: 3:03 - loss: 0.0903 - regression_loss: 0.0784 - classification_loss: 0.0119 142/500 [=======>......................] - ETA: 3:02 - loss: 0.0901 - regression_loss: 0.0782 - classification_loss: 0.0119 143/500 [=======>......................] - ETA: 3:02 - loss: 0.0904 - regression_loss: 0.0784 - classification_loss: 0.0119 144/500 [=======>......................] - ETA: 3:01 - loss: 0.0900 - regression_loss: 0.0781 - classification_loss: 0.0119 145/500 [=======>......................] - ETA: 3:01 - loss: 0.0899 - regression_loss: 0.0780 - classification_loss: 0.0119 146/500 [=======>......................] - ETA: 3:00 - loss: 0.0896 - regression_loss: 0.0778 - classification_loss: 0.0118 147/500 [=======>......................] - ETA: 3:00 - loss: 0.0898 - regression_loss: 0.0780 - classification_loss: 0.0118 148/500 [=======>......................] - ETA: 2:59 - loss: 0.0894 - regression_loss: 0.0777 - classification_loss: 0.0118 149/500 [=======>......................] - ETA: 2:59 - loss: 0.0895 - regression_loss: 0.0777 - classification_loss: 0.0117 150/500 [========>.....................] - ETA: 2:58 - loss: 0.0890 - regression_loss: 0.0773 - classification_loss: 0.0117 151/500 [========>.....................] - ETA: 2:58 - loss: 0.0887 - regression_loss: 0.0770 - classification_loss: 0.0117 152/500 [========>.....................] - ETA: 2:57 - loss: 0.0892 - regression_loss: 0.0774 - classification_loss: 0.0117 153/500 [========>.....................] - ETA: 2:57 - loss: 0.0889 - regression_loss: 0.0771 - classification_loss: 0.0117 154/500 [========>.....................] - ETA: 2:56 - loss: 0.0887 - regression_loss: 0.0770 - classification_loss: 0.0117 155/500 [========>.....................] - ETA: 2:56 - loss: 0.0885 - regression_loss: 0.0769 - classification_loss: 0.0117 156/500 [========>.....................] - ETA: 2:55 - loss: 0.0884 - regression_loss: 0.0768 - classification_loss: 0.0116 157/500 [========>.....................] - ETA: 2:55 - loss: 0.0885 - regression_loss: 0.0769 - classification_loss: 0.0116 158/500 [========>.....................] - ETA: 2:54 - loss: 0.0882 - regression_loss: 0.0766 - classification_loss: 0.0116 159/500 [========>.....................] - ETA: 2:54 - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115 160/500 [========>.....................] - ETA: 2:53 - loss: 0.0877 - regression_loss: 0.0762 - classification_loss: 0.0115 161/500 [========>.....................] - ETA: 2:53 - loss: 0.0873 - regression_loss: 0.0759 - classification_loss: 0.0115 162/500 [========>.....................] - ETA: 2:52 - loss: 0.0874 - regression_loss: 0.0759 - classification_loss: 0.0115 163/500 [========>.....................] - ETA: 2:52 - loss: 0.0870 - regression_loss: 0.0756 - classification_loss: 0.0114 164/500 [========>.....................] - ETA: 2:51 - loss: 0.0875 - regression_loss: 0.0760 - classification_loss: 0.0115 165/500 [========>.....................] - ETA: 2:51 - loss: 0.0872 - regression_loss: 0.0758 - classification_loss: 0.0114 166/500 [========>.....................] - ETA: 2:50 - loss: 0.0869 - regression_loss: 0.0755 - classification_loss: 0.0114 167/500 [=========>....................] - ETA: 2:50 - loss: 0.0866 - regression_loss: 0.0752 - classification_loss: 0.0113 168/500 [=========>....................] - ETA: 2:49 - loss: 0.0864 - regression_loss: 0.0751 - classification_loss: 0.0113 169/500 [=========>....................] - ETA: 2:49 - loss: 0.0865 - regression_loss: 0.0752 - classification_loss: 0.0113 170/500 [=========>....................] - ETA: 2:48 - loss: 0.0868 - regression_loss: 0.0754 - classification_loss: 0.0113 171/500 [=========>....................] - ETA: 2:48 - loss: 0.0872 - regression_loss: 0.0758 - classification_loss: 0.0113 172/500 [=========>....................] - ETA: 2:47 - loss: 0.0868 - regression_loss: 0.0755 - classification_loss: 0.0113 173/500 [=========>....................] - ETA: 2:47 - loss: 0.0870 - regression_loss: 0.0757 - classification_loss: 0.0113 174/500 [=========>....................] - ETA: 2:46 - loss: 0.0873 - regression_loss: 0.0760 - classification_loss: 0.0113 175/500 [=========>....................] - ETA: 2:46 - loss: 0.0876 - regression_loss: 0.0763 - classification_loss: 0.0113 176/500 [=========>....................] - ETA: 2:45 - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0113 177/500 [=========>....................] - ETA: 2:45 - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0113 178/500 [=========>....................] - ETA: 2:44 - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0113 179/500 [=========>....................] - ETA: 2:44 - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0113 180/500 [=========>....................] - ETA: 2:43 - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0113 181/500 [=========>....................] - ETA: 2:43 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0112 182/500 [=========>....................] - ETA: 2:42 - loss: 0.0868 - regression_loss: 0.0756 - classification_loss: 0.0112 183/500 [=========>....................] - ETA: 2:42 - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0112 184/500 [==========>...................] - ETA: 2:41 - loss: 0.0864 - regression_loss: 0.0752 - classification_loss: 0.0111 185/500 [==========>...................] - ETA: 2:41 - loss: 0.0864 - regression_loss: 0.0753 - classification_loss: 0.0111 186/500 [==========>...................] - ETA: 2:40 - loss: 0.0863 - regression_loss: 0.0752 - classification_loss: 0.0112 187/500 [==========>...................] - ETA: 2:40 - loss: 0.0861 - regression_loss: 0.0749 - classification_loss: 0.0111 188/500 [==========>...................] - ETA: 2:39 - loss: 0.0859 - regression_loss: 0.0748 - classification_loss: 0.0111 189/500 [==========>...................] - ETA: 2:39 - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0111 190/500 [==========>...................] - ETA: 2:38 - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0111 191/500 [==========>...................] - ETA: 2:38 - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0111 192/500 [==========>...................] - ETA: 2:37 - loss: 0.0855 - regression_loss: 0.0745 - classification_loss: 0.0110 193/500 [==========>...................] - ETA: 2:37 - loss: 0.0852 - regression_loss: 0.0743 - classification_loss: 0.0110 194/500 [==========>...................] - ETA: 2:36 - loss: 0.0850 - regression_loss: 0.0741 - classification_loss: 0.0110 195/500 [==========>...................] - ETA: 2:35 - loss: 0.0858 - regression_loss: 0.0747 - classification_loss: 0.0110 196/500 [==========>...................] - ETA: 2:35 - loss: 0.0856 - regression_loss: 0.0746 - classification_loss: 0.0110 197/500 [==========>...................] - ETA: 2:34 - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0110 198/500 [==========>...................] - ETA: 2:34 - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0110 199/500 [==========>...................] - ETA: 2:33 - loss: 0.0856 - regression_loss: 0.0746 - classification_loss: 0.0110 200/500 [===========>..................] - ETA: 2:33 - loss: 0.0859 - regression_loss: 0.0749 - classification_loss: 0.0110 201/500 [===========>..................] - ETA: 2:32 - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0110 202/500 [===========>..................] - ETA: 2:32 - loss: 0.0855 - regression_loss: 0.0745 - classification_loss: 0.0110 203/500 [===========>..................] - ETA: 2:31 - loss: 0.0854 - regression_loss: 0.0744 - classification_loss: 0.0110 204/500 [===========>..................] - ETA: 2:31 - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 205/500 [===========>..................] - ETA: 2:30 - loss: 0.0850 - regression_loss: 0.0741 - classification_loss: 0.0109 206/500 [===========>..................] - ETA: 2:30 - loss: 0.0852 - regression_loss: 0.0743 - classification_loss: 0.0109 207/500 [===========>..................] - ETA: 2:29 - loss: 0.0852 - regression_loss: 0.0743 - classification_loss: 0.0109 208/500 [===========>..................] - ETA: 2:29 - loss: 0.0849 - regression_loss: 0.0740 - classification_loss: 0.0109 209/500 [===========>..................] - ETA: 2:28 - loss: 0.0846 - regression_loss: 0.0738 - classification_loss: 0.0109 210/500 [===========>..................] - ETA: 2:28 - loss: 0.0844 - regression_loss: 0.0736 - classification_loss: 0.0108 211/500 [===========>..................] - ETA: 2:27 - loss: 0.0857 - regression_loss: 0.0745 - classification_loss: 0.0112 212/500 [===========>..................] - ETA: 2:27 - loss: 0.0854 - regression_loss: 0.0743 - classification_loss: 0.0112 213/500 [===========>..................] - ETA: 2:26 - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0111 214/500 [===========>..................] - ETA: 2:26 - loss: 0.0854 - regression_loss: 0.0743 - classification_loss: 0.0111 215/500 [===========>..................] - ETA: 2:25 - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0111 216/500 [===========>..................] - ETA: 2:25 - loss: 0.0851 - regression_loss: 0.0740 - classification_loss: 0.0111 217/500 [============>.................] - ETA: 2:24 - loss: 0.0849 - regression_loss: 0.0738 - classification_loss: 0.0110 218/500 [============>.................] - ETA: 2:24 - loss: 0.0848 - regression_loss: 0.0738 - classification_loss: 0.0110 219/500 [============>.................] - ETA: 2:23 - loss: 0.0866 - regression_loss: 0.0755 - classification_loss: 0.0111 220/500 [============>.................] - ETA: 2:23 - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 221/500 [============>.................] - ETA: 2:22 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0112 222/500 [============>.................] - ETA: 2:22 - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0112 223/500 [============>.................] - ETA: 2:21 - loss: 0.0870 - regression_loss: 0.0759 - classification_loss: 0.0112 224/500 [============>.................] - ETA: 2:21 - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0111 225/500 [============>.................] - ETA: 2:20 - loss: 0.0866 - regression_loss: 0.0754 - classification_loss: 0.0111 226/500 [============>.................] - ETA: 2:19 - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0111 227/500 [============>.................] - ETA: 2:19 - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 228/500 [============>.................] - ETA: 2:18 - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0111 229/500 [============>.................] - ETA: 2:18 - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0111 230/500 [============>.................] - ETA: 2:17 - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0111 231/500 [============>.................] - ETA: 2:17 - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 232/500 [============>.................] - ETA: 2:16 - loss: 0.0862 - regression_loss: 0.0752 - classification_loss: 0.0110 233/500 [============>.................] - ETA: 2:16 - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0110 234/500 [=============>................] - ETA: 2:15 - loss: 0.0857 - regression_loss: 0.0748 - classification_loss: 0.0110 235/500 [=============>................] - ETA: 2:15 - loss: 0.0855 - regression_loss: 0.0746 - classification_loss: 0.0109 236/500 [=============>................] - ETA: 2:14 - loss: 0.0854 - regression_loss: 0.0744 - classification_loss: 0.0109 237/500 [=============>................] - ETA: 2:14 - loss: 0.0877 - regression_loss: 0.0766 - classification_loss: 0.0111 238/500 [=============>................] - ETA: 2:13 - loss: 0.0879 - regression_loss: 0.0768 - classification_loss: 0.0111 239/500 [=============>................] - ETA: 2:13 - loss: 0.0878 - regression_loss: 0.0767 - classification_loss: 0.0111 240/500 [=============>................] - ETA: 2:12 - loss: 0.0878 - regression_loss: 0.0767 - classification_loss: 0.0111 241/500 [=============>................] - ETA: 2:12 - loss: 0.0876 - regression_loss: 0.0766 - classification_loss: 0.0111 242/500 [=============>................] - ETA: 2:11 - loss: 0.0874 - regression_loss: 0.0764 - classification_loss: 0.0110 243/500 [=============>................] - ETA: 2:11 - loss: 0.0872 - regression_loss: 0.0762 - classification_loss: 0.0110 244/500 [=============>................] - ETA: 2:10 - loss: 0.0870 - regression_loss: 0.0760 - classification_loss: 0.0110 245/500 [=============>................] - ETA: 2:10 - loss: 0.0869 - regression_loss: 0.0759 - classification_loss: 0.0110 246/500 [=============>................] - ETA: 2:09 - loss: 0.0866 - regression_loss: 0.0757 - classification_loss: 0.0109 247/500 [=============>................] - ETA: 2:09 - loss: 0.0865 - regression_loss: 0.0756 - classification_loss: 0.0109 248/500 [=============>................] - ETA: 2:08 - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0109 249/500 [=============>................] - ETA: 2:08 - loss: 0.0864 - regression_loss: 0.0755 - classification_loss: 0.0109 250/500 [==============>...............] - ETA: 2:07 - loss: 0.0862 - regression_loss: 0.0753 - classification_loss: 0.0109 251/500 [==============>...............] - ETA: 2:07 - loss: 0.0860 - regression_loss: 0.0752 - classification_loss: 0.0108 252/500 [==============>...............] - ETA: 2:06 - loss: 0.0858 - regression_loss: 0.0750 - classification_loss: 0.0108 253/500 [==============>...............] - ETA: 2:06 - loss: 0.0856 - regression_loss: 0.0748 - classification_loss: 0.0108 254/500 [==============>...............] - ETA: 2:05 - loss: 0.0854 - regression_loss: 0.0747 - classification_loss: 0.0108 255/500 [==============>...............] - ETA: 2:05 - loss: 0.0854 - regression_loss: 0.0746 - classification_loss: 0.0108 256/500 [==============>...............] - ETA: 2:04 - loss: 0.0852 - regression_loss: 0.0744 - classification_loss: 0.0108 257/500 [==============>...............] - ETA: 2:04 - loss: 0.0850 - regression_loss: 0.0743 - classification_loss: 0.0107 258/500 [==============>...............] - ETA: 2:03 - loss: 0.0850 - regression_loss: 0.0742 - classification_loss: 0.0107 259/500 [==============>...............] - ETA: 2:03 - loss: 0.0849 - regression_loss: 0.0742 - classification_loss: 0.0107 260/500 [==============>...............] - ETA: 2:02 - loss: 0.0847 - regression_loss: 0.0741 - classification_loss: 0.0107 261/500 [==============>...............] - ETA: 2:02 - loss: 0.0846 - regression_loss: 0.0739 - classification_loss: 0.0107 262/500 [==============>...............] - ETA: 2:01 - loss: 0.0845 - regression_loss: 0.0738 - classification_loss: 0.0107 263/500 [==============>...............] - ETA: 2:01 - loss: 0.0846 - regression_loss: 0.0740 - classification_loss: 0.0107 264/500 [==============>...............] - ETA: 2:00 - loss: 0.0845 - regression_loss: 0.0739 - classification_loss: 0.0106 265/500 [==============>...............] - ETA: 2:00 - loss: 0.0843 - regression_loss: 0.0737 - classification_loss: 0.0106 266/500 [==============>...............] - ETA: 1:59 - loss: 0.0843 - regression_loss: 0.0737 - classification_loss: 0.0106 267/500 [===============>..............] - ETA: 1:59 - loss: 0.0859 - regression_loss: 0.0749 - classification_loss: 0.0109 268/500 [===============>..............] - ETA: 1:58 - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0109 269/500 [===============>..............] - ETA: 1:58 - loss: 0.0855 - regression_loss: 0.0746 - classification_loss: 0.0109 270/500 [===============>..............] - ETA: 1:57 - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 271/500 [===============>..............] - ETA: 1:57 - loss: 0.0851 - regression_loss: 0.0743 - classification_loss: 0.0108 272/500 [===============>..............] - ETA: 1:56 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 273/500 [===============>..............] - ETA: 1:56 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 274/500 [===============>..............] - ETA: 1:55 - loss: 0.0851 - regression_loss: 0.0743 - classification_loss: 0.0108 275/500 [===============>..............] - ETA: 1:54 - loss: 0.0851 - regression_loss: 0.0743 - classification_loss: 0.0108 276/500 [===============>..............] - ETA: 1:54 - loss: 0.0849 - regression_loss: 0.0742 - classification_loss: 0.0108 277/500 [===============>..............] - ETA: 1:53 - loss: 0.0848 - regression_loss: 0.0740 - classification_loss: 0.0108 278/500 [===============>..............] - ETA: 1:53 - loss: 0.0848 - regression_loss: 0.0740 - classification_loss: 0.0107 279/500 [===============>..............] - ETA: 1:52 - loss: 0.0846 - regression_loss: 0.0739 - classification_loss: 0.0107 280/500 [===============>..............] - ETA: 1:52 - loss: 0.0850 - regression_loss: 0.0742 - classification_loss: 0.0108 281/500 [===============>..............] - ETA: 1:51 - loss: 0.0850 - regression_loss: 0.0743 - classification_loss: 0.0108 282/500 [===============>..............] - ETA: 1:51 - loss: 0.0854 - regression_loss: 0.0746 - classification_loss: 0.0108 283/500 [===============>..............] - ETA: 1:50 - loss: 0.0854 - regression_loss: 0.0746 - classification_loss: 0.0108 284/500 [================>.............] - ETA: 1:50 - loss: 0.0852 - regression_loss: 0.0745 - classification_loss: 0.0108 285/500 [================>.............] - ETA: 1:49 - loss: 0.0852 - regression_loss: 0.0744 - classification_loss: 0.0107 286/500 [================>.............] - ETA: 1:49 - loss: 0.0850 - regression_loss: 0.0743 - classification_loss: 0.0107 287/500 [================>.............] - ETA: 1:48 - loss: 0.0848 - regression_loss: 0.0741 - classification_loss: 0.0107 288/500 [================>.............] - ETA: 1:48 - loss: 0.0847 - regression_loss: 0.0740 - classification_loss: 0.0107 289/500 [================>.............] - ETA: 1:47 - loss: 0.0845 - regression_loss: 0.0738 - classification_loss: 0.0107 290/500 [================>.............] - ETA: 1:47 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0108 291/500 [================>.............] - ETA: 1:46 - loss: 0.0848 - regression_loss: 0.0741 - classification_loss: 0.0108 292/500 [================>.............] - ETA: 1:46 - loss: 0.0849 - regression_loss: 0.0742 - classification_loss: 0.0108 293/500 [================>.............] - ETA: 1:45 - loss: 0.0849 - regression_loss: 0.0741 - classification_loss: 0.0107 294/500 [================>.............] - ETA: 1:45 - loss: 0.0846 - regression_loss: 0.0739 - classification_loss: 0.0107 295/500 [================>.............] - ETA: 1:44 - loss: 0.0845 - regression_loss: 0.0738 - classification_loss: 0.0107 296/500 [================>.............] - ETA: 1:44 - loss: 0.0844 - regression_loss: 0.0737 - classification_loss: 0.0107 297/500 [================>.............] - ETA: 1:43 - loss: 0.0847 - regression_loss: 0.0740 - classification_loss: 0.0107 298/500 [================>.............] - ETA: 1:43 - loss: 0.0845 - regression_loss: 0.0739 - classification_loss: 0.0106 299/500 [================>.............] - ETA: 1:42 - loss: 0.0846 - regression_loss: 0.0739 - classification_loss: 0.0107 300/500 [=================>............] - ETA: 1:42 - loss: 0.0845 - regression_loss: 0.0739 - classification_loss: 0.0107 301/500 [=================>............] - ETA: 1:41 - loss: 0.0844 - regression_loss: 0.0738 - classification_loss: 0.0106 302/500 [=================>............] - ETA: 1:41 - loss: 0.0843 - regression_loss: 0.0737 - classification_loss: 0.0106 303/500 [=================>............] - ETA: 1:40 - loss: 0.0841 - regression_loss: 0.0735 - classification_loss: 0.0106 304/500 [=================>............] - ETA: 1:40 - loss: 0.0840 - regression_loss: 0.0734 - classification_loss: 0.0106 305/500 [=================>............] - ETA: 1:39 - loss: 0.0838 - regression_loss: 0.0732 - classification_loss: 0.0106 306/500 [=================>............] - ETA: 1:39 - loss: 0.0836 - regression_loss: 0.0730 - classification_loss: 0.0105 307/500 [=================>............] - ETA: 1:38 - loss: 0.0834 - regression_loss: 0.0729 - classification_loss: 0.0105 308/500 [=================>............] - ETA: 1:38 - loss: 0.0833 - regression_loss: 0.0728 - classification_loss: 0.0105 309/500 [=================>............] - ETA: 1:37 - loss: 0.0833 - regression_loss: 0.0727 - classification_loss: 0.0105 310/500 [=================>............] - ETA: 1:37 - loss: 0.0832 - regression_loss: 0.0727 - classification_loss: 0.0105 311/500 [=================>............] - ETA: 1:36 - loss: 0.0832 - regression_loss: 0.0726 - classification_loss: 0.0105 312/500 [=================>............] - ETA: 1:36 - loss: 0.0829 - regression_loss: 0.0724 - classification_loss: 0.0105 313/500 [=================>............] - ETA: 1:35 - loss: 0.0848 - regression_loss: 0.0741 - classification_loss: 0.0107 314/500 [=================>............] - ETA: 1:35 - loss: 0.0874 - regression_loss: 0.0758 - classification_loss: 0.0115 315/500 [=================>............] - ETA: 1:34 - loss: 0.0872 - regression_loss: 0.0757 - classification_loss: 0.0115 316/500 [=================>............] - ETA: 1:34 - loss: 0.0870 - regression_loss: 0.0755 - classification_loss: 0.0115 317/500 [==================>...........] - ETA: 1:33 - loss: 0.0870 - regression_loss: 0.0755 - classification_loss: 0.0115 318/500 [==================>...........] - ETA: 1:33 - loss: 0.0869 - regression_loss: 0.0755 - classification_loss: 0.0114 319/500 [==================>...........] - ETA: 1:32 - loss: 0.0868 - regression_loss: 0.0753 - classification_loss: 0.0114 320/500 [==================>...........] - ETA: 1:32 - loss: 0.0866 - regression_loss: 0.0752 - classification_loss: 0.0114 321/500 [==================>...........] - ETA: 1:31 - loss: 0.0868 - regression_loss: 0.0754 - classification_loss: 0.0114 322/500 [==================>...........] - ETA: 1:31 - loss: 0.0868 - regression_loss: 0.0754 - classification_loss: 0.0114 323/500 [==================>...........] - ETA: 1:30 - loss: 0.0869 - regression_loss: 0.0755 - classification_loss: 0.0114 324/500 [==================>...........] - ETA: 1:29 - loss: 0.0880 - regression_loss: 0.0764 - classification_loss: 0.0117 325/500 [==================>...........] - ETA: 1:29 - loss: 0.0882 - regression_loss: 0.0765 - classification_loss: 0.0117 326/500 [==================>...........] - ETA: 1:28 - loss: 0.0880 - regression_loss: 0.0763 - classification_loss: 0.0117 327/500 [==================>...........] - ETA: 1:28 - loss: 0.0878 - regression_loss: 0.0762 - classification_loss: 0.0116 328/500 [==================>...........] - ETA: 1:27 - loss: 0.0884 - regression_loss: 0.0767 - classification_loss: 0.0117 329/500 [==================>...........] - ETA: 1:27 - loss: 0.0882 - regression_loss: 0.0765 - classification_loss: 0.0117 330/500 [==================>...........] - ETA: 1:26 - loss: 0.0887 - regression_loss: 0.0770 - classification_loss: 0.0117 331/500 [==================>...........] - ETA: 1:26 - loss: 0.0886 - regression_loss: 0.0769 - classification_loss: 0.0117 332/500 [==================>...........] - ETA: 1:25 - loss: 0.0884 - regression_loss: 0.0768 - classification_loss: 0.0117 333/500 [==================>...........] - ETA: 1:25 - loss: 0.0883 - regression_loss: 0.0767 - classification_loss: 0.0116 334/500 [===================>..........] - ETA: 1:24 - loss: 0.0882 - regression_loss: 0.0765 - classification_loss: 0.0116 335/500 [===================>..........] - ETA: 1:24 - loss: 0.0879 - regression_loss: 0.0763 - classification_loss: 0.0116 336/500 [===================>..........] - ETA: 1:23 - loss: 0.0878 - regression_loss: 0.0762 - classification_loss: 0.0116 337/500 [===================>..........] - ETA: 1:23 - loss: 0.0878 - regression_loss: 0.0762 - classification_loss: 0.0116 338/500 [===================>..........] - ETA: 1:22 - loss: 0.0876 - regression_loss: 0.0761 - classification_loss: 0.0116 339/500 [===================>..........] - ETA: 1:22 - loss: 0.0876 - regression_loss: 0.0760 - classification_loss: 0.0116 340/500 [===================>..........] - ETA: 1:21 - loss: 0.0874 - regression_loss: 0.0759 - classification_loss: 0.0115 341/500 [===================>..........] - ETA: 1:21 - loss: 0.0873 - regression_loss: 0.0758 - classification_loss: 0.0115 342/500 [===================>..........] - ETA: 1:20 - loss: 0.0872 - regression_loss: 0.0757 - classification_loss: 0.0115 343/500 [===================>..........] - ETA: 1:20 - loss: 0.0872 - regression_loss: 0.0757 - classification_loss: 0.0115 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0870 - regression_loss: 0.0756 - classification_loss: 0.0115 345/500 [===================>..........] - ETA: 1:19 - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0878 - regression_loss: 0.0764 - classification_loss: 0.0115 347/500 [===================>..........] - ETA: 1:18 - loss: 0.0880 - regression_loss: 0.0765 - classification_loss: 0.0115 348/500 [===================>..........] - ETA: 1:17 - loss: 0.0879 - regression_loss: 0.0765 - classification_loss: 0.0115 349/500 [===================>..........] - ETA: 1:17 - loss: 0.0882 - regression_loss: 0.0767 - classification_loss: 0.0115 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0880 - regression_loss: 0.0766 - classification_loss: 0.0114 351/500 [====================>.........] - ETA: 1:16 - loss: 0.0880 - regression_loss: 0.0765 - classification_loss: 0.0114 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0878 - regression_loss: 0.0764 - classification_loss: 0.0114 353/500 [====================>.........] - ETA: 1:15 - loss: 0.0878 - regression_loss: 0.0764 - classification_loss: 0.0114 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0114 355/500 [====================>.........] - ETA: 1:14 - loss: 0.0877 - regression_loss: 0.0763 - classification_loss: 0.0114 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0878 - regression_loss: 0.0764 - classification_loss: 0.0114 357/500 [====================>.........] - ETA: 1:13 - loss: 0.0877 - regression_loss: 0.0763 - classification_loss: 0.0114 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0878 - regression_loss: 0.0764 - classification_loss: 0.0114 359/500 [====================>.........] - ETA: 1:12 - loss: 0.0877 - regression_loss: 0.0763 - classification_loss: 0.0114 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0875 - regression_loss: 0.0762 - classification_loss: 0.0114 361/500 [====================>.........] - ETA: 1:11 - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0875 - regression_loss: 0.0760 - classification_loss: 0.0114 363/500 [====================>.........] - ETA: 1:10 - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 365/500 [====================>.........] - ETA: 1:08 - loss: 0.0875 - regression_loss: 0.0761 - classification_loss: 0.0114 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0873 - regression_loss: 0.0760 - classification_loss: 0.0113 367/500 [=====================>........] - ETA: 1:07 - loss: 0.0872 - regression_loss: 0.0759 - classification_loss: 0.0113 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0872 - regression_loss: 0.0759 - classification_loss: 0.0113 369/500 [=====================>........] - ETA: 1:06 - loss: 0.0872 - regression_loss: 0.0759 - classification_loss: 0.0113 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0870 - regression_loss: 0.0757 - classification_loss: 0.0113 371/500 [=====================>........] - ETA: 1:05 - loss: 0.0871 - regression_loss: 0.0758 - classification_loss: 0.0113 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0870 - regression_loss: 0.0757 - classification_loss: 0.0113 373/500 [=====================>........] - ETA: 1:04 - loss: 0.0869 - regression_loss: 0.0756 - classification_loss: 0.0113 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0868 - regression_loss: 0.0756 - classification_loss: 0.0113 375/500 [=====================>........] - ETA: 1:03 - loss: 0.0869 - regression_loss: 0.0757 - classification_loss: 0.0113 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0868 - regression_loss: 0.0756 - classification_loss: 0.0113 377/500 [=====================>........] - ETA: 1:02 - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0113 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0866 - regression_loss: 0.0753 - classification_loss: 0.0112 379/500 [=====================>........] - ETA: 1:01 - loss: 0.0866 - regression_loss: 0.0754 - classification_loss: 0.0112 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0865 - regression_loss: 0.0753 - classification_loss: 0.0112 381/500 [=====================>........] - ETA: 1:00 - loss: 0.0864 - regression_loss: 0.0752 - classification_loss: 0.0112 382/500 [=====================>........] - ETA: 1:00 - loss: 0.0863 - regression_loss: 0.0751 - classification_loss: 0.0112 383/500 [=====================>........] - ETA: 59s - loss: 0.0861 - regression_loss: 0.0750 - classification_loss: 0.0112  384/500 [======================>.......] - ETA: 59s - loss: 0.0860 - regression_loss: 0.0749 - classification_loss: 0.0112 385/500 [======================>.......] - ETA: 58s - loss: 0.0859 - regression_loss: 0.0747 - classification_loss: 0.0111 386/500 [======================>.......] - ETA: 58s - loss: 0.0858 - regression_loss: 0.0747 - classification_loss: 0.0111 387/500 [======================>.......] - ETA: 57s - loss: 0.0857 - regression_loss: 0.0746 - classification_loss: 0.0111 388/500 [======================>.......] - ETA: 57s - loss: 0.0856 - regression_loss: 0.0745 - classification_loss: 0.0111 389/500 [======================>.......] - ETA: 56s - loss: 0.0857 - regression_loss: 0.0746 - classification_loss: 0.0111 390/500 [======================>.......] - ETA: 56s - loss: 0.0866 - regression_loss: 0.0755 - classification_loss: 0.0111 391/500 [======================>.......] - ETA: 55s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 392/500 [======================>.......] - ETA: 55s - loss: 0.0864 - regression_loss: 0.0753 - classification_loss: 0.0111 393/500 [======================>.......] - ETA: 54s - loss: 0.0862 - regression_loss: 0.0751 - classification_loss: 0.0111 394/500 [======================>.......] - ETA: 54s - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0111 395/500 [======================>.......] - ETA: 53s - loss: 0.0859 - regression_loss: 0.0748 - classification_loss: 0.0110 396/500 [======================>.......] - ETA: 53s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 397/500 [======================>.......] - ETA: 52s - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 398/500 [======================>.......] - ETA: 52s - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 399/500 [======================>.......] - ETA: 51s - loss: 0.0864 - regression_loss: 0.0754 - classification_loss: 0.0110 400/500 [=======================>......] - ETA: 51s - loss: 0.0863 - regression_loss: 0.0753 - classification_loss: 0.0110 401/500 [=======================>......] - ETA: 50s - loss: 0.0862 - regression_loss: 0.0752 - classification_loss: 0.0110 402/500 [=======================>......] - ETA: 50s - loss: 0.0862 - regression_loss: 0.0752 - classification_loss: 0.0110 403/500 [=======================>......] - ETA: 49s - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110 404/500 [=======================>......] - ETA: 49s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 405/500 [=======================>......] - ETA: 48s - loss: 0.0868 - regression_loss: 0.0758 - classification_loss: 0.0110 406/500 [=======================>......] - ETA: 48s - loss: 0.0866 - regression_loss: 0.0757 - classification_loss: 0.0110 407/500 [=======================>......] - ETA: 47s - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 408/500 [=======================>......] - ETA: 47s - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110 409/500 [=======================>......] - ETA: 46s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 410/500 [=======================>......] - ETA: 45s - loss: 0.0867 - regression_loss: 0.0757 - classification_loss: 0.0110 411/500 [=======================>......] - ETA: 45s - loss: 0.0866 - regression_loss: 0.0756 - classification_loss: 0.0110 412/500 [=======================>......] - ETA: 44s - loss: 0.0865 - regression_loss: 0.0756 - classification_loss: 0.0110 413/500 [=======================>......] - ETA: 44s - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 414/500 [=======================>......] - ETA: 43s - loss: 0.0863 - regression_loss: 0.0754 - classification_loss: 0.0110 415/500 [=======================>......] - ETA: 43s - loss: 0.0862 - regression_loss: 0.0753 - classification_loss: 0.0110 416/500 [=======================>......] - ETA: 42s - loss: 0.0865 - regression_loss: 0.0756 - classification_loss: 0.0110 417/500 [========================>.....] - ETA: 42s - loss: 0.0865 - regression_loss: 0.0755 - classification_loss: 0.0110 418/500 [========================>.....] - ETA: 41s - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0110 419/500 [========================>.....] - ETA: 41s - loss: 0.0867 - regression_loss: 0.0756 - classification_loss: 0.0110 420/500 [========================>.....] - ETA: 40s - loss: 0.0875 - regression_loss: 0.0763 - classification_loss: 0.0112 421/500 [========================>.....] - ETA: 40s - loss: 0.0874 - regression_loss: 0.0761 - classification_loss: 0.0112 422/500 [========================>.....] - ETA: 39s - loss: 0.0872 - regression_loss: 0.0760 - classification_loss: 0.0112 423/500 [========================>.....] - ETA: 39s - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0112 424/500 [========================>.....] - ETA: 38s - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0112 425/500 [========================>.....] - ETA: 38s - loss: 0.0869 - regression_loss: 0.0757 - classification_loss: 0.0112 426/500 [========================>.....] - ETA: 37s - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0112 427/500 [========================>.....] - ETA: 37s - loss: 0.0868 - regression_loss: 0.0757 - classification_loss: 0.0112 428/500 [========================>.....] - ETA: 36s - loss: 0.0868 - regression_loss: 0.0756 - classification_loss: 0.0112 429/500 [========================>.....] - ETA: 36s - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0112 430/500 [========================>.....] - ETA: 35s - loss: 0.0866 - regression_loss: 0.0755 - classification_loss: 0.0111 431/500 [========================>.....] - ETA: 35s - loss: 0.0866 - regression_loss: 0.0754 - classification_loss: 0.0111 432/500 [========================>.....] - ETA: 34s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 433/500 [========================>.....] - ETA: 34s - loss: 0.0864 - regression_loss: 0.0753 - classification_loss: 0.0111 434/500 [=========================>....] - ETA: 33s - loss: 0.0864 - regression_loss: 0.0753 - classification_loss: 0.0111 435/500 [=========================>....] - ETA: 33s - loss: 0.0865 - regression_loss: 0.0754 - classification_loss: 0.0111 436/500 [=========================>....] - ETA: 32s - loss: 0.0871 - regression_loss: 0.0758 - classification_loss: 0.0113 437/500 [=========================>....] - ETA: 32s - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0113 438/500 [=========================>....] - ETA: 31s - loss: 0.0873 - regression_loss: 0.0760 - classification_loss: 0.0113 439/500 [=========================>....] - ETA: 31s - loss: 0.0872 - regression_loss: 0.0759 - classification_loss: 0.0113 440/500 [=========================>....] - ETA: 30s - loss: 0.0872 - regression_loss: 0.0759 - classification_loss: 0.0113 441/500 [=========================>....] - ETA: 30s - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0113 442/500 [=========================>....] - ETA: 29s - loss: 0.0871 - regression_loss: 0.0758 - classification_loss: 0.0113 443/500 [=========================>....] - ETA: 29s - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0112 444/500 [=========================>....] - ETA: 28s - loss: 0.0870 - regression_loss: 0.0757 - classification_loss: 0.0112 445/500 [=========================>....] - ETA: 28s - loss: 0.0869 - regression_loss: 0.0756 - classification_loss: 0.0112 446/500 [=========================>....] - ETA: 27s - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0112 447/500 [=========================>....] - ETA: 27s - loss: 0.0867 - regression_loss: 0.0755 - classification_loss: 0.0112 448/500 [=========================>....] - ETA: 26s - loss: 0.0866 - regression_loss: 0.0754 - classification_loss: 0.0112 449/500 [=========================>....] - ETA: 26s - loss: 0.0876 - regression_loss: 0.0763 - classification_loss: 0.0113 450/500 [==========================>...] - ETA: 25s - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0113 451/500 [==========================>...] - ETA: 25s - loss: 0.0874 - regression_loss: 0.0761 - classification_loss: 0.0112 452/500 [==========================>...] - ETA: 24s - loss: 0.0875 - regression_loss: 0.0762 - classification_loss: 0.0112 453/500 [==========================>...] - ETA: 24s - loss: 0.0879 - regression_loss: 0.0766 - classification_loss: 0.0113 454/500 [==========================>...] - ETA: 23s - loss: 0.0879 - regression_loss: 0.0767 - classification_loss: 0.0113 455/500 [==========================>...] - ETA: 22s - loss: 0.0878 - regression_loss: 0.0766 - classification_loss: 0.0113 456/500 [==========================>...] - ETA: 22s - loss: 0.0877 - regression_loss: 0.0764 - classification_loss: 0.0112 457/500 [==========================>...] - ETA: 21s - loss: 0.0876 - regression_loss: 0.0763 - classification_loss: 0.0112 458/500 [==========================>...] - ETA: 21s - loss: 0.0876 - regression_loss: 0.0763 - classification_loss: 0.0112 459/500 [==========================>...] - ETA: 20s - loss: 0.0875 - regression_loss: 0.0763 - classification_loss: 0.0112 460/500 [==========================>...] - ETA: 20s - loss: 0.0874 - regression_loss: 0.0762 - classification_loss: 0.0112 461/500 [==========================>...] - ETA: 19s - loss: 0.0874 - regression_loss: 0.0761 - classification_loss: 0.0112 462/500 [==========================>...] - ETA: 19s - loss: 0.0873 - regression_loss: 0.0761 - classification_loss: 0.0112 463/500 [==========================>...] - ETA: 18s - loss: 0.0872 - regression_loss: 0.0760 - classification_loss: 0.0112 464/500 [==========================>...] - ETA: 18s - loss: 0.0871 - regression_loss: 0.0759 - classification_loss: 0.0112 465/500 [==========================>...] - ETA: 17s - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0112 466/500 [==========================>...] - ETA: 17s - loss: 0.0869 - regression_loss: 0.0758 - classification_loss: 0.0112 467/500 [===========================>..] - ETA: 16s - loss: 0.0869 - regression_loss: 0.0757 - classification_loss: 0.0112 468/500 [===========================>..] - ETA: 16s - loss: 0.0870 - regression_loss: 0.0758 - classification_loss: 0.0111 469/500 [===========================>..] - ETA: 15s - loss: 0.0885 - regression_loss: 0.0769 - classification_loss: 0.0116 470/500 [===========================>..] - ETA: 15s - loss: 0.0884 - regression_loss: 0.0768 - classification_loss: 0.0116 471/500 [===========================>..] - ETA: 14s - loss: 0.0883 - regression_loss: 0.0767 - classification_loss: 0.0116 472/500 [===========================>..] - ETA: 14s - loss: 0.0883 - regression_loss: 0.0767 - classification_loss: 0.0116 473/500 [===========================>..] - ETA: 13s - loss: 0.0882 - regression_loss: 0.0766 - classification_loss: 0.0116 474/500 [===========================>..] - ETA: 13s - loss: 0.0881 - regression_loss: 0.0765 - classification_loss: 0.0116 475/500 [===========================>..] - ETA: 12s - loss: 0.0880 - regression_loss: 0.0764 - classification_loss: 0.0116 476/500 [===========================>..] - ETA: 12s - loss: 0.0880 - regression_loss: 0.0764 - classification_loss: 0.0116 477/500 [===========================>..] - ETA: 11s - loss: 0.0879 - regression_loss: 0.0763 - classification_loss: 0.0116 478/500 [===========================>..] - ETA: 11s - loss: 0.0881 - regression_loss: 0.0765 - classification_loss: 0.0115 479/500 [===========================>..] - ETA: 10s - loss: 0.0880 - regression_loss: 0.0765 - classification_loss: 0.0115 480/500 [===========================>..] - ETA: 10s - loss: 0.0880 - regression_loss: 0.0765 - classification_loss: 0.0115 481/500 [===========================>..] - ETA: 9s - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115  482/500 [===========================>..] - ETA: 9s - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115 483/500 [===========================>..] - ETA: 8s - loss: 0.0877 - regression_loss: 0.0763 - classification_loss: 0.0115 484/500 [============================>.] - ETA: 8s - loss: 0.0880 - regression_loss: 0.0765 - classification_loss: 0.0115 485/500 [============================>.] - ETA: 7s - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115 486/500 [============================>.] - ETA: 7s - loss: 0.0878 - regression_loss: 0.0763 - classification_loss: 0.0115 487/500 [============================>.] - ETA: 6s - loss: 0.0879 - regression_loss: 0.0764 - classification_loss: 0.0115 488/500 [============================>.] - ETA: 6s - loss: 0.0878 - regression_loss: 0.0763 - classification_loss: 0.0115 489/500 [============================>.] - ETA: 5s - loss: 0.0876 - regression_loss: 0.0762 - classification_loss: 0.0115 490/500 [============================>.] - ETA: 5s - loss: 0.0876 - regression_loss: 0.0761 - classification_loss: 0.0115 491/500 [============================>.] - ETA: 4s - loss: 0.0875 - regression_loss: 0.0761 - classification_loss: 0.0114 492/500 [============================>.] - ETA: 4s - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 493/500 [============================>.] - ETA: 3s - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 494/500 [============================>.] - ETA: 3s - loss: 0.0873 - regression_loss: 0.0759 - classification_loss: 0.0114 495/500 [============================>.] - ETA: 2s - loss: 0.0875 - regression_loss: 0.0760 - classification_loss: 0.0114 496/500 [============================>.] - ETA: 2s - loss: 0.0874 - regression_loss: 0.0760 - classification_loss: 0.0114 497/500 [============================>.] - ETA: 1s - loss: 0.0873 - regression_loss: 0.0759 - classification_loss: 0.0114 498/500 [============================>.] - ETA: 1s - loss: 0.0872 - regression_loss: 0.0758 - classification_loss: 0.0114 499/500 [============================>.] - ETA: 0s - loss: 0.0872 - regression_loss: 0.0758 - classification_loss: 0.0114 500/500 [==============================] - 256s 511ms/step - loss: 0.0871 - regression_loss: 0.0757 - classification_loss: 0.0114 326 instances of class plum with average precision: 0.8309 mAP: 0.8309 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 4:12 - loss: 0.0865 - regression_loss: 0.0800 - classification_loss: 0.0065 2/500 [..............................] - ETA: 4:14 - loss: 0.3723 - regression_loss: 0.3330 - classification_loss: 0.0394 3/500 [..............................] - ETA: 4:10 - loss: 0.2646 - regression_loss: 0.2372 - classification_loss: 0.0274 4/500 [..............................] - ETA: 4:12 - loss: 0.2159 - regression_loss: 0.1939 - classification_loss: 0.0220 5/500 [..............................] - ETA: 4:13 - loss: 0.1835 - regression_loss: 0.1651 - classification_loss: 0.0184 6/500 [..............................] - ETA: 4:12 - loss: 0.1632 - regression_loss: 0.1465 - classification_loss: 0.0168 7/500 [..............................] - ETA: 4:12 - loss: 0.1495 - regression_loss: 0.1337 - classification_loss: 0.0159 8/500 [..............................] - ETA: 4:11 - loss: 0.1344 - regression_loss: 0.1200 - classification_loss: 0.0145 9/500 [..............................] - ETA: 4:09 - loss: 0.1267 - regression_loss: 0.1129 - classification_loss: 0.0138 10/500 [..............................] - ETA: 4:09 - loss: 0.1160 - regression_loss: 0.1031 - classification_loss: 0.0129 11/500 [..............................] - ETA: 4:09 - loss: 0.1136 - regression_loss: 0.1008 - classification_loss: 0.0128 12/500 [..............................] - ETA: 4:09 - loss: 0.1129 - regression_loss: 0.1004 - classification_loss: 0.0125 13/500 [..............................] - ETA: 4:09 - loss: 0.1056 - regression_loss: 0.0938 - classification_loss: 0.0118 14/500 [..............................] - ETA: 4:07 - loss: 0.1015 - regression_loss: 0.0902 - classification_loss: 0.0113 15/500 [..............................] - ETA: 4:07 - loss: 0.0975 - regression_loss: 0.0864 - classification_loss: 0.0111 16/500 [..............................] - ETA: 4:06 - loss: 0.0943 - regression_loss: 0.0834 - classification_loss: 0.0108 17/500 [>.............................] - ETA: 4:06 - loss: 0.0990 - regression_loss: 0.0877 - classification_loss: 0.0113 18/500 [>.............................] - ETA: 4:05 - loss: 0.0958 - regression_loss: 0.0848 - classification_loss: 0.0111 19/500 [>.............................] - ETA: 4:04 - loss: 0.0971 - regression_loss: 0.0858 - classification_loss: 0.0113 20/500 [>.............................] - ETA: 4:04 - loss: 0.0938 - regression_loss: 0.0829 - classification_loss: 0.0109 21/500 [>.............................] - ETA: 4:03 - loss: 0.0918 - regression_loss: 0.0811 - classification_loss: 0.0107 22/500 [>.............................] - ETA: 4:03 - loss: 0.0896 - regression_loss: 0.0790 - classification_loss: 0.0106 23/500 [>.............................] - ETA: 4:02 - loss: 0.0877 - regression_loss: 0.0772 - classification_loss: 0.0105 24/500 [>.............................] - ETA: 4:02 - loss: 0.0856 - regression_loss: 0.0753 - classification_loss: 0.0104 25/500 [>.............................] - ETA: 4:01 - loss: 0.0869 - regression_loss: 0.0764 - classification_loss: 0.0105 26/500 [>.............................] - ETA: 4:01 - loss: 0.0855 - regression_loss: 0.0752 - classification_loss: 0.0103 27/500 [>.............................] - ETA: 4:01 - loss: 0.0841 - regression_loss: 0.0736 - classification_loss: 0.0105 28/500 [>.............................] - ETA: 4:00 - loss: 0.0822 - regression_loss: 0.0720 - classification_loss: 0.0102 29/500 [>.............................] - ETA: 3:59 - loss: 0.0966 - regression_loss: 0.0834 - classification_loss: 0.0132 30/500 [>.............................] - ETA: 3:59 - loss: 0.0968 - regression_loss: 0.0835 - classification_loss: 0.0133 31/500 [>.............................] - ETA: 3:59 - loss: 0.0949 - regression_loss: 0.0819 - classification_loss: 0.0130 32/500 [>.............................] - ETA: 3:58 - loss: 0.0948 - regression_loss: 0.0819 - classification_loss: 0.0129 33/500 [>.............................] - ETA: 3:57 - loss: 0.0934 - regression_loss: 0.0808 - classification_loss: 0.0126 34/500 [=>............................] - ETA: 3:57 - loss: 0.0915 - regression_loss: 0.0791 - classification_loss: 0.0124 35/500 [=>............................] - ETA: 3:56 - loss: 0.0901 - regression_loss: 0.0780 - classification_loss: 0.0122 36/500 [=>............................] - ETA: 3:56 - loss: 0.0969 - regression_loss: 0.0826 - classification_loss: 0.0143 37/500 [=>............................] - ETA: 3:55 - loss: 0.0946 - regression_loss: 0.0807 - classification_loss: 0.0139 38/500 [=>............................] - ETA: 3:54 - loss: 0.0930 - regression_loss: 0.0793 - classification_loss: 0.0137 39/500 [=>............................] - ETA: 3:54 - loss: 0.0945 - regression_loss: 0.0808 - classification_loss: 0.0136 40/500 [=>............................] - ETA: 3:53 - loss: 0.0928 - regression_loss: 0.0794 - classification_loss: 0.0134 41/500 [=>............................] - ETA: 3:53 - loss: 0.0925 - regression_loss: 0.0793 - classification_loss: 0.0132 42/500 [=>............................] - ETA: 3:53 - loss: 0.0921 - regression_loss: 0.0791 - classification_loss: 0.0129 43/500 [=>............................] - ETA: 3:52 - loss: 0.0910 - regression_loss: 0.0782 - classification_loss: 0.0128 44/500 [=>............................] - ETA: 3:51 - loss: 0.0917 - regression_loss: 0.0789 - classification_loss: 0.0128 45/500 [=>............................] - ETA: 3:51 - loss: 0.0920 - regression_loss: 0.0792 - classification_loss: 0.0128 46/500 [=>............................] - ETA: 3:50 - loss: 0.0916 - regression_loss: 0.0788 - classification_loss: 0.0127 47/500 [=>............................] - ETA: 3:50 - loss: 0.0914 - regression_loss: 0.0787 - classification_loss: 0.0126 48/500 [=>............................] - ETA: 3:49 - loss: 0.0904 - regression_loss: 0.0779 - classification_loss: 0.0125 49/500 [=>............................] - ETA: 3:49 - loss: 0.0892 - regression_loss: 0.0769 - classification_loss: 0.0123 50/500 [==>...........................] - ETA: 3:48 - loss: 0.0881 - regression_loss: 0.0759 - classification_loss: 0.0122 51/500 [==>...........................] - ETA: 3:48 - loss: 0.0872 - regression_loss: 0.0751 - classification_loss: 0.0121 52/500 [==>...........................] - ETA: 3:47 - loss: 0.0861 - regression_loss: 0.0742 - classification_loss: 0.0119 53/500 [==>...........................] - ETA: 3:47 - loss: 0.0853 - regression_loss: 0.0735 - classification_loss: 0.0118 54/500 [==>...........................] - ETA: 3:46 - loss: 0.0843 - regression_loss: 0.0725 - classification_loss: 0.0118 55/500 [==>...........................] - ETA: 3:46 - loss: 0.0836 - regression_loss: 0.0719 - classification_loss: 0.0117 56/500 [==>...........................] - ETA: 3:45 - loss: 0.0837 - regression_loss: 0.0720 - classification_loss: 0.0117 57/500 [==>...........................] - ETA: 3:45 - loss: 0.0889 - regression_loss: 0.0772 - classification_loss: 0.0117 58/500 [==>...........................] - ETA: 3:45 - loss: 0.0913 - regression_loss: 0.0793 - classification_loss: 0.0120 59/500 [==>...........................] - ETA: 3:44 - loss: 0.0902 - regression_loss: 0.0783 - classification_loss: 0.0119 60/500 [==>...........................] - ETA: 3:44 - loss: 0.0894 - regression_loss: 0.0776 - classification_loss: 0.0118 61/500 [==>...........................] - ETA: 3:43 - loss: 0.0888 - regression_loss: 0.0771 - classification_loss: 0.0117 62/500 [==>...........................] - ETA: 3:43 - loss: 0.0883 - regression_loss: 0.0767 - classification_loss: 0.0116 63/500 [==>...........................] - ETA: 3:42 - loss: 0.0877 - regression_loss: 0.0762 - classification_loss: 0.0115 64/500 [==>...........................] - ETA: 3:42 - loss: 0.0868 - regression_loss: 0.0755 - classification_loss: 0.0114 65/500 [==>...........................] - ETA: 3:41 - loss: 0.0863 - regression_loss: 0.0750 - classification_loss: 0.0113 66/500 [==>...........................] - ETA: 3:41 - loss: 0.0856 - regression_loss: 0.0743 - classification_loss: 0.0113 67/500 [===>..........................] - ETA: 3:40 - loss: 0.0858 - regression_loss: 0.0744 - classification_loss: 0.0114 68/500 [===>..........................] - ETA: 3:40 - loss: 0.0849 - regression_loss: 0.0737 - classification_loss: 0.0113 69/500 [===>..........................] - ETA: 3:39 - loss: 0.0843 - regression_loss: 0.0731 - classification_loss: 0.0112 70/500 [===>..........................] - ETA: 3:39 - loss: 0.0835 - regression_loss: 0.0724 - classification_loss: 0.0111 71/500 [===>..........................] - ETA: 3:39 - loss: 0.0826 - regression_loss: 0.0716 - classification_loss: 0.0110 72/500 [===>..........................] - ETA: 3:38 - loss: 0.0822 - regression_loss: 0.0713 - classification_loss: 0.0109 73/500 [===>..........................] - ETA: 3:37 - loss: 0.0823 - regression_loss: 0.0714 - classification_loss: 0.0109 74/500 [===>..........................] - ETA: 3:37 - loss: 0.0820 - regression_loss: 0.0711 - classification_loss: 0.0109 75/500 [===>..........................] - ETA: 3:36 - loss: 0.0814 - regression_loss: 0.0706 - classification_loss: 0.0108 76/500 [===>..........................] - ETA: 3:36 - loss: 0.0808 - regression_loss: 0.0701 - classification_loss: 0.0108 77/500 [===>..........................] - ETA: 3:35 - loss: 0.0882 - regression_loss: 0.0766 - classification_loss: 0.0115 78/500 [===>..........................] - ETA: 3:35 - loss: 0.0875 - regression_loss: 0.0761 - classification_loss: 0.0114 79/500 [===>..........................] - ETA: 3:34 - loss: 0.0886 - regression_loss: 0.0771 - classification_loss: 0.0115 80/500 [===>..........................] - ETA: 3:34 - loss: 0.0885 - regression_loss: 0.0770 - classification_loss: 0.0115 81/500 [===>..........................] - ETA: 3:33 - loss: 0.0876 - regression_loss: 0.0762 - classification_loss: 0.0114 82/500 [===>..........................] - ETA: 3:33 - loss: 0.0870 - regression_loss: 0.0757 - classification_loss: 0.0113 83/500 [===>..........................] - ETA: 3:32 - loss: 0.0862 - regression_loss: 0.0750 - classification_loss: 0.0112 84/500 [====>.........................] - ETA: 3:32 - loss: 0.0855 - regression_loss: 0.0744 - classification_loss: 0.0111 85/500 [====>.........................] - ETA: 3:31 - loss: 0.0847 - regression_loss: 0.0737 - classification_loss: 0.0110 86/500 [====>.........................] - ETA: 3:31 - loss: 0.0844 - regression_loss: 0.0735 - classification_loss: 0.0110 87/500 [====>.........................] - ETA: 3:30 - loss: 0.0928 - regression_loss: 0.0791 - classification_loss: 0.0137 88/500 [====>.........................] - ETA: 3:30 - loss: 0.0923 - regression_loss: 0.0787 - classification_loss: 0.0136 89/500 [====>.........................] - ETA: 3:29 - loss: 0.0917 - regression_loss: 0.0782 - classification_loss: 0.0135 90/500 [====>.........................] - ETA: 3:29 - loss: 0.0917 - regression_loss: 0.0783 - classification_loss: 0.0134 91/500 [====>.........................] - ETA: 3:28 - loss: 0.0911 - regression_loss: 0.0778 - classification_loss: 0.0133 92/500 [====>.........................] - ETA: 3:28 - loss: 0.0907 - regression_loss: 0.0775 - classification_loss: 0.0132 93/500 [====>.........................] - ETA: 3:27 - loss: 0.0911 - regression_loss: 0.0780 - classification_loss: 0.0132 94/500 [====>.........................] - ETA: 3:27 - loss: 0.0907 - regression_loss: 0.0776 - classification_loss: 0.0131 95/500 [====>.........................] - ETA: 3:26 - loss: 0.0901 - regression_loss: 0.0771 - classification_loss: 0.0130 96/500 [====>.........................] - ETA: 3:26 - loss: 0.0896 - regression_loss: 0.0766 - classification_loss: 0.0130 97/500 [====>.........................] - ETA: 3:25 - loss: 0.0903 - regression_loss: 0.0773 - classification_loss: 0.0130 98/500 [====>.........................] - ETA: 3:25 - loss: 0.0899 - regression_loss: 0.0769 - classification_loss: 0.0130 99/500 [====>.........................] - ETA: 3:24 - loss: 0.0904 - regression_loss: 0.0775 - classification_loss: 0.0130 100/500 [=====>........................] - ETA: 3:24 - loss: 0.0908 - regression_loss: 0.0779 - classification_loss: 0.0129 101/500 [=====>........................] - ETA: 3:23 - loss: 0.0907 - regression_loss: 0.0779 - classification_loss: 0.0128 102/500 [=====>........................] - ETA: 3:23 - loss: 0.0906 - regression_loss: 0.0778 - classification_loss: 0.0128 103/500 [=====>........................] - ETA: 3:23 - loss: 0.0900 - regression_loss: 0.0773 - classification_loss: 0.0127 104/500 [=====>........................] - ETA: 3:22 - loss: 0.0897 - regression_loss: 0.0771 - classification_loss: 0.0126 105/500 [=====>........................] - ETA: 3:22 - loss: 0.0891 - regression_loss: 0.0766 - classification_loss: 0.0125 106/500 [=====>........................] - ETA: 3:21 - loss: 0.0894 - regression_loss: 0.0769 - classification_loss: 0.0125 107/500 [=====>........................] - ETA: 3:21 - loss: 0.0890 - regression_loss: 0.0765 - classification_loss: 0.0125 108/500 [=====>........................] - ETA: 3:20 - loss: 0.0885 - regression_loss: 0.0760 - classification_loss: 0.0124 109/500 [=====>........................] - ETA: 3:20 - loss: 0.0883 - regression_loss: 0.0759 - classification_loss: 0.0124 110/500 [=====>........................] - ETA: 3:19 - loss: 0.0881 - regression_loss: 0.0757 - classification_loss: 0.0124 111/500 [=====>........................] - ETA: 3:18 - loss: 0.0875 - regression_loss: 0.0752 - classification_loss: 0.0123 112/500 [=====>........................] - ETA: 3:18 - loss: 0.0873 - regression_loss: 0.0751 - classification_loss: 0.0122 113/500 [=====>........................] - ETA: 3:17 - loss: 0.0870 - regression_loss: 0.0748 - classification_loss: 0.0122 114/500 [=====>........................] - ETA: 3:17 - loss: 0.0917 - regression_loss: 0.0792 - classification_loss: 0.0125 115/500 [=====>........................] - ETA: 3:16 - loss: 0.0914 - regression_loss: 0.0789 - classification_loss: 0.0125 116/500 [=====>........................] - ETA: 3:16 - loss: 0.0907 - regression_loss: 0.0783 - classification_loss: 0.0124 117/500 [======>.......................] - ETA: 3:15 - loss: 0.0903 - regression_loss: 0.0780 - classification_loss: 0.0124 118/500 [======>.......................] - ETA: 3:15 - loss: 0.0899 - regression_loss: 0.0776 - classification_loss: 0.0123 119/500 [======>.......................] - ETA: 3:14 - loss: 0.0900 - regression_loss: 0.0778 - classification_loss: 0.0123 120/500 [======>.......................] - ETA: 3:14 - loss: 0.0896 - regression_loss: 0.0774 - classification_loss: 0.0122 121/500 [======>.......................] - ETA: 3:13 - loss: 0.0892 - regression_loss: 0.0771 - classification_loss: 0.0121 122/500 [======>.......................] - ETA: 3:13 - loss: 0.0894 - regression_loss: 0.0773 - classification_loss: 0.0121 123/500 [======>.......................] - ETA: 3:12 - loss: 0.0892 - regression_loss: 0.0771 - classification_loss: 0.0121 124/500 [======>.......................] - ETA: 3:12 - loss: 0.0887 - regression_loss: 0.0767 - classification_loss: 0.0120 125/500 [======>.......................] - ETA: 3:11 - loss: 0.0888 - regression_loss: 0.0767 - classification_loss: 0.0121 126/500 [======>.......................] - ETA: 3:11 - loss: 0.0885 - regression_loss: 0.0765 - classification_loss: 0.0120 127/500 [======>.......................] - ETA: 3:10 - loss: 0.0882 - regression_loss: 0.0762 - classification_loss: 0.0119 128/500 [======>.......................] - ETA: 3:10 - loss: 0.0878 - regression_loss: 0.0759 - classification_loss: 0.0119 129/500 [======>.......................] - ETA: 3:09 - loss: 0.0875 - regression_loss: 0.0757 - classification_loss: 0.0118 130/500 [======>.......................] - ETA: 3:09 - loss: 0.0873 - regression_loss: 0.0755 - classification_loss: 0.0118 131/500 [======>.......................] - ETA: 3:08 - loss: 0.0869 - regression_loss: 0.0752 - classification_loss: 0.0117 132/500 [======>.......................] - ETA: 3:08 - loss: 0.0868 - regression_loss: 0.0751 - classification_loss: 0.0117 133/500 [======>.......................] - ETA: 3:07 - loss: 0.0864 - regression_loss: 0.0748 - classification_loss: 0.0117 134/500 [=======>......................] - ETA: 3:07 - loss: 0.0860 - regression_loss: 0.0744 - classification_loss: 0.0116 135/500 [=======>......................] - ETA: 3:06 - loss: 0.0859 - regression_loss: 0.0743 - classification_loss: 0.0116 136/500 [=======>......................] - ETA: 3:06 - loss: 0.0868 - regression_loss: 0.0752 - classification_loss: 0.0116 137/500 [=======>......................] - ETA: 3:05 - loss: 0.0864 - regression_loss: 0.0748 - classification_loss: 0.0116 138/500 [=======>......................] - ETA: 3:05 - loss: 0.0860 - regression_loss: 0.0745 - classification_loss: 0.0116 139/500 [=======>......................] - ETA: 3:04 - loss: 0.0857 - regression_loss: 0.0742 - classification_loss: 0.0115 140/500 [=======>......................] - ETA: 3:04 - loss: 0.0854 - regression_loss: 0.0739 - classification_loss: 0.0115 141/500 [=======>......................] - ETA: 3:03 - loss: 0.0861 - regression_loss: 0.0744 - classification_loss: 0.0117 142/500 [=======>......................] - ETA: 3:03 - loss: 0.0856 - regression_loss: 0.0740 - classification_loss: 0.0117 143/500 [=======>......................] - ETA: 3:02 - loss: 0.0859 - regression_loss: 0.0742 - classification_loss: 0.0117 144/500 [=======>......................] - ETA: 3:02 - loss: 0.0857 - regression_loss: 0.0740 - classification_loss: 0.0116 145/500 [=======>......................] - ETA: 3:01 - loss: 0.0855 - regression_loss: 0.0739 - classification_loss: 0.0116 146/500 [=======>......................] - ETA: 3:01 - loss: 0.0851 - regression_loss: 0.0735 - classification_loss: 0.0116 147/500 [=======>......................] - ETA: 3:00 - loss: 0.0859 - regression_loss: 0.0743 - classification_loss: 0.0116 148/500 [=======>......................] - ETA: 3:00 - loss: 0.0871 - regression_loss: 0.0754 - classification_loss: 0.0118 149/500 [=======>......................] - ETA: 2:59 - loss: 0.0871 - regression_loss: 0.0753 - classification_loss: 0.0118 150/500 [========>.....................] - ETA: 2:59 - loss: 0.0867 - regression_loss: 0.0749 - classification_loss: 0.0117 151/500 [========>.....................] - ETA: 2:58 - loss: 0.0863 - regression_loss: 0.0746 - classification_loss: 0.0117 152/500 [========>.....................] - ETA: 2:58 - loss: 0.0867 - regression_loss: 0.0750 - classification_loss: 0.0118 153/500 [========>.....................] - ETA: 2:57 - loss: 0.0867 - regression_loss: 0.0749 - classification_loss: 0.0118 154/500 [========>.....................] - ETA: 2:57 - loss: 0.0863 - regression_loss: 0.0745 - classification_loss: 0.0118 155/500 [========>.....................] - ETA: 2:56 - loss: 0.0862 - regression_loss: 0.0745 - classification_loss: 0.0117 156/500 [========>.....................] - ETA: 2:56 - loss: 0.0859 - regression_loss: 0.0742 - classification_loss: 0.0117 157/500 [========>.....................] - ETA: 2:55 - loss: 0.0856 - regression_loss: 0.0740 - classification_loss: 0.0116 158/500 [========>.....................] - ETA: 2:54 - loss: 0.0854 - regression_loss: 0.0737 - classification_loss: 0.0116 159/500 [========>.....................] - ETA: 2:54 - loss: 0.0854 - regression_loss: 0.0738 - classification_loss: 0.0116 160/500 [========>.....................] - ETA: 2:53 - loss: 0.0852 - regression_loss: 0.0736 - classification_loss: 0.0116 161/500 [========>.....................] - ETA: 2:53 - loss: 0.0851 - regression_loss: 0.0735 - classification_loss: 0.0116 162/500 [========>.....................] - ETA: 2:52 - loss: 0.0848 - regression_loss: 0.0733 - classification_loss: 0.0115 163/500 [========>.....................] - ETA: 2:52 - loss: 0.0847 - regression_loss: 0.0732 - classification_loss: 0.0115 164/500 [========>.....................] - ETA: 2:51 - loss: 0.0853 - regression_loss: 0.0737 - classification_loss: 0.0117 165/500 [========>.....................] - ETA: 2:51 - loss: 0.0852 - regression_loss: 0.0736 - classification_loss: 0.0116 166/500 [========>.....................] - ETA: 2:50 - loss: 0.0850 - regression_loss: 0.0734 - classification_loss: 0.0116 167/500 [=========>....................] - ETA: 2:50 - loss: 0.0855 - regression_loss: 0.0739 - classification_loss: 0.0116 168/500 [=========>....................] - ETA: 2:49 - loss: 0.0853 - regression_loss: 0.0737 - classification_loss: 0.0116 169/500 [=========>....................] - ETA: 2:49 - loss: 0.0851 - regression_loss: 0.0735 - classification_loss: 0.0116 170/500 [=========>....................] - ETA: 2:48 - loss: 0.0848 - regression_loss: 0.0733 - classification_loss: 0.0115 171/500 [=========>....................] - ETA: 2:48 - loss: 0.0845 - regression_loss: 0.0731 - classification_loss: 0.0115 172/500 [=========>....................] - ETA: 2:47 - loss: 0.0844 - regression_loss: 0.0729 - classification_loss: 0.0115 173/500 [=========>....................] - ETA: 2:47 - loss: 0.0841 - regression_loss: 0.0726 - classification_loss: 0.0114 174/500 [=========>....................] - ETA: 2:46 - loss: 0.0838 - regression_loss: 0.0724 - classification_loss: 0.0114 175/500 [=========>....................] - ETA: 2:46 - loss: 0.0848 - regression_loss: 0.0734 - classification_loss: 0.0114 176/500 [=========>....................] - ETA: 2:45 - loss: 0.0871 - regression_loss: 0.0752 - classification_loss: 0.0119 177/500 [=========>....................] - ETA: 2:45 - loss: 0.0868 - regression_loss: 0.0750 - classification_loss: 0.0119 178/500 [=========>....................] - ETA: 2:44 - loss: 0.0866 - regression_loss: 0.0748 - classification_loss: 0.0118 179/500 [=========>....................] - ETA: 2:44 - loss: 0.0865 - regression_loss: 0.0747 - classification_loss: 0.0118 180/500 [=========>....................] - ETA: 2:43 - loss: 0.0862 - regression_loss: 0.0744 - classification_loss: 0.0118 181/500 [=========>....................] - ETA: 2:43 - loss: 0.0861 - regression_loss: 0.0744 - classification_loss: 0.0117 182/500 [=========>....................] - ETA: 2:42 - loss: 0.0858 - regression_loss: 0.0741 - classification_loss: 0.0117 183/500 [=========>....................] - ETA: 2:42 - loss: 0.0858 - regression_loss: 0.0741 - classification_loss: 0.0117 184/500 [==========>...................] - ETA: 2:41 - loss: 0.0855 - regression_loss: 0.0739 - classification_loss: 0.0116 185/500 [==========>...................] - ETA: 2:41 - loss: 0.0856 - regression_loss: 0.0740 - classification_loss: 0.0116 186/500 [==========>...................] - ETA: 2:40 - loss: 0.0856 - regression_loss: 0.0740 - classification_loss: 0.0116 187/500 [==========>...................] - ETA: 2:40 - loss: 0.0857 - regression_loss: 0.0741 - classification_loss: 0.0116 188/500 [==========>...................] - ETA: 2:39 - loss: 0.0854 - regression_loss: 0.0738 - classification_loss: 0.0116 189/500 [==========>...................] - ETA: 2:39 - loss: 0.0851 - regression_loss: 0.0736 - classification_loss: 0.0115 190/500 [==========>...................] - ETA: 2:38 - loss: 0.0850 - regression_loss: 0.0735 - classification_loss: 0.0115 191/500 [==========>...................] - ETA: 2:37 - loss: 0.0848 - regression_loss: 0.0733 - classification_loss: 0.0115 192/500 [==========>...................] - ETA: 2:37 - loss: 0.0845 - regression_loss: 0.0730 - classification_loss: 0.0114 193/500 [==========>...................] - ETA: 2:36 - loss: 0.0841 - regression_loss: 0.0727 - classification_loss: 0.0114 194/500 [==========>...................] - ETA: 2:36 - loss: 0.0839 - regression_loss: 0.0725 - classification_loss: 0.0114 195/500 [==========>...................] - ETA: 2:36 - loss: 0.0837 - regression_loss: 0.0724 - classification_loss: 0.0114 196/500 [==========>...................] - ETA: 2:35 - loss: 0.0839 - regression_loss: 0.0725 - classification_loss: 0.0114 197/500 [==========>...................] - ETA: 2:35 - loss: 0.0838 - regression_loss: 0.0725 - classification_loss: 0.0113 198/500 [==========>...................] - ETA: 2:34 - loss: 0.0840 - regression_loss: 0.0727 - classification_loss: 0.0114 199/500 [==========>...................] - ETA: 2:34 - loss: 0.0838 - regression_loss: 0.0725 - classification_loss: 0.0113 200/500 [===========>..................] - ETA: 2:33 - loss: 0.0836 - regression_loss: 0.0723 - classification_loss: 0.0113 201/500 [===========>..................] - ETA: 2:32 - loss: 0.0836 - regression_loss: 0.0723 - classification_loss: 0.0113 202/500 [===========>..................] - ETA: 2:32 - loss: 0.0833 - regression_loss: 0.0721 - classification_loss: 0.0112 203/500 [===========>..................] - ETA: 2:31 - loss: 0.0835 - regression_loss: 0.0722 - classification_loss: 0.0112 204/500 [===========>..................] - ETA: 2:31 - loss: 0.0853 - regression_loss: 0.0740 - classification_loss: 0.0113 205/500 [===========>..................] - ETA: 2:30 - loss: 0.0853 - regression_loss: 0.0740 - classification_loss: 0.0113 206/500 [===========>..................] - ETA: 2:30 - loss: 0.0851 - regression_loss: 0.0738 - classification_loss: 0.0113 207/500 [===========>..................] - ETA: 2:29 - loss: 0.0849 - regression_loss: 0.0736 - classification_loss: 0.0112 208/500 [===========>..................] - ETA: 2:29 - loss: 0.0848 - regression_loss: 0.0736 - classification_loss: 0.0112 209/500 [===========>..................] - ETA: 2:28 - loss: 0.0849 - regression_loss: 0.0737 - classification_loss: 0.0112 210/500 [===========>..................] - ETA: 2:28 - loss: 0.0846 - regression_loss: 0.0734 - classification_loss: 0.0112 211/500 [===========>..................] - ETA: 2:27 - loss: 0.0843 - regression_loss: 0.0732 - classification_loss: 0.0111 212/500 [===========>..................] - ETA: 2:27 - loss: 0.0843 - regression_loss: 0.0732 - classification_loss: 0.0111 213/500 [===========>..................] - ETA: 2:26 - loss: 0.0840 - regression_loss: 0.0729 - classification_loss: 0.0111 214/500 [===========>..................] - ETA: 2:26 - loss: 0.0837 - regression_loss: 0.0726 - classification_loss: 0.0110 215/500 [===========>..................] - ETA: 2:25 - loss: 0.0836 - regression_loss: 0.0725 - classification_loss: 0.0110 216/500 [===========>..................] - ETA: 2:25 - loss: 0.0835 - regression_loss: 0.0725 - classification_loss: 0.0110 217/500 [============>.................] - ETA: 2:24 - loss: 0.0834 - regression_loss: 0.0723 - classification_loss: 0.0110 218/500 [============>.................] - ETA: 2:24 - loss: 0.0834 - regression_loss: 0.0724 - classification_loss: 0.0110 219/500 [============>.................] - ETA: 2:23 - loss: 0.0833 - regression_loss: 0.0723 - classification_loss: 0.0110 220/500 [============>.................] - ETA: 2:23 - loss: 0.0831 - regression_loss: 0.0721 - classification_loss: 0.0110 221/500 [============>.................] - ETA: 2:22 - loss: 0.0831 - regression_loss: 0.0721 - classification_loss: 0.0110 222/500 [============>.................] - ETA: 2:22 - loss: 0.0828 - regression_loss: 0.0719 - classification_loss: 0.0109 223/500 [============>.................] - ETA: 2:21 - loss: 0.0829 - regression_loss: 0.0720 - classification_loss: 0.0109 224/500 [============>.................] - ETA: 2:21 - loss: 0.0827 - regression_loss: 0.0718 - classification_loss: 0.0109 225/500 [============>.................] - ETA: 2:20 - loss: 0.0827 - regression_loss: 0.0718 - classification_loss: 0.0109 226/500 [============>.................] - ETA: 2:20 - loss: 0.0825 - regression_loss: 0.0717 - classification_loss: 0.0108 227/500 [============>.................] - ETA: 2:19 - loss: 0.0823 - regression_loss: 0.0714 - classification_loss: 0.0108 228/500 [============>.................] - ETA: 2:19 - loss: 0.0820 - regression_loss: 0.0713 - classification_loss: 0.0108 229/500 [============>.................] - ETA: 2:18 - loss: 0.0818 - regression_loss: 0.0711 - classification_loss: 0.0107 230/500 [============>.................] - ETA: 2:18 - loss: 0.0816 - regression_loss: 0.0709 - classification_loss: 0.0107 231/500 [============>.................] - ETA: 2:17 - loss: 0.0816 - regression_loss: 0.0708 - classification_loss: 0.0107 232/500 [============>.................] - ETA: 2:17 - loss: 0.0814 - regression_loss: 0.0707 - classification_loss: 0.0107 233/500 [============>.................] - ETA: 2:16 - loss: 0.0812 - regression_loss: 0.0705 - classification_loss: 0.0107 234/500 [=============>................] - ETA: 2:16 - loss: 0.0812 - regression_loss: 0.0705 - classification_loss: 0.0107 235/500 [=============>................] - ETA: 2:15 - loss: 0.0811 - regression_loss: 0.0705 - classification_loss: 0.0107 236/500 [=============>................] - ETA: 2:15 - loss: 0.0811 - regression_loss: 0.0704 - classification_loss: 0.0107 237/500 [=============>................] - ETA: 2:14 - loss: 0.0808 - regression_loss: 0.0702 - classification_loss: 0.0106 238/500 [=============>................] - ETA: 2:14 - loss: 0.0806 - regression_loss: 0.0700 - classification_loss: 0.0106 239/500 [=============>................] - ETA: 2:13 - loss: 0.0807 - regression_loss: 0.0701 - classification_loss: 0.0106 240/500 [=============>................] - ETA: 2:13 - loss: 0.0822 - regression_loss: 0.0715 - classification_loss: 0.0107 241/500 [=============>................] - ETA: 2:12 - loss: 0.0822 - regression_loss: 0.0715 - classification_loss: 0.0107 242/500 [=============>................] - ETA: 2:12 - loss: 0.0821 - regression_loss: 0.0714 - classification_loss: 0.0106 243/500 [=============>................] - ETA: 2:11 - loss: 0.0824 - regression_loss: 0.0717 - classification_loss: 0.0107 244/500 [=============>................] - ETA: 2:10 - loss: 0.0822 - regression_loss: 0.0715 - classification_loss: 0.0107 245/500 [=============>................] - ETA: 2:10 - loss: 0.0820 - regression_loss: 0.0714 - classification_loss: 0.0106 246/500 [=============>................] - ETA: 2:09 - loss: 0.0819 - regression_loss: 0.0713 - classification_loss: 0.0106 247/500 [=============>................] - ETA: 2:09 - loss: 0.0817 - regression_loss: 0.0711 - classification_loss: 0.0106 248/500 [=============>................] - ETA: 2:08 - loss: 0.0816 - regression_loss: 0.0710 - classification_loss: 0.0106 249/500 [=============>................] - ETA: 2:08 - loss: 0.0816 - regression_loss: 0.0711 - classification_loss: 0.0106 250/500 [==============>...............] - ETA: 2:07 - loss: 0.0816 - regression_loss: 0.0710 - classification_loss: 0.0106 251/500 [==============>...............] - ETA: 2:07 - loss: 0.0828 - regression_loss: 0.0722 - classification_loss: 0.0106 252/500 [==============>...............] - ETA: 2:06 - loss: 0.0832 - regression_loss: 0.0725 - classification_loss: 0.0107 253/500 [==============>...............] - ETA: 2:06 - loss: 0.0830 - regression_loss: 0.0723 - classification_loss: 0.0107 254/500 [==============>...............] - ETA: 2:05 - loss: 0.0831 - regression_loss: 0.0724 - classification_loss: 0.0107 255/500 [==============>...............] - ETA: 2:05 - loss: 0.0832 - regression_loss: 0.0725 - classification_loss: 0.0107 256/500 [==============>...............] - ETA: 2:04 - loss: 0.0831 - regression_loss: 0.0724 - classification_loss: 0.0107 257/500 [==============>...............] - ETA: 2:04 - loss: 0.0829 - regression_loss: 0.0722 - classification_loss: 0.0107 258/500 [==============>...............] - ETA: 2:03 - loss: 0.0830 - regression_loss: 0.0724 - classification_loss: 0.0107 259/500 [==============>...............] - ETA: 2:03 - loss: 0.0830 - regression_loss: 0.0724 - classification_loss: 0.0107 260/500 [==============>...............] - ETA: 2:02 - loss: 0.0828 - regression_loss: 0.0722 - classification_loss: 0.0106 261/500 [==============>...............] - ETA: 2:02 - loss: 0.0827 - regression_loss: 0.0721 - classification_loss: 0.0106 262/500 [==============>...............] - ETA: 2:01 - loss: 0.0825 - regression_loss: 0.0720 - classification_loss: 0.0106 263/500 [==============>...............] - ETA: 2:01 - loss: 0.0824 - regression_loss: 0.0719 - classification_loss: 0.0106 264/500 [==============>...............] - ETA: 2:00 - loss: 0.0823 - regression_loss: 0.0717 - classification_loss: 0.0105 265/500 [==============>...............] - ETA: 2:00 - loss: 0.0821 - regression_loss: 0.0716 - classification_loss: 0.0105 266/500 [==============>...............] - ETA: 1:59 - loss: 0.0819 - regression_loss: 0.0714 - classification_loss: 0.0105 267/500 [===============>..............] - ETA: 1:59 - loss: 0.0828 - regression_loss: 0.0721 - classification_loss: 0.0108 268/500 [===============>..............] - ETA: 1:58 - loss: 0.0827 - regression_loss: 0.0720 - classification_loss: 0.0108 269/500 [===============>..............] - ETA: 1:58 - loss: 0.0825 - regression_loss: 0.0717 - classification_loss: 0.0107 270/500 [===============>..............] - ETA: 1:57 - loss: 0.0823 - regression_loss: 0.0716 - classification_loss: 0.0107 271/500 [===============>..............] - ETA: 1:57 - loss: 0.0823 - regression_loss: 0.0716 - classification_loss: 0.0107 272/500 [===============>..............] - ETA: 1:56 - loss: 0.0821 - regression_loss: 0.0715 - classification_loss: 0.0107 273/500 [===============>..............] - ETA: 1:56 - loss: 0.0823 - regression_loss: 0.0716 - classification_loss: 0.0107 274/500 [===============>..............] - ETA: 1:55 - loss: 0.0821 - regression_loss: 0.0715 - classification_loss: 0.0106 275/500 [===============>..............] - ETA: 1:55 - loss: 0.0828 - regression_loss: 0.0721 - classification_loss: 0.0107 276/500 [===============>..............] - ETA: 1:54 - loss: 0.0827 - regression_loss: 0.0720 - classification_loss: 0.0107 277/500 [===============>..............] - ETA: 1:54 - loss: 0.0828 - regression_loss: 0.0720 - classification_loss: 0.0107 278/500 [===============>..............] - ETA: 1:53 - loss: 0.0826 - regression_loss: 0.0719 - classification_loss: 0.0107 279/500 [===============>..............] - ETA: 1:53 - loss: 0.0824 - regression_loss: 0.0717 - classification_loss: 0.0107 280/500 [===============>..............] - ETA: 1:52 - loss: 0.0823 - regression_loss: 0.0716 - classification_loss: 0.0107 281/500 [===============>..............] - ETA: 1:52 - loss: 0.0822 - regression_loss: 0.0715 - classification_loss: 0.0107 282/500 [===============>..............] - ETA: 1:51 - loss: 0.0820 - regression_loss: 0.0713 - classification_loss: 0.0106 283/500 [===============>..............] - ETA: 1:51 - loss: 0.0819 - regression_loss: 0.0713 - classification_loss: 0.0106 284/500 [================>.............] - ETA: 1:50 - loss: 0.0817 - regression_loss: 0.0711 - classification_loss: 0.0106 285/500 [================>.............] - ETA: 1:50 - loss: 0.0817 - regression_loss: 0.0711 - classification_loss: 0.0106 286/500 [================>.............] - ETA: 1:49 - loss: 0.0815 - regression_loss: 0.0709 - classification_loss: 0.0106 287/500 [================>.............] - ETA: 1:49 - loss: 0.0814 - regression_loss: 0.0708 - classification_loss: 0.0106 288/500 [================>.............] - ETA: 1:48 - loss: 0.0814 - regression_loss: 0.0708 - classification_loss: 0.0106 289/500 [================>.............] - ETA: 1:48 - loss: 0.0813 - regression_loss: 0.0708 - classification_loss: 0.0105 290/500 [================>.............] - ETA: 1:47 - loss: 0.0811 - regression_loss: 0.0706 - classification_loss: 0.0105 291/500 [================>.............] - ETA: 1:47 - loss: 0.0810 - regression_loss: 0.0705 - classification_loss: 0.0105 292/500 [================>.............] - ETA: 1:46 - loss: 0.0808 - regression_loss: 0.0704 - classification_loss: 0.0105 293/500 [================>.............] - ETA: 1:46 - loss: 0.0809 - regression_loss: 0.0705 - classification_loss: 0.0105 294/500 [================>.............] - ETA: 1:45 - loss: 0.0811 - regression_loss: 0.0706 - classification_loss: 0.0105 295/500 [================>.............] - ETA: 1:45 - loss: 0.0809 - regression_loss: 0.0705 - classification_loss: 0.0105 296/500 [================>.............] - ETA: 1:44 - loss: 0.0809 - regression_loss: 0.0704 - classification_loss: 0.0105 297/500 [================>.............] - ETA: 1:44 - loss: 0.0807 - regression_loss: 0.0703 - classification_loss: 0.0104 298/500 [================>.............] - ETA: 1:43 - loss: 0.0809 - regression_loss: 0.0705 - classification_loss: 0.0104 299/500 [================>.............] - ETA: 1:42 - loss: 0.0807 - regression_loss: 0.0703 - classification_loss: 0.0104 300/500 [=================>............] - ETA: 1:42 - loss: 0.0806 - regression_loss: 0.0702 - classification_loss: 0.0104 301/500 [=================>............] - ETA: 1:41 - loss: 0.0805 - regression_loss: 0.0701 - classification_loss: 0.0104 302/500 [=================>............] - ETA: 1:41 - loss: 0.0803 - regression_loss: 0.0699 - classification_loss: 0.0104 303/500 [=================>............] - ETA: 1:40 - loss: 0.0802 - regression_loss: 0.0698 - classification_loss: 0.0103 304/500 [=================>............] - ETA: 1:40 - loss: 0.0802 - regression_loss: 0.0699 - classification_loss: 0.0103 305/500 [=================>............] - ETA: 1:39 - loss: 0.0803 - regression_loss: 0.0699 - classification_loss: 0.0103 306/500 [=================>............] - ETA: 1:39 - loss: 0.0801 - regression_loss: 0.0698 - classification_loss: 0.0103 307/500 [=================>............] - ETA: 1:38 - loss: 0.0800 - regression_loss: 0.0697 - classification_loss: 0.0103 308/500 [=================>............] - ETA: 1:38 - loss: 0.0798 - regression_loss: 0.0695 - classification_loss: 0.0103 309/500 [=================>............] - ETA: 1:37 - loss: 0.0797 - regression_loss: 0.0694 - classification_loss: 0.0103 310/500 [=================>............] - ETA: 1:37 - loss: 0.0795 - regression_loss: 0.0693 - classification_loss: 0.0103 311/500 [=================>............] - ETA: 1:36 - loss: 0.0795 - regression_loss: 0.0693 - classification_loss: 0.0103 312/500 [=================>............] - ETA: 1:36 - loss: 0.0795 - regression_loss: 0.0693 - classification_loss: 0.0102 313/500 [=================>............] - ETA: 1:35 - loss: 0.0794 - regression_loss: 0.0692 - classification_loss: 0.0102 314/500 [=================>............] - ETA: 1:35 - loss: 0.0793 - regression_loss: 0.0691 - classification_loss: 0.0102 315/500 [=================>............] - ETA: 1:34 - loss: 0.0792 - regression_loss: 0.0690 - classification_loss: 0.0102 316/500 [=================>............] - ETA: 1:34 - loss: 0.0791 - regression_loss: 0.0690 - classification_loss: 0.0102 317/500 [==================>...........] - ETA: 1:33 - loss: 0.0790 - regression_loss: 0.0689 - classification_loss: 0.0102 318/500 [==================>...........] - ETA: 1:33 - loss: 0.0793 - regression_loss: 0.0691 - classification_loss: 0.0102 319/500 [==================>...........] - ETA: 1:32 - loss: 0.0811 - regression_loss: 0.0706 - classification_loss: 0.0104 320/500 [==================>...........] - ETA: 1:32 - loss: 0.0809 - regression_loss: 0.0705 - classification_loss: 0.0104 321/500 [==================>...........] - ETA: 1:31 - loss: 0.0809 - regression_loss: 0.0705 - classification_loss: 0.0104 322/500 [==================>...........] - ETA: 1:31 - loss: 0.0807 - regression_loss: 0.0703 - classification_loss: 0.0104 323/500 [==================>...........] - ETA: 1:30 - loss: 0.0805 - regression_loss: 0.0702 - classification_loss: 0.0103 324/500 [==================>...........] - ETA: 1:30 - loss: 0.0804 - regression_loss: 0.0700 - classification_loss: 0.0103 325/500 [==================>...........] - ETA: 1:29 - loss: 0.0804 - regression_loss: 0.0701 - classification_loss: 0.0103 326/500 [==================>...........] - ETA: 1:29 - loss: 0.0805 - regression_loss: 0.0701 - classification_loss: 0.0103 327/500 [==================>...........] - ETA: 1:28 - loss: 0.0803 - regression_loss: 0.0700 - classification_loss: 0.0103 328/500 [==================>...........] - ETA: 1:28 - loss: 0.0807 - regression_loss: 0.0703 - classification_loss: 0.0103 329/500 [==================>...........] - ETA: 1:27 - loss: 0.0805 - regression_loss: 0.0702 - classification_loss: 0.0103 330/500 [==================>...........] - ETA: 1:27 - loss: 0.0804 - regression_loss: 0.0701 - classification_loss: 0.0103 331/500 [==================>...........] - ETA: 1:26 - loss: 0.0805 - regression_loss: 0.0702 - classification_loss: 0.0103 332/500 [==================>...........] - ETA: 1:26 - loss: 0.0804 - regression_loss: 0.0701 - classification_loss: 0.0103 333/500 [==================>...........] - ETA: 1:25 - loss: 0.0803 - regression_loss: 0.0700 - classification_loss: 0.0103 334/500 [===================>..........] - ETA: 1:25 - loss: 0.0801 - regression_loss: 0.0699 - classification_loss: 0.0103 335/500 [===================>..........] - ETA: 1:24 - loss: 0.0804 - regression_loss: 0.0701 - classification_loss: 0.0103 336/500 [===================>..........] - ETA: 1:24 - loss: 0.0803 - regression_loss: 0.0700 - classification_loss: 0.0103 337/500 [===================>..........] - ETA: 1:23 - loss: 0.0802 - regression_loss: 0.0699 - classification_loss: 0.0103 338/500 [===================>..........] - ETA: 1:23 - loss: 0.0801 - regression_loss: 0.0699 - classification_loss: 0.0102 339/500 [===================>..........] - ETA: 1:22 - loss: 0.0800 - regression_loss: 0.0698 - classification_loss: 0.0102 340/500 [===================>..........] - ETA: 1:22 - loss: 0.0803 - regression_loss: 0.0700 - classification_loss: 0.0102 341/500 [===================>..........] - ETA: 1:21 - loss: 0.0803 - regression_loss: 0.0701 - classification_loss: 0.0102 342/500 [===================>..........] - ETA: 1:21 - loss: 0.0803 - regression_loss: 0.0701 - classification_loss: 0.0102 343/500 [===================>..........] - ETA: 1:20 - loss: 0.0802 - regression_loss: 0.0700 - classification_loss: 0.0102 344/500 [===================>..........] - ETA: 1:19 - loss: 0.0806 - regression_loss: 0.0704 - classification_loss: 0.0102 345/500 [===================>..........] - ETA: 1:19 - loss: 0.0805 - regression_loss: 0.0703 - classification_loss: 0.0102 346/500 [===================>..........] - ETA: 1:18 - loss: 0.0804 - regression_loss: 0.0702 - classification_loss: 0.0102 347/500 [===================>..........] - ETA: 1:18 - loss: 0.0803 - regression_loss: 0.0701 - classification_loss: 0.0102 348/500 [===================>..........] - ETA: 1:17 - loss: 0.0802 - regression_loss: 0.0700 - classification_loss: 0.0102 349/500 [===================>..........] - ETA: 1:17 - loss: 0.0801 - regression_loss: 0.0700 - classification_loss: 0.0101 350/500 [====================>.........] - ETA: 1:16 - loss: 0.0800 - regression_loss: 0.0699 - classification_loss: 0.0101 351/500 [====================>.........] - ETA: 1:16 - loss: 0.0799 - regression_loss: 0.0698 - classification_loss: 0.0101 352/500 [====================>.........] - ETA: 1:15 - loss: 0.0803 - regression_loss: 0.0701 - classification_loss: 0.0102 353/500 [====================>.........] - ETA: 1:15 - loss: 0.0802 - regression_loss: 0.0700 - classification_loss: 0.0102 354/500 [====================>.........] - ETA: 1:14 - loss: 0.0813 - regression_loss: 0.0709 - classification_loss: 0.0104 355/500 [====================>.........] - ETA: 1:14 - loss: 0.0823 - regression_loss: 0.0717 - classification_loss: 0.0107 356/500 [====================>.........] - ETA: 1:13 - loss: 0.0823 - regression_loss: 0.0716 - classification_loss: 0.0106 357/500 [====================>.........] - ETA: 1:13 - loss: 0.0821 - regression_loss: 0.0715 - classification_loss: 0.0106 358/500 [====================>.........] - ETA: 1:12 - loss: 0.0820 - regression_loss: 0.0714 - classification_loss: 0.0106 359/500 [====================>.........] - ETA: 1:12 - loss: 0.0818 - regression_loss: 0.0712 - classification_loss: 0.0106 360/500 [====================>.........] - ETA: 1:11 - loss: 0.0822 - regression_loss: 0.0715 - classification_loss: 0.0106 361/500 [====================>.........] - ETA: 1:11 - loss: 0.0821 - regression_loss: 0.0715 - classification_loss: 0.0106 362/500 [====================>.........] - ETA: 1:10 - loss: 0.0821 - regression_loss: 0.0715 - classification_loss: 0.0106 363/500 [====================>.........] - ETA: 1:10 - loss: 0.0824 - regression_loss: 0.0718 - classification_loss: 0.0106 364/500 [====================>.........] - ETA: 1:09 - loss: 0.0824 - regression_loss: 0.0718 - classification_loss: 0.0106 365/500 [====================>.........] - ETA: 1:09 - loss: 0.0847 - regression_loss: 0.0733 - classification_loss: 0.0113 366/500 [====================>.........] - ETA: 1:08 - loss: 0.0846 - regression_loss: 0.0733 - classification_loss: 0.0113 367/500 [=====================>........] - ETA: 1:08 - loss: 0.0845 - regression_loss: 0.0732 - classification_loss: 0.0113 368/500 [=====================>........] - ETA: 1:07 - loss: 0.0846 - regression_loss: 0.0733 - classification_loss: 0.0113 369/500 [=====================>........] - ETA: 1:07 - loss: 0.0845 - regression_loss: 0.0732 - classification_loss: 0.0113 370/500 [=====================>........] - ETA: 1:06 - loss: 0.0844 - regression_loss: 0.0731 - classification_loss: 0.0112 371/500 [=====================>........] - ETA: 1:06 - loss: 0.0843 - regression_loss: 0.0730 - classification_loss: 0.0112 372/500 [=====================>........] - ETA: 1:05 - loss: 0.0843 - regression_loss: 0.0731 - classification_loss: 0.0112 373/500 [=====================>........] - ETA: 1:05 - loss: 0.0841 - regression_loss: 0.0729 - classification_loss: 0.0112 374/500 [=====================>........] - ETA: 1:04 - loss: 0.0840 - regression_loss: 0.0728 - classification_loss: 0.0112 375/500 [=====================>........] - ETA: 1:04 - loss: 0.0840 - regression_loss: 0.0728 - classification_loss: 0.0112 376/500 [=====================>........] - ETA: 1:03 - loss: 0.0854 - regression_loss: 0.0741 - classification_loss: 0.0113 377/500 [=====================>........] - ETA: 1:03 - loss: 0.0852 - regression_loss: 0.0740 - classification_loss: 0.0113 378/500 [=====================>........] - ETA: 1:02 - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 379/500 [=====================>........] - ETA: 1:02 - loss: 0.0852 - regression_loss: 0.0740 - classification_loss: 0.0113 380/500 [=====================>........] - ETA: 1:01 - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 381/500 [=====================>........] - ETA: 1:01 - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 382/500 [=====================>........] - ETA: 1:00 - loss: 0.0852 - regression_loss: 0.0740 - classification_loss: 0.0112 383/500 [=====================>........] - ETA: 59s - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112  384/500 [======================>.......] - ETA: 59s - loss: 0.0850 - regression_loss: 0.0738 - classification_loss: 0.0112 385/500 [======================>.......] - ETA: 58s - loss: 0.0851 - regression_loss: 0.0739 - classification_loss: 0.0112 386/500 [======================>.......] - ETA: 58s - loss: 0.0851 - regression_loss: 0.0738 - classification_loss: 0.0112 387/500 [======================>.......] - ETA: 57s - loss: 0.0850 - regression_loss: 0.0738 - classification_loss: 0.0112 388/500 [======================>.......] - ETA: 57s - loss: 0.0849 - regression_loss: 0.0737 - classification_loss: 0.0112 389/500 [======================>.......] - ETA: 56s - loss: 0.0848 - regression_loss: 0.0736 - classification_loss: 0.0112 390/500 [======================>.......] - ETA: 56s - loss: 0.0847 - regression_loss: 0.0736 - classification_loss: 0.0112 391/500 [======================>.......] - ETA: 55s - loss: 0.0847 - regression_loss: 0.0736 - classification_loss: 0.0112 392/500 [======================>.......] - ETA: 55s - loss: 0.0846 - regression_loss: 0.0734 - classification_loss: 0.0112 393/500 [======================>.......] - ETA: 54s - loss: 0.0845 - regression_loss: 0.0733 - classification_loss: 0.0111 394/500 [======================>.......] - ETA: 54s - loss: 0.0843 - regression_loss: 0.0732 - classification_loss: 0.0111 395/500 [======================>.......] - ETA: 53s - loss: 0.0842 - regression_loss: 0.0731 - classification_loss: 0.0111 396/500 [======================>.......] - ETA: 53s - loss: 0.0841 - regression_loss: 0.0730 - classification_loss: 0.0111 397/500 [======================>.......] - ETA: 52s - loss: 0.0841 - regression_loss: 0.0730 - classification_loss: 0.0111 398/500 [======================>.......] - ETA: 52s - loss: 0.0839 - regression_loss: 0.0728 - classification_loss: 0.0111 399/500 [======================>.......] - ETA: 51s - loss: 0.0837 - regression_loss: 0.0727 - classification_loss: 0.0110 400/500 [=======================>......] - ETA: 51s - loss: 0.0838 - regression_loss: 0.0727 - classification_loss: 0.0110 401/500 [=======================>......] - ETA: 50s - loss: 0.0846 - regression_loss: 0.0736 - classification_loss: 0.0111 402/500 [=======================>......] - ETA: 50s - loss: 0.0846 - regression_loss: 0.0735 - classification_loss: 0.0111 403/500 [=======================>......] - ETA: 49s - loss: 0.0845 - regression_loss: 0.0734 - classification_loss: 0.0111 404/500 [=======================>......] - ETA: 49s - loss: 0.0847 - regression_loss: 0.0736 - classification_loss: 0.0111 405/500 [=======================>......] - ETA: 48s - loss: 0.0846 - regression_loss: 0.0736 - classification_loss: 0.0111 406/500 [=======================>......] - ETA: 48s - loss: 0.0847 - regression_loss: 0.0737 - classification_loss: 0.0111 407/500 [=======================>......] - ETA: 47s - loss: 0.0849 - regression_loss: 0.0738 - classification_loss: 0.0111 408/500 [=======================>......] - ETA: 47s - loss: 0.0852 - regression_loss: 0.0741 - classification_loss: 0.0111 409/500 [=======================>......] - ETA: 46s - loss: 0.0851 - regression_loss: 0.0740 - classification_loss: 0.0111 410/500 [=======================>......] - ETA: 46s - loss: 0.0851 - regression_loss: 0.0740 - classification_loss: 0.0111 411/500 [=======================>......] - ETA: 45s - loss: 0.0850 - regression_loss: 0.0739 - classification_loss: 0.0111 412/500 [=======================>......] - ETA: 45s - loss: 0.0849 - regression_loss: 0.0739 - classification_loss: 0.0110 413/500 [=======================>......] - ETA: 44s - loss: 0.0848 - regression_loss: 0.0738 - classification_loss: 0.0110 414/500 [=======================>......] - ETA: 44s - loss: 0.0847 - regression_loss: 0.0737 - classification_loss: 0.0110 415/500 [=======================>......] - ETA: 43s - loss: 0.0848 - regression_loss: 0.0737 - classification_loss: 0.0110 416/500 [=======================>......] - ETA: 43s - loss: 0.0855 - regression_loss: 0.0744 - classification_loss: 0.0111 417/500 [========================>.....] - ETA: 42s - loss: 0.0856 - regression_loss: 0.0745 - classification_loss: 0.0110 418/500 [========================>.....] - ETA: 42s - loss: 0.0855 - regression_loss: 0.0744 - classification_loss: 0.0110 419/500 [========================>.....] - ETA: 41s - loss: 0.0858 - regression_loss: 0.0747 - classification_loss: 0.0111 420/500 [========================>.....] - ETA: 41s - loss: 0.0856 - regression_loss: 0.0746 - classification_loss: 0.0110 421/500 [========================>.....] - ETA: 40s - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0110 422/500 [========================>.....] - ETA: 39s - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0110 423/500 [========================>.....] - ETA: 39s - loss: 0.0859 - regression_loss: 0.0749 - classification_loss: 0.0110 424/500 [========================>.....] - ETA: 38s - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0110 425/500 [========================>.....] - ETA: 38s - loss: 0.0859 - regression_loss: 0.0749 - classification_loss: 0.0110 426/500 [========================>.....] - ETA: 37s - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0110 427/500 [========================>.....] - ETA: 37s - loss: 0.0860 - regression_loss: 0.0750 - classification_loss: 0.0110 428/500 [========================>.....] - ETA: 36s - loss: 0.0859 - regression_loss: 0.0749 - classification_loss: 0.0110 429/500 [========================>.....] - ETA: 36s - loss: 0.0858 - regression_loss: 0.0748 - classification_loss: 0.0110 430/500 [========================>.....] - ETA: 35s - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0110 431/500 [========================>.....] - ETA: 35s - loss: 0.0857 - regression_loss: 0.0747 - classification_loss: 0.0110 432/500 [========================>.....] - ETA: 34s - loss: 0.0856 - regression_loss: 0.0746 - classification_loss: 0.0109 433/500 [========================>.....] - ETA: 34s - loss: 0.0855 - regression_loss: 0.0746 - classification_loss: 0.0109 434/500 [=========================>....] - ETA: 33s - loss: 0.0854 - regression_loss: 0.0745 - classification_loss: 0.0109 435/500 [=========================>....] - ETA: 33s - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 436/500 [=========================>....] - ETA: 32s - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 437/500 [=========================>....] - ETA: 32s - loss: 0.0853 - regression_loss: 0.0744 - classification_loss: 0.0109 438/500 [=========================>....] - ETA: 31s - loss: 0.0851 - regression_loss: 0.0742 - classification_loss: 0.0109 439/500 [=========================>....] - ETA: 31s - loss: 0.0850 - regression_loss: 0.0741 - classification_loss: 0.0109 440/500 [=========================>....] - ETA: 30s - loss: 0.0849 - regression_loss: 0.0740 - classification_loss: 0.0109 441/500 [=========================>....] - ETA: 30s - loss: 0.0848 - regression_loss: 0.0740 - classification_loss: 0.0108 442/500 [=========================>....] - ETA: 29s - loss: 0.0847 - regression_loss: 0.0739 - classification_loss: 0.0108 443/500 [=========================>....] - ETA: 29s - loss: 0.0846 - regression_loss: 0.0738 - classification_loss: 0.0108 444/500 [=========================>....] - ETA: 28s - loss: 0.0847 - regression_loss: 0.0739 - classification_loss: 0.0108 445/500 [=========================>....] - ETA: 28s - loss: 0.0846 - regression_loss: 0.0738 - classification_loss: 0.0108 446/500 [=========================>....] - ETA: 27s - loss: 0.0845 - regression_loss: 0.0737 - classification_loss: 0.0108 447/500 [=========================>....] - ETA: 27s - loss: 0.0844 - regression_loss: 0.0736 - classification_loss: 0.0108 448/500 [=========================>....] - ETA: 26s - loss: 0.0843 - regression_loss: 0.0735 - classification_loss: 0.0108 449/500 [=========================>....] - ETA: 26s - loss: 0.0842 - regression_loss: 0.0734 - classification_loss: 0.0108 450/500 [==========================>...] - ETA: 25s - loss: 0.0840 - regression_loss: 0.0733 - classification_loss: 0.0107 451/500 [==========================>...] - ETA: 25s - loss: 0.0840 - regression_loss: 0.0733 - classification_loss: 0.0107 452/500 [==========================>...] - ETA: 24s - loss: 0.0841 - regression_loss: 0.0733 - classification_loss: 0.0108 453/500 [==========================>...] - ETA: 24s - loss: 0.0840 - regression_loss: 0.0733 - classification_loss: 0.0107 454/500 [==========================>...] - ETA: 23s - loss: 0.0839 - regression_loss: 0.0732 - classification_loss: 0.0107 455/500 [==========================>...] - ETA: 23s - loss: 0.0838 - regression_loss: 0.0731 - classification_loss: 0.0107 456/500 [==========================>...] - ETA: 22s - loss: 0.0838 - regression_loss: 0.0731 - classification_loss: 0.0107 457/500 [==========================>...] - ETA: 22s - loss: 0.0836 - regression_loss: 0.0729 - classification_loss: 0.0107 458/500 [==========================>...] - ETA: 21s - loss: 0.0835 - regression_loss: 0.0729 - classification_loss: 0.0107 459/500 [==========================>...] - ETA: 20s - loss: 0.0834 - regression_loss: 0.0727 - classification_loss: 0.0107 460/500 [==========================>...] - ETA: 20s - loss: 0.0833 - regression_loss: 0.0726 - classification_loss: 0.0106 461/500 [==========================>...] - ETA: 19s - loss: 0.0834 - regression_loss: 0.0727 - classification_loss: 0.0106 462/500 [==========================>...] - ETA: 19s - loss: 0.0834 - regression_loss: 0.0727 - classification_loss: 0.0106 463/500 [==========================>...] - ETA: 18s - loss: 0.0832 - regression_loss: 0.0726 - classification_loss: 0.0106 464/500 [==========================>...] - ETA: 18s - loss: 0.0831 - regression_loss: 0.0725 - classification_loss: 0.0106 465/500 [==========================>...] - ETA: 17s - loss: 0.0830 - regression_loss: 0.0724 - classification_loss: 0.0106 466/500 [==========================>...] - ETA: 17s - loss: 0.0829 - regression_loss: 0.0723 - classification_loss: 0.0106 467/500 [===========================>..] - ETA: 16s - loss: 0.0829 - regression_loss: 0.0723 - classification_loss: 0.0106 468/500 [===========================>..] - ETA: 16s - loss: 0.0828 - regression_loss: 0.0722 - classification_loss: 0.0106 469/500 [===========================>..] - ETA: 15s - loss: 0.0827 - regression_loss: 0.0722 - classification_loss: 0.0106 470/500 [===========================>..] - ETA: 15s - loss: 0.0826 - regression_loss: 0.0720 - classification_loss: 0.0106 471/500 [===========================>..] - ETA: 14s - loss: 0.0825 - regression_loss: 0.0719 - classification_loss: 0.0105 472/500 [===========================>..] - ETA: 14s - loss: 0.0824 - regression_loss: 0.0719 - classification_loss: 0.0105 473/500 [===========================>..] - ETA: 13s - loss: 0.0825 - regression_loss: 0.0719 - classification_loss: 0.0105 474/500 [===========================>..] - ETA: 13s - loss: 0.0824 - regression_loss: 0.0719 - classification_loss: 0.0105 475/500 [===========================>..] - ETA: 12s - loss: 0.0823 - regression_loss: 0.0718 - classification_loss: 0.0105 476/500 [===========================>..] - ETA: 12s - loss: 0.0822 - regression_loss: 0.0717 - classification_loss: 0.0105 477/500 [===========================>..] - ETA: 11s - loss: 0.0824 - regression_loss: 0.0719 - classification_loss: 0.0105 478/500 [===========================>..] - ETA: 11s - loss: 0.0823 - regression_loss: 0.0718 - classification_loss: 0.0105 479/500 [===========================>..] - ETA: 10s - loss: 0.0824 - regression_loss: 0.0719 - classification_loss: 0.0105 480/500 [===========================>..] - ETA: 10s - loss: 0.0823 - regression_loss: 0.0718 - classification_loss: 0.0105 481/500 [===========================>..] - ETA: 9s - loss: 0.0822 - regression_loss: 0.0717 - classification_loss: 0.0105  482/500 [===========================>..] - ETA: 9s - loss: 0.0821 - regression_loss: 0.0717 - classification_loss: 0.0105 483/500 [===========================>..] - ETA: 8s - loss: 0.0821 - regression_loss: 0.0717 - classification_loss: 0.0105 484/500 [============================>.] - ETA: 8s - loss: 0.0820 - regression_loss: 0.0716 - classification_loss: 0.0105 485/500 [============================>.] - ETA: 7s - loss: 0.0832 - regression_loss: 0.0726 - classification_loss: 0.0106 486/500 [============================>.] - ETA: 7s - loss: 0.0831 - regression_loss: 0.0725 - classification_loss: 0.0106 487/500 [============================>.] - ETA: 6s - loss: 0.0836 - regression_loss: 0.0729 - classification_loss: 0.0107 488/500 [============================>.] - ETA: 6s - loss: 0.0835 - regression_loss: 0.0728 - classification_loss: 0.0107 489/500 [============================>.] - ETA: 5s - loss: 0.0835 - regression_loss: 0.0728 - classification_loss: 0.0107 490/500 [============================>.] - ETA: 5s - loss: 0.0835 - regression_loss: 0.0728 - classification_loss: 0.0107 491/500 [============================>.] - ETA: 4s - loss: 0.0844 - regression_loss: 0.0736 - classification_loss: 0.0108 492/500 [============================>.] - ETA: 4s - loss: 0.0843 - regression_loss: 0.0735 - classification_loss: 0.0108 493/500 [============================>.] - ETA: 3s - loss: 0.0844 - regression_loss: 0.0736 - classification_loss: 0.0108 494/500 [============================>.] - ETA: 3s - loss: 0.0844 - regression_loss: 0.0736 - classification_loss: 0.0108 495/500 [============================>.] - ETA: 2s - loss: 0.0843 - regression_loss: 0.0735 - classification_loss: 0.0108 496/500 [============================>.] - ETA: 2s - loss: 0.0842 - regression_loss: 0.0734 - classification_loss: 0.0108 497/500 [============================>.] - ETA: 1s - loss: 0.0842 - regression_loss: 0.0734 - classification_loss: 0.0108 498/500 [============================>.] - ETA: 1s - loss: 0.0841 - regression_loss: 0.0734 - classification_loss: 0.0108 499/500 [============================>.] - ETA: 0s - loss: 0.0841 - regression_loss: 0.0733 - classification_loss: 0.0107 500/500 [==============================] - 256s 512ms/step - loss: 0.0841 - regression_loss: 0.0733 - classification_loss: 0.0107 326 instances of class plum with average precision: 0.8301 mAP: 0.8301 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 ALL DONE :-)