CHECK: Is CUDA the right version (10)? weights is None Caution! Loading imagenet weights Creating model, this may take a second... Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2a (BatchNormalizati (None, None, None, 1 512 res3b_branch2a[0][0] __________________________________________________________________________________________________ res3b_branch2a_relu (Activation (None, None, None, 1 0 bn3b_branch2a[0][0] __________________________________________________________________________________________________ padding3b_branch2b (ZeroPadding (None, None, None, 1 0 res3b_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b_branch2b (Conv2D) (None, None, None, 1 147456 padding3b_branch2b[0][0] __________________________________________________________________________________________________ bn3b_branch2b (BatchNormalizati (None, None, None, 1 512 res3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2b_relu (Activation (None, None, None, 1 0 bn3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2c (Conv2D) (None, None, None, 5 65536 res3b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2c (BatchNormalizati (None, None, None, 5 2048 res3b_branch2c[0][0] __________________________________________________________________________________________________ res3b (Add) (None, None, None, 5 0 bn3b_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b_relu (Activation) (None, None, None, 5 0 res3b[0][0] __________________________________________________________________________________________________ res3c_branch2a (Conv2D) (None, None, None, 1 65536 res3b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2a (BatchNormalizati (None, None, None, 1 512 res3c_branch2a[0][0] __________________________________________________________________________________________________ res3c_branch2a_relu (Activation (None, None, None, 1 0 bn3c_branch2a[0][0] __________________________________________________________________________________________________ padding3c_branch2b (ZeroPadding (None, None, None, 1 0 res3c_branch2a_relu[0][0] __________________________________________________________________________________________________ res3c_branch2b (Conv2D) (None, None, None, 1 147456 padding3c_branch2b[0][0] __________________________________________________________________________________________________ bn3c_branch2b (BatchNormalizati (None, None, None, 1 512 res3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2b_relu (Activation (None, None, None, 1 0 bn3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2c (Conv2D) (None, None, None, 5 65536 res3c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2c (BatchNormalizati (None, None, None, 5 2048 res3c_branch2c[0][0] __________________________________________________________________________________________________ res3c (Add) (None, None, None, 5 0 bn3c_branch2c[0][0] res3b_relu[0][0] __________________________________________________________________________________________________ res3c_relu (Activation) (None, None, None, 5 0 res3c[0][0] __________________________________________________________________________________________________ res3d_branch2a (Conv2D) (None, None, None, 1 65536 res3c_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2a (BatchNormalizati (None, None, None, 1 512 res3d_branch2a[0][0] __________________________________________________________________________________________________ res3d_branch2a_relu (Activation (None, None, None, 1 0 bn3d_branch2a[0][0] __________________________________________________________________________________________________ padding3d_branch2b (ZeroPadding (None, None, None, 1 0 res3d_branch2a_relu[0][0] __________________________________________________________________________________________________ res3d_branch2b (Conv2D) (None, None, None, 1 147456 padding3d_branch2b[0][0] __________________________________________________________________________________________________ bn3d_branch2b (BatchNormalizati (None, None, None, 1 512 res3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2b_relu (Activation (None, None, None, 1 0 bn3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2c (Conv2D) (None, None, None, 5 65536 res3d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2c (BatchNormalizati (None, None, None, 5 2048 res3d_branch2c[0][0] __________________________________________________________________________________________________ res3d (Add) (None, None, None, 5 0 bn3d_branch2c[0][0] res3c_relu[0][0] __________________________________________________________________________________________________ res3d_relu (Activation) (None, None, None, 5 0 res3d[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2a (BatchNormalizati (None, None, None, 2 1024 res4b_branch2a[0][0] __________________________________________________________________________________________________ res4b_branch2a_relu (Activation (None, None, None, 2 0 bn4b_branch2a[0][0] __________________________________________________________________________________________________ padding4b_branch2b (ZeroPadding (None, None, None, 2 0 res4b_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b_branch2b (Conv2D) (None, None, None, 2 589824 padding4b_branch2b[0][0] __________________________________________________________________________________________________ bn4b_branch2b (BatchNormalizati (None, None, None, 2 1024 res4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2b_relu (Activation (None, None, None, 2 0 bn4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2c (Conv2D) (None, None, None, 1 262144 res4b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2c (BatchNormalizati (None, None, None, 1 4096 res4b_branch2c[0][0] __________________________________________________________________________________________________ res4b (Add) (None, None, None, 1 0 bn4b_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b_relu (Activation) (None, None, None, 1 0 res4b[0][0] __________________________________________________________________________________________________ res4c_branch2a (Conv2D) (None, None, None, 2 262144 res4b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2a (BatchNormalizati (None, None, None, 2 1024 res4c_branch2a[0][0] __________________________________________________________________________________________________ res4c_branch2a_relu (Activation (None, None, None, 2 0 bn4c_branch2a[0][0] __________________________________________________________________________________________________ padding4c_branch2b (ZeroPadding (None, None, None, 2 0 res4c_branch2a_relu[0][0] __________________________________________________________________________________________________ res4c_branch2b (Conv2D) (None, None, None, 2 589824 padding4c_branch2b[0][0] __________________________________________________________________________________________________ bn4c_branch2b (BatchNormalizati (None, None, None, 2 1024 res4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2b_relu (Activation (None, None, None, 2 0 bn4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2c (Conv2D) (None, None, None, 1 262144 res4c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2c (BatchNormalizati (None, None, None, 1 4096 res4c_branch2c[0][0] __________________________________________________________________________________________________ res4c (Add) (None, None, None, 1 0 bn4c_branch2c[0][0] res4b_relu[0][0] __________________________________________________________________________________________________ res4c_relu (Activation) (None, None, None, 1 0 res4c[0][0] __________________________________________________________________________________________________ res4d_branch2a (Conv2D) (None, None, None, 2 262144 res4c_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2a (BatchNormalizati (None, None, None, 2 1024 res4d_branch2a[0][0] __________________________________________________________________________________________________ res4d_branch2a_relu (Activation (None, None, None, 2 0 bn4d_branch2a[0][0] __________________________________________________________________________________________________ padding4d_branch2b (ZeroPadding (None, None, None, 2 0 res4d_branch2a_relu[0][0] __________________________________________________________________________________________________ res4d_branch2b (Conv2D) (None, None, None, 2 589824 padding4d_branch2b[0][0] __________________________________________________________________________________________________ bn4d_branch2b (BatchNormalizati (None, None, None, 2 1024 res4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2b_relu (Activation (None, None, None, 2 0 bn4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2c (Conv2D) (None, None, None, 1 262144 res4d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2c (BatchNormalizati (None, None, None, 1 4096 res4d_branch2c[0][0] __________________________________________________________________________________________________ res4d (Add) (None, None, None, 1 0 bn4d_branch2c[0][0] res4c_relu[0][0] __________________________________________________________________________________________________ res4d_relu (Activation) (None, None, None, 1 0 res4d[0][0] __________________________________________________________________________________________________ res4e_branch2a (Conv2D) (None, None, None, 2 262144 res4d_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2a (BatchNormalizati (None, None, None, 2 1024 res4e_branch2a[0][0] __________________________________________________________________________________________________ res4e_branch2a_relu (Activation (None, None, None, 2 0 bn4e_branch2a[0][0] __________________________________________________________________________________________________ padding4e_branch2b (ZeroPadding (None, None, None, 2 0 res4e_branch2a_relu[0][0] __________________________________________________________________________________________________ res4e_branch2b (Conv2D) (None, None, None, 2 589824 padding4e_branch2b[0][0] __________________________________________________________________________________________________ bn4e_branch2b (BatchNormalizati (None, None, None, 2 1024 res4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2b_relu (Activation (None, None, None, 2 0 bn4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2c (Conv2D) (None, None, None, 1 262144 res4e_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2c (BatchNormalizati (None, None, None, 1 4096 res4e_branch2c[0][0] __________________________________________________________________________________________________ res4e (Add) (None, None, None, 1 0 bn4e_branch2c[0][0] res4d_relu[0][0] __________________________________________________________________________________________________ res4e_relu (Activation) (None, None, None, 1 0 res4e[0][0] __________________________________________________________________________________________________ res4f_branch2a (Conv2D) (None, None, None, 2 262144 res4e_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2a (BatchNormalizati (None, None, None, 2 1024 res4f_branch2a[0][0] __________________________________________________________________________________________________ res4f_branch2a_relu (Activation (None, None, None, 2 0 bn4f_branch2a[0][0] __________________________________________________________________________________________________ padding4f_branch2b (ZeroPadding (None, None, None, 2 0 res4f_branch2a_relu[0][0] __________________________________________________________________________________________________ res4f_branch2b (Conv2D) (None, None, None, 2 589824 padding4f_branch2b[0][0] __________________________________________________________________________________________________ bn4f_branch2b (BatchNormalizati (None, None, None, 2 1024 res4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2b_relu (Activation (None, None, None, 2 0 bn4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2c (Conv2D) (None, None, None, 1 262144 res4f_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2c (BatchNormalizati (None, None, None, 1 4096 res4f_branch2c[0][0] __________________________________________________________________________________________________ res4f (Add) (None, None, None, 1 0 bn4f_branch2c[0][0] res4e_relu[0][0] __________________________________________________________________________________________________ res4f_relu (Activation) (None, None, None, 1 0 res4f[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4f_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4f_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3d_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3d_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 36,382,957 Trainable params: 36,276,717 Non-trainable params: 106,240 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 36:57 - loss: 4.0212 - regression_loss: 2.8930 - classification_loss: 1.1281 2/500 [..............................] - ETA: 19:28 - loss: 4.0383 - regression_loss: 2.9098 - classification_loss: 1.1285 3/500 [..............................] - ETA: 13:39 - loss: 4.0269 - regression_loss: 2.8978 - classification_loss: 1.1291 4/500 [..............................] - ETA: 10:45 - loss: 4.0341 - regression_loss: 2.9042 - classification_loss: 1.1299 5/500 [..............................] - ETA: 9:00 - loss: 4.0233 - regression_loss: 2.8936 - classification_loss: 1.1297 6/500 [..............................] - ETA: 7:50 - loss: 4.0061 - regression_loss: 2.8758 - classification_loss: 1.1303 7/500 [..............................] - ETA: 7:00 - loss: 4.0022 - regression_loss: 2.8722 - classification_loss: 1.1299 8/500 [..............................] - ETA: 6:23 - loss: 4.0304 - regression_loss: 2.9006 - classification_loss: 1.1298 9/500 [..............................] - ETA: 5:54 - loss: 4.0445 - regression_loss: 2.9149 - classification_loss: 1.1296 10/500 [..............................] - ETA: 5:29 - loss: 4.0097 - regression_loss: 2.8797 - classification_loss: 1.1299 11/500 [..............................] - ETA: 5:09 - loss: 3.9866 - regression_loss: 2.8567 - classification_loss: 1.1299 12/500 [..............................] - ETA: 4:54 - loss: 3.9956 - regression_loss: 2.8658 - classification_loss: 1.1298 13/500 [..............................] - ETA: 4:41 - loss: 4.0092 - regression_loss: 2.8795 - classification_loss: 1.1296 14/500 [..............................] - ETA: 4:29 - loss: 4.0144 - regression_loss: 2.8848 - classification_loss: 1.1296 15/500 [..............................] - ETA: 4:19 - loss: 4.0117 - regression_loss: 2.8822 - classification_loss: 1.1295 16/500 [..............................] - ETA: 4:10 - loss: 4.0218 - regression_loss: 2.8925 - classification_loss: 1.1293 17/500 [>.............................] - ETA: 4:03 - loss: 4.0016 - regression_loss: 2.8724 - classification_loss: 1.1292 18/500 [>.............................] - ETA: 3:55 - loss: 3.9943 - regression_loss: 2.8652 - classification_loss: 1.1291 19/500 [>.............................] - ETA: 3:49 - loss: 3.9961 - regression_loss: 2.8671 - classification_loss: 1.1290 20/500 [>.............................] - ETA: 3:44 - loss: 3.9918 - regression_loss: 2.8626 - classification_loss: 1.1292 21/500 [>.............................] - ETA: 3:39 - loss: 3.9879 - regression_loss: 2.8588 - classification_loss: 1.1291 22/500 [>.............................] - ETA: 3:35 - loss: 3.9924 - regression_loss: 2.8634 - classification_loss: 1.1290 23/500 [>.............................] - ETA: 3:30 - loss: 3.9932 - regression_loss: 2.8644 - classification_loss: 1.1289 24/500 [>.............................] - ETA: 3:26 - loss: 4.0029 - regression_loss: 2.8741 - classification_loss: 1.1288 25/500 [>.............................] - ETA: 3:22 - loss: 3.9975 - regression_loss: 2.8687 - classification_loss: 1.1288 26/500 [>.............................] - ETA: 3:19 - loss: 3.9928 - regression_loss: 2.8640 - classification_loss: 1.1288 27/500 [>.............................] - ETA: 3:16 - loss: 3.9933 - regression_loss: 2.8646 - classification_loss: 1.1287 28/500 [>.............................] - ETA: 3:12 - loss: 4.0003 - regression_loss: 2.8717 - classification_loss: 1.1286 29/500 [>.............................] - ETA: 3:09 - loss: 4.0007 - regression_loss: 2.8722 - classification_loss: 1.1285 30/500 [>.............................] - ETA: 3:07 - loss: 4.0106 - regression_loss: 2.8821 - classification_loss: 1.1286 31/500 [>.............................] - ETA: 3:04 - loss: 4.0062 - regression_loss: 2.8776 - classification_loss: 1.1286 32/500 [>.............................] - ETA: 3:01 - loss: 4.0040 - regression_loss: 2.8755 - classification_loss: 1.1284 33/500 [>.............................] - ETA: 2:59 - loss: 4.0055 - regression_loss: 2.8772 - classification_loss: 1.1284 34/500 [=>............................] - ETA: 2:56 - loss: 4.0074 - regression_loss: 2.8791 - classification_loss: 1.1283 35/500 [=>............................] - ETA: 2:54 - loss: 4.0120 - regression_loss: 2.8836 - classification_loss: 1.1284 36/500 [=>............................] - ETA: 2:51 - loss: 4.0169 - regression_loss: 2.8885 - classification_loss: 1.1284 37/500 [=>............................] - ETA: 2:49 - loss: 4.0136 - regression_loss: 2.8852 - classification_loss: 1.1284 38/500 [=>............................] - ETA: 2:47 - loss: 4.0120 - regression_loss: 2.8838 - classification_loss: 1.1282 39/500 [=>............................] - ETA: 2:45 - loss: 4.0141 - regression_loss: 2.8860 - classification_loss: 1.1281 40/500 [=>............................] - ETA: 2:43 - loss: 4.0154 - regression_loss: 2.8875 - classification_loss: 1.1279 41/500 [=>............................] - ETA: 2:42 - loss: 4.0172 - regression_loss: 2.8894 - classification_loss: 1.1277 42/500 [=>............................] - ETA: 2:40 - loss: 4.0081 - regression_loss: 2.8806 - classification_loss: 1.1274 43/500 [=>............................] - ETA: 2:39 - loss: 4.0108 - regression_loss: 2.8835 - classification_loss: 1.1274 44/500 [=>............................] - ETA: 2:38 - loss: 4.0101 - regression_loss: 2.8828 - classification_loss: 1.1273 45/500 [=>............................] - ETA: 2:36 - loss: 4.0104 - regression_loss: 2.8832 - classification_loss: 1.1271 46/500 [=>............................] - ETA: 2:35 - loss: 4.0092 - regression_loss: 2.8821 - classification_loss: 1.1271 47/500 [=>............................] - ETA: 2:34 - loss: 4.0084 - regression_loss: 2.8813 - classification_loss: 1.1271 48/500 [=>............................] - ETA: 2:33 - loss: 4.0073 - regression_loss: 2.8804 - classification_loss: 1.1269 49/500 [=>............................] - ETA: 2:31 - loss: 4.0044 - regression_loss: 2.8777 - classification_loss: 1.1267 50/500 [==>...........................] - ETA: 2:30 - loss: 4.0051 - regression_loss: 2.8785 - classification_loss: 1.1266 51/500 [==>...........................] - ETA: 2:29 - loss: 4.0052 - regression_loss: 2.8786 - classification_loss: 1.1266 52/500 [==>...........................] - ETA: 2:28 - loss: 4.0006 - regression_loss: 2.8741 - classification_loss: 1.1265 53/500 [==>...........................] - ETA: 2:27 - loss: 3.9999 - regression_loss: 2.8734 - classification_loss: 1.1265 54/500 [==>...........................] - ETA: 2:26 - loss: 3.9995 - regression_loss: 2.8732 - classification_loss: 1.1264 55/500 [==>...........................] - ETA: 2:25 - loss: 3.9993 - regression_loss: 2.8732 - classification_loss: 1.1262 56/500 [==>...........................] - ETA: 2:24 - loss: 3.9976 - regression_loss: 2.8715 - classification_loss: 1.1260 57/500 [==>...........................] - ETA: 2:23 - loss: 3.9934 - regression_loss: 2.8675 - classification_loss: 1.1260 58/500 [==>...........................] - ETA: 2:23 - loss: 3.9919 - regression_loss: 2.8662 - classification_loss: 1.1257 59/500 [==>...........................] - ETA: 2:22 - loss: 3.9919 - regression_loss: 2.8664 - classification_loss: 1.1255 60/500 [==>...........................] - ETA: 2:21 - loss: 3.9917 - regression_loss: 2.8665 - classification_loss: 1.1253 61/500 [==>...........................] - ETA: 2:20 - loss: 3.9913 - regression_loss: 2.8662 - classification_loss: 1.1251 62/500 [==>...........................] - ETA: 2:19 - loss: 3.9892 - regression_loss: 2.8643 - classification_loss: 1.1249 63/500 [==>...........................] - ETA: 2:18 - loss: 3.9873 - regression_loss: 2.8626 - classification_loss: 1.1247 64/500 [==>...........................] - ETA: 2:17 - loss: 3.9880 - regression_loss: 2.8633 - classification_loss: 1.1246 65/500 [==>...........................] - ETA: 2:16 - loss: 3.9895 - regression_loss: 2.8649 - classification_loss: 1.1245 66/500 [==>...........................] - ETA: 2:15 - loss: 3.9882 - regression_loss: 2.8640 - classification_loss: 1.1242 67/500 [===>..........................] - ETA: 2:14 - loss: 3.9882 - regression_loss: 2.8644 - classification_loss: 1.1239 68/500 [===>..........................] - ETA: 2:14 - loss: 3.9853 - regression_loss: 2.8617 - classification_loss: 1.1236 69/500 [===>..........................] - ETA: 2:13 - loss: 3.9834 - regression_loss: 2.8601 - classification_loss: 1.1233 70/500 [===>..........................] - ETA: 2:12 - loss: 3.9815 - regression_loss: 2.8587 - classification_loss: 1.1229 71/500 [===>..........................] - ETA: 2:12 - loss: 3.9785 - regression_loss: 2.8558 - classification_loss: 1.1227 72/500 [===>..........................] - ETA: 2:11 - loss: 3.9759 - regression_loss: 2.8537 - classification_loss: 1.1223 73/500 [===>..........................] - ETA: 2:10 - loss: 3.9743 - regression_loss: 2.8523 - classification_loss: 1.1219 74/500 [===>..........................] - ETA: 2:10 - loss: 3.9679 - regression_loss: 2.8463 - classification_loss: 1.1217 75/500 [===>..........................] - ETA: 2:09 - loss: 3.9643 - regression_loss: 2.8430 - classification_loss: 1.1213 76/500 [===>..........................] - ETA: 2:09 - loss: 3.9625 - regression_loss: 2.8417 - classification_loss: 1.1208 77/500 [===>..........................] - ETA: 2:08 - loss: 3.9609 - regression_loss: 2.8405 - classification_loss: 1.1204 78/500 [===>..........................] - ETA: 2:07 - loss: 3.9600 - regression_loss: 2.8399 - classification_loss: 1.1201 79/500 [===>..........................] - ETA: 2:07 - loss: 3.9562 - regression_loss: 2.8368 - classification_loss: 1.1194 80/500 [===>..........................] - ETA: 2:06 - loss: 3.9579 - regression_loss: 2.8389 - classification_loss: 1.1190 81/500 [===>..........................] - ETA: 2:06 - loss: 3.9547 - regression_loss: 2.8362 - classification_loss: 1.1185 82/500 [===>..........................] - ETA: 2:05 - loss: 3.9505 - regression_loss: 2.8327 - classification_loss: 1.1177 83/500 [===>..........................] - ETA: 2:05 - loss: 3.9479 - regression_loss: 2.8307 - classification_loss: 1.1172 84/500 [====>.........................] - ETA: 2:04 - loss: 3.9443 - regression_loss: 2.8277 - classification_loss: 1.1166 85/500 [====>.........................] - ETA: 2:03 - loss: 3.9444 - regression_loss: 2.8282 - classification_loss: 1.1162 86/500 [====>.........................] - ETA: 2:03 - loss: 3.9422 - regression_loss: 2.8269 - classification_loss: 1.1153 87/500 [====>.........................] - ETA: 2:02 - loss: 3.9372 - regression_loss: 2.8227 - classification_loss: 1.1145 88/500 [====>.........................] - ETA: 2:02 - loss: 3.9348 - regression_loss: 2.8216 - classification_loss: 1.1133 89/500 [====>.........................] - ETA: 2:01 - loss: 3.9316 - regression_loss: 2.8190 - classification_loss: 1.1126 90/500 [====>.........................] - ETA: 2:00 - loss: 3.9274 - regression_loss: 2.8163 - classification_loss: 1.1111 91/500 [====>.........................] - ETA: 2:00 - loss: 3.9243 - regression_loss: 2.8146 - classification_loss: 1.1097 92/500 [====>.........................] - ETA: 1:59 - loss: 3.9208 - regression_loss: 2.8121 - classification_loss: 1.1087 93/500 [====>.........................] - ETA: 1:59 - loss: 3.9160 - regression_loss: 2.8087 - classification_loss: 1.1073 94/500 [====>.........................] - ETA: 1:58 - loss: 3.9120 - regression_loss: 2.8059 - classification_loss: 1.1061 95/500 [====>.........................] - ETA: 1:58 - loss: 3.9074 - regression_loss: 2.8024 - classification_loss: 1.1050 96/500 [====>.........................] - ETA: 1:57 - loss: 3.9040 - regression_loss: 2.7999 - classification_loss: 1.1041 97/500 [====>.........................] - ETA: 1:57 - loss: 3.9014 - regression_loss: 2.7987 - classification_loss: 1.1027 98/500 [====>.........................] - ETA: 1:56 - loss: 3.9025 - regression_loss: 2.8008 - classification_loss: 1.1017 99/500 [====>.........................] - ETA: 1:56 - loss: 3.8978 - regression_loss: 2.7974 - classification_loss: 1.1004 100/500 [=====>........................] - ETA: 1:55 - loss: 3.8924 - regression_loss: 2.7936 - classification_loss: 1.0989 101/500 [=====>........................] - ETA: 1:55 - loss: 3.8878 - regression_loss: 2.7907 - classification_loss: 1.0971 102/500 [=====>........................] - ETA: 1:54 - loss: 3.8833 - regression_loss: 2.7879 - classification_loss: 1.0954 103/500 [=====>........................] - ETA: 1:54 - loss: 3.8790 - regression_loss: 2.7858 - classification_loss: 1.0932 104/500 [=====>........................] - ETA: 1:54 - loss: 3.8741 - regression_loss: 2.7829 - classification_loss: 1.0912 105/500 [=====>........................] - ETA: 1:53 - loss: 3.8721 - regression_loss: 2.7828 - classification_loss: 1.0894 106/500 [=====>........................] - ETA: 1:52 - loss: 3.8675 - regression_loss: 2.7804 - classification_loss: 1.0871 107/500 [=====>........................] - ETA: 1:52 - loss: 3.8597 - regression_loss: 2.7750 - classification_loss: 1.0847 108/500 [=====>........................] - ETA: 1:52 - loss: 3.8536 - regression_loss: 2.7718 - classification_loss: 1.0819 109/500 [=====>........................] - ETA: 1:51 - loss: 3.8479 - regression_loss: 2.7694 - classification_loss: 1.0785 110/500 [=====>........................] - ETA: 1:51 - loss: 3.8418 - regression_loss: 2.7668 - classification_loss: 1.0750 111/500 [=====>........................] - ETA: 1:50 - loss: 3.8393 - regression_loss: 2.7653 - classification_loss: 1.0740 112/500 [=====>........................] - ETA: 1:50 - loss: 3.8375 - regression_loss: 2.7621 - classification_loss: 1.0754 113/500 [=====>........................] - ETA: 1:49 - loss: 3.8357 - regression_loss: 2.7614 - classification_loss: 1.0743 114/500 [=====>........................] - ETA: 1:49 - loss: 3.8347 - regression_loss: 2.7596 - classification_loss: 1.0751 115/500 [=====>........................] - ETA: 1:49 - loss: 3.8306 - regression_loss: 2.7571 - classification_loss: 1.0735 116/500 [=====>........................] - ETA: 1:48 - loss: 3.8300 - regression_loss: 2.7560 - classification_loss: 1.0740 117/500 [======>.......................] - ETA: 1:48 - loss: 3.8240 - regression_loss: 2.7535 - classification_loss: 1.0705 118/500 [======>.......................] - ETA: 1:47 - loss: 3.8180 - regression_loss: 2.7511 - classification_loss: 1.0669 119/500 [======>.......................] - ETA: 1:47 - loss: 3.8119 - regression_loss: 2.7484 - classification_loss: 1.0635 120/500 [======>.......................] - ETA: 1:46 - loss: 3.8055 - regression_loss: 2.7452 - classification_loss: 1.0604 121/500 [======>.......................] - ETA: 1:46 - loss: 3.7986 - regression_loss: 2.7425 - classification_loss: 1.0561 122/500 [======>.......................] - ETA: 1:45 - loss: 3.7930 - regression_loss: 2.7410 - classification_loss: 1.0520 123/500 [======>.......................] - ETA: 1:45 - loss: 3.7878 - regression_loss: 2.7391 - classification_loss: 1.0487 124/500 [======>.......................] - ETA: 1:45 - loss: 3.7830 - regression_loss: 2.7369 - classification_loss: 1.0460 125/500 [======>.......................] - ETA: 1:44 - loss: 3.7799 - regression_loss: 2.7351 - classification_loss: 1.0447 126/500 [======>.......................] - ETA: 1:44 - loss: 3.7745 - regression_loss: 2.7334 - classification_loss: 1.0411 127/500 [======>.......................] - ETA: 1:43 - loss: 3.7687 - regression_loss: 2.7315 - classification_loss: 1.0373 128/500 [======>.......................] - ETA: 1:43 - loss: 3.7626 - regression_loss: 2.7295 - classification_loss: 1.0331 129/500 [======>.......................] - ETA: 1:42 - loss: 3.7626 - regression_loss: 2.7291 - classification_loss: 1.0336 130/500 [======>.......................] - ETA: 1:42 - loss: 3.7579 - regression_loss: 2.7273 - classification_loss: 1.0306 131/500 [======>.......................] - ETA: 1:42 - loss: 3.7530 - regression_loss: 2.7247 - classification_loss: 1.0282 132/500 [======>.......................] - ETA: 1:41 - loss: 3.7469 - regression_loss: 2.7223 - classification_loss: 1.0246 133/500 [======>.......................] - ETA: 1:41 - loss: 3.7433 - regression_loss: 2.7214 - classification_loss: 1.0219 134/500 [=======>......................] - ETA: 1:40 - loss: 3.7395 - regression_loss: 2.7189 - classification_loss: 1.0206 135/500 [=======>......................] - ETA: 1:40 - loss: 3.7334 - regression_loss: 2.7168 - classification_loss: 1.0166 136/500 [=======>......................] - ETA: 1:39 - loss: 3.7287 - regression_loss: 2.7145 - classification_loss: 1.0142 137/500 [=======>......................] - ETA: 1:39 - loss: 3.7253 - regression_loss: 2.7134 - classification_loss: 1.0119 138/500 [=======>......................] - ETA: 1:39 - loss: 3.7204 - regression_loss: 2.7120 - classification_loss: 1.0085 139/500 [=======>......................] - ETA: 1:38 - loss: 3.7149 - regression_loss: 2.7097 - classification_loss: 1.0052 140/500 [=======>......................] - ETA: 1:38 - loss: 3.7105 - regression_loss: 2.7085 - classification_loss: 1.0020 141/500 [=======>......................] - ETA: 1:38 - loss: 3.7073 - regression_loss: 2.7071 - classification_loss: 1.0003 142/500 [=======>......................] - ETA: 1:37 - loss: 3.7034 - regression_loss: 2.7056 - classification_loss: 0.9978 143/500 [=======>......................] - ETA: 1:37 - loss: 3.6992 - regression_loss: 2.7042 - classification_loss: 0.9950 144/500 [=======>......................] - ETA: 1:36 - loss: 3.6949 - regression_loss: 2.7029 - classification_loss: 0.9919 145/500 [=======>......................] - ETA: 1:36 - loss: 3.6905 - regression_loss: 2.7024 - classification_loss: 0.9881 146/500 [=======>......................] - ETA: 1:36 - loss: 3.6854 - regression_loss: 2.7003 - classification_loss: 0.9851 147/500 [=======>......................] - ETA: 1:35 - loss: 3.6850 - regression_loss: 2.7005 - classification_loss: 0.9844 148/500 [=======>......................] - ETA: 1:35 - loss: 3.6823 - regression_loss: 2.6996 - classification_loss: 0.9827 149/500 [=======>......................] - ETA: 1:35 - loss: 3.6778 - regression_loss: 2.6981 - classification_loss: 0.9797 150/500 [========>.....................] - ETA: 1:35 - loss: 3.6729 - regression_loss: 2.6961 - classification_loss: 0.9768 151/500 [========>.....................] - ETA: 1:34 - loss: 3.6682 - regression_loss: 2.6944 - classification_loss: 0.9739 152/500 [========>.....................] - ETA: 1:34 - loss: 3.6686 - regression_loss: 2.6935 - classification_loss: 0.9751 153/500 [========>.....................] - ETA: 1:34 - loss: 3.6689 - regression_loss: 2.6939 - classification_loss: 0.9750 154/500 [========>.....................] - ETA: 1:33 - loss: 3.6689 - regression_loss: 2.6939 - classification_loss: 0.9750 155/500 [========>.....................] - ETA: 1:33 - loss: 3.6669 - regression_loss: 2.6930 - classification_loss: 0.9739 156/500 [========>.....................] - ETA: 1:33 - loss: 3.6666 - regression_loss: 2.6927 - classification_loss: 0.9739 157/500 [========>.....................] - ETA: 1:32 - loss: 3.6622 - regression_loss: 2.6910 - classification_loss: 0.9713 158/500 [========>.....................] - ETA: 1:32 - loss: 3.6582 - regression_loss: 2.6899 - classification_loss: 0.9684 159/500 [========>.....................] - ETA: 1:32 - loss: 3.6545 - regression_loss: 2.6884 - classification_loss: 0.9661 160/500 [========>.....................] - ETA: 1:31 - loss: 3.6503 - regression_loss: 2.6867 - classification_loss: 0.9635 161/500 [========>.....................] - ETA: 1:31 - loss: 3.6475 - regression_loss: 2.6865 - classification_loss: 0.9610 162/500 [========>.....................] - ETA: 1:31 - loss: 3.6465 - regression_loss: 2.6850 - classification_loss: 0.9615 163/500 [========>.....................] - ETA: 1:31 - loss: 3.6431 - regression_loss: 2.6839 - classification_loss: 0.9592 164/500 [========>.....................] - ETA: 1:30 - loss: 3.6415 - regression_loss: 2.6831 - classification_loss: 0.9584 165/500 [========>.....................] - ETA: 1:30 - loss: 3.6372 - regression_loss: 2.6817 - classification_loss: 0.9556 166/500 [========>.....................] - ETA: 1:30 - loss: 3.6327 - regression_loss: 2.6794 - classification_loss: 0.9533 167/500 [=========>....................] - ETA: 1:29 - loss: 3.6289 - regression_loss: 2.6781 - classification_loss: 0.9508 168/500 [=========>....................] - ETA: 1:29 - loss: 3.6245 - regression_loss: 2.6765 - classification_loss: 0.9479 169/500 [=========>....................] - ETA: 1:28 - loss: 3.6213 - regression_loss: 2.6759 - classification_loss: 0.9454 170/500 [=========>....................] - ETA: 1:28 - loss: 3.6168 - regression_loss: 2.6744 - classification_loss: 0.9424 171/500 [=========>....................] - ETA: 1:28 - loss: 3.6136 - regression_loss: 2.6731 - classification_loss: 0.9405 172/500 [=========>....................] - ETA: 1:28 - loss: 3.6100 - regression_loss: 2.6717 - classification_loss: 0.9383 173/500 [=========>....................] - ETA: 1:27 - loss: 3.6056 - regression_loss: 2.6700 - classification_loss: 0.9355 174/500 [=========>....................] - ETA: 1:27 - loss: 3.6027 - regression_loss: 2.6690 - classification_loss: 0.9337 175/500 [=========>....................] - ETA: 1:27 - loss: 3.5983 - regression_loss: 2.6674 - classification_loss: 0.9309 176/500 [=========>....................] - ETA: 1:26 - loss: 3.5961 - regression_loss: 2.6675 - classification_loss: 0.9286 177/500 [=========>....................] - ETA: 1:26 - loss: 3.5951 - regression_loss: 2.6661 - classification_loss: 0.9290 178/500 [=========>....................] - ETA: 1:26 - loss: 3.5931 - regression_loss: 2.6648 - classification_loss: 0.9283 179/500 [=========>....................] - ETA: 1:25 - loss: 3.5896 - regression_loss: 2.6636 - classification_loss: 0.9261 180/500 [=========>....................] - ETA: 1:25 - loss: 3.5862 - regression_loss: 2.6626 - classification_loss: 0.9236 181/500 [=========>....................] - ETA: 1:25 - loss: 3.5844 - regression_loss: 2.6617 - classification_loss: 0.9227 182/500 [=========>....................] - ETA: 1:24 - loss: 3.5843 - regression_loss: 2.6625 - classification_loss: 0.9218 183/500 [=========>....................] - ETA: 1:24 - loss: 3.5815 - regression_loss: 2.6615 - classification_loss: 0.9200 184/500 [==========>...................] - ETA: 1:24 - loss: 3.5777 - regression_loss: 2.6604 - classification_loss: 0.9173 185/500 [==========>...................] - ETA: 1:23 - loss: 3.5737 - regression_loss: 2.6589 - classification_loss: 0.9149 186/500 [==========>...................] - ETA: 1:23 - loss: 3.5727 - regression_loss: 2.6594 - classification_loss: 0.9133 187/500 [==========>...................] - ETA: 1:23 - loss: 3.5710 - regression_loss: 2.6592 - classification_loss: 0.9118 188/500 [==========>...................] - ETA: 1:22 - loss: 3.5681 - regression_loss: 2.6583 - classification_loss: 0.9098 189/500 [==========>...................] - ETA: 1:22 - loss: 3.5686 - regression_loss: 2.6592 - classification_loss: 0.9094 190/500 [==========>...................] - ETA: 1:22 - loss: 3.5650 - regression_loss: 2.6577 - classification_loss: 0.9073 191/500 [==========>...................] - ETA: 1:21 - loss: 3.5626 - regression_loss: 2.6571 - classification_loss: 0.9055 192/500 [==========>...................] - ETA: 1:21 - loss: 3.5591 - regression_loss: 2.6557 - classification_loss: 0.9034 193/500 [==========>...................] - ETA: 1:21 - loss: 3.5560 - regression_loss: 2.6549 - classification_loss: 0.9011 194/500 [==========>...................] - ETA: 1:20 - loss: 3.5564 - regression_loss: 2.6547 - classification_loss: 0.9017 195/500 [==========>...................] - ETA: 1:20 - loss: 3.5537 - regression_loss: 2.6537 - classification_loss: 0.9000 196/500 [==========>...................] - ETA: 1:20 - loss: 3.5512 - regression_loss: 2.6527 - classification_loss: 0.8985 197/500 [==========>...................] - ETA: 1:19 - loss: 3.5477 - regression_loss: 2.6516 - classification_loss: 0.8961 198/500 [==========>...................] - ETA: 1:19 - loss: 3.5443 - regression_loss: 2.6503 - classification_loss: 0.8940 199/500 [==========>...................] - ETA: 1:19 - loss: 3.5410 - regression_loss: 2.6490 - classification_loss: 0.8919 200/500 [===========>..................] - ETA: 1:18 - loss: 3.5382 - regression_loss: 2.6479 - classification_loss: 0.8903 201/500 [===========>..................] - ETA: 1:18 - loss: 3.5377 - regression_loss: 2.6466 - classification_loss: 0.8912 202/500 [===========>..................] - ETA: 1:18 - loss: 3.5349 - regression_loss: 2.6454 - classification_loss: 0.8896 203/500 [===========>..................] - ETA: 1:18 - loss: 3.5343 - regression_loss: 2.6445 - classification_loss: 0.8898 204/500 [===========>..................] - ETA: 1:17 - loss: 3.5328 - regression_loss: 2.6441 - classification_loss: 0.8887 205/500 [===========>..................] - ETA: 1:17 - loss: 3.5296 - regression_loss: 2.6429 - classification_loss: 0.8866 206/500 [===========>..................] - ETA: 1:17 - loss: 3.5269 - regression_loss: 2.6424 - classification_loss: 0.8845 207/500 [===========>..................] - ETA: 1:16 - loss: 3.5240 - regression_loss: 2.6415 - classification_loss: 0.8826 208/500 [===========>..................] - ETA: 1:16 - loss: 3.5208 - regression_loss: 2.6405 - classification_loss: 0.8803 209/500 [===========>..................] - ETA: 1:16 - loss: 3.5190 - regression_loss: 2.6403 - classification_loss: 0.8786 210/500 [===========>..................] - ETA: 1:15 - loss: 3.5162 - regression_loss: 2.6395 - classification_loss: 0.8767 211/500 [===========>..................] - ETA: 1:15 - loss: 3.5133 - regression_loss: 2.6386 - classification_loss: 0.8747 212/500 [===========>..................] - ETA: 1:15 - loss: 3.5100 - regression_loss: 2.6373 - classification_loss: 0.8727 213/500 [===========>..................] - ETA: 1:15 - loss: 3.5073 - regression_loss: 2.6363 - classification_loss: 0.8710 214/500 [===========>..................] - ETA: 1:14 - loss: 3.5047 - regression_loss: 2.6354 - classification_loss: 0.8693 215/500 [===========>..................] - ETA: 1:14 - loss: 3.5022 - regression_loss: 2.6346 - classification_loss: 0.8675 216/500 [===========>..................] - ETA: 1:14 - loss: 3.4975 - regression_loss: 2.6322 - classification_loss: 0.8653 217/500 [============>.................] - ETA: 1:13 - loss: 3.4960 - regression_loss: 2.6321 - classification_loss: 0.8639 218/500 [============>.................] - ETA: 1:13 - loss: 3.4930 - regression_loss: 2.6309 - classification_loss: 0.8621 219/500 [============>.................] - ETA: 1:13 - loss: 3.4917 - regression_loss: 2.6312 - classification_loss: 0.8604 220/500 [============>.................] - ETA: 1:13 - loss: 3.4887 - regression_loss: 2.6304 - classification_loss: 0.8583 221/500 [============>.................] - ETA: 1:12 - loss: 3.4858 - regression_loss: 2.6293 - classification_loss: 0.8565 222/500 [============>.................] - ETA: 1:12 - loss: 3.4836 - regression_loss: 2.6288 - classification_loss: 0.8549 223/500 [============>.................] - ETA: 1:12 - loss: 3.4806 - regression_loss: 2.6275 - classification_loss: 0.8530 224/500 [============>.................] - ETA: 1:11 - loss: 3.4779 - regression_loss: 2.6266 - classification_loss: 0.8513 225/500 [============>.................] - ETA: 1:11 - loss: 3.4764 - regression_loss: 2.6267 - classification_loss: 0.8497 226/500 [============>.................] - ETA: 1:11 - loss: 3.4737 - regression_loss: 2.6259 - classification_loss: 0.8478 227/500 [============>.................] - ETA: 1:11 - loss: 3.4708 - regression_loss: 2.6249 - classification_loss: 0.8459 228/500 [============>.................] - ETA: 1:10 - loss: 3.4680 - regression_loss: 2.6233 - classification_loss: 0.8447 229/500 [============>.................] - ETA: 1:10 - loss: 3.4655 - regression_loss: 2.6223 - classification_loss: 0.8431 230/500 [============>.................] - ETA: 1:10 - loss: 3.4629 - regression_loss: 2.6213 - classification_loss: 0.8417 231/500 [============>.................] - ETA: 1:09 - loss: 3.4634 - regression_loss: 2.6217 - classification_loss: 0.8416 232/500 [============>.................] - ETA: 1:09 - loss: 3.4618 - regression_loss: 2.6217 - classification_loss: 0.8401 233/500 [============>.................] - ETA: 1:09 - loss: 3.4600 - regression_loss: 2.6215 - classification_loss: 0.8385 234/500 [=============>................] - ETA: 1:09 - loss: 3.4574 - regression_loss: 2.6206 - classification_loss: 0.8367 235/500 [=============>................] - ETA: 1:08 - loss: 3.4548 - regression_loss: 2.6196 - classification_loss: 0.8351 236/500 [=============>................] - ETA: 1:08 - loss: 3.4543 - regression_loss: 2.6202 - classification_loss: 0.8341 237/500 [=============>................] - ETA: 1:08 - loss: 3.4525 - regression_loss: 2.6197 - classification_loss: 0.8328 238/500 [=============>................] - ETA: 1:07 - loss: 3.4515 - regression_loss: 2.6199 - classification_loss: 0.8316 239/500 [=============>................] - ETA: 1:07 - loss: 3.4517 - regression_loss: 2.6202 - classification_loss: 0.8315 240/500 [=============>................] - ETA: 1:07 - loss: 3.4499 - regression_loss: 2.6198 - classification_loss: 0.8301 241/500 [=============>................] - ETA: 1:07 - loss: 3.4474 - regression_loss: 2.6188 - classification_loss: 0.8286 242/500 [=============>................] - ETA: 1:06 - loss: 3.4462 - regression_loss: 2.6188 - classification_loss: 0.8274 243/500 [=============>................] - ETA: 1:06 - loss: 3.4441 - regression_loss: 2.6181 - classification_loss: 0.8260 244/500 [=============>................] - ETA: 1:06 - loss: 3.4419 - regression_loss: 2.6172 - classification_loss: 0.8247 245/500 [=============>................] - ETA: 1:05 - loss: 3.4415 - regression_loss: 2.6174 - classification_loss: 0.8240 246/500 [=============>................] - ETA: 1:05 - loss: 3.4402 - regression_loss: 2.6161 - classification_loss: 0.8242 247/500 [=============>................] - ETA: 1:05 - loss: 3.4400 - regression_loss: 2.6158 - classification_loss: 0.8242 248/500 [=============>................] - ETA: 1:05 - loss: 3.4377 - regression_loss: 2.6149 - classification_loss: 0.8228 249/500 [=============>................] - ETA: 1:04 - loss: 3.4351 - regression_loss: 2.6138 - classification_loss: 0.8214 250/500 [==============>...............] - ETA: 1:04 - loss: 3.4331 - regression_loss: 2.6130 - classification_loss: 0.8201 251/500 [==============>...............] - ETA: 1:04 - loss: 3.4302 - regression_loss: 2.6115 - classification_loss: 0.8187 252/500 [==============>...............] - ETA: 1:04 - loss: 3.4284 - regression_loss: 2.6109 - classification_loss: 0.8176 253/500 [==============>...............] - ETA: 1:03 - loss: 3.4266 - regression_loss: 2.6100 - classification_loss: 0.8166 254/500 [==============>...............] - ETA: 1:03 - loss: 3.4242 - regression_loss: 2.6088 - classification_loss: 0.8154 255/500 [==============>...............] - ETA: 1:03 - loss: 3.4229 - regression_loss: 2.6085 - classification_loss: 0.8145 256/500 [==============>...............] - ETA: 1:02 - loss: 3.4209 - regression_loss: 2.6078 - classification_loss: 0.8131 257/500 [==============>...............] - ETA: 1:02 - loss: 3.4194 - regression_loss: 2.6076 - classification_loss: 0.8118 258/500 [==============>...............] - ETA: 1:02 - loss: 3.4189 - regression_loss: 2.6080 - classification_loss: 0.8109 259/500 [==============>...............] - ETA: 1:02 - loss: 3.4173 - regression_loss: 2.6074 - classification_loss: 0.8099 260/500 [==============>...............] - ETA: 1:01 - loss: 3.4150 - regression_loss: 2.6066 - classification_loss: 0.8084 261/500 [==============>...............] - ETA: 1:01 - loss: 3.4128 - regression_loss: 2.6057 - classification_loss: 0.8071 262/500 [==============>...............] - ETA: 1:01 - loss: 3.4108 - regression_loss: 2.6046 - classification_loss: 0.8061 263/500 [==============>...............] - ETA: 1:01 - loss: 3.4093 - regression_loss: 2.6042 - classification_loss: 0.8051 264/500 [==============>...............] - ETA: 1:00 - loss: 3.4100 - regression_loss: 2.6055 - classification_loss: 0.8045 265/500 [==============>...............] - ETA: 1:00 - loss: 3.4082 - regression_loss: 2.6047 - classification_loss: 0.8035 266/500 [==============>...............] - ETA: 1:00 - loss: 3.4067 - regression_loss: 2.6040 - classification_loss: 0.8026 267/500 [===============>..............] - ETA: 59s - loss: 3.4054 - regression_loss: 2.6043 - classification_loss: 0.8011 268/500 [===============>..............] - ETA: 59s - loss: 3.4031 - regression_loss: 2.6033 - classification_loss: 0.7998 269/500 [===============>..............] - ETA: 59s - loss: 3.4024 - regression_loss: 2.6039 - classification_loss: 0.7985 270/500 [===============>..............] - ETA: 59s - loss: 3.4000 - regression_loss: 2.6027 - classification_loss: 0.7973 271/500 [===============>..............] - ETA: 58s - loss: 3.3982 - regression_loss: 2.6022 - classification_loss: 0.7960 272/500 [===============>..............] - ETA: 58s - loss: 3.3974 - regression_loss: 2.6024 - classification_loss: 0.7950 273/500 [===============>..............] - ETA: 58s - loss: 3.3953 - regression_loss: 2.6016 - classification_loss: 0.7937 274/500 [===============>..............] - ETA: 58s - loss: 3.3940 - regression_loss: 2.6013 - classification_loss: 0.7927 275/500 [===============>..............] - ETA: 57s - loss: 3.3924 - regression_loss: 2.6008 - classification_loss: 0.7915 276/500 [===============>..............] - ETA: 57s - loss: 3.3901 - regression_loss: 2.5998 - classification_loss: 0.7903 277/500 [===============>..............] - ETA: 57s - loss: 3.3889 - regression_loss: 2.5990 - classification_loss: 0.7898 278/500 [===============>..............] - ETA: 57s - loss: 3.3874 - regression_loss: 2.5987 - classification_loss: 0.7887 279/500 [===============>..............] - ETA: 56s - loss: 3.3864 - regression_loss: 2.5984 - classification_loss: 0.7880 280/500 [===============>..............] - ETA: 56s - loss: 3.3863 - regression_loss: 2.5992 - classification_loss: 0.7871 281/500 [===============>..............] - ETA: 56s - loss: 3.3844 - regression_loss: 2.5981 - classification_loss: 0.7863 282/500 [===============>..............] - ETA: 55s - loss: 3.3822 - regression_loss: 2.5973 - classification_loss: 0.7849 283/500 [===============>..............] - ETA: 55s - loss: 3.3803 - regression_loss: 2.5967 - classification_loss: 0.7836 284/500 [================>.............] - ETA: 55s - loss: 3.3798 - regression_loss: 2.5972 - classification_loss: 0.7826 285/500 [================>.............] - ETA: 55s - loss: 3.3777 - regression_loss: 2.5963 - classification_loss: 0.7814 286/500 [================>.............] - ETA: 54s - loss: 3.3763 - regression_loss: 2.5961 - classification_loss: 0.7802 287/500 [================>.............] - ETA: 54s - loss: 3.3749 - regression_loss: 2.5957 - classification_loss: 0.7792 288/500 [================>.............] - ETA: 54s - loss: 3.3747 - regression_loss: 2.5961 - classification_loss: 0.7785 289/500 [================>.............] - ETA: 54s - loss: 3.3729 - regression_loss: 2.5956 - classification_loss: 0.7773 290/500 [================>.............] - ETA: 53s - loss: 3.3729 - regression_loss: 2.5946 - classification_loss: 0.7782 291/500 [================>.............] - ETA: 53s - loss: 3.3711 - regression_loss: 2.5939 - classification_loss: 0.7773 292/500 [================>.............] - ETA: 53s - loss: 3.3688 - regression_loss: 2.5930 - classification_loss: 0.7757 293/500 [================>.............] - ETA: 52s - loss: 3.3668 - regression_loss: 2.5923 - classification_loss: 0.7745 294/500 [================>.............] - ETA: 52s - loss: 3.3641 - regression_loss: 2.5911 - classification_loss: 0.7730 295/500 [================>.............] - ETA: 52s - loss: 3.3625 - regression_loss: 2.5902 - classification_loss: 0.7724 296/500 [================>.............] - ETA: 52s - loss: 3.3605 - regression_loss: 2.5892 - classification_loss: 0.7713 297/500 [================>.............] - ETA: 51s - loss: 3.3590 - regression_loss: 2.5885 - classification_loss: 0.7704 298/500 [================>.............] - ETA: 51s - loss: 3.3585 - regression_loss: 2.5886 - classification_loss: 0.7699 299/500 [================>.............] - ETA: 51s - loss: 3.3568 - regression_loss: 2.5881 - classification_loss: 0.7687 300/500 [=================>............] - ETA: 51s - loss: 3.3558 - regression_loss: 2.5880 - classification_loss: 0.7678 301/500 [=================>............] - ETA: 50s - loss: 3.3538 - regression_loss: 2.5873 - classification_loss: 0.7665 302/500 [=================>............] - ETA: 50s - loss: 3.3520 - regression_loss: 2.5866 - classification_loss: 0.7654 303/500 [=================>............] - ETA: 50s - loss: 3.3514 - regression_loss: 2.5867 - classification_loss: 0.7647 304/500 [=================>............] - ETA: 49s - loss: 3.3510 - regression_loss: 2.5866 - classification_loss: 0.7644 305/500 [=================>............] - ETA: 49s - loss: 3.3492 - regression_loss: 2.5859 - classification_loss: 0.7633 306/500 [=================>............] - ETA: 49s - loss: 3.3475 - regression_loss: 2.5853 - classification_loss: 0.7622 307/500 [=================>............] - ETA: 49s - loss: 3.3457 - regression_loss: 2.5846 - classification_loss: 0.7610 308/500 [=================>............] - ETA: 48s - loss: 3.3443 - regression_loss: 2.5840 - classification_loss: 0.7603 309/500 [=================>............] - ETA: 48s - loss: 3.3435 - regression_loss: 2.5837 - classification_loss: 0.7598 310/500 [=================>............] - ETA: 48s - loss: 3.3426 - regression_loss: 2.5836 - classification_loss: 0.7590 311/500 [=================>............] - ETA: 48s - loss: 3.3419 - regression_loss: 2.5836 - classification_loss: 0.7583 312/500 [=================>............] - ETA: 47s - loss: 3.3432 - regression_loss: 2.5854 - classification_loss: 0.7578 313/500 [=================>............] - ETA: 47s - loss: 3.3419 - regression_loss: 2.5850 - classification_loss: 0.7569 314/500 [=================>............] - ETA: 47s - loss: 3.3399 - regression_loss: 2.5839 - classification_loss: 0.7560 315/500 [=================>............] - ETA: 47s - loss: 3.3384 - regression_loss: 2.5833 - classification_loss: 0.7552 316/500 [=================>............] - ETA: 46s - loss: 3.3390 - regression_loss: 2.5838 - classification_loss: 0.7552 317/500 [==================>...........] - ETA: 46s - loss: 3.3382 - regression_loss: 2.5838 - classification_loss: 0.7544 318/500 [==================>...........] - ETA: 46s - loss: 3.3365 - regression_loss: 2.5830 - classification_loss: 0.7534 319/500 [==================>...........] - ETA: 46s - loss: 3.3346 - regression_loss: 2.5823 - classification_loss: 0.7523 320/500 [==================>...........] - ETA: 45s - loss: 3.3330 - regression_loss: 2.5817 - classification_loss: 0.7513 321/500 [==================>...........] - ETA: 45s - loss: 3.3317 - regression_loss: 2.5814 - classification_loss: 0.7503 322/500 [==================>...........] - ETA: 45s - loss: 3.3308 - regression_loss: 2.5810 - classification_loss: 0.7498 323/500 [==================>...........] - ETA: 45s - loss: 3.3294 - regression_loss: 2.5805 - classification_loss: 0.7489 324/500 [==================>...........] - ETA: 44s - loss: 3.3281 - regression_loss: 2.5801 - classification_loss: 0.7480 325/500 [==================>...........] - ETA: 44s - loss: 3.3265 - regression_loss: 2.5795 - classification_loss: 0.7471 326/500 [==================>...........] - ETA: 44s - loss: 3.3252 - regression_loss: 2.5792 - classification_loss: 0.7461 327/500 [==================>...........] - ETA: 43s - loss: 3.3236 - regression_loss: 2.5785 - classification_loss: 0.7452 328/500 [==================>...........] - ETA: 43s - loss: 3.3219 - regression_loss: 2.5779 - classification_loss: 0.7441 329/500 [==================>...........] - ETA: 43s - loss: 3.3203 - regression_loss: 2.5772 - classification_loss: 0.7431 330/500 [==================>...........] - ETA: 43s - loss: 3.3203 - regression_loss: 2.5774 - classification_loss: 0.7428 331/500 [==================>...........] - ETA: 42s - loss: 3.3187 - regression_loss: 2.5770 - classification_loss: 0.7417 332/500 [==================>...........] - ETA: 42s - loss: 3.3172 - regression_loss: 2.5760 - classification_loss: 0.7412 333/500 [==================>...........] - ETA: 42s - loss: 3.3156 - regression_loss: 2.5753 - classification_loss: 0.7403 334/500 [===================>..........] - ETA: 42s - loss: 3.3139 - regression_loss: 2.5746 - classification_loss: 0.7393 335/500 [===================>..........] - ETA: 41s - loss: 3.3138 - regression_loss: 2.5750 - classification_loss: 0.7387 336/500 [===================>..........] - ETA: 41s - loss: 3.3122 - regression_loss: 2.5745 - classification_loss: 0.7378 337/500 [===================>..........] - ETA: 41s - loss: 3.3107 - regression_loss: 2.5739 - classification_loss: 0.7368 338/500 [===================>..........] - ETA: 41s - loss: 3.3102 - regression_loss: 2.5737 - classification_loss: 0.7365 339/500 [===================>..........] - ETA: 40s - loss: 3.3099 - regression_loss: 2.5738 - classification_loss: 0.7361 340/500 [===================>..........] - ETA: 40s - loss: 3.3116 - regression_loss: 2.5760 - classification_loss: 0.7356 341/500 [===================>..........] - ETA: 40s - loss: 3.3101 - regression_loss: 2.5752 - classification_loss: 0.7349 342/500 [===================>..........] - ETA: 40s - loss: 3.3074 - regression_loss: 2.5736 - classification_loss: 0.7338 343/500 [===================>..........] - ETA: 39s - loss: 3.3058 - regression_loss: 2.5729 - classification_loss: 0.7329 344/500 [===================>..........] - ETA: 39s - loss: 3.3051 - regression_loss: 2.5730 - classification_loss: 0.7321 345/500 [===================>..........] - ETA: 39s - loss: 3.3035 - regression_loss: 2.5724 - classification_loss: 0.7312 346/500 [===================>..........] - ETA: 39s - loss: 3.3021 - regression_loss: 2.5716 - classification_loss: 0.7305 347/500 [===================>..........] - ETA: 38s - loss: 3.3021 - regression_loss: 2.5723 - classification_loss: 0.7298 348/500 [===================>..........] - ETA: 38s - loss: 3.3002 - regression_loss: 2.5714 - classification_loss: 0.7288 349/500 [===================>..........] - ETA: 38s - loss: 3.2991 - regression_loss: 2.5710 - classification_loss: 0.7281 350/500 [====================>.........] - ETA: 38s - loss: 3.2983 - regression_loss: 2.5709 - classification_loss: 0.7274 351/500 [====================>.........] - ETA: 37s - loss: 3.2969 - regression_loss: 2.5703 - classification_loss: 0.7265 352/500 [====================>.........] - ETA: 37s - loss: 3.2962 - regression_loss: 2.5701 - classification_loss: 0.7261 353/500 [====================>.........] - ETA: 37s - loss: 3.2949 - regression_loss: 2.5698 - classification_loss: 0.7251 354/500 [====================>.........] - ETA: 36s - loss: 3.2935 - regression_loss: 2.5695 - classification_loss: 0.7240 355/500 [====================>.........] - ETA: 36s - loss: 3.2927 - regression_loss: 2.5692 - classification_loss: 0.7235 356/500 [====================>.........] - ETA: 36s - loss: 3.2921 - regression_loss: 2.5694 - classification_loss: 0.7227 357/500 [====================>.........] - ETA: 36s - loss: 3.2903 - regression_loss: 2.5684 - classification_loss: 0.7219 358/500 [====================>.........] - ETA: 35s - loss: 3.2889 - regression_loss: 2.5678 - classification_loss: 0.7211 359/500 [====================>.........] - ETA: 35s - loss: 3.2881 - regression_loss: 2.5676 - classification_loss: 0.7205 360/500 [====================>.........] - ETA: 35s - loss: 3.2869 - regression_loss: 2.5671 - classification_loss: 0.7199 361/500 [====================>.........] - ETA: 35s - loss: 3.2853 - regression_loss: 2.5664 - classification_loss: 0.7190 362/500 [====================>.........] - ETA: 34s - loss: 3.2834 - regression_loss: 2.5655 - classification_loss: 0.7179 363/500 [====================>.........] - ETA: 34s - loss: 3.2813 - regression_loss: 2.5643 - classification_loss: 0.7171 364/500 [====================>.........] - ETA: 34s - loss: 3.2791 - regression_loss: 2.5631 - classification_loss: 0.7161 365/500 [====================>.........] - ETA: 34s - loss: 3.2787 - regression_loss: 2.5630 - classification_loss: 0.7157 366/500 [====================>.........] - ETA: 33s - loss: 3.2796 - regression_loss: 2.5643 - classification_loss: 0.7152 367/500 [=====================>........] - ETA: 33s - loss: 3.2783 - regression_loss: 2.5639 - classification_loss: 0.7145 368/500 [=====================>........] - ETA: 33s - loss: 3.2783 - regression_loss: 2.5643 - classification_loss: 0.7139 369/500 [=====================>........] - ETA: 33s - loss: 3.2770 - regression_loss: 2.5638 - classification_loss: 0.7131 370/500 [=====================>........] - ETA: 32s - loss: 3.2756 - regression_loss: 2.5632 - classification_loss: 0.7124 371/500 [=====================>........] - ETA: 32s - loss: 3.2742 - regression_loss: 2.5628 - classification_loss: 0.7114 372/500 [=====================>........] - ETA: 32s - loss: 3.2725 - regression_loss: 2.5620 - classification_loss: 0.7105 373/500 [=====================>........] - ETA: 32s - loss: 3.2710 - regression_loss: 2.5613 - classification_loss: 0.7097 374/500 [=====================>........] - ETA: 31s - loss: 3.2690 - regression_loss: 2.5604 - classification_loss: 0.7087 375/500 [=====================>........] - ETA: 31s - loss: 3.2679 - regression_loss: 2.5598 - classification_loss: 0.7081 376/500 [=====================>........] - ETA: 31s - loss: 3.2665 - regression_loss: 2.5591 - classification_loss: 0.7074 377/500 [=====================>........] - ETA: 31s - loss: 3.2661 - regression_loss: 2.5592 - classification_loss: 0.7069 378/500 [=====================>........] - ETA: 30s - loss: 3.2656 - regression_loss: 2.5592 - classification_loss: 0.7064 379/500 [=====================>........] - ETA: 30s - loss: 3.2651 - regression_loss: 2.5591 - classification_loss: 0.7060 380/500 [=====================>........] - ETA: 30s - loss: 3.2646 - regression_loss: 2.5590 - classification_loss: 0.7056 381/500 [=====================>........] - ETA: 30s - loss: 3.2652 - regression_loss: 2.5601 - classification_loss: 0.7051 382/500 [=====================>........] - ETA: 29s - loss: 3.2637 - regression_loss: 2.5594 - classification_loss: 0.7043 383/500 [=====================>........] - ETA: 29s - loss: 3.2622 - regression_loss: 2.5590 - classification_loss: 0.7032 384/500 [======================>.......] - ETA: 29s - loss: 3.2609 - regression_loss: 2.5583 - classification_loss: 0.7026 385/500 [======================>.......] - ETA: 29s - loss: 3.2614 - regression_loss: 2.5587 - classification_loss: 0.7027 386/500 [======================>.......] - ETA: 28s - loss: 3.2601 - regression_loss: 2.5581 - classification_loss: 0.7020 387/500 [======================>.......] - ETA: 28s - loss: 3.2588 - regression_loss: 2.5576 - classification_loss: 0.7012 388/500 [======================>.......] - ETA: 28s - loss: 3.2567 - regression_loss: 2.5566 - classification_loss: 0.7001 389/500 [======================>.......] - ETA: 27s - loss: 3.2553 - regression_loss: 2.5559 - classification_loss: 0.6994 390/500 [======================>.......] - ETA: 27s - loss: 3.2544 - regression_loss: 2.5558 - classification_loss: 0.6986 391/500 [======================>.......] - ETA: 27s - loss: 3.2532 - regression_loss: 2.5551 - classification_loss: 0.6980 392/500 [======================>.......] - ETA: 27s - loss: 3.2515 - regression_loss: 2.5543 - classification_loss: 0.6972 393/500 [======================>.......] - ETA: 26s - loss: 3.2504 - regression_loss: 2.5537 - classification_loss: 0.6967 394/500 [======================>.......] - ETA: 26s - loss: 3.2493 - regression_loss: 2.5533 - classification_loss: 0.6960 395/500 [======================>.......] - ETA: 26s - loss: 3.2481 - regression_loss: 2.5527 - classification_loss: 0.6954 396/500 [======================>.......] - ETA: 26s - loss: 3.2470 - regression_loss: 2.5519 - classification_loss: 0.6951 397/500 [======================>.......] - ETA: 25s - loss: 3.2460 - regression_loss: 2.5514 - classification_loss: 0.6946 398/500 [======================>.......] - ETA: 25s - loss: 3.2460 - regression_loss: 2.5516 - classification_loss: 0.6944 399/500 [======================>.......] - ETA: 25s - loss: 3.2458 - regression_loss: 2.5516 - classification_loss: 0.6942 400/500 [=======================>......] - ETA: 25s - loss: 3.2451 - regression_loss: 2.5510 - classification_loss: 0.6941 401/500 [=======================>......] - ETA: 24s - loss: 3.2447 - regression_loss: 2.5508 - classification_loss: 0.6939 402/500 [=======================>......] - ETA: 24s - loss: 3.2436 - regression_loss: 2.5506 - classification_loss: 0.6930 403/500 [=======================>......] - ETA: 24s - loss: 3.2423 - regression_loss: 2.5500 - classification_loss: 0.6923 404/500 [=======================>......] - ETA: 24s - loss: 3.2407 - regression_loss: 2.5490 - classification_loss: 0.6917 405/500 [=======================>......] - ETA: 23s - loss: 3.2409 - regression_loss: 2.5494 - classification_loss: 0.6915 406/500 [=======================>......] - ETA: 23s - loss: 3.2399 - regression_loss: 2.5490 - classification_loss: 0.6909 407/500 [=======================>......] - ETA: 23s - loss: 3.2393 - regression_loss: 2.5489 - classification_loss: 0.6904 408/500 [=======================>......] - ETA: 23s - loss: 3.2383 - regression_loss: 2.5485 - classification_loss: 0.6898 409/500 [=======================>......] - ETA: 22s - loss: 3.2374 - regression_loss: 2.5483 - classification_loss: 0.6891 410/500 [=======================>......] - ETA: 22s - loss: 3.2366 - regression_loss: 2.5478 - classification_loss: 0.6887 411/500 [=======================>......] - ETA: 22s - loss: 3.2360 - regression_loss: 2.5478 - classification_loss: 0.6882 412/500 [=======================>......] - ETA: 22s - loss: 3.2356 - regression_loss: 2.5477 - classification_loss: 0.6878 413/500 [=======================>......] - ETA: 21s - loss: 3.2342 - regression_loss: 2.5470 - classification_loss: 0.6872 414/500 [=======================>......] - ETA: 21s - loss: 3.2338 - regression_loss: 2.5470 - classification_loss: 0.6869 415/500 [=======================>......] - ETA: 21s - loss: 3.2320 - regression_loss: 2.5461 - classification_loss: 0.6859 416/500 [=======================>......] - ETA: 21s - loss: 3.2313 - regression_loss: 2.5460 - classification_loss: 0.6853 417/500 [========================>.....] - ETA: 20s - loss: 3.2294 - regression_loss: 2.5445 - classification_loss: 0.6849 418/500 [========================>.....] - ETA: 20s - loss: 3.2282 - regression_loss: 2.5438 - classification_loss: 0.6843 419/500 [========================>.....] - ETA: 20s - loss: 3.2269 - regression_loss: 2.5432 - classification_loss: 0.6837 420/500 [========================>.....] - ETA: 20s - loss: 3.2259 - regression_loss: 2.5428 - classification_loss: 0.6830 421/500 [========================>.....] - ETA: 19s - loss: 3.2258 - regression_loss: 2.5431 - classification_loss: 0.6827 422/500 [========================>.....] - ETA: 19s - loss: 3.2246 - regression_loss: 2.5424 - classification_loss: 0.6821 423/500 [========================>.....] - ETA: 19s - loss: 3.2232 - regression_loss: 2.5416 - classification_loss: 0.6816 424/500 [========================>.....] - ETA: 18s - loss: 3.2217 - regression_loss: 2.5409 - classification_loss: 0.6808 425/500 [========================>.....] - ETA: 18s - loss: 3.2213 - regression_loss: 2.5407 - classification_loss: 0.6806 426/500 [========================>.....] - ETA: 18s - loss: 3.2202 - regression_loss: 2.5402 - classification_loss: 0.6800 427/500 [========================>.....] - ETA: 18s - loss: 3.2197 - regression_loss: 2.5401 - classification_loss: 0.6796 428/500 [========================>.....] - ETA: 17s - loss: 3.2185 - regression_loss: 2.5395 - classification_loss: 0.6789 429/500 [========================>.....] - ETA: 17s - loss: 3.2183 - regression_loss: 2.5398 - classification_loss: 0.6785 430/500 [========================>.....] - ETA: 17s - loss: 3.2187 - regression_loss: 2.5406 - classification_loss: 0.6781 431/500 [========================>.....] - ETA: 17s - loss: 3.2185 - regression_loss: 2.5408 - classification_loss: 0.6777 432/500 [========================>.....] - ETA: 16s - loss: 3.2170 - regression_loss: 2.5399 - classification_loss: 0.6771 433/500 [========================>.....] - ETA: 16s - loss: 3.2148 - regression_loss: 2.5385 - classification_loss: 0.6763 434/500 [=========================>....] - ETA: 16s - loss: 3.2125 - regression_loss: 2.5370 - classification_loss: 0.6755 435/500 [=========================>....] - ETA: 16s - loss: 3.2121 - regression_loss: 2.5370 - classification_loss: 0.6751 436/500 [=========================>....] - ETA: 15s - loss: 3.2118 - regression_loss: 2.5371 - classification_loss: 0.6747 437/500 [=========================>....] - ETA: 15s - loss: 3.2109 - regression_loss: 2.5366 - classification_loss: 0.6742 438/500 [=========================>....] - ETA: 15s - loss: 3.2098 - regression_loss: 2.5362 - classification_loss: 0.6735 439/500 [=========================>....] - ETA: 15s - loss: 3.2086 - regression_loss: 2.5355 - classification_loss: 0.6730 440/500 [=========================>....] - ETA: 14s - loss: 3.2070 - regression_loss: 2.5346 - classification_loss: 0.6724 441/500 [=========================>....] - ETA: 14s - loss: 3.2058 - regression_loss: 2.5339 - classification_loss: 0.6719 442/500 [=========================>....] - ETA: 14s - loss: 3.2057 - regression_loss: 2.5339 - classification_loss: 0.6718 443/500 [=========================>....] - ETA: 14s - loss: 3.2043 - regression_loss: 2.5331 - classification_loss: 0.6712 444/500 [=========================>....] - ETA: 13s - loss: 3.2033 - regression_loss: 2.5326 - classification_loss: 0.6707 445/500 [=========================>....] - ETA: 13s - loss: 3.2030 - regression_loss: 2.5326 - classification_loss: 0.6704 446/500 [=========================>....] - ETA: 13s - loss: 3.2019 - regression_loss: 2.5323 - classification_loss: 0.6697 447/500 [=========================>....] - ETA: 13s - loss: 3.2007 - regression_loss: 2.5317 - classification_loss: 0.6690 448/500 [=========================>....] - ETA: 12s - loss: 3.2010 - regression_loss: 2.5309 - classification_loss: 0.6701 449/500 [=========================>....] - ETA: 12s - loss: 3.1995 - regression_loss: 2.5301 - classification_loss: 0.6694 450/500 [==========================>...] - ETA: 12s - loss: 3.2001 - regression_loss: 2.5310 - classification_loss: 0.6691 451/500 [==========================>...] - ETA: 12s - loss: 3.2012 - regression_loss: 2.5323 - classification_loss: 0.6688 452/500 [==========================>...] - ETA: 11s - loss: 3.2000 - regression_loss: 2.5317 - classification_loss: 0.6682 453/500 [==========================>...] - ETA: 11s - loss: 3.1994 - regression_loss: 2.5314 - classification_loss: 0.6681 454/500 [==========================>...] - ETA: 11s - loss: 3.1985 - regression_loss: 2.5309 - classification_loss: 0.6676 455/500 [==========================>...] - ETA: 11s - loss: 3.1973 - regression_loss: 2.5303 - classification_loss: 0.6669 456/500 [==========================>...] - ETA: 10s - loss: 3.1964 - regression_loss: 2.5300 - classification_loss: 0.6664 457/500 [==========================>...] - ETA: 10s - loss: 3.1954 - regression_loss: 2.5296 - classification_loss: 0.6658 458/500 [==========================>...] - ETA: 10s - loss: 3.1945 - regression_loss: 2.5291 - classification_loss: 0.6653 459/500 [==========================>...] - ETA: 10s - loss: 3.1940 - regression_loss: 2.5291 - classification_loss: 0.6649 460/500 [==========================>...] - ETA: 9s - loss: 3.1927 - regression_loss: 2.5284 - classification_loss: 0.6643 461/500 [==========================>...] - ETA: 9s - loss: 3.1915 - regression_loss: 2.5280 - classification_loss: 0.6636 462/500 [==========================>...] - ETA: 9s - loss: 3.1904 - regression_loss: 2.5274 - classification_loss: 0.6630 463/500 [==========================>...] - ETA: 9s - loss: 3.1885 - regression_loss: 2.5263 - classification_loss: 0.6622 464/500 [==========================>...] - ETA: 8s - loss: 3.1879 - regression_loss: 2.5261 - classification_loss: 0.6618 465/500 [==========================>...] - ETA: 8s - loss: 3.1867 - regression_loss: 2.5254 - classification_loss: 0.6613 466/500 [==========================>...] - ETA: 8s - loss: 3.1858 - regression_loss: 2.5250 - classification_loss: 0.6607 467/500 [===========================>..] - ETA: 8s - loss: 3.1850 - regression_loss: 2.5247 - classification_loss: 0.6603 468/500 [===========================>..] - ETA: 7s - loss: 3.1841 - regression_loss: 2.5243 - classification_loss: 0.6598 469/500 [===========================>..] - ETA: 7s - loss: 3.1867 - regression_loss: 2.5238 - classification_loss: 0.6629 470/500 [===========================>..] - ETA: 7s - loss: 3.1865 - regression_loss: 2.5233 - classification_loss: 0.6633 471/500 [===========================>..] - ETA: 7s - loss: 3.1857 - regression_loss: 2.5231 - classification_loss: 0.6626 472/500 [===========================>..] - ETA: 6s - loss: 3.1854 - regression_loss: 2.5229 - classification_loss: 0.6625 473/500 [===========================>..] - ETA: 6s - loss: 3.1836 - regression_loss: 2.5218 - classification_loss: 0.6619 474/500 [===========================>..] - ETA: 6s - loss: 3.1848 - regression_loss: 2.5230 - classification_loss: 0.6618 475/500 [===========================>..] - ETA: 6s - loss: 3.1831 - regression_loss: 2.5219 - classification_loss: 0.6612 476/500 [===========================>..] - ETA: 5s - loss: 3.1824 - regression_loss: 2.5215 - classification_loss: 0.6609 477/500 [===========================>..] - ETA: 5s - loss: 3.1819 - regression_loss: 2.5214 - classification_loss: 0.6606 478/500 [===========================>..] - ETA: 5s - loss: 3.1806 - regression_loss: 2.5206 - classification_loss: 0.6600 479/500 [===========================>..] - ETA: 5s - loss: 3.1798 - regression_loss: 2.5201 - classification_loss: 0.6597 480/500 [===========================>..] - ETA: 4s - loss: 3.1791 - regression_loss: 2.5197 - classification_loss: 0.6595 481/500 [===========================>..] - ETA: 4s - loss: 3.1785 - regression_loss: 2.5193 - classification_loss: 0.6591 482/500 [===========================>..] - ETA: 4s - loss: 3.1775 - regression_loss: 2.5190 - classification_loss: 0.6585 483/500 [===========================>..] - ETA: 4s - loss: 3.1764 - regression_loss: 2.5185 - classification_loss: 0.6579 484/500 [============================>.] - ETA: 3s - loss: 3.1777 - regression_loss: 2.5198 - classification_loss: 0.6579 485/500 [============================>.] - ETA: 3s - loss: 3.1764 - regression_loss: 2.5190 - classification_loss: 0.6574 486/500 [============================>.] - ETA: 3s - loss: 3.1759 - regression_loss: 2.5186 - classification_loss: 0.6574 487/500 [============================>.] - ETA: 3s - loss: 3.1753 - regression_loss: 2.5183 - classification_loss: 0.6570 488/500 [============================>.] - ETA: 2s - loss: 3.1741 - regression_loss: 2.5175 - classification_loss: 0.6566 489/500 [============================>.] - ETA: 2s - loss: 3.1734 - regression_loss: 2.5171 - classification_loss: 0.6563 490/500 [============================>.] - ETA: 2s - loss: 3.1714 - regression_loss: 2.5156 - classification_loss: 0.6557 491/500 [============================>.] - ETA: 2s - loss: 3.1721 - regression_loss: 2.5162 - classification_loss: 0.6559 492/500 [============================>.] - ETA: 1s - loss: 3.1713 - regression_loss: 2.5158 - classification_loss: 0.6555 493/500 [============================>.] - ETA: 1s - loss: 3.1704 - regression_loss: 2.5154 - classification_loss: 0.6551 494/500 [============================>.] - ETA: 1s - loss: 3.1701 - regression_loss: 2.5151 - classification_loss: 0.6550 495/500 [============================>.] - ETA: 1s - loss: 3.1694 - regression_loss: 2.5150 - classification_loss: 0.6545 496/500 [============================>.] - ETA: 0s - loss: 3.1681 - regression_loss: 2.5142 - classification_loss: 0.6539 497/500 [============================>.] - ETA: 0s - loss: 3.1677 - regression_loss: 2.5143 - classification_loss: 0.6534 498/500 [============================>.] - ETA: 0s - loss: 3.1659 - regression_loss: 2.5130 - classification_loss: 0.6528 499/500 [============================>.] - ETA: 0s - loss: 3.1644 - regression_loss: 2.5121 - classification_loss: 0.6523 500/500 [==============================] - 123s 245ms/step - loss: 3.1637 - regression_loss: 2.5119 - classification_loss: 0.6518 1172 instances of class plum with average precision: 0.1668 mAP: 0.1668 Epoch 00001: saving model to ./training/snapshots/resnet50_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 1:51 - loss: 1.5928 - regression_loss: 1.3484 - classification_loss: 0.2444 2/500 [..............................] - ETA: 1:50 - loss: 2.1275 - regression_loss: 1.8032 - classification_loss: 0.3243 3/500 [..............................] - ETA: 1:49 - loss: 2.2961 - regression_loss: 1.9329 - classification_loss: 0.3633 4/500 [..............................] - ETA: 1:47 - loss: 2.4088 - regression_loss: 2.0327 - classification_loss: 0.3761 5/500 [..............................] - ETA: 1:46 - loss: 2.4903 - regression_loss: 2.0938 - classification_loss: 0.3964 6/500 [..............................] - ETA: 1:46 - loss: 2.5136 - regression_loss: 2.1192 - classification_loss: 0.3944 7/500 [..............................] - ETA: 1:46 - loss: 2.5166 - regression_loss: 2.1315 - classification_loss: 0.3851 8/500 [..............................] - ETA: 1:46 - loss: 2.6688 - regression_loss: 2.2272 - classification_loss: 0.4416 9/500 [..............................] - ETA: 1:46 - loss: 2.6561 - regression_loss: 2.2212 - classification_loss: 0.4349 10/500 [..............................] - ETA: 1:45 - loss: 2.6487 - regression_loss: 2.1970 - classification_loss: 0.4517 11/500 [..............................] - ETA: 1:45 - loss: 2.6655 - regression_loss: 2.2134 - classification_loss: 0.4522 12/500 [..............................] - ETA: 1:45 - loss: 2.7026 - regression_loss: 2.2486 - classification_loss: 0.4540 13/500 [..............................] - ETA: 1:44 - loss: 2.7046 - regression_loss: 2.2517 - classification_loss: 0.4528 14/500 [..............................] - ETA: 1:44 - loss: 2.6853 - regression_loss: 2.2377 - classification_loss: 0.4477 15/500 [..............................] - ETA: 1:44 - loss: 2.6997 - regression_loss: 2.2423 - classification_loss: 0.4574 16/500 [..............................] - ETA: 1:44 - loss: 2.6913 - regression_loss: 2.2391 - classification_loss: 0.4521 17/500 [>.............................] - ETA: 1:44 - loss: 2.6891 - regression_loss: 2.2380 - classification_loss: 0.4512 18/500 [>.............................] - ETA: 1:43 - loss: 2.7428 - regression_loss: 2.2788 - classification_loss: 0.4639 19/500 [>.............................] - ETA: 1:43 - loss: 2.7570 - regression_loss: 2.2919 - classification_loss: 0.4651 20/500 [>.............................] - ETA: 1:43 - loss: 2.7885 - regression_loss: 2.3194 - classification_loss: 0.4691 21/500 [>.............................] - ETA: 1:42 - loss: 2.7718 - regression_loss: 2.3092 - classification_loss: 0.4626 22/500 [>.............................] - ETA: 1:42 - loss: 2.7482 - regression_loss: 2.2920 - classification_loss: 0.4562 23/500 [>.............................] - ETA: 1:42 - loss: 2.7508 - regression_loss: 2.2969 - classification_loss: 0.4539 24/500 [>.............................] - ETA: 1:42 - loss: 2.7489 - regression_loss: 2.2977 - classification_loss: 0.4512 25/500 [>.............................] - ETA: 1:42 - loss: 2.7499 - regression_loss: 2.2981 - classification_loss: 0.4518 26/500 [>.............................] - ETA: 1:42 - loss: 2.7463 - regression_loss: 2.2963 - classification_loss: 0.4500 27/500 [>.............................] - ETA: 1:41 - loss: 2.7353 - regression_loss: 2.2893 - classification_loss: 0.4460 28/500 [>.............................] - ETA: 1:42 - loss: 2.7392 - regression_loss: 2.2981 - classification_loss: 0.4411 29/500 [>.............................] - ETA: 1:41 - loss: 2.7314 - regression_loss: 2.2914 - classification_loss: 0.4399 30/500 [>.............................] - ETA: 1:41 - loss: 2.7182 - regression_loss: 2.2801 - classification_loss: 0.4381 31/500 [>.............................] - ETA: 1:41 - loss: 2.7203 - regression_loss: 2.2822 - classification_loss: 0.4381 32/500 [>.............................] - ETA: 1:41 - loss: 2.7159 - regression_loss: 2.2802 - classification_loss: 0.4357 33/500 [>.............................] - ETA: 1:41 - loss: 2.7128 - regression_loss: 2.2769 - classification_loss: 0.4358 34/500 [=>............................] - ETA: 1:41 - loss: 2.7051 - regression_loss: 2.2720 - classification_loss: 0.4331 35/500 [=>............................] - ETA: 1:41 - loss: 2.7045 - regression_loss: 2.2725 - classification_loss: 0.4321 36/500 [=>............................] - ETA: 1:41 - loss: 2.7039 - regression_loss: 2.2725 - classification_loss: 0.4314 37/500 [=>............................] - ETA: 1:41 - loss: 2.7039 - regression_loss: 2.2733 - classification_loss: 0.4306 38/500 [=>............................] - ETA: 1:41 - loss: 2.7208 - regression_loss: 2.2835 - classification_loss: 0.4373 39/500 [=>............................] - ETA: 1:41 - loss: 2.7148 - regression_loss: 2.2789 - classification_loss: 0.4359 40/500 [=>............................] - ETA: 1:40 - loss: 2.6943 - regression_loss: 2.2635 - classification_loss: 0.4308 41/500 [=>............................] - ETA: 1:40 - loss: 2.6933 - regression_loss: 2.2639 - classification_loss: 0.4295 42/500 [=>............................] - ETA: 1:40 - loss: 2.6978 - regression_loss: 2.2670 - classification_loss: 0.4308 43/500 [=>............................] - ETA: 1:40 - loss: 2.6894 - regression_loss: 2.2613 - classification_loss: 0.4281 44/500 [=>............................] - ETA: 1:40 - loss: 2.6971 - regression_loss: 2.2681 - classification_loss: 0.4291 45/500 [=>............................] - ETA: 1:40 - loss: 2.7013 - regression_loss: 2.2726 - classification_loss: 0.4287 46/500 [=>............................] - ETA: 1:40 - loss: 2.7000 - regression_loss: 2.2717 - classification_loss: 0.4283 47/500 [=>............................] - ETA: 1:40 - loss: 2.6974 - regression_loss: 2.2702 - classification_loss: 0.4272 48/500 [=>............................] - ETA: 1:40 - loss: 2.6988 - regression_loss: 2.2707 - classification_loss: 0.4281 49/500 [=>............................] - ETA: 1:40 - loss: 2.6904 - regression_loss: 2.2648 - classification_loss: 0.4256 50/500 [==>...........................] - ETA: 1:39 - loss: 2.6883 - regression_loss: 2.2621 - classification_loss: 0.4262 51/500 [==>...........................] - ETA: 1:39 - loss: 2.6892 - regression_loss: 2.2632 - classification_loss: 0.4260 52/500 [==>...........................] - ETA: 1:39 - loss: 2.6842 - regression_loss: 2.2600 - classification_loss: 0.4242 53/500 [==>...........................] - ETA: 1:39 - loss: 2.6831 - regression_loss: 2.2585 - classification_loss: 0.4246 54/500 [==>...........................] - ETA: 1:39 - loss: 2.6909 - regression_loss: 2.2660 - classification_loss: 0.4249 55/500 [==>...........................] - ETA: 1:39 - loss: 2.6932 - regression_loss: 2.2678 - classification_loss: 0.4254 56/500 [==>...........................] - ETA: 1:38 - loss: 2.6935 - regression_loss: 2.2664 - classification_loss: 0.4270 57/500 [==>...........................] - ETA: 1:38 - loss: 2.7120 - regression_loss: 2.2756 - classification_loss: 0.4364 58/500 [==>...........................] - ETA: 1:38 - loss: 2.7028 - regression_loss: 2.2681 - classification_loss: 0.4347 59/500 [==>...........................] - ETA: 1:38 - loss: 2.7019 - regression_loss: 2.2657 - classification_loss: 0.4362 60/500 [==>...........................] - ETA: 1:38 - loss: 2.7038 - regression_loss: 2.2681 - classification_loss: 0.4357 61/500 [==>...........................] - ETA: 1:37 - loss: 2.7125 - regression_loss: 2.2758 - classification_loss: 0.4367 62/500 [==>...........................] - ETA: 1:37 - loss: 2.7128 - regression_loss: 2.2766 - classification_loss: 0.4362 63/500 [==>...........................] - ETA: 1:37 - loss: 2.7111 - regression_loss: 2.2754 - classification_loss: 0.4356 64/500 [==>...........................] - ETA: 1:37 - loss: 2.7066 - regression_loss: 2.2728 - classification_loss: 0.4338 65/500 [==>...........................] - ETA: 1:37 - loss: 2.7086 - regression_loss: 2.2740 - classification_loss: 0.4345 66/500 [==>...........................] - ETA: 1:37 - loss: 2.7014 - regression_loss: 2.2685 - classification_loss: 0.4329 67/500 [===>..........................] - ETA: 1:36 - loss: 2.7006 - regression_loss: 2.2688 - classification_loss: 0.4317 68/500 [===>..........................] - ETA: 1:36 - loss: 2.7033 - regression_loss: 2.2715 - classification_loss: 0.4318 69/500 [===>..........................] - ETA: 1:36 - loss: 2.7018 - regression_loss: 2.2703 - classification_loss: 0.4314 70/500 [===>..........................] - ETA: 1:36 - loss: 2.6971 - regression_loss: 2.2672 - classification_loss: 0.4298 71/500 [===>..........................] - ETA: 1:35 - loss: 2.6959 - regression_loss: 2.2671 - classification_loss: 0.4288 72/500 [===>..........................] - ETA: 1:35 - loss: 2.6941 - regression_loss: 2.2658 - classification_loss: 0.4282 73/500 [===>..........................] - ETA: 1:35 - loss: 2.6975 - regression_loss: 2.2683 - classification_loss: 0.4293 74/500 [===>..........................] - ETA: 1:35 - loss: 2.6976 - regression_loss: 2.2682 - classification_loss: 0.4294 75/500 [===>..........................] - ETA: 1:35 - loss: 2.6984 - regression_loss: 2.2684 - classification_loss: 0.4300 76/500 [===>..........................] - ETA: 1:34 - loss: 2.7026 - regression_loss: 2.2677 - classification_loss: 0.4349 77/500 [===>..........................] - ETA: 1:34 - loss: 2.7015 - regression_loss: 2.2667 - classification_loss: 0.4349 78/500 [===>..........................] - ETA: 1:34 - loss: 2.7041 - regression_loss: 2.2679 - classification_loss: 0.4361 79/500 [===>..........................] - ETA: 1:34 - loss: 2.6999 - regression_loss: 2.2648 - classification_loss: 0.4351 80/500 [===>..........................] - ETA: 1:34 - loss: 2.6996 - regression_loss: 2.2629 - classification_loss: 0.4367 81/500 [===>..........................] - ETA: 1:33 - loss: 2.6895 - regression_loss: 2.2553 - classification_loss: 0.4342 82/500 [===>..........................] - ETA: 1:33 - loss: 2.6877 - regression_loss: 2.2545 - classification_loss: 0.4332 83/500 [===>..........................] - ETA: 1:33 - loss: 2.6878 - regression_loss: 2.2557 - classification_loss: 0.4321 84/500 [====>.........................] - ETA: 1:33 - loss: 2.6868 - regression_loss: 2.2557 - classification_loss: 0.4311 85/500 [====>.........................] - ETA: 1:32 - loss: 2.6821 - regression_loss: 2.2522 - classification_loss: 0.4299 86/500 [====>.........................] - ETA: 1:32 - loss: 2.6819 - regression_loss: 2.2518 - classification_loss: 0.4300 87/500 [====>.........................] - ETA: 1:32 - loss: 2.6826 - regression_loss: 2.2519 - classification_loss: 0.4308 88/500 [====>.........................] - ETA: 1:32 - loss: 2.6847 - regression_loss: 2.2549 - classification_loss: 0.4298 89/500 [====>.........................] - ETA: 1:31 - loss: 2.6927 - regression_loss: 2.2599 - classification_loss: 0.4328 90/500 [====>.........................] - ETA: 1:31 - loss: 2.6939 - regression_loss: 2.2609 - classification_loss: 0.4330 91/500 [====>.........................] - ETA: 1:31 - loss: 2.6926 - regression_loss: 2.2599 - classification_loss: 0.4328 92/500 [====>.........................] - ETA: 1:31 - loss: 2.6919 - regression_loss: 2.2590 - classification_loss: 0.4329 93/500 [====>.........................] - ETA: 1:30 - loss: 2.6928 - regression_loss: 2.2605 - classification_loss: 0.4323 94/500 [====>.........................] - ETA: 1:30 - loss: 2.6886 - regression_loss: 2.2574 - classification_loss: 0.4312 95/500 [====>.........................] - ETA: 1:30 - loss: 2.6885 - regression_loss: 2.2566 - classification_loss: 0.4320 96/500 [====>.........................] - ETA: 1:30 - loss: 2.6896 - regression_loss: 2.2579 - classification_loss: 0.4317 97/500 [====>.........................] - ETA: 1:29 - loss: 2.6845 - regression_loss: 2.2545 - classification_loss: 0.4300 98/500 [====>.........................] - ETA: 1:29 - loss: 2.6801 - regression_loss: 2.2507 - classification_loss: 0.4293 99/500 [====>.........................] - ETA: 1:29 - loss: 2.6796 - regression_loss: 2.2495 - classification_loss: 0.4301 100/500 [=====>........................] - ETA: 1:29 - loss: 2.6686 - regression_loss: 2.2403 - classification_loss: 0.4284 101/500 [=====>........................] - ETA: 1:29 - loss: 2.6680 - regression_loss: 2.2397 - classification_loss: 0.4283 102/500 [=====>........................] - ETA: 1:28 - loss: 2.6686 - regression_loss: 2.2405 - classification_loss: 0.4280 103/500 [=====>........................] - ETA: 1:28 - loss: 2.6656 - regression_loss: 2.2380 - classification_loss: 0.4275 104/500 [=====>........................] - ETA: 1:28 - loss: 2.6661 - regression_loss: 2.2388 - classification_loss: 0.4273 105/500 [=====>........................] - ETA: 1:28 - loss: 2.6702 - regression_loss: 2.2428 - classification_loss: 0.4274 106/500 [=====>........................] - ETA: 1:27 - loss: 2.6684 - regression_loss: 2.2421 - classification_loss: 0.4263 107/500 [=====>........................] - ETA: 1:27 - loss: 2.6639 - regression_loss: 2.2382 - classification_loss: 0.4257 108/500 [=====>........................] - ETA: 1:27 - loss: 2.6659 - regression_loss: 2.2398 - classification_loss: 0.4262 109/500 [=====>........................] - ETA: 1:27 - loss: 2.6680 - regression_loss: 2.2416 - classification_loss: 0.4264 110/500 [=====>........................] - ETA: 1:27 - loss: 2.6700 - regression_loss: 2.2430 - classification_loss: 0.4270 111/500 [=====>........................] - ETA: 1:26 - loss: 2.6712 - regression_loss: 2.2427 - classification_loss: 0.4285 112/500 [=====>........................] - ETA: 1:26 - loss: 2.6711 - regression_loss: 2.2425 - classification_loss: 0.4287 113/500 [=====>........................] - ETA: 1:26 - loss: 2.6719 - regression_loss: 2.2443 - classification_loss: 0.4276 114/500 [=====>........................] - ETA: 1:26 - loss: 2.6709 - regression_loss: 2.2435 - classification_loss: 0.4275 115/500 [=====>........................] - ETA: 1:25 - loss: 2.6799 - regression_loss: 2.2426 - classification_loss: 0.4373 116/500 [=====>........................] - ETA: 1:25 - loss: 2.6824 - regression_loss: 2.2446 - classification_loss: 0.4378 117/500 [======>.......................] - ETA: 1:25 - loss: 2.6802 - regression_loss: 2.2433 - classification_loss: 0.4368 118/500 [======>.......................] - ETA: 1:25 - loss: 2.6793 - regression_loss: 2.2421 - classification_loss: 0.4372 119/500 [======>.......................] - ETA: 1:24 - loss: 2.6772 - regression_loss: 2.2401 - classification_loss: 0.4371 120/500 [======>.......................] - ETA: 1:24 - loss: 2.6758 - regression_loss: 2.2391 - classification_loss: 0.4367 121/500 [======>.......................] - ETA: 1:24 - loss: 2.6745 - regression_loss: 2.2381 - classification_loss: 0.4364 122/500 [======>.......................] - ETA: 1:24 - loss: 2.6717 - regression_loss: 2.2362 - classification_loss: 0.4354 123/500 [======>.......................] - ETA: 1:23 - loss: 2.6708 - regression_loss: 2.2360 - classification_loss: 0.4349 124/500 [======>.......................] - ETA: 1:23 - loss: 2.6670 - regression_loss: 2.2330 - classification_loss: 0.4340 125/500 [======>.......................] - ETA: 1:23 - loss: 2.6663 - regression_loss: 2.2323 - classification_loss: 0.4340 126/500 [======>.......................] - ETA: 1:23 - loss: 2.6666 - regression_loss: 2.2325 - classification_loss: 0.4341 127/500 [======>.......................] - ETA: 1:22 - loss: 2.6645 - regression_loss: 2.2306 - classification_loss: 0.4339 128/500 [======>.......................] - ETA: 1:22 - loss: 2.6684 - regression_loss: 2.2318 - classification_loss: 0.4365 129/500 [======>.......................] - ETA: 1:22 - loss: 2.6691 - regression_loss: 2.2326 - classification_loss: 0.4365 130/500 [======>.......................] - ETA: 1:22 - loss: 2.6683 - regression_loss: 2.2319 - classification_loss: 0.4364 131/500 [======>.......................] - ETA: 1:21 - loss: 2.6681 - regression_loss: 2.2320 - classification_loss: 0.4362 132/500 [======>.......................] - ETA: 1:21 - loss: 2.6670 - regression_loss: 2.2313 - classification_loss: 0.4357 133/500 [======>.......................] - ETA: 1:21 - loss: 2.6662 - regression_loss: 2.2305 - classification_loss: 0.4357 134/500 [=======>......................] - ETA: 1:21 - loss: 2.6650 - regression_loss: 2.2299 - classification_loss: 0.4352 135/500 [=======>......................] - ETA: 1:20 - loss: 2.6631 - regression_loss: 2.2282 - classification_loss: 0.4348 136/500 [=======>......................] - ETA: 1:20 - loss: 2.6608 - regression_loss: 2.2265 - classification_loss: 0.4342 137/500 [=======>......................] - ETA: 1:20 - loss: 2.6601 - regression_loss: 2.2262 - classification_loss: 0.4339 138/500 [=======>......................] - ETA: 1:20 - loss: 2.6559 - regression_loss: 2.2227 - classification_loss: 0.4331 139/500 [=======>......................] - ETA: 1:20 - loss: 2.6576 - regression_loss: 2.2242 - classification_loss: 0.4334 140/500 [=======>......................] - ETA: 1:19 - loss: 2.6559 - regression_loss: 2.2227 - classification_loss: 0.4332 141/500 [=======>......................] - ETA: 1:19 - loss: 2.6533 - regression_loss: 2.2208 - classification_loss: 0.4325 142/500 [=======>......................] - ETA: 1:19 - loss: 2.6537 - regression_loss: 2.2215 - classification_loss: 0.4322 143/500 [=======>......................] - ETA: 1:19 - loss: 2.6546 - regression_loss: 2.2226 - classification_loss: 0.4320 144/500 [=======>......................] - ETA: 1:18 - loss: 2.6556 - regression_loss: 2.2235 - classification_loss: 0.4321 145/500 [=======>......................] - ETA: 1:18 - loss: 2.6561 - regression_loss: 2.2244 - classification_loss: 0.4317 146/500 [=======>......................] - ETA: 1:18 - loss: 2.6557 - regression_loss: 2.2243 - classification_loss: 0.4313 147/500 [=======>......................] - ETA: 1:18 - loss: 2.6521 - regression_loss: 2.2214 - classification_loss: 0.4307 148/500 [=======>......................] - ETA: 1:18 - loss: 2.6482 - regression_loss: 2.2184 - classification_loss: 0.4299 149/500 [=======>......................] - ETA: 1:17 - loss: 2.6470 - regression_loss: 2.2166 - classification_loss: 0.4304 150/500 [========>.....................] - ETA: 1:17 - loss: 2.6492 - regression_loss: 2.2188 - classification_loss: 0.4304 151/500 [========>.....................] - ETA: 1:17 - loss: 2.6451 - regression_loss: 2.2156 - classification_loss: 0.4295 152/500 [========>.....................] - ETA: 1:17 - loss: 2.6442 - regression_loss: 2.2151 - classification_loss: 0.4291 153/500 [========>.....................] - ETA: 1:16 - loss: 2.6447 - regression_loss: 2.2159 - classification_loss: 0.4288 154/500 [========>.....................] - ETA: 1:16 - loss: 2.6470 - regression_loss: 2.2185 - classification_loss: 0.4285 155/500 [========>.....................] - ETA: 1:16 - loss: 2.6470 - regression_loss: 2.2185 - classification_loss: 0.4284 156/500 [========>.....................] - ETA: 1:16 - loss: 2.6463 - regression_loss: 2.2169 - classification_loss: 0.4294 157/500 [========>.....................] - ETA: 1:16 - loss: 2.6465 - regression_loss: 2.2172 - classification_loss: 0.4293 158/500 [========>.....................] - ETA: 1:15 - loss: 2.6447 - regression_loss: 2.2158 - classification_loss: 0.4289 159/500 [========>.....................] - ETA: 1:15 - loss: 2.6433 - regression_loss: 2.2149 - classification_loss: 0.4284 160/500 [========>.....................] - ETA: 1:15 - loss: 2.6459 - regression_loss: 2.2154 - classification_loss: 0.4305 161/500 [========>.....................] - ETA: 1:15 - loss: 2.6440 - regression_loss: 2.2144 - classification_loss: 0.4296 162/500 [========>.....................] - ETA: 1:15 - loss: 2.6466 - regression_loss: 2.2164 - classification_loss: 0.4302 163/500 [========>.....................] - ETA: 1:14 - loss: 2.6471 - regression_loss: 2.2174 - classification_loss: 0.4297 164/500 [========>.....................] - ETA: 1:14 - loss: 2.6464 - regression_loss: 2.2167 - classification_loss: 0.4297 165/500 [========>.....................] - ETA: 1:14 - loss: 2.6448 - regression_loss: 2.2153 - classification_loss: 0.4295 166/500 [========>.....................] - ETA: 1:14 - loss: 2.6447 - regression_loss: 2.2153 - classification_loss: 0.4294 167/500 [=========>....................] - ETA: 1:13 - loss: 2.6455 - regression_loss: 2.2160 - classification_loss: 0.4296 168/500 [=========>....................] - ETA: 1:13 - loss: 2.6452 - regression_loss: 2.2162 - classification_loss: 0.4290 169/500 [=========>....................] - ETA: 1:13 - loss: 2.6438 - regression_loss: 2.2150 - classification_loss: 0.4288 170/500 [=========>....................] - ETA: 1:13 - loss: 2.6508 - regression_loss: 2.2187 - classification_loss: 0.4321 171/500 [=========>....................] - ETA: 1:13 - loss: 2.6501 - regression_loss: 2.2185 - classification_loss: 0.4316 172/500 [=========>....................] - ETA: 1:12 - loss: 2.6493 - regression_loss: 2.2183 - classification_loss: 0.4310 173/500 [=========>....................] - ETA: 1:12 - loss: 2.6544 - regression_loss: 2.2218 - classification_loss: 0.4326 174/500 [=========>....................] - ETA: 1:12 - loss: 2.6543 - regression_loss: 2.2219 - classification_loss: 0.4324 175/500 [=========>....................] - ETA: 1:12 - loss: 2.6538 - regression_loss: 2.2215 - classification_loss: 0.4322 176/500 [=========>....................] - ETA: 1:11 - loss: 2.6545 - regression_loss: 2.2213 - classification_loss: 0.4332 177/500 [=========>....................] - ETA: 1:11 - loss: 2.6555 - regression_loss: 2.2215 - classification_loss: 0.4339 178/500 [=========>....................] - ETA: 1:11 - loss: 2.6539 - regression_loss: 2.2205 - classification_loss: 0.4334 179/500 [=========>....................] - ETA: 1:11 - loss: 2.6538 - regression_loss: 2.2200 - classification_loss: 0.4339 180/500 [=========>....................] - ETA: 1:11 - loss: 2.6529 - regression_loss: 2.2195 - classification_loss: 0.4334 181/500 [=========>....................] - ETA: 1:10 - loss: 2.6547 - regression_loss: 2.2212 - classification_loss: 0.4336 182/500 [=========>....................] - ETA: 1:10 - loss: 2.6579 - regression_loss: 2.2242 - classification_loss: 0.4337 183/500 [=========>....................] - ETA: 1:10 - loss: 2.6589 - regression_loss: 2.2247 - classification_loss: 0.4342 184/500 [==========>...................] - ETA: 1:10 - loss: 2.6578 - regression_loss: 2.2239 - classification_loss: 0.4339 185/500 [==========>...................] - ETA: 1:09 - loss: 2.6605 - regression_loss: 2.2251 - classification_loss: 0.4354 186/500 [==========>...................] - ETA: 1:09 - loss: 2.6597 - regression_loss: 2.2248 - classification_loss: 0.4349 187/500 [==========>...................] - ETA: 1:09 - loss: 2.6598 - regression_loss: 2.2249 - classification_loss: 0.4349 188/500 [==========>...................] - ETA: 1:09 - loss: 2.6612 - regression_loss: 2.2260 - classification_loss: 0.4352 189/500 [==========>...................] - ETA: 1:09 - loss: 2.6609 - regression_loss: 2.2259 - classification_loss: 0.4350 190/500 [==========>...................] - ETA: 1:08 - loss: 2.6584 - regression_loss: 2.2245 - classification_loss: 0.4339 191/500 [==========>...................] - ETA: 1:08 - loss: 2.6579 - regression_loss: 2.2244 - classification_loss: 0.4335 192/500 [==========>...................] - ETA: 1:08 - loss: 2.6572 - regression_loss: 2.2238 - classification_loss: 0.4334 193/500 [==========>...................] - ETA: 1:08 - loss: 2.6571 - regression_loss: 2.2236 - classification_loss: 0.4335 194/500 [==========>...................] - ETA: 1:07 - loss: 2.6560 - regression_loss: 2.2225 - classification_loss: 0.4335 195/500 [==========>...................] - ETA: 1:07 - loss: 2.6537 - regression_loss: 2.2204 - classification_loss: 0.4333 196/500 [==========>...................] - ETA: 1:07 - loss: 2.6555 - regression_loss: 2.2226 - classification_loss: 0.4329 197/500 [==========>...................] - ETA: 1:07 - loss: 2.6522 - regression_loss: 2.2201 - classification_loss: 0.4321 198/500 [==========>...................] - ETA: 1:06 - loss: 2.6533 - regression_loss: 2.2198 - classification_loss: 0.4335 199/500 [==========>...................] - ETA: 1:06 - loss: 2.6516 - regression_loss: 2.2182 - classification_loss: 0.4334 200/500 [===========>..................] - ETA: 1:06 - loss: 2.6508 - regression_loss: 2.2176 - classification_loss: 0.4333 201/500 [===========>..................] - ETA: 1:06 - loss: 2.6492 - regression_loss: 2.2167 - classification_loss: 0.4325 202/500 [===========>..................] - ETA: 1:06 - loss: 2.6495 - regression_loss: 2.2170 - classification_loss: 0.4325 203/500 [===========>..................] - ETA: 1:05 - loss: 2.6468 - regression_loss: 2.2146 - classification_loss: 0.4322 204/500 [===========>..................] - ETA: 1:05 - loss: 2.6439 - regression_loss: 2.2122 - classification_loss: 0.4317 205/500 [===========>..................] - ETA: 1:05 - loss: 2.6442 - regression_loss: 2.2122 - classification_loss: 0.4320 206/500 [===========>..................] - ETA: 1:05 - loss: 2.6442 - regression_loss: 2.2121 - classification_loss: 0.4321 207/500 [===========>..................] - ETA: 1:04 - loss: 2.6422 - regression_loss: 2.2107 - classification_loss: 0.4315 208/500 [===========>..................] - ETA: 1:04 - loss: 2.6404 - regression_loss: 2.2096 - classification_loss: 0.4308 209/500 [===========>..................] - ETA: 1:04 - loss: 2.6406 - regression_loss: 2.2097 - classification_loss: 0.4309 210/500 [===========>..................] - ETA: 1:04 - loss: 2.6412 - regression_loss: 2.2104 - classification_loss: 0.4309 211/500 [===========>..................] - ETA: 1:03 - loss: 2.6398 - regression_loss: 2.2096 - classification_loss: 0.4302 212/500 [===========>..................] - ETA: 1:03 - loss: 2.6414 - regression_loss: 2.2112 - classification_loss: 0.4302 213/500 [===========>..................] - ETA: 1:03 - loss: 2.6397 - regression_loss: 2.2100 - classification_loss: 0.4298 214/500 [===========>..................] - ETA: 1:03 - loss: 2.6436 - regression_loss: 2.2131 - classification_loss: 0.4305 215/500 [===========>..................] - ETA: 1:03 - loss: 2.6422 - regression_loss: 2.2121 - classification_loss: 0.4300 216/500 [===========>..................] - ETA: 1:02 - loss: 2.6413 - regression_loss: 2.2118 - classification_loss: 0.4295 217/500 [============>.................] - ETA: 1:02 - loss: 2.6398 - regression_loss: 2.2104 - classification_loss: 0.4294 218/500 [============>.................] - ETA: 1:02 - loss: 2.6391 - regression_loss: 2.2099 - classification_loss: 0.4292 219/500 [============>.................] - ETA: 1:02 - loss: 2.6340 - regression_loss: 2.2059 - classification_loss: 0.4280 220/500 [============>.................] - ETA: 1:01 - loss: 2.6335 - regression_loss: 2.2058 - classification_loss: 0.4278 221/500 [============>.................] - ETA: 1:01 - loss: 2.6319 - regression_loss: 2.2049 - classification_loss: 0.4269 222/500 [============>.................] - ETA: 1:01 - loss: 2.6323 - regression_loss: 2.2052 - classification_loss: 0.4271 223/500 [============>.................] - ETA: 1:01 - loss: 2.6370 - regression_loss: 2.2092 - classification_loss: 0.4278 224/500 [============>.................] - ETA: 1:01 - loss: 2.6362 - regression_loss: 2.2086 - classification_loss: 0.4277 225/500 [============>.................] - ETA: 1:00 - loss: 2.6359 - regression_loss: 2.2084 - classification_loss: 0.4275 226/500 [============>.................] - ETA: 1:00 - loss: 2.6356 - regression_loss: 2.2083 - classification_loss: 0.4274 227/500 [============>.................] - ETA: 1:00 - loss: 2.6348 - regression_loss: 2.2080 - classification_loss: 0.4269 228/500 [============>.................] - ETA: 1:00 - loss: 2.6359 - regression_loss: 2.2088 - classification_loss: 0.4271 229/500 [============>.................] - ETA: 59s - loss: 2.6365 - regression_loss: 2.2095 - classification_loss: 0.4270  230/500 [============>.................] - ETA: 59s - loss: 2.6357 - regression_loss: 2.2090 - classification_loss: 0.4267 231/500 [============>.................] - ETA: 59s - loss: 2.6345 - regression_loss: 2.2083 - classification_loss: 0.4262 232/500 [============>.................] - ETA: 59s - loss: 2.6316 - regression_loss: 2.2059 - classification_loss: 0.4257 233/500 [============>.................] - ETA: 59s - loss: 2.6312 - regression_loss: 2.2057 - classification_loss: 0.4255 234/500 [=============>................] - ETA: 58s - loss: 2.6278 - regression_loss: 2.2027 - classification_loss: 0.4252 235/500 [=============>................] - ETA: 58s - loss: 2.6295 - regression_loss: 2.2041 - classification_loss: 0.4254 236/500 [=============>................] - ETA: 58s - loss: 2.6289 - regression_loss: 2.2037 - classification_loss: 0.4252 237/500 [=============>................] - ETA: 58s - loss: 2.6278 - regression_loss: 2.2028 - classification_loss: 0.4250 238/500 [=============>................] - ETA: 57s - loss: 2.6274 - regression_loss: 2.2027 - classification_loss: 0.4247 239/500 [=============>................] - ETA: 57s - loss: 2.6269 - regression_loss: 2.2021 - classification_loss: 0.4248 240/500 [=============>................] - ETA: 57s - loss: 2.6291 - regression_loss: 2.2036 - classification_loss: 0.4255 241/500 [=============>................] - ETA: 57s - loss: 2.6275 - regression_loss: 2.2022 - classification_loss: 0.4253 242/500 [=============>................] - ETA: 57s - loss: 2.6266 - regression_loss: 2.2016 - classification_loss: 0.4250 243/500 [=============>................] - ETA: 56s - loss: 2.6254 - regression_loss: 2.2009 - classification_loss: 0.4246 244/500 [=============>................] - ETA: 56s - loss: 2.6276 - regression_loss: 2.2026 - classification_loss: 0.4250 245/500 [=============>................] - ETA: 56s - loss: 2.6262 - regression_loss: 2.2012 - classification_loss: 0.4250 246/500 [=============>................] - ETA: 56s - loss: 2.6258 - regression_loss: 2.2009 - classification_loss: 0.4249 247/500 [=============>................] - ETA: 55s - loss: 2.6248 - regression_loss: 2.2002 - classification_loss: 0.4246 248/500 [=============>................] - ETA: 55s - loss: 2.6253 - regression_loss: 2.2011 - classification_loss: 0.4243 249/500 [=============>................] - ETA: 55s - loss: 2.6253 - regression_loss: 2.2013 - classification_loss: 0.4240 250/500 [==============>...............] - ETA: 55s - loss: 2.6247 - regression_loss: 2.2009 - classification_loss: 0.4238 251/500 [==============>...............] - ETA: 55s - loss: 2.6248 - regression_loss: 2.2009 - classification_loss: 0.4239 252/500 [==============>...............] - ETA: 54s - loss: 2.6247 - regression_loss: 2.2010 - classification_loss: 0.4237 253/500 [==============>...............] - ETA: 54s - loss: 2.6219 - regression_loss: 2.1986 - classification_loss: 0.4233 254/500 [==============>...............] - ETA: 54s - loss: 2.6217 - regression_loss: 2.1986 - classification_loss: 0.4232 255/500 [==============>...............] - ETA: 54s - loss: 2.6214 - regression_loss: 2.1983 - classification_loss: 0.4231 256/500 [==============>...............] - ETA: 54s - loss: 2.6231 - regression_loss: 2.1989 - classification_loss: 0.4242 257/500 [==============>...............] - ETA: 53s - loss: 2.6222 - regression_loss: 2.1982 - classification_loss: 0.4240 258/500 [==============>...............] - ETA: 53s - loss: 2.6226 - regression_loss: 2.1982 - classification_loss: 0.4243 259/500 [==============>...............] - ETA: 53s - loss: 2.6236 - regression_loss: 2.1993 - classification_loss: 0.4243 260/500 [==============>...............] - ETA: 53s - loss: 2.6240 - regression_loss: 2.1998 - classification_loss: 0.4242 261/500 [==============>...............] - ETA: 52s - loss: 2.6243 - regression_loss: 2.2003 - classification_loss: 0.4240 262/500 [==============>...............] - ETA: 52s - loss: 2.6245 - regression_loss: 2.2002 - classification_loss: 0.4242 263/500 [==============>...............] - ETA: 52s - loss: 2.6250 - regression_loss: 2.2004 - classification_loss: 0.4246 264/500 [==============>...............] - ETA: 52s - loss: 2.6271 - regression_loss: 2.2016 - classification_loss: 0.4255 265/500 [==============>...............] - ETA: 52s - loss: 2.6266 - regression_loss: 2.2013 - classification_loss: 0.4253 266/500 [==============>...............] - ETA: 51s - loss: 2.6267 - regression_loss: 2.2012 - classification_loss: 0.4255 267/500 [===============>..............] - ETA: 51s - loss: 2.6264 - regression_loss: 2.2009 - classification_loss: 0.4255 268/500 [===============>..............] - ETA: 51s - loss: 2.6255 - regression_loss: 2.2003 - classification_loss: 0.4252 269/500 [===============>..............] - ETA: 51s - loss: 2.6237 - regression_loss: 2.1988 - classification_loss: 0.4249 270/500 [===============>..............] - ETA: 50s - loss: 2.6221 - regression_loss: 2.1975 - classification_loss: 0.4245 271/500 [===============>..............] - ETA: 50s - loss: 2.6211 - regression_loss: 2.1965 - classification_loss: 0.4245 272/500 [===============>..............] - ETA: 50s - loss: 2.6218 - regression_loss: 2.1971 - classification_loss: 0.4248 273/500 [===============>..............] - ETA: 50s - loss: 2.6244 - regression_loss: 2.1994 - classification_loss: 0.4250 274/500 [===============>..............] - ETA: 50s - loss: 2.6229 - regression_loss: 2.1983 - classification_loss: 0.4246 275/500 [===============>..............] - ETA: 49s - loss: 2.6231 - regression_loss: 2.1982 - classification_loss: 0.4249 276/500 [===============>..............] - ETA: 49s - loss: 2.6230 - regression_loss: 2.1978 - classification_loss: 0.4251 277/500 [===============>..............] - ETA: 49s - loss: 2.6223 - regression_loss: 2.1973 - classification_loss: 0.4250 278/500 [===============>..............] - ETA: 49s - loss: 2.6215 - regression_loss: 2.1967 - classification_loss: 0.4248 279/500 [===============>..............] - ETA: 48s - loss: 2.6212 - regression_loss: 2.1966 - classification_loss: 0.4246 280/500 [===============>..............] - ETA: 48s - loss: 2.6224 - regression_loss: 2.1970 - classification_loss: 0.4254 281/500 [===============>..............] - ETA: 48s - loss: 2.6227 - regression_loss: 2.1972 - classification_loss: 0.4255 282/500 [===============>..............] - ETA: 48s - loss: 2.6242 - regression_loss: 2.1986 - classification_loss: 0.4256 283/500 [===============>..............] - ETA: 48s - loss: 2.6228 - regression_loss: 2.1975 - classification_loss: 0.4253 284/500 [================>.............] - ETA: 47s - loss: 2.6232 - regression_loss: 2.1977 - classification_loss: 0.4254 285/500 [================>.............] - ETA: 47s - loss: 2.6284 - regression_loss: 2.2020 - classification_loss: 0.4264 286/500 [================>.............] - ETA: 47s - loss: 2.6289 - regression_loss: 2.2019 - classification_loss: 0.4270 287/500 [================>.............] - ETA: 47s - loss: 2.6289 - regression_loss: 2.2017 - classification_loss: 0.4271 288/500 [================>.............] - ETA: 46s - loss: 2.6295 - regression_loss: 2.2029 - classification_loss: 0.4266 289/500 [================>.............] - ETA: 46s - loss: 2.6281 - regression_loss: 2.2016 - classification_loss: 0.4266 290/500 [================>.............] - ETA: 46s - loss: 2.6273 - regression_loss: 2.2010 - classification_loss: 0.4263 291/500 [================>.............] - ETA: 46s - loss: 2.6275 - regression_loss: 2.2012 - classification_loss: 0.4262 292/500 [================>.............] - ETA: 45s - loss: 2.6271 - regression_loss: 2.2008 - classification_loss: 0.4263 293/500 [================>.............] - ETA: 45s - loss: 2.6267 - regression_loss: 2.2006 - classification_loss: 0.4261 294/500 [================>.............] - ETA: 45s - loss: 2.6257 - regression_loss: 2.1999 - classification_loss: 0.4258 295/500 [================>.............] - ETA: 45s - loss: 2.6252 - regression_loss: 2.1997 - classification_loss: 0.4255 296/500 [================>.............] - ETA: 45s - loss: 2.6253 - regression_loss: 2.1999 - classification_loss: 0.4254 297/500 [================>.............] - ETA: 44s - loss: 2.6246 - regression_loss: 2.1996 - classification_loss: 0.4250 298/500 [================>.............] - ETA: 44s - loss: 2.6248 - regression_loss: 2.1999 - classification_loss: 0.4249 299/500 [================>.............] - ETA: 44s - loss: 2.6244 - regression_loss: 2.1998 - classification_loss: 0.4246 300/500 [=================>............] - ETA: 44s - loss: 2.6242 - regression_loss: 2.1998 - classification_loss: 0.4243 301/500 [=================>............] - ETA: 43s - loss: 2.6222 - regression_loss: 2.1980 - classification_loss: 0.4242 302/500 [=================>............] - ETA: 43s - loss: 2.6216 - regression_loss: 2.1975 - classification_loss: 0.4241 303/500 [=================>............] - ETA: 43s - loss: 2.6215 - regression_loss: 2.1975 - classification_loss: 0.4240 304/500 [=================>............] - ETA: 43s - loss: 2.6204 - regression_loss: 2.1966 - classification_loss: 0.4237 305/500 [=================>............] - ETA: 43s - loss: 2.6184 - regression_loss: 2.1952 - classification_loss: 0.4233 306/500 [=================>............] - ETA: 42s - loss: 2.6174 - regression_loss: 2.1945 - classification_loss: 0.4229 307/500 [=================>............] - ETA: 42s - loss: 2.6151 - regression_loss: 2.1928 - classification_loss: 0.4224 308/500 [=================>............] - ETA: 42s - loss: 2.6151 - regression_loss: 2.1927 - classification_loss: 0.4224 309/500 [=================>............] - ETA: 42s - loss: 2.6150 - regression_loss: 2.1927 - classification_loss: 0.4223 310/500 [=================>............] - ETA: 41s - loss: 2.6133 - regression_loss: 2.1913 - classification_loss: 0.4220 311/500 [=================>............] - ETA: 41s - loss: 2.6134 - regression_loss: 2.1913 - classification_loss: 0.4221 312/500 [=================>............] - ETA: 41s - loss: 2.6140 - regression_loss: 2.1919 - classification_loss: 0.4221 313/500 [=================>............] - ETA: 41s - loss: 2.6132 - regression_loss: 2.1913 - classification_loss: 0.4219 314/500 [=================>............] - ETA: 41s - loss: 2.6128 - regression_loss: 2.1910 - classification_loss: 0.4219 315/500 [=================>............] - ETA: 40s - loss: 2.6140 - regression_loss: 2.1914 - classification_loss: 0.4226 316/500 [=================>............] - ETA: 40s - loss: 2.6135 - regression_loss: 2.1911 - classification_loss: 0.4224 317/500 [==================>...........] - ETA: 40s - loss: 2.6122 - regression_loss: 2.1900 - classification_loss: 0.4221 318/500 [==================>...........] - ETA: 40s - loss: 2.6128 - regression_loss: 2.1905 - classification_loss: 0.4223 319/500 [==================>...........] - ETA: 39s - loss: 2.6125 - regression_loss: 2.1904 - classification_loss: 0.4221 320/500 [==================>...........] - ETA: 39s - loss: 2.6125 - regression_loss: 2.1905 - classification_loss: 0.4220 321/500 [==================>...........] - ETA: 39s - loss: 2.6105 - regression_loss: 2.1891 - classification_loss: 0.4214 322/500 [==================>...........] - ETA: 39s - loss: 2.6094 - regression_loss: 2.1882 - classification_loss: 0.4211 323/500 [==================>...........] - ETA: 39s - loss: 2.6075 - regression_loss: 2.1868 - classification_loss: 0.4207 324/500 [==================>...........] - ETA: 38s - loss: 2.6075 - regression_loss: 2.1869 - classification_loss: 0.4206 325/500 [==================>...........] - ETA: 38s - loss: 2.6066 - regression_loss: 2.1862 - classification_loss: 0.4204 326/500 [==================>...........] - ETA: 38s - loss: 2.6059 - regression_loss: 2.1858 - classification_loss: 0.4202 327/500 [==================>...........] - ETA: 38s - loss: 2.6072 - regression_loss: 2.1865 - classification_loss: 0.4207 328/500 [==================>...........] - ETA: 38s - loss: 2.6069 - regression_loss: 2.1863 - classification_loss: 0.4206 329/500 [==================>...........] - ETA: 37s - loss: 2.6063 - regression_loss: 2.1859 - classification_loss: 0.4204 330/500 [==================>...........] - ETA: 37s - loss: 2.6051 - regression_loss: 2.1852 - classification_loss: 0.4200 331/500 [==================>...........] - ETA: 37s - loss: 2.6052 - regression_loss: 2.1851 - classification_loss: 0.4201 332/500 [==================>...........] - ETA: 37s - loss: 2.6045 - regression_loss: 2.1844 - classification_loss: 0.4201 333/500 [==================>...........] - ETA: 36s - loss: 2.6052 - regression_loss: 2.1851 - classification_loss: 0.4201 334/500 [===================>..........] - ETA: 36s - loss: 2.6044 - regression_loss: 2.1845 - classification_loss: 0.4198 335/500 [===================>..........] - ETA: 36s - loss: 2.6083 - regression_loss: 2.1851 - classification_loss: 0.4232 336/500 [===================>..........] - ETA: 36s - loss: 2.6072 - regression_loss: 2.1844 - classification_loss: 0.4228 337/500 [===================>..........] - ETA: 36s - loss: 2.6072 - regression_loss: 2.1843 - classification_loss: 0.4229 338/500 [===================>..........] - ETA: 35s - loss: 2.6062 - regression_loss: 2.1836 - classification_loss: 0.4226 339/500 [===================>..........] - ETA: 35s - loss: 2.6054 - regression_loss: 2.1819 - classification_loss: 0.4235 340/500 [===================>..........] - ETA: 35s - loss: 2.6050 - regression_loss: 2.1804 - classification_loss: 0.4246 341/500 [===================>..........] - ETA: 35s - loss: 2.6063 - regression_loss: 2.1816 - classification_loss: 0.4247 342/500 [===================>..........] - ETA: 34s - loss: 2.6064 - regression_loss: 2.1818 - classification_loss: 0.4247 343/500 [===================>..........] - ETA: 34s - loss: 2.6056 - regression_loss: 2.1812 - classification_loss: 0.4244 344/500 [===================>..........] - ETA: 34s - loss: 2.6053 - regression_loss: 2.1810 - classification_loss: 0.4244 345/500 [===================>..........] - ETA: 34s - loss: 2.6038 - regression_loss: 2.1798 - classification_loss: 0.4241 346/500 [===================>..........] - ETA: 33s - loss: 2.6045 - regression_loss: 2.1802 - classification_loss: 0.4243 347/500 [===================>..........] - ETA: 33s - loss: 2.6041 - regression_loss: 2.1800 - classification_loss: 0.4240 348/500 [===================>..........] - ETA: 33s - loss: 2.6030 - regression_loss: 2.1792 - classification_loss: 0.4238 349/500 [===================>..........] - ETA: 33s - loss: 2.6035 - regression_loss: 2.1791 - classification_loss: 0.4244 350/500 [====================>.........] - ETA: 33s - loss: 2.6032 - regression_loss: 2.1789 - classification_loss: 0.4243 351/500 [====================>.........] - ETA: 32s - loss: 2.6030 - regression_loss: 2.1788 - classification_loss: 0.4242 352/500 [====================>.........] - ETA: 32s - loss: 2.6027 - regression_loss: 2.1788 - classification_loss: 0.4239 353/500 [====================>.........] - ETA: 32s - loss: 2.6030 - regression_loss: 2.1790 - classification_loss: 0.4240 354/500 [====================>.........] - ETA: 32s - loss: 2.5992 - regression_loss: 2.1759 - classification_loss: 0.4233 355/500 [====================>.........] - ETA: 32s - loss: 2.5990 - regression_loss: 2.1758 - classification_loss: 0.4232 356/500 [====================>.........] - ETA: 31s - loss: 2.5983 - regression_loss: 2.1753 - classification_loss: 0.4230 357/500 [====================>.........] - ETA: 31s - loss: 2.5991 - regression_loss: 2.1760 - classification_loss: 0.4231 358/500 [====================>.........] - ETA: 31s - loss: 2.5985 - regression_loss: 2.1755 - classification_loss: 0.4230 359/500 [====================>.........] - ETA: 31s - loss: 2.5981 - regression_loss: 2.1752 - classification_loss: 0.4229 360/500 [====================>.........] - ETA: 30s - loss: 2.5977 - regression_loss: 2.1750 - classification_loss: 0.4227 361/500 [====================>.........] - ETA: 30s - loss: 2.5975 - regression_loss: 2.1748 - classification_loss: 0.4226 362/500 [====================>.........] - ETA: 30s - loss: 2.5970 - regression_loss: 2.1747 - classification_loss: 0.4222 363/500 [====================>.........] - ETA: 30s - loss: 2.5958 - regression_loss: 2.1740 - classification_loss: 0.4219 364/500 [====================>.........] - ETA: 30s - loss: 2.5957 - regression_loss: 2.1740 - classification_loss: 0.4217 365/500 [====================>.........] - ETA: 29s - loss: 2.5948 - regression_loss: 2.1731 - classification_loss: 0.4217 366/500 [====================>.........] - ETA: 29s - loss: 2.5938 - regression_loss: 2.1724 - classification_loss: 0.4214 367/500 [=====================>........] - ETA: 29s - loss: 2.5928 - regression_loss: 2.1715 - classification_loss: 0.4213 368/500 [=====================>........] - ETA: 29s - loss: 2.5926 - regression_loss: 2.1712 - classification_loss: 0.4214 369/500 [=====================>........] - ETA: 28s - loss: 2.5927 - regression_loss: 2.1713 - classification_loss: 0.4213 370/500 [=====================>........] - ETA: 28s - loss: 2.5934 - regression_loss: 2.1719 - classification_loss: 0.4214 371/500 [=====================>........] - ETA: 28s - loss: 2.5935 - regression_loss: 2.1719 - classification_loss: 0.4216 372/500 [=====================>........] - ETA: 28s - loss: 2.5914 - regression_loss: 2.1702 - classification_loss: 0.4212 373/500 [=====================>........] - ETA: 28s - loss: 2.5928 - regression_loss: 2.1714 - classification_loss: 0.4213 374/500 [=====================>........] - ETA: 27s - loss: 2.5927 - regression_loss: 2.1714 - classification_loss: 0.4213 375/500 [=====================>........] - ETA: 27s - loss: 2.5940 - regression_loss: 2.1723 - classification_loss: 0.4217 376/500 [=====================>........] - ETA: 27s - loss: 2.5935 - regression_loss: 2.1718 - classification_loss: 0.4217 377/500 [=====================>........] - ETA: 27s - loss: 2.5931 - regression_loss: 2.1717 - classification_loss: 0.4215 378/500 [=====================>........] - ETA: 26s - loss: 2.5915 - regression_loss: 2.1703 - classification_loss: 0.4212 379/500 [=====================>........] - ETA: 26s - loss: 2.5922 - regression_loss: 2.1711 - classification_loss: 0.4211 380/500 [=====================>........] - ETA: 26s - loss: 2.5907 - regression_loss: 2.1697 - classification_loss: 0.4210 381/500 [=====================>........] - ETA: 26s - loss: 2.5906 - regression_loss: 2.1697 - classification_loss: 0.4209 382/500 [=====================>........] - ETA: 26s - loss: 2.5889 - regression_loss: 2.1684 - classification_loss: 0.4205 383/500 [=====================>........] - ETA: 25s - loss: 2.5890 - regression_loss: 2.1685 - classification_loss: 0.4204 384/500 [======================>.......] - ETA: 25s - loss: 2.5880 - regression_loss: 2.1680 - classification_loss: 0.4200 385/500 [======================>.......] - ETA: 25s - loss: 2.5877 - regression_loss: 2.1679 - classification_loss: 0.4199 386/500 [======================>.......] - ETA: 25s - loss: 2.5891 - regression_loss: 2.1686 - classification_loss: 0.4205 387/500 [======================>.......] - ETA: 24s - loss: 2.5876 - regression_loss: 2.1674 - classification_loss: 0.4202 388/500 [======================>.......] - ETA: 24s - loss: 2.5878 - regression_loss: 2.1675 - classification_loss: 0.4203 389/500 [======================>.......] - ETA: 24s - loss: 2.5875 - regression_loss: 2.1674 - classification_loss: 0.4201 390/500 [======================>.......] - ETA: 24s - loss: 2.5863 - regression_loss: 2.1664 - classification_loss: 0.4199 391/500 [======================>.......] - ETA: 24s - loss: 2.5861 - regression_loss: 2.1662 - classification_loss: 0.4199 392/500 [======================>.......] - ETA: 23s - loss: 2.5865 - regression_loss: 2.1666 - classification_loss: 0.4200 393/500 [======================>.......] - ETA: 23s - loss: 2.5852 - regression_loss: 2.1656 - classification_loss: 0.4196 394/500 [======================>.......] - ETA: 23s - loss: 2.5854 - regression_loss: 2.1656 - classification_loss: 0.4197 395/500 [======================>.......] - ETA: 23s - loss: 2.5846 - regression_loss: 2.1650 - classification_loss: 0.4196 396/500 [======================>.......] - ETA: 23s - loss: 2.5857 - regression_loss: 2.1656 - classification_loss: 0.4200 397/500 [======================>.......] - ETA: 22s - loss: 2.5853 - regression_loss: 2.1653 - classification_loss: 0.4200 398/500 [======================>.......] - ETA: 22s - loss: 2.5832 - regression_loss: 2.1636 - classification_loss: 0.4197 399/500 [======================>.......] - ETA: 22s - loss: 2.5832 - regression_loss: 2.1636 - classification_loss: 0.4197 400/500 [=======================>......] - ETA: 22s - loss: 2.5833 - regression_loss: 2.1632 - classification_loss: 0.4201 401/500 [=======================>......] - ETA: 21s - loss: 2.5843 - regression_loss: 2.1640 - classification_loss: 0.4203 402/500 [=======================>......] - ETA: 21s - loss: 2.5835 - regression_loss: 2.1634 - classification_loss: 0.4200 403/500 [=======================>......] - ETA: 21s - loss: 2.5822 - regression_loss: 2.1625 - classification_loss: 0.4198 404/500 [=======================>......] - ETA: 21s - loss: 2.5822 - regression_loss: 2.1625 - classification_loss: 0.4197 405/500 [=======================>......] - ETA: 21s - loss: 2.5817 - regression_loss: 2.1621 - classification_loss: 0.4196 406/500 [=======================>......] - ETA: 20s - loss: 2.5811 - regression_loss: 2.1617 - classification_loss: 0.4195 407/500 [=======================>......] - ETA: 20s - loss: 2.5811 - regression_loss: 2.1618 - classification_loss: 0.4193 408/500 [=======================>......] - ETA: 20s - loss: 2.5828 - regression_loss: 2.1634 - classification_loss: 0.4193 409/500 [=======================>......] - ETA: 20s - loss: 2.5823 - regression_loss: 2.1628 - classification_loss: 0.4195 410/500 [=======================>......] - ETA: 20s - loss: 2.5827 - regression_loss: 2.1635 - classification_loss: 0.4192 411/500 [=======================>......] - ETA: 19s - loss: 2.5839 - regression_loss: 2.1648 - classification_loss: 0.4192 412/500 [=======================>......] - ETA: 19s - loss: 2.5838 - regression_loss: 2.1646 - classification_loss: 0.4192 413/500 [=======================>......] - ETA: 19s - loss: 2.5840 - regression_loss: 2.1648 - classification_loss: 0.4192 414/500 [=======================>......] - ETA: 19s - loss: 2.5830 - regression_loss: 2.1639 - classification_loss: 0.4192 415/500 [=======================>......] - ETA: 18s - loss: 2.5830 - regression_loss: 2.1638 - classification_loss: 0.4192 416/500 [=======================>......] - ETA: 18s - loss: 2.5830 - regression_loss: 2.1637 - classification_loss: 0.4193 417/500 [========================>.....] - ETA: 18s - loss: 2.5821 - regression_loss: 2.1629 - classification_loss: 0.4191 418/500 [========================>.....] - ETA: 18s - loss: 2.5817 - regression_loss: 2.1625 - classification_loss: 0.4192 419/500 [========================>.....] - ETA: 18s - loss: 2.5811 - regression_loss: 2.1621 - classification_loss: 0.4190 420/500 [========================>.....] - ETA: 17s - loss: 2.5807 - regression_loss: 2.1619 - classification_loss: 0.4188 421/500 [========================>.....] - ETA: 17s - loss: 2.5803 - regression_loss: 2.1616 - classification_loss: 0.4186 422/500 [========================>.....] - ETA: 17s - loss: 2.5802 - regression_loss: 2.1615 - classification_loss: 0.4187 423/500 [========================>.....] - ETA: 17s - loss: 2.5789 - regression_loss: 2.1605 - classification_loss: 0.4184 424/500 [========================>.....] - ETA: 16s - loss: 2.5787 - regression_loss: 2.1605 - classification_loss: 0.4182 425/500 [========================>.....] - ETA: 16s - loss: 2.5785 - regression_loss: 2.1605 - classification_loss: 0.4180 426/500 [========================>.....] - ETA: 16s - loss: 2.5782 - regression_loss: 2.1604 - classification_loss: 0.4178 427/500 [========================>.....] - ETA: 16s - loss: 2.5768 - regression_loss: 2.1593 - classification_loss: 0.4175 428/500 [========================>.....] - ETA: 16s - loss: 2.5767 - regression_loss: 2.1593 - classification_loss: 0.4173 429/500 [========================>.....] - ETA: 15s - loss: 2.5761 - regression_loss: 2.1589 - classification_loss: 0.4171 430/500 [========================>.....] - ETA: 15s - loss: 2.5752 - regression_loss: 2.1583 - classification_loss: 0.4169 431/500 [========================>.....] - ETA: 15s - loss: 2.5750 - regression_loss: 2.1582 - classification_loss: 0.4168 432/500 [========================>.....] - ETA: 15s - loss: 2.5752 - regression_loss: 2.1583 - classification_loss: 0.4169 433/500 [========================>.....] - ETA: 14s - loss: 2.5751 - regression_loss: 2.1582 - classification_loss: 0.4169 434/500 [=========================>....] - ETA: 14s - loss: 2.5743 - regression_loss: 2.1576 - classification_loss: 0.4166 435/500 [=========================>....] - ETA: 14s - loss: 2.5726 - regression_loss: 2.1564 - classification_loss: 0.4162 436/500 [=========================>....] - ETA: 14s - loss: 2.5727 - regression_loss: 2.1565 - classification_loss: 0.4162 437/500 [=========================>....] - ETA: 14s - loss: 2.5722 - regression_loss: 2.1562 - classification_loss: 0.4160 438/500 [=========================>....] - ETA: 13s - loss: 2.5719 - regression_loss: 2.1560 - classification_loss: 0.4159 439/500 [=========================>....] - ETA: 13s - loss: 2.5726 - regression_loss: 2.1566 - classification_loss: 0.4160 440/500 [=========================>....] - ETA: 13s - loss: 2.5709 - regression_loss: 2.1553 - classification_loss: 0.4155 441/500 [=========================>....] - ETA: 13s - loss: 2.5714 - regression_loss: 2.1560 - classification_loss: 0.4154 442/500 [=========================>....] - ETA: 12s - loss: 2.5714 - regression_loss: 2.1559 - classification_loss: 0.4156 443/500 [=========================>....] - ETA: 12s - loss: 2.5718 - regression_loss: 2.1562 - classification_loss: 0.4156 444/500 [=========================>....] - ETA: 12s - loss: 2.5721 - regression_loss: 2.1567 - classification_loss: 0.4154 445/500 [=========================>....] - ETA: 12s - loss: 2.5706 - regression_loss: 2.1556 - classification_loss: 0.4150 446/500 [=========================>....] - ETA: 12s - loss: 2.5691 - regression_loss: 2.1545 - classification_loss: 0.4146 447/500 [=========================>....] - ETA: 11s - loss: 2.5686 - regression_loss: 2.1543 - classification_loss: 0.4144 448/500 [=========================>....] - ETA: 11s - loss: 2.5678 - regression_loss: 2.1534 - classification_loss: 0.4144 449/500 [=========================>....] - ETA: 11s - loss: 2.5684 - regression_loss: 2.1540 - classification_loss: 0.4145 450/500 [==========================>...] - ETA: 11s - loss: 2.5685 - regression_loss: 2.1540 - classification_loss: 0.4145 451/500 [==========================>...] - ETA: 10s - loss: 2.5684 - regression_loss: 2.1540 - classification_loss: 0.4144 452/500 [==========================>...] - ETA: 10s - loss: 2.5675 - regression_loss: 2.1533 - classification_loss: 0.4142 453/500 [==========================>...] - ETA: 10s - loss: 2.5676 - regression_loss: 2.1534 - classification_loss: 0.4142 454/500 [==========================>...] - ETA: 10s - loss: 2.5672 - regression_loss: 2.1532 - classification_loss: 0.4140 455/500 [==========================>...] - ETA: 10s - loss: 2.5636 - regression_loss: 2.1501 - classification_loss: 0.4135 456/500 [==========================>...] - ETA: 9s - loss: 2.5634 - regression_loss: 2.1503 - classification_loss: 0.4131  457/500 [==========================>...] - ETA: 9s - loss: 2.5638 - regression_loss: 2.1509 - classification_loss: 0.4130 458/500 [==========================>...] - ETA: 9s - loss: 2.5634 - regression_loss: 2.1506 - classification_loss: 0.4128 459/500 [==========================>...] - ETA: 9s - loss: 2.5635 - regression_loss: 2.1508 - classification_loss: 0.4127 460/500 [==========================>...] - ETA: 8s - loss: 2.5606 - regression_loss: 2.1484 - classification_loss: 0.4122 461/500 [==========================>...] - ETA: 8s - loss: 2.5605 - regression_loss: 2.1483 - classification_loss: 0.4122 462/500 [==========================>...] - ETA: 8s - loss: 2.5597 - regression_loss: 2.1476 - classification_loss: 0.4120 463/500 [==========================>...] - ETA: 8s - loss: 2.5588 - regression_loss: 2.1469 - classification_loss: 0.4119 464/500 [==========================>...] - ETA: 8s - loss: 2.5578 - regression_loss: 2.1462 - classification_loss: 0.4116 465/500 [==========================>...] - ETA: 7s - loss: 2.5577 - regression_loss: 2.1461 - classification_loss: 0.4116 466/500 [==========================>...] - ETA: 7s - loss: 2.5582 - regression_loss: 2.1465 - classification_loss: 0.4117 467/500 [===========================>..] - ETA: 7s - loss: 2.5586 - regression_loss: 2.1468 - classification_loss: 0.4118 468/500 [===========================>..] - ETA: 7s - loss: 2.5579 - regression_loss: 2.1463 - classification_loss: 0.4116 469/500 [===========================>..] - ETA: 6s - loss: 2.5586 - regression_loss: 2.1469 - classification_loss: 0.4117 470/500 [===========================>..] - ETA: 6s - loss: 2.5577 - regression_loss: 2.1462 - classification_loss: 0.4114 471/500 [===========================>..] - ETA: 6s - loss: 2.5576 - regression_loss: 2.1462 - classification_loss: 0.4114 472/500 [===========================>..] - ETA: 6s - loss: 2.5571 - regression_loss: 2.1458 - classification_loss: 0.4113 473/500 [===========================>..] - ETA: 6s - loss: 2.5568 - regression_loss: 2.1456 - classification_loss: 0.4112 474/500 [===========================>..] - ETA: 5s - loss: 2.5564 - regression_loss: 2.1453 - classification_loss: 0.4110 475/500 [===========================>..] - ETA: 5s - loss: 2.5563 - regression_loss: 2.1453 - classification_loss: 0.4110 476/500 [===========================>..] - ETA: 5s - loss: 2.5591 - regression_loss: 2.1474 - classification_loss: 0.4116 477/500 [===========================>..] - ETA: 5s - loss: 2.5587 - regression_loss: 2.1472 - classification_loss: 0.4114 478/500 [===========================>..] - ETA: 4s - loss: 2.5587 - regression_loss: 2.1472 - classification_loss: 0.4115 479/500 [===========================>..] - ETA: 4s - loss: 2.5589 - regression_loss: 2.1474 - classification_loss: 0.4115 480/500 [===========================>..] - ETA: 4s - loss: 2.5581 - regression_loss: 2.1469 - classification_loss: 0.4112 481/500 [===========================>..] - ETA: 4s - loss: 2.5575 - regression_loss: 2.1464 - classification_loss: 0.4111 482/500 [===========================>..] - ETA: 4s - loss: 2.5563 - regression_loss: 2.1455 - classification_loss: 0.4108 483/500 [===========================>..] - ETA: 3s - loss: 2.5573 - regression_loss: 2.1464 - classification_loss: 0.4109 484/500 [============================>.] - ETA: 3s - loss: 2.5573 - regression_loss: 2.1454 - classification_loss: 0.4120 485/500 [============================>.] - ETA: 3s - loss: 2.5580 - regression_loss: 2.1459 - classification_loss: 0.4121 486/500 [============================>.] - ETA: 3s - loss: 2.5579 - regression_loss: 2.1457 - classification_loss: 0.4121 487/500 [============================>.] - ETA: 2s - loss: 2.5576 - regression_loss: 2.1455 - classification_loss: 0.4121 488/500 [============================>.] - ETA: 2s - loss: 2.5574 - regression_loss: 2.1452 - classification_loss: 0.4121 489/500 [============================>.] - ETA: 2s - loss: 2.5566 - regression_loss: 2.1447 - classification_loss: 0.4120 490/500 [============================>.] - ETA: 2s - loss: 2.5563 - regression_loss: 2.1445 - classification_loss: 0.4118 491/500 [============================>.] - ETA: 2s - loss: 2.5543 - regression_loss: 2.1430 - classification_loss: 0.4113 492/500 [============================>.] - ETA: 1s - loss: 2.5539 - regression_loss: 2.1428 - classification_loss: 0.4111 493/500 [============================>.] - ETA: 1s - loss: 2.5526 - regression_loss: 2.1417 - classification_loss: 0.4109 494/500 [============================>.] - ETA: 1s - loss: 2.5533 - regression_loss: 2.1421 - classification_loss: 0.4111 495/500 [============================>.] - ETA: 1s - loss: 2.5531 - regression_loss: 2.1421 - classification_loss: 0.4110 496/500 [============================>.] - ETA: 0s - loss: 2.5542 - regression_loss: 2.1431 - classification_loss: 0.4112 497/500 [============================>.] - ETA: 0s - loss: 2.5551 - regression_loss: 2.1438 - classification_loss: 0.4113 498/500 [============================>.] - ETA: 0s - loss: 2.5552 - regression_loss: 2.1439 - classification_loss: 0.4113 499/500 [============================>.] - ETA: 0s - loss: 2.5560 - regression_loss: 2.1446 - classification_loss: 0.4114 500/500 [==============================] - 113s 226ms/step - loss: 2.5549 - regression_loss: 2.1435 - classification_loss: 0.4114 1172 instances of class plum with average precision: 0.3792 mAP: 0.3792 Epoch 00002: saving model to ./training/snapshots/resnet50_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 1:48 - loss: 1.8844 - regression_loss: 1.5934 - classification_loss: 0.2911 2/500 [..............................] - ETA: 1:52 - loss: 2.1552 - regression_loss: 1.8266 - classification_loss: 0.3285 3/500 [..............................] - ETA: 1:57 - loss: 2.2726 - regression_loss: 1.9348 - classification_loss: 0.3378 4/500 [..............................] - ETA: 1:59 - loss: 2.2951 - regression_loss: 1.9599 - classification_loss: 0.3351 5/500 [..............................] - ETA: 2:00 - loss: 2.4157 - regression_loss: 2.0554 - classification_loss: 0.3603 6/500 [..............................] - ETA: 1:58 - loss: 2.3107 - regression_loss: 1.9711 - classification_loss: 0.3396 7/500 [..............................] - ETA: 1:57 - loss: 2.3347 - regression_loss: 1.9899 - classification_loss: 0.3447 8/500 [..............................] - ETA: 1:56 - loss: 2.3171 - regression_loss: 1.9852 - classification_loss: 0.3319 9/500 [..............................] - ETA: 1:56 - loss: 2.3628 - regression_loss: 2.0212 - classification_loss: 0.3416 10/500 [..............................] - ETA: 1:55 - loss: 2.3070 - regression_loss: 1.9759 - classification_loss: 0.3311 11/500 [..............................] - ETA: 1:54 - loss: 2.3078 - regression_loss: 1.9741 - classification_loss: 0.3337 12/500 [..............................] - ETA: 1:54 - loss: 2.3236 - regression_loss: 1.9878 - classification_loss: 0.3358 13/500 [..............................] - ETA: 1:54 - loss: 2.3426 - regression_loss: 2.0013 - classification_loss: 0.3413 14/500 [..............................] - ETA: 1:55 - loss: 2.3713 - regression_loss: 2.0230 - classification_loss: 0.3483 15/500 [..............................] - ETA: 1:54 - loss: 2.3638 - regression_loss: 2.0173 - classification_loss: 0.3465 16/500 [..............................] - ETA: 1:54 - loss: 2.3448 - regression_loss: 1.9976 - classification_loss: 0.3472 17/500 [>.............................] - ETA: 1:54 - loss: 2.3731 - regression_loss: 2.0113 - classification_loss: 0.3618 18/500 [>.............................] - ETA: 1:55 - loss: 2.3788 - regression_loss: 2.0188 - classification_loss: 0.3599 19/500 [>.............................] - ETA: 1:55 - loss: 2.3549 - regression_loss: 1.9989 - classification_loss: 0.3560 20/500 [>.............................] - ETA: 1:55 - loss: 2.3444 - regression_loss: 1.9905 - classification_loss: 0.3539 21/500 [>.............................] - ETA: 1:55 - loss: 2.3509 - regression_loss: 1.9916 - classification_loss: 0.3593 22/500 [>.............................] - ETA: 1:55 - loss: 2.3468 - regression_loss: 1.9900 - classification_loss: 0.3568 23/500 [>.............................] - ETA: 1:54 - loss: 2.3525 - regression_loss: 1.9947 - classification_loss: 0.3578 24/500 [>.............................] - ETA: 1:54 - loss: 2.3562 - regression_loss: 1.9982 - classification_loss: 0.3579 25/500 [>.............................] - ETA: 1:54 - loss: 2.3522 - regression_loss: 1.9962 - classification_loss: 0.3559 26/500 [>.............................] - ETA: 1:54 - loss: 2.3443 - regression_loss: 1.9929 - classification_loss: 0.3514 27/500 [>.............................] - ETA: 1:54 - loss: 2.3429 - regression_loss: 1.9904 - classification_loss: 0.3524 28/500 [>.............................] - ETA: 1:54 - loss: 2.3440 - regression_loss: 1.9892 - classification_loss: 0.3548 29/500 [>.............................] - ETA: 1:54 - loss: 2.3496 - regression_loss: 1.9898 - classification_loss: 0.3598 30/500 [>.............................] - ETA: 1:54 - loss: 2.3589 - regression_loss: 1.9998 - classification_loss: 0.3591 31/500 [>.............................] - ETA: 1:54 - loss: 2.3699 - regression_loss: 2.0086 - classification_loss: 0.3614 32/500 [>.............................] - ETA: 1:54 - loss: 2.3697 - regression_loss: 2.0093 - classification_loss: 0.3604 33/500 [>.............................] - ETA: 1:54 - loss: 2.3732 - regression_loss: 2.0143 - classification_loss: 0.3588 34/500 [=>............................] - ETA: 1:54 - loss: 2.3690 - regression_loss: 2.0103 - classification_loss: 0.3587 35/500 [=>............................] - ETA: 1:53 - loss: 2.4208 - regression_loss: 2.0322 - classification_loss: 0.3886 36/500 [=>............................] - ETA: 1:53 - loss: 2.4220 - regression_loss: 2.0350 - classification_loss: 0.3870 37/500 [=>............................] - ETA: 1:53 - loss: 2.4186 - regression_loss: 2.0317 - classification_loss: 0.3869 38/500 [=>............................] - ETA: 1:53 - loss: 2.3943 - regression_loss: 2.0124 - classification_loss: 0.3819 39/500 [=>............................] - ETA: 1:53 - loss: 2.3737 - regression_loss: 1.9958 - classification_loss: 0.3780 40/500 [=>............................] - ETA: 1:53 - loss: 2.3785 - regression_loss: 2.0008 - classification_loss: 0.3777 41/500 [=>............................] - ETA: 1:52 - loss: 2.3787 - regression_loss: 2.0018 - classification_loss: 0.3769 42/500 [=>............................] - ETA: 1:52 - loss: 2.3834 - regression_loss: 2.0060 - classification_loss: 0.3774 43/500 [=>............................] - ETA: 1:52 - loss: 2.3868 - regression_loss: 2.0094 - classification_loss: 0.3774 44/500 [=>............................] - ETA: 1:52 - loss: 2.3976 - regression_loss: 2.0185 - classification_loss: 0.3791 45/500 [=>............................] - ETA: 1:52 - loss: 2.3958 - regression_loss: 2.0153 - classification_loss: 0.3805 46/500 [=>............................] - ETA: 1:52 - loss: 2.3979 - regression_loss: 2.0171 - classification_loss: 0.3808 47/500 [=>............................] - ETA: 1:51 - loss: 2.3901 - regression_loss: 2.0113 - classification_loss: 0.3789 48/500 [=>............................] - ETA: 1:51 - loss: 2.3779 - regression_loss: 2.0008 - classification_loss: 0.3771 49/500 [=>............................] - ETA: 1:51 - loss: 2.3732 - regression_loss: 2.0003 - classification_loss: 0.3729 50/500 [==>...........................] - ETA: 1:51 - loss: 2.3749 - regression_loss: 2.0015 - classification_loss: 0.3734 51/500 [==>...........................] - ETA: 1:51 - loss: 2.3784 - regression_loss: 2.0055 - classification_loss: 0.3729 52/500 [==>...........................] - ETA: 1:50 - loss: 2.3817 - regression_loss: 2.0085 - classification_loss: 0.3733 53/500 [==>...........................] - ETA: 1:50 - loss: 2.3715 - regression_loss: 1.9971 - classification_loss: 0.3744 54/500 [==>...........................] - ETA: 1:50 - loss: 2.3642 - regression_loss: 1.9909 - classification_loss: 0.3734 55/500 [==>...........................] - ETA: 1:50 - loss: 2.3585 - regression_loss: 1.9862 - classification_loss: 0.3723 56/500 [==>...........................] - ETA: 1:50 - loss: 2.3652 - regression_loss: 1.9914 - classification_loss: 0.3738 57/500 [==>...........................] - ETA: 1:49 - loss: 2.3744 - regression_loss: 2.0003 - classification_loss: 0.3740 58/500 [==>...........................] - ETA: 1:49 - loss: 2.3651 - regression_loss: 1.9933 - classification_loss: 0.3718 59/500 [==>...........................] - ETA: 1:48 - loss: 2.3680 - regression_loss: 1.9941 - classification_loss: 0.3739 60/500 [==>...........................] - ETA: 1:48 - loss: 2.3685 - regression_loss: 1.9944 - classification_loss: 0.3741 61/500 [==>...........................] - ETA: 1:47 - loss: 2.3713 - regression_loss: 1.9946 - classification_loss: 0.3767 62/500 [==>...........................] - ETA: 1:47 - loss: 2.3718 - regression_loss: 1.9954 - classification_loss: 0.3764 63/500 [==>...........................] - ETA: 1:47 - loss: 2.3821 - regression_loss: 2.0058 - classification_loss: 0.3762 64/500 [==>...........................] - ETA: 1:46 - loss: 2.3837 - regression_loss: 2.0078 - classification_loss: 0.3759 65/500 [==>...........................] - ETA: 1:46 - loss: 2.3856 - regression_loss: 2.0100 - classification_loss: 0.3757 66/500 [==>...........................] - ETA: 1:46 - loss: 2.3823 - regression_loss: 2.0055 - classification_loss: 0.3767 67/500 [===>..........................] - ETA: 1:46 - loss: 2.3851 - regression_loss: 2.0054 - classification_loss: 0.3797 68/500 [===>..........................] - ETA: 1:46 - loss: 2.3907 - regression_loss: 2.0105 - classification_loss: 0.3802 69/500 [===>..........................] - ETA: 1:45 - loss: 2.3915 - regression_loss: 2.0113 - classification_loss: 0.3802 70/500 [===>..........................] - ETA: 1:45 - loss: 2.3839 - regression_loss: 2.0052 - classification_loss: 0.3788 71/500 [===>..........................] - ETA: 1:45 - loss: 2.3833 - regression_loss: 2.0044 - classification_loss: 0.3789 72/500 [===>..........................] - ETA: 1:45 - loss: 2.3814 - regression_loss: 2.0035 - classification_loss: 0.3778 73/500 [===>..........................] - ETA: 1:45 - loss: 2.3731 - regression_loss: 1.9971 - classification_loss: 0.3759 74/500 [===>..........................] - ETA: 1:44 - loss: 2.3546 - regression_loss: 1.9817 - classification_loss: 0.3729 75/500 [===>..........................] - ETA: 1:44 - loss: 2.3601 - regression_loss: 1.9864 - classification_loss: 0.3737 76/500 [===>..........................] - ETA: 1:44 - loss: 2.3603 - regression_loss: 1.9869 - classification_loss: 0.3734 77/500 [===>..........................] - ETA: 1:43 - loss: 2.3611 - regression_loss: 1.9870 - classification_loss: 0.3741 78/500 [===>..........................] - ETA: 1:43 - loss: 2.3581 - regression_loss: 1.9846 - classification_loss: 0.3735 79/500 [===>..........................] - ETA: 1:43 - loss: 2.3594 - regression_loss: 1.9846 - classification_loss: 0.3748 80/500 [===>..........................] - ETA: 1:43 - loss: 2.3579 - regression_loss: 1.9839 - classification_loss: 0.3741 81/500 [===>..........................] - ETA: 1:43 - loss: 2.3547 - regression_loss: 1.9815 - classification_loss: 0.3732 82/500 [===>..........................] - ETA: 1:42 - loss: 2.3531 - regression_loss: 1.9807 - classification_loss: 0.3724 83/500 [===>..........................] - ETA: 1:42 - loss: 2.3533 - regression_loss: 1.9811 - classification_loss: 0.3723 84/500 [====>.........................] - ETA: 1:42 - loss: 2.3565 - regression_loss: 1.9840 - classification_loss: 0.3725 85/500 [====>.........................] - ETA: 1:42 - loss: 2.3540 - regression_loss: 1.9813 - classification_loss: 0.3727 86/500 [====>.........................] - ETA: 1:42 - loss: 2.3551 - regression_loss: 1.9823 - classification_loss: 0.3728 87/500 [====>.........................] - ETA: 1:41 - loss: 2.3552 - regression_loss: 1.9826 - classification_loss: 0.3726 88/500 [====>.........................] - ETA: 1:41 - loss: 2.3563 - regression_loss: 1.9828 - classification_loss: 0.3734 89/500 [====>.........................] - ETA: 1:41 - loss: 2.3550 - regression_loss: 1.9829 - classification_loss: 0.3720 90/500 [====>.........................] - ETA: 1:41 - loss: 2.3567 - regression_loss: 1.9843 - classification_loss: 0.3723 91/500 [====>.........................] - ETA: 1:40 - loss: 2.3567 - regression_loss: 1.9845 - classification_loss: 0.3721 92/500 [====>.........................] - ETA: 1:40 - loss: 2.3578 - regression_loss: 1.9858 - classification_loss: 0.3720 93/500 [====>.........................] - ETA: 1:40 - loss: 2.3540 - regression_loss: 1.9833 - classification_loss: 0.3708 94/500 [====>.........................] - ETA: 1:39 - loss: 2.3535 - regression_loss: 1.9832 - classification_loss: 0.3703 95/500 [====>.........................] - ETA: 1:39 - loss: 2.3474 - regression_loss: 1.9783 - classification_loss: 0.3691 96/500 [====>.........................] - ETA: 1:39 - loss: 2.3379 - regression_loss: 1.9707 - classification_loss: 0.3672 97/500 [====>.........................] - ETA: 1:39 - loss: 2.3417 - regression_loss: 1.9747 - classification_loss: 0.3670 98/500 [====>.........................] - ETA: 1:38 - loss: 2.3416 - regression_loss: 1.9739 - classification_loss: 0.3677 99/500 [====>.........................] - ETA: 1:38 - loss: 2.3427 - regression_loss: 1.9740 - classification_loss: 0.3687 100/500 [=====>........................] - ETA: 1:38 - loss: 2.3439 - regression_loss: 1.9745 - classification_loss: 0.3694 101/500 [=====>........................] - ETA: 1:37 - loss: 2.3450 - regression_loss: 1.9756 - classification_loss: 0.3694 102/500 [=====>........................] - ETA: 1:37 - loss: 2.3514 - regression_loss: 1.9807 - classification_loss: 0.3707 103/500 [=====>........................] - ETA: 1:37 - loss: 2.3506 - regression_loss: 1.9803 - classification_loss: 0.3703 104/500 [=====>........................] - ETA: 1:37 - loss: 2.3491 - regression_loss: 1.9796 - classification_loss: 0.3695 105/500 [=====>........................] - ETA: 1:37 - loss: 2.3495 - regression_loss: 1.9802 - classification_loss: 0.3693 106/500 [=====>........................] - ETA: 1:36 - loss: 2.3414 - regression_loss: 1.9736 - classification_loss: 0.3678 107/500 [=====>........................] - ETA: 1:36 - loss: 2.3350 - regression_loss: 1.9688 - classification_loss: 0.3662 108/500 [=====>........................] - ETA: 1:36 - loss: 2.3309 - regression_loss: 1.9658 - classification_loss: 0.3651 109/500 [=====>........................] - ETA: 1:36 - loss: 2.3320 - regression_loss: 1.9670 - classification_loss: 0.3650 110/500 [=====>........................] - ETA: 1:35 - loss: 2.3337 - regression_loss: 1.9689 - classification_loss: 0.3648 111/500 [=====>........................] - ETA: 1:35 - loss: 2.3321 - regression_loss: 1.9676 - classification_loss: 0.3645 112/500 [=====>........................] - ETA: 1:35 - loss: 2.3345 - regression_loss: 1.9702 - classification_loss: 0.3643 113/500 [=====>........................] - ETA: 1:34 - loss: 2.3340 - regression_loss: 1.9699 - classification_loss: 0.3640 114/500 [=====>........................] - ETA: 1:34 - loss: 2.3366 - regression_loss: 1.9731 - classification_loss: 0.3636 115/500 [=====>........................] - ETA: 1:34 - loss: 2.3466 - regression_loss: 1.9819 - classification_loss: 0.3647 116/500 [=====>........................] - ETA: 1:33 - loss: 2.3451 - regression_loss: 1.9809 - classification_loss: 0.3643 117/500 [======>.......................] - ETA: 1:33 - loss: 2.3517 - regression_loss: 1.9871 - classification_loss: 0.3646 118/500 [======>.......................] - ETA: 1:33 - loss: 2.3535 - regression_loss: 1.9888 - classification_loss: 0.3648 119/500 [======>.......................] - ETA: 1:32 - loss: 2.3578 - regression_loss: 1.9922 - classification_loss: 0.3656 120/500 [======>.......................] - ETA: 1:32 - loss: 2.3557 - regression_loss: 1.9902 - classification_loss: 0.3654 121/500 [======>.......................] - ETA: 1:32 - loss: 2.3543 - regression_loss: 1.9892 - classification_loss: 0.3651 122/500 [======>.......................] - ETA: 1:32 - loss: 2.3566 - regression_loss: 1.9910 - classification_loss: 0.3656 123/500 [======>.......................] - ETA: 1:31 - loss: 2.3522 - regression_loss: 1.9879 - classification_loss: 0.3643 124/500 [======>.......................] - ETA: 1:31 - loss: 2.3529 - regression_loss: 1.9886 - classification_loss: 0.3643 125/500 [======>.......................] - ETA: 1:31 - loss: 2.3542 - regression_loss: 1.9897 - classification_loss: 0.3645 126/500 [======>.......................] - ETA: 1:30 - loss: 2.3579 - regression_loss: 1.9923 - classification_loss: 0.3656 127/500 [======>.......................] - ETA: 1:30 - loss: 2.3567 - regression_loss: 1.9916 - classification_loss: 0.3651 128/500 [======>.......................] - ETA: 1:30 - loss: 2.3599 - regression_loss: 1.9939 - classification_loss: 0.3660 129/500 [======>.......................] - ETA: 1:30 - loss: 2.3635 - regression_loss: 1.9967 - classification_loss: 0.3668 130/500 [======>.......................] - ETA: 1:29 - loss: 2.3570 - regression_loss: 1.9913 - classification_loss: 0.3657 131/500 [======>.......................] - ETA: 1:29 - loss: 2.3592 - regression_loss: 1.9936 - classification_loss: 0.3656 132/500 [======>.......................] - ETA: 1:29 - loss: 2.3608 - regression_loss: 1.9950 - classification_loss: 0.3657 133/500 [======>.......................] - ETA: 1:28 - loss: 2.3616 - regression_loss: 1.9958 - classification_loss: 0.3658 134/500 [=======>......................] - ETA: 1:28 - loss: 2.3590 - regression_loss: 1.9937 - classification_loss: 0.3653 135/500 [=======>......................] - ETA: 1:28 - loss: 2.3633 - regression_loss: 1.9952 - classification_loss: 0.3680 136/500 [=======>......................] - ETA: 1:28 - loss: 2.3611 - regression_loss: 1.9933 - classification_loss: 0.3678 137/500 [=======>......................] - ETA: 1:27 - loss: 2.3630 - regression_loss: 1.9955 - classification_loss: 0.3675 138/500 [=======>......................] - ETA: 1:27 - loss: 2.3634 - regression_loss: 1.9960 - classification_loss: 0.3675 139/500 [=======>......................] - ETA: 1:27 - loss: 2.3630 - regression_loss: 1.9956 - classification_loss: 0.3674 140/500 [=======>......................] - ETA: 1:26 - loss: 2.3609 - regression_loss: 1.9941 - classification_loss: 0.3669 141/500 [=======>......................] - ETA: 1:26 - loss: 2.3569 - regression_loss: 1.9902 - classification_loss: 0.3667 142/500 [=======>......................] - ETA: 1:26 - loss: 2.3547 - regression_loss: 1.9881 - classification_loss: 0.3665 143/500 [=======>......................] - ETA: 1:26 - loss: 2.3563 - regression_loss: 1.9897 - classification_loss: 0.3666 144/500 [=======>......................] - ETA: 1:26 - loss: 2.3603 - regression_loss: 1.9917 - classification_loss: 0.3686 145/500 [=======>......................] - ETA: 1:25 - loss: 2.3622 - regression_loss: 1.9928 - classification_loss: 0.3694 146/500 [=======>......................] - ETA: 1:25 - loss: 2.3573 - regression_loss: 1.9888 - classification_loss: 0.3686 147/500 [=======>......................] - ETA: 1:25 - loss: 2.3550 - regression_loss: 1.9867 - classification_loss: 0.3683 148/500 [=======>......................] - ETA: 1:24 - loss: 2.3541 - regression_loss: 1.9862 - classification_loss: 0.3680 149/500 [=======>......................] - ETA: 1:24 - loss: 2.3551 - regression_loss: 1.9869 - classification_loss: 0.3681 150/500 [========>.....................] - ETA: 1:24 - loss: 2.3530 - regression_loss: 1.9854 - classification_loss: 0.3676 151/500 [========>.....................] - ETA: 1:24 - loss: 2.3533 - regression_loss: 1.9862 - classification_loss: 0.3671 152/500 [========>.....................] - ETA: 1:24 - loss: 2.3562 - regression_loss: 1.9880 - classification_loss: 0.3682 153/500 [========>.....................] - ETA: 1:23 - loss: 2.3549 - regression_loss: 1.9876 - classification_loss: 0.3673 154/500 [========>.....................] - ETA: 1:23 - loss: 2.3551 - regression_loss: 1.9881 - classification_loss: 0.3670 155/500 [========>.....................] - ETA: 1:23 - loss: 2.3514 - regression_loss: 1.9846 - classification_loss: 0.3667 156/500 [========>.....................] - ETA: 1:23 - loss: 2.3524 - regression_loss: 1.9856 - classification_loss: 0.3668 157/500 [========>.....................] - ETA: 1:22 - loss: 2.3547 - regression_loss: 1.9883 - classification_loss: 0.3664 158/500 [========>.....................] - ETA: 1:22 - loss: 2.3549 - regression_loss: 1.9892 - classification_loss: 0.3657 159/500 [========>.....................] - ETA: 1:22 - loss: 2.3539 - regression_loss: 1.9884 - classification_loss: 0.3654 160/500 [========>.....................] - ETA: 1:22 - loss: 2.3519 - regression_loss: 1.9870 - classification_loss: 0.3649 161/500 [========>.....................] - ETA: 1:21 - loss: 2.3505 - regression_loss: 1.9857 - classification_loss: 0.3647 162/500 [========>.....................] - ETA: 1:21 - loss: 2.3512 - regression_loss: 1.9862 - classification_loss: 0.3649 163/500 [========>.....................] - ETA: 1:21 - loss: 2.3470 - regression_loss: 1.9829 - classification_loss: 0.3641 164/500 [========>.....................] - ETA: 1:20 - loss: 2.3437 - regression_loss: 1.9802 - classification_loss: 0.3635 165/500 [========>.....................] - ETA: 1:20 - loss: 2.3437 - regression_loss: 1.9802 - classification_loss: 0.3635 166/500 [========>.....................] - ETA: 1:20 - loss: 2.3441 - regression_loss: 1.9807 - classification_loss: 0.3634 167/500 [=========>....................] - ETA: 1:20 - loss: 2.3430 - regression_loss: 1.9799 - classification_loss: 0.3631 168/500 [=========>....................] - ETA: 1:20 - loss: 2.3434 - regression_loss: 1.9802 - classification_loss: 0.3632 169/500 [=========>....................] - ETA: 1:19 - loss: 2.3433 - regression_loss: 1.9802 - classification_loss: 0.3631 170/500 [=========>....................] - ETA: 1:19 - loss: 2.3431 - regression_loss: 1.9799 - classification_loss: 0.3632 171/500 [=========>....................] - ETA: 1:19 - loss: 2.3418 - regression_loss: 1.9777 - classification_loss: 0.3641 172/500 [=========>....................] - ETA: 1:19 - loss: 2.3446 - regression_loss: 1.9799 - classification_loss: 0.3647 173/500 [=========>....................] - ETA: 1:18 - loss: 2.3444 - regression_loss: 1.9798 - classification_loss: 0.3646 174/500 [=========>....................] - ETA: 1:18 - loss: 2.3456 - regression_loss: 1.9810 - classification_loss: 0.3646 175/500 [=========>....................] - ETA: 1:18 - loss: 2.3458 - regression_loss: 1.9809 - classification_loss: 0.3650 176/500 [=========>....................] - ETA: 1:18 - loss: 2.3457 - regression_loss: 1.9808 - classification_loss: 0.3649 177/500 [=========>....................] - ETA: 1:17 - loss: 2.3458 - regression_loss: 1.9809 - classification_loss: 0.3649 178/500 [=========>....................] - ETA: 1:17 - loss: 2.3447 - regression_loss: 1.9798 - classification_loss: 0.3649 179/500 [=========>....................] - ETA: 1:17 - loss: 2.3446 - regression_loss: 1.9803 - classification_loss: 0.3643 180/500 [=========>....................] - ETA: 1:17 - loss: 2.3430 - regression_loss: 1.9789 - classification_loss: 0.3641 181/500 [=========>....................] - ETA: 1:16 - loss: 2.3439 - regression_loss: 1.9799 - classification_loss: 0.3640 182/500 [=========>....................] - ETA: 1:16 - loss: 2.3467 - regression_loss: 1.9820 - classification_loss: 0.3647 183/500 [=========>....................] - ETA: 1:16 - loss: 2.3483 - regression_loss: 1.9830 - classification_loss: 0.3653 184/500 [==========>...................] - ETA: 1:16 - loss: 2.3493 - regression_loss: 1.9837 - classification_loss: 0.3656 185/500 [==========>...................] - ETA: 1:15 - loss: 2.3505 - regression_loss: 1.9848 - classification_loss: 0.3657 186/500 [==========>...................] - ETA: 1:15 - loss: 2.3492 - regression_loss: 1.9838 - classification_loss: 0.3654 187/500 [==========>...................] - ETA: 1:15 - loss: 2.3446 - regression_loss: 1.9797 - classification_loss: 0.3648 188/500 [==========>...................] - ETA: 1:15 - loss: 2.3423 - regression_loss: 1.9777 - classification_loss: 0.3646 189/500 [==========>...................] - ETA: 1:15 - loss: 2.3399 - regression_loss: 1.9756 - classification_loss: 0.3642 190/500 [==========>...................] - ETA: 1:14 - loss: 2.3379 - regression_loss: 1.9741 - classification_loss: 0.3637 191/500 [==========>...................] - ETA: 1:14 - loss: 2.3391 - regression_loss: 1.9752 - classification_loss: 0.3638 192/500 [==========>...................] - ETA: 1:14 - loss: 2.3391 - regression_loss: 1.9751 - classification_loss: 0.3639 193/500 [==========>...................] - ETA: 1:14 - loss: 2.3403 - regression_loss: 1.9767 - classification_loss: 0.3636 194/500 [==========>...................] - ETA: 1:13 - loss: 2.3405 - regression_loss: 1.9771 - classification_loss: 0.3634 195/500 [==========>...................] - ETA: 1:13 - loss: 2.3397 - regression_loss: 1.9765 - classification_loss: 0.3632 196/500 [==========>...................] - ETA: 1:13 - loss: 2.3408 - regression_loss: 1.9782 - classification_loss: 0.3627 197/500 [==========>...................] - ETA: 1:13 - loss: 2.3401 - regression_loss: 1.9778 - classification_loss: 0.3623 198/500 [==========>...................] - ETA: 1:12 - loss: 2.3406 - regression_loss: 1.9785 - classification_loss: 0.3621 199/500 [==========>...................] - ETA: 1:12 - loss: 2.3391 - regression_loss: 1.9771 - classification_loss: 0.3620 200/500 [===========>..................] - ETA: 1:12 - loss: 2.3390 - regression_loss: 1.9771 - classification_loss: 0.3618 201/500 [===========>..................] - ETA: 1:12 - loss: 2.3393 - regression_loss: 1.9775 - classification_loss: 0.3617 202/500 [===========>..................] - ETA: 1:12 - loss: 2.3370 - regression_loss: 1.9756 - classification_loss: 0.3614 203/500 [===========>..................] - ETA: 1:11 - loss: 2.3368 - regression_loss: 1.9756 - classification_loss: 0.3612 204/500 [===========>..................] - ETA: 1:11 - loss: 2.3374 - regression_loss: 1.9763 - classification_loss: 0.3611 205/500 [===========>..................] - ETA: 1:11 - loss: 2.3375 - regression_loss: 1.9766 - classification_loss: 0.3609 206/500 [===========>..................] - ETA: 1:11 - loss: 2.3388 - regression_loss: 1.9781 - classification_loss: 0.3608 207/500 [===========>..................] - ETA: 1:10 - loss: 2.3364 - regression_loss: 1.9764 - classification_loss: 0.3600 208/500 [===========>..................] - ETA: 1:10 - loss: 2.3350 - regression_loss: 1.9754 - classification_loss: 0.3595 209/500 [===========>..................] - ETA: 1:10 - loss: 2.3347 - regression_loss: 1.9751 - classification_loss: 0.3595 210/500 [===========>..................] - ETA: 1:10 - loss: 2.3358 - regression_loss: 1.9761 - classification_loss: 0.3597 211/500 [===========>..................] - ETA: 1:10 - loss: 2.3379 - regression_loss: 1.9775 - classification_loss: 0.3604 212/500 [===========>..................] - ETA: 1:09 - loss: 2.3365 - regression_loss: 1.9764 - classification_loss: 0.3601 213/500 [===========>..................] - ETA: 1:09 - loss: 2.3376 - regression_loss: 1.9771 - classification_loss: 0.3605 214/500 [===========>..................] - ETA: 1:09 - loss: 2.3380 - regression_loss: 1.9778 - classification_loss: 0.3601 215/500 [===========>..................] - ETA: 1:09 - loss: 2.3380 - regression_loss: 1.9783 - classification_loss: 0.3597 216/500 [===========>..................] - ETA: 1:08 - loss: 2.3384 - regression_loss: 1.9785 - classification_loss: 0.3598 217/500 [============>.................] - ETA: 1:08 - loss: 2.3377 - regression_loss: 1.9779 - classification_loss: 0.3598 218/500 [============>.................] - ETA: 1:08 - loss: 2.3376 - regression_loss: 1.9776 - classification_loss: 0.3600 219/500 [============>.................] - ETA: 1:08 - loss: 2.3384 - regression_loss: 1.9781 - classification_loss: 0.3603 220/500 [============>.................] - ETA: 1:07 - loss: 2.3404 - regression_loss: 1.9793 - classification_loss: 0.3610 221/500 [============>.................] - ETA: 1:07 - loss: 2.3394 - regression_loss: 1.9779 - classification_loss: 0.3614 222/500 [============>.................] - ETA: 1:07 - loss: 2.3379 - regression_loss: 1.9768 - classification_loss: 0.3611 223/500 [============>.................] - ETA: 1:07 - loss: 2.3392 - regression_loss: 1.9779 - classification_loss: 0.3612 224/500 [============>.................] - ETA: 1:07 - loss: 2.3327 - regression_loss: 1.9723 - classification_loss: 0.3605 225/500 [============>.................] - ETA: 1:06 - loss: 2.3341 - regression_loss: 1.9733 - classification_loss: 0.3607 226/500 [============>.................] - ETA: 1:06 - loss: 2.3349 - regression_loss: 1.9739 - classification_loss: 0.3610 227/500 [============>.................] - ETA: 1:06 - loss: 2.3351 - regression_loss: 1.9739 - classification_loss: 0.3612 228/500 [============>.................] - ETA: 1:06 - loss: 2.3366 - regression_loss: 1.9752 - classification_loss: 0.3615 229/500 [============>.................] - ETA: 1:05 - loss: 2.3374 - regression_loss: 1.9754 - classification_loss: 0.3620 230/500 [============>.................] - ETA: 1:05 - loss: 2.3373 - regression_loss: 1.9750 - classification_loss: 0.3622 231/500 [============>.................] - ETA: 1:05 - loss: 2.3380 - regression_loss: 1.9756 - classification_loss: 0.3624 232/500 [============>.................] - ETA: 1:05 - loss: 2.3362 - regression_loss: 1.9743 - classification_loss: 0.3620 233/500 [============>.................] - ETA: 1:04 - loss: 2.3348 - regression_loss: 1.9732 - classification_loss: 0.3616 234/500 [=============>................] - ETA: 1:04 - loss: 2.3369 - regression_loss: 1.9749 - classification_loss: 0.3620 235/500 [=============>................] - ETA: 1:04 - loss: 2.3381 - regression_loss: 1.9759 - classification_loss: 0.3622 236/500 [=============>................] - ETA: 1:04 - loss: 2.3373 - regression_loss: 1.9754 - classification_loss: 0.3619 237/500 [=============>................] - ETA: 1:03 - loss: 2.3369 - regression_loss: 1.9751 - classification_loss: 0.3617 238/500 [=============>................] - ETA: 1:03 - loss: 2.3368 - regression_loss: 1.9752 - classification_loss: 0.3615 239/500 [=============>................] - ETA: 1:03 - loss: 2.3376 - regression_loss: 1.9765 - classification_loss: 0.3612 240/500 [=============>................] - ETA: 1:03 - loss: 2.3397 - regression_loss: 1.9779 - classification_loss: 0.3618 241/500 [=============>................] - ETA: 1:02 - loss: 2.3417 - regression_loss: 1.9795 - classification_loss: 0.3621 242/500 [=============>................] - ETA: 1:02 - loss: 2.3417 - regression_loss: 1.9792 - classification_loss: 0.3625 243/500 [=============>................] - ETA: 1:02 - loss: 2.3427 - regression_loss: 1.9799 - classification_loss: 0.3628 244/500 [=============>................] - ETA: 1:02 - loss: 2.3369 - regression_loss: 1.9750 - classification_loss: 0.3619 245/500 [=============>................] - ETA: 1:01 - loss: 2.3384 - regression_loss: 1.9762 - classification_loss: 0.3622 246/500 [=============>................] - ETA: 1:01 - loss: 2.3379 - regression_loss: 1.9755 - classification_loss: 0.3624 247/500 [=============>................] - ETA: 1:01 - loss: 2.3407 - regression_loss: 1.9781 - classification_loss: 0.3626 248/500 [=============>................] - ETA: 1:01 - loss: 2.3410 - regression_loss: 1.9783 - classification_loss: 0.3628 249/500 [=============>................] - ETA: 1:00 - loss: 2.3416 - regression_loss: 1.9788 - classification_loss: 0.3628 250/500 [==============>...............] - ETA: 1:00 - loss: 2.3449 - regression_loss: 1.9813 - classification_loss: 0.3636 251/500 [==============>...............] - ETA: 1:00 - loss: 2.3431 - regression_loss: 1.9798 - classification_loss: 0.3633 252/500 [==============>...............] - ETA: 1:00 - loss: 2.3430 - regression_loss: 1.9795 - classification_loss: 0.3634 253/500 [==============>...............] - ETA: 1:00 - loss: 2.3427 - regression_loss: 1.9795 - classification_loss: 0.3632 254/500 [==============>...............] - ETA: 59s - loss: 2.3431 - regression_loss: 1.9796 - classification_loss: 0.3635  255/500 [==============>...............] - ETA: 59s - loss: 2.3409 - regression_loss: 1.9779 - classification_loss: 0.3631 256/500 [==============>...............] - ETA: 59s - loss: 2.3421 - regression_loss: 1.9791 - classification_loss: 0.3631 257/500 [==============>...............] - ETA: 59s - loss: 2.3407 - regression_loss: 1.9780 - classification_loss: 0.3627 258/500 [==============>...............] - ETA: 58s - loss: 2.3421 - regression_loss: 1.9790 - classification_loss: 0.3632 259/500 [==============>...............] - ETA: 58s - loss: 2.3422 - regression_loss: 1.9792 - classification_loss: 0.3630 260/500 [==============>...............] - ETA: 58s - loss: 2.3400 - regression_loss: 1.9773 - classification_loss: 0.3626 261/500 [==============>...............] - ETA: 58s - loss: 2.3379 - regression_loss: 1.9757 - classification_loss: 0.3623 262/500 [==============>...............] - ETA: 57s - loss: 2.3377 - regression_loss: 1.9757 - classification_loss: 0.3620 263/500 [==============>...............] - ETA: 57s - loss: 2.3363 - regression_loss: 1.9746 - classification_loss: 0.3617 264/500 [==============>...............] - ETA: 57s - loss: 2.3366 - regression_loss: 1.9750 - classification_loss: 0.3617 265/500 [==============>...............] - ETA: 57s - loss: 2.3371 - regression_loss: 1.9755 - classification_loss: 0.3616 266/500 [==============>...............] - ETA: 57s - loss: 2.3365 - regression_loss: 1.9749 - classification_loss: 0.3616 267/500 [===============>..............] - ETA: 56s - loss: 2.3356 - regression_loss: 1.9744 - classification_loss: 0.3612 268/500 [===============>..............] - ETA: 56s - loss: 2.3336 - regression_loss: 1.9726 - classification_loss: 0.3610 269/500 [===============>..............] - ETA: 56s - loss: 2.3336 - regression_loss: 1.9724 - classification_loss: 0.3611 270/500 [===============>..............] - ETA: 56s - loss: 2.3341 - regression_loss: 1.9728 - classification_loss: 0.3613 271/500 [===============>..............] - ETA: 55s - loss: 2.3340 - regression_loss: 1.9728 - classification_loss: 0.3612 272/500 [===============>..............] - ETA: 55s - loss: 2.3356 - regression_loss: 1.9739 - classification_loss: 0.3617 273/500 [===============>..............] - ETA: 55s - loss: 2.3350 - regression_loss: 1.9733 - classification_loss: 0.3617 274/500 [===============>..............] - ETA: 55s - loss: 2.3348 - regression_loss: 1.9732 - classification_loss: 0.3616 275/500 [===============>..............] - ETA: 54s - loss: 2.3344 - regression_loss: 1.9728 - classification_loss: 0.3616 276/500 [===============>..............] - ETA: 54s - loss: 2.3333 - regression_loss: 1.9720 - classification_loss: 0.3614 277/500 [===============>..............] - ETA: 54s - loss: 2.3331 - regression_loss: 1.9720 - classification_loss: 0.3611 278/500 [===============>..............] - ETA: 54s - loss: 2.3330 - regression_loss: 1.9719 - classification_loss: 0.3611 279/500 [===============>..............] - ETA: 53s - loss: 2.3318 - regression_loss: 1.9711 - classification_loss: 0.3607 280/500 [===============>..............] - ETA: 53s - loss: 2.3320 - regression_loss: 1.9714 - classification_loss: 0.3607 281/500 [===============>..............] - ETA: 53s - loss: 2.3323 - regression_loss: 1.9717 - classification_loss: 0.3606 282/500 [===============>..............] - ETA: 53s - loss: 2.3313 - regression_loss: 1.9711 - classification_loss: 0.3602 283/500 [===============>..............] - ETA: 52s - loss: 2.3305 - regression_loss: 1.9706 - classification_loss: 0.3599 284/500 [================>.............] - ETA: 52s - loss: 2.3322 - regression_loss: 1.9717 - classification_loss: 0.3605 285/500 [================>.............] - ETA: 52s - loss: 2.3330 - regression_loss: 1.9726 - classification_loss: 0.3604 286/500 [================>.............] - ETA: 52s - loss: 2.3338 - regression_loss: 1.9735 - classification_loss: 0.3603 287/500 [================>.............] - ETA: 52s - loss: 2.3319 - regression_loss: 1.9720 - classification_loss: 0.3599 288/500 [================>.............] - ETA: 51s - loss: 2.3321 - regression_loss: 1.9722 - classification_loss: 0.3599 289/500 [================>.............] - ETA: 51s - loss: 2.3314 - regression_loss: 1.9717 - classification_loss: 0.3597 290/500 [================>.............] - ETA: 51s - loss: 2.3308 - regression_loss: 1.9713 - classification_loss: 0.3595 291/500 [================>.............] - ETA: 51s - loss: 2.3283 - regression_loss: 1.9694 - classification_loss: 0.3589 292/500 [================>.............] - ETA: 50s - loss: 2.3290 - regression_loss: 1.9698 - classification_loss: 0.3592 293/500 [================>.............] - ETA: 50s - loss: 2.3294 - regression_loss: 1.9703 - classification_loss: 0.3591 294/500 [================>.............] - ETA: 50s - loss: 2.3293 - regression_loss: 1.9703 - classification_loss: 0.3590 295/500 [================>.............] - ETA: 50s - loss: 2.3284 - regression_loss: 1.9695 - classification_loss: 0.3589 296/500 [================>.............] - ETA: 49s - loss: 2.3286 - regression_loss: 1.9698 - classification_loss: 0.3588 297/500 [================>.............] - ETA: 49s - loss: 2.3291 - regression_loss: 1.9702 - classification_loss: 0.3589 298/500 [================>.............] - ETA: 49s - loss: 2.3285 - regression_loss: 1.9696 - classification_loss: 0.3589 299/500 [================>.............] - ETA: 49s - loss: 2.3267 - regression_loss: 1.9682 - classification_loss: 0.3586 300/500 [=================>............] - ETA: 48s - loss: 2.3280 - regression_loss: 1.9693 - classification_loss: 0.3587 301/500 [=================>............] - ETA: 48s - loss: 2.3269 - regression_loss: 1.9685 - classification_loss: 0.3585 302/500 [=================>............] - ETA: 48s - loss: 2.3265 - regression_loss: 1.9676 - classification_loss: 0.3589 303/500 [=================>............] - ETA: 48s - loss: 2.3246 - regression_loss: 1.9662 - classification_loss: 0.3584 304/500 [=================>............] - ETA: 47s - loss: 2.3247 - regression_loss: 1.9663 - classification_loss: 0.3584 305/500 [=================>............] - ETA: 47s - loss: 2.3242 - regression_loss: 1.9660 - classification_loss: 0.3582 306/500 [=================>............] - ETA: 47s - loss: 2.3226 - regression_loss: 1.9648 - classification_loss: 0.3578 307/500 [=================>............] - ETA: 47s - loss: 2.3231 - regression_loss: 1.9654 - classification_loss: 0.3576 308/500 [=================>............] - ETA: 46s - loss: 2.3245 - regression_loss: 1.9668 - classification_loss: 0.3577 309/500 [=================>............] - ETA: 46s - loss: 2.3251 - regression_loss: 1.9672 - classification_loss: 0.3579 310/500 [=================>............] - ETA: 46s - loss: 2.3254 - regression_loss: 1.9676 - classification_loss: 0.3579 311/500 [=================>............] - ETA: 46s - loss: 2.3282 - regression_loss: 1.9700 - classification_loss: 0.3582 312/500 [=================>............] - ETA: 45s - loss: 2.3281 - regression_loss: 1.9701 - classification_loss: 0.3579 313/500 [=================>............] - ETA: 45s - loss: 2.3281 - regression_loss: 1.9701 - classification_loss: 0.3579 314/500 [=================>............] - ETA: 45s - loss: 2.3290 - regression_loss: 1.9711 - classification_loss: 0.3579 315/500 [=================>............] - ETA: 45s - loss: 2.3281 - regression_loss: 1.9702 - classification_loss: 0.3578 316/500 [=================>............] - ETA: 45s - loss: 2.3277 - regression_loss: 1.9700 - classification_loss: 0.3577 317/500 [==================>...........] - ETA: 44s - loss: 2.3276 - regression_loss: 1.9699 - classification_loss: 0.3577 318/500 [==================>...........] - ETA: 44s - loss: 2.3263 - regression_loss: 1.9681 - classification_loss: 0.3582 319/500 [==================>...........] - ETA: 44s - loss: 2.3260 - regression_loss: 1.9680 - classification_loss: 0.3580 320/500 [==================>...........] - ETA: 44s - loss: 2.3232 - regression_loss: 1.9659 - classification_loss: 0.3573 321/500 [==================>...........] - ETA: 43s - loss: 2.3239 - regression_loss: 1.9663 - classification_loss: 0.3576 322/500 [==================>...........] - ETA: 43s - loss: 2.3251 - regression_loss: 1.9671 - classification_loss: 0.3580 323/500 [==================>...........] - ETA: 43s - loss: 2.3242 - regression_loss: 1.9663 - classification_loss: 0.3579 324/500 [==================>...........] - ETA: 43s - loss: 2.3220 - regression_loss: 1.9646 - classification_loss: 0.3574 325/500 [==================>...........] - ETA: 42s - loss: 2.3218 - regression_loss: 1.9645 - classification_loss: 0.3573 326/500 [==================>...........] - ETA: 42s - loss: 2.3222 - regression_loss: 1.9649 - classification_loss: 0.3573 327/500 [==================>...........] - ETA: 42s - loss: 2.3232 - regression_loss: 1.9660 - classification_loss: 0.3573 328/500 [==================>...........] - ETA: 42s - loss: 2.3237 - regression_loss: 1.9663 - classification_loss: 0.3574 329/500 [==================>...........] - ETA: 41s - loss: 2.3231 - regression_loss: 1.9661 - classification_loss: 0.3570 330/500 [==================>...........] - ETA: 41s - loss: 2.3228 - regression_loss: 1.9658 - classification_loss: 0.3570 331/500 [==================>...........] - ETA: 41s - loss: 2.3227 - regression_loss: 1.9656 - classification_loss: 0.3571 332/500 [==================>...........] - ETA: 41s - loss: 2.3262 - regression_loss: 1.9672 - classification_loss: 0.3591 333/500 [==================>...........] - ETA: 40s - loss: 2.3263 - regression_loss: 1.9673 - classification_loss: 0.3590 334/500 [===================>..........] - ETA: 40s - loss: 2.3273 - regression_loss: 1.9681 - classification_loss: 0.3592 335/500 [===================>..........] - ETA: 40s - loss: 2.3276 - regression_loss: 1.9684 - classification_loss: 0.3592 336/500 [===================>..........] - ETA: 40s - loss: 2.3269 - regression_loss: 1.9678 - classification_loss: 0.3591 337/500 [===================>..........] - ETA: 39s - loss: 2.3279 - regression_loss: 1.9686 - classification_loss: 0.3594 338/500 [===================>..........] - ETA: 39s - loss: 2.3286 - regression_loss: 1.9690 - classification_loss: 0.3596 339/500 [===================>..........] - ETA: 39s - loss: 2.3260 - regression_loss: 1.9669 - classification_loss: 0.3591 340/500 [===================>..........] - ETA: 39s - loss: 2.3283 - regression_loss: 1.9684 - classification_loss: 0.3599 341/500 [===================>..........] - ETA: 38s - loss: 2.3284 - regression_loss: 1.9688 - classification_loss: 0.3597 342/500 [===================>..........] - ETA: 38s - loss: 2.3294 - regression_loss: 1.9698 - classification_loss: 0.3596 343/500 [===================>..........] - ETA: 38s - loss: 2.3287 - regression_loss: 1.9692 - classification_loss: 0.3595 344/500 [===================>..........] - ETA: 38s - loss: 2.3312 - regression_loss: 1.9713 - classification_loss: 0.3600 345/500 [===================>..........] - ETA: 37s - loss: 2.3311 - regression_loss: 1.9711 - classification_loss: 0.3601 346/500 [===================>..........] - ETA: 37s - loss: 2.3315 - regression_loss: 1.9711 - classification_loss: 0.3604 347/500 [===================>..........] - ETA: 37s - loss: 2.3321 - regression_loss: 1.9716 - classification_loss: 0.3605 348/500 [===================>..........] - ETA: 37s - loss: 2.3305 - regression_loss: 1.9702 - classification_loss: 0.3603 349/500 [===================>..........] - ETA: 36s - loss: 2.3314 - regression_loss: 1.9710 - classification_loss: 0.3604 350/500 [====================>.........] - ETA: 36s - loss: 2.3308 - regression_loss: 1.9707 - classification_loss: 0.3602 351/500 [====================>.........] - ETA: 36s - loss: 2.3305 - regression_loss: 1.9703 - classification_loss: 0.3601 352/500 [====================>.........] - ETA: 36s - loss: 2.3308 - regression_loss: 1.9699 - classification_loss: 0.3609 353/500 [====================>.........] - ETA: 35s - loss: 2.3288 - regression_loss: 1.9671 - classification_loss: 0.3616 354/500 [====================>.........] - ETA: 35s - loss: 2.3268 - regression_loss: 1.9652 - classification_loss: 0.3617 355/500 [====================>.........] - ETA: 35s - loss: 2.3242 - regression_loss: 1.9631 - classification_loss: 0.3611 356/500 [====================>.........] - ETA: 35s - loss: 2.3237 - regression_loss: 1.9626 - classification_loss: 0.3611 357/500 [====================>.........] - ETA: 35s - loss: 2.3228 - regression_loss: 1.9619 - classification_loss: 0.3609 358/500 [====================>.........] - ETA: 34s - loss: 2.3221 - regression_loss: 1.9614 - classification_loss: 0.3606 359/500 [====================>.........] - ETA: 34s - loss: 2.3223 - regression_loss: 1.9616 - classification_loss: 0.3607 360/500 [====================>.........] - ETA: 34s - loss: 2.3206 - regression_loss: 1.9602 - classification_loss: 0.3604 361/500 [====================>.........] - ETA: 34s - loss: 2.3200 - regression_loss: 1.9598 - classification_loss: 0.3603 362/500 [====================>.........] - ETA: 33s - loss: 2.3202 - regression_loss: 1.9599 - classification_loss: 0.3603 363/500 [====================>.........] - ETA: 33s - loss: 2.3191 - regression_loss: 1.9590 - classification_loss: 0.3602 364/500 [====================>.........] - ETA: 33s - loss: 2.3197 - regression_loss: 1.9593 - classification_loss: 0.3604 365/500 [====================>.........] - ETA: 33s - loss: 2.3186 - regression_loss: 1.9584 - classification_loss: 0.3602 366/500 [====================>.........] - ETA: 32s - loss: 2.3198 - regression_loss: 1.9592 - classification_loss: 0.3606 367/500 [=====================>........] - ETA: 32s - loss: 2.3193 - regression_loss: 1.9587 - classification_loss: 0.3607 368/500 [=====================>........] - ETA: 32s - loss: 2.3209 - regression_loss: 1.9602 - classification_loss: 0.3607 369/500 [=====================>........] - ETA: 32s - loss: 2.3199 - regression_loss: 1.9596 - classification_loss: 0.3604 370/500 [=====================>........] - ETA: 31s - loss: 2.3164 - regression_loss: 1.9565 - classification_loss: 0.3599 371/500 [=====================>........] - ETA: 31s - loss: 2.3142 - regression_loss: 1.9548 - classification_loss: 0.3595 372/500 [=====================>........] - ETA: 31s - loss: 2.3147 - regression_loss: 1.9552 - classification_loss: 0.3594 373/500 [=====================>........] - ETA: 31s - loss: 2.3144 - regression_loss: 1.9552 - classification_loss: 0.3592 374/500 [=====================>........] - ETA: 30s - loss: 2.3138 - regression_loss: 1.9548 - classification_loss: 0.3590 375/500 [=====================>........] - ETA: 30s - loss: 2.3140 - regression_loss: 1.9550 - classification_loss: 0.3590 376/500 [=====================>........] - ETA: 30s - loss: 2.3139 - regression_loss: 1.9549 - classification_loss: 0.3589 377/500 [=====================>........] - ETA: 30s - loss: 2.3128 - regression_loss: 1.9542 - classification_loss: 0.3586 378/500 [=====================>........] - ETA: 29s - loss: 2.3134 - regression_loss: 1.9550 - classification_loss: 0.3584 379/500 [=====================>........] - ETA: 29s - loss: 2.3140 - regression_loss: 1.9555 - classification_loss: 0.3584 380/500 [=====================>........] - ETA: 29s - loss: 2.3152 - regression_loss: 1.9566 - classification_loss: 0.3586 381/500 [=====================>........] - ETA: 29s - loss: 2.3149 - regression_loss: 1.9565 - classification_loss: 0.3585 382/500 [=====================>........] - ETA: 28s - loss: 2.3126 - regression_loss: 1.9546 - classification_loss: 0.3580 383/500 [=====================>........] - ETA: 28s - loss: 2.3130 - regression_loss: 1.9550 - classification_loss: 0.3580 384/500 [======================>.......] - ETA: 28s - loss: 2.3145 - regression_loss: 1.9564 - classification_loss: 0.3580 385/500 [======================>.......] - ETA: 28s - loss: 2.3119 - regression_loss: 1.9543 - classification_loss: 0.3575 386/500 [======================>.......] - ETA: 27s - loss: 2.3133 - regression_loss: 1.9555 - classification_loss: 0.3578 387/500 [======================>.......] - ETA: 27s - loss: 2.3147 - regression_loss: 1.9567 - classification_loss: 0.3580 388/500 [======================>.......] - ETA: 27s - loss: 2.3151 - regression_loss: 1.9573 - classification_loss: 0.3578 389/500 [======================>.......] - ETA: 27s - loss: 2.3156 - regression_loss: 1.9576 - classification_loss: 0.3580 390/500 [======================>.......] - ETA: 26s - loss: 2.3155 - regression_loss: 1.9574 - classification_loss: 0.3580 391/500 [======================>.......] - ETA: 26s - loss: 2.3142 - regression_loss: 1.9563 - classification_loss: 0.3579 392/500 [======================>.......] - ETA: 26s - loss: 2.3145 - regression_loss: 1.9562 - classification_loss: 0.3583 393/500 [======================>.......] - ETA: 26s - loss: 2.3149 - regression_loss: 1.9566 - classification_loss: 0.3583 394/500 [======================>.......] - ETA: 25s - loss: 2.3155 - regression_loss: 1.9571 - classification_loss: 0.3584 395/500 [======================>.......] - ETA: 25s - loss: 2.3156 - regression_loss: 1.9573 - classification_loss: 0.3583 396/500 [======================>.......] - ETA: 25s - loss: 2.3167 - regression_loss: 1.9581 - classification_loss: 0.3586 397/500 [======================>.......] - ETA: 25s - loss: 2.3162 - regression_loss: 1.9578 - classification_loss: 0.3584 398/500 [======================>.......] - ETA: 24s - loss: 2.3155 - regression_loss: 1.9573 - classification_loss: 0.3582 399/500 [======================>.......] - ETA: 24s - loss: 2.3157 - regression_loss: 1.9575 - classification_loss: 0.3582 400/500 [=======================>......] - ETA: 24s - loss: 2.3154 - regression_loss: 1.9573 - classification_loss: 0.3581 401/500 [=======================>......] - ETA: 24s - loss: 2.3152 - regression_loss: 1.9569 - classification_loss: 0.3583 402/500 [=======================>......] - ETA: 24s - loss: 2.3172 - regression_loss: 1.9588 - classification_loss: 0.3584 403/500 [=======================>......] - ETA: 23s - loss: 2.3171 - regression_loss: 1.9588 - classification_loss: 0.3583 404/500 [=======================>......] - ETA: 23s - loss: 2.3173 - regression_loss: 1.9590 - classification_loss: 0.3583 405/500 [=======================>......] - ETA: 23s - loss: 2.3175 - regression_loss: 1.9592 - classification_loss: 0.3583 406/500 [=======================>......] - ETA: 23s - loss: 2.3183 - regression_loss: 1.9597 - classification_loss: 0.3586 407/500 [=======================>......] - ETA: 22s - loss: 2.3173 - regression_loss: 1.9588 - classification_loss: 0.3584 408/500 [=======================>......] - ETA: 22s - loss: 2.3189 - regression_loss: 1.9607 - classification_loss: 0.3583 409/500 [=======================>......] - ETA: 22s - loss: 2.3181 - regression_loss: 1.9600 - classification_loss: 0.3581 410/500 [=======================>......] - ETA: 22s - loss: 2.3214 - regression_loss: 1.9629 - classification_loss: 0.3585 411/500 [=======================>......] - ETA: 21s - loss: 2.3216 - regression_loss: 1.9631 - classification_loss: 0.3585 412/500 [=======================>......] - ETA: 21s - loss: 2.3217 - regression_loss: 1.9632 - classification_loss: 0.3585 413/500 [=======================>......] - ETA: 21s - loss: 2.3206 - regression_loss: 1.9624 - classification_loss: 0.3582 414/500 [=======================>......] - ETA: 21s - loss: 2.3206 - regression_loss: 1.9625 - classification_loss: 0.3581 415/500 [=======================>......] - ETA: 20s - loss: 2.3197 - regression_loss: 1.9619 - classification_loss: 0.3579 416/500 [=======================>......] - ETA: 20s - loss: 2.3186 - regression_loss: 1.9610 - classification_loss: 0.3576 417/500 [========================>.....] - ETA: 20s - loss: 2.3182 - regression_loss: 1.9607 - classification_loss: 0.3575 418/500 [========================>.....] - ETA: 20s - loss: 2.3185 - regression_loss: 1.9609 - classification_loss: 0.3576 419/500 [========================>.....] - ETA: 19s - loss: 2.3187 - regression_loss: 1.9611 - classification_loss: 0.3576 420/500 [========================>.....] - ETA: 19s - loss: 2.3185 - regression_loss: 1.9610 - classification_loss: 0.3575 421/500 [========================>.....] - ETA: 19s - loss: 2.3184 - regression_loss: 1.9610 - classification_loss: 0.3574 422/500 [========================>.....] - ETA: 19s - loss: 2.3183 - regression_loss: 1.9610 - classification_loss: 0.3573 423/500 [========================>.....] - ETA: 18s - loss: 2.3177 - regression_loss: 1.9605 - classification_loss: 0.3572 424/500 [========================>.....] - ETA: 18s - loss: 2.3174 - regression_loss: 1.9599 - classification_loss: 0.3576 425/500 [========================>.....] - ETA: 18s - loss: 2.3167 - regression_loss: 1.9586 - classification_loss: 0.3580 426/500 [========================>.....] - ETA: 18s - loss: 2.3187 - regression_loss: 1.9579 - classification_loss: 0.3608 427/500 [========================>.....] - ETA: 17s - loss: 2.3156 - regression_loss: 1.9552 - classification_loss: 0.3604 428/500 [========================>.....] - ETA: 17s - loss: 2.3155 - regression_loss: 1.9551 - classification_loss: 0.3604 429/500 [========================>.....] - ETA: 17s - loss: 2.3153 - regression_loss: 1.9550 - classification_loss: 0.3602 430/500 [========================>.....] - ETA: 17s - loss: 2.3145 - regression_loss: 1.9544 - classification_loss: 0.3601 431/500 [========================>.....] - ETA: 16s - loss: 2.3144 - regression_loss: 1.9543 - classification_loss: 0.3601 432/500 [========================>.....] - ETA: 16s - loss: 2.3135 - regression_loss: 1.9535 - classification_loss: 0.3600 433/500 [========================>.....] - ETA: 16s - loss: 2.3127 - regression_loss: 1.9529 - classification_loss: 0.3598 434/500 [=========================>....] - ETA: 16s - loss: 2.3137 - regression_loss: 1.9536 - classification_loss: 0.3601 435/500 [=========================>....] - ETA: 15s - loss: 2.3134 - regression_loss: 1.9534 - classification_loss: 0.3600 436/500 [=========================>....] - ETA: 15s - loss: 2.3128 - regression_loss: 1.9529 - classification_loss: 0.3599 437/500 [=========================>....] - ETA: 15s - loss: 2.3121 - regression_loss: 1.9524 - classification_loss: 0.3597 438/500 [=========================>....] - ETA: 15s - loss: 2.3128 - regression_loss: 1.9530 - classification_loss: 0.3597 439/500 [=========================>....] - ETA: 14s - loss: 2.3123 - regression_loss: 1.9525 - classification_loss: 0.3597 440/500 [=========================>....] - ETA: 14s - loss: 2.3123 - regression_loss: 1.9526 - classification_loss: 0.3597 441/500 [=========================>....] - ETA: 14s - loss: 2.3120 - regression_loss: 1.9523 - classification_loss: 0.3597 442/500 [=========================>....] - ETA: 14s - loss: 2.3127 - regression_loss: 1.9529 - classification_loss: 0.3599 443/500 [=========================>....] - ETA: 13s - loss: 2.3134 - regression_loss: 1.9534 - classification_loss: 0.3600 444/500 [=========================>....] - ETA: 13s - loss: 2.3123 - regression_loss: 1.9526 - classification_loss: 0.3597 445/500 [=========================>....] - ETA: 13s - loss: 2.3126 - regression_loss: 1.9530 - classification_loss: 0.3596 446/500 [=========================>....] - ETA: 13s - loss: 2.3123 - regression_loss: 1.9529 - classification_loss: 0.3595 447/500 [=========================>....] - ETA: 12s - loss: 2.3098 - regression_loss: 1.9507 - classification_loss: 0.3591 448/500 [=========================>....] - ETA: 12s - loss: 2.3098 - regression_loss: 1.9507 - classification_loss: 0.3591 449/500 [=========================>....] - ETA: 12s - loss: 2.3098 - regression_loss: 1.9507 - classification_loss: 0.3591 450/500 [==========================>...] - ETA: 12s - loss: 2.3106 - regression_loss: 1.9514 - classification_loss: 0.3592 451/500 [==========================>...] - ETA: 11s - loss: 2.3108 - regression_loss: 1.9518 - classification_loss: 0.3590 452/500 [==========================>...] - ETA: 11s - loss: 2.3099 - regression_loss: 1.9510 - classification_loss: 0.3588 453/500 [==========================>...] - ETA: 11s - loss: 2.3103 - regression_loss: 1.9515 - classification_loss: 0.3588 454/500 [==========================>...] - ETA: 11s - loss: 2.3101 - regression_loss: 1.9513 - classification_loss: 0.3587 455/500 [==========================>...] - ETA: 11s - loss: 2.3098 - regression_loss: 1.9510 - classification_loss: 0.3587 456/500 [==========================>...] - ETA: 10s - loss: 2.3102 - regression_loss: 1.9514 - classification_loss: 0.3588 457/500 [==========================>...] - ETA: 10s - loss: 2.3103 - regression_loss: 1.9515 - classification_loss: 0.3588 458/500 [==========================>...] - ETA: 10s - loss: 2.3093 - regression_loss: 1.9500 - classification_loss: 0.3593 459/500 [==========================>...] - ETA: 10s - loss: 2.3100 - regression_loss: 1.9505 - classification_loss: 0.3594 460/500 [==========================>...] - ETA: 9s - loss: 2.3098 - regression_loss: 1.9506 - classification_loss: 0.3592  461/500 [==========================>...] - ETA: 9s - loss: 2.3096 - regression_loss: 1.9505 - classification_loss: 0.3591 462/500 [==========================>...] - ETA: 9s - loss: 2.3098 - regression_loss: 1.9508 - classification_loss: 0.3590 463/500 [==========================>...] - ETA: 9s - loss: 2.3093 - regression_loss: 1.9504 - classification_loss: 0.3589 464/500 [==========================>...] - ETA: 8s - loss: 2.3102 - regression_loss: 1.9509 - classification_loss: 0.3592 465/500 [==========================>...] - ETA: 8s - loss: 2.3104 - regression_loss: 1.9512 - classification_loss: 0.3592 466/500 [==========================>...] - ETA: 8s - loss: 2.3099 - regression_loss: 1.9507 - classification_loss: 0.3592 467/500 [===========================>..] - ETA: 8s - loss: 2.3093 - regression_loss: 1.9501 - classification_loss: 0.3592 468/500 [===========================>..] - ETA: 7s - loss: 2.3090 - regression_loss: 1.9499 - classification_loss: 0.3591 469/500 [===========================>..] - ETA: 7s - loss: 2.3091 - regression_loss: 1.9501 - classification_loss: 0.3590 470/500 [===========================>..] - ETA: 7s - loss: 2.3095 - regression_loss: 1.9503 - classification_loss: 0.3592 471/500 [===========================>..] - ETA: 7s - loss: 2.3100 - regression_loss: 1.9508 - classification_loss: 0.3592 472/500 [===========================>..] - ETA: 6s - loss: 2.3084 - regression_loss: 1.9494 - classification_loss: 0.3589 473/500 [===========================>..] - ETA: 6s - loss: 2.3083 - regression_loss: 1.9494 - classification_loss: 0.3589 474/500 [===========================>..] - ETA: 6s - loss: 2.3087 - regression_loss: 1.9497 - classification_loss: 0.3590 475/500 [===========================>..] - ETA: 6s - loss: 2.3078 - regression_loss: 1.9490 - classification_loss: 0.3588 476/500 [===========================>..] - ETA: 5s - loss: 2.3066 - regression_loss: 1.9478 - classification_loss: 0.3588 477/500 [===========================>..] - ETA: 5s - loss: 2.3068 - regression_loss: 1.9478 - classification_loss: 0.3590 478/500 [===========================>..] - ETA: 5s - loss: 2.3063 - regression_loss: 1.9474 - classification_loss: 0.3589 479/500 [===========================>..] - ETA: 5s - loss: 2.3060 - regression_loss: 1.9472 - classification_loss: 0.3587 480/500 [===========================>..] - ETA: 4s - loss: 2.3056 - regression_loss: 1.9470 - classification_loss: 0.3587 481/500 [===========================>..] - ETA: 4s - loss: 2.3054 - regression_loss: 1.9464 - classification_loss: 0.3590 482/500 [===========================>..] - ETA: 4s - loss: 2.3040 - regression_loss: 1.9451 - classification_loss: 0.3588 483/500 [===========================>..] - ETA: 4s - loss: 2.3040 - regression_loss: 1.9453 - classification_loss: 0.3586 484/500 [============================>.] - ETA: 3s - loss: 2.3031 - regression_loss: 1.9446 - classification_loss: 0.3585 485/500 [============================>.] - ETA: 3s - loss: 2.3024 - regression_loss: 1.9441 - classification_loss: 0.3583 486/500 [============================>.] - ETA: 3s - loss: 2.3018 - regression_loss: 1.9436 - classification_loss: 0.3582 487/500 [============================>.] - ETA: 3s - loss: 2.3008 - regression_loss: 1.9428 - classification_loss: 0.3579 488/500 [============================>.] - ETA: 2s - loss: 2.2993 - regression_loss: 1.9417 - classification_loss: 0.3576 489/500 [============================>.] - ETA: 2s - loss: 2.2979 - regression_loss: 1.9406 - classification_loss: 0.3573 490/500 [============================>.] - ETA: 2s - loss: 2.2977 - regression_loss: 1.9404 - classification_loss: 0.3573 491/500 [============================>.] - ETA: 2s - loss: 2.2970 - regression_loss: 1.9398 - classification_loss: 0.3572 492/500 [============================>.] - ETA: 1s - loss: 2.2955 - regression_loss: 1.9386 - classification_loss: 0.3568 493/500 [============================>.] - ETA: 1s - loss: 2.2955 - regression_loss: 1.9387 - classification_loss: 0.3568 494/500 [============================>.] - ETA: 1s - loss: 2.2948 - regression_loss: 1.9377 - classification_loss: 0.3572 495/500 [============================>.] - ETA: 1s - loss: 2.2953 - regression_loss: 1.9382 - classification_loss: 0.3571 496/500 [============================>.] - ETA: 0s - loss: 2.2956 - regression_loss: 1.9386 - classification_loss: 0.3570 497/500 [============================>.] - ETA: 0s - loss: 2.2948 - regression_loss: 1.9380 - classification_loss: 0.3568 498/500 [============================>.] - ETA: 0s - loss: 2.2956 - regression_loss: 1.9384 - classification_loss: 0.3572 499/500 [============================>.] - ETA: 0s - loss: 2.2957 - regression_loss: 1.9385 - classification_loss: 0.3572 500/500 [==============================] - 122s 245ms/step - loss: 2.2946 - regression_loss: 1.9377 - classification_loss: 0.3569 1172 instances of class plum with average precision: 0.4985 mAP: 0.4985 Epoch 00003: saving model to ./training/snapshots/resnet50_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 2:00 - loss: 2.4214 - regression_loss: 2.0111 - classification_loss: 0.4104 2/500 [..............................] - ETA: 1:59 - loss: 1.7710 - regression_loss: 1.4869 - classification_loss: 0.2841 3/500 [..............................] - ETA: 1:59 - loss: 1.7856 - regression_loss: 1.5152 - classification_loss: 0.2704 4/500 [..............................] - ETA: 1:58 - loss: 1.9545 - regression_loss: 1.6754 - classification_loss: 0.2791 5/500 [..............................] - ETA: 1:57 - loss: 2.0663 - regression_loss: 1.7734 - classification_loss: 0.2929 6/500 [..............................] - ETA: 1:57 - loss: 2.1122 - regression_loss: 1.8111 - classification_loss: 0.3011 7/500 [..............................] - ETA: 1:58 - loss: 2.0628 - regression_loss: 1.7701 - classification_loss: 0.2928 8/500 [..............................] - ETA: 1:59 - loss: 2.0083 - regression_loss: 1.7165 - classification_loss: 0.2918 9/500 [..............................] - ETA: 1:59 - loss: 2.0522 - regression_loss: 1.7488 - classification_loss: 0.3034 10/500 [..............................] - ETA: 1:57 - loss: 2.0709 - regression_loss: 1.7638 - classification_loss: 0.3071 11/500 [..............................] - ETA: 1:57 - loss: 2.0730 - regression_loss: 1.7646 - classification_loss: 0.3085 12/500 [..............................] - ETA: 1:57 - loss: 2.0599 - regression_loss: 1.7517 - classification_loss: 0.3082 13/500 [..............................] - ETA: 1:57 - loss: 2.0625 - regression_loss: 1.7542 - classification_loss: 0.3083 14/500 [..............................] - ETA: 1:57 - loss: 2.0654 - regression_loss: 1.7580 - classification_loss: 0.3074 15/500 [..............................] - ETA: 1:57 - loss: 2.1073 - regression_loss: 1.7965 - classification_loss: 0.3108 16/500 [..............................] - ETA: 1:57 - loss: 2.1131 - regression_loss: 1.8013 - classification_loss: 0.3118 17/500 [>.............................] - ETA: 1:57 - loss: 2.1110 - regression_loss: 1.8008 - classification_loss: 0.3101 18/500 [>.............................] - ETA: 1:57 - loss: 2.1284 - regression_loss: 1.8110 - classification_loss: 0.3174 19/500 [>.............................] - ETA: 1:57 - loss: 2.1435 - regression_loss: 1.8214 - classification_loss: 0.3220 20/500 [>.............................] - ETA: 1:56 - loss: 2.1792 - regression_loss: 1.8490 - classification_loss: 0.3303 21/500 [>.............................] - ETA: 1:55 - loss: 2.1714 - regression_loss: 1.8433 - classification_loss: 0.3282 22/500 [>.............................] - ETA: 1:55 - loss: 2.1788 - regression_loss: 1.8516 - classification_loss: 0.3272 23/500 [>.............................] - ETA: 1:55 - loss: 2.1890 - regression_loss: 1.8601 - classification_loss: 0.3289 24/500 [>.............................] - ETA: 1:55 - loss: 2.1907 - regression_loss: 1.8617 - classification_loss: 0.3291 25/500 [>.............................] - ETA: 1:55 - loss: 2.1955 - regression_loss: 1.8656 - classification_loss: 0.3300 26/500 [>.............................] - ETA: 1:54 - loss: 2.2055 - regression_loss: 1.8719 - classification_loss: 0.3335 27/500 [>.............................] - ETA: 1:54 - loss: 2.1933 - regression_loss: 1.8609 - classification_loss: 0.3324 28/500 [>.............................] - ETA: 1:54 - loss: 2.1975 - regression_loss: 1.8652 - classification_loss: 0.3323 29/500 [>.............................] - ETA: 1:54 - loss: 2.2270 - regression_loss: 1.8899 - classification_loss: 0.3371 30/500 [>.............................] - ETA: 1:53 - loss: 2.2250 - regression_loss: 1.8890 - classification_loss: 0.3360 31/500 [>.............................] - ETA: 1:53 - loss: 2.2223 - regression_loss: 1.8861 - classification_loss: 0.3362 32/500 [>.............................] - ETA: 1:53 - loss: 2.2251 - regression_loss: 1.8897 - classification_loss: 0.3354 33/500 [>.............................] - ETA: 1:53 - loss: 2.2165 - regression_loss: 1.8822 - classification_loss: 0.3344 34/500 [=>............................] - ETA: 1:52 - loss: 2.2192 - regression_loss: 1.8848 - classification_loss: 0.3344 35/500 [=>............................] - ETA: 1:52 - loss: 2.2197 - regression_loss: 1.8871 - classification_loss: 0.3326 36/500 [=>............................] - ETA: 1:52 - loss: 2.2284 - regression_loss: 1.8938 - classification_loss: 0.3346 37/500 [=>............................] - ETA: 1:51 - loss: 2.2384 - regression_loss: 1.9026 - classification_loss: 0.3358 38/500 [=>............................] - ETA: 1:51 - loss: 2.2354 - regression_loss: 1.9003 - classification_loss: 0.3351 39/500 [=>............................] - ETA: 1:51 - loss: 2.2331 - regression_loss: 1.8990 - classification_loss: 0.3341 40/500 [=>............................] - ETA: 1:50 - loss: 2.2355 - regression_loss: 1.9017 - classification_loss: 0.3338 41/500 [=>............................] - ETA: 1:50 - loss: 2.2253 - regression_loss: 1.8916 - classification_loss: 0.3337 42/500 [=>............................] - ETA: 1:50 - loss: 2.2176 - regression_loss: 1.8842 - classification_loss: 0.3334 43/500 [=>............................] - ETA: 1:50 - loss: 2.2216 - regression_loss: 1.8874 - classification_loss: 0.3342 44/500 [=>............................] - ETA: 1:49 - loss: 2.2024 - regression_loss: 1.8719 - classification_loss: 0.3306 45/500 [=>............................] - ETA: 1:49 - loss: 2.1959 - regression_loss: 1.8670 - classification_loss: 0.3290 46/500 [=>............................] - ETA: 1:49 - loss: 2.1929 - regression_loss: 1.8653 - classification_loss: 0.3276 47/500 [=>............................] - ETA: 1:49 - loss: 2.1961 - regression_loss: 1.8663 - classification_loss: 0.3298 48/500 [=>............................] - ETA: 1:49 - loss: 2.1959 - regression_loss: 1.8668 - classification_loss: 0.3291 49/500 [=>............................] - ETA: 1:48 - loss: 2.2053 - regression_loss: 1.8747 - classification_loss: 0.3306 50/500 [==>...........................] - ETA: 1:48 - loss: 2.2137 - regression_loss: 1.8779 - classification_loss: 0.3358 51/500 [==>...........................] - ETA: 1:48 - loss: 2.2219 - regression_loss: 1.8848 - classification_loss: 0.3372 52/500 [==>...........................] - ETA: 1:47 - loss: 2.2228 - regression_loss: 1.8863 - classification_loss: 0.3364 53/500 [==>...........................] - ETA: 1:47 - loss: 2.2212 - regression_loss: 1.8851 - classification_loss: 0.3361 54/500 [==>...........................] - ETA: 1:47 - loss: 2.2054 - regression_loss: 1.8724 - classification_loss: 0.3330 55/500 [==>...........................] - ETA: 1:47 - loss: 2.1947 - regression_loss: 1.8631 - classification_loss: 0.3316 56/500 [==>...........................] - ETA: 1:46 - loss: 2.1968 - regression_loss: 1.8660 - classification_loss: 0.3308 57/500 [==>...........................] - ETA: 1:46 - loss: 2.2018 - regression_loss: 1.8692 - classification_loss: 0.3326 58/500 [==>...........................] - ETA: 1:46 - loss: 2.2225 - regression_loss: 1.8854 - classification_loss: 0.3371 59/500 [==>...........................] - ETA: 1:46 - loss: 2.2273 - regression_loss: 1.8897 - classification_loss: 0.3376 60/500 [==>...........................] - ETA: 1:46 - loss: 2.2163 - regression_loss: 1.8789 - classification_loss: 0.3373 61/500 [==>...........................] - ETA: 1:45 - loss: 2.2305 - regression_loss: 1.8902 - classification_loss: 0.3403 62/500 [==>...........................] - ETA: 1:45 - loss: 2.2361 - regression_loss: 1.8933 - classification_loss: 0.3429 63/500 [==>...........................] - ETA: 1:45 - loss: 2.2394 - regression_loss: 1.8957 - classification_loss: 0.3437 64/500 [==>...........................] - ETA: 1:45 - loss: 2.2391 - regression_loss: 1.8952 - classification_loss: 0.3439 65/500 [==>...........................] - ETA: 1:44 - loss: 2.2208 - regression_loss: 1.8787 - classification_loss: 0.3422 66/500 [==>...........................] - ETA: 1:44 - loss: 2.2172 - regression_loss: 1.8759 - classification_loss: 0.3413 67/500 [===>..........................] - ETA: 1:44 - loss: 2.2234 - regression_loss: 1.8813 - classification_loss: 0.3422 68/500 [===>..........................] - ETA: 1:44 - loss: 2.2342 - regression_loss: 1.8897 - classification_loss: 0.3444 69/500 [===>..........................] - ETA: 1:44 - loss: 2.2342 - regression_loss: 1.8894 - classification_loss: 0.3448 70/500 [===>..........................] - ETA: 1:43 - loss: 2.2365 - regression_loss: 1.8906 - classification_loss: 0.3458 71/500 [===>..........................] - ETA: 1:43 - loss: 2.2387 - regression_loss: 1.8921 - classification_loss: 0.3466 72/500 [===>..........................] - ETA: 1:43 - loss: 2.2388 - regression_loss: 1.8915 - classification_loss: 0.3473 73/500 [===>..........................] - ETA: 1:42 - loss: 2.2394 - regression_loss: 1.8920 - classification_loss: 0.3475 74/500 [===>..........................] - ETA: 1:42 - loss: 2.2397 - regression_loss: 1.8930 - classification_loss: 0.3467 75/500 [===>..........................] - ETA: 1:42 - loss: 2.2325 - regression_loss: 1.8877 - classification_loss: 0.3448 76/500 [===>..........................] - ETA: 1:42 - loss: 2.2267 - regression_loss: 1.8822 - classification_loss: 0.3445 77/500 [===>..........................] - ETA: 1:42 - loss: 2.2288 - regression_loss: 1.8849 - classification_loss: 0.3440 78/500 [===>..........................] - ETA: 1:41 - loss: 2.2218 - regression_loss: 1.8798 - classification_loss: 0.3420 79/500 [===>..........................] - ETA: 1:41 - loss: 2.2210 - regression_loss: 1.8794 - classification_loss: 0.3416 80/500 [===>..........................] - ETA: 1:41 - loss: 2.2202 - regression_loss: 1.8784 - classification_loss: 0.3419 81/500 [===>..........................] - ETA: 1:41 - loss: 2.2168 - regression_loss: 1.8764 - classification_loss: 0.3404 82/500 [===>..........................] - ETA: 1:40 - loss: 2.2188 - regression_loss: 1.8784 - classification_loss: 0.3405 83/500 [===>..........................] - ETA: 1:40 - loss: 2.2093 - regression_loss: 1.8702 - classification_loss: 0.3391 84/500 [====>.........................] - ETA: 1:40 - loss: 2.2110 - regression_loss: 1.8716 - classification_loss: 0.3393 85/500 [====>.........................] - ETA: 1:40 - loss: 2.2031 - regression_loss: 1.8656 - classification_loss: 0.3374 86/500 [====>.........................] - ETA: 1:39 - loss: 2.2025 - regression_loss: 1.8645 - classification_loss: 0.3380 87/500 [====>.........................] - ETA: 1:39 - loss: 2.2060 - regression_loss: 1.8673 - classification_loss: 0.3387 88/500 [====>.........................] - ETA: 1:39 - loss: 2.2053 - regression_loss: 1.8671 - classification_loss: 0.3382 89/500 [====>.........................] - ETA: 1:38 - loss: 2.2004 - regression_loss: 1.8634 - classification_loss: 0.3370 90/500 [====>.........................] - ETA: 1:38 - loss: 2.2012 - regression_loss: 1.8645 - classification_loss: 0.3367 91/500 [====>.........................] - ETA: 1:38 - loss: 2.2010 - regression_loss: 1.8646 - classification_loss: 0.3364 92/500 [====>.........................] - ETA: 1:38 - loss: 2.1857 - regression_loss: 1.8512 - classification_loss: 0.3345 93/500 [====>.........................] - ETA: 1:38 - loss: 2.1861 - regression_loss: 1.8516 - classification_loss: 0.3345 94/500 [====>.........................] - ETA: 1:37 - loss: 2.1980 - regression_loss: 1.8623 - classification_loss: 0.3357 95/500 [====>.........................] - ETA: 1:37 - loss: 2.1967 - regression_loss: 1.8615 - classification_loss: 0.3352 96/500 [====>.........................] - ETA: 1:37 - loss: 2.1982 - regression_loss: 1.8626 - classification_loss: 0.3355 97/500 [====>.........................] - ETA: 1:37 - loss: 2.1988 - regression_loss: 1.8634 - classification_loss: 0.3354 98/500 [====>.........................] - ETA: 1:37 - loss: 2.2033 - regression_loss: 1.8670 - classification_loss: 0.3363 99/500 [====>.........................] - ETA: 1:36 - loss: 2.2001 - regression_loss: 1.8646 - classification_loss: 0.3355 100/500 [=====>........................] - ETA: 1:36 - loss: 2.1891 - regression_loss: 1.8554 - classification_loss: 0.3337 101/500 [=====>........................] - ETA: 1:36 - loss: 2.1863 - regression_loss: 1.8527 - classification_loss: 0.3336 102/500 [=====>........................] - ETA: 1:36 - loss: 2.1889 - regression_loss: 1.8551 - classification_loss: 0.3338 103/500 [=====>........................] - ETA: 1:35 - loss: 2.1892 - regression_loss: 1.8553 - classification_loss: 0.3339 104/500 [=====>........................] - ETA: 1:35 - loss: 2.1853 - regression_loss: 1.8526 - classification_loss: 0.3327 105/500 [=====>........................] - ETA: 1:35 - loss: 2.1874 - regression_loss: 1.8535 - classification_loss: 0.3339 106/500 [=====>........................] - ETA: 1:35 - loss: 2.1925 - regression_loss: 1.8585 - classification_loss: 0.3340 107/500 [=====>........................] - ETA: 1:35 - loss: 2.1933 - regression_loss: 1.8596 - classification_loss: 0.3337 108/500 [=====>........................] - ETA: 1:35 - loss: 2.1908 - regression_loss: 1.8579 - classification_loss: 0.3329 109/500 [=====>........................] - ETA: 1:34 - loss: 2.1910 - regression_loss: 1.8579 - classification_loss: 0.3332 110/500 [=====>........................] - ETA: 1:34 - loss: 2.1901 - regression_loss: 1.8568 - classification_loss: 0.3333 111/500 [=====>........................] - ETA: 1:34 - loss: 2.1914 - regression_loss: 1.8580 - classification_loss: 0.3334 112/500 [=====>........................] - ETA: 1:34 - loss: 2.1889 - regression_loss: 1.8562 - classification_loss: 0.3328 113/500 [=====>........................] - ETA: 1:33 - loss: 2.1894 - regression_loss: 1.8562 - classification_loss: 0.3333 114/500 [=====>........................] - ETA: 1:33 - loss: 2.1858 - regression_loss: 1.8533 - classification_loss: 0.3324 115/500 [=====>........................] - ETA: 1:33 - loss: 2.1886 - regression_loss: 1.8560 - classification_loss: 0.3326 116/500 [=====>........................] - ETA: 1:33 - loss: 2.1770 - regression_loss: 1.8460 - classification_loss: 0.3310 117/500 [======>.......................] - ETA: 1:32 - loss: 2.1725 - regression_loss: 1.8423 - classification_loss: 0.3302 118/500 [======>.......................] - ETA: 1:32 - loss: 2.1737 - regression_loss: 1.8436 - classification_loss: 0.3301 119/500 [======>.......................] - ETA: 1:32 - loss: 2.1755 - regression_loss: 1.8458 - classification_loss: 0.3297 120/500 [======>.......................] - ETA: 1:32 - loss: 2.1771 - regression_loss: 1.8473 - classification_loss: 0.3299 121/500 [======>.......................] - ETA: 1:32 - loss: 2.1754 - regression_loss: 1.8460 - classification_loss: 0.3294 122/500 [======>.......................] - ETA: 1:31 - loss: 2.1744 - regression_loss: 1.8451 - classification_loss: 0.3293 123/500 [======>.......................] - ETA: 1:31 - loss: 2.1712 - regression_loss: 1.8428 - classification_loss: 0.3284 124/500 [======>.......................] - ETA: 1:31 - loss: 2.1732 - regression_loss: 1.8444 - classification_loss: 0.3287 125/500 [======>.......................] - ETA: 1:31 - loss: 2.1741 - regression_loss: 1.8457 - classification_loss: 0.3284 126/500 [======>.......................] - ETA: 1:30 - loss: 2.1755 - regression_loss: 1.8471 - classification_loss: 0.3284 127/500 [======>.......................] - ETA: 1:30 - loss: 2.1780 - regression_loss: 1.8490 - classification_loss: 0.3290 128/500 [======>.......................] - ETA: 1:30 - loss: 2.1739 - regression_loss: 1.8455 - classification_loss: 0.3284 129/500 [======>.......................] - ETA: 1:30 - loss: 2.1723 - regression_loss: 1.8435 - classification_loss: 0.3288 130/500 [======>.......................] - ETA: 1:29 - loss: 2.1664 - regression_loss: 1.8382 - classification_loss: 0.3282 131/500 [======>.......................] - ETA: 1:29 - loss: 2.1648 - regression_loss: 1.8363 - classification_loss: 0.3286 132/500 [======>.......................] - ETA: 1:29 - loss: 2.1618 - regression_loss: 1.8342 - classification_loss: 0.3276 133/500 [======>.......................] - ETA: 1:29 - loss: 2.1605 - regression_loss: 1.8332 - classification_loss: 0.3273 134/500 [=======>......................] - ETA: 1:29 - loss: 2.1629 - regression_loss: 1.8354 - classification_loss: 0.3275 135/500 [=======>......................] - ETA: 1:28 - loss: 2.1618 - regression_loss: 1.8344 - classification_loss: 0.3274 136/500 [=======>......................] - ETA: 1:28 - loss: 2.1622 - regression_loss: 1.8342 - classification_loss: 0.3281 137/500 [=======>......................] - ETA: 1:28 - loss: 2.1662 - regression_loss: 1.8369 - classification_loss: 0.3293 138/500 [=======>......................] - ETA: 1:28 - loss: 2.1732 - regression_loss: 1.8429 - classification_loss: 0.3303 139/500 [=======>......................] - ETA: 1:27 - loss: 2.1790 - regression_loss: 1.8471 - classification_loss: 0.3319 140/500 [=======>......................] - ETA: 1:27 - loss: 2.1787 - regression_loss: 1.8467 - classification_loss: 0.3320 141/500 [=======>......................] - ETA: 1:27 - loss: 2.1803 - regression_loss: 1.8478 - classification_loss: 0.3325 142/500 [=======>......................] - ETA: 1:27 - loss: 2.1823 - regression_loss: 1.8494 - classification_loss: 0.3329 143/500 [=======>......................] - ETA: 1:26 - loss: 2.1814 - regression_loss: 1.8482 - classification_loss: 0.3331 144/500 [=======>......................] - ETA: 1:26 - loss: 2.1801 - regression_loss: 1.8474 - classification_loss: 0.3327 145/500 [=======>......................] - ETA: 1:26 - loss: 2.1803 - regression_loss: 1.8479 - classification_loss: 0.3324 146/500 [=======>......................] - ETA: 1:26 - loss: 2.1809 - regression_loss: 1.8486 - classification_loss: 0.3323 147/500 [=======>......................] - ETA: 1:25 - loss: 2.1827 - regression_loss: 1.8497 - classification_loss: 0.3330 148/500 [=======>......................] - ETA: 1:25 - loss: 2.1837 - regression_loss: 1.8508 - classification_loss: 0.3329 149/500 [=======>......................] - ETA: 1:25 - loss: 2.1876 - regression_loss: 1.8535 - classification_loss: 0.3341 150/500 [========>.....................] - ETA: 1:25 - loss: 2.1893 - regression_loss: 1.8552 - classification_loss: 0.3342 151/500 [========>.....................] - ETA: 1:24 - loss: 2.1936 - regression_loss: 1.8541 - classification_loss: 0.3394 152/500 [========>.....................] - ETA: 1:24 - loss: 2.1916 - regression_loss: 1.8524 - classification_loss: 0.3392 153/500 [========>.....................] - ETA: 1:24 - loss: 2.1920 - regression_loss: 1.8528 - classification_loss: 0.3392 154/500 [========>.....................] - ETA: 1:24 - loss: 2.1929 - regression_loss: 1.8537 - classification_loss: 0.3392 155/500 [========>.....................] - ETA: 1:23 - loss: 2.1957 - regression_loss: 1.8571 - classification_loss: 0.3386 156/500 [========>.....................] - ETA: 1:23 - loss: 2.1939 - regression_loss: 1.8559 - classification_loss: 0.3381 157/500 [========>.....................] - ETA: 1:23 - loss: 2.1947 - regression_loss: 1.8567 - classification_loss: 0.3380 158/500 [========>.....................] - ETA: 1:23 - loss: 2.1956 - regression_loss: 1.8582 - classification_loss: 0.3374 159/500 [========>.....................] - ETA: 1:22 - loss: 2.1985 - regression_loss: 1.8609 - classification_loss: 0.3376 160/500 [========>.....................] - ETA: 1:22 - loss: 2.1995 - regression_loss: 1.8619 - classification_loss: 0.3377 161/500 [========>.....................] - ETA: 1:22 - loss: 2.1973 - regression_loss: 1.8599 - classification_loss: 0.3375 162/500 [========>.....................] - ETA: 1:22 - loss: 2.1955 - regression_loss: 1.8587 - classification_loss: 0.3368 163/500 [========>.....................] - ETA: 1:21 - loss: 2.1966 - regression_loss: 1.8599 - classification_loss: 0.3367 164/500 [========>.....................] - ETA: 1:21 - loss: 2.1935 - regression_loss: 1.8573 - classification_loss: 0.3362 165/500 [========>.....................] - ETA: 1:21 - loss: 2.1897 - regression_loss: 1.8540 - classification_loss: 0.3356 166/500 [========>.....................] - ETA: 1:21 - loss: 2.1917 - regression_loss: 1.8562 - classification_loss: 0.3355 167/500 [=========>....................] - ETA: 1:20 - loss: 2.1916 - regression_loss: 1.8562 - classification_loss: 0.3354 168/500 [=========>....................] - ETA: 1:20 - loss: 2.1928 - regression_loss: 1.8574 - classification_loss: 0.3354 169/500 [=========>....................] - ETA: 1:20 - loss: 2.1881 - regression_loss: 1.8535 - classification_loss: 0.3346 170/500 [=========>....................] - ETA: 1:20 - loss: 2.1873 - regression_loss: 1.8529 - classification_loss: 0.3344 171/500 [=========>....................] - ETA: 1:19 - loss: 2.1888 - regression_loss: 1.8543 - classification_loss: 0.3345 172/500 [=========>....................] - ETA: 1:19 - loss: 2.1827 - regression_loss: 1.8489 - classification_loss: 0.3338 173/500 [=========>....................] - ETA: 1:19 - loss: 2.1812 - regression_loss: 1.8474 - classification_loss: 0.3338 174/500 [=========>....................] - ETA: 1:19 - loss: 2.1815 - regression_loss: 1.8476 - classification_loss: 0.3339 175/500 [=========>....................] - ETA: 1:19 - loss: 2.1809 - regression_loss: 1.8477 - classification_loss: 0.3332 176/500 [=========>....................] - ETA: 1:18 - loss: 2.1809 - regression_loss: 1.8477 - classification_loss: 0.3332 177/500 [=========>....................] - ETA: 1:18 - loss: 2.1801 - regression_loss: 1.8472 - classification_loss: 0.3329 178/500 [=========>....................] - ETA: 1:18 - loss: 2.1789 - regression_loss: 1.8463 - classification_loss: 0.3326 179/500 [=========>....................] - ETA: 1:18 - loss: 2.1806 - regression_loss: 1.8478 - classification_loss: 0.3328 180/500 [=========>....................] - ETA: 1:17 - loss: 2.1812 - regression_loss: 1.8484 - classification_loss: 0.3327 181/500 [=========>....................] - ETA: 1:17 - loss: 2.1841 - regression_loss: 1.8503 - classification_loss: 0.3339 182/500 [=========>....................] - ETA: 1:17 - loss: 2.1831 - regression_loss: 1.8495 - classification_loss: 0.3336 183/500 [=========>....................] - ETA: 1:17 - loss: 2.1835 - regression_loss: 1.8500 - classification_loss: 0.3335 184/500 [==========>...................] - ETA: 1:16 - loss: 2.1815 - regression_loss: 1.8485 - classification_loss: 0.3330 185/500 [==========>...................] - ETA: 1:16 - loss: 2.1808 - regression_loss: 1.8481 - classification_loss: 0.3327 186/500 [==========>...................] - ETA: 1:16 - loss: 2.1785 - regression_loss: 1.8464 - classification_loss: 0.3321 187/500 [==========>...................] - ETA: 1:16 - loss: 2.1745 - regression_loss: 1.8430 - classification_loss: 0.3315 188/500 [==========>...................] - ETA: 1:15 - loss: 2.1742 - regression_loss: 1.8426 - classification_loss: 0.3316 189/500 [==========>...................] - ETA: 1:15 - loss: 2.1724 - regression_loss: 1.8413 - classification_loss: 0.3311 190/500 [==========>...................] - ETA: 1:15 - loss: 2.1727 - regression_loss: 1.8416 - classification_loss: 0.3311 191/500 [==========>...................] - ETA: 1:15 - loss: 2.1717 - regression_loss: 1.8404 - classification_loss: 0.3312 192/500 [==========>...................] - ETA: 1:15 - loss: 2.1712 - regression_loss: 1.8403 - classification_loss: 0.3308 193/500 [==========>...................] - ETA: 1:14 - loss: 2.1688 - regression_loss: 1.8383 - classification_loss: 0.3304 194/500 [==========>...................] - ETA: 1:14 - loss: 2.1734 - regression_loss: 1.8418 - classification_loss: 0.3316 195/500 [==========>...................] - ETA: 1:14 - loss: 2.1715 - regression_loss: 1.8404 - classification_loss: 0.3311 196/500 [==========>...................] - ETA: 1:14 - loss: 2.1722 - regression_loss: 1.8410 - classification_loss: 0.3313 197/500 [==========>...................] - ETA: 1:13 - loss: 2.1738 - regression_loss: 1.8424 - classification_loss: 0.3315 198/500 [==========>...................] - ETA: 1:13 - loss: 2.1720 - regression_loss: 1.8409 - classification_loss: 0.3311 199/500 [==========>...................] - ETA: 1:13 - loss: 2.1715 - regression_loss: 1.8403 - classification_loss: 0.3312 200/500 [===========>..................] - ETA: 1:13 - loss: 2.1703 - regression_loss: 1.8396 - classification_loss: 0.3307 201/500 [===========>..................] - ETA: 1:12 - loss: 2.1654 - regression_loss: 1.8354 - classification_loss: 0.3300 202/500 [===========>..................] - ETA: 1:12 - loss: 2.1633 - regression_loss: 1.8335 - classification_loss: 0.3298 203/500 [===========>..................] - ETA: 1:12 - loss: 2.1630 - regression_loss: 1.8334 - classification_loss: 0.3296 204/500 [===========>..................] - ETA: 1:12 - loss: 2.1625 - regression_loss: 1.8333 - classification_loss: 0.3293 205/500 [===========>..................] - ETA: 1:11 - loss: 2.1632 - regression_loss: 1.8338 - classification_loss: 0.3294 206/500 [===========>..................] - ETA: 1:11 - loss: 2.1582 - regression_loss: 1.8296 - classification_loss: 0.3286 207/500 [===========>..................] - ETA: 1:11 - loss: 2.1599 - regression_loss: 1.8313 - classification_loss: 0.3286 208/500 [===========>..................] - ETA: 1:11 - loss: 2.1603 - regression_loss: 1.8316 - classification_loss: 0.3287 209/500 [===========>..................] - ETA: 1:11 - loss: 2.1612 - regression_loss: 1.8323 - classification_loss: 0.3289 210/500 [===========>..................] - ETA: 1:10 - loss: 2.1610 - regression_loss: 1.8321 - classification_loss: 0.3289 211/500 [===========>..................] - ETA: 1:10 - loss: 2.1605 - regression_loss: 1.8315 - classification_loss: 0.3290 212/500 [===========>..................] - ETA: 1:10 - loss: 2.1629 - regression_loss: 1.8334 - classification_loss: 0.3295 213/500 [===========>..................] - ETA: 1:10 - loss: 2.1631 - regression_loss: 1.8337 - classification_loss: 0.3294 214/500 [===========>..................] - ETA: 1:09 - loss: 2.1635 - regression_loss: 1.8339 - classification_loss: 0.3296 215/500 [===========>..................] - ETA: 1:09 - loss: 2.1590 - regression_loss: 1.8298 - classification_loss: 0.3291 216/500 [===========>..................] - ETA: 1:09 - loss: 2.1593 - regression_loss: 1.8294 - classification_loss: 0.3299 217/500 [============>.................] - ETA: 1:09 - loss: 2.1593 - regression_loss: 1.8293 - classification_loss: 0.3299 218/500 [============>.................] - ETA: 1:08 - loss: 2.1634 - regression_loss: 1.8328 - classification_loss: 0.3307 219/500 [============>.................] - ETA: 1:08 - loss: 2.1633 - regression_loss: 1.8327 - classification_loss: 0.3306 220/500 [============>.................] - ETA: 1:08 - loss: 2.1636 - regression_loss: 1.8334 - classification_loss: 0.3302 221/500 [============>.................] - ETA: 1:08 - loss: 2.1632 - regression_loss: 1.8325 - classification_loss: 0.3308 222/500 [============>.................] - ETA: 1:07 - loss: 2.1642 - regression_loss: 1.8334 - classification_loss: 0.3308 223/500 [============>.................] - ETA: 1:07 - loss: 2.1676 - regression_loss: 1.8357 - classification_loss: 0.3319 224/500 [============>.................] - ETA: 1:07 - loss: 2.1701 - regression_loss: 1.8374 - classification_loss: 0.3328 225/500 [============>.................] - ETA: 1:07 - loss: 2.1713 - regression_loss: 1.8377 - classification_loss: 0.3336 226/500 [============>.................] - ETA: 1:07 - loss: 2.1712 - regression_loss: 1.8376 - classification_loss: 0.3336 227/500 [============>.................] - ETA: 1:06 - loss: 2.1713 - regression_loss: 1.8379 - classification_loss: 0.3334 228/500 [============>.................] - ETA: 1:06 - loss: 2.1704 - regression_loss: 1.8374 - classification_loss: 0.3330 229/500 [============>.................] - ETA: 1:06 - loss: 2.1680 - regression_loss: 1.8353 - classification_loss: 0.3327 230/500 [============>.................] - ETA: 1:06 - loss: 2.1701 - regression_loss: 1.8373 - classification_loss: 0.3329 231/500 [============>.................] - ETA: 1:05 - loss: 2.1734 - regression_loss: 1.8396 - classification_loss: 0.3338 232/500 [============>.................] - ETA: 1:05 - loss: 2.1732 - regression_loss: 1.8394 - classification_loss: 0.3338 233/500 [============>.................] - ETA: 1:05 - loss: 2.1715 - regression_loss: 1.8381 - classification_loss: 0.3334 234/500 [=============>................] - ETA: 1:05 - loss: 2.1716 - regression_loss: 1.8382 - classification_loss: 0.3334 235/500 [=============>................] - ETA: 1:04 - loss: 2.1713 - regression_loss: 1.8379 - classification_loss: 0.3334 236/500 [=============>................] - ETA: 1:04 - loss: 2.1727 - regression_loss: 1.8388 - classification_loss: 0.3339 237/500 [=============>................] - ETA: 1:04 - loss: 2.1717 - regression_loss: 1.8385 - classification_loss: 0.3333 238/500 [=============>................] - ETA: 1:04 - loss: 2.1727 - regression_loss: 1.8394 - classification_loss: 0.3333 239/500 [=============>................] - ETA: 1:03 - loss: 2.1731 - regression_loss: 1.8398 - classification_loss: 0.3333 240/500 [=============>................] - ETA: 1:03 - loss: 2.1742 - regression_loss: 1.8403 - classification_loss: 0.3339 241/500 [=============>................] - ETA: 1:03 - loss: 2.1743 - regression_loss: 1.8404 - classification_loss: 0.3339 242/500 [=============>................] - ETA: 1:03 - loss: 2.1754 - regression_loss: 1.8413 - classification_loss: 0.3341 243/500 [=============>................] - ETA: 1:02 - loss: 2.1755 - regression_loss: 1.8412 - classification_loss: 0.3342 244/500 [=============>................] - ETA: 1:02 - loss: 2.1769 - regression_loss: 1.8424 - classification_loss: 0.3345 245/500 [=============>................] - ETA: 1:02 - loss: 2.1751 - regression_loss: 1.8413 - classification_loss: 0.3339 246/500 [=============>................] - ETA: 1:02 - loss: 2.1738 - regression_loss: 1.8401 - classification_loss: 0.3337 247/500 [=============>................] - ETA: 1:01 - loss: 2.1739 - regression_loss: 1.8401 - classification_loss: 0.3338 248/500 [=============>................] - ETA: 1:01 - loss: 2.1751 - regression_loss: 1.8412 - classification_loss: 0.3339 249/500 [=============>................] - ETA: 1:01 - loss: 2.1767 - regression_loss: 1.8423 - classification_loss: 0.3344 250/500 [==============>...............] - ETA: 1:01 - loss: 2.1777 - regression_loss: 1.8433 - classification_loss: 0.3344 251/500 [==============>...............] - ETA: 1:00 - loss: 2.1774 - regression_loss: 1.8432 - classification_loss: 0.3342 252/500 [==============>...............] - ETA: 1:00 - loss: 2.1757 - regression_loss: 1.8419 - classification_loss: 0.3338 253/500 [==============>...............] - ETA: 1:00 - loss: 2.1746 - regression_loss: 1.8411 - classification_loss: 0.3335 254/500 [==============>...............] - ETA: 1:00 - loss: 2.1753 - regression_loss: 1.8418 - classification_loss: 0.3335 255/500 [==============>...............] - ETA: 59s - loss: 2.1747 - regression_loss: 1.8413 - classification_loss: 0.3334  256/500 [==============>...............] - ETA: 59s - loss: 2.1760 - regression_loss: 1.8426 - classification_loss: 0.3334 257/500 [==============>...............] - ETA: 59s - loss: 2.1767 - regression_loss: 1.8428 - classification_loss: 0.3339 258/500 [==============>...............] - ETA: 59s - loss: 2.1766 - regression_loss: 1.8428 - classification_loss: 0.3338 259/500 [==============>...............] - ETA: 58s - loss: 2.1755 - regression_loss: 1.8420 - classification_loss: 0.3335 260/500 [==============>...............] - ETA: 58s - loss: 2.1763 - regression_loss: 1.8428 - classification_loss: 0.3336 261/500 [==============>...............] - ETA: 58s - loss: 2.1754 - regression_loss: 1.8422 - classification_loss: 0.3332 262/500 [==============>...............] - ETA: 58s - loss: 2.1749 - regression_loss: 1.8416 - classification_loss: 0.3332 263/500 [==============>...............] - ETA: 57s - loss: 2.1734 - regression_loss: 1.8404 - classification_loss: 0.3329 264/500 [==============>...............] - ETA: 57s - loss: 2.1747 - regression_loss: 1.8413 - classification_loss: 0.3334 265/500 [==============>...............] - ETA: 57s - loss: 2.1759 - regression_loss: 1.8425 - classification_loss: 0.3334 266/500 [==============>...............] - ETA: 57s - loss: 2.1767 - regression_loss: 1.8433 - classification_loss: 0.3335 267/500 [===============>..............] - ETA: 56s - loss: 2.1787 - regression_loss: 1.8453 - classification_loss: 0.3334 268/500 [===============>..............] - ETA: 56s - loss: 2.1786 - regression_loss: 1.8452 - classification_loss: 0.3334 269/500 [===============>..............] - ETA: 56s - loss: 2.1729 - regression_loss: 1.8403 - classification_loss: 0.3325 270/500 [===============>..............] - ETA: 56s - loss: 2.1730 - regression_loss: 1.8407 - classification_loss: 0.3323 271/500 [===============>..............] - ETA: 55s - loss: 2.1740 - regression_loss: 1.8414 - classification_loss: 0.3326 272/500 [===============>..............] - ETA: 55s - loss: 2.1738 - regression_loss: 1.8413 - classification_loss: 0.3326 273/500 [===============>..............] - ETA: 55s - loss: 2.1754 - regression_loss: 1.8422 - classification_loss: 0.3331 274/500 [===============>..............] - ETA: 55s - loss: 2.1715 - regression_loss: 1.8391 - classification_loss: 0.3324 275/500 [===============>..............] - ETA: 54s - loss: 2.1719 - regression_loss: 1.8395 - classification_loss: 0.3324 276/500 [===============>..............] - ETA: 54s - loss: 2.1733 - regression_loss: 1.8407 - classification_loss: 0.3326 277/500 [===============>..............] - ETA: 54s - loss: 2.1694 - regression_loss: 1.8374 - classification_loss: 0.3320 278/500 [===============>..............] - ETA: 54s - loss: 2.1659 - regression_loss: 1.8344 - classification_loss: 0.3315 279/500 [===============>..............] - ETA: 53s - loss: 2.1641 - regression_loss: 1.8329 - classification_loss: 0.3312 280/500 [===============>..............] - ETA: 53s - loss: 2.1630 - regression_loss: 1.8321 - classification_loss: 0.3309 281/500 [===============>..............] - ETA: 53s - loss: 2.1635 - regression_loss: 1.8326 - classification_loss: 0.3309 282/500 [===============>..............] - ETA: 53s - loss: 2.1637 - regression_loss: 1.8327 - classification_loss: 0.3310 283/500 [===============>..............] - ETA: 52s - loss: 2.1619 - regression_loss: 1.8316 - classification_loss: 0.3303 284/500 [================>.............] - ETA: 52s - loss: 2.1599 - regression_loss: 1.8298 - classification_loss: 0.3301 285/500 [================>.............] - ETA: 52s - loss: 2.1597 - regression_loss: 1.8296 - classification_loss: 0.3301 286/500 [================>.............] - ETA: 52s - loss: 2.1607 - regression_loss: 1.8306 - classification_loss: 0.3301 287/500 [================>.............] - ETA: 52s - loss: 2.1635 - regression_loss: 1.8330 - classification_loss: 0.3306 288/500 [================>.............] - ETA: 51s - loss: 2.1633 - regression_loss: 1.8327 - classification_loss: 0.3305 289/500 [================>.............] - ETA: 51s - loss: 2.1633 - regression_loss: 1.8327 - classification_loss: 0.3306 290/500 [================>.............] - ETA: 51s - loss: 2.1624 - regression_loss: 1.8321 - classification_loss: 0.3303 291/500 [================>.............] - ETA: 51s - loss: 2.1629 - regression_loss: 1.8325 - classification_loss: 0.3304 292/500 [================>.............] - ETA: 50s - loss: 2.1616 - regression_loss: 1.8313 - classification_loss: 0.3303 293/500 [================>.............] - ETA: 50s - loss: 2.1603 - regression_loss: 1.8299 - classification_loss: 0.3304 294/500 [================>.............] - ETA: 50s - loss: 2.1619 - regression_loss: 1.8312 - classification_loss: 0.3307 295/500 [================>.............] - ETA: 50s - loss: 2.1636 - regression_loss: 1.8327 - classification_loss: 0.3309 296/500 [================>.............] - ETA: 49s - loss: 2.1642 - regression_loss: 1.8333 - classification_loss: 0.3310 297/500 [================>.............] - ETA: 49s - loss: 2.1649 - regression_loss: 1.8338 - classification_loss: 0.3311 298/500 [================>.............] - ETA: 49s - loss: 2.1653 - regression_loss: 1.8343 - classification_loss: 0.3310 299/500 [================>.............] - ETA: 49s - loss: 2.1652 - regression_loss: 1.8339 - classification_loss: 0.3313 300/500 [=================>............] - ETA: 48s - loss: 2.1641 - regression_loss: 1.8330 - classification_loss: 0.3310 301/500 [=================>............] - ETA: 48s - loss: 2.1615 - regression_loss: 1.8307 - classification_loss: 0.3307 302/500 [=================>............] - ETA: 48s - loss: 2.1614 - regression_loss: 1.8308 - classification_loss: 0.3306 303/500 [=================>............] - ETA: 48s - loss: 2.1595 - regression_loss: 1.8294 - classification_loss: 0.3301 304/500 [=================>............] - ETA: 47s - loss: 2.1603 - regression_loss: 1.8299 - classification_loss: 0.3304 305/500 [=================>............] - ETA: 47s - loss: 2.1596 - regression_loss: 1.8294 - classification_loss: 0.3302 306/500 [=================>............] - ETA: 47s - loss: 2.1571 - regression_loss: 1.8273 - classification_loss: 0.3298 307/500 [=================>............] - ETA: 47s - loss: 2.1586 - regression_loss: 1.8288 - classification_loss: 0.3298 308/500 [=================>............] - ETA: 46s - loss: 2.1582 - regression_loss: 1.8284 - classification_loss: 0.3298 309/500 [=================>............] - ETA: 46s - loss: 2.1584 - regression_loss: 1.8278 - classification_loss: 0.3306 310/500 [=================>............] - ETA: 46s - loss: 2.1581 - regression_loss: 1.8275 - classification_loss: 0.3305 311/500 [=================>............] - ETA: 46s - loss: 2.1587 - regression_loss: 1.8281 - classification_loss: 0.3307 312/500 [=================>............] - ETA: 45s - loss: 2.1592 - regression_loss: 1.8284 - classification_loss: 0.3309 313/500 [=================>............] - ETA: 45s - loss: 2.1583 - regression_loss: 1.8270 - classification_loss: 0.3313 314/500 [=================>............] - ETA: 45s - loss: 2.1579 - regression_loss: 1.8266 - classification_loss: 0.3313 315/500 [=================>............] - ETA: 45s - loss: 2.1575 - regression_loss: 1.8264 - classification_loss: 0.3311 316/500 [=================>............] - ETA: 44s - loss: 2.1571 - regression_loss: 1.8257 - classification_loss: 0.3314 317/500 [==================>...........] - ETA: 44s - loss: 2.1569 - regression_loss: 1.8258 - classification_loss: 0.3312 318/500 [==================>...........] - ETA: 44s - loss: 2.1563 - regression_loss: 1.8250 - classification_loss: 0.3313 319/500 [==================>...........] - ETA: 44s - loss: 2.1549 - regression_loss: 1.8238 - classification_loss: 0.3311 320/500 [==================>...........] - ETA: 43s - loss: 2.1550 - regression_loss: 1.8239 - classification_loss: 0.3311 321/500 [==================>...........] - ETA: 43s - loss: 2.1546 - regression_loss: 1.8235 - classification_loss: 0.3311 322/500 [==================>...........] - ETA: 43s - loss: 2.1571 - regression_loss: 1.8257 - classification_loss: 0.3314 323/500 [==================>...........] - ETA: 43s - loss: 2.1579 - regression_loss: 1.8264 - classification_loss: 0.3316 324/500 [==================>...........] - ETA: 42s - loss: 2.1585 - regression_loss: 1.8268 - classification_loss: 0.3316 325/500 [==================>...........] - ETA: 42s - loss: 2.1601 - regression_loss: 1.8278 - classification_loss: 0.3323 326/500 [==================>...........] - ETA: 42s - loss: 2.1608 - regression_loss: 1.8285 - classification_loss: 0.3323 327/500 [==================>...........] - ETA: 42s - loss: 2.1611 - regression_loss: 1.8288 - classification_loss: 0.3323 328/500 [==================>...........] - ETA: 41s - loss: 2.1595 - regression_loss: 1.8276 - classification_loss: 0.3319 329/500 [==================>...........] - ETA: 41s - loss: 2.1589 - regression_loss: 1.8270 - classification_loss: 0.3319 330/500 [==================>...........] - ETA: 41s - loss: 2.1578 - regression_loss: 1.8261 - classification_loss: 0.3317 331/500 [==================>...........] - ETA: 41s - loss: 2.1576 - regression_loss: 1.8260 - classification_loss: 0.3315 332/500 [==================>...........] - ETA: 40s - loss: 2.1561 - regression_loss: 1.8248 - classification_loss: 0.3313 333/500 [==================>...........] - ETA: 40s - loss: 2.1569 - regression_loss: 1.8256 - classification_loss: 0.3313 334/500 [===================>..........] - ETA: 40s - loss: 2.1567 - regression_loss: 1.8254 - classification_loss: 0.3313 335/500 [===================>..........] - ETA: 40s - loss: 2.1558 - regression_loss: 1.8247 - classification_loss: 0.3311 336/500 [===================>..........] - ETA: 40s - loss: 2.1565 - regression_loss: 1.8255 - classification_loss: 0.3311 337/500 [===================>..........] - ETA: 39s - loss: 2.1563 - regression_loss: 1.8254 - classification_loss: 0.3309 338/500 [===================>..........] - ETA: 39s - loss: 2.1558 - regression_loss: 1.8251 - classification_loss: 0.3307 339/500 [===================>..........] - ETA: 39s - loss: 2.1557 - regression_loss: 1.8250 - classification_loss: 0.3307 340/500 [===================>..........] - ETA: 39s - loss: 2.1560 - regression_loss: 1.8253 - classification_loss: 0.3307 341/500 [===================>..........] - ETA: 38s - loss: 2.1566 - regression_loss: 1.8259 - classification_loss: 0.3307 342/500 [===================>..........] - ETA: 38s - loss: 2.1563 - regression_loss: 1.8257 - classification_loss: 0.3306 343/500 [===================>..........] - ETA: 38s - loss: 2.1577 - regression_loss: 1.8269 - classification_loss: 0.3308 344/500 [===================>..........] - ETA: 38s - loss: 2.1596 - regression_loss: 1.8288 - classification_loss: 0.3308 345/500 [===================>..........] - ETA: 37s - loss: 2.1596 - regression_loss: 1.8288 - classification_loss: 0.3308 346/500 [===================>..........] - ETA: 37s - loss: 2.1572 - regression_loss: 1.8269 - classification_loss: 0.3303 347/500 [===================>..........] - ETA: 37s - loss: 2.1567 - regression_loss: 1.8263 - classification_loss: 0.3304 348/500 [===================>..........] - ETA: 37s - loss: 2.1573 - regression_loss: 1.8266 - classification_loss: 0.3307 349/500 [===================>..........] - ETA: 36s - loss: 2.1583 - regression_loss: 1.8275 - classification_loss: 0.3308 350/500 [====================>.........] - ETA: 36s - loss: 2.1577 - regression_loss: 1.8270 - classification_loss: 0.3307 351/500 [====================>.........] - ETA: 36s - loss: 2.1579 - regression_loss: 1.8272 - classification_loss: 0.3307 352/500 [====================>.........] - ETA: 36s - loss: 2.1580 - regression_loss: 1.8273 - classification_loss: 0.3307 353/500 [====================>.........] - ETA: 35s - loss: 2.1596 - regression_loss: 1.8285 - classification_loss: 0.3311 354/500 [====================>.........] - ETA: 35s - loss: 2.1597 - regression_loss: 1.8286 - classification_loss: 0.3311 355/500 [====================>.........] - ETA: 35s - loss: 2.1593 - regression_loss: 1.8283 - classification_loss: 0.3310 356/500 [====================>.........] - ETA: 35s - loss: 2.1608 - regression_loss: 1.8290 - classification_loss: 0.3318 357/500 [====================>.........] - ETA: 34s - loss: 2.1616 - regression_loss: 1.8297 - classification_loss: 0.3320 358/500 [====================>.........] - ETA: 34s - loss: 2.1623 - regression_loss: 1.8302 - classification_loss: 0.3321 359/500 [====================>.........] - ETA: 34s - loss: 2.1625 - regression_loss: 1.8305 - classification_loss: 0.3320 360/500 [====================>.........] - ETA: 34s - loss: 2.1626 - regression_loss: 1.8307 - classification_loss: 0.3319 361/500 [====================>.........] - ETA: 33s - loss: 2.1635 - regression_loss: 1.8307 - classification_loss: 0.3328 362/500 [====================>.........] - ETA: 33s - loss: 2.1633 - regression_loss: 1.8306 - classification_loss: 0.3327 363/500 [====================>.........] - ETA: 33s - loss: 2.1637 - regression_loss: 1.8309 - classification_loss: 0.3327 364/500 [====================>.........] - ETA: 33s - loss: 2.1626 - regression_loss: 1.8301 - classification_loss: 0.3324 365/500 [====================>.........] - ETA: 32s - loss: 2.1618 - regression_loss: 1.8296 - classification_loss: 0.3323 366/500 [====================>.........] - ETA: 32s - loss: 2.1624 - regression_loss: 1.8300 - classification_loss: 0.3324 367/500 [=====================>........] - ETA: 32s - loss: 2.1632 - regression_loss: 1.8309 - classification_loss: 0.3323 368/500 [=====================>........] - ETA: 32s - loss: 2.1650 - regression_loss: 1.8324 - classification_loss: 0.3326 369/500 [=====================>........] - ETA: 31s - loss: 2.1646 - regression_loss: 1.8320 - classification_loss: 0.3326 370/500 [=====================>........] - ETA: 31s - loss: 2.1650 - regression_loss: 1.8324 - classification_loss: 0.3326 371/500 [=====================>........] - ETA: 31s - loss: 2.1655 - regression_loss: 1.8323 - classification_loss: 0.3332 372/500 [=====================>........] - ETA: 31s - loss: 2.1657 - regression_loss: 1.8327 - classification_loss: 0.3330 373/500 [=====================>........] - ETA: 30s - loss: 2.1649 - regression_loss: 1.8321 - classification_loss: 0.3329 374/500 [=====================>........] - ETA: 30s - loss: 2.1656 - regression_loss: 1.8329 - classification_loss: 0.3327 375/500 [=====================>........] - ETA: 30s - loss: 2.1666 - regression_loss: 1.8336 - classification_loss: 0.3330 376/500 [=====================>........] - ETA: 30s - loss: 2.1663 - regression_loss: 1.8334 - classification_loss: 0.3329 377/500 [=====================>........] - ETA: 29s - loss: 2.1650 - regression_loss: 1.8322 - classification_loss: 0.3328 378/500 [=====================>........] - ETA: 29s - loss: 2.1637 - regression_loss: 1.8311 - classification_loss: 0.3326 379/500 [=====================>........] - ETA: 29s - loss: 2.1631 - regression_loss: 1.8308 - classification_loss: 0.3324 380/500 [=====================>........] - ETA: 29s - loss: 2.1630 - regression_loss: 1.8308 - classification_loss: 0.3323 381/500 [=====================>........] - ETA: 29s - loss: 2.1630 - regression_loss: 1.8307 - classification_loss: 0.3322 382/500 [=====================>........] - ETA: 28s - loss: 2.1638 - regression_loss: 1.8313 - classification_loss: 0.3325 383/500 [=====================>........] - ETA: 28s - loss: 2.1635 - regression_loss: 1.8310 - classification_loss: 0.3325 384/500 [======================>.......] - ETA: 28s - loss: 2.1628 - regression_loss: 1.8305 - classification_loss: 0.3323 385/500 [======================>.......] - ETA: 28s - loss: 2.1641 - regression_loss: 1.8313 - classification_loss: 0.3328 386/500 [======================>.......] - ETA: 27s - loss: 2.1645 - regression_loss: 1.8317 - classification_loss: 0.3328 387/500 [======================>.......] - ETA: 27s - loss: 2.1643 - regression_loss: 1.8315 - classification_loss: 0.3328 388/500 [======================>.......] - ETA: 27s - loss: 2.1634 - regression_loss: 1.8306 - classification_loss: 0.3328 389/500 [======================>.......] - ETA: 27s - loss: 2.1642 - regression_loss: 1.8311 - classification_loss: 0.3331 390/500 [======================>.......] - ETA: 26s - loss: 2.1627 - regression_loss: 1.8300 - classification_loss: 0.3328 391/500 [======================>.......] - ETA: 26s - loss: 2.1620 - regression_loss: 1.8293 - classification_loss: 0.3327 392/500 [======================>.......] - ETA: 26s - loss: 2.1618 - regression_loss: 1.8292 - classification_loss: 0.3326 393/500 [======================>.......] - ETA: 26s - loss: 2.1613 - regression_loss: 1.8288 - classification_loss: 0.3326 394/500 [======================>.......] - ETA: 25s - loss: 2.1608 - regression_loss: 1.8284 - classification_loss: 0.3324 395/500 [======================>.......] - ETA: 25s - loss: 2.1594 - regression_loss: 1.8272 - classification_loss: 0.3322 396/500 [======================>.......] - ETA: 25s - loss: 2.1611 - regression_loss: 1.8289 - classification_loss: 0.3322 397/500 [======================>.......] - ETA: 25s - loss: 2.1614 - regression_loss: 1.8292 - classification_loss: 0.3322 398/500 [======================>.......] - ETA: 24s - loss: 2.1612 - regression_loss: 1.8292 - classification_loss: 0.3321 399/500 [======================>.......] - ETA: 24s - loss: 2.1582 - regression_loss: 1.8267 - classification_loss: 0.3315 400/500 [=======================>......] - ETA: 24s - loss: 2.1587 - regression_loss: 1.8270 - classification_loss: 0.3316 401/500 [=======================>......] - ETA: 24s - loss: 2.1589 - regression_loss: 1.8274 - classification_loss: 0.3315 402/500 [=======================>......] - ETA: 23s - loss: 2.1585 - regression_loss: 1.8269 - classification_loss: 0.3315 403/500 [=======================>......] - ETA: 23s - loss: 2.1588 - regression_loss: 1.8271 - classification_loss: 0.3317 404/500 [=======================>......] - ETA: 23s - loss: 2.1600 - regression_loss: 1.8281 - classification_loss: 0.3319 405/500 [=======================>......] - ETA: 23s - loss: 2.1593 - regression_loss: 1.8276 - classification_loss: 0.3318 406/500 [=======================>......] - ETA: 22s - loss: 2.1587 - regression_loss: 1.8267 - classification_loss: 0.3320 407/500 [=======================>......] - ETA: 22s - loss: 2.1579 - regression_loss: 1.8261 - classification_loss: 0.3318 408/500 [=======================>......] - ETA: 22s - loss: 2.1589 - regression_loss: 1.8270 - classification_loss: 0.3319 409/500 [=======================>......] - ETA: 22s - loss: 2.1594 - regression_loss: 1.8274 - classification_loss: 0.3320 410/500 [=======================>......] - ETA: 21s - loss: 2.1605 - regression_loss: 1.8283 - classification_loss: 0.3321 411/500 [=======================>......] - ETA: 21s - loss: 2.1592 - regression_loss: 1.8273 - classification_loss: 0.3319 412/500 [=======================>......] - ETA: 21s - loss: 2.1579 - regression_loss: 1.8263 - classification_loss: 0.3316 413/500 [=======================>......] - ETA: 21s - loss: 2.1583 - regression_loss: 1.8267 - classification_loss: 0.3316 414/500 [=======================>......] - ETA: 20s - loss: 2.1564 - regression_loss: 1.8252 - classification_loss: 0.3313 415/500 [=======================>......] - ETA: 20s - loss: 2.1567 - regression_loss: 1.8257 - classification_loss: 0.3311 416/500 [=======================>......] - ETA: 20s - loss: 2.1568 - regression_loss: 1.8257 - classification_loss: 0.3311 417/500 [========================>.....] - ETA: 20s - loss: 2.1579 - regression_loss: 1.8267 - classification_loss: 0.3312 418/500 [========================>.....] - ETA: 19s - loss: 2.1587 - regression_loss: 1.8274 - classification_loss: 0.3313 419/500 [========================>.....] - ETA: 19s - loss: 2.1587 - regression_loss: 1.8275 - classification_loss: 0.3312 420/500 [========================>.....] - ETA: 19s - loss: 2.1560 - regression_loss: 1.8252 - classification_loss: 0.3308 421/500 [========================>.....] - ETA: 19s - loss: 2.1557 - regression_loss: 1.8249 - classification_loss: 0.3307 422/500 [========================>.....] - ETA: 18s - loss: 2.1552 - regression_loss: 1.8241 - classification_loss: 0.3311 423/500 [========================>.....] - ETA: 18s - loss: 2.1554 - regression_loss: 1.8244 - classification_loss: 0.3311 424/500 [========================>.....] - ETA: 18s - loss: 2.1548 - regression_loss: 1.8239 - classification_loss: 0.3309 425/500 [========================>.....] - ETA: 18s - loss: 2.1538 - regression_loss: 1.8232 - classification_loss: 0.3307 426/500 [========================>.....] - ETA: 17s - loss: 2.1548 - regression_loss: 1.8241 - classification_loss: 0.3307 427/500 [========================>.....] - ETA: 17s - loss: 2.1553 - regression_loss: 1.8246 - classification_loss: 0.3307 428/500 [========================>.....] - ETA: 17s - loss: 2.1553 - regression_loss: 1.8246 - classification_loss: 0.3307 429/500 [========================>.....] - ETA: 17s - loss: 2.1544 - regression_loss: 1.8239 - classification_loss: 0.3306 430/500 [========================>.....] - ETA: 16s - loss: 2.1533 - regression_loss: 1.8230 - classification_loss: 0.3303 431/500 [========================>.....] - ETA: 16s - loss: 2.1543 - regression_loss: 1.8239 - classification_loss: 0.3304 432/500 [========================>.....] - ETA: 16s - loss: 2.1545 - regression_loss: 1.8240 - classification_loss: 0.3305 433/500 [========================>.....] - ETA: 16s - loss: 2.1540 - regression_loss: 1.8235 - classification_loss: 0.3305 434/500 [=========================>....] - ETA: 15s - loss: 2.1541 - regression_loss: 1.8234 - classification_loss: 0.3307 435/500 [=========================>....] - ETA: 15s - loss: 2.1520 - regression_loss: 1.8217 - classification_loss: 0.3303 436/500 [=========================>....] - ETA: 15s - loss: 2.1520 - regression_loss: 1.8217 - classification_loss: 0.3303 437/500 [=========================>....] - ETA: 15s - loss: 2.1519 - regression_loss: 1.8213 - classification_loss: 0.3305 438/500 [=========================>....] - ETA: 14s - loss: 2.1530 - regression_loss: 1.8222 - classification_loss: 0.3308 439/500 [=========================>....] - ETA: 14s - loss: 2.1530 - regression_loss: 1.8220 - classification_loss: 0.3310 440/500 [=========================>....] - ETA: 14s - loss: 2.1520 - regression_loss: 1.8210 - classification_loss: 0.3311 441/500 [=========================>....] - ETA: 14s - loss: 2.1503 - regression_loss: 1.8197 - classification_loss: 0.3306 442/500 [=========================>....] - ETA: 13s - loss: 2.1500 - regression_loss: 1.8195 - classification_loss: 0.3305 443/500 [=========================>....] - ETA: 13s - loss: 2.1501 - regression_loss: 1.8196 - classification_loss: 0.3304 444/500 [=========================>....] - ETA: 13s - loss: 2.1497 - regression_loss: 1.8195 - classification_loss: 0.3302 445/500 [=========================>....] - ETA: 13s - loss: 2.1498 - regression_loss: 1.8197 - classification_loss: 0.3302 446/500 [=========================>....] - ETA: 13s - loss: 2.1489 - regression_loss: 1.8190 - classification_loss: 0.3299 447/500 [=========================>....] - ETA: 12s - loss: 2.1502 - regression_loss: 1.8200 - classification_loss: 0.3301 448/500 [=========================>....] - ETA: 12s - loss: 2.1501 - regression_loss: 1.8200 - classification_loss: 0.3301 449/500 [=========================>....] - ETA: 12s - loss: 2.1477 - regression_loss: 1.8181 - classification_loss: 0.3297 450/500 [==========================>...] - ETA: 12s - loss: 2.1467 - regression_loss: 1.8171 - classification_loss: 0.3296 451/500 [==========================>...] - ETA: 11s - loss: 2.1461 - regression_loss: 1.8166 - classification_loss: 0.3295 452/500 [==========================>...] - ETA: 11s - loss: 2.1462 - regression_loss: 1.8165 - classification_loss: 0.3297 453/500 [==========================>...] - ETA: 11s - loss: 2.1456 - regression_loss: 1.8161 - classification_loss: 0.3295 454/500 [==========================>...] - ETA: 11s - loss: 2.1461 - regression_loss: 1.8165 - classification_loss: 0.3296 455/500 [==========================>...] - ETA: 10s - loss: 2.1438 - regression_loss: 1.8146 - classification_loss: 0.3292 456/500 [==========================>...] - ETA: 10s - loss: 2.1437 - regression_loss: 1.8147 - classification_loss: 0.3290 457/500 [==========================>...] - ETA: 10s - loss: 2.1436 - regression_loss: 1.8146 - classification_loss: 0.3290 458/500 [==========================>...] - ETA: 10s - loss: 2.1431 - regression_loss: 1.8143 - classification_loss: 0.3288 459/500 [==========================>...] - ETA: 9s - loss: 2.1443 - regression_loss: 1.8152 - classification_loss: 0.3291  460/500 [==========================>...] - ETA: 9s - loss: 2.1450 - regression_loss: 1.8158 - classification_loss: 0.3292 461/500 [==========================>...] - ETA: 9s - loss: 2.1450 - regression_loss: 1.8159 - classification_loss: 0.3291 462/500 [==========================>...] - ETA: 9s - loss: 2.1440 - regression_loss: 1.8151 - classification_loss: 0.3289 463/500 [==========================>...] - ETA: 8s - loss: 2.1444 - regression_loss: 1.8154 - classification_loss: 0.3290 464/500 [==========================>...] - ETA: 8s - loss: 2.1450 - regression_loss: 1.8159 - classification_loss: 0.3292 465/500 [==========================>...] - ETA: 8s - loss: 2.1455 - regression_loss: 1.8163 - classification_loss: 0.3292 466/500 [==========================>...] - ETA: 8s - loss: 2.1450 - regression_loss: 1.8160 - classification_loss: 0.3290 467/500 [===========================>..] - ETA: 7s - loss: 2.1451 - regression_loss: 1.8162 - classification_loss: 0.3289 468/500 [===========================>..] - ETA: 7s - loss: 2.1431 - regression_loss: 1.8145 - classification_loss: 0.3286 469/500 [===========================>..] - ETA: 7s - loss: 2.1434 - regression_loss: 1.8147 - classification_loss: 0.3287 470/500 [===========================>..] - ETA: 7s - loss: 2.1432 - regression_loss: 1.8147 - classification_loss: 0.3285 471/500 [===========================>..] - ETA: 6s - loss: 2.1403 - regression_loss: 1.8122 - classification_loss: 0.3281 472/500 [===========================>..] - ETA: 6s - loss: 2.1383 - regression_loss: 1.8106 - classification_loss: 0.3277 473/500 [===========================>..] - ETA: 6s - loss: 2.1382 - regression_loss: 1.8105 - classification_loss: 0.3278 474/500 [===========================>..] - ETA: 6s - loss: 2.1387 - regression_loss: 1.8108 - classification_loss: 0.3279 475/500 [===========================>..] - ETA: 5s - loss: 2.1394 - regression_loss: 1.8113 - classification_loss: 0.3280 476/500 [===========================>..] - ETA: 5s - loss: 2.1376 - regression_loss: 1.8099 - classification_loss: 0.3277 477/500 [===========================>..] - ETA: 5s - loss: 2.1365 - regression_loss: 1.8090 - classification_loss: 0.3275 478/500 [===========================>..] - ETA: 5s - loss: 2.1361 - regression_loss: 1.8087 - classification_loss: 0.3274 479/500 [===========================>..] - ETA: 5s - loss: 2.1362 - regression_loss: 1.8088 - classification_loss: 0.3273 480/500 [===========================>..] - ETA: 4s - loss: 2.1364 - regression_loss: 1.8091 - classification_loss: 0.3273 481/500 [===========================>..] - ETA: 4s - loss: 2.1355 - regression_loss: 1.8083 - classification_loss: 0.3272 482/500 [===========================>..] - ETA: 4s - loss: 2.1351 - regression_loss: 1.8079 - classification_loss: 0.3272 483/500 [===========================>..] - ETA: 4s - loss: 2.1354 - regression_loss: 1.8081 - classification_loss: 0.3272 484/500 [============================>.] - ETA: 3s - loss: 2.1349 - regression_loss: 1.8078 - classification_loss: 0.3271 485/500 [============================>.] - ETA: 3s - loss: 2.1349 - regression_loss: 1.8080 - classification_loss: 0.3270 486/500 [============================>.] - ETA: 3s - loss: 2.1351 - regression_loss: 1.8081 - classification_loss: 0.3270 487/500 [============================>.] - ETA: 3s - loss: 2.1353 - regression_loss: 1.8083 - classification_loss: 0.3269 488/500 [============================>.] - ETA: 2s - loss: 2.1362 - regression_loss: 1.8091 - classification_loss: 0.3271 489/500 [============================>.] - ETA: 2s - loss: 2.1356 - regression_loss: 1.8087 - classification_loss: 0.3269 490/500 [============================>.] - ETA: 2s - loss: 2.1361 - regression_loss: 1.8092 - classification_loss: 0.3269 491/500 [============================>.] - ETA: 2s - loss: 2.1378 - regression_loss: 1.8106 - classification_loss: 0.3272 492/500 [============================>.] - ETA: 1s - loss: 2.1386 - regression_loss: 1.8116 - classification_loss: 0.3270 493/500 [============================>.] - ETA: 1s - loss: 2.1383 - regression_loss: 1.8113 - classification_loss: 0.3270 494/500 [============================>.] - ETA: 1s - loss: 2.1378 - regression_loss: 1.8110 - classification_loss: 0.3269 495/500 [============================>.] - ETA: 1s - loss: 2.1361 - regression_loss: 1.8095 - classification_loss: 0.3266 496/500 [============================>.] - ETA: 0s - loss: 2.1363 - regression_loss: 1.8099 - classification_loss: 0.3265 497/500 [============================>.] - ETA: 0s - loss: 2.1359 - regression_loss: 1.8094 - classification_loss: 0.3265 498/500 [============================>.] - ETA: 0s - loss: 2.1360 - regression_loss: 1.8095 - classification_loss: 0.3264 499/500 [============================>.] - ETA: 0s - loss: 2.1360 - regression_loss: 1.8095 - classification_loss: 0.3265 500/500 [==============================] - 119s 239ms/step - loss: 2.1362 - regression_loss: 1.8097 - classification_loss: 0.3265 1172 instances of class plum with average precision: 0.4911 mAP: 0.4911 Epoch 00004: saving model to ./training/snapshots/resnet50_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 1:49 - loss: 2.2504 - regression_loss: 1.9155 - classification_loss: 0.3348 2/500 [..............................] - ETA: 1:48 - loss: 2.1553 - regression_loss: 1.8545 - classification_loss: 0.3008 3/500 [..............................] - ETA: 1:49 - loss: 2.2228 - regression_loss: 1.9263 - classification_loss: 0.2965 4/500 [..............................] - ETA: 1:49 - loss: 2.1253 - regression_loss: 1.8459 - classification_loss: 0.2793 5/500 [..............................] - ETA: 1:50 - loss: 2.0579 - regression_loss: 1.7843 - classification_loss: 0.2736 6/500 [..............................] - ETA: 1:48 - loss: 2.1847 - regression_loss: 1.9031 - classification_loss: 0.2816 7/500 [..............................] - ETA: 1:48 - loss: 2.1621 - regression_loss: 1.8833 - classification_loss: 0.2788 8/500 [..............................] - ETA: 1:48 - loss: 2.2456 - regression_loss: 1.9591 - classification_loss: 0.2865 9/500 [..............................] - ETA: 1:48 - loss: 2.2555 - regression_loss: 1.9582 - classification_loss: 0.2973 10/500 [..............................] - ETA: 1:48 - loss: 2.1962 - regression_loss: 1.8994 - classification_loss: 0.2968 11/500 [..............................] - ETA: 1:47 - loss: 2.2267 - regression_loss: 1.9069 - classification_loss: 0.3197 12/500 [..............................] - ETA: 1:47 - loss: 2.2592 - regression_loss: 1.9276 - classification_loss: 0.3316 13/500 [..............................] - ETA: 1:47 - loss: 2.2396 - regression_loss: 1.9099 - classification_loss: 0.3297 14/500 [..............................] - ETA: 1:47 - loss: 2.2141 - regression_loss: 1.8910 - classification_loss: 0.3232 15/500 [..............................] - ETA: 1:46 - loss: 2.1783 - regression_loss: 1.8597 - classification_loss: 0.3186 16/500 [..............................] - ETA: 1:46 - loss: 2.1828 - regression_loss: 1.8647 - classification_loss: 0.3181 17/500 [>.............................] - ETA: 1:46 - loss: 2.1746 - regression_loss: 1.8580 - classification_loss: 0.3166 18/500 [>.............................] - ETA: 1:46 - loss: 2.1887 - regression_loss: 1.8661 - classification_loss: 0.3225 19/500 [>.............................] - ETA: 1:45 - loss: 2.1926 - regression_loss: 1.8687 - classification_loss: 0.3239 20/500 [>.............................] - ETA: 1:45 - loss: 2.1958 - regression_loss: 1.8698 - classification_loss: 0.3260 21/500 [>.............................] - ETA: 1:45 - loss: 2.2013 - regression_loss: 1.8737 - classification_loss: 0.3276 22/500 [>.............................] - ETA: 1:44 - loss: 2.1935 - regression_loss: 1.8694 - classification_loss: 0.3241 23/500 [>.............................] - ETA: 1:44 - loss: 2.2133 - regression_loss: 1.8844 - classification_loss: 0.3289 24/500 [>.............................] - ETA: 1:44 - loss: 2.2110 - regression_loss: 1.8754 - classification_loss: 0.3356 25/500 [>.............................] - ETA: 1:43 - loss: 2.2147 - regression_loss: 1.8808 - classification_loss: 0.3339 26/500 [>.............................] - ETA: 1:43 - loss: 2.2057 - regression_loss: 1.8737 - classification_loss: 0.3321 27/500 [>.............................] - ETA: 1:43 - loss: 2.2075 - regression_loss: 1.8754 - classification_loss: 0.3321 28/500 [>.............................] - ETA: 1:43 - loss: 2.1872 - regression_loss: 1.8566 - classification_loss: 0.3306 29/500 [>.............................] - ETA: 1:42 - loss: 2.2020 - regression_loss: 1.8671 - classification_loss: 0.3349 30/500 [>.............................] - ETA: 1:42 - loss: 2.1972 - regression_loss: 1.8626 - classification_loss: 0.3346 31/500 [>.............................] - ETA: 1:42 - loss: 2.1909 - regression_loss: 1.8565 - classification_loss: 0.3344 32/500 [>.............................] - ETA: 1:42 - loss: 2.1828 - regression_loss: 1.8464 - classification_loss: 0.3364 33/500 [>.............................] - ETA: 1:41 - loss: 2.1825 - regression_loss: 1.8442 - classification_loss: 0.3383 34/500 [=>............................] - ETA: 1:41 - loss: 2.1782 - regression_loss: 1.8402 - classification_loss: 0.3380 35/500 [=>............................] - ETA: 1:41 - loss: 2.2052 - regression_loss: 1.8601 - classification_loss: 0.3450 36/500 [=>............................] - ETA: 1:41 - loss: 2.2104 - regression_loss: 1.8648 - classification_loss: 0.3456 37/500 [=>............................] - ETA: 1:40 - loss: 2.2150 - regression_loss: 1.8674 - classification_loss: 0.3477 38/500 [=>............................] - ETA: 1:40 - loss: 2.1904 - regression_loss: 1.8478 - classification_loss: 0.3426 39/500 [=>............................] - ETA: 1:40 - loss: 2.1889 - regression_loss: 1.8468 - classification_loss: 0.3421 40/500 [=>............................] - ETA: 1:40 - loss: 2.1968 - regression_loss: 1.8572 - classification_loss: 0.3396 41/500 [=>............................] - ETA: 1:39 - loss: 2.1816 - regression_loss: 1.8464 - classification_loss: 0.3352 42/500 [=>............................] - ETA: 1:39 - loss: 2.1830 - regression_loss: 1.8480 - classification_loss: 0.3350 43/500 [=>............................] - ETA: 1:39 - loss: 2.1825 - regression_loss: 1.8493 - classification_loss: 0.3332 44/500 [=>............................] - ETA: 1:39 - loss: 2.1957 - regression_loss: 1.8492 - classification_loss: 0.3465 45/500 [=>............................] - ETA: 1:39 - loss: 2.1991 - regression_loss: 1.8545 - classification_loss: 0.3446 46/500 [=>............................] - ETA: 1:38 - loss: 2.1871 - regression_loss: 1.8443 - classification_loss: 0.3428 47/500 [=>............................] - ETA: 1:38 - loss: 2.2085 - regression_loss: 1.8555 - classification_loss: 0.3530 48/500 [=>............................] - ETA: 1:38 - loss: 2.2082 - regression_loss: 1.8565 - classification_loss: 0.3517 49/500 [=>............................] - ETA: 1:38 - loss: 2.2028 - regression_loss: 1.8528 - classification_loss: 0.3500 50/500 [==>...........................] - ETA: 1:38 - loss: 2.1954 - regression_loss: 1.8472 - classification_loss: 0.3482 51/500 [==>...........................] - ETA: 1:37 - loss: 2.1866 - regression_loss: 1.8406 - classification_loss: 0.3460 52/500 [==>...........................] - ETA: 1:37 - loss: 2.1800 - regression_loss: 1.8358 - classification_loss: 0.3442 53/500 [==>...........................] - ETA: 1:37 - loss: 2.1795 - regression_loss: 1.8358 - classification_loss: 0.3437 54/500 [==>...........................] - ETA: 1:37 - loss: 2.1731 - regression_loss: 1.8309 - classification_loss: 0.3422 55/500 [==>...........................] - ETA: 1:37 - loss: 2.1743 - regression_loss: 1.8319 - classification_loss: 0.3424 56/500 [==>...........................] - ETA: 1:36 - loss: 2.1865 - regression_loss: 1.8393 - classification_loss: 0.3473 57/500 [==>...........................] - ETA: 1:36 - loss: 2.1864 - regression_loss: 1.8389 - classification_loss: 0.3475 58/500 [==>...........................] - ETA: 1:36 - loss: 2.1844 - regression_loss: 1.8380 - classification_loss: 0.3464 59/500 [==>...........................] - ETA: 1:36 - loss: 2.1912 - regression_loss: 1.8431 - classification_loss: 0.3481 60/500 [==>...........................] - ETA: 1:36 - loss: 2.1721 - regression_loss: 1.8274 - classification_loss: 0.3447 61/500 [==>...........................] - ETA: 1:35 - loss: 2.1574 - regression_loss: 1.8150 - classification_loss: 0.3424 62/500 [==>...........................] - ETA: 1:35 - loss: 2.1589 - regression_loss: 1.8162 - classification_loss: 0.3427 63/500 [==>...........................] - ETA: 1:35 - loss: 2.1658 - regression_loss: 1.8226 - classification_loss: 0.3432 64/500 [==>...........................] - ETA: 1:35 - loss: 2.1669 - regression_loss: 1.8245 - classification_loss: 0.3424 65/500 [==>...........................] - ETA: 1:34 - loss: 2.1578 - regression_loss: 1.8170 - classification_loss: 0.3408 66/500 [==>...........................] - ETA: 1:34 - loss: 2.1464 - regression_loss: 1.8073 - classification_loss: 0.3391 67/500 [===>..........................] - ETA: 1:34 - loss: 2.1441 - regression_loss: 1.8060 - classification_loss: 0.3381 68/500 [===>..........................] - ETA: 1:34 - loss: 2.1511 - regression_loss: 1.8131 - classification_loss: 0.3380 69/500 [===>..........................] - ETA: 1:34 - loss: 2.1553 - regression_loss: 1.8167 - classification_loss: 0.3386 70/500 [===>..........................] - ETA: 1:33 - loss: 2.1564 - regression_loss: 1.8178 - classification_loss: 0.3386 71/500 [===>..........................] - ETA: 1:33 - loss: 2.1581 - regression_loss: 1.8197 - classification_loss: 0.3384 72/500 [===>..........................] - ETA: 1:33 - loss: 2.1544 - regression_loss: 1.8168 - classification_loss: 0.3375 73/500 [===>..........................] - ETA: 1:33 - loss: 2.1483 - regression_loss: 1.8118 - classification_loss: 0.3365 74/500 [===>..........................] - ETA: 1:33 - loss: 2.1523 - regression_loss: 1.8156 - classification_loss: 0.3367 75/500 [===>..........................] - ETA: 1:32 - loss: 2.1507 - regression_loss: 1.8137 - classification_loss: 0.3370 76/500 [===>..........................] - ETA: 1:32 - loss: 2.1490 - regression_loss: 1.8133 - classification_loss: 0.3357 77/500 [===>..........................] - ETA: 1:32 - loss: 2.1469 - regression_loss: 1.8123 - classification_loss: 0.3346 78/500 [===>..........................] - ETA: 1:32 - loss: 2.1476 - regression_loss: 1.8133 - classification_loss: 0.3343 79/500 [===>..........................] - ETA: 1:32 - loss: 2.1580 - regression_loss: 1.8233 - classification_loss: 0.3347 80/500 [===>..........................] - ETA: 1:32 - loss: 2.1538 - regression_loss: 1.8193 - classification_loss: 0.3345 81/500 [===>..........................] - ETA: 1:31 - loss: 2.1564 - regression_loss: 1.8212 - classification_loss: 0.3352 82/500 [===>..........................] - ETA: 1:31 - loss: 2.1550 - regression_loss: 1.8202 - classification_loss: 0.3348 83/500 [===>..........................] - ETA: 1:31 - loss: 2.1529 - regression_loss: 1.8184 - classification_loss: 0.3346 84/500 [====>.........................] - ETA: 1:31 - loss: 2.1551 - regression_loss: 1.8204 - classification_loss: 0.3348 85/500 [====>.........................] - ETA: 1:31 - loss: 2.1528 - regression_loss: 1.8189 - classification_loss: 0.3339 86/500 [====>.........................] - ETA: 1:30 - loss: 2.1540 - regression_loss: 1.8204 - classification_loss: 0.3336 87/500 [====>.........................] - ETA: 1:30 - loss: 2.1552 - regression_loss: 1.8219 - classification_loss: 0.3334 88/500 [====>.........................] - ETA: 1:30 - loss: 2.1566 - regression_loss: 1.8236 - classification_loss: 0.3330 89/500 [====>.........................] - ETA: 1:30 - loss: 2.1546 - regression_loss: 1.8223 - classification_loss: 0.3323 90/500 [====>.........................] - ETA: 1:29 - loss: 2.1544 - regression_loss: 1.8224 - classification_loss: 0.3320 91/500 [====>.........................] - ETA: 1:29 - loss: 2.1500 - regression_loss: 1.8191 - classification_loss: 0.3309 92/500 [====>.........................] - ETA: 1:29 - loss: 2.1520 - regression_loss: 1.8208 - classification_loss: 0.3312 93/500 [====>.........................] - ETA: 1:29 - loss: 2.1551 - regression_loss: 1.8222 - classification_loss: 0.3329 94/500 [====>.........................] - ETA: 1:28 - loss: 2.1562 - regression_loss: 1.8230 - classification_loss: 0.3331 95/500 [====>.........................] - ETA: 1:28 - loss: 2.1608 - regression_loss: 1.8276 - classification_loss: 0.3332 96/500 [====>.........................] - ETA: 1:28 - loss: 2.1587 - regression_loss: 1.8261 - classification_loss: 0.3325 97/500 [====>.........................] - ETA: 1:28 - loss: 2.1597 - regression_loss: 1.8274 - classification_loss: 0.3322 98/500 [====>.........................] - ETA: 1:28 - loss: 2.1600 - regression_loss: 1.8277 - classification_loss: 0.3322 99/500 [====>.........................] - ETA: 1:27 - loss: 2.1606 - regression_loss: 1.8287 - classification_loss: 0.3319 100/500 [=====>........................] - ETA: 1:27 - loss: 2.1646 - regression_loss: 1.8317 - classification_loss: 0.3330 101/500 [=====>........................] - ETA: 1:27 - loss: 2.1666 - regression_loss: 1.8339 - classification_loss: 0.3327 102/500 [=====>........................] - ETA: 1:27 - loss: 2.1601 - regression_loss: 1.8279 - classification_loss: 0.3322 103/500 [=====>........................] - ETA: 1:26 - loss: 2.1625 - regression_loss: 1.8298 - classification_loss: 0.3327 104/500 [=====>........................] - ETA: 1:26 - loss: 2.1641 - regression_loss: 1.8313 - classification_loss: 0.3328 105/500 [=====>........................] - ETA: 1:26 - loss: 2.1598 - regression_loss: 1.8280 - classification_loss: 0.3318 106/500 [=====>........................] - ETA: 1:26 - loss: 2.1633 - regression_loss: 1.8303 - classification_loss: 0.3330 107/500 [=====>........................] - ETA: 1:26 - loss: 2.1576 - regression_loss: 1.8250 - classification_loss: 0.3325 108/500 [=====>........................] - ETA: 1:25 - loss: 2.1572 - regression_loss: 1.8246 - classification_loss: 0.3326 109/500 [=====>........................] - ETA: 1:25 - loss: 2.1554 - regression_loss: 1.8237 - classification_loss: 0.3317 110/500 [=====>........................] - ETA: 1:25 - loss: 2.1483 - regression_loss: 1.8181 - classification_loss: 0.3302 111/500 [=====>........................] - ETA: 1:25 - loss: 2.1492 - regression_loss: 1.8191 - classification_loss: 0.3301 112/500 [=====>........................] - ETA: 1:24 - loss: 2.1485 - regression_loss: 1.8191 - classification_loss: 0.3294 113/500 [=====>........................] - ETA: 1:24 - loss: 2.1493 - regression_loss: 1.8188 - classification_loss: 0.3304 114/500 [=====>........................] - ETA: 1:24 - loss: 2.1447 - regression_loss: 1.8146 - classification_loss: 0.3301 115/500 [=====>........................] - ETA: 1:24 - loss: 2.1510 - regression_loss: 1.8195 - classification_loss: 0.3314 116/500 [=====>........................] - ETA: 1:24 - loss: 2.1419 - regression_loss: 1.8104 - classification_loss: 0.3316 117/500 [======>.......................] - ETA: 1:23 - loss: 2.1419 - regression_loss: 1.8107 - classification_loss: 0.3312 118/500 [======>.......................] - ETA: 1:23 - loss: 2.1401 - regression_loss: 1.8091 - classification_loss: 0.3310 119/500 [======>.......................] - ETA: 1:23 - loss: 2.1363 - regression_loss: 1.8061 - classification_loss: 0.3301 120/500 [======>.......................] - ETA: 1:23 - loss: 2.1357 - regression_loss: 1.8060 - classification_loss: 0.3297 121/500 [======>.......................] - ETA: 1:22 - loss: 2.1340 - regression_loss: 1.8049 - classification_loss: 0.3291 122/500 [======>.......................] - ETA: 1:22 - loss: 2.1362 - regression_loss: 1.8071 - classification_loss: 0.3291 123/500 [======>.......................] - ETA: 1:22 - loss: 2.1346 - regression_loss: 1.8059 - classification_loss: 0.3287 124/500 [======>.......................] - ETA: 1:22 - loss: 2.1366 - regression_loss: 1.8087 - classification_loss: 0.3279 125/500 [======>.......................] - ETA: 1:22 - loss: 2.1332 - regression_loss: 1.8062 - classification_loss: 0.3270 126/500 [======>.......................] - ETA: 1:21 - loss: 2.1301 - regression_loss: 1.8040 - classification_loss: 0.3261 127/500 [======>.......................] - ETA: 1:21 - loss: 2.1286 - regression_loss: 1.8029 - classification_loss: 0.3257 128/500 [======>.......................] - ETA: 1:21 - loss: 2.1247 - regression_loss: 1.7993 - classification_loss: 0.3255 129/500 [======>.......................] - ETA: 1:21 - loss: 2.1215 - regression_loss: 1.7964 - classification_loss: 0.3250 130/500 [======>.......................] - ETA: 1:20 - loss: 2.1229 - regression_loss: 1.7977 - classification_loss: 0.3253 131/500 [======>.......................] - ETA: 1:20 - loss: 2.1261 - regression_loss: 1.8001 - classification_loss: 0.3261 132/500 [======>.......................] - ETA: 1:20 - loss: 2.1198 - regression_loss: 1.7948 - classification_loss: 0.3251 133/500 [======>.......................] - ETA: 1:20 - loss: 2.1240 - regression_loss: 1.7989 - classification_loss: 0.3250 134/500 [=======>......................] - ETA: 1:20 - loss: 2.1239 - regression_loss: 1.7973 - classification_loss: 0.3266 135/500 [=======>......................] - ETA: 1:19 - loss: 2.1253 - regression_loss: 1.7984 - classification_loss: 0.3269 136/500 [=======>......................] - ETA: 1:19 - loss: 2.1265 - regression_loss: 1.7999 - classification_loss: 0.3266 137/500 [=======>......................] - ETA: 1:19 - loss: 2.1295 - regression_loss: 1.8018 - classification_loss: 0.3277 138/500 [=======>......................] - ETA: 1:19 - loss: 2.1287 - regression_loss: 1.8011 - classification_loss: 0.3277 139/500 [=======>......................] - ETA: 1:18 - loss: 2.1279 - regression_loss: 1.8008 - classification_loss: 0.3271 140/500 [=======>......................] - ETA: 1:18 - loss: 2.1267 - regression_loss: 1.7999 - classification_loss: 0.3268 141/500 [=======>......................] - ETA: 1:18 - loss: 2.1244 - regression_loss: 1.7985 - classification_loss: 0.3259 142/500 [=======>......................] - ETA: 1:18 - loss: 2.1255 - regression_loss: 1.8002 - classification_loss: 0.3253 143/500 [=======>......................] - ETA: 1:18 - loss: 2.1257 - regression_loss: 1.8004 - classification_loss: 0.3253 144/500 [=======>......................] - ETA: 1:17 - loss: 2.1235 - regression_loss: 1.7985 - classification_loss: 0.3250 145/500 [=======>......................] - ETA: 1:17 - loss: 2.1253 - regression_loss: 1.7999 - classification_loss: 0.3254 146/500 [=======>......................] - ETA: 1:17 - loss: 2.1282 - regression_loss: 1.8029 - classification_loss: 0.3252 147/500 [=======>......................] - ETA: 1:17 - loss: 2.1300 - regression_loss: 1.8048 - classification_loss: 0.3252 148/500 [=======>......................] - ETA: 1:16 - loss: 2.1295 - regression_loss: 1.8045 - classification_loss: 0.3250 149/500 [=======>......................] - ETA: 1:16 - loss: 2.1247 - regression_loss: 1.8007 - classification_loss: 0.3240 150/500 [========>.....................] - ETA: 1:16 - loss: 2.1251 - regression_loss: 1.8015 - classification_loss: 0.3236 151/500 [========>.....................] - ETA: 1:16 - loss: 2.1253 - regression_loss: 1.8018 - classification_loss: 0.3235 152/500 [========>.....................] - ETA: 1:16 - loss: 2.1236 - regression_loss: 1.8000 - classification_loss: 0.3236 153/500 [========>.....................] - ETA: 1:15 - loss: 2.1236 - regression_loss: 1.8000 - classification_loss: 0.3236 154/500 [========>.....................] - ETA: 1:15 - loss: 2.1260 - regression_loss: 1.8018 - classification_loss: 0.3242 155/500 [========>.....................] - ETA: 1:15 - loss: 2.1228 - regression_loss: 1.7992 - classification_loss: 0.3236 156/500 [========>.....................] - ETA: 1:15 - loss: 2.1234 - regression_loss: 1.7999 - classification_loss: 0.3236 157/500 [========>.....................] - ETA: 1:15 - loss: 2.1222 - regression_loss: 1.7987 - classification_loss: 0.3235 158/500 [========>.....................] - ETA: 1:15 - loss: 2.1224 - regression_loss: 1.7989 - classification_loss: 0.3234 159/500 [========>.....................] - ETA: 1:15 - loss: 2.1183 - regression_loss: 1.7954 - classification_loss: 0.3229 160/500 [========>.....................] - ETA: 1:15 - loss: 2.1164 - regression_loss: 1.7939 - classification_loss: 0.3225 161/500 [========>.....................] - ETA: 1:14 - loss: 2.1172 - regression_loss: 1.7936 - classification_loss: 0.3236 162/500 [========>.....................] - ETA: 1:14 - loss: 2.1142 - regression_loss: 1.7913 - classification_loss: 0.3229 163/500 [========>.....................] - ETA: 1:14 - loss: 2.1155 - regression_loss: 1.7921 - classification_loss: 0.3234 164/500 [========>.....................] - ETA: 1:14 - loss: 2.1137 - regression_loss: 1.7907 - classification_loss: 0.3230 165/500 [========>.....................] - ETA: 1:14 - loss: 2.1137 - regression_loss: 1.7907 - classification_loss: 0.3229 166/500 [========>.....................] - ETA: 1:14 - loss: 2.1127 - regression_loss: 1.7900 - classification_loss: 0.3227 167/500 [=========>....................] - ETA: 1:13 - loss: 2.1135 - regression_loss: 1.7909 - classification_loss: 0.3226 168/500 [=========>....................] - ETA: 1:13 - loss: 2.1146 - regression_loss: 1.7920 - classification_loss: 0.3226 169/500 [=========>....................] - ETA: 1:13 - loss: 2.1139 - regression_loss: 1.7918 - classification_loss: 0.3221 170/500 [=========>....................] - ETA: 1:13 - loss: 2.1101 - regression_loss: 1.7888 - classification_loss: 0.3213 171/500 [=========>....................] - ETA: 1:13 - loss: 2.1080 - regression_loss: 1.7873 - classification_loss: 0.3207 172/500 [=========>....................] - ETA: 1:13 - loss: 2.1038 - regression_loss: 1.7838 - classification_loss: 0.3201 173/500 [=========>....................] - ETA: 1:12 - loss: 2.1063 - regression_loss: 1.7861 - classification_loss: 0.3202 174/500 [=========>....................] - ETA: 1:12 - loss: 2.1063 - regression_loss: 1.7862 - classification_loss: 0.3201 175/500 [=========>....................] - ETA: 1:12 - loss: 2.1064 - regression_loss: 1.7863 - classification_loss: 0.3201 176/500 [=========>....................] - ETA: 1:12 - loss: 2.1070 - regression_loss: 1.7869 - classification_loss: 0.3201 177/500 [=========>....................] - ETA: 1:12 - loss: 2.1046 - regression_loss: 1.7849 - classification_loss: 0.3197 178/500 [=========>....................] - ETA: 1:12 - loss: 2.1028 - regression_loss: 1.7835 - classification_loss: 0.3193 179/500 [=========>....................] - ETA: 1:11 - loss: 2.1052 - regression_loss: 1.7866 - classification_loss: 0.3186 180/500 [=========>....................] - ETA: 1:11 - loss: 2.0994 - regression_loss: 1.7818 - classification_loss: 0.3176 181/500 [=========>....................] - ETA: 1:11 - loss: 2.0990 - regression_loss: 1.7818 - classification_loss: 0.3173 182/500 [=========>....................] - ETA: 1:11 - loss: 2.0990 - regression_loss: 1.7818 - classification_loss: 0.3172 183/500 [=========>....................] - ETA: 1:11 - loss: 2.0964 - regression_loss: 1.7796 - classification_loss: 0.3168 184/500 [==========>...................] - ETA: 1:10 - loss: 2.0959 - regression_loss: 1.7794 - classification_loss: 0.3166 185/500 [==========>...................] - ETA: 1:10 - loss: 2.0966 - regression_loss: 1.7800 - classification_loss: 0.3166 186/500 [==========>...................] - ETA: 1:10 - loss: 2.0978 - regression_loss: 1.7808 - classification_loss: 0.3169 187/500 [==========>...................] - ETA: 1:10 - loss: 2.0970 - regression_loss: 1.7802 - classification_loss: 0.3167 188/500 [==========>...................] - ETA: 1:10 - loss: 2.0965 - regression_loss: 1.7796 - classification_loss: 0.3169 189/500 [==========>...................] - ETA: 1:09 - loss: 2.0943 - regression_loss: 1.7778 - classification_loss: 0.3164 190/500 [==========>...................] - ETA: 1:09 - loss: 2.0929 - regression_loss: 1.7765 - classification_loss: 0.3164 191/500 [==========>...................] - ETA: 1:09 - loss: 2.0925 - regression_loss: 1.7763 - classification_loss: 0.3162 192/500 [==========>...................] - ETA: 1:09 - loss: 2.0917 - regression_loss: 1.7757 - classification_loss: 0.3160 193/500 [==========>...................] - ETA: 1:09 - loss: 2.0921 - regression_loss: 1.7759 - classification_loss: 0.3162 194/500 [==========>...................] - ETA: 1:08 - loss: 2.0930 - regression_loss: 1.7765 - classification_loss: 0.3165 195/500 [==========>...................] - ETA: 1:08 - loss: 2.0919 - regression_loss: 1.7757 - classification_loss: 0.3162 196/500 [==========>...................] - ETA: 1:08 - loss: 2.0942 - regression_loss: 1.7777 - classification_loss: 0.3165 197/500 [==========>...................] - ETA: 1:08 - loss: 2.0917 - regression_loss: 1.7757 - classification_loss: 0.3160 198/500 [==========>...................] - ETA: 1:08 - loss: 2.0886 - regression_loss: 1.7730 - classification_loss: 0.3156 199/500 [==========>...................] - ETA: 1:07 - loss: 2.0880 - regression_loss: 1.7726 - classification_loss: 0.3154 200/500 [===========>..................] - ETA: 1:07 - loss: 2.0857 - regression_loss: 1.7709 - classification_loss: 0.3148 201/500 [===========>..................] - ETA: 1:07 - loss: 2.0883 - regression_loss: 1.7728 - classification_loss: 0.3155 202/500 [===========>..................] - ETA: 1:07 - loss: 2.0936 - regression_loss: 1.7774 - classification_loss: 0.3163 203/500 [===========>..................] - ETA: 1:07 - loss: 2.0960 - regression_loss: 1.7799 - classification_loss: 0.3161 204/500 [===========>..................] - ETA: 1:06 - loss: 2.0919 - regression_loss: 1.7765 - classification_loss: 0.3154 205/500 [===========>..................] - ETA: 1:06 - loss: 2.0920 - regression_loss: 1.7767 - classification_loss: 0.3153 206/500 [===========>..................] - ETA: 1:06 - loss: 2.0941 - regression_loss: 1.7780 - classification_loss: 0.3161 207/500 [===========>..................] - ETA: 1:06 - loss: 2.0952 - regression_loss: 1.7790 - classification_loss: 0.3163 208/500 [===========>..................] - ETA: 1:06 - loss: 2.0951 - regression_loss: 1.7789 - classification_loss: 0.3161 209/500 [===========>..................] - ETA: 1:05 - loss: 2.0886 - regression_loss: 1.7734 - classification_loss: 0.3152 210/500 [===========>..................] - ETA: 1:05 - loss: 2.0840 - regression_loss: 1.7695 - classification_loss: 0.3145 211/500 [===========>..................] - ETA: 1:05 - loss: 2.0849 - regression_loss: 1.7704 - classification_loss: 0.3145 212/500 [===========>..................] - ETA: 1:05 - loss: 2.0830 - regression_loss: 1.7689 - classification_loss: 0.3141 213/500 [===========>..................] - ETA: 1:05 - loss: 2.0785 - regression_loss: 1.7650 - classification_loss: 0.3135 214/500 [===========>..................] - ETA: 1:05 - loss: 2.0759 - regression_loss: 1.7629 - classification_loss: 0.3130 215/500 [===========>..................] - ETA: 1:04 - loss: 2.0744 - regression_loss: 1.7614 - classification_loss: 0.3130 216/500 [===========>..................] - ETA: 1:04 - loss: 2.0785 - regression_loss: 1.7650 - classification_loss: 0.3134 217/500 [============>.................] - ETA: 1:04 - loss: 2.0780 - regression_loss: 1.7649 - classification_loss: 0.3131 218/500 [============>.................] - ETA: 1:04 - loss: 2.0794 - regression_loss: 1.7664 - classification_loss: 0.3130 219/500 [============>.................] - ETA: 1:04 - loss: 2.0777 - regression_loss: 1.7651 - classification_loss: 0.3126 220/500 [============>.................] - ETA: 1:03 - loss: 2.0775 - regression_loss: 1.7650 - classification_loss: 0.3125 221/500 [============>.................] - ETA: 1:03 - loss: 2.0785 - regression_loss: 1.7659 - classification_loss: 0.3126 222/500 [============>.................] - ETA: 1:03 - loss: 2.0788 - regression_loss: 1.7663 - classification_loss: 0.3125 223/500 [============>.................] - ETA: 1:03 - loss: 2.0799 - regression_loss: 1.7677 - classification_loss: 0.3123 224/500 [============>.................] - ETA: 1:03 - loss: 2.0803 - regression_loss: 1.7679 - classification_loss: 0.3124 225/500 [============>.................] - ETA: 1:02 - loss: 2.0816 - regression_loss: 1.7694 - classification_loss: 0.3123 226/500 [============>.................] - ETA: 1:02 - loss: 2.0839 - regression_loss: 1.7714 - classification_loss: 0.3125 227/500 [============>.................] - ETA: 1:02 - loss: 2.0853 - regression_loss: 1.7727 - classification_loss: 0.3126 228/500 [============>.................] - ETA: 1:02 - loss: 2.0878 - regression_loss: 1.7750 - classification_loss: 0.3128 229/500 [============>.................] - ETA: 1:02 - loss: 2.0918 - regression_loss: 1.7780 - classification_loss: 0.3138 230/500 [============>.................] - ETA: 1:01 - loss: 2.0884 - regression_loss: 1.7749 - classification_loss: 0.3135 231/500 [============>.................] - ETA: 1:01 - loss: 2.0905 - regression_loss: 1.7767 - classification_loss: 0.3137 232/500 [============>.................] - ETA: 1:01 - loss: 2.0908 - regression_loss: 1.7771 - classification_loss: 0.3138 233/500 [============>.................] - ETA: 1:01 - loss: 2.0919 - regression_loss: 1.7779 - classification_loss: 0.3139 234/500 [=============>................] - ETA: 1:01 - loss: 2.0916 - regression_loss: 1.7778 - classification_loss: 0.3137 235/500 [=============>................] - ETA: 1:00 - loss: 2.0918 - regression_loss: 1.7781 - classification_loss: 0.3138 236/500 [=============>................] - ETA: 1:00 - loss: 2.0931 - regression_loss: 1.7792 - classification_loss: 0.3139 237/500 [=============>................] - ETA: 1:00 - loss: 2.0945 - regression_loss: 1.7800 - classification_loss: 0.3144 238/500 [=============>................] - ETA: 1:00 - loss: 2.0949 - regression_loss: 1.7805 - classification_loss: 0.3143 239/500 [=============>................] - ETA: 1:00 - loss: 2.0934 - regression_loss: 1.7791 - classification_loss: 0.3143 240/500 [=============>................] - ETA: 59s - loss: 2.0933 - regression_loss: 1.7790 - classification_loss: 0.3143  241/500 [=============>................] - ETA: 59s - loss: 2.0947 - regression_loss: 1.7801 - classification_loss: 0.3146 242/500 [=============>................] - ETA: 59s - loss: 2.0947 - regression_loss: 1.7803 - classification_loss: 0.3144 243/500 [=============>................] - ETA: 59s - loss: 2.0917 - regression_loss: 1.7779 - classification_loss: 0.3138 244/500 [=============>................] - ETA: 59s - loss: 2.0928 - regression_loss: 1.7787 - classification_loss: 0.3141 245/500 [=============>................] - ETA: 58s - loss: 2.0942 - regression_loss: 1.7799 - classification_loss: 0.3143 246/500 [=============>................] - ETA: 58s - loss: 2.0956 - regression_loss: 1.7811 - classification_loss: 0.3144 247/500 [=============>................] - ETA: 58s - loss: 2.0958 - regression_loss: 1.7816 - classification_loss: 0.3143 248/500 [=============>................] - ETA: 58s - loss: 2.0925 - regression_loss: 1.7788 - classification_loss: 0.3137 249/500 [=============>................] - ETA: 58s - loss: 2.0929 - regression_loss: 1.7791 - classification_loss: 0.3138 250/500 [==============>...............] - ETA: 57s - loss: 2.0925 - regression_loss: 1.7788 - classification_loss: 0.3137 251/500 [==============>...............] - ETA: 57s - loss: 2.0911 - regression_loss: 1.7775 - classification_loss: 0.3136 252/500 [==============>...............] - ETA: 57s - loss: 2.0894 - regression_loss: 1.7763 - classification_loss: 0.3132 253/500 [==============>...............] - ETA: 57s - loss: 2.0881 - regression_loss: 1.7750 - classification_loss: 0.3132 254/500 [==============>...............] - ETA: 57s - loss: 2.0870 - regression_loss: 1.7741 - classification_loss: 0.3128 255/500 [==============>...............] - ETA: 56s - loss: 2.0855 - regression_loss: 1.7729 - classification_loss: 0.3126 256/500 [==============>...............] - ETA: 56s - loss: 2.0884 - regression_loss: 1.7754 - classification_loss: 0.3130 257/500 [==============>...............] - ETA: 56s - loss: 2.0883 - regression_loss: 1.7750 - classification_loss: 0.3133 258/500 [==============>...............] - ETA: 56s - loss: 2.0875 - regression_loss: 1.7744 - classification_loss: 0.3130 259/500 [==============>...............] - ETA: 55s - loss: 2.0859 - regression_loss: 1.7731 - classification_loss: 0.3128 260/500 [==============>...............] - ETA: 55s - loss: 2.0869 - regression_loss: 1.7742 - classification_loss: 0.3128 261/500 [==============>...............] - ETA: 55s - loss: 2.0869 - regression_loss: 1.7742 - classification_loss: 0.3127 262/500 [==============>...............] - ETA: 55s - loss: 2.0877 - regression_loss: 1.7747 - classification_loss: 0.3131 263/500 [==============>...............] - ETA: 55s - loss: 2.0898 - regression_loss: 1.7761 - classification_loss: 0.3137 264/500 [==============>...............] - ETA: 54s - loss: 2.0900 - regression_loss: 1.7763 - classification_loss: 0.3137 265/500 [==============>...............] - ETA: 54s - loss: 2.0901 - regression_loss: 1.7761 - classification_loss: 0.3140 266/500 [==============>...............] - ETA: 54s - loss: 2.0913 - regression_loss: 1.7773 - classification_loss: 0.3140 267/500 [===============>..............] - ETA: 54s - loss: 2.0925 - regression_loss: 1.7784 - classification_loss: 0.3141 268/500 [===============>..............] - ETA: 54s - loss: 2.0926 - regression_loss: 1.7784 - classification_loss: 0.3142 269/500 [===============>..............] - ETA: 53s - loss: 2.0910 - regression_loss: 1.7772 - classification_loss: 0.3138 270/500 [===============>..............] - ETA: 53s - loss: 2.0902 - regression_loss: 1.7766 - classification_loss: 0.3136 271/500 [===============>..............] - ETA: 53s - loss: 2.0906 - regression_loss: 1.7769 - classification_loss: 0.3137 272/500 [===============>..............] - ETA: 53s - loss: 2.0905 - regression_loss: 1.7767 - classification_loss: 0.3137 273/500 [===============>..............] - ETA: 52s - loss: 2.0878 - regression_loss: 1.7745 - classification_loss: 0.3134 274/500 [===============>..............] - ETA: 52s - loss: 2.0864 - regression_loss: 1.7732 - classification_loss: 0.3131 275/500 [===============>..............] - ETA: 52s - loss: 2.0814 - regression_loss: 1.7690 - classification_loss: 0.3125 276/500 [===============>..............] - ETA: 52s - loss: 2.0817 - regression_loss: 1.7693 - classification_loss: 0.3123 277/500 [===============>..............] - ETA: 52s - loss: 2.0818 - regression_loss: 1.7697 - classification_loss: 0.3121 278/500 [===============>..............] - ETA: 51s - loss: 2.0802 - regression_loss: 1.7684 - classification_loss: 0.3118 279/500 [===============>..............] - ETA: 51s - loss: 2.0791 - regression_loss: 1.7678 - classification_loss: 0.3113 280/500 [===============>..............] - ETA: 51s - loss: 2.0789 - regression_loss: 1.7676 - classification_loss: 0.3113 281/500 [===============>..............] - ETA: 51s - loss: 2.0784 - regression_loss: 1.7671 - classification_loss: 0.3112 282/500 [===============>..............] - ETA: 50s - loss: 2.0782 - regression_loss: 1.7670 - classification_loss: 0.3112 283/500 [===============>..............] - ETA: 50s - loss: 2.0787 - regression_loss: 1.7674 - classification_loss: 0.3113 284/500 [================>.............] - ETA: 50s - loss: 2.0785 - regression_loss: 1.7670 - classification_loss: 0.3115 285/500 [================>.............] - ETA: 50s - loss: 2.0796 - regression_loss: 1.7681 - classification_loss: 0.3116 286/500 [================>.............] - ETA: 50s - loss: 2.0797 - regression_loss: 1.7681 - classification_loss: 0.3116 287/500 [================>.............] - ETA: 49s - loss: 2.0794 - regression_loss: 1.7677 - classification_loss: 0.3117 288/500 [================>.............] - ETA: 49s - loss: 2.0792 - regression_loss: 1.7678 - classification_loss: 0.3114 289/500 [================>.............] - ETA: 49s - loss: 2.0791 - regression_loss: 1.7677 - classification_loss: 0.3113 290/500 [================>.............] - ETA: 49s - loss: 2.0784 - regression_loss: 1.7671 - classification_loss: 0.3113 291/500 [================>.............] - ETA: 48s - loss: 2.0802 - regression_loss: 1.7689 - classification_loss: 0.3113 292/500 [================>.............] - ETA: 48s - loss: 2.0803 - regression_loss: 1.7691 - classification_loss: 0.3112 293/500 [================>.............] - ETA: 48s - loss: 2.0800 - regression_loss: 1.7690 - classification_loss: 0.3109 294/500 [================>.............] - ETA: 48s - loss: 2.0793 - regression_loss: 1.7685 - classification_loss: 0.3108 295/500 [================>.............] - ETA: 48s - loss: 2.0783 - regression_loss: 1.7676 - classification_loss: 0.3107 296/500 [================>.............] - ETA: 47s - loss: 2.0764 - regression_loss: 1.7659 - classification_loss: 0.3104 297/500 [================>.............] - ETA: 47s - loss: 2.0771 - regression_loss: 1.7666 - classification_loss: 0.3105 298/500 [================>.............] - ETA: 47s - loss: 2.0759 - regression_loss: 1.7657 - classification_loss: 0.3102 299/500 [================>.............] - ETA: 47s - loss: 2.0751 - regression_loss: 1.7650 - classification_loss: 0.3101 300/500 [=================>............] - ETA: 46s - loss: 2.0758 - regression_loss: 1.7656 - classification_loss: 0.3102 301/500 [=================>............] - ETA: 46s - loss: 2.0754 - regression_loss: 1.7654 - classification_loss: 0.3100 302/500 [=================>............] - ETA: 46s - loss: 2.0734 - regression_loss: 1.7637 - classification_loss: 0.3096 303/500 [=================>............] - ETA: 46s - loss: 2.0755 - regression_loss: 1.7656 - classification_loss: 0.3099 304/500 [=================>............] - ETA: 46s - loss: 2.0749 - regression_loss: 1.7652 - classification_loss: 0.3097 305/500 [=================>............] - ETA: 45s - loss: 2.0768 - regression_loss: 1.7668 - classification_loss: 0.3100 306/500 [=================>............] - ETA: 45s - loss: 2.0747 - regression_loss: 1.7651 - classification_loss: 0.3096 307/500 [=================>............] - ETA: 45s - loss: 2.0757 - regression_loss: 1.7661 - classification_loss: 0.3096 308/500 [=================>............] - ETA: 45s - loss: 2.0748 - regression_loss: 1.7655 - classification_loss: 0.3093 309/500 [=================>............] - ETA: 44s - loss: 2.0736 - regression_loss: 1.7646 - classification_loss: 0.3090 310/500 [=================>............] - ETA: 44s - loss: 2.0717 - regression_loss: 1.7628 - classification_loss: 0.3089 311/500 [=================>............] - ETA: 44s - loss: 2.0697 - regression_loss: 1.7612 - classification_loss: 0.3085 312/500 [=================>............] - ETA: 44s - loss: 2.0684 - regression_loss: 1.7602 - classification_loss: 0.3082 313/500 [=================>............] - ETA: 44s - loss: 2.0677 - regression_loss: 1.7598 - classification_loss: 0.3079 314/500 [=================>............] - ETA: 43s - loss: 2.0683 - regression_loss: 1.7602 - classification_loss: 0.3081 315/500 [=================>............] - ETA: 43s - loss: 2.0670 - regression_loss: 1.7591 - classification_loss: 0.3080 316/500 [=================>............] - ETA: 43s - loss: 2.0671 - regression_loss: 1.7594 - classification_loss: 0.3077 317/500 [==================>...........] - ETA: 43s - loss: 2.0663 - regression_loss: 1.7588 - classification_loss: 0.3075 318/500 [==================>...........] - ETA: 42s - loss: 2.0654 - regression_loss: 1.7581 - classification_loss: 0.3073 319/500 [==================>...........] - ETA: 42s - loss: 2.0672 - regression_loss: 1.7596 - classification_loss: 0.3076 320/500 [==================>...........] - ETA: 42s - loss: 2.0673 - regression_loss: 1.7597 - classification_loss: 0.3076 321/500 [==================>...........] - ETA: 42s - loss: 2.0673 - regression_loss: 1.7598 - classification_loss: 0.3075 322/500 [==================>...........] - ETA: 42s - loss: 2.0681 - regression_loss: 1.7605 - classification_loss: 0.3076 323/500 [==================>...........] - ETA: 41s - loss: 2.0677 - regression_loss: 1.7602 - classification_loss: 0.3075 324/500 [==================>...........] - ETA: 41s - loss: 2.0682 - regression_loss: 1.7605 - classification_loss: 0.3077 325/500 [==================>...........] - ETA: 41s - loss: 2.0648 - regression_loss: 1.7576 - classification_loss: 0.3072 326/500 [==================>...........] - ETA: 41s - loss: 2.0644 - regression_loss: 1.7571 - classification_loss: 0.3073 327/500 [==================>...........] - ETA: 40s - loss: 2.0655 - regression_loss: 1.7582 - classification_loss: 0.3073 328/500 [==================>...........] - ETA: 40s - loss: 2.0644 - regression_loss: 1.7573 - classification_loss: 0.3070 329/500 [==================>...........] - ETA: 40s - loss: 2.0645 - regression_loss: 1.7574 - classification_loss: 0.3071 330/500 [==================>...........] - ETA: 40s - loss: 2.0658 - regression_loss: 1.7586 - classification_loss: 0.3072 331/500 [==================>...........] - ETA: 39s - loss: 2.0631 - regression_loss: 1.7563 - classification_loss: 0.3068 332/500 [==================>...........] - ETA: 39s - loss: 2.0632 - regression_loss: 1.7565 - classification_loss: 0.3067 333/500 [==================>...........] - ETA: 39s - loss: 2.0655 - regression_loss: 1.7580 - classification_loss: 0.3075 334/500 [===================>..........] - ETA: 39s - loss: 2.0668 - regression_loss: 1.7590 - classification_loss: 0.3078 335/500 [===================>..........] - ETA: 39s - loss: 2.0646 - regression_loss: 1.7570 - classification_loss: 0.3075 336/500 [===================>..........] - ETA: 38s - loss: 2.0643 - regression_loss: 1.7569 - classification_loss: 0.3074 337/500 [===================>..........] - ETA: 38s - loss: 2.0630 - regression_loss: 1.7558 - classification_loss: 0.3072 338/500 [===================>..........] - ETA: 38s - loss: 2.0637 - regression_loss: 1.7567 - classification_loss: 0.3071 339/500 [===================>..........] - ETA: 38s - loss: 2.0650 - regression_loss: 1.7570 - classification_loss: 0.3081 340/500 [===================>..........] - ETA: 37s - loss: 2.0651 - regression_loss: 1.7571 - classification_loss: 0.3080 341/500 [===================>..........] - ETA: 37s - loss: 2.0653 - regression_loss: 1.7572 - classification_loss: 0.3080 342/500 [===================>..........] - ETA: 37s - loss: 2.0638 - regression_loss: 1.7559 - classification_loss: 0.3078 343/500 [===================>..........] - ETA: 37s - loss: 2.0636 - regression_loss: 1.7558 - classification_loss: 0.3078 344/500 [===================>..........] - ETA: 36s - loss: 2.0650 - regression_loss: 1.7572 - classification_loss: 0.3078 345/500 [===================>..........] - ETA: 36s - loss: 2.0638 - regression_loss: 1.7563 - classification_loss: 0.3075 346/500 [===================>..........] - ETA: 36s - loss: 2.0648 - regression_loss: 1.7572 - classification_loss: 0.3075 347/500 [===================>..........] - ETA: 36s - loss: 2.0627 - regression_loss: 1.7556 - classification_loss: 0.3071 348/500 [===================>..........] - ETA: 36s - loss: 2.0625 - regression_loss: 1.7553 - classification_loss: 0.3072 349/500 [===================>..........] - ETA: 35s - loss: 2.0630 - regression_loss: 1.7557 - classification_loss: 0.3073 350/500 [====================>.........] - ETA: 35s - loss: 2.0624 - regression_loss: 1.7552 - classification_loss: 0.3072 351/500 [====================>.........] - ETA: 35s - loss: 2.0629 - regression_loss: 1.7556 - classification_loss: 0.3073 352/500 [====================>.........] - ETA: 35s - loss: 2.0634 - regression_loss: 1.7561 - classification_loss: 0.3073 353/500 [====================>.........] - ETA: 34s - loss: 2.0622 - regression_loss: 1.7551 - classification_loss: 0.3072 354/500 [====================>.........] - ETA: 34s - loss: 2.0620 - regression_loss: 1.7549 - classification_loss: 0.3071 355/500 [====================>.........] - ETA: 34s - loss: 2.0631 - regression_loss: 1.7559 - classification_loss: 0.3073 356/500 [====================>.........] - ETA: 34s - loss: 2.0642 - regression_loss: 1.7569 - classification_loss: 0.3073 357/500 [====================>.........] - ETA: 34s - loss: 2.0643 - regression_loss: 1.7570 - classification_loss: 0.3073 358/500 [====================>.........] - ETA: 33s - loss: 2.0640 - regression_loss: 1.7568 - classification_loss: 0.3072 359/500 [====================>.........] - ETA: 33s - loss: 2.0642 - regression_loss: 1.7570 - classification_loss: 0.3072 360/500 [====================>.........] - ETA: 33s - loss: 2.0632 - regression_loss: 1.7561 - classification_loss: 0.3071 361/500 [====================>.........] - ETA: 33s - loss: 2.0628 - regression_loss: 1.7559 - classification_loss: 0.3069 362/500 [====================>.........] - ETA: 32s - loss: 2.0636 - regression_loss: 1.7566 - classification_loss: 0.3070 363/500 [====================>.........] - ETA: 32s - loss: 2.0636 - regression_loss: 1.7568 - classification_loss: 0.3069 364/500 [====================>.........] - ETA: 32s - loss: 2.0642 - regression_loss: 1.7576 - classification_loss: 0.3065 365/500 [====================>.........] - ETA: 32s - loss: 2.0625 - regression_loss: 1.7564 - classification_loss: 0.3062 366/500 [====================>.........] - ETA: 31s - loss: 2.0626 - regression_loss: 1.7565 - classification_loss: 0.3061 367/500 [=====================>........] - ETA: 31s - loss: 2.0617 - regression_loss: 1.7558 - classification_loss: 0.3059 368/500 [=====================>........] - ETA: 31s - loss: 2.0619 - regression_loss: 1.7560 - classification_loss: 0.3060 369/500 [=====================>........] - ETA: 31s - loss: 2.0592 - regression_loss: 1.7535 - classification_loss: 0.3057 370/500 [=====================>........] - ETA: 31s - loss: 2.0583 - regression_loss: 1.7529 - classification_loss: 0.3055 371/500 [=====================>........] - ETA: 30s - loss: 2.0585 - regression_loss: 1.7531 - classification_loss: 0.3054 372/500 [=====================>........] - ETA: 30s - loss: 2.0587 - regression_loss: 1.7533 - classification_loss: 0.3054 373/500 [=====================>........] - ETA: 30s - loss: 2.0594 - regression_loss: 1.7537 - classification_loss: 0.3057 374/500 [=====================>........] - ETA: 30s - loss: 2.0604 - regression_loss: 1.7544 - classification_loss: 0.3060 375/500 [=====================>........] - ETA: 29s - loss: 2.0580 - regression_loss: 1.7523 - classification_loss: 0.3057 376/500 [=====================>........] - ETA: 29s - loss: 2.0586 - regression_loss: 1.7529 - classification_loss: 0.3057 377/500 [=====================>........] - ETA: 29s - loss: 2.0588 - regression_loss: 1.7531 - classification_loss: 0.3057 378/500 [=====================>........] - ETA: 29s - loss: 2.0587 - regression_loss: 1.7529 - classification_loss: 0.3058 379/500 [=====================>........] - ETA: 28s - loss: 2.0590 - regression_loss: 1.7532 - classification_loss: 0.3058 380/500 [=====================>........] - ETA: 28s - loss: 2.0595 - regression_loss: 1.7536 - classification_loss: 0.3059 381/500 [=====================>........] - ETA: 28s - loss: 2.0596 - regression_loss: 1.7538 - classification_loss: 0.3058 382/500 [=====================>........] - ETA: 28s - loss: 2.0604 - regression_loss: 1.7546 - classification_loss: 0.3057 383/500 [=====================>........] - ETA: 27s - loss: 2.0585 - regression_loss: 1.7531 - classification_loss: 0.3054 384/500 [======================>.......] - ETA: 27s - loss: 2.0577 - regression_loss: 1.7525 - classification_loss: 0.3052 385/500 [======================>.......] - ETA: 27s - loss: 2.0578 - regression_loss: 1.7527 - classification_loss: 0.3052 386/500 [======================>.......] - ETA: 27s - loss: 2.0587 - regression_loss: 1.7533 - classification_loss: 0.3054 387/500 [======================>.......] - ETA: 27s - loss: 2.0604 - regression_loss: 1.7548 - classification_loss: 0.3057 388/500 [======================>.......] - ETA: 26s - loss: 2.0595 - regression_loss: 1.7539 - classification_loss: 0.3056 389/500 [======================>.......] - ETA: 26s - loss: 2.0603 - regression_loss: 1.7544 - classification_loss: 0.3059 390/500 [======================>.......] - ETA: 26s - loss: 2.0591 - regression_loss: 1.7535 - classification_loss: 0.3056 391/500 [======================>.......] - ETA: 26s - loss: 2.0598 - regression_loss: 1.7540 - classification_loss: 0.3058 392/500 [======================>.......] - ETA: 25s - loss: 2.0597 - regression_loss: 1.7540 - classification_loss: 0.3057 393/500 [======================>.......] - ETA: 25s - loss: 2.0596 - regression_loss: 1.7537 - classification_loss: 0.3059 394/500 [======================>.......] - ETA: 25s - loss: 2.0591 - regression_loss: 1.7532 - classification_loss: 0.3059 395/500 [======================>.......] - ETA: 25s - loss: 2.0592 - regression_loss: 1.7533 - classification_loss: 0.3059 396/500 [======================>.......] - ETA: 24s - loss: 2.0603 - regression_loss: 1.7542 - classification_loss: 0.3061 397/500 [======================>.......] - ETA: 24s - loss: 2.0593 - regression_loss: 1.7534 - classification_loss: 0.3060 398/500 [======================>.......] - ETA: 24s - loss: 2.0585 - regression_loss: 1.7526 - classification_loss: 0.3058 399/500 [======================>.......] - ETA: 24s - loss: 2.0570 - regression_loss: 1.7512 - classification_loss: 0.3057 400/500 [=======================>......] - ETA: 23s - loss: 2.0562 - regression_loss: 1.7509 - classification_loss: 0.3053 401/500 [=======================>......] - ETA: 23s - loss: 2.0559 - regression_loss: 1.7505 - classification_loss: 0.3054 402/500 [=======================>......] - ETA: 23s - loss: 2.0569 - regression_loss: 1.7516 - classification_loss: 0.3053 403/500 [=======================>......] - ETA: 23s - loss: 2.0563 - regression_loss: 1.7511 - classification_loss: 0.3052 404/500 [=======================>......] - ETA: 23s - loss: 2.0571 - regression_loss: 1.7517 - classification_loss: 0.3054 405/500 [=======================>......] - ETA: 22s - loss: 2.0583 - regression_loss: 1.7526 - classification_loss: 0.3057 406/500 [=======================>......] - ETA: 22s - loss: 2.0580 - regression_loss: 1.7524 - classification_loss: 0.3057 407/500 [=======================>......] - ETA: 22s - loss: 2.0581 - regression_loss: 1.7524 - classification_loss: 0.3057 408/500 [=======================>......] - ETA: 22s - loss: 2.0551 - regression_loss: 1.7498 - classification_loss: 0.3053 409/500 [=======================>......] - ETA: 21s - loss: 2.0538 - regression_loss: 1.7488 - classification_loss: 0.3050 410/500 [=======================>......] - ETA: 21s - loss: 2.0515 - regression_loss: 1.7467 - classification_loss: 0.3048 411/500 [=======================>......] - ETA: 21s - loss: 2.0531 - regression_loss: 1.7484 - classification_loss: 0.3047 412/500 [=======================>......] - ETA: 21s - loss: 2.0514 - regression_loss: 1.7469 - classification_loss: 0.3045 413/500 [=======================>......] - ETA: 20s - loss: 2.0500 - regression_loss: 1.7457 - classification_loss: 0.3043 414/500 [=======================>......] - ETA: 20s - loss: 2.0508 - regression_loss: 1.7461 - classification_loss: 0.3047 415/500 [=======================>......] - ETA: 20s - loss: 2.0493 - regression_loss: 1.7448 - classification_loss: 0.3045 416/500 [=======================>......] - ETA: 20s - loss: 2.0471 - regression_loss: 1.7430 - classification_loss: 0.3041 417/500 [========================>.....] - ETA: 19s - loss: 2.0475 - regression_loss: 1.7433 - classification_loss: 0.3042 418/500 [========================>.....] - ETA: 19s - loss: 2.0468 - regression_loss: 1.7427 - classification_loss: 0.3041 419/500 [========================>.....] - ETA: 19s - loss: 2.0475 - regression_loss: 1.7433 - classification_loss: 0.3041 420/500 [========================>.....] - ETA: 19s - loss: 2.0465 - regression_loss: 1.7426 - classification_loss: 0.3040 421/500 [========================>.....] - ETA: 18s - loss: 2.0474 - regression_loss: 1.7432 - classification_loss: 0.3042 422/500 [========================>.....] - ETA: 18s - loss: 2.0452 - regression_loss: 1.7414 - classification_loss: 0.3038 423/500 [========================>.....] - ETA: 18s - loss: 2.0464 - regression_loss: 1.7425 - classification_loss: 0.3039 424/500 [========================>.....] - ETA: 18s - loss: 2.0467 - regression_loss: 1.7429 - classification_loss: 0.3038 425/500 [========================>.....] - ETA: 18s - loss: 2.0483 - regression_loss: 1.7440 - classification_loss: 0.3043 426/500 [========================>.....] - ETA: 17s - loss: 2.0480 - regression_loss: 1.7437 - classification_loss: 0.3043 427/500 [========================>.....] - ETA: 17s - loss: 2.0480 - regression_loss: 1.7437 - classification_loss: 0.3043 428/500 [========================>.....] - ETA: 17s - loss: 2.0488 - regression_loss: 1.7444 - classification_loss: 0.3043 429/500 [========================>.....] - ETA: 17s - loss: 2.0474 - regression_loss: 1.7432 - classification_loss: 0.3042 430/500 [========================>.....] - ETA: 16s - loss: 2.0497 - regression_loss: 1.7449 - classification_loss: 0.3047 431/500 [========================>.....] - ETA: 16s - loss: 2.0487 - regression_loss: 1.7442 - classification_loss: 0.3046 432/500 [========================>.....] - ETA: 16s - loss: 2.0470 - regression_loss: 1.7425 - classification_loss: 0.3045 433/500 [========================>.....] - ETA: 16s - loss: 2.0470 - regression_loss: 1.7424 - classification_loss: 0.3046 434/500 [=========================>....] - ETA: 15s - loss: 2.0457 - regression_loss: 1.7413 - classification_loss: 0.3044 435/500 [=========================>....] - ETA: 15s - loss: 2.0461 - regression_loss: 1.7416 - classification_loss: 0.3045 436/500 [=========================>....] - ETA: 15s - loss: 2.0465 - regression_loss: 1.7418 - classification_loss: 0.3047 437/500 [=========================>....] - ETA: 15s - loss: 2.0467 - regression_loss: 1.7419 - classification_loss: 0.3049 438/500 [=========================>....] - ETA: 14s - loss: 2.0470 - regression_loss: 1.7422 - classification_loss: 0.3048 439/500 [=========================>....] - ETA: 14s - loss: 2.0468 - regression_loss: 1.7421 - classification_loss: 0.3047 440/500 [=========================>....] - ETA: 14s - loss: 2.0465 - regression_loss: 1.7418 - classification_loss: 0.3047 441/500 [=========================>....] - ETA: 14s - loss: 2.0460 - regression_loss: 1.7414 - classification_loss: 0.3045 442/500 [=========================>....] - ETA: 13s - loss: 2.0463 - regression_loss: 1.7418 - classification_loss: 0.3045 443/500 [=========================>....] - ETA: 13s - loss: 2.0451 - regression_loss: 1.7409 - classification_loss: 0.3043 444/500 [=========================>....] - ETA: 13s - loss: 2.0451 - regression_loss: 1.7409 - classification_loss: 0.3043 445/500 [=========================>....] - ETA: 13s - loss: 2.0446 - regression_loss: 1.7405 - classification_loss: 0.3042 446/500 [=========================>....] - ETA: 13s - loss: 2.0450 - regression_loss: 1.7407 - classification_loss: 0.3043 447/500 [=========================>....] - ETA: 12s - loss: 2.0436 - regression_loss: 1.7394 - classification_loss: 0.3042 448/500 [=========================>....] - ETA: 12s - loss: 2.0439 - regression_loss: 1.7396 - classification_loss: 0.3043 449/500 [=========================>....] - ETA: 12s - loss: 2.0449 - regression_loss: 1.7401 - classification_loss: 0.3048 450/500 [==========================>...] - ETA: 12s - loss: 2.0455 - regression_loss: 1.7408 - classification_loss: 0.3048 451/500 [==========================>...] - ETA: 11s - loss: 2.0454 - regression_loss: 1.7407 - classification_loss: 0.3047 452/500 [==========================>...] - ETA: 11s - loss: 2.0434 - regression_loss: 1.7391 - classification_loss: 0.3044 453/500 [==========================>...] - ETA: 11s - loss: 2.0440 - regression_loss: 1.7396 - classification_loss: 0.3045 454/500 [==========================>...] - ETA: 11s - loss: 2.0448 - regression_loss: 1.7401 - classification_loss: 0.3047 455/500 [==========================>...] - ETA: 10s - loss: 2.0431 - regression_loss: 1.7387 - classification_loss: 0.3044 456/500 [==========================>...] - ETA: 10s - loss: 2.0461 - regression_loss: 1.7412 - classification_loss: 0.3049 457/500 [==========================>...] - ETA: 10s - loss: 2.0462 - regression_loss: 1.7413 - classification_loss: 0.3049 458/500 [==========================>...] - ETA: 10s - loss: 2.0470 - regression_loss: 1.7418 - classification_loss: 0.3052 459/500 [==========================>...] - ETA: 9s - loss: 2.0469 - regression_loss: 1.7417 - classification_loss: 0.3051  460/500 [==========================>...] - ETA: 9s - loss: 2.0462 - regression_loss: 1.7412 - classification_loss: 0.3050 461/500 [==========================>...] - ETA: 9s - loss: 2.0457 - regression_loss: 1.7409 - classification_loss: 0.3049 462/500 [==========================>...] - ETA: 9s - loss: 2.0454 - regression_loss: 1.7406 - classification_loss: 0.3047 463/500 [==========================>...] - ETA: 8s - loss: 2.0444 - regression_loss: 1.7396 - classification_loss: 0.3048 464/500 [==========================>...] - ETA: 8s - loss: 2.0446 - regression_loss: 1.7397 - classification_loss: 0.3049 465/500 [==========================>...] - ETA: 8s - loss: 2.0452 - regression_loss: 1.7402 - classification_loss: 0.3051 466/500 [==========================>...] - ETA: 8s - loss: 2.0467 - regression_loss: 1.7411 - classification_loss: 0.3056 467/500 [===========================>..] - ETA: 7s - loss: 2.0463 - regression_loss: 1.7408 - classification_loss: 0.3055 468/500 [===========================>..] - ETA: 7s - loss: 2.0459 - regression_loss: 1.7406 - classification_loss: 0.3053 469/500 [===========================>..] - ETA: 7s - loss: 2.0449 - regression_loss: 1.7398 - classification_loss: 0.3052 470/500 [===========================>..] - ETA: 7s - loss: 2.0453 - regression_loss: 1.7402 - classification_loss: 0.3051 471/500 [===========================>..] - ETA: 6s - loss: 2.0455 - regression_loss: 1.7404 - classification_loss: 0.3051 472/500 [===========================>..] - ETA: 6s - loss: 2.0445 - regression_loss: 1.7396 - classification_loss: 0.3049 473/500 [===========================>..] - ETA: 6s - loss: 2.0449 - regression_loss: 1.7400 - classification_loss: 0.3049 474/500 [===========================>..] - ETA: 6s - loss: 2.0450 - regression_loss: 1.7400 - classification_loss: 0.3050 475/500 [===========================>..] - ETA: 6s - loss: 2.0453 - regression_loss: 1.7404 - classification_loss: 0.3049 476/500 [===========================>..] - ETA: 5s - loss: 2.0455 - regression_loss: 1.7406 - classification_loss: 0.3049 477/500 [===========================>..] - ETA: 5s - loss: 2.0453 - regression_loss: 1.7403 - classification_loss: 0.3050 478/500 [===========================>..] - ETA: 5s - loss: 2.0441 - regression_loss: 1.7393 - classification_loss: 0.3048 479/500 [===========================>..] - ETA: 5s - loss: 2.0446 - regression_loss: 1.7399 - classification_loss: 0.3047 480/500 [===========================>..] - ETA: 4s - loss: 2.0437 - regression_loss: 1.7391 - classification_loss: 0.3046 481/500 [===========================>..] - ETA: 4s - loss: 2.0432 - regression_loss: 1.7387 - classification_loss: 0.3044 482/500 [===========================>..] - ETA: 4s - loss: 2.0434 - regression_loss: 1.7389 - classification_loss: 0.3045 483/500 [===========================>..] - ETA: 4s - loss: 2.0431 - regression_loss: 1.7387 - classification_loss: 0.3044 484/500 [============================>.] - ETA: 3s - loss: 2.0416 - regression_loss: 1.7373 - classification_loss: 0.3043 485/500 [============================>.] - ETA: 3s - loss: 2.0418 - regression_loss: 1.7375 - classification_loss: 0.3043 486/500 [============================>.] - ETA: 3s - loss: 2.0410 - regression_loss: 1.7370 - classification_loss: 0.3041 487/500 [============================>.] - ETA: 3s - loss: 2.0409 - regression_loss: 1.7369 - classification_loss: 0.3040 488/500 [============================>.] - ETA: 2s - loss: 2.0406 - regression_loss: 1.7367 - classification_loss: 0.3039 489/500 [============================>.] - ETA: 2s - loss: 2.0409 - regression_loss: 1.7370 - classification_loss: 0.3039 490/500 [============================>.] - ETA: 2s - loss: 2.0409 - regression_loss: 1.7371 - classification_loss: 0.3038 491/500 [============================>.] - ETA: 2s - loss: 2.0403 - regression_loss: 1.7366 - classification_loss: 0.3037 492/500 [============================>.] - ETA: 1s - loss: 2.0384 - regression_loss: 1.7350 - classification_loss: 0.3034 493/500 [============================>.] - ETA: 1s - loss: 2.0386 - regression_loss: 1.7352 - classification_loss: 0.3034 494/500 [============================>.] - ETA: 1s - loss: 2.0388 - regression_loss: 1.7354 - classification_loss: 0.3033 495/500 [============================>.] - ETA: 1s - loss: 2.0381 - regression_loss: 1.7349 - classification_loss: 0.3032 496/500 [============================>.] - ETA: 0s - loss: 2.0380 - regression_loss: 1.7348 - classification_loss: 0.3032 497/500 [============================>.] - ETA: 0s - loss: 2.0371 - regression_loss: 1.7341 - classification_loss: 0.3030 498/500 [============================>.] - ETA: 0s - loss: 2.0357 - regression_loss: 1.7330 - classification_loss: 0.3027 499/500 [============================>.] - ETA: 0s - loss: 2.0362 - regression_loss: 1.7335 - classification_loss: 0.3027 500/500 [==============================] - 121s 242ms/step - loss: 2.0358 - regression_loss: 1.7333 - classification_loss: 0.3026 1172 instances of class plum with average precision: 0.5157 mAP: 0.5157 Epoch 00005: saving model to ./training/snapshots/resnet50_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 1:53 - loss: 2.3603 - regression_loss: 1.9940 - classification_loss: 0.3663 2/500 [..............................] - ETA: 1:56 - loss: 2.0809 - regression_loss: 1.7725 - classification_loss: 0.3085 3/500 [..............................] - ETA: 2:00 - loss: 2.0289 - regression_loss: 1.7270 - classification_loss: 0.3019 4/500 [..............................] - ETA: 2:01 - loss: 2.1342 - regression_loss: 1.7171 - classification_loss: 0.4170 5/500 [..............................] - ETA: 2:01 - loss: 2.2255 - regression_loss: 1.8335 - classification_loss: 0.3920 6/500 [..............................] - ETA: 2:02 - loss: 2.1275 - regression_loss: 1.7577 - classification_loss: 0.3698 7/500 [..............................] - ETA: 2:01 - loss: 1.9868 - regression_loss: 1.6480 - classification_loss: 0.3387 8/500 [..............................] - ETA: 2:02 - loss: 1.9885 - regression_loss: 1.6635 - classification_loss: 0.3250 9/500 [..............................] - ETA: 2:02 - loss: 2.0381 - regression_loss: 1.7149 - classification_loss: 0.3232 10/500 [..............................] - ETA: 2:01 - loss: 1.9903 - regression_loss: 1.6705 - classification_loss: 0.3198 11/500 [..............................] - ETA: 2:01 - loss: 2.0514 - regression_loss: 1.7296 - classification_loss: 0.3218 12/500 [..............................] - ETA: 2:02 - loss: 2.0129 - regression_loss: 1.7011 - classification_loss: 0.3118 13/500 [..............................] - ETA: 2:01 - loss: 2.0409 - regression_loss: 1.7266 - classification_loss: 0.3143 14/500 [..............................] - ETA: 2:02 - loss: 1.9981 - regression_loss: 1.6928 - classification_loss: 0.3052 15/500 [..............................] - ETA: 2:02 - loss: 2.0289 - regression_loss: 1.7268 - classification_loss: 0.3021 16/500 [..............................] - ETA: 2:02 - loss: 1.9446 - regression_loss: 1.6566 - classification_loss: 0.2880 17/500 [>.............................] - ETA: 2:02 - loss: 1.9468 - regression_loss: 1.6569 - classification_loss: 0.2899 18/500 [>.............................] - ETA: 2:02 - loss: 1.9674 - regression_loss: 1.6741 - classification_loss: 0.2934 19/500 [>.............................] - ETA: 2:01 - loss: 1.9634 - regression_loss: 1.6748 - classification_loss: 0.2886 20/500 [>.............................] - ETA: 2:01 - loss: 1.9363 - regression_loss: 1.6548 - classification_loss: 0.2814 21/500 [>.............................] - ETA: 2:01 - loss: 1.9429 - regression_loss: 1.6600 - classification_loss: 0.2829 22/500 [>.............................] - ETA: 2:01 - loss: 1.9481 - regression_loss: 1.6635 - classification_loss: 0.2846 23/500 [>.............................] - ETA: 2:01 - loss: 1.9543 - regression_loss: 1.6691 - classification_loss: 0.2851 24/500 [>.............................] - ETA: 2:00 - loss: 1.9317 - regression_loss: 1.6506 - classification_loss: 0.2811 25/500 [>.............................] - ETA: 2:00 - loss: 1.9488 - regression_loss: 1.6636 - classification_loss: 0.2852 26/500 [>.............................] - ETA: 2:00 - loss: 1.9202 - regression_loss: 1.6394 - classification_loss: 0.2808 27/500 [>.............................] - ETA: 2:00 - loss: 1.8908 - regression_loss: 1.6156 - classification_loss: 0.2753 28/500 [>.............................] - ETA: 2:00 - loss: 1.8916 - regression_loss: 1.6147 - classification_loss: 0.2769 29/500 [>.............................] - ETA: 2:00 - loss: 1.8694 - regression_loss: 1.5979 - classification_loss: 0.2715 30/500 [>.............................] - ETA: 1:59 - loss: 1.8656 - regression_loss: 1.5950 - classification_loss: 0.2707 31/500 [>.............................] - ETA: 1:59 - loss: 1.8953 - regression_loss: 1.6180 - classification_loss: 0.2773 32/500 [>.............................] - ETA: 1:59 - loss: 1.8964 - regression_loss: 1.6189 - classification_loss: 0.2775 33/500 [>.............................] - ETA: 1:58 - loss: 1.8783 - regression_loss: 1.6047 - classification_loss: 0.2737 34/500 [=>............................] - ETA: 1:58 - loss: 1.8746 - regression_loss: 1.5963 - classification_loss: 0.2782 35/500 [=>............................] - ETA: 1:58 - loss: 1.8618 - regression_loss: 1.5860 - classification_loss: 0.2758 36/500 [=>............................] - ETA: 1:58 - loss: 1.8684 - regression_loss: 1.5916 - classification_loss: 0.2768 37/500 [=>............................] - ETA: 1:58 - loss: 1.8889 - regression_loss: 1.6080 - classification_loss: 0.2810 38/500 [=>............................] - ETA: 1:57 - loss: 1.8818 - regression_loss: 1.6019 - classification_loss: 0.2799 39/500 [=>............................] - ETA: 1:57 - loss: 1.8892 - regression_loss: 1.6094 - classification_loss: 0.2798 40/500 [=>............................] - ETA: 1:57 - loss: 1.8902 - regression_loss: 1.6090 - classification_loss: 0.2813 41/500 [=>............................] - ETA: 1:56 - loss: 1.8947 - regression_loss: 1.6113 - classification_loss: 0.2833 42/500 [=>............................] - ETA: 1:56 - loss: 1.9069 - regression_loss: 1.6215 - classification_loss: 0.2854 43/500 [=>............................] - ETA: 1:56 - loss: 1.9274 - regression_loss: 1.6415 - classification_loss: 0.2859 44/500 [=>............................] - ETA: 1:56 - loss: 1.9266 - regression_loss: 1.6410 - classification_loss: 0.2856 45/500 [=>............................] - ETA: 1:55 - loss: 1.9280 - regression_loss: 1.6419 - classification_loss: 0.2861 46/500 [=>............................] - ETA: 1:55 - loss: 1.9477 - regression_loss: 1.6598 - classification_loss: 0.2880 47/500 [=>............................] - ETA: 1:55 - loss: 1.9438 - regression_loss: 1.6564 - classification_loss: 0.2874 48/500 [=>............................] - ETA: 1:55 - loss: 1.9491 - regression_loss: 1.6606 - classification_loss: 0.2885 49/500 [=>............................] - ETA: 1:54 - loss: 1.9464 - regression_loss: 1.6587 - classification_loss: 0.2876 50/500 [==>...........................] - ETA: 1:54 - loss: 1.9503 - regression_loss: 1.6618 - classification_loss: 0.2885 51/500 [==>...........................] - ETA: 1:54 - loss: 1.9533 - regression_loss: 1.6646 - classification_loss: 0.2887 52/500 [==>...........................] - ETA: 1:54 - loss: 1.9537 - regression_loss: 1.6642 - classification_loss: 0.2896 53/500 [==>...........................] - ETA: 1:54 - loss: 1.9515 - regression_loss: 1.6612 - classification_loss: 0.2903 54/500 [==>...........................] - ETA: 1:53 - loss: 1.9467 - regression_loss: 1.6551 - classification_loss: 0.2916 55/500 [==>...........................] - ETA: 1:53 - loss: 1.9471 - regression_loss: 1.6547 - classification_loss: 0.2924 56/500 [==>...........................] - ETA: 1:53 - loss: 1.9474 - regression_loss: 1.6535 - classification_loss: 0.2939 57/500 [==>...........................] - ETA: 1:52 - loss: 1.9504 - regression_loss: 1.6557 - classification_loss: 0.2947 58/500 [==>...........................] - ETA: 1:52 - loss: 1.9527 - regression_loss: 1.6586 - classification_loss: 0.2941 59/500 [==>...........................] - ETA: 1:52 - loss: 1.9588 - regression_loss: 1.6635 - classification_loss: 0.2954 60/500 [==>...........................] - ETA: 1:51 - loss: 1.9577 - regression_loss: 1.6628 - classification_loss: 0.2949 61/500 [==>...........................] - ETA: 1:51 - loss: 1.9626 - regression_loss: 1.6673 - classification_loss: 0.2953 62/500 [==>...........................] - ETA: 1:51 - loss: 1.9650 - regression_loss: 1.6687 - classification_loss: 0.2963 63/500 [==>...........................] - ETA: 1:51 - loss: 1.9681 - regression_loss: 1.6712 - classification_loss: 0.2969 64/500 [==>...........................] - ETA: 1:50 - loss: 1.9741 - regression_loss: 1.6759 - classification_loss: 0.2982 65/500 [==>...........................] - ETA: 1:50 - loss: 1.9675 - regression_loss: 1.6698 - classification_loss: 0.2976 66/500 [==>...........................] - ETA: 1:50 - loss: 1.9647 - regression_loss: 1.6671 - classification_loss: 0.2976 67/500 [===>..........................] - ETA: 1:50 - loss: 1.9690 - regression_loss: 1.6705 - classification_loss: 0.2984 68/500 [===>..........................] - ETA: 1:50 - loss: 1.9766 - regression_loss: 1.6767 - classification_loss: 0.3000 69/500 [===>..........................] - ETA: 1:49 - loss: 1.9814 - regression_loss: 1.6812 - classification_loss: 0.3002 70/500 [===>..........................] - ETA: 1:49 - loss: 1.9823 - regression_loss: 1.6820 - classification_loss: 0.3003 71/500 [===>..........................] - ETA: 1:49 - loss: 1.9849 - regression_loss: 1.6844 - classification_loss: 0.3005 72/500 [===>..........................] - ETA: 1:49 - loss: 1.9867 - regression_loss: 1.6870 - classification_loss: 0.2997 73/500 [===>..........................] - ETA: 1:48 - loss: 1.9906 - regression_loss: 1.6911 - classification_loss: 0.2995 74/500 [===>..........................] - ETA: 1:48 - loss: 1.9823 - regression_loss: 1.6832 - classification_loss: 0.2991 75/500 [===>..........................] - ETA: 1:48 - loss: 1.9851 - regression_loss: 1.6856 - classification_loss: 0.2995 76/500 [===>..........................] - ETA: 1:48 - loss: 1.9867 - regression_loss: 1.6875 - classification_loss: 0.2992 77/500 [===>..........................] - ETA: 1:47 - loss: 1.9868 - regression_loss: 1.6882 - classification_loss: 0.2986 78/500 [===>..........................] - ETA: 1:47 - loss: 1.9909 - regression_loss: 1.6914 - classification_loss: 0.2995 79/500 [===>..........................] - ETA: 1:47 - loss: 1.9969 - regression_loss: 1.6970 - classification_loss: 0.3000 80/500 [===>..........................] - ETA: 1:47 - loss: 1.9984 - regression_loss: 1.6986 - classification_loss: 0.2998 81/500 [===>..........................] - ETA: 1:46 - loss: 2.0144 - regression_loss: 1.7118 - classification_loss: 0.3026 82/500 [===>..........................] - ETA: 1:46 - loss: 2.0145 - regression_loss: 1.7112 - classification_loss: 0.3033 83/500 [===>..........................] - ETA: 1:46 - loss: 2.0121 - regression_loss: 1.7090 - classification_loss: 0.3031 84/500 [====>.........................] - ETA: 1:46 - loss: 2.0093 - regression_loss: 1.7067 - classification_loss: 0.3025 85/500 [====>.........................] - ETA: 1:45 - loss: 2.0131 - regression_loss: 1.7102 - classification_loss: 0.3028 86/500 [====>.........................] - ETA: 1:45 - loss: 2.0234 - regression_loss: 1.7187 - classification_loss: 0.3048 87/500 [====>.........................] - ETA: 1:45 - loss: 2.0236 - regression_loss: 1.7184 - classification_loss: 0.3052 88/500 [====>.........................] - ETA: 1:45 - loss: 2.0246 - regression_loss: 1.7198 - classification_loss: 0.3049 89/500 [====>.........................] - ETA: 1:44 - loss: 2.0186 - regression_loss: 1.7147 - classification_loss: 0.3039 90/500 [====>.........................] - ETA: 1:44 - loss: 2.0196 - regression_loss: 1.7162 - classification_loss: 0.3034 91/500 [====>.........................] - ETA: 1:44 - loss: 2.0186 - regression_loss: 1.7141 - classification_loss: 0.3045 92/500 [====>.........................] - ETA: 1:44 - loss: 2.0242 - regression_loss: 1.7188 - classification_loss: 0.3053 93/500 [====>.........................] - ETA: 1:43 - loss: 2.0174 - regression_loss: 1.7139 - classification_loss: 0.3035 94/500 [====>.........................] - ETA: 1:43 - loss: 2.0124 - regression_loss: 1.7101 - classification_loss: 0.3023 95/500 [====>.........................] - ETA: 1:43 - loss: 2.0094 - regression_loss: 1.7071 - classification_loss: 0.3023 96/500 [====>.........................] - ETA: 1:43 - loss: 2.0129 - regression_loss: 1.7094 - classification_loss: 0.3035 97/500 [====>.........................] - ETA: 1:42 - loss: 2.0121 - regression_loss: 1.7089 - classification_loss: 0.3032 98/500 [====>.........................] - ETA: 1:42 - loss: 2.0108 - regression_loss: 1.7086 - classification_loss: 0.3022 99/500 [====>.........................] - ETA: 1:42 - loss: 2.0108 - regression_loss: 1.7087 - classification_loss: 0.3021 100/500 [=====>........................] - ETA: 1:41 - loss: 2.0126 - regression_loss: 1.7102 - classification_loss: 0.3024 101/500 [=====>........................] - ETA: 1:41 - loss: 2.0057 - regression_loss: 1.7015 - classification_loss: 0.3042 102/500 [=====>........................] - ETA: 1:41 - loss: 1.9969 - regression_loss: 1.6925 - classification_loss: 0.3044 103/500 [=====>........................] - ETA: 1:41 - loss: 1.9983 - regression_loss: 1.6933 - classification_loss: 0.3050 104/500 [=====>........................] - ETA: 1:40 - loss: 2.0027 - regression_loss: 1.6974 - classification_loss: 0.3053 105/500 [=====>........................] - ETA: 1:40 - loss: 2.0017 - regression_loss: 1.6967 - classification_loss: 0.3050 106/500 [=====>........................] - ETA: 1:40 - loss: 2.0011 - regression_loss: 1.6964 - classification_loss: 0.3046 107/500 [=====>........................] - ETA: 1:39 - loss: 2.0004 - regression_loss: 1.6963 - classification_loss: 0.3041 108/500 [=====>........................] - ETA: 1:39 - loss: 1.9934 - regression_loss: 1.6906 - classification_loss: 0.3028 109/500 [=====>........................] - ETA: 1:39 - loss: 1.9982 - regression_loss: 1.6937 - classification_loss: 0.3045 110/500 [=====>........................] - ETA: 1:39 - loss: 1.9995 - regression_loss: 1.6951 - classification_loss: 0.3044 111/500 [=====>........................] - ETA: 1:38 - loss: 2.0013 - regression_loss: 1.6963 - classification_loss: 0.3050 112/500 [=====>........................] - ETA: 1:38 - loss: 1.9983 - regression_loss: 1.6939 - classification_loss: 0.3044 113/500 [=====>........................] - ETA: 1:38 - loss: 2.0001 - regression_loss: 1.6963 - classification_loss: 0.3037 114/500 [=====>........................] - ETA: 1:38 - loss: 2.0017 - regression_loss: 1.6973 - classification_loss: 0.3044 115/500 [=====>........................] - ETA: 1:37 - loss: 2.0030 - regression_loss: 1.6985 - classification_loss: 0.3045 116/500 [=====>........................] - ETA: 1:37 - loss: 2.0062 - regression_loss: 1.7012 - classification_loss: 0.3050 117/500 [======>.......................] - ETA: 1:37 - loss: 2.0104 - regression_loss: 1.7051 - classification_loss: 0.3053 118/500 [======>.......................] - ETA: 1:37 - loss: 2.0197 - regression_loss: 1.7111 - classification_loss: 0.3086 119/500 [======>.......................] - ETA: 1:36 - loss: 2.0140 - regression_loss: 1.7058 - classification_loss: 0.3082 120/500 [======>.......................] - ETA: 1:36 - loss: 2.0092 - regression_loss: 1.7019 - classification_loss: 0.3073 121/500 [======>.......................] - ETA: 1:36 - loss: 2.0065 - regression_loss: 1.6998 - classification_loss: 0.3067 122/500 [======>.......................] - ETA: 1:36 - loss: 2.0116 - regression_loss: 1.7040 - classification_loss: 0.3076 123/500 [======>.......................] - ETA: 1:35 - loss: 2.0098 - regression_loss: 1.7027 - classification_loss: 0.3071 124/500 [======>.......................] - ETA: 1:35 - loss: 2.0076 - regression_loss: 1.7006 - classification_loss: 0.3069 125/500 [======>.......................] - ETA: 1:35 - loss: 2.0079 - regression_loss: 1.7007 - classification_loss: 0.3072 126/500 [======>.......................] - ETA: 1:35 - loss: 2.0042 - regression_loss: 1.6981 - classification_loss: 0.3061 127/500 [======>.......................] - ETA: 1:34 - loss: 2.0112 - regression_loss: 1.7047 - classification_loss: 0.3066 128/500 [======>.......................] - ETA: 1:34 - loss: 2.0116 - regression_loss: 1.7052 - classification_loss: 0.3064 129/500 [======>.......................] - ETA: 1:34 - loss: 2.0125 - regression_loss: 1.7063 - classification_loss: 0.3062 130/500 [======>.......................] - ETA: 1:34 - loss: 2.0139 - regression_loss: 1.7073 - classification_loss: 0.3066 131/500 [======>.......................] - ETA: 1:33 - loss: 2.0127 - regression_loss: 1.7065 - classification_loss: 0.3061 132/500 [======>.......................] - ETA: 1:33 - loss: 2.0113 - regression_loss: 1.7056 - classification_loss: 0.3056 133/500 [======>.......................] - ETA: 1:33 - loss: 2.0092 - regression_loss: 1.7038 - classification_loss: 0.3054 134/500 [=======>......................] - ETA: 1:32 - loss: 2.0079 - regression_loss: 1.7031 - classification_loss: 0.3048 135/500 [=======>......................] - ETA: 1:32 - loss: 2.0088 - regression_loss: 1.7036 - classification_loss: 0.3052 136/500 [=======>......................] - ETA: 1:32 - loss: 2.0109 - regression_loss: 1.7054 - classification_loss: 0.3055 137/500 [=======>......................] - ETA: 1:32 - loss: 2.0138 - regression_loss: 1.7083 - classification_loss: 0.3055 138/500 [=======>......................] - ETA: 1:31 - loss: 2.0107 - regression_loss: 1.7047 - classification_loss: 0.3060 139/500 [=======>......................] - ETA: 1:31 - loss: 2.0086 - regression_loss: 1.7032 - classification_loss: 0.3054 140/500 [=======>......................] - ETA: 1:31 - loss: 2.0091 - regression_loss: 1.7039 - classification_loss: 0.3052 141/500 [=======>......................] - ETA: 1:31 - loss: 2.0095 - regression_loss: 1.7041 - classification_loss: 0.3054 142/500 [=======>......................] - ETA: 1:30 - loss: 2.0110 - regression_loss: 1.7056 - classification_loss: 0.3054 143/500 [=======>......................] - ETA: 1:30 - loss: 2.0126 - regression_loss: 1.7070 - classification_loss: 0.3055 144/500 [=======>......................] - ETA: 1:30 - loss: 2.0148 - regression_loss: 1.7082 - classification_loss: 0.3066 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0151 - regression_loss: 1.7088 - classification_loss: 0.3064 146/500 [=======>......................] - ETA: 1:29 - loss: 2.0159 - regression_loss: 1.7093 - classification_loss: 0.3066 147/500 [=======>......................] - ETA: 1:29 - loss: 2.0187 - regression_loss: 1.7112 - classification_loss: 0.3075 148/500 [=======>......................] - ETA: 1:29 - loss: 2.0151 - regression_loss: 1.7083 - classification_loss: 0.3068 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0122 - regression_loss: 1.7059 - classification_loss: 0.3063 150/500 [========>.....................] - ETA: 1:28 - loss: 2.0112 - regression_loss: 1.7054 - classification_loss: 0.3058 151/500 [========>.....................] - ETA: 1:28 - loss: 2.0126 - regression_loss: 1.7069 - classification_loss: 0.3057 152/500 [========>.....................] - ETA: 1:28 - loss: 2.0134 - regression_loss: 1.7076 - classification_loss: 0.3058 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0121 - regression_loss: 1.7067 - classification_loss: 0.3054 154/500 [========>.....................] - ETA: 1:27 - loss: 2.0134 - regression_loss: 1.7076 - classification_loss: 0.3058 155/500 [========>.....................] - ETA: 1:27 - loss: 2.0154 - regression_loss: 1.7093 - classification_loss: 0.3061 156/500 [========>.....................] - ETA: 1:27 - loss: 2.0161 - regression_loss: 1.7099 - classification_loss: 0.3062 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0175 - regression_loss: 1.7110 - classification_loss: 0.3065 158/500 [========>.....................] - ETA: 1:26 - loss: 2.0118 - regression_loss: 1.7059 - classification_loss: 0.3059 159/500 [========>.....................] - ETA: 1:26 - loss: 2.0115 - regression_loss: 1.7059 - classification_loss: 0.3056 160/500 [========>.....................] - ETA: 1:26 - loss: 2.0149 - regression_loss: 1.7090 - classification_loss: 0.3059 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0134 - regression_loss: 1.7076 - classification_loss: 0.3058 162/500 [========>.....................] - ETA: 1:25 - loss: 2.0132 - regression_loss: 1.7078 - classification_loss: 0.3054 163/500 [========>.....................] - ETA: 1:25 - loss: 2.0158 - regression_loss: 1.7105 - classification_loss: 0.3053 164/500 [========>.....................] - ETA: 1:25 - loss: 2.0194 - regression_loss: 1.7140 - classification_loss: 0.3054 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0181 - regression_loss: 1.7122 - classification_loss: 0.3059 166/500 [========>.....................] - ETA: 1:24 - loss: 2.0174 - regression_loss: 1.7116 - classification_loss: 0.3058 167/500 [=========>....................] - ETA: 1:24 - loss: 2.0178 - regression_loss: 1.7119 - classification_loss: 0.3059 168/500 [=========>....................] - ETA: 1:24 - loss: 2.0165 - regression_loss: 1.7109 - classification_loss: 0.3056 169/500 [=========>....................] - ETA: 1:23 - loss: 2.0162 - regression_loss: 1.7109 - classification_loss: 0.3053 170/500 [=========>....................] - ETA: 1:23 - loss: 2.0156 - regression_loss: 1.7104 - classification_loss: 0.3052 171/500 [=========>....................] - ETA: 1:23 - loss: 2.0144 - regression_loss: 1.7097 - classification_loss: 0.3047 172/500 [=========>....................] - ETA: 1:23 - loss: 2.0119 - regression_loss: 1.7075 - classification_loss: 0.3044 173/500 [=========>....................] - ETA: 1:22 - loss: 2.0140 - regression_loss: 1.7091 - classification_loss: 0.3049 174/500 [=========>....................] - ETA: 1:22 - loss: 2.0144 - regression_loss: 1.7089 - classification_loss: 0.3055 175/500 [=========>....................] - ETA: 1:22 - loss: 2.0158 - regression_loss: 1.7105 - classification_loss: 0.3053 176/500 [=========>....................] - ETA: 1:22 - loss: 2.0131 - regression_loss: 1.7082 - classification_loss: 0.3049 177/500 [=========>....................] - ETA: 1:21 - loss: 2.0116 - regression_loss: 1.7073 - classification_loss: 0.3043 178/500 [=========>....................] - ETA: 1:21 - loss: 2.0128 - regression_loss: 1.7083 - classification_loss: 0.3044 179/500 [=========>....................] - ETA: 1:21 - loss: 2.0054 - regression_loss: 1.7018 - classification_loss: 0.3035 180/500 [=========>....................] - ETA: 1:21 - loss: 2.0071 - regression_loss: 1.7035 - classification_loss: 0.3036 181/500 [=========>....................] - ETA: 1:20 - loss: 2.0073 - regression_loss: 1.7038 - classification_loss: 0.3035 182/500 [=========>....................] - ETA: 1:20 - loss: 2.0082 - regression_loss: 1.7045 - classification_loss: 0.3037 183/500 [=========>....................] - ETA: 1:20 - loss: 2.0105 - regression_loss: 1.7063 - classification_loss: 0.3041 184/500 [==========>...................] - ETA: 1:20 - loss: 2.0109 - regression_loss: 1.7067 - classification_loss: 0.3042 185/500 [==========>...................] - ETA: 1:19 - loss: 2.0124 - regression_loss: 1.7083 - classification_loss: 0.3041 186/500 [==========>...................] - ETA: 1:19 - loss: 2.0116 - regression_loss: 1.7078 - classification_loss: 0.3038 187/500 [==========>...................] - ETA: 1:19 - loss: 2.0094 - regression_loss: 1.7056 - classification_loss: 0.3037 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0097 - regression_loss: 1.7057 - classification_loss: 0.3040 189/500 [==========>...................] - ETA: 1:18 - loss: 2.0083 - regression_loss: 1.7046 - classification_loss: 0.3037 190/500 [==========>...................] - ETA: 1:18 - loss: 2.0075 - regression_loss: 1.7039 - classification_loss: 0.3036 191/500 [==========>...................] - ETA: 1:18 - loss: 2.0064 - regression_loss: 1.7030 - classification_loss: 0.3034 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0087 - regression_loss: 1.7047 - classification_loss: 0.3040 193/500 [==========>...................] - ETA: 1:17 - loss: 2.0037 - regression_loss: 1.7004 - classification_loss: 0.3033 194/500 [==========>...................] - ETA: 1:17 - loss: 1.9986 - regression_loss: 1.6962 - classification_loss: 0.3024 195/500 [==========>...................] - ETA: 1:17 - loss: 1.9949 - regression_loss: 1.6922 - classification_loss: 0.3027 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9966 - regression_loss: 1.6936 - classification_loss: 0.3030 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9947 - regression_loss: 1.6922 - classification_loss: 0.3024 198/500 [==========>...................] - ETA: 1:16 - loss: 1.9939 - regression_loss: 1.6914 - classification_loss: 0.3025 199/500 [==========>...................] - ETA: 1:16 - loss: 1.9952 - regression_loss: 1.6925 - classification_loss: 0.3027 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9951 - regression_loss: 1.6926 - classification_loss: 0.3024 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9946 - regression_loss: 1.6924 - classification_loss: 0.3021 202/500 [===========>..................] - ETA: 1:15 - loss: 1.9941 - regression_loss: 1.6921 - classification_loss: 0.3020 203/500 [===========>..................] - ETA: 1:15 - loss: 1.9940 - regression_loss: 1.6920 - classification_loss: 0.3020 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9941 - regression_loss: 1.6925 - classification_loss: 0.3016 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9899 - regression_loss: 1.6893 - classification_loss: 0.3006 206/500 [===========>..................] - ETA: 1:14 - loss: 1.9937 - regression_loss: 1.6921 - classification_loss: 0.3016 207/500 [===========>..................] - ETA: 1:14 - loss: 1.9925 - regression_loss: 1.6913 - classification_loss: 0.3012 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9923 - regression_loss: 1.6912 - classification_loss: 0.3011 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9937 - regression_loss: 1.6916 - classification_loss: 0.3020 210/500 [===========>..................] - ETA: 1:13 - loss: 1.9924 - regression_loss: 1.6904 - classification_loss: 0.3020 211/500 [===========>..................] - ETA: 1:13 - loss: 1.9915 - regression_loss: 1.6893 - classification_loss: 0.3022 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9918 - regression_loss: 1.6899 - classification_loss: 0.3019 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9932 - regression_loss: 1.6911 - classification_loss: 0.3021 214/500 [===========>..................] - ETA: 1:12 - loss: 1.9898 - regression_loss: 1.6884 - classification_loss: 0.3015 215/500 [===========>..................] - ETA: 1:12 - loss: 1.9909 - regression_loss: 1.6892 - classification_loss: 0.3017 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9905 - regression_loss: 1.6889 - classification_loss: 0.3016 217/500 [============>.................] - ETA: 1:11 - loss: 1.9911 - regression_loss: 1.6892 - classification_loss: 0.3019 218/500 [============>.................] - ETA: 1:11 - loss: 1.9917 - regression_loss: 1.6897 - classification_loss: 0.3020 219/500 [============>.................] - ETA: 1:11 - loss: 1.9887 - regression_loss: 1.6873 - classification_loss: 0.3015 220/500 [============>.................] - ETA: 1:10 - loss: 1.9882 - regression_loss: 1.6869 - classification_loss: 0.3012 221/500 [============>.................] - ETA: 1:10 - loss: 1.9886 - regression_loss: 1.6878 - classification_loss: 0.3008 222/500 [============>.................] - ETA: 1:10 - loss: 1.9880 - regression_loss: 1.6872 - classification_loss: 0.3008 223/500 [============>.................] - ETA: 1:10 - loss: 1.9831 - regression_loss: 1.6830 - classification_loss: 0.3001 224/500 [============>.................] - ETA: 1:09 - loss: 1.9859 - regression_loss: 1.6855 - classification_loss: 0.3003 225/500 [============>.................] - ETA: 1:09 - loss: 1.9861 - regression_loss: 1.6859 - classification_loss: 0.3002 226/500 [============>.................] - ETA: 1:09 - loss: 1.9824 - regression_loss: 1.6828 - classification_loss: 0.2996 227/500 [============>.................] - ETA: 1:09 - loss: 1.9833 - regression_loss: 1.6837 - classification_loss: 0.2996 228/500 [============>.................] - ETA: 1:08 - loss: 1.9841 - regression_loss: 1.6843 - classification_loss: 0.2998 229/500 [============>.................] - ETA: 1:08 - loss: 1.9838 - regression_loss: 1.6843 - classification_loss: 0.2995 230/500 [============>.................] - ETA: 1:08 - loss: 1.9789 - regression_loss: 1.6801 - classification_loss: 0.2988 231/500 [============>.................] - ETA: 1:08 - loss: 1.9798 - regression_loss: 1.6808 - classification_loss: 0.2990 232/500 [============>.................] - ETA: 1:07 - loss: 1.9784 - regression_loss: 1.6795 - classification_loss: 0.2989 233/500 [============>.................] - ETA: 1:07 - loss: 1.9785 - regression_loss: 1.6797 - classification_loss: 0.2987 234/500 [=============>................] - ETA: 1:07 - loss: 1.9759 - regression_loss: 1.6776 - classification_loss: 0.2983 235/500 [=============>................] - ETA: 1:07 - loss: 1.9755 - regression_loss: 1.6769 - classification_loss: 0.2986 236/500 [=============>................] - ETA: 1:06 - loss: 1.9730 - regression_loss: 1.6747 - classification_loss: 0.2982 237/500 [=============>................] - ETA: 1:06 - loss: 1.9731 - regression_loss: 1.6750 - classification_loss: 0.2981 238/500 [=============>................] - ETA: 1:06 - loss: 1.9684 - regression_loss: 1.6710 - classification_loss: 0.2974 239/500 [=============>................] - ETA: 1:06 - loss: 1.9697 - regression_loss: 1.6720 - classification_loss: 0.2977 240/500 [=============>................] - ETA: 1:05 - loss: 1.9712 - regression_loss: 1.6733 - classification_loss: 0.2979 241/500 [=============>................] - ETA: 1:05 - loss: 1.9719 - regression_loss: 1.6739 - classification_loss: 0.2980 242/500 [=============>................] - ETA: 1:05 - loss: 1.9714 - regression_loss: 1.6736 - classification_loss: 0.2978 243/500 [=============>................] - ETA: 1:05 - loss: 1.9717 - regression_loss: 1.6738 - classification_loss: 0.2978 244/500 [=============>................] - ETA: 1:04 - loss: 1.9703 - regression_loss: 1.6727 - classification_loss: 0.2976 245/500 [=============>................] - ETA: 1:04 - loss: 1.9706 - regression_loss: 1.6732 - classification_loss: 0.2973 246/500 [=============>................] - ETA: 1:04 - loss: 1.9695 - regression_loss: 1.6724 - classification_loss: 0.2971 247/500 [=============>................] - ETA: 1:04 - loss: 1.9707 - regression_loss: 1.6735 - classification_loss: 0.2972 248/500 [=============>................] - ETA: 1:03 - loss: 1.9696 - regression_loss: 1.6725 - classification_loss: 0.2970 249/500 [=============>................] - ETA: 1:03 - loss: 1.9693 - regression_loss: 1.6721 - classification_loss: 0.2972 250/500 [==============>...............] - ETA: 1:03 - loss: 1.9693 - regression_loss: 1.6719 - classification_loss: 0.2974 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9701 - regression_loss: 1.6729 - classification_loss: 0.2973 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9697 - regression_loss: 1.6724 - classification_loss: 0.2974 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9708 - regression_loss: 1.6733 - classification_loss: 0.2975 254/500 [==============>...............] - ETA: 1:02 - loss: 1.9733 - regression_loss: 1.6753 - classification_loss: 0.2979 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9749 - regression_loss: 1.6768 - classification_loss: 0.2981 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9742 - regression_loss: 1.6764 - classification_loss: 0.2979 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9756 - regression_loss: 1.6776 - classification_loss: 0.2980 258/500 [==============>...............] - ETA: 1:01 - loss: 1.9748 - regression_loss: 1.6770 - classification_loss: 0.2978 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9758 - regression_loss: 1.6779 - classification_loss: 0.2979 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9755 - regression_loss: 1.6778 - classification_loss: 0.2976 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9750 - regression_loss: 1.6773 - classification_loss: 0.2977 262/500 [==============>...............] - ETA: 1:00 - loss: 1.9753 - regression_loss: 1.6777 - classification_loss: 0.2976 263/500 [==============>...............] - ETA: 59s - loss: 1.9753 - regression_loss: 1.6779 - classification_loss: 0.2974  264/500 [==============>...............] - ETA: 59s - loss: 1.9738 - regression_loss: 1.6766 - classification_loss: 0.2972 265/500 [==============>...............] - ETA: 59s - loss: 1.9730 - regression_loss: 1.6761 - classification_loss: 0.2969 266/500 [==============>...............] - ETA: 59s - loss: 1.9736 - regression_loss: 1.6768 - classification_loss: 0.2968 267/500 [===============>..............] - ETA: 58s - loss: 1.9712 - regression_loss: 1.6748 - classification_loss: 0.2963 268/500 [===============>..............] - ETA: 58s - loss: 1.9696 - regression_loss: 1.6737 - classification_loss: 0.2960 269/500 [===============>..............] - ETA: 58s - loss: 1.9716 - regression_loss: 1.6752 - classification_loss: 0.2965 270/500 [===============>..............] - ETA: 58s - loss: 1.9722 - regression_loss: 1.6755 - classification_loss: 0.2967 271/500 [===============>..............] - ETA: 57s - loss: 1.9722 - regression_loss: 1.6755 - classification_loss: 0.2968 272/500 [===============>..............] - ETA: 57s - loss: 1.9723 - regression_loss: 1.6757 - classification_loss: 0.2966 273/500 [===============>..............] - ETA: 57s - loss: 1.9769 - regression_loss: 1.6796 - classification_loss: 0.2972 274/500 [===============>..............] - ETA: 57s - loss: 1.9737 - regression_loss: 1.6771 - classification_loss: 0.2967 275/500 [===============>..............] - ETA: 56s - loss: 1.9729 - regression_loss: 1.6766 - classification_loss: 0.2963 276/500 [===============>..............] - ETA: 56s - loss: 1.9766 - regression_loss: 1.6793 - classification_loss: 0.2973 277/500 [===============>..............] - ETA: 56s - loss: 1.9749 - regression_loss: 1.6779 - classification_loss: 0.2971 278/500 [===============>..............] - ETA: 56s - loss: 1.9740 - regression_loss: 1.6771 - classification_loss: 0.2969 279/500 [===============>..............] - ETA: 55s - loss: 1.9733 - regression_loss: 1.6764 - classification_loss: 0.2969 280/500 [===============>..............] - ETA: 55s - loss: 1.9743 - regression_loss: 1.6773 - classification_loss: 0.2969 281/500 [===============>..............] - ETA: 55s - loss: 1.9734 - regression_loss: 1.6767 - classification_loss: 0.2967 282/500 [===============>..............] - ETA: 55s - loss: 1.9748 - regression_loss: 1.6781 - classification_loss: 0.2967 283/500 [===============>..............] - ETA: 54s - loss: 1.9746 - regression_loss: 1.6779 - classification_loss: 0.2967 284/500 [================>.............] - ETA: 54s - loss: 1.9737 - regression_loss: 1.6771 - classification_loss: 0.2966 285/500 [================>.............] - ETA: 54s - loss: 1.9752 - regression_loss: 1.6778 - classification_loss: 0.2974 286/500 [================>.............] - ETA: 54s - loss: 1.9770 - regression_loss: 1.6789 - classification_loss: 0.2981 287/500 [================>.............] - ETA: 53s - loss: 1.9766 - regression_loss: 1.6785 - classification_loss: 0.2982 288/500 [================>.............] - ETA: 53s - loss: 1.9762 - regression_loss: 1.6782 - classification_loss: 0.2979 289/500 [================>.............] - ETA: 53s - loss: 1.9772 - regression_loss: 1.6793 - classification_loss: 0.2980 290/500 [================>.............] - ETA: 53s - loss: 1.9774 - regression_loss: 1.6794 - classification_loss: 0.2980 291/500 [================>.............] - ETA: 52s - loss: 1.9780 - regression_loss: 1.6801 - classification_loss: 0.2980 292/500 [================>.............] - ETA: 52s - loss: 1.9767 - regression_loss: 1.6790 - classification_loss: 0.2978 293/500 [================>.............] - ETA: 52s - loss: 1.9760 - regression_loss: 1.6783 - classification_loss: 0.2977 294/500 [================>.............] - ETA: 52s - loss: 1.9758 - regression_loss: 1.6782 - classification_loss: 0.2976 295/500 [================>.............] - ETA: 51s - loss: 1.9772 - regression_loss: 1.6795 - classification_loss: 0.2978 296/500 [================>.............] - ETA: 51s - loss: 1.9766 - regression_loss: 1.6790 - classification_loss: 0.2976 297/500 [================>.............] - ETA: 51s - loss: 1.9730 - regression_loss: 1.6759 - classification_loss: 0.2971 298/500 [================>.............] - ETA: 51s - loss: 1.9721 - regression_loss: 1.6755 - classification_loss: 0.2966 299/500 [================>.............] - ETA: 50s - loss: 1.9696 - regression_loss: 1.6735 - classification_loss: 0.2962 300/500 [=================>............] - ETA: 50s - loss: 1.9706 - regression_loss: 1.6743 - classification_loss: 0.2963 301/500 [=================>............] - ETA: 50s - loss: 1.9709 - regression_loss: 1.6747 - classification_loss: 0.2962 302/500 [=================>............] - ETA: 50s - loss: 1.9715 - regression_loss: 1.6752 - classification_loss: 0.2963 303/500 [=================>............] - ETA: 49s - loss: 1.9736 - regression_loss: 1.6772 - classification_loss: 0.2965 304/500 [=================>............] - ETA: 49s - loss: 1.9698 - regression_loss: 1.6735 - classification_loss: 0.2962 305/500 [=================>............] - ETA: 49s - loss: 1.9700 - regression_loss: 1.6738 - classification_loss: 0.2961 306/500 [=================>............] - ETA: 49s - loss: 1.9688 - regression_loss: 1.6730 - classification_loss: 0.2958 307/500 [=================>............] - ETA: 48s - loss: 1.9666 - regression_loss: 1.6712 - classification_loss: 0.2954 308/500 [=================>............] - ETA: 48s - loss: 1.9669 - regression_loss: 1.6709 - classification_loss: 0.2960 309/500 [=================>............] - ETA: 48s - loss: 1.9655 - regression_loss: 1.6698 - classification_loss: 0.2956 310/500 [=================>............] - ETA: 48s - loss: 1.9659 - regression_loss: 1.6702 - classification_loss: 0.2956 311/500 [=================>............] - ETA: 47s - loss: 1.9661 - regression_loss: 1.6704 - classification_loss: 0.2956 312/500 [=================>............] - ETA: 47s - loss: 1.9674 - regression_loss: 1.6717 - classification_loss: 0.2958 313/500 [=================>............] - ETA: 47s - loss: 1.9671 - regression_loss: 1.6714 - classification_loss: 0.2957 314/500 [=================>............] - ETA: 47s - loss: 1.9691 - regression_loss: 1.6727 - classification_loss: 0.2964 315/500 [=================>............] - ETA: 46s - loss: 1.9702 - regression_loss: 1.6734 - classification_loss: 0.2968 316/500 [=================>............] - ETA: 46s - loss: 1.9695 - regression_loss: 1.6726 - classification_loss: 0.2969 317/500 [==================>...........] - ETA: 46s - loss: 1.9719 - regression_loss: 1.6743 - classification_loss: 0.2975 318/500 [==================>...........] - ETA: 46s - loss: 1.9699 - regression_loss: 1.6726 - classification_loss: 0.2973 319/500 [==================>...........] - ETA: 45s - loss: 1.9711 - regression_loss: 1.6731 - classification_loss: 0.2979 320/500 [==================>...........] - ETA: 45s - loss: 1.9736 - regression_loss: 1.6749 - classification_loss: 0.2988 321/500 [==================>...........] - ETA: 45s - loss: 1.9739 - regression_loss: 1.6750 - classification_loss: 0.2989 322/500 [==================>...........] - ETA: 44s - loss: 1.9727 - regression_loss: 1.6740 - classification_loss: 0.2986 323/500 [==================>...........] - ETA: 44s - loss: 1.9728 - regression_loss: 1.6741 - classification_loss: 0.2986 324/500 [==================>...........] - ETA: 44s - loss: 1.9727 - regression_loss: 1.6742 - classification_loss: 0.2985 325/500 [==================>...........] - ETA: 44s - loss: 1.9736 - regression_loss: 1.6747 - classification_loss: 0.2989 326/500 [==================>...........] - ETA: 43s - loss: 1.9732 - regression_loss: 1.6745 - classification_loss: 0.2987 327/500 [==================>...........] - ETA: 43s - loss: 1.9729 - regression_loss: 1.6744 - classification_loss: 0.2986 328/500 [==================>...........] - ETA: 43s - loss: 1.9744 - regression_loss: 1.6758 - classification_loss: 0.2985 329/500 [==================>...........] - ETA: 43s - loss: 1.9710 - regression_loss: 1.6730 - classification_loss: 0.2980 330/500 [==================>...........] - ETA: 42s - loss: 1.9720 - regression_loss: 1.6738 - classification_loss: 0.2982 331/500 [==================>...........] - ETA: 42s - loss: 1.9699 - regression_loss: 1.6721 - classification_loss: 0.2978 332/500 [==================>...........] - ETA: 42s - loss: 1.9684 - regression_loss: 1.6709 - classification_loss: 0.2975 333/500 [==================>...........] - ETA: 42s - loss: 1.9675 - regression_loss: 1.6700 - classification_loss: 0.2975 334/500 [===================>..........] - ETA: 41s - loss: 1.9656 - regression_loss: 1.6683 - classification_loss: 0.2973 335/500 [===================>..........] - ETA: 41s - loss: 1.9656 - regression_loss: 1.6685 - classification_loss: 0.2971 336/500 [===================>..........] - ETA: 41s - loss: 1.9658 - regression_loss: 1.6688 - classification_loss: 0.2971 337/500 [===================>..........] - ETA: 41s - loss: 1.9665 - regression_loss: 1.6693 - classification_loss: 0.2971 338/500 [===================>..........] - ETA: 40s - loss: 1.9660 - regression_loss: 1.6689 - classification_loss: 0.2971 339/500 [===================>..........] - ETA: 40s - loss: 1.9660 - regression_loss: 1.6688 - classification_loss: 0.2972 340/500 [===================>..........] - ETA: 40s - loss: 1.9639 - regression_loss: 1.6668 - classification_loss: 0.2971 341/500 [===================>..........] - ETA: 40s - loss: 1.9645 - regression_loss: 1.6674 - classification_loss: 0.2971 342/500 [===================>..........] - ETA: 39s - loss: 1.9647 - regression_loss: 1.6677 - classification_loss: 0.2970 343/500 [===================>..........] - ETA: 39s - loss: 1.9632 - regression_loss: 1.6664 - classification_loss: 0.2967 344/500 [===================>..........] - ETA: 39s - loss: 1.9638 - regression_loss: 1.6668 - classification_loss: 0.2970 345/500 [===================>..........] - ETA: 39s - loss: 1.9642 - regression_loss: 1.6671 - classification_loss: 0.2971 346/500 [===================>..........] - ETA: 38s - loss: 1.9644 - regression_loss: 1.6673 - classification_loss: 0.2971 347/500 [===================>..........] - ETA: 38s - loss: 1.9633 - regression_loss: 1.6664 - classification_loss: 0.2969 348/500 [===================>..........] - ETA: 38s - loss: 1.9613 - regression_loss: 1.6648 - classification_loss: 0.2966 349/500 [===================>..........] - ETA: 38s - loss: 1.9606 - regression_loss: 1.6642 - classification_loss: 0.2964 350/500 [====================>.........] - ETA: 37s - loss: 1.9617 - regression_loss: 1.6652 - classification_loss: 0.2966 351/500 [====================>.........] - ETA: 37s - loss: 1.9601 - regression_loss: 1.6638 - classification_loss: 0.2963 352/500 [====================>.........] - ETA: 37s - loss: 1.9595 - regression_loss: 1.6627 - classification_loss: 0.2968 353/500 [====================>.........] - ETA: 37s - loss: 1.9601 - regression_loss: 1.6632 - classification_loss: 0.2969 354/500 [====================>.........] - ETA: 36s - loss: 1.9597 - regression_loss: 1.6626 - classification_loss: 0.2971 355/500 [====================>.........] - ETA: 36s - loss: 1.9604 - regression_loss: 1.6633 - classification_loss: 0.2971 356/500 [====================>.........] - ETA: 36s - loss: 1.9607 - regression_loss: 1.6635 - classification_loss: 0.2972 357/500 [====================>.........] - ETA: 36s - loss: 1.9607 - regression_loss: 1.6636 - classification_loss: 0.2971 358/500 [====================>.........] - ETA: 35s - loss: 1.9601 - regression_loss: 1.6632 - classification_loss: 0.2969 359/500 [====================>.........] - ETA: 35s - loss: 1.9614 - regression_loss: 1.6642 - classification_loss: 0.2972 360/500 [====================>.........] - ETA: 35s - loss: 1.9610 - regression_loss: 1.6639 - classification_loss: 0.2971 361/500 [====================>.........] - ETA: 35s - loss: 1.9608 - regression_loss: 1.6638 - classification_loss: 0.2970 362/500 [====================>.........] - ETA: 34s - loss: 1.9596 - regression_loss: 1.6628 - classification_loss: 0.2968 363/500 [====================>.........] - ETA: 34s - loss: 1.9576 - regression_loss: 1.6613 - classification_loss: 0.2963 364/500 [====================>.........] - ETA: 34s - loss: 1.9555 - regression_loss: 1.6595 - classification_loss: 0.2960 365/500 [====================>.........] - ETA: 34s - loss: 1.9570 - regression_loss: 1.6606 - classification_loss: 0.2964 366/500 [====================>.........] - ETA: 33s - loss: 1.9551 - regression_loss: 1.6590 - classification_loss: 0.2961 367/500 [=====================>........] - ETA: 33s - loss: 1.9553 - regression_loss: 1.6592 - classification_loss: 0.2961 368/500 [=====================>........] - ETA: 33s - loss: 1.9556 - regression_loss: 1.6594 - classification_loss: 0.2962 369/500 [=====================>........] - ETA: 33s - loss: 1.9549 - regression_loss: 1.6589 - classification_loss: 0.2960 370/500 [=====================>........] - ETA: 32s - loss: 1.9551 - regression_loss: 1.6591 - classification_loss: 0.2960 371/500 [=====================>........] - ETA: 32s - loss: 1.9555 - regression_loss: 1.6593 - classification_loss: 0.2962 372/500 [=====================>........] - ETA: 32s - loss: 1.9541 - regression_loss: 1.6581 - classification_loss: 0.2959 373/500 [=====================>........] - ETA: 32s - loss: 1.9537 - regression_loss: 1.6578 - classification_loss: 0.2959 374/500 [=====================>........] - ETA: 31s - loss: 1.9548 - regression_loss: 1.6586 - classification_loss: 0.2962 375/500 [=====================>........] - ETA: 31s - loss: 1.9540 - regression_loss: 1.6576 - classification_loss: 0.2964 376/500 [=====================>........] - ETA: 31s - loss: 1.9537 - regression_loss: 1.6573 - classification_loss: 0.2964 377/500 [=====================>........] - ETA: 31s - loss: 1.9536 - regression_loss: 1.6572 - classification_loss: 0.2963 378/500 [=====================>........] - ETA: 30s - loss: 1.9524 - regression_loss: 1.6560 - classification_loss: 0.2964 379/500 [=====================>........] - ETA: 30s - loss: 1.9557 - regression_loss: 1.6590 - classification_loss: 0.2966 380/500 [=====================>........] - ETA: 30s - loss: 1.9561 - regression_loss: 1.6594 - classification_loss: 0.2967 381/500 [=====================>........] - ETA: 30s - loss: 1.9539 - regression_loss: 1.6575 - classification_loss: 0.2964 382/500 [=====================>........] - ETA: 29s - loss: 1.9552 - regression_loss: 1.6586 - classification_loss: 0.2966 383/500 [=====================>........] - ETA: 29s - loss: 1.9559 - regression_loss: 1.6594 - classification_loss: 0.2965 384/500 [======================>.......] - ETA: 29s - loss: 1.9564 - regression_loss: 1.6599 - classification_loss: 0.2965 385/500 [======================>.......] - ETA: 29s - loss: 1.9580 - regression_loss: 1.6612 - classification_loss: 0.2968 386/500 [======================>.......] - ETA: 28s - loss: 1.9579 - regression_loss: 1.6611 - classification_loss: 0.2968 387/500 [======================>.......] - ETA: 28s - loss: 1.9588 - regression_loss: 1.6619 - classification_loss: 0.2969 388/500 [======================>.......] - ETA: 28s - loss: 1.9578 - regression_loss: 1.6611 - classification_loss: 0.2968 389/500 [======================>.......] - ETA: 28s - loss: 1.9570 - regression_loss: 1.6605 - classification_loss: 0.2966 390/500 [======================>.......] - ETA: 27s - loss: 1.9573 - regression_loss: 1.6607 - classification_loss: 0.2966 391/500 [======================>.......] - ETA: 27s - loss: 1.9572 - regression_loss: 1.6608 - classification_loss: 0.2963 392/500 [======================>.......] - ETA: 27s - loss: 1.9562 - regression_loss: 1.6600 - classification_loss: 0.2962 393/500 [======================>.......] - ETA: 27s - loss: 1.9570 - regression_loss: 1.6607 - classification_loss: 0.2962 394/500 [======================>.......] - ETA: 26s - loss: 1.9574 - regression_loss: 1.6612 - classification_loss: 0.2962 395/500 [======================>.......] - ETA: 26s - loss: 1.9575 - regression_loss: 1.6613 - classification_loss: 0.2962 396/500 [======================>.......] - ETA: 26s - loss: 1.9583 - regression_loss: 1.6619 - classification_loss: 0.2964 397/500 [======================>.......] - ETA: 26s - loss: 1.9579 - regression_loss: 1.6616 - classification_loss: 0.2963 398/500 [======================>.......] - ETA: 25s - loss: 1.9580 - regression_loss: 1.6619 - classification_loss: 0.2962 399/500 [======================>.......] - ETA: 25s - loss: 1.9582 - regression_loss: 1.6619 - classification_loss: 0.2962 400/500 [=======================>......] - ETA: 25s - loss: 1.9613 - regression_loss: 1.6645 - classification_loss: 0.2969 401/500 [=======================>......] - ETA: 25s - loss: 1.9602 - regression_loss: 1.6635 - classification_loss: 0.2967 402/500 [=======================>......] - ETA: 24s - loss: 1.9590 - regression_loss: 1.6626 - classification_loss: 0.2964 403/500 [=======================>......] - ETA: 24s - loss: 1.9601 - regression_loss: 1.6636 - classification_loss: 0.2966 404/500 [=======================>......] - ETA: 24s - loss: 1.9592 - regression_loss: 1.6628 - classification_loss: 0.2964 405/500 [=======================>......] - ETA: 24s - loss: 1.9568 - regression_loss: 1.6609 - classification_loss: 0.2959 406/500 [=======================>......] - ETA: 23s - loss: 1.9562 - regression_loss: 1.6602 - classification_loss: 0.2960 407/500 [=======================>......] - ETA: 23s - loss: 1.9542 - regression_loss: 1.6586 - classification_loss: 0.2957 408/500 [=======================>......] - ETA: 23s - loss: 1.9543 - regression_loss: 1.6587 - classification_loss: 0.2956 409/500 [=======================>......] - ETA: 23s - loss: 1.9539 - regression_loss: 1.6583 - classification_loss: 0.2956 410/500 [=======================>......] - ETA: 22s - loss: 1.9529 - regression_loss: 1.6575 - classification_loss: 0.2954 411/500 [=======================>......] - ETA: 22s - loss: 1.9537 - regression_loss: 1.6582 - classification_loss: 0.2955 412/500 [=======================>......] - ETA: 22s - loss: 1.9533 - regression_loss: 1.6579 - classification_loss: 0.2954 413/500 [=======================>......] - ETA: 22s - loss: 1.9538 - regression_loss: 1.6585 - classification_loss: 0.2954 414/500 [=======================>......] - ETA: 21s - loss: 1.9542 - regression_loss: 1.6588 - classification_loss: 0.2954 415/500 [=======================>......] - ETA: 21s - loss: 1.9542 - regression_loss: 1.6590 - classification_loss: 0.2952 416/500 [=======================>......] - ETA: 21s - loss: 1.9527 - regression_loss: 1.6574 - classification_loss: 0.2953 417/500 [========================>.....] - ETA: 21s - loss: 1.9529 - regression_loss: 1.6576 - classification_loss: 0.2953 418/500 [========================>.....] - ETA: 20s - loss: 1.9520 - regression_loss: 1.6568 - classification_loss: 0.2952 419/500 [========================>.....] - ETA: 20s - loss: 1.9524 - regression_loss: 1.6572 - classification_loss: 0.2952 420/500 [========================>.....] - ETA: 20s - loss: 1.9523 - regression_loss: 1.6571 - classification_loss: 0.2951 421/500 [========================>.....] - ETA: 19s - loss: 1.9522 - regression_loss: 1.6572 - classification_loss: 0.2950 422/500 [========================>.....] - ETA: 19s - loss: 1.9522 - regression_loss: 1.6572 - classification_loss: 0.2950 423/500 [========================>.....] - ETA: 19s - loss: 1.9530 - regression_loss: 1.6580 - classification_loss: 0.2950 424/500 [========================>.....] - ETA: 19s - loss: 1.9530 - regression_loss: 1.6580 - classification_loss: 0.2950 425/500 [========================>.....] - ETA: 18s - loss: 1.9521 - regression_loss: 1.6573 - classification_loss: 0.2948 426/500 [========================>.....] - ETA: 18s - loss: 1.9521 - regression_loss: 1.6575 - classification_loss: 0.2946 427/500 [========================>.....] - ETA: 18s - loss: 1.9524 - regression_loss: 1.6578 - classification_loss: 0.2946 428/500 [========================>.....] - ETA: 18s - loss: 1.9536 - regression_loss: 1.6588 - classification_loss: 0.2948 429/500 [========================>.....] - ETA: 17s - loss: 1.9540 - regression_loss: 1.6592 - classification_loss: 0.2948 430/500 [========================>.....] - ETA: 17s - loss: 1.9527 - regression_loss: 1.6583 - classification_loss: 0.2945 431/500 [========================>.....] - ETA: 17s - loss: 1.9532 - regression_loss: 1.6587 - classification_loss: 0.2945 432/500 [========================>.....] - ETA: 17s - loss: 1.9525 - regression_loss: 1.6582 - classification_loss: 0.2943 433/500 [========================>.....] - ETA: 16s - loss: 1.9534 - regression_loss: 1.6589 - classification_loss: 0.2945 434/500 [=========================>....] - ETA: 16s - loss: 1.9544 - regression_loss: 1.6599 - classification_loss: 0.2945 435/500 [=========================>....] - ETA: 16s - loss: 1.9533 - regression_loss: 1.6589 - classification_loss: 0.2944 436/500 [=========================>....] - ETA: 16s - loss: 1.9536 - regression_loss: 1.6593 - classification_loss: 0.2944 437/500 [=========================>....] - ETA: 15s - loss: 1.9533 - regression_loss: 1.6592 - classification_loss: 0.2941 438/500 [=========================>....] - ETA: 15s - loss: 1.9518 - regression_loss: 1.6580 - classification_loss: 0.2938 439/500 [=========================>....] - ETA: 15s - loss: 1.9507 - regression_loss: 1.6572 - classification_loss: 0.2935 440/500 [=========================>....] - ETA: 15s - loss: 1.9512 - regression_loss: 1.6577 - classification_loss: 0.2935 441/500 [=========================>....] - ETA: 14s - loss: 1.9511 - regression_loss: 1.6577 - classification_loss: 0.2934 442/500 [=========================>....] - ETA: 14s - loss: 1.9510 - regression_loss: 1.6578 - classification_loss: 0.2933 443/500 [=========================>....] - ETA: 14s - loss: 1.9492 - regression_loss: 1.6560 - classification_loss: 0.2931 444/500 [=========================>....] - ETA: 14s - loss: 1.9492 - regression_loss: 1.6561 - classification_loss: 0.2931 445/500 [=========================>....] - ETA: 13s - loss: 1.9493 - regression_loss: 1.6559 - classification_loss: 0.2933 446/500 [=========================>....] - ETA: 13s - loss: 1.9487 - regression_loss: 1.6554 - classification_loss: 0.2933 447/500 [=========================>....] - ETA: 13s - loss: 1.9459 - regression_loss: 1.6530 - classification_loss: 0.2929 448/500 [=========================>....] - ETA: 13s - loss: 1.9459 - regression_loss: 1.6529 - classification_loss: 0.2930 449/500 [=========================>....] - ETA: 12s - loss: 1.9467 - regression_loss: 1.6536 - classification_loss: 0.2931 450/500 [==========================>...] - ETA: 12s - loss: 1.9465 - regression_loss: 1.6534 - classification_loss: 0.2931 451/500 [==========================>...] - ETA: 12s - loss: 1.9453 - regression_loss: 1.6524 - classification_loss: 0.2928 452/500 [==========================>...] - ETA: 12s - loss: 1.9447 - regression_loss: 1.6519 - classification_loss: 0.2928 453/500 [==========================>...] - ETA: 11s - loss: 1.9424 - regression_loss: 1.6500 - classification_loss: 0.2924 454/500 [==========================>...] - ETA: 11s - loss: 1.9410 - regression_loss: 1.6489 - classification_loss: 0.2921 455/500 [==========================>...] - ETA: 11s - loss: 1.9408 - regression_loss: 1.6488 - classification_loss: 0.2920 456/500 [==========================>...] - ETA: 11s - loss: 1.9416 - regression_loss: 1.6495 - classification_loss: 0.2921 457/500 [==========================>...] - ETA: 10s - loss: 1.9422 - regression_loss: 1.6500 - classification_loss: 0.2923 458/500 [==========================>...] - ETA: 10s - loss: 1.9429 - regression_loss: 1.6505 - classification_loss: 0.2924 459/500 [==========================>...] - ETA: 10s - loss: 1.9435 - regression_loss: 1.6511 - classification_loss: 0.2924 460/500 [==========================>...] - ETA: 10s - loss: 1.9442 - regression_loss: 1.6517 - classification_loss: 0.2925 461/500 [==========================>...] - ETA: 9s - loss: 1.9439 - regression_loss: 1.6514 - classification_loss: 0.2924  462/500 [==========================>...] - ETA: 9s - loss: 1.9437 - regression_loss: 1.6514 - classification_loss: 0.2923 463/500 [==========================>...] - ETA: 9s - loss: 1.9435 - regression_loss: 1.6513 - classification_loss: 0.2922 464/500 [==========================>...] - ETA: 9s - loss: 1.9438 - regression_loss: 1.6516 - classification_loss: 0.2922 465/500 [==========================>...] - ETA: 8s - loss: 1.9449 - regression_loss: 1.6527 - classification_loss: 0.2923 466/500 [==========================>...] - ETA: 8s - loss: 1.9454 - regression_loss: 1.6532 - classification_loss: 0.2922 467/500 [===========================>..] - ETA: 8s - loss: 1.9460 - regression_loss: 1.6536 - classification_loss: 0.2923 468/500 [===========================>..] - ETA: 8s - loss: 1.9458 - regression_loss: 1.6536 - classification_loss: 0.2923 469/500 [===========================>..] - ETA: 7s - loss: 1.9467 - regression_loss: 1.6544 - classification_loss: 0.2924 470/500 [===========================>..] - ETA: 7s - loss: 1.9456 - regression_loss: 1.6536 - classification_loss: 0.2920 471/500 [===========================>..] - ETA: 7s - loss: 1.9449 - regression_loss: 1.6530 - classification_loss: 0.2919 472/500 [===========================>..] - ETA: 7s - loss: 1.9451 - regression_loss: 1.6530 - classification_loss: 0.2921 473/500 [===========================>..] - ETA: 6s - loss: 1.9450 - regression_loss: 1.6530 - classification_loss: 0.2920 474/500 [===========================>..] - ETA: 6s - loss: 1.9454 - regression_loss: 1.6534 - classification_loss: 0.2921 475/500 [===========================>..] - ETA: 6s - loss: 1.9435 - regression_loss: 1.6519 - classification_loss: 0.2917 476/500 [===========================>..] - ETA: 6s - loss: 1.9434 - regression_loss: 1.6518 - classification_loss: 0.2916 477/500 [===========================>..] - ETA: 5s - loss: 1.9433 - regression_loss: 1.6517 - classification_loss: 0.2916 478/500 [===========================>..] - ETA: 5s - loss: 1.9438 - regression_loss: 1.6521 - classification_loss: 0.2916 479/500 [===========================>..] - ETA: 5s - loss: 1.9439 - regression_loss: 1.6522 - classification_loss: 0.2917 480/500 [===========================>..] - ETA: 5s - loss: 1.9437 - regression_loss: 1.6521 - classification_loss: 0.2916 481/500 [===========================>..] - ETA: 4s - loss: 1.9425 - regression_loss: 1.6513 - classification_loss: 0.2912 482/500 [===========================>..] - ETA: 4s - loss: 1.9423 - regression_loss: 1.6512 - classification_loss: 0.2910 483/500 [===========================>..] - ETA: 4s - loss: 1.9439 - regression_loss: 1.6526 - classification_loss: 0.2913 484/500 [============================>.] - ETA: 4s - loss: 1.9439 - regression_loss: 1.6525 - classification_loss: 0.2914 485/500 [============================>.] - ETA: 3s - loss: 1.9430 - regression_loss: 1.6517 - classification_loss: 0.2913 486/500 [============================>.] - ETA: 3s - loss: 1.9436 - regression_loss: 1.6522 - classification_loss: 0.2915 487/500 [============================>.] - ETA: 3s - loss: 1.9441 - regression_loss: 1.6526 - classification_loss: 0.2915 488/500 [============================>.] - ETA: 3s - loss: 1.9447 - regression_loss: 1.6529 - classification_loss: 0.2918 489/500 [============================>.] - ETA: 2s - loss: 1.9445 - regression_loss: 1.6528 - classification_loss: 0.2917 490/500 [============================>.] - ETA: 2s - loss: 1.9446 - regression_loss: 1.6529 - classification_loss: 0.2917 491/500 [============================>.] - ETA: 2s - loss: 1.9448 - regression_loss: 1.6530 - classification_loss: 0.2918 492/500 [============================>.] - ETA: 2s - loss: 1.9446 - regression_loss: 1.6529 - classification_loss: 0.2917 493/500 [============================>.] - ETA: 1s - loss: 1.9455 - regression_loss: 1.6537 - classification_loss: 0.2917 494/500 [============================>.] - ETA: 1s - loss: 1.9451 - regression_loss: 1.6535 - classification_loss: 0.2917 495/500 [============================>.] - ETA: 1s - loss: 1.9452 - regression_loss: 1.6536 - classification_loss: 0.2916 496/500 [============================>.] - ETA: 1s - loss: 1.9456 - regression_loss: 1.6540 - classification_loss: 0.2916 497/500 [============================>.] - ETA: 0s - loss: 1.9442 - regression_loss: 1.6529 - classification_loss: 0.2914 498/500 [============================>.] - ETA: 0s - loss: 1.9447 - regression_loss: 1.6531 - classification_loss: 0.2916 499/500 [============================>.] - ETA: 0s - loss: 1.9445 - regression_loss: 1.6529 - classification_loss: 0.2915 500/500 [==============================] - 126s 253ms/step - loss: 1.9444 - regression_loss: 1.6529 - classification_loss: 0.2915 1172 instances of class plum with average precision: 0.6122 mAP: 0.6122 Epoch 00006: saving model to ./training/snapshots/resnet50_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 2:03 - loss: 2.0171 - regression_loss: 1.7438 - classification_loss: 0.2733 2/500 [..............................] - ETA: 2:03 - loss: 1.9308 - regression_loss: 1.6644 - classification_loss: 0.2665 3/500 [..............................] - ETA: 2:04 - loss: 1.9936 - regression_loss: 1.7213 - classification_loss: 0.2723 4/500 [..............................] - ETA: 2:04 - loss: 2.0709 - regression_loss: 1.7834 - classification_loss: 0.2875 5/500 [..............................] - ETA: 2:04 - loss: 1.9830 - regression_loss: 1.7077 - classification_loss: 0.2753 6/500 [..............................] - ETA: 2:04 - loss: 1.9748 - regression_loss: 1.6870 - classification_loss: 0.2878 7/500 [..............................] - ETA: 2:04 - loss: 2.0654 - regression_loss: 1.7806 - classification_loss: 0.2848 8/500 [..............................] - ETA: 2:03 - loss: 2.0677 - regression_loss: 1.7817 - classification_loss: 0.2860 9/500 [..............................] - ETA: 2:03 - loss: 2.0743 - regression_loss: 1.7764 - classification_loss: 0.2979 10/500 [..............................] - ETA: 2:03 - loss: 2.0527 - regression_loss: 1.7546 - classification_loss: 0.2981 11/500 [..............................] - ETA: 2:02 - loss: 2.0439 - regression_loss: 1.7504 - classification_loss: 0.2935 12/500 [..............................] - ETA: 2:02 - loss: 2.0440 - regression_loss: 1.7552 - classification_loss: 0.2888 13/500 [..............................] - ETA: 2:02 - loss: 2.0078 - regression_loss: 1.7270 - classification_loss: 0.2808 14/500 [..............................] - ETA: 2:02 - loss: 1.9427 - regression_loss: 1.6698 - classification_loss: 0.2730 15/500 [..............................] - ETA: 2:02 - loss: 1.9236 - regression_loss: 1.6507 - classification_loss: 0.2728 16/500 [..............................] - ETA: 2:01 - loss: 1.9236 - regression_loss: 1.6440 - classification_loss: 0.2796 17/500 [>.............................] - ETA: 2:01 - loss: 1.9164 - regression_loss: 1.6315 - classification_loss: 0.2848 18/500 [>.............................] - ETA: 2:01 - loss: 1.9600 - regression_loss: 1.6676 - classification_loss: 0.2924 19/500 [>.............................] - ETA: 2:01 - loss: 1.9468 - regression_loss: 1.6602 - classification_loss: 0.2866 20/500 [>.............................] - ETA: 2:01 - loss: 1.9255 - regression_loss: 1.6432 - classification_loss: 0.2824 21/500 [>.............................] - ETA: 2:00 - loss: 1.9411 - regression_loss: 1.6543 - classification_loss: 0.2869 22/500 [>.............................] - ETA: 2:00 - loss: 1.9906 - regression_loss: 1.6931 - classification_loss: 0.2976 23/500 [>.............................] - ETA: 2:00 - loss: 2.0021 - regression_loss: 1.7006 - classification_loss: 0.3015 24/500 [>.............................] - ETA: 1:59 - loss: 2.0039 - regression_loss: 1.7027 - classification_loss: 0.3013 25/500 [>.............................] - ETA: 1:59 - loss: 1.9905 - regression_loss: 1.6943 - classification_loss: 0.2962 26/500 [>.............................] - ETA: 1:59 - loss: 1.9970 - regression_loss: 1.7003 - classification_loss: 0.2967 27/500 [>.............................] - ETA: 1:59 - loss: 1.9932 - regression_loss: 1.6983 - classification_loss: 0.2950 28/500 [>.............................] - ETA: 1:58 - loss: 1.9955 - regression_loss: 1.7009 - classification_loss: 0.2947 29/500 [>.............................] - ETA: 1:58 - loss: 2.0068 - regression_loss: 1.7104 - classification_loss: 0.2964 30/500 [>.............................] - ETA: 1:57 - loss: 2.0228 - regression_loss: 1.7249 - classification_loss: 0.2979 31/500 [>.............................] - ETA: 1:57 - loss: 2.0029 - regression_loss: 1.7067 - classification_loss: 0.2962 32/500 [>.............................] - ETA: 1:57 - loss: 1.9732 - regression_loss: 1.6765 - classification_loss: 0.2967 33/500 [>.............................] - ETA: 1:57 - loss: 1.9753 - regression_loss: 1.6781 - classification_loss: 0.2972 34/500 [=>............................] - ETA: 1:57 - loss: 1.9770 - regression_loss: 1.6802 - classification_loss: 0.2969 35/500 [=>............................] - ETA: 1:57 - loss: 1.9787 - regression_loss: 1.6833 - classification_loss: 0.2954 36/500 [=>............................] - ETA: 1:56 - loss: 1.9929 - regression_loss: 1.6953 - classification_loss: 0.2975 37/500 [=>............................] - ETA: 1:56 - loss: 1.9978 - regression_loss: 1.7002 - classification_loss: 0.2976 38/500 [=>............................] - ETA: 1:56 - loss: 1.9985 - regression_loss: 1.7011 - classification_loss: 0.2974 39/500 [=>............................] - ETA: 1:56 - loss: 1.9807 - regression_loss: 1.6861 - classification_loss: 0.2945 40/500 [=>............................] - ETA: 1:55 - loss: 1.9895 - regression_loss: 1.6928 - classification_loss: 0.2967 41/500 [=>............................] - ETA: 1:55 - loss: 1.9841 - regression_loss: 1.6889 - classification_loss: 0.2952 42/500 [=>............................] - ETA: 1:55 - loss: 1.9768 - regression_loss: 1.6838 - classification_loss: 0.2930 43/500 [=>............................] - ETA: 1:55 - loss: 1.9717 - regression_loss: 1.6803 - classification_loss: 0.2914 44/500 [=>............................] - ETA: 1:54 - loss: 1.9661 - regression_loss: 1.6764 - classification_loss: 0.2898 45/500 [=>............................] - ETA: 1:54 - loss: 1.9562 - regression_loss: 1.6684 - classification_loss: 0.2879 46/500 [=>............................] - ETA: 1:54 - loss: 1.9724 - regression_loss: 1.6794 - classification_loss: 0.2930 47/500 [=>............................] - ETA: 1:54 - loss: 1.9599 - regression_loss: 1.6697 - classification_loss: 0.2902 48/500 [=>............................] - ETA: 1:53 - loss: 1.9595 - regression_loss: 1.6703 - classification_loss: 0.2892 49/500 [=>............................] - ETA: 1:53 - loss: 1.9637 - regression_loss: 1.6741 - classification_loss: 0.2896 50/500 [==>...........................] - ETA: 1:53 - loss: 1.9544 - regression_loss: 1.6658 - classification_loss: 0.2886 51/500 [==>...........................] - ETA: 1:53 - loss: 1.9560 - regression_loss: 1.6681 - classification_loss: 0.2879 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9551 - regression_loss: 1.6675 - classification_loss: 0.2876 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9456 - regression_loss: 1.6606 - classification_loss: 0.2851 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9398 - regression_loss: 1.6559 - classification_loss: 0.2839 55/500 [==>...........................] - ETA: 1:52 - loss: 1.9351 - regression_loss: 1.6520 - classification_loss: 0.2831 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9265 - regression_loss: 1.6446 - classification_loss: 0.2819 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9101 - regression_loss: 1.6312 - classification_loss: 0.2789 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9111 - regression_loss: 1.6313 - classification_loss: 0.2798 59/500 [==>...........................] - ETA: 1:51 - loss: 1.9052 - regression_loss: 1.6246 - classification_loss: 0.2807 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9092 - regression_loss: 1.6290 - classification_loss: 0.2801 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8999 - regression_loss: 1.6212 - classification_loss: 0.2787 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9016 - regression_loss: 1.6236 - classification_loss: 0.2780 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8853 - regression_loss: 1.6091 - classification_loss: 0.2763 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8867 - regression_loss: 1.6104 - classification_loss: 0.2763 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8877 - regression_loss: 1.6112 - classification_loss: 0.2764 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8908 - regression_loss: 1.6134 - classification_loss: 0.2774 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8918 - regression_loss: 1.6145 - classification_loss: 0.2773 68/500 [===>..........................] - ETA: 1:49 - loss: 1.8901 - regression_loss: 1.6132 - classification_loss: 0.2768 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8997 - regression_loss: 1.6198 - classification_loss: 0.2800 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8996 - regression_loss: 1.6186 - classification_loss: 0.2811 71/500 [===>..........................] - ETA: 1:48 - loss: 1.9140 - regression_loss: 1.6267 - classification_loss: 0.2873 72/500 [===>..........................] - ETA: 1:48 - loss: 1.9097 - regression_loss: 1.6240 - classification_loss: 0.2858 73/500 [===>..........................] - ETA: 1:48 - loss: 1.9215 - regression_loss: 1.6347 - classification_loss: 0.2868 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9260 - regression_loss: 1.6389 - classification_loss: 0.2872 75/500 [===>..........................] - ETA: 1:47 - loss: 1.9230 - regression_loss: 1.6372 - classification_loss: 0.2859 76/500 [===>..........................] - ETA: 1:47 - loss: 1.9313 - regression_loss: 1.6441 - classification_loss: 0.2872 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9431 - regression_loss: 1.6545 - classification_loss: 0.2886 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9428 - regression_loss: 1.6544 - classification_loss: 0.2884 79/500 [===>..........................] - ETA: 1:46 - loss: 1.9428 - regression_loss: 1.6548 - classification_loss: 0.2880 80/500 [===>..........................] - ETA: 1:46 - loss: 1.9480 - regression_loss: 1.6597 - classification_loss: 0.2883 81/500 [===>..........................] - ETA: 1:46 - loss: 1.9537 - regression_loss: 1.6646 - classification_loss: 0.2890 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9436 - regression_loss: 1.6557 - classification_loss: 0.2878 83/500 [===>..........................] - ETA: 1:45 - loss: 1.9454 - regression_loss: 1.6572 - classification_loss: 0.2882 84/500 [====>.........................] - ETA: 1:45 - loss: 1.9447 - regression_loss: 1.6569 - classification_loss: 0.2878 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9464 - regression_loss: 1.6583 - classification_loss: 0.2881 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9528 - regression_loss: 1.6625 - classification_loss: 0.2903 87/500 [====>.........................] - ETA: 1:44 - loss: 1.9672 - regression_loss: 1.6718 - classification_loss: 0.2954 88/500 [====>.........................] - ETA: 1:44 - loss: 1.9685 - regression_loss: 1.6729 - classification_loss: 0.2956 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9696 - regression_loss: 1.6745 - classification_loss: 0.2951 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9719 - regression_loss: 1.6763 - classification_loss: 0.2956 91/500 [====>.........................] - ETA: 1:43 - loss: 1.9729 - regression_loss: 1.6778 - classification_loss: 0.2952 92/500 [====>.........................] - ETA: 1:43 - loss: 1.9721 - regression_loss: 1.6781 - classification_loss: 0.2940 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9702 - regression_loss: 1.6761 - classification_loss: 0.2941 94/500 [====>.........................] - ETA: 1:42 - loss: 1.9715 - regression_loss: 1.6756 - classification_loss: 0.2959 95/500 [====>.........................] - ETA: 1:42 - loss: 1.9742 - regression_loss: 1.6781 - classification_loss: 0.2961 96/500 [====>.........................] - ETA: 1:42 - loss: 1.9714 - regression_loss: 1.6765 - classification_loss: 0.2949 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9714 - regression_loss: 1.6769 - classification_loss: 0.2944 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9700 - regression_loss: 1.6761 - classification_loss: 0.2939 99/500 [====>.........................] - ETA: 1:41 - loss: 1.9814 - regression_loss: 1.6865 - classification_loss: 0.2949 100/500 [=====>........................] - ETA: 1:41 - loss: 1.9802 - regression_loss: 1.6842 - classification_loss: 0.2960 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9830 - regression_loss: 1.6866 - classification_loss: 0.2964 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9763 - regression_loss: 1.6805 - classification_loss: 0.2958 103/500 [=====>........................] - ETA: 1:40 - loss: 1.9764 - regression_loss: 1.6804 - classification_loss: 0.2960 104/500 [=====>........................] - ETA: 1:40 - loss: 1.9723 - regression_loss: 1.6771 - classification_loss: 0.2952 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9729 - regression_loss: 1.6777 - classification_loss: 0.2952 106/500 [=====>........................] - ETA: 1:39 - loss: 1.9687 - regression_loss: 1.6747 - classification_loss: 0.2940 107/500 [=====>........................] - ETA: 1:39 - loss: 1.9663 - regression_loss: 1.6729 - classification_loss: 0.2934 108/500 [=====>........................] - ETA: 1:39 - loss: 1.9656 - regression_loss: 1.6725 - classification_loss: 0.2931 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9639 - regression_loss: 1.6710 - classification_loss: 0.2929 110/500 [=====>........................] - ETA: 1:38 - loss: 1.9583 - regression_loss: 1.6664 - classification_loss: 0.2919 111/500 [=====>........................] - ETA: 1:38 - loss: 1.9593 - regression_loss: 1.6676 - classification_loss: 0.2916 112/500 [=====>........................] - ETA: 1:38 - loss: 1.9569 - regression_loss: 1.6658 - classification_loss: 0.2911 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9491 - regression_loss: 1.6597 - classification_loss: 0.2894 114/500 [=====>........................] - ETA: 1:37 - loss: 1.9521 - regression_loss: 1.6617 - classification_loss: 0.2903 115/500 [=====>........................] - ETA: 1:37 - loss: 1.9446 - regression_loss: 1.6554 - classification_loss: 0.2893 116/500 [=====>........................] - ETA: 1:37 - loss: 1.9445 - regression_loss: 1.6556 - classification_loss: 0.2889 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9399 - regression_loss: 1.6519 - classification_loss: 0.2880 118/500 [======>.......................] - ETA: 1:36 - loss: 1.9439 - regression_loss: 1.6557 - classification_loss: 0.2882 119/500 [======>.......................] - ETA: 1:36 - loss: 1.9436 - regression_loss: 1.6555 - classification_loss: 0.2881 120/500 [======>.......................] - ETA: 1:36 - loss: 1.9501 - regression_loss: 1.6601 - classification_loss: 0.2900 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9433 - regression_loss: 1.6547 - classification_loss: 0.2886 122/500 [======>.......................] - ETA: 1:35 - loss: 1.9438 - regression_loss: 1.6551 - classification_loss: 0.2887 123/500 [======>.......................] - ETA: 1:35 - loss: 1.9468 - regression_loss: 1.6570 - classification_loss: 0.2899 124/500 [======>.......................] - ETA: 1:35 - loss: 1.9459 - regression_loss: 1.6560 - classification_loss: 0.2899 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9402 - regression_loss: 1.6508 - classification_loss: 0.2893 126/500 [======>.......................] - ETA: 1:34 - loss: 1.9354 - regression_loss: 1.6469 - classification_loss: 0.2885 127/500 [======>.......................] - ETA: 1:34 - loss: 1.9334 - regression_loss: 1.6450 - classification_loss: 0.2883 128/500 [======>.......................] - ETA: 1:34 - loss: 1.9339 - regression_loss: 1.6462 - classification_loss: 0.2877 129/500 [======>.......................] - ETA: 1:34 - loss: 1.9364 - regression_loss: 1.6488 - classification_loss: 0.2876 130/500 [======>.......................] - ETA: 1:33 - loss: 1.9390 - regression_loss: 1.6507 - classification_loss: 0.2883 131/500 [======>.......................] - ETA: 1:33 - loss: 1.9366 - regression_loss: 1.6487 - classification_loss: 0.2878 132/500 [======>.......................] - ETA: 1:33 - loss: 1.9401 - regression_loss: 1.6522 - classification_loss: 0.2879 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9314 - regression_loss: 1.6449 - classification_loss: 0.2865 134/500 [=======>......................] - ETA: 1:32 - loss: 1.9289 - regression_loss: 1.6423 - classification_loss: 0.2866 135/500 [=======>......................] - ETA: 1:32 - loss: 1.9260 - regression_loss: 1.6395 - classification_loss: 0.2865 136/500 [=======>......................] - ETA: 1:32 - loss: 1.9282 - regression_loss: 1.6415 - classification_loss: 0.2868 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9271 - regression_loss: 1.6407 - classification_loss: 0.2865 138/500 [=======>......................] - ETA: 1:31 - loss: 1.9279 - regression_loss: 1.6414 - classification_loss: 0.2865 139/500 [=======>......................] - ETA: 1:31 - loss: 1.9288 - regression_loss: 1.6424 - classification_loss: 0.2864 140/500 [=======>......................] - ETA: 1:31 - loss: 1.9280 - regression_loss: 1.6418 - classification_loss: 0.2861 141/500 [=======>......................] - ETA: 1:31 - loss: 1.9292 - regression_loss: 1.6429 - classification_loss: 0.2863 142/500 [=======>......................] - ETA: 1:30 - loss: 1.9276 - regression_loss: 1.6411 - classification_loss: 0.2865 143/500 [=======>......................] - ETA: 1:30 - loss: 1.9264 - regression_loss: 1.6403 - classification_loss: 0.2861 144/500 [=======>......................] - ETA: 1:30 - loss: 1.9222 - regression_loss: 1.6370 - classification_loss: 0.2851 145/500 [=======>......................] - ETA: 1:30 - loss: 1.9159 - regression_loss: 1.6319 - classification_loss: 0.2840 146/500 [=======>......................] - ETA: 1:29 - loss: 1.9071 - regression_loss: 1.6245 - classification_loss: 0.2827 147/500 [=======>......................] - ETA: 1:29 - loss: 1.9063 - regression_loss: 1.6239 - classification_loss: 0.2824 148/500 [=======>......................] - ETA: 1:29 - loss: 1.9021 - regression_loss: 1.6204 - classification_loss: 0.2817 149/500 [=======>......................] - ETA: 1:28 - loss: 1.9028 - regression_loss: 1.6212 - classification_loss: 0.2816 150/500 [========>.....................] - ETA: 1:28 - loss: 1.9040 - regression_loss: 1.6224 - classification_loss: 0.2816 151/500 [========>.....................] - ETA: 1:28 - loss: 1.9036 - regression_loss: 1.6224 - classification_loss: 0.2813 152/500 [========>.....................] - ETA: 1:28 - loss: 1.9075 - regression_loss: 1.6256 - classification_loss: 0.2819 153/500 [========>.....................] - ETA: 1:27 - loss: 1.9069 - regression_loss: 1.6253 - classification_loss: 0.2816 154/500 [========>.....................] - ETA: 1:27 - loss: 1.9027 - regression_loss: 1.6218 - classification_loss: 0.2809 155/500 [========>.....................] - ETA: 1:27 - loss: 1.9024 - regression_loss: 1.6217 - classification_loss: 0.2807 156/500 [========>.....................] - ETA: 1:27 - loss: 1.9019 - regression_loss: 1.6216 - classification_loss: 0.2803 157/500 [========>.....................] - ETA: 1:26 - loss: 1.9015 - regression_loss: 1.6215 - classification_loss: 0.2800 158/500 [========>.....................] - ETA: 1:26 - loss: 1.8993 - regression_loss: 1.6193 - classification_loss: 0.2800 159/500 [========>.....................] - ETA: 1:26 - loss: 1.8980 - regression_loss: 1.6181 - classification_loss: 0.2799 160/500 [========>.....................] - ETA: 1:26 - loss: 1.8932 - regression_loss: 1.6144 - classification_loss: 0.2788 161/500 [========>.....................] - ETA: 1:26 - loss: 1.8955 - regression_loss: 1.6162 - classification_loss: 0.2793 162/500 [========>.....................] - ETA: 1:25 - loss: 1.8963 - regression_loss: 1.6169 - classification_loss: 0.2794 163/500 [========>.....................] - ETA: 1:25 - loss: 1.8976 - regression_loss: 1.6183 - classification_loss: 0.2793 164/500 [========>.....................] - ETA: 1:25 - loss: 1.9025 - regression_loss: 1.6229 - classification_loss: 0.2796 165/500 [========>.....................] - ETA: 1:25 - loss: 1.8990 - regression_loss: 1.6202 - classification_loss: 0.2789 166/500 [========>.....................] - ETA: 1:24 - loss: 1.8966 - regression_loss: 1.6181 - classification_loss: 0.2784 167/500 [=========>....................] - ETA: 1:24 - loss: 1.8930 - regression_loss: 1.6147 - classification_loss: 0.2783 168/500 [=========>....................] - ETA: 1:24 - loss: 1.8960 - regression_loss: 1.6173 - classification_loss: 0.2787 169/500 [=========>....................] - ETA: 1:23 - loss: 1.8958 - regression_loss: 1.6173 - classification_loss: 0.2785 170/500 [=========>....................] - ETA: 1:23 - loss: 1.8966 - regression_loss: 1.6180 - classification_loss: 0.2786 171/500 [=========>....................] - ETA: 1:23 - loss: 1.8977 - regression_loss: 1.6188 - classification_loss: 0.2789 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8980 - regression_loss: 1.6192 - classification_loss: 0.2788 173/500 [=========>....................] - ETA: 1:22 - loss: 1.9003 - regression_loss: 1.6212 - classification_loss: 0.2791 174/500 [=========>....................] - ETA: 1:22 - loss: 1.9036 - regression_loss: 1.6241 - classification_loss: 0.2794 175/500 [=========>....................] - ETA: 1:22 - loss: 1.9077 - regression_loss: 1.6271 - classification_loss: 0.2806 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9104 - regression_loss: 1.6295 - classification_loss: 0.2810 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9121 - regression_loss: 1.6307 - classification_loss: 0.2813 178/500 [=========>....................] - ETA: 1:21 - loss: 1.9055 - regression_loss: 1.6253 - classification_loss: 0.2802 179/500 [=========>....................] - ETA: 1:21 - loss: 1.9045 - regression_loss: 1.6238 - classification_loss: 0.2807 180/500 [=========>....................] - ETA: 1:21 - loss: 1.9056 - regression_loss: 1.6249 - classification_loss: 0.2808 181/500 [=========>....................] - ETA: 1:20 - loss: 1.9033 - regression_loss: 1.6230 - classification_loss: 0.2803 182/500 [=========>....................] - ETA: 1:20 - loss: 1.9052 - regression_loss: 1.6246 - classification_loss: 0.2806 183/500 [=========>....................] - ETA: 1:20 - loss: 1.9070 - regression_loss: 1.6257 - classification_loss: 0.2812 184/500 [==========>...................] - ETA: 1:20 - loss: 1.9072 - regression_loss: 1.6259 - classification_loss: 0.2812 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9066 - regression_loss: 1.6255 - classification_loss: 0.2811 186/500 [==========>...................] - ETA: 1:19 - loss: 1.9058 - regression_loss: 1.6248 - classification_loss: 0.2810 187/500 [==========>...................] - ETA: 1:19 - loss: 1.9070 - regression_loss: 1.6254 - classification_loss: 0.2816 188/500 [==========>...................] - ETA: 1:19 - loss: 1.9074 - regression_loss: 1.6258 - classification_loss: 0.2816 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9080 - regression_loss: 1.6262 - classification_loss: 0.2817 190/500 [==========>...................] - ETA: 1:18 - loss: 1.9072 - regression_loss: 1.6250 - classification_loss: 0.2822 191/500 [==========>...................] - ETA: 1:18 - loss: 1.9020 - regression_loss: 1.6205 - classification_loss: 0.2815 192/500 [==========>...................] - ETA: 1:18 - loss: 1.9030 - regression_loss: 1.6217 - classification_loss: 0.2813 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9049 - regression_loss: 1.6227 - classification_loss: 0.2822 194/500 [==========>...................] - ETA: 1:17 - loss: 1.9009 - regression_loss: 1.6195 - classification_loss: 0.2814 195/500 [==========>...................] - ETA: 1:17 - loss: 1.8989 - regression_loss: 1.6179 - classification_loss: 0.2810 196/500 [==========>...................] - ETA: 1:17 - loss: 1.8974 - regression_loss: 1.6168 - classification_loss: 0.2807 197/500 [==========>...................] - ETA: 1:16 - loss: 1.8935 - regression_loss: 1.6135 - classification_loss: 0.2800 198/500 [==========>...................] - ETA: 1:16 - loss: 1.8899 - regression_loss: 1.6106 - classification_loss: 0.2793 199/500 [==========>...................] - ETA: 1:16 - loss: 1.8863 - regression_loss: 1.6074 - classification_loss: 0.2789 200/500 [===========>..................] - ETA: 1:16 - loss: 1.8842 - regression_loss: 1.6057 - classification_loss: 0.2785 201/500 [===========>..................] - ETA: 1:15 - loss: 1.8854 - regression_loss: 1.6066 - classification_loss: 0.2788 202/500 [===========>..................] - ETA: 1:15 - loss: 1.8827 - regression_loss: 1.6043 - classification_loss: 0.2784 203/500 [===========>..................] - ETA: 1:15 - loss: 1.8840 - regression_loss: 1.6050 - classification_loss: 0.2790 204/500 [===========>..................] - ETA: 1:15 - loss: 1.8848 - regression_loss: 1.6058 - classification_loss: 0.2789 205/500 [===========>..................] - ETA: 1:14 - loss: 1.8857 - regression_loss: 1.6066 - classification_loss: 0.2790 206/500 [===========>..................] - ETA: 1:14 - loss: 1.8867 - regression_loss: 1.6079 - classification_loss: 0.2788 207/500 [===========>..................] - ETA: 1:14 - loss: 1.8868 - regression_loss: 1.6080 - classification_loss: 0.2788 208/500 [===========>..................] - ETA: 1:14 - loss: 1.8879 - regression_loss: 1.6088 - classification_loss: 0.2791 209/500 [===========>..................] - ETA: 1:13 - loss: 1.8861 - regression_loss: 1.6074 - classification_loss: 0.2787 210/500 [===========>..................] - ETA: 1:13 - loss: 1.8860 - regression_loss: 1.6068 - classification_loss: 0.2791 211/500 [===========>..................] - ETA: 1:13 - loss: 1.8854 - regression_loss: 1.6062 - classification_loss: 0.2792 212/500 [===========>..................] - ETA: 1:13 - loss: 1.8863 - regression_loss: 1.6068 - classification_loss: 0.2795 213/500 [===========>..................] - ETA: 1:12 - loss: 1.8851 - regression_loss: 1.6057 - classification_loss: 0.2794 214/500 [===========>..................] - ETA: 1:12 - loss: 1.8848 - regression_loss: 1.6060 - classification_loss: 0.2788 215/500 [===========>..................] - ETA: 1:12 - loss: 1.8875 - regression_loss: 1.6080 - classification_loss: 0.2795 216/500 [===========>..................] - ETA: 1:12 - loss: 1.8886 - regression_loss: 1.6089 - classification_loss: 0.2798 217/500 [============>.................] - ETA: 1:11 - loss: 1.8908 - regression_loss: 1.6110 - classification_loss: 0.2798 218/500 [============>.................] - ETA: 1:11 - loss: 1.8886 - regression_loss: 1.6092 - classification_loss: 0.2794 219/500 [============>.................] - ETA: 1:11 - loss: 1.8876 - regression_loss: 1.6082 - classification_loss: 0.2793 220/500 [============>.................] - ETA: 1:11 - loss: 1.8872 - regression_loss: 1.6078 - classification_loss: 0.2794 221/500 [============>.................] - ETA: 1:10 - loss: 1.8863 - regression_loss: 1.6073 - classification_loss: 0.2790 222/500 [============>.................] - ETA: 1:10 - loss: 1.8865 - regression_loss: 1.6074 - classification_loss: 0.2790 223/500 [============>.................] - ETA: 1:10 - loss: 1.8866 - regression_loss: 1.6076 - classification_loss: 0.2791 224/500 [============>.................] - ETA: 1:10 - loss: 1.8858 - regression_loss: 1.6069 - classification_loss: 0.2789 225/500 [============>.................] - ETA: 1:09 - loss: 1.8855 - regression_loss: 1.6065 - classification_loss: 0.2789 226/500 [============>.................] - ETA: 1:09 - loss: 1.8846 - regression_loss: 1.6057 - classification_loss: 0.2790 227/500 [============>.................] - ETA: 1:09 - loss: 1.8867 - regression_loss: 1.6070 - classification_loss: 0.2797 228/500 [============>.................] - ETA: 1:08 - loss: 1.8831 - regression_loss: 1.6041 - classification_loss: 0.2791 229/500 [============>.................] - ETA: 1:08 - loss: 1.8841 - regression_loss: 1.6051 - classification_loss: 0.2790 230/500 [============>.................] - ETA: 1:08 - loss: 1.8831 - regression_loss: 1.6044 - classification_loss: 0.2788 231/500 [============>.................] - ETA: 1:08 - loss: 1.8836 - regression_loss: 1.6049 - classification_loss: 0.2787 232/500 [============>.................] - ETA: 1:07 - loss: 1.8846 - regression_loss: 1.6056 - classification_loss: 0.2789 233/500 [============>.................] - ETA: 1:07 - loss: 1.8841 - regression_loss: 1.6051 - classification_loss: 0.2789 234/500 [=============>................] - ETA: 1:07 - loss: 1.8827 - regression_loss: 1.6039 - classification_loss: 0.2788 235/500 [=============>................] - ETA: 1:07 - loss: 1.8798 - regression_loss: 1.6015 - classification_loss: 0.2783 236/500 [=============>................] - ETA: 1:06 - loss: 1.8820 - regression_loss: 1.6034 - classification_loss: 0.2786 237/500 [=============>................] - ETA: 1:06 - loss: 1.8833 - regression_loss: 1.6048 - classification_loss: 0.2785 238/500 [=============>................] - ETA: 1:06 - loss: 1.8796 - regression_loss: 1.6017 - classification_loss: 0.2779 239/500 [=============>................] - ETA: 1:06 - loss: 1.8799 - regression_loss: 1.6020 - classification_loss: 0.2779 240/500 [=============>................] - ETA: 1:05 - loss: 1.8808 - regression_loss: 1.6028 - classification_loss: 0.2780 241/500 [=============>................] - ETA: 1:05 - loss: 1.8801 - regression_loss: 1.6024 - classification_loss: 0.2777 242/500 [=============>................] - ETA: 1:05 - loss: 1.8809 - regression_loss: 1.6029 - classification_loss: 0.2780 243/500 [=============>................] - ETA: 1:05 - loss: 1.8814 - regression_loss: 1.6031 - classification_loss: 0.2782 244/500 [=============>................] - ETA: 1:04 - loss: 1.8833 - regression_loss: 1.6047 - classification_loss: 0.2785 245/500 [=============>................] - ETA: 1:04 - loss: 1.8824 - regression_loss: 1.6040 - classification_loss: 0.2784 246/500 [=============>................] - ETA: 1:04 - loss: 1.8857 - regression_loss: 1.6072 - classification_loss: 0.2785 247/500 [=============>................] - ETA: 1:04 - loss: 1.8851 - regression_loss: 1.6068 - classification_loss: 0.2783 248/500 [=============>................] - ETA: 1:03 - loss: 1.8876 - regression_loss: 1.6093 - classification_loss: 0.2784 249/500 [=============>................] - ETA: 1:03 - loss: 1.8913 - regression_loss: 1.6125 - classification_loss: 0.2787 250/500 [==============>...............] - ETA: 1:03 - loss: 1.8892 - regression_loss: 1.6102 - classification_loss: 0.2790 251/500 [==============>...............] - ETA: 1:03 - loss: 1.8883 - regression_loss: 1.6094 - classification_loss: 0.2788 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8887 - regression_loss: 1.6099 - classification_loss: 0.2788 253/500 [==============>...............] - ETA: 1:02 - loss: 1.8886 - regression_loss: 1.6095 - classification_loss: 0.2792 254/500 [==============>...............] - ETA: 1:02 - loss: 1.8880 - regression_loss: 1.6089 - classification_loss: 0.2791 255/500 [==============>...............] - ETA: 1:02 - loss: 1.8894 - regression_loss: 1.6096 - classification_loss: 0.2797 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8899 - regression_loss: 1.6102 - classification_loss: 0.2797 257/500 [==============>...............] - ETA: 1:01 - loss: 1.8897 - regression_loss: 1.6101 - classification_loss: 0.2796 258/500 [==============>...............] - ETA: 1:01 - loss: 1.8880 - regression_loss: 1.6088 - classification_loss: 0.2792 259/500 [==============>...............] - ETA: 1:01 - loss: 1.8884 - regression_loss: 1.6090 - classification_loss: 0.2794 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8893 - regression_loss: 1.6098 - classification_loss: 0.2795 261/500 [==============>...............] - ETA: 1:00 - loss: 1.8900 - regression_loss: 1.6104 - classification_loss: 0.2796 262/500 [==============>...............] - ETA: 1:00 - loss: 1.8907 - regression_loss: 1.6111 - classification_loss: 0.2797 263/500 [==============>...............] - ETA: 1:00 - loss: 1.8905 - regression_loss: 1.6110 - classification_loss: 0.2795 264/500 [==============>...............] - ETA: 59s - loss: 1.8899 - regression_loss: 1.6106 - classification_loss: 0.2793  265/500 [==============>...............] - ETA: 59s - loss: 1.8904 - regression_loss: 1.6112 - classification_loss: 0.2792 266/500 [==============>...............] - ETA: 59s - loss: 1.8892 - regression_loss: 1.6102 - classification_loss: 0.2790 267/500 [===============>..............] - ETA: 59s - loss: 1.8896 - regression_loss: 1.6108 - classification_loss: 0.2788 268/500 [===============>..............] - ETA: 58s - loss: 1.8895 - regression_loss: 1.6110 - classification_loss: 0.2785 269/500 [===============>..............] - ETA: 58s - loss: 1.8902 - regression_loss: 1.6116 - classification_loss: 0.2786 270/500 [===============>..............] - ETA: 58s - loss: 1.8939 - regression_loss: 1.6149 - classification_loss: 0.2789 271/500 [===============>..............] - ETA: 58s - loss: 1.8928 - regression_loss: 1.6141 - classification_loss: 0.2787 272/500 [===============>..............] - ETA: 57s - loss: 1.8961 - regression_loss: 1.6167 - classification_loss: 0.2794 273/500 [===============>..............] - ETA: 57s - loss: 1.8935 - regression_loss: 1.6145 - classification_loss: 0.2791 274/500 [===============>..............] - ETA: 57s - loss: 1.8940 - regression_loss: 1.6148 - classification_loss: 0.2792 275/500 [===============>..............] - ETA: 57s - loss: 1.8946 - regression_loss: 1.6153 - classification_loss: 0.2793 276/500 [===============>..............] - ETA: 56s - loss: 1.8952 - regression_loss: 1.6159 - classification_loss: 0.2793 277/500 [===============>..............] - ETA: 56s - loss: 1.8951 - regression_loss: 1.6159 - classification_loss: 0.2793 278/500 [===============>..............] - ETA: 56s - loss: 1.8953 - regression_loss: 1.6160 - classification_loss: 0.2793 279/500 [===============>..............] - ETA: 56s - loss: 1.8961 - regression_loss: 1.6170 - classification_loss: 0.2792 280/500 [===============>..............] - ETA: 55s - loss: 1.8971 - regression_loss: 1.6178 - classification_loss: 0.2793 281/500 [===============>..............] - ETA: 55s - loss: 1.8979 - regression_loss: 1.6183 - classification_loss: 0.2796 282/500 [===============>..............] - ETA: 55s - loss: 1.8983 - regression_loss: 1.6187 - classification_loss: 0.2795 283/500 [===============>..............] - ETA: 55s - loss: 1.8984 - regression_loss: 1.6190 - classification_loss: 0.2794 284/500 [================>.............] - ETA: 54s - loss: 1.8987 - regression_loss: 1.6194 - classification_loss: 0.2793 285/500 [================>.............] - ETA: 54s - loss: 1.8993 - regression_loss: 1.6202 - classification_loss: 0.2792 286/500 [================>.............] - ETA: 54s - loss: 1.8980 - regression_loss: 1.6191 - classification_loss: 0.2789 287/500 [================>.............] - ETA: 54s - loss: 1.8963 - regression_loss: 1.6176 - classification_loss: 0.2787 288/500 [================>.............] - ETA: 53s - loss: 1.8965 - regression_loss: 1.6178 - classification_loss: 0.2786 289/500 [================>.............] - ETA: 53s - loss: 1.8956 - regression_loss: 1.6172 - classification_loss: 0.2784 290/500 [================>.............] - ETA: 53s - loss: 1.8946 - regression_loss: 1.6165 - classification_loss: 0.2781 291/500 [================>.............] - ETA: 53s - loss: 1.8959 - regression_loss: 1.6177 - classification_loss: 0.2783 292/500 [================>.............] - ETA: 52s - loss: 1.8921 - regression_loss: 1.6143 - classification_loss: 0.2778 293/500 [================>.............] - ETA: 52s - loss: 1.8937 - regression_loss: 1.6156 - classification_loss: 0.2781 294/500 [================>.............] - ETA: 52s - loss: 1.8943 - regression_loss: 1.6161 - classification_loss: 0.2782 295/500 [================>.............] - ETA: 52s - loss: 1.8946 - regression_loss: 1.6165 - classification_loss: 0.2782 296/500 [================>.............] - ETA: 51s - loss: 1.8962 - regression_loss: 1.6175 - classification_loss: 0.2786 297/500 [================>.............] - ETA: 51s - loss: 1.8970 - regression_loss: 1.6183 - classification_loss: 0.2788 298/500 [================>.............] - ETA: 51s - loss: 1.8980 - regression_loss: 1.6191 - classification_loss: 0.2789 299/500 [================>.............] - ETA: 50s - loss: 1.8996 - regression_loss: 1.6206 - classification_loss: 0.2790 300/500 [=================>............] - ETA: 50s - loss: 1.9007 - regression_loss: 1.6214 - classification_loss: 0.2793 301/500 [=================>............] - ETA: 50s - loss: 1.9024 - regression_loss: 1.6226 - classification_loss: 0.2798 302/500 [=================>............] - ETA: 50s - loss: 1.8990 - regression_loss: 1.6197 - classification_loss: 0.2793 303/500 [=================>............] - ETA: 49s - loss: 1.8988 - regression_loss: 1.6195 - classification_loss: 0.2793 304/500 [=================>............] - ETA: 49s - loss: 1.8972 - regression_loss: 1.6180 - classification_loss: 0.2792 305/500 [=================>............] - ETA: 49s - loss: 1.8991 - regression_loss: 1.6196 - classification_loss: 0.2796 306/500 [=================>............] - ETA: 49s - loss: 1.9012 - regression_loss: 1.6210 - classification_loss: 0.2802 307/500 [=================>............] - ETA: 48s - loss: 1.9003 - regression_loss: 1.6203 - classification_loss: 0.2800 308/500 [=================>............] - ETA: 48s - loss: 1.8989 - regression_loss: 1.6191 - classification_loss: 0.2798 309/500 [=================>............] - ETA: 48s - loss: 1.8985 - regression_loss: 1.6188 - classification_loss: 0.2798 310/500 [=================>............] - ETA: 48s - loss: 1.8988 - regression_loss: 1.6192 - classification_loss: 0.2797 311/500 [=================>............] - ETA: 47s - loss: 1.8990 - regression_loss: 1.6195 - classification_loss: 0.2796 312/500 [=================>............] - ETA: 47s - loss: 1.8986 - regression_loss: 1.6192 - classification_loss: 0.2794 313/500 [=================>............] - ETA: 47s - loss: 1.8977 - regression_loss: 1.6182 - classification_loss: 0.2795 314/500 [=================>............] - ETA: 47s - loss: 1.8990 - regression_loss: 1.6194 - classification_loss: 0.2796 315/500 [=================>............] - ETA: 46s - loss: 1.8986 - regression_loss: 1.6191 - classification_loss: 0.2795 316/500 [=================>............] - ETA: 46s - loss: 1.8988 - regression_loss: 1.6197 - classification_loss: 0.2792 317/500 [==================>...........] - ETA: 46s - loss: 1.8982 - regression_loss: 1.6189 - classification_loss: 0.2793 318/500 [==================>...........] - ETA: 46s - loss: 1.8994 - regression_loss: 1.6200 - classification_loss: 0.2794 319/500 [==================>...........] - ETA: 45s - loss: 1.8998 - regression_loss: 1.6204 - classification_loss: 0.2794 320/500 [==================>...........] - ETA: 45s - loss: 1.8975 - regression_loss: 1.6184 - classification_loss: 0.2791 321/500 [==================>...........] - ETA: 45s - loss: 1.9026 - regression_loss: 1.6183 - classification_loss: 0.2843 322/500 [==================>...........] - ETA: 45s - loss: 1.9014 - regression_loss: 1.6171 - classification_loss: 0.2843 323/500 [==================>...........] - ETA: 44s - loss: 1.9077 - regression_loss: 1.6209 - classification_loss: 0.2868 324/500 [==================>...........] - ETA: 44s - loss: 1.9050 - regression_loss: 1.6186 - classification_loss: 0.2864 325/500 [==================>...........] - ETA: 44s - loss: 1.9057 - regression_loss: 1.6193 - classification_loss: 0.2864 326/500 [==================>...........] - ETA: 44s - loss: 1.9054 - regression_loss: 1.6191 - classification_loss: 0.2863 327/500 [==================>...........] - ETA: 43s - loss: 1.9055 - regression_loss: 1.6193 - classification_loss: 0.2863 328/500 [==================>...........] - ETA: 43s - loss: 1.9053 - regression_loss: 1.6191 - classification_loss: 0.2862 329/500 [==================>...........] - ETA: 43s - loss: 1.9042 - regression_loss: 1.6183 - classification_loss: 0.2860 330/500 [==================>...........] - ETA: 43s - loss: 1.9044 - regression_loss: 1.6185 - classification_loss: 0.2859 331/500 [==================>...........] - ETA: 42s - loss: 1.9047 - regression_loss: 1.6192 - classification_loss: 0.2855 332/500 [==================>...........] - ETA: 42s - loss: 1.9024 - regression_loss: 1.6171 - classification_loss: 0.2853 333/500 [==================>...........] - ETA: 42s - loss: 1.9017 - regression_loss: 1.6166 - classification_loss: 0.2852 334/500 [===================>..........] - ETA: 42s - loss: 1.9004 - regression_loss: 1.6156 - classification_loss: 0.2848 335/500 [===================>..........] - ETA: 41s - loss: 1.9011 - regression_loss: 1.6162 - classification_loss: 0.2849 336/500 [===================>..........] - ETA: 41s - loss: 1.8994 - regression_loss: 1.6149 - classification_loss: 0.2846 337/500 [===================>..........] - ETA: 41s - loss: 1.8999 - regression_loss: 1.6153 - classification_loss: 0.2845 338/500 [===================>..........] - ETA: 41s - loss: 1.8993 - regression_loss: 1.6147 - classification_loss: 0.2846 339/500 [===================>..........] - ETA: 40s - loss: 1.8979 - regression_loss: 1.6135 - classification_loss: 0.2844 340/500 [===================>..........] - ETA: 40s - loss: 1.8977 - regression_loss: 1.6132 - classification_loss: 0.2846 341/500 [===================>..........] - ETA: 40s - loss: 1.8977 - regression_loss: 1.6131 - classification_loss: 0.2846 342/500 [===================>..........] - ETA: 40s - loss: 1.8985 - regression_loss: 1.6138 - classification_loss: 0.2847 343/500 [===================>..........] - ETA: 39s - loss: 1.8992 - regression_loss: 1.6143 - classification_loss: 0.2849 344/500 [===================>..........] - ETA: 39s - loss: 1.9013 - regression_loss: 1.6161 - classification_loss: 0.2853 345/500 [===================>..........] - ETA: 39s - loss: 1.9017 - regression_loss: 1.6164 - classification_loss: 0.2853 346/500 [===================>..........] - ETA: 38s - loss: 1.9000 - regression_loss: 1.6151 - classification_loss: 0.2849 347/500 [===================>..........] - ETA: 38s - loss: 1.9006 - regression_loss: 1.6157 - classification_loss: 0.2849 348/500 [===================>..........] - ETA: 38s - loss: 1.9008 - regression_loss: 1.6159 - classification_loss: 0.2848 349/500 [===================>..........] - ETA: 38s - loss: 1.9022 - regression_loss: 1.6171 - classification_loss: 0.2851 350/500 [====================>.........] - ETA: 37s - loss: 1.9012 - regression_loss: 1.6162 - classification_loss: 0.2850 351/500 [====================>.........] - ETA: 37s - loss: 1.9023 - regression_loss: 1.6172 - classification_loss: 0.2852 352/500 [====================>.........] - ETA: 37s - loss: 1.9029 - regression_loss: 1.6177 - classification_loss: 0.2852 353/500 [====================>.........] - ETA: 37s - loss: 1.9036 - regression_loss: 1.6181 - classification_loss: 0.2855 354/500 [====================>.........] - ETA: 36s - loss: 1.9037 - regression_loss: 1.6182 - classification_loss: 0.2854 355/500 [====================>.........] - ETA: 36s - loss: 1.9059 - regression_loss: 1.6199 - classification_loss: 0.2860 356/500 [====================>.........] - ETA: 36s - loss: 1.9060 - regression_loss: 1.6202 - classification_loss: 0.2858 357/500 [====================>.........] - ETA: 36s - loss: 1.9089 - regression_loss: 1.6227 - classification_loss: 0.2862 358/500 [====================>.........] - ETA: 35s - loss: 1.9088 - regression_loss: 1.6228 - classification_loss: 0.2860 359/500 [====================>.........] - ETA: 35s - loss: 1.9079 - regression_loss: 1.6221 - classification_loss: 0.2858 360/500 [====================>.........] - ETA: 35s - loss: 1.9067 - regression_loss: 1.6211 - classification_loss: 0.2856 361/500 [====================>.........] - ETA: 35s - loss: 1.9069 - regression_loss: 1.6215 - classification_loss: 0.2854 362/500 [====================>.........] - ETA: 34s - loss: 1.9067 - regression_loss: 1.6214 - classification_loss: 0.2853 363/500 [====================>.........] - ETA: 34s - loss: 1.9077 - regression_loss: 1.6221 - classification_loss: 0.2856 364/500 [====================>.........] - ETA: 34s - loss: 1.9055 - regression_loss: 1.6204 - classification_loss: 0.2851 365/500 [====================>.........] - ETA: 34s - loss: 1.9050 - regression_loss: 1.6199 - classification_loss: 0.2850 366/500 [====================>.........] - ETA: 33s - loss: 1.9046 - regression_loss: 1.6197 - classification_loss: 0.2849 367/500 [=====================>........] - ETA: 33s - loss: 1.9046 - regression_loss: 1.6197 - classification_loss: 0.2849 368/500 [=====================>........] - ETA: 33s - loss: 1.9038 - regression_loss: 1.6193 - classification_loss: 0.2845 369/500 [=====================>........] - ETA: 33s - loss: 1.9032 - regression_loss: 1.6188 - classification_loss: 0.2844 370/500 [=====================>........] - ETA: 32s - loss: 1.9047 - regression_loss: 1.6202 - classification_loss: 0.2846 371/500 [=====================>........] - ETA: 32s - loss: 1.9049 - regression_loss: 1.6203 - classification_loss: 0.2846 372/500 [=====================>........] - ETA: 32s - loss: 1.9051 - regression_loss: 1.6208 - classification_loss: 0.2843 373/500 [=====================>........] - ETA: 32s - loss: 1.9030 - regression_loss: 1.6190 - classification_loss: 0.2840 374/500 [=====================>........] - ETA: 31s - loss: 1.9038 - regression_loss: 1.6195 - classification_loss: 0.2843 375/500 [=====================>........] - ETA: 31s - loss: 1.9051 - regression_loss: 1.6203 - classification_loss: 0.2847 376/500 [=====================>........] - ETA: 31s - loss: 1.9071 - regression_loss: 1.6220 - classification_loss: 0.2851 377/500 [=====================>........] - ETA: 31s - loss: 1.9070 - regression_loss: 1.6220 - classification_loss: 0.2850 378/500 [=====================>........] - ETA: 30s - loss: 1.9064 - regression_loss: 1.6215 - classification_loss: 0.2848 379/500 [=====================>........] - ETA: 30s - loss: 1.9068 - regression_loss: 1.6220 - classification_loss: 0.2848 380/500 [=====================>........] - ETA: 30s - loss: 1.9069 - regression_loss: 1.6219 - classification_loss: 0.2850 381/500 [=====================>........] - ETA: 30s - loss: 1.9069 - regression_loss: 1.6221 - classification_loss: 0.2849 382/500 [=====================>........] - ETA: 29s - loss: 1.9063 - regression_loss: 1.6217 - classification_loss: 0.2847 383/500 [=====================>........] - ETA: 29s - loss: 1.9063 - regression_loss: 1.6217 - classification_loss: 0.2846 384/500 [======================>.......] - ETA: 29s - loss: 1.9065 - regression_loss: 1.6220 - classification_loss: 0.2845 385/500 [======================>.......] - ETA: 29s - loss: 1.9056 - regression_loss: 1.6212 - classification_loss: 0.2844 386/500 [======================>.......] - ETA: 28s - loss: 1.9039 - regression_loss: 1.6199 - classification_loss: 0.2840 387/500 [======================>.......] - ETA: 28s - loss: 1.9052 - regression_loss: 1.6211 - classification_loss: 0.2840 388/500 [======================>.......] - ETA: 28s - loss: 1.9033 - regression_loss: 1.6196 - classification_loss: 0.2837 389/500 [======================>.......] - ETA: 28s - loss: 1.9033 - regression_loss: 1.6197 - classification_loss: 0.2837 390/500 [======================>.......] - ETA: 27s - loss: 1.9031 - regression_loss: 1.6196 - classification_loss: 0.2835 391/500 [======================>.......] - ETA: 27s - loss: 1.9006 - regression_loss: 1.6175 - classification_loss: 0.2831 392/500 [======================>.......] - ETA: 27s - loss: 1.9015 - regression_loss: 1.6183 - classification_loss: 0.2832 393/500 [======================>.......] - ETA: 27s - loss: 1.9013 - regression_loss: 1.6181 - classification_loss: 0.2833 394/500 [======================>.......] - ETA: 26s - loss: 1.9012 - regression_loss: 1.6179 - classification_loss: 0.2833 395/500 [======================>.......] - ETA: 26s - loss: 1.9016 - regression_loss: 1.6183 - classification_loss: 0.2833 396/500 [======================>.......] - ETA: 26s - loss: 1.9023 - regression_loss: 1.6191 - classification_loss: 0.2832 397/500 [======================>.......] - ETA: 26s - loss: 1.9023 - regression_loss: 1.6191 - classification_loss: 0.2832 398/500 [======================>.......] - ETA: 25s - loss: 1.9030 - regression_loss: 1.6195 - classification_loss: 0.2835 399/500 [======================>.......] - ETA: 25s - loss: 1.9060 - regression_loss: 1.6223 - classification_loss: 0.2838 400/500 [=======================>......] - ETA: 25s - loss: 1.9067 - regression_loss: 1.6227 - classification_loss: 0.2840 401/500 [=======================>......] - ETA: 25s - loss: 1.9073 - regression_loss: 1.6232 - classification_loss: 0.2841 402/500 [=======================>......] - ETA: 24s - loss: 1.9081 - regression_loss: 1.6239 - classification_loss: 0.2842 403/500 [=======================>......] - ETA: 24s - loss: 1.9087 - regression_loss: 1.6245 - classification_loss: 0.2842 404/500 [=======================>......] - ETA: 24s - loss: 1.9086 - regression_loss: 1.6245 - classification_loss: 0.2842 405/500 [=======================>......] - ETA: 24s - loss: 1.9091 - regression_loss: 1.6250 - classification_loss: 0.2841 406/500 [=======================>......] - ETA: 23s - loss: 1.9069 - regression_loss: 1.6231 - classification_loss: 0.2838 407/500 [=======================>......] - ETA: 23s - loss: 1.9068 - regression_loss: 1.6229 - classification_loss: 0.2839 408/500 [=======================>......] - ETA: 23s - loss: 1.9064 - regression_loss: 1.6227 - classification_loss: 0.2837 409/500 [=======================>......] - ETA: 23s - loss: 1.9071 - regression_loss: 1.6233 - classification_loss: 0.2838 410/500 [=======================>......] - ETA: 22s - loss: 1.9063 - regression_loss: 1.6226 - classification_loss: 0.2837 411/500 [=======================>......] - ETA: 22s - loss: 1.9041 - regression_loss: 1.6205 - classification_loss: 0.2836 412/500 [=======================>......] - ETA: 22s - loss: 1.9034 - regression_loss: 1.6202 - classification_loss: 0.2832 413/500 [=======================>......] - ETA: 22s - loss: 1.9028 - regression_loss: 1.6197 - classification_loss: 0.2831 414/500 [=======================>......] - ETA: 21s - loss: 1.9030 - regression_loss: 1.6200 - classification_loss: 0.2830 415/500 [=======================>......] - ETA: 21s - loss: 1.9028 - regression_loss: 1.6199 - classification_loss: 0.2830 416/500 [=======================>......] - ETA: 21s - loss: 1.9029 - regression_loss: 1.6199 - classification_loss: 0.2830 417/500 [========================>.....] - ETA: 21s - loss: 1.9026 - regression_loss: 1.6197 - classification_loss: 0.2829 418/500 [========================>.....] - ETA: 20s - loss: 1.9023 - regression_loss: 1.6192 - classification_loss: 0.2831 419/500 [========================>.....] - ETA: 20s - loss: 1.9017 - regression_loss: 1.6187 - classification_loss: 0.2831 420/500 [========================>.....] - ETA: 20s - loss: 1.9032 - regression_loss: 1.6197 - classification_loss: 0.2835 421/500 [========================>.....] - ETA: 20s - loss: 1.9035 - regression_loss: 1.6198 - classification_loss: 0.2837 422/500 [========================>.....] - ETA: 19s - loss: 1.9036 - regression_loss: 1.6200 - classification_loss: 0.2836 423/500 [========================>.....] - ETA: 19s - loss: 1.9016 - regression_loss: 1.6183 - classification_loss: 0.2833 424/500 [========================>.....] - ETA: 19s - loss: 1.9021 - regression_loss: 1.6187 - classification_loss: 0.2834 425/500 [========================>.....] - ETA: 18s - loss: 1.9006 - regression_loss: 1.6174 - classification_loss: 0.2832 426/500 [========================>.....] - ETA: 18s - loss: 1.9009 - regression_loss: 1.6177 - classification_loss: 0.2832 427/500 [========================>.....] - ETA: 18s - loss: 1.9008 - regression_loss: 1.6176 - classification_loss: 0.2831 428/500 [========================>.....] - ETA: 18s - loss: 1.8992 - regression_loss: 1.6162 - classification_loss: 0.2830 429/500 [========================>.....] - ETA: 17s - loss: 1.9006 - regression_loss: 1.6176 - classification_loss: 0.2829 430/500 [========================>.....] - ETA: 17s - loss: 1.8994 - regression_loss: 1.6167 - classification_loss: 0.2827 431/500 [========================>.....] - ETA: 17s - loss: 1.8986 - regression_loss: 1.6161 - classification_loss: 0.2825 432/500 [========================>.....] - ETA: 17s - loss: 1.8972 - regression_loss: 1.6149 - classification_loss: 0.2823 433/500 [========================>.....] - ETA: 16s - loss: 1.8957 - regression_loss: 1.6138 - classification_loss: 0.2819 434/500 [=========================>....] - ETA: 16s - loss: 1.8959 - regression_loss: 1.6139 - classification_loss: 0.2820 435/500 [=========================>....] - ETA: 16s - loss: 1.8954 - regression_loss: 1.6133 - classification_loss: 0.2821 436/500 [=========================>....] - ETA: 16s - loss: 1.8951 - regression_loss: 1.6130 - classification_loss: 0.2821 437/500 [=========================>....] - ETA: 15s - loss: 1.8942 - regression_loss: 1.6123 - classification_loss: 0.2819 438/500 [=========================>....] - ETA: 15s - loss: 1.8933 - regression_loss: 1.6116 - classification_loss: 0.2818 439/500 [=========================>....] - ETA: 15s - loss: 1.8935 - regression_loss: 1.6116 - classification_loss: 0.2818 440/500 [=========================>....] - ETA: 15s - loss: 1.8924 - regression_loss: 1.6108 - classification_loss: 0.2816 441/500 [=========================>....] - ETA: 14s - loss: 1.8926 - regression_loss: 1.6109 - classification_loss: 0.2817 442/500 [=========================>....] - ETA: 14s - loss: 1.8921 - regression_loss: 1.6105 - classification_loss: 0.2816 443/500 [=========================>....] - ETA: 14s - loss: 1.8926 - regression_loss: 1.6110 - classification_loss: 0.2816 444/500 [=========================>....] - ETA: 14s - loss: 1.8929 - regression_loss: 1.6112 - classification_loss: 0.2817 445/500 [=========================>....] - ETA: 13s - loss: 1.8927 - regression_loss: 1.6110 - classification_loss: 0.2816 446/500 [=========================>....] - ETA: 13s - loss: 1.8914 - regression_loss: 1.6099 - classification_loss: 0.2815 447/500 [=========================>....] - ETA: 13s - loss: 1.8922 - regression_loss: 1.6103 - classification_loss: 0.2818 448/500 [=========================>....] - ETA: 13s - loss: 1.8915 - regression_loss: 1.6098 - classification_loss: 0.2817 449/500 [=========================>....] - ETA: 12s - loss: 1.8907 - regression_loss: 1.6091 - classification_loss: 0.2816 450/500 [==========================>...] - ETA: 12s - loss: 1.8907 - regression_loss: 1.6091 - classification_loss: 0.2816 451/500 [==========================>...] - ETA: 12s - loss: 1.8901 - regression_loss: 1.6087 - classification_loss: 0.2815 452/500 [==========================>...] - ETA: 12s - loss: 1.8919 - regression_loss: 1.6101 - classification_loss: 0.2818 453/500 [==========================>...] - ETA: 11s - loss: 1.8936 - regression_loss: 1.6118 - classification_loss: 0.2818 454/500 [==========================>...] - ETA: 11s - loss: 1.8936 - regression_loss: 1.6118 - classification_loss: 0.2817 455/500 [==========================>...] - ETA: 11s - loss: 1.8927 - regression_loss: 1.6111 - classification_loss: 0.2816 456/500 [==========================>...] - ETA: 11s - loss: 1.8926 - regression_loss: 1.6112 - classification_loss: 0.2815 457/500 [==========================>...] - ETA: 10s - loss: 1.8933 - regression_loss: 1.6119 - classification_loss: 0.2814 458/500 [==========================>...] - ETA: 10s - loss: 1.8931 - regression_loss: 1.6118 - classification_loss: 0.2813 459/500 [==========================>...] - ETA: 10s - loss: 1.8926 - regression_loss: 1.6113 - classification_loss: 0.2813 460/500 [==========================>...] - ETA: 10s - loss: 1.8930 - regression_loss: 1.6117 - classification_loss: 0.2813 461/500 [==========================>...] - ETA: 9s - loss: 1.8928 - regression_loss: 1.6116 - classification_loss: 0.2812  462/500 [==========================>...] - ETA: 9s - loss: 1.8902 - regression_loss: 1.6094 - classification_loss: 0.2808 463/500 [==========================>...] - ETA: 9s - loss: 1.8903 - regression_loss: 1.6094 - classification_loss: 0.2809 464/500 [==========================>...] - ETA: 9s - loss: 1.8908 - regression_loss: 1.6099 - classification_loss: 0.2809 465/500 [==========================>...] - ETA: 8s - loss: 1.8913 - regression_loss: 1.6104 - classification_loss: 0.2809 466/500 [==========================>...] - ETA: 8s - loss: 1.8917 - regression_loss: 1.6110 - classification_loss: 0.2807 467/500 [===========================>..] - ETA: 8s - loss: 1.8917 - regression_loss: 1.6109 - classification_loss: 0.2808 468/500 [===========================>..] - ETA: 8s - loss: 1.8918 - regression_loss: 1.6111 - classification_loss: 0.2808 469/500 [===========================>..] - ETA: 7s - loss: 1.8914 - regression_loss: 1.6108 - classification_loss: 0.2806 470/500 [===========================>..] - ETA: 7s - loss: 1.8898 - regression_loss: 1.6095 - classification_loss: 0.2803 471/500 [===========================>..] - ETA: 7s - loss: 1.8890 - regression_loss: 1.6089 - classification_loss: 0.2801 472/500 [===========================>..] - ETA: 7s - loss: 1.8897 - regression_loss: 1.6097 - classification_loss: 0.2800 473/500 [===========================>..] - ETA: 6s - loss: 1.8903 - regression_loss: 1.6104 - classification_loss: 0.2800 474/500 [===========================>..] - ETA: 6s - loss: 1.8909 - regression_loss: 1.6108 - classification_loss: 0.2801 475/500 [===========================>..] - ETA: 6s - loss: 1.8892 - regression_loss: 1.6093 - classification_loss: 0.2798 476/500 [===========================>..] - ETA: 6s - loss: 1.8886 - regression_loss: 1.6088 - classification_loss: 0.2797 477/500 [===========================>..] - ETA: 5s - loss: 1.8898 - regression_loss: 1.6094 - classification_loss: 0.2805 478/500 [===========================>..] - ETA: 5s - loss: 1.8900 - regression_loss: 1.6095 - classification_loss: 0.2805 479/500 [===========================>..] - ETA: 5s - loss: 1.8907 - regression_loss: 1.6096 - classification_loss: 0.2811 480/500 [===========================>..] - ETA: 5s - loss: 1.8909 - regression_loss: 1.6099 - classification_loss: 0.2811 481/500 [===========================>..] - ETA: 4s - loss: 1.8926 - regression_loss: 1.6113 - classification_loss: 0.2813 482/500 [===========================>..] - ETA: 4s - loss: 1.8936 - regression_loss: 1.6115 - classification_loss: 0.2821 483/500 [===========================>..] - ETA: 4s - loss: 1.8938 - regression_loss: 1.6117 - classification_loss: 0.2821 484/500 [============================>.] - ETA: 4s - loss: 1.8944 - regression_loss: 1.6123 - classification_loss: 0.2821 485/500 [============================>.] - ETA: 3s - loss: 1.8947 - regression_loss: 1.6129 - classification_loss: 0.2818 486/500 [============================>.] - ETA: 3s - loss: 1.8950 - regression_loss: 1.6132 - classification_loss: 0.2817 487/500 [============================>.] - ETA: 3s - loss: 1.8939 - regression_loss: 1.6124 - classification_loss: 0.2815 488/500 [============================>.] - ETA: 3s - loss: 1.8953 - regression_loss: 1.6134 - classification_loss: 0.2819 489/500 [============================>.] - ETA: 2s - loss: 1.8933 - regression_loss: 1.6118 - classification_loss: 0.2815 490/500 [============================>.] - ETA: 2s - loss: 1.8943 - regression_loss: 1.6124 - classification_loss: 0.2819 491/500 [============================>.] - ETA: 2s - loss: 1.8951 - regression_loss: 1.6130 - classification_loss: 0.2821 492/500 [============================>.] - ETA: 2s - loss: 1.8941 - regression_loss: 1.6121 - classification_loss: 0.2819 493/500 [============================>.] - ETA: 1s - loss: 1.8930 - regression_loss: 1.6111 - classification_loss: 0.2819 494/500 [============================>.] - ETA: 1s - loss: 1.8919 - regression_loss: 1.6102 - classification_loss: 0.2817 495/500 [============================>.] - ETA: 1s - loss: 1.8923 - regression_loss: 1.6106 - classification_loss: 0.2817 496/500 [============================>.] - ETA: 1s - loss: 1.8924 - regression_loss: 1.6108 - classification_loss: 0.2816 497/500 [============================>.] - ETA: 0s - loss: 1.8929 - regression_loss: 1.6112 - classification_loss: 0.2817 498/500 [============================>.] - ETA: 0s - loss: 1.8932 - regression_loss: 1.6114 - classification_loss: 0.2818 499/500 [============================>.] - ETA: 0s - loss: 1.8929 - regression_loss: 1.6112 - classification_loss: 0.2817 500/500 [==============================] - 127s 253ms/step - loss: 1.8929 - regression_loss: 1.6113 - classification_loss: 0.2816 1172 instances of class plum with average precision: 0.6315 mAP: 0.6315 Epoch 00007: saving model to ./training/snapshots/resnet50_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 1:44 - loss: 2.1327 - regression_loss: 1.8198 - classification_loss: 0.3129 2/500 [..............................] - ETA: 1:54 - loss: 2.1981 - regression_loss: 1.9617 - classification_loss: 0.2364 3/500 [..............................] - ETA: 1:58 - loss: 1.9635 - regression_loss: 1.7400 - classification_loss: 0.2235 4/500 [..............................] - ETA: 2:01 - loss: 1.8804 - regression_loss: 1.6470 - classification_loss: 0.2333 5/500 [..............................] - ETA: 2:01 - loss: 1.7705 - regression_loss: 1.5484 - classification_loss: 0.2221 6/500 [..............................] - ETA: 2:01 - loss: 1.8285 - regression_loss: 1.5874 - classification_loss: 0.2412 7/500 [..............................] - ETA: 2:01 - loss: 1.8546 - regression_loss: 1.5983 - classification_loss: 0.2563 8/500 [..............................] - ETA: 2:02 - loss: 1.8391 - regression_loss: 1.5844 - classification_loss: 0.2546 9/500 [..............................] - ETA: 2:02 - loss: 1.8373 - regression_loss: 1.5839 - classification_loss: 0.2534 10/500 [..............................] - ETA: 2:02 - loss: 1.8190 - regression_loss: 1.5587 - classification_loss: 0.2603 11/500 [..............................] - ETA: 2:02 - loss: 1.8352 - regression_loss: 1.5729 - classification_loss: 0.2622 12/500 [..............................] - ETA: 2:02 - loss: 1.8807 - regression_loss: 1.6006 - classification_loss: 0.2801 13/500 [..............................] - ETA: 2:02 - loss: 1.9053 - regression_loss: 1.6193 - classification_loss: 0.2860 14/500 [..............................] - ETA: 2:02 - loss: 1.9730 - regression_loss: 1.6698 - classification_loss: 0.3032 15/500 [..............................] - ETA: 2:03 - loss: 1.9681 - regression_loss: 1.6686 - classification_loss: 0.2995 16/500 [..............................] - ETA: 2:03 - loss: 1.9679 - regression_loss: 1.6701 - classification_loss: 0.2978 17/500 [>.............................] - ETA: 2:02 - loss: 2.0012 - regression_loss: 1.6969 - classification_loss: 0.3043 18/500 [>.............................] - ETA: 2:02 - loss: 1.9904 - regression_loss: 1.6901 - classification_loss: 0.3003 19/500 [>.............................] - ETA: 2:01 - loss: 1.9730 - regression_loss: 1.6743 - classification_loss: 0.2987 20/500 [>.............................] - ETA: 2:01 - loss: 1.9911 - regression_loss: 1.6884 - classification_loss: 0.3027 21/500 [>.............................] - ETA: 2:00 - loss: 1.9705 - regression_loss: 1.6723 - classification_loss: 0.2982 22/500 [>.............................] - ETA: 1:59 - loss: 1.9810 - regression_loss: 1.6823 - classification_loss: 0.2987 23/500 [>.............................] - ETA: 1:58 - loss: 1.9435 - regression_loss: 1.6533 - classification_loss: 0.2902 24/500 [>.............................] - ETA: 1:57 - loss: 1.9310 - regression_loss: 1.6441 - classification_loss: 0.2869 25/500 [>.............................] - ETA: 1:57 - loss: 1.9456 - regression_loss: 1.6580 - classification_loss: 0.2876 26/500 [>.............................] - ETA: 1:57 - loss: 1.9553 - regression_loss: 1.6666 - classification_loss: 0.2886 27/500 [>.............................] - ETA: 1:57 - loss: 1.9278 - regression_loss: 1.6432 - classification_loss: 0.2846 28/500 [>.............................] - ETA: 1:57 - loss: 1.9237 - regression_loss: 1.6383 - classification_loss: 0.2855 29/500 [>.............................] - ETA: 1:57 - loss: 1.8962 - regression_loss: 1.6166 - classification_loss: 0.2796 30/500 [>.............................] - ETA: 1:57 - loss: 1.8980 - regression_loss: 1.6153 - classification_loss: 0.2827 31/500 [>.............................] - ETA: 1:56 - loss: 1.8894 - regression_loss: 1.6074 - classification_loss: 0.2820 32/500 [>.............................] - ETA: 1:56 - loss: 1.8741 - regression_loss: 1.5887 - classification_loss: 0.2854 33/500 [>.............................] - ETA: 1:56 - loss: 1.8693 - regression_loss: 1.5853 - classification_loss: 0.2839 34/500 [=>............................] - ETA: 1:56 - loss: 1.8796 - regression_loss: 1.5931 - classification_loss: 0.2865 35/500 [=>............................] - ETA: 1:56 - loss: 1.8757 - regression_loss: 1.5912 - classification_loss: 0.2845 36/500 [=>............................] - ETA: 1:56 - loss: 1.8887 - regression_loss: 1.6027 - classification_loss: 0.2860 37/500 [=>............................] - ETA: 1:56 - loss: 1.8710 - regression_loss: 1.5892 - classification_loss: 0.2818 38/500 [=>............................] - ETA: 1:55 - loss: 1.8501 - regression_loss: 1.5734 - classification_loss: 0.2767 39/500 [=>............................] - ETA: 1:55 - loss: 1.8495 - regression_loss: 1.5728 - classification_loss: 0.2766 40/500 [=>............................] - ETA: 1:55 - loss: 1.8502 - regression_loss: 1.5744 - classification_loss: 0.2758 41/500 [=>............................] - ETA: 1:55 - loss: 1.8254 - regression_loss: 1.5527 - classification_loss: 0.2727 42/500 [=>............................] - ETA: 1:54 - loss: 1.8095 - regression_loss: 1.5406 - classification_loss: 0.2689 43/500 [=>............................] - ETA: 1:54 - loss: 1.8166 - regression_loss: 1.5452 - classification_loss: 0.2713 44/500 [=>............................] - ETA: 1:54 - loss: 1.8093 - regression_loss: 1.5391 - classification_loss: 0.2702 45/500 [=>............................] - ETA: 1:54 - loss: 1.8079 - regression_loss: 1.5396 - classification_loss: 0.2683 46/500 [=>............................] - ETA: 1:54 - loss: 1.8134 - regression_loss: 1.5441 - classification_loss: 0.2693 47/500 [=>............................] - ETA: 1:53 - loss: 1.8116 - regression_loss: 1.5419 - classification_loss: 0.2697 48/500 [=>............................] - ETA: 1:53 - loss: 1.8212 - regression_loss: 1.5493 - classification_loss: 0.2719 49/500 [=>............................] - ETA: 1:53 - loss: 1.8273 - regression_loss: 1.5544 - classification_loss: 0.2729 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8250 - regression_loss: 1.5532 - classification_loss: 0.2718 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8202 - regression_loss: 1.5499 - classification_loss: 0.2703 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8189 - regression_loss: 1.5495 - classification_loss: 0.2694 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8134 - regression_loss: 1.5465 - classification_loss: 0.2669 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8083 - regression_loss: 1.5423 - classification_loss: 0.2659 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8068 - regression_loss: 1.5419 - classification_loss: 0.2649 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7969 - regression_loss: 1.5339 - classification_loss: 0.2630 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8032 - regression_loss: 1.5390 - classification_loss: 0.2642 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8078 - regression_loss: 1.5429 - classification_loss: 0.2649 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8190 - regression_loss: 1.5511 - classification_loss: 0.2679 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8219 - regression_loss: 1.5540 - classification_loss: 0.2679 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8174 - regression_loss: 1.5502 - classification_loss: 0.2672 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8189 - regression_loss: 1.5518 - classification_loss: 0.2670 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8302 - regression_loss: 1.5607 - classification_loss: 0.2695 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8297 - regression_loss: 1.5605 - classification_loss: 0.2692 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8353 - regression_loss: 1.5661 - classification_loss: 0.2692 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8373 - regression_loss: 1.5658 - classification_loss: 0.2715 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8306 - regression_loss: 1.5601 - classification_loss: 0.2705 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8298 - regression_loss: 1.5596 - classification_loss: 0.2702 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8363 - regression_loss: 1.5662 - classification_loss: 0.2701 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8436 - regression_loss: 1.5725 - classification_loss: 0.2712 71/500 [===>..........................] - ETA: 1:48 - loss: 1.8378 - regression_loss: 1.5680 - classification_loss: 0.2698 72/500 [===>..........................] - ETA: 1:48 - loss: 1.8447 - regression_loss: 1.5744 - classification_loss: 0.2702 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8467 - regression_loss: 1.5783 - classification_loss: 0.2685 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8451 - regression_loss: 1.5771 - classification_loss: 0.2679 75/500 [===>..........................] - ETA: 1:47 - loss: 1.8473 - regression_loss: 1.5789 - classification_loss: 0.2684 76/500 [===>..........................] - ETA: 1:47 - loss: 1.8433 - regression_loss: 1.5758 - classification_loss: 0.2675 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8467 - regression_loss: 1.5793 - classification_loss: 0.2674 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8383 - regression_loss: 1.5720 - classification_loss: 0.2662 79/500 [===>..........................] - ETA: 1:46 - loss: 1.8332 - regression_loss: 1.5679 - classification_loss: 0.2654 80/500 [===>..........................] - ETA: 1:46 - loss: 1.8219 - regression_loss: 1.5582 - classification_loss: 0.2638 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8153 - regression_loss: 1.5524 - classification_loss: 0.2629 82/500 [===>..........................] - ETA: 1:45 - loss: 1.8077 - regression_loss: 1.5462 - classification_loss: 0.2615 83/500 [===>..........................] - ETA: 1:45 - loss: 1.8075 - regression_loss: 1.5457 - classification_loss: 0.2618 84/500 [====>.........................] - ETA: 1:45 - loss: 1.8015 - regression_loss: 1.5405 - classification_loss: 0.2609 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8004 - regression_loss: 1.5397 - classification_loss: 0.2607 86/500 [====>.........................] - ETA: 1:44 - loss: 1.8021 - regression_loss: 1.5413 - classification_loss: 0.2609 87/500 [====>.........................] - ETA: 1:44 - loss: 1.8024 - regression_loss: 1.5417 - classification_loss: 0.2607 88/500 [====>.........................] - ETA: 1:44 - loss: 1.8065 - regression_loss: 1.5453 - classification_loss: 0.2612 89/500 [====>.........................] - ETA: 1:44 - loss: 1.8073 - regression_loss: 1.5454 - classification_loss: 0.2619 90/500 [====>.........................] - ETA: 1:43 - loss: 1.8102 - regression_loss: 1.5479 - classification_loss: 0.2623 91/500 [====>.........................] - ETA: 1:43 - loss: 1.8136 - regression_loss: 1.5505 - classification_loss: 0.2631 92/500 [====>.........................] - ETA: 1:43 - loss: 1.8140 - regression_loss: 1.5507 - classification_loss: 0.2634 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8188 - regression_loss: 1.5553 - classification_loss: 0.2634 94/500 [====>.........................] - ETA: 1:42 - loss: 1.8237 - regression_loss: 1.5607 - classification_loss: 0.2630 95/500 [====>.........................] - ETA: 1:42 - loss: 1.8238 - regression_loss: 1.5608 - classification_loss: 0.2629 96/500 [====>.........................] - ETA: 1:42 - loss: 1.8222 - regression_loss: 1.5595 - classification_loss: 0.2627 97/500 [====>.........................] - ETA: 1:42 - loss: 1.8329 - regression_loss: 1.5698 - classification_loss: 0.2632 98/500 [====>.........................] - ETA: 1:41 - loss: 1.8310 - regression_loss: 1.5679 - classification_loss: 0.2632 99/500 [====>.........................] - ETA: 1:41 - loss: 1.8328 - regression_loss: 1.5695 - classification_loss: 0.2633 100/500 [=====>........................] - ETA: 1:41 - loss: 1.8355 - regression_loss: 1.5719 - classification_loss: 0.2636 101/500 [=====>........................] - ETA: 1:41 - loss: 1.8252 - regression_loss: 1.5622 - classification_loss: 0.2630 102/500 [=====>........................] - ETA: 1:40 - loss: 1.8375 - regression_loss: 1.5708 - classification_loss: 0.2666 103/500 [=====>........................] - ETA: 1:40 - loss: 1.8355 - regression_loss: 1.5696 - classification_loss: 0.2659 104/500 [=====>........................] - ETA: 1:40 - loss: 1.8359 - regression_loss: 1.5710 - classification_loss: 0.2650 105/500 [=====>........................] - ETA: 1:40 - loss: 1.8365 - regression_loss: 1.5708 - classification_loss: 0.2657 106/500 [=====>........................] - ETA: 1:39 - loss: 1.8402 - regression_loss: 1.5744 - classification_loss: 0.2658 107/500 [=====>........................] - ETA: 1:39 - loss: 1.8418 - regression_loss: 1.5757 - classification_loss: 0.2661 108/500 [=====>........................] - ETA: 1:39 - loss: 1.8442 - regression_loss: 1.5778 - classification_loss: 0.2664 109/500 [=====>........................] - ETA: 1:39 - loss: 1.8444 - regression_loss: 1.5782 - classification_loss: 0.2662 110/500 [=====>........................] - ETA: 1:38 - loss: 1.8412 - regression_loss: 1.5755 - classification_loss: 0.2657 111/500 [=====>........................] - ETA: 1:38 - loss: 1.8434 - regression_loss: 1.5773 - classification_loss: 0.2661 112/500 [=====>........................] - ETA: 1:38 - loss: 1.8426 - regression_loss: 1.5765 - classification_loss: 0.2661 113/500 [=====>........................] - ETA: 1:38 - loss: 1.8387 - regression_loss: 1.5732 - classification_loss: 0.2655 114/500 [=====>........................] - ETA: 1:37 - loss: 1.8341 - regression_loss: 1.5691 - classification_loss: 0.2651 115/500 [=====>........................] - ETA: 1:37 - loss: 1.8329 - regression_loss: 1.5681 - classification_loss: 0.2648 116/500 [=====>........................] - ETA: 1:37 - loss: 1.8329 - regression_loss: 1.5680 - classification_loss: 0.2649 117/500 [======>.......................] - ETA: 1:37 - loss: 1.8305 - regression_loss: 1.5661 - classification_loss: 0.2645 118/500 [======>.......................] - ETA: 1:36 - loss: 1.8267 - regression_loss: 1.5623 - classification_loss: 0.2644 119/500 [======>.......................] - ETA: 1:36 - loss: 1.8319 - regression_loss: 1.5674 - classification_loss: 0.2645 120/500 [======>.......................] - ETA: 1:36 - loss: 1.8329 - regression_loss: 1.5686 - classification_loss: 0.2643 121/500 [======>.......................] - ETA: 1:36 - loss: 1.8293 - regression_loss: 1.5658 - classification_loss: 0.2635 122/500 [======>.......................] - ETA: 1:35 - loss: 1.8300 - regression_loss: 1.5662 - classification_loss: 0.2638 123/500 [======>.......................] - ETA: 1:35 - loss: 1.8282 - regression_loss: 1.5647 - classification_loss: 0.2635 124/500 [======>.......................] - ETA: 1:35 - loss: 1.8302 - regression_loss: 1.5666 - classification_loss: 0.2636 125/500 [======>.......................] - ETA: 1:35 - loss: 1.8228 - regression_loss: 1.5599 - classification_loss: 0.2629 126/500 [======>.......................] - ETA: 1:34 - loss: 1.8265 - regression_loss: 1.5628 - classification_loss: 0.2637 127/500 [======>.......................] - ETA: 1:34 - loss: 1.8262 - regression_loss: 1.5623 - classification_loss: 0.2639 128/500 [======>.......................] - ETA: 1:34 - loss: 1.8230 - regression_loss: 1.5592 - classification_loss: 0.2638 129/500 [======>.......................] - ETA: 1:34 - loss: 1.8226 - regression_loss: 1.5596 - classification_loss: 0.2631 130/500 [======>.......................] - ETA: 1:34 - loss: 1.8243 - regression_loss: 1.5612 - classification_loss: 0.2631 131/500 [======>.......................] - ETA: 1:33 - loss: 1.8282 - regression_loss: 1.5649 - classification_loss: 0.2633 132/500 [======>.......................] - ETA: 1:33 - loss: 1.8237 - regression_loss: 1.5610 - classification_loss: 0.2627 133/500 [======>.......................] - ETA: 1:33 - loss: 1.8224 - regression_loss: 1.5599 - classification_loss: 0.2625 134/500 [=======>......................] - ETA: 1:32 - loss: 1.8370 - regression_loss: 1.5633 - classification_loss: 0.2736 135/500 [=======>......................] - ETA: 1:32 - loss: 1.8389 - regression_loss: 1.5653 - classification_loss: 0.2736 136/500 [=======>......................] - ETA: 1:32 - loss: 1.8385 - regression_loss: 1.5646 - classification_loss: 0.2739 137/500 [=======>......................] - ETA: 1:32 - loss: 1.8385 - regression_loss: 1.5647 - classification_loss: 0.2739 138/500 [=======>......................] - ETA: 1:31 - loss: 1.8360 - regression_loss: 1.5626 - classification_loss: 0.2735 139/500 [=======>......................] - ETA: 1:31 - loss: 1.8357 - regression_loss: 1.5622 - classification_loss: 0.2735 140/500 [=======>......................] - ETA: 1:31 - loss: 1.8353 - regression_loss: 1.5620 - classification_loss: 0.2733 141/500 [=======>......................] - ETA: 1:31 - loss: 1.8398 - regression_loss: 1.5662 - classification_loss: 0.2736 142/500 [=======>......................] - ETA: 1:30 - loss: 1.8418 - regression_loss: 1.5687 - classification_loss: 0.2731 143/500 [=======>......................] - ETA: 1:30 - loss: 1.8363 - regression_loss: 1.5639 - classification_loss: 0.2724 144/500 [=======>......................] - ETA: 1:30 - loss: 1.8382 - regression_loss: 1.5654 - classification_loss: 0.2728 145/500 [=======>......................] - ETA: 1:30 - loss: 1.8371 - regression_loss: 1.5641 - classification_loss: 0.2729 146/500 [=======>......................] - ETA: 1:29 - loss: 1.8371 - regression_loss: 1.5644 - classification_loss: 0.2727 147/500 [=======>......................] - ETA: 1:29 - loss: 1.8386 - regression_loss: 1.5656 - classification_loss: 0.2730 148/500 [=======>......................] - ETA: 1:29 - loss: 1.8411 - regression_loss: 1.5678 - classification_loss: 0.2733 149/500 [=======>......................] - ETA: 1:29 - loss: 1.8440 - regression_loss: 1.5700 - classification_loss: 0.2740 150/500 [========>.....................] - ETA: 1:28 - loss: 1.8436 - regression_loss: 1.5698 - classification_loss: 0.2738 151/500 [========>.....................] - ETA: 1:28 - loss: 1.8438 - regression_loss: 1.5699 - classification_loss: 0.2739 152/500 [========>.....................] - ETA: 1:28 - loss: 1.8454 - regression_loss: 1.5716 - classification_loss: 0.2738 153/500 [========>.....................] - ETA: 1:28 - loss: 1.8466 - regression_loss: 1.5728 - classification_loss: 0.2738 154/500 [========>.....................] - ETA: 1:27 - loss: 1.8434 - regression_loss: 1.5702 - classification_loss: 0.2733 155/500 [========>.....................] - ETA: 1:27 - loss: 1.8439 - regression_loss: 1.5703 - classification_loss: 0.2737 156/500 [========>.....................] - ETA: 1:27 - loss: 1.8448 - regression_loss: 1.5711 - classification_loss: 0.2738 157/500 [========>.....................] - ETA: 1:27 - loss: 1.8481 - regression_loss: 1.5740 - classification_loss: 0.2741 158/500 [========>.....................] - ETA: 1:26 - loss: 1.8456 - regression_loss: 1.5721 - classification_loss: 0.2734 159/500 [========>.....................] - ETA: 1:26 - loss: 1.8394 - regression_loss: 1.5671 - classification_loss: 0.2723 160/500 [========>.....................] - ETA: 1:26 - loss: 1.8415 - regression_loss: 1.5693 - classification_loss: 0.2722 161/500 [========>.....................] - ETA: 1:26 - loss: 1.8409 - regression_loss: 1.5689 - classification_loss: 0.2720 162/500 [========>.....................] - ETA: 1:25 - loss: 1.8441 - regression_loss: 1.5715 - classification_loss: 0.2726 163/500 [========>.....................] - ETA: 1:25 - loss: 1.8452 - regression_loss: 1.5726 - classification_loss: 0.2726 164/500 [========>.....................] - ETA: 1:25 - loss: 1.8460 - regression_loss: 1.5734 - classification_loss: 0.2727 165/500 [========>.....................] - ETA: 1:25 - loss: 1.8480 - regression_loss: 1.5750 - classification_loss: 0.2730 166/500 [========>.....................] - ETA: 1:24 - loss: 1.8498 - regression_loss: 1.5765 - classification_loss: 0.2733 167/500 [=========>....................] - ETA: 1:24 - loss: 1.8501 - regression_loss: 1.5768 - classification_loss: 0.2734 168/500 [=========>....................] - ETA: 1:24 - loss: 1.8499 - regression_loss: 1.5768 - classification_loss: 0.2732 169/500 [=========>....................] - ETA: 1:24 - loss: 1.8488 - regression_loss: 1.5759 - classification_loss: 0.2730 170/500 [=========>....................] - ETA: 1:23 - loss: 1.8478 - regression_loss: 1.5754 - classification_loss: 0.2724 171/500 [=========>....................] - ETA: 1:23 - loss: 1.8422 - regression_loss: 1.5699 - classification_loss: 0.2722 172/500 [=========>....................] - ETA: 1:23 - loss: 1.8412 - regression_loss: 1.5689 - classification_loss: 0.2723 173/500 [=========>....................] - ETA: 1:23 - loss: 1.8406 - regression_loss: 1.5683 - classification_loss: 0.2723 174/500 [=========>....................] - ETA: 1:22 - loss: 1.8409 - regression_loss: 1.5689 - classification_loss: 0.2720 175/500 [=========>....................] - ETA: 1:22 - loss: 1.8418 - regression_loss: 1.5697 - classification_loss: 0.2721 176/500 [=========>....................] - ETA: 1:22 - loss: 1.8424 - regression_loss: 1.5702 - classification_loss: 0.2722 177/500 [=========>....................] - ETA: 1:22 - loss: 1.8392 - regression_loss: 1.5675 - classification_loss: 0.2718 178/500 [=========>....................] - ETA: 1:21 - loss: 1.8388 - regression_loss: 1.5669 - classification_loss: 0.2719 179/500 [=========>....................] - ETA: 1:21 - loss: 1.8398 - regression_loss: 1.5675 - classification_loss: 0.2723 180/500 [=========>....................] - ETA: 1:21 - loss: 1.8395 - regression_loss: 1.5672 - classification_loss: 0.2723 181/500 [=========>....................] - ETA: 1:21 - loss: 1.8406 - regression_loss: 1.5681 - classification_loss: 0.2725 182/500 [=========>....................] - ETA: 1:20 - loss: 1.8411 - regression_loss: 1.5685 - classification_loss: 0.2726 183/500 [=========>....................] - ETA: 1:20 - loss: 1.8412 - regression_loss: 1.5687 - classification_loss: 0.2726 184/500 [==========>...................] - ETA: 1:20 - loss: 1.8412 - regression_loss: 1.5687 - classification_loss: 0.2725 185/500 [==========>...................] - ETA: 1:20 - loss: 1.8420 - regression_loss: 1.5692 - classification_loss: 0.2728 186/500 [==========>...................] - ETA: 1:19 - loss: 1.8396 - regression_loss: 1.5674 - classification_loss: 0.2722 187/500 [==========>...................] - ETA: 1:19 - loss: 1.8464 - regression_loss: 1.5718 - classification_loss: 0.2746 188/500 [==========>...................] - ETA: 1:19 - loss: 1.8473 - regression_loss: 1.5729 - classification_loss: 0.2744 189/500 [==========>...................] - ETA: 1:19 - loss: 1.8486 - regression_loss: 1.5735 - classification_loss: 0.2751 190/500 [==========>...................] - ETA: 1:18 - loss: 1.8490 - regression_loss: 1.5738 - classification_loss: 0.2752 191/500 [==========>...................] - ETA: 1:18 - loss: 1.8497 - regression_loss: 1.5744 - classification_loss: 0.2753 192/500 [==========>...................] - ETA: 1:18 - loss: 1.8496 - regression_loss: 1.5742 - classification_loss: 0.2754 193/500 [==========>...................] - ETA: 1:18 - loss: 1.8472 - regression_loss: 1.5723 - classification_loss: 0.2748 194/500 [==========>...................] - ETA: 1:17 - loss: 1.8464 - regression_loss: 1.5720 - classification_loss: 0.2744 195/500 [==========>...................] - ETA: 1:17 - loss: 1.8411 - regression_loss: 1.5675 - classification_loss: 0.2736 196/500 [==========>...................] - ETA: 1:17 - loss: 1.8422 - regression_loss: 1.5683 - classification_loss: 0.2739 197/500 [==========>...................] - ETA: 1:16 - loss: 1.8436 - regression_loss: 1.5696 - classification_loss: 0.2740 198/500 [==========>...................] - ETA: 1:16 - loss: 1.8432 - regression_loss: 1.5698 - classification_loss: 0.2734 199/500 [==========>...................] - ETA: 1:16 - loss: 1.8436 - regression_loss: 1.5702 - classification_loss: 0.2734 200/500 [===========>..................] - ETA: 1:16 - loss: 1.8447 - regression_loss: 1.5712 - classification_loss: 0.2736 201/500 [===========>..................] - ETA: 1:15 - loss: 1.8411 - regression_loss: 1.5680 - classification_loss: 0.2730 202/500 [===========>..................] - ETA: 1:15 - loss: 1.8455 - regression_loss: 1.5714 - classification_loss: 0.2741 203/500 [===========>..................] - ETA: 1:15 - loss: 1.8440 - regression_loss: 1.5704 - classification_loss: 0.2735 204/500 [===========>..................] - ETA: 1:15 - loss: 1.8439 - regression_loss: 1.5705 - classification_loss: 0.2734 205/500 [===========>..................] - ETA: 1:14 - loss: 1.8417 - regression_loss: 1.5688 - classification_loss: 0.2730 206/500 [===========>..................] - ETA: 1:14 - loss: 1.8362 - regression_loss: 1.5641 - classification_loss: 0.2721 207/500 [===========>..................] - ETA: 1:14 - loss: 1.8382 - regression_loss: 1.5658 - classification_loss: 0.2724 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8344 - regression_loss: 1.5627 - classification_loss: 0.2717 209/500 [===========>..................] - ETA: 1:13 - loss: 1.8311 - regression_loss: 1.5600 - classification_loss: 0.2711 210/500 [===========>..................] - ETA: 1:13 - loss: 1.8304 - regression_loss: 1.5595 - classification_loss: 0.2709 211/500 [===========>..................] - ETA: 1:13 - loss: 1.8326 - regression_loss: 1.5615 - classification_loss: 0.2711 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8269 - regression_loss: 1.5565 - classification_loss: 0.2704 213/500 [===========>..................] - ETA: 1:12 - loss: 1.8268 - regression_loss: 1.5565 - classification_loss: 0.2703 214/500 [===========>..................] - ETA: 1:12 - loss: 1.8279 - regression_loss: 1.5578 - classification_loss: 0.2702 215/500 [===========>..................] - ETA: 1:12 - loss: 1.8275 - regression_loss: 1.5577 - classification_loss: 0.2698 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8298 - regression_loss: 1.5597 - classification_loss: 0.2701 217/500 [============>.................] - ETA: 1:11 - loss: 1.8310 - regression_loss: 1.5607 - classification_loss: 0.2703 218/500 [============>.................] - ETA: 1:11 - loss: 1.8316 - regression_loss: 1.5609 - classification_loss: 0.2707 219/500 [============>.................] - ETA: 1:11 - loss: 1.8323 - regression_loss: 1.5617 - classification_loss: 0.2706 220/500 [============>.................] - ETA: 1:10 - loss: 1.8328 - regression_loss: 1.5623 - classification_loss: 0.2705 221/500 [============>.................] - ETA: 1:10 - loss: 1.8315 - regression_loss: 1.5613 - classification_loss: 0.2701 222/500 [============>.................] - ETA: 1:10 - loss: 1.8318 - regression_loss: 1.5620 - classification_loss: 0.2697 223/500 [============>.................] - ETA: 1:10 - loss: 1.8310 - regression_loss: 1.5611 - classification_loss: 0.2699 224/500 [============>.................] - ETA: 1:09 - loss: 1.8328 - regression_loss: 1.5627 - classification_loss: 0.2701 225/500 [============>.................] - ETA: 1:09 - loss: 1.8309 - regression_loss: 1.5613 - classification_loss: 0.2696 226/500 [============>.................] - ETA: 1:09 - loss: 1.8310 - regression_loss: 1.5615 - classification_loss: 0.2695 227/500 [============>.................] - ETA: 1:09 - loss: 1.8275 - regression_loss: 1.5585 - classification_loss: 0.2690 228/500 [============>.................] - ETA: 1:08 - loss: 1.8260 - regression_loss: 1.5570 - classification_loss: 0.2690 229/500 [============>.................] - ETA: 1:08 - loss: 1.8296 - regression_loss: 1.5602 - classification_loss: 0.2694 230/500 [============>.................] - ETA: 1:08 - loss: 1.8307 - regression_loss: 1.5613 - classification_loss: 0.2694 231/500 [============>.................] - ETA: 1:08 - loss: 1.8312 - regression_loss: 1.5619 - classification_loss: 0.2693 232/500 [============>.................] - ETA: 1:07 - loss: 1.8320 - regression_loss: 1.5625 - classification_loss: 0.2695 233/500 [============>.................] - ETA: 1:07 - loss: 1.8323 - regression_loss: 1.5628 - classification_loss: 0.2695 234/500 [=============>................] - ETA: 1:07 - loss: 1.8317 - regression_loss: 1.5626 - classification_loss: 0.2691 235/500 [=============>................] - ETA: 1:07 - loss: 1.8319 - regression_loss: 1.5633 - classification_loss: 0.2686 236/500 [=============>................] - ETA: 1:06 - loss: 1.8293 - regression_loss: 1.5613 - classification_loss: 0.2680 237/500 [=============>................] - ETA: 1:06 - loss: 1.8296 - regression_loss: 1.5617 - classification_loss: 0.2679 238/500 [=============>................] - ETA: 1:06 - loss: 1.8303 - regression_loss: 1.5623 - classification_loss: 0.2680 239/500 [=============>................] - ETA: 1:06 - loss: 1.8343 - regression_loss: 1.5657 - classification_loss: 0.2686 240/500 [=============>................] - ETA: 1:05 - loss: 1.8351 - regression_loss: 1.5663 - classification_loss: 0.2688 241/500 [=============>................] - ETA: 1:05 - loss: 1.8359 - regression_loss: 1.5670 - classification_loss: 0.2688 242/500 [=============>................] - ETA: 1:05 - loss: 1.8373 - regression_loss: 1.5679 - classification_loss: 0.2694 243/500 [=============>................] - ETA: 1:05 - loss: 1.8411 - regression_loss: 1.5715 - classification_loss: 0.2695 244/500 [=============>................] - ETA: 1:04 - loss: 1.8410 - regression_loss: 1.5714 - classification_loss: 0.2696 245/500 [=============>................] - ETA: 1:04 - loss: 1.8407 - regression_loss: 1.5710 - classification_loss: 0.2697 246/500 [=============>................] - ETA: 1:04 - loss: 1.8431 - regression_loss: 1.5729 - classification_loss: 0.2702 247/500 [=============>................] - ETA: 1:04 - loss: 1.8402 - regression_loss: 1.5706 - classification_loss: 0.2696 248/500 [=============>................] - ETA: 1:03 - loss: 1.8382 - regression_loss: 1.5691 - classification_loss: 0.2691 249/500 [=============>................] - ETA: 1:03 - loss: 1.8370 - regression_loss: 1.5682 - classification_loss: 0.2688 250/500 [==============>...............] - ETA: 1:03 - loss: 1.8377 - regression_loss: 1.5687 - classification_loss: 0.2689 251/500 [==============>...............] - ETA: 1:03 - loss: 1.8381 - regression_loss: 1.5693 - classification_loss: 0.2688 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8366 - regression_loss: 1.5678 - classification_loss: 0.2688 253/500 [==============>...............] - ETA: 1:02 - loss: 1.8374 - regression_loss: 1.5684 - classification_loss: 0.2690 254/500 [==============>...............] - ETA: 1:02 - loss: 1.8385 - regression_loss: 1.5692 - classification_loss: 0.2693 255/500 [==============>...............] - ETA: 1:02 - loss: 1.8394 - regression_loss: 1.5699 - classification_loss: 0.2695 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8380 - regression_loss: 1.5689 - classification_loss: 0.2691 257/500 [==============>...............] - ETA: 1:01 - loss: 1.8376 - regression_loss: 1.5687 - classification_loss: 0.2688 258/500 [==============>...............] - ETA: 1:01 - loss: 1.8380 - regression_loss: 1.5692 - classification_loss: 0.2688 259/500 [==============>...............] - ETA: 1:01 - loss: 1.8382 - regression_loss: 1.5695 - classification_loss: 0.2687 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8395 - regression_loss: 1.5705 - classification_loss: 0.2690 261/500 [==============>...............] - ETA: 1:00 - loss: 1.8369 - regression_loss: 1.5684 - classification_loss: 0.2685 262/500 [==============>...............] - ETA: 1:00 - loss: 1.8350 - regression_loss: 1.5668 - classification_loss: 0.2682 263/500 [==============>...............] - ETA: 1:00 - loss: 1.8372 - regression_loss: 1.5688 - classification_loss: 0.2684 264/500 [==============>...............] - ETA: 59s - loss: 1.8380 - regression_loss: 1.5696 - classification_loss: 0.2684  265/500 [==============>...............] - ETA: 59s - loss: 1.8376 - regression_loss: 1.5693 - classification_loss: 0.2683 266/500 [==============>...............] - ETA: 59s - loss: 1.8383 - regression_loss: 1.5700 - classification_loss: 0.2683 267/500 [===============>..............] - ETA: 59s - loss: 1.8367 - regression_loss: 1.5683 - classification_loss: 0.2683 268/500 [===============>..............] - ETA: 58s - loss: 1.8350 - regression_loss: 1.5668 - classification_loss: 0.2681 269/500 [===============>..............] - ETA: 58s - loss: 1.8341 - regression_loss: 1.5661 - classification_loss: 0.2680 270/500 [===============>..............] - ETA: 58s - loss: 1.8352 - regression_loss: 1.5670 - classification_loss: 0.2681 271/500 [===============>..............] - ETA: 58s - loss: 1.8350 - regression_loss: 1.5669 - classification_loss: 0.2681 272/500 [===============>..............] - ETA: 57s - loss: 1.8312 - regression_loss: 1.5633 - classification_loss: 0.2679 273/500 [===============>..............] - ETA: 57s - loss: 1.8292 - regression_loss: 1.5618 - classification_loss: 0.2674 274/500 [===============>..............] - ETA: 57s - loss: 1.8297 - regression_loss: 1.5623 - classification_loss: 0.2674 275/500 [===============>..............] - ETA: 57s - loss: 1.8272 - regression_loss: 1.5605 - classification_loss: 0.2667 276/500 [===============>..............] - ETA: 56s - loss: 1.8272 - regression_loss: 1.5606 - classification_loss: 0.2666 277/500 [===============>..............] - ETA: 56s - loss: 1.8294 - regression_loss: 1.5628 - classification_loss: 0.2667 278/500 [===============>..............] - ETA: 56s - loss: 1.8304 - regression_loss: 1.5636 - classification_loss: 0.2669 279/500 [===============>..............] - ETA: 56s - loss: 1.8324 - regression_loss: 1.5652 - classification_loss: 0.2673 280/500 [===============>..............] - ETA: 55s - loss: 1.8335 - regression_loss: 1.5661 - classification_loss: 0.2674 281/500 [===============>..............] - ETA: 55s - loss: 1.8322 - regression_loss: 1.5651 - classification_loss: 0.2670 282/500 [===============>..............] - ETA: 55s - loss: 1.8301 - regression_loss: 1.5634 - classification_loss: 0.2667 283/500 [===============>..............] - ETA: 55s - loss: 1.8302 - regression_loss: 1.5637 - classification_loss: 0.2665 284/500 [================>.............] - ETA: 54s - loss: 1.8269 - regression_loss: 1.5610 - classification_loss: 0.2659 285/500 [================>.............] - ETA: 54s - loss: 1.8276 - regression_loss: 1.5616 - classification_loss: 0.2660 286/500 [================>.............] - ETA: 54s - loss: 1.8278 - regression_loss: 1.5617 - classification_loss: 0.2661 287/500 [================>.............] - ETA: 54s - loss: 1.8292 - regression_loss: 1.5629 - classification_loss: 0.2663 288/500 [================>.............] - ETA: 53s - loss: 1.8298 - regression_loss: 1.5635 - classification_loss: 0.2663 289/500 [================>.............] - ETA: 53s - loss: 1.8297 - regression_loss: 1.5634 - classification_loss: 0.2663 290/500 [================>.............] - ETA: 53s - loss: 1.8281 - regression_loss: 1.5623 - classification_loss: 0.2659 291/500 [================>.............] - ETA: 53s - loss: 1.8300 - regression_loss: 1.5639 - classification_loss: 0.2661 292/500 [================>.............] - ETA: 52s - loss: 1.8313 - regression_loss: 1.5650 - classification_loss: 0.2663 293/500 [================>.............] - ETA: 52s - loss: 1.8318 - regression_loss: 1.5656 - classification_loss: 0.2663 294/500 [================>.............] - ETA: 52s - loss: 1.8323 - regression_loss: 1.5657 - classification_loss: 0.2665 295/500 [================>.............] - ETA: 52s - loss: 1.8315 - regression_loss: 1.5650 - classification_loss: 0.2665 296/500 [================>.............] - ETA: 51s - loss: 1.8316 - regression_loss: 1.5648 - classification_loss: 0.2668 297/500 [================>.............] - ETA: 51s - loss: 1.8311 - regression_loss: 1.5644 - classification_loss: 0.2667 298/500 [================>.............] - ETA: 51s - loss: 1.8304 - regression_loss: 1.5638 - classification_loss: 0.2666 299/500 [================>.............] - ETA: 51s - loss: 1.8315 - regression_loss: 1.5648 - classification_loss: 0.2668 300/500 [=================>............] - ETA: 50s - loss: 1.8305 - regression_loss: 1.5638 - classification_loss: 0.2666 301/500 [=================>............] - ETA: 50s - loss: 1.8311 - regression_loss: 1.5642 - classification_loss: 0.2669 302/500 [=================>............] - ETA: 50s - loss: 1.8306 - regression_loss: 1.5638 - classification_loss: 0.2668 303/500 [=================>............] - ETA: 50s - loss: 1.8290 - regression_loss: 1.5625 - classification_loss: 0.2665 304/500 [=================>............] - ETA: 49s - loss: 1.8294 - regression_loss: 1.5630 - classification_loss: 0.2665 305/500 [=================>............] - ETA: 49s - loss: 1.8273 - regression_loss: 1.5611 - classification_loss: 0.2662 306/500 [=================>............] - ETA: 49s - loss: 1.8267 - regression_loss: 1.5607 - classification_loss: 0.2660 307/500 [=================>............] - ETA: 49s - loss: 1.8255 - regression_loss: 1.5597 - classification_loss: 0.2658 308/500 [=================>............] - ETA: 48s - loss: 1.8238 - regression_loss: 1.5584 - classification_loss: 0.2655 309/500 [=================>............] - ETA: 48s - loss: 1.8237 - regression_loss: 1.5583 - classification_loss: 0.2654 310/500 [=================>............] - ETA: 48s - loss: 1.8224 - regression_loss: 1.5572 - classification_loss: 0.2652 311/500 [=================>............] - ETA: 47s - loss: 1.8215 - regression_loss: 1.5567 - classification_loss: 0.2648 312/500 [=================>............] - ETA: 47s - loss: 1.8211 - regression_loss: 1.5564 - classification_loss: 0.2647 313/500 [=================>............] - ETA: 47s - loss: 1.8207 - regression_loss: 1.5563 - classification_loss: 0.2643 314/500 [=================>............] - ETA: 47s - loss: 1.8200 - regression_loss: 1.5557 - classification_loss: 0.2643 315/500 [=================>............] - ETA: 46s - loss: 1.8181 - regression_loss: 1.5541 - classification_loss: 0.2641 316/500 [=================>............] - ETA: 46s - loss: 1.8187 - regression_loss: 1.5543 - classification_loss: 0.2644 317/500 [==================>...........] - ETA: 46s - loss: 1.8178 - regression_loss: 1.5535 - classification_loss: 0.2643 318/500 [==================>...........] - ETA: 46s - loss: 1.8178 - regression_loss: 1.5535 - classification_loss: 0.2643 319/500 [==================>...........] - ETA: 45s - loss: 1.8175 - regression_loss: 1.5534 - classification_loss: 0.2641 320/500 [==================>...........] - ETA: 45s - loss: 1.8176 - regression_loss: 1.5534 - classification_loss: 0.2641 321/500 [==================>...........] - ETA: 45s - loss: 1.8176 - regression_loss: 1.5536 - classification_loss: 0.2641 322/500 [==================>...........] - ETA: 45s - loss: 1.8189 - regression_loss: 1.5548 - classification_loss: 0.2642 323/500 [==================>...........] - ETA: 44s - loss: 1.8193 - regression_loss: 1.5551 - classification_loss: 0.2642 324/500 [==================>...........] - ETA: 44s - loss: 1.8200 - regression_loss: 1.5557 - classification_loss: 0.2643 325/500 [==================>...........] - ETA: 44s - loss: 1.8197 - regression_loss: 1.5554 - classification_loss: 0.2643 326/500 [==================>...........] - ETA: 44s - loss: 1.8213 - regression_loss: 1.5567 - classification_loss: 0.2646 327/500 [==================>...........] - ETA: 43s - loss: 1.8203 - regression_loss: 1.5557 - classification_loss: 0.2646 328/500 [==================>...........] - ETA: 43s - loss: 1.8221 - regression_loss: 1.5572 - classification_loss: 0.2649 329/500 [==================>...........] - ETA: 43s - loss: 1.8216 - regression_loss: 1.5569 - classification_loss: 0.2647 330/500 [==================>...........] - ETA: 43s - loss: 1.8199 - regression_loss: 1.5554 - classification_loss: 0.2645 331/500 [==================>...........] - ETA: 42s - loss: 1.8203 - regression_loss: 1.5557 - classification_loss: 0.2646 332/500 [==================>...........] - ETA: 42s - loss: 1.8195 - regression_loss: 1.5552 - classification_loss: 0.2643 333/500 [==================>...........] - ETA: 42s - loss: 1.8200 - regression_loss: 1.5556 - classification_loss: 0.2643 334/500 [===================>..........] - ETA: 42s - loss: 1.8187 - regression_loss: 1.5545 - classification_loss: 0.2642 335/500 [===================>..........] - ETA: 41s - loss: 1.8193 - regression_loss: 1.5551 - classification_loss: 0.2642 336/500 [===================>..........] - ETA: 41s - loss: 1.8170 - regression_loss: 1.5531 - classification_loss: 0.2639 337/500 [===================>..........] - ETA: 41s - loss: 1.8180 - regression_loss: 1.5536 - classification_loss: 0.2644 338/500 [===================>..........] - ETA: 41s - loss: 1.8163 - regression_loss: 1.5523 - classification_loss: 0.2640 339/500 [===================>..........] - ETA: 40s - loss: 1.8168 - regression_loss: 1.5528 - classification_loss: 0.2641 340/500 [===================>..........] - ETA: 40s - loss: 1.8146 - regression_loss: 1.5509 - classification_loss: 0.2637 341/500 [===================>..........] - ETA: 40s - loss: 1.8159 - regression_loss: 1.5520 - classification_loss: 0.2639 342/500 [===================>..........] - ETA: 40s - loss: 1.8156 - regression_loss: 1.5517 - classification_loss: 0.2640 343/500 [===================>..........] - ETA: 39s - loss: 1.8149 - regression_loss: 1.5510 - classification_loss: 0.2638 344/500 [===================>..........] - ETA: 39s - loss: 1.8135 - regression_loss: 1.5498 - classification_loss: 0.2637 345/500 [===================>..........] - ETA: 39s - loss: 1.8117 - regression_loss: 1.5483 - classification_loss: 0.2634 346/500 [===================>..........] - ETA: 39s - loss: 1.8116 - regression_loss: 1.5482 - classification_loss: 0.2634 347/500 [===================>..........] - ETA: 38s - loss: 1.8116 - regression_loss: 1.5483 - classification_loss: 0.2633 348/500 [===================>..........] - ETA: 38s - loss: 1.8116 - regression_loss: 1.5485 - classification_loss: 0.2632 349/500 [===================>..........] - ETA: 38s - loss: 1.8129 - regression_loss: 1.5495 - classification_loss: 0.2634 350/500 [====================>.........] - ETA: 38s - loss: 1.8140 - regression_loss: 1.5506 - classification_loss: 0.2634 351/500 [====================>.........] - ETA: 37s - loss: 1.8156 - regression_loss: 1.5519 - classification_loss: 0.2637 352/500 [====================>.........] - ETA: 37s - loss: 1.8158 - regression_loss: 1.5520 - classification_loss: 0.2638 353/500 [====================>.........] - ETA: 37s - loss: 1.8162 - regression_loss: 1.5525 - classification_loss: 0.2637 354/500 [====================>.........] - ETA: 37s - loss: 1.8165 - regression_loss: 1.5527 - classification_loss: 0.2638 355/500 [====================>.........] - ETA: 36s - loss: 1.8171 - regression_loss: 1.5532 - classification_loss: 0.2639 356/500 [====================>.........] - ETA: 36s - loss: 1.8179 - regression_loss: 1.5541 - classification_loss: 0.2637 357/500 [====================>.........] - ETA: 36s - loss: 1.8182 - regression_loss: 1.5544 - classification_loss: 0.2638 358/500 [====================>.........] - ETA: 36s - loss: 1.8175 - regression_loss: 1.5538 - classification_loss: 0.2637 359/500 [====================>.........] - ETA: 35s - loss: 1.8180 - regression_loss: 1.5541 - classification_loss: 0.2638 360/500 [====================>.........] - ETA: 35s - loss: 1.8181 - regression_loss: 1.5541 - classification_loss: 0.2639 361/500 [====================>.........] - ETA: 35s - loss: 1.8179 - regression_loss: 1.5541 - classification_loss: 0.2638 362/500 [====================>.........] - ETA: 35s - loss: 1.8169 - regression_loss: 1.5535 - classification_loss: 0.2634 363/500 [====================>.........] - ETA: 34s - loss: 1.8179 - regression_loss: 1.5546 - classification_loss: 0.2634 364/500 [====================>.........] - ETA: 34s - loss: 1.8180 - regression_loss: 1.5547 - classification_loss: 0.2633 365/500 [====================>.........] - ETA: 34s - loss: 1.8181 - regression_loss: 1.5548 - classification_loss: 0.2633 366/500 [====================>.........] - ETA: 33s - loss: 1.8178 - regression_loss: 1.5546 - classification_loss: 0.2632 367/500 [=====================>........] - ETA: 33s - loss: 1.8193 - regression_loss: 1.5558 - classification_loss: 0.2634 368/500 [=====================>........] - ETA: 33s - loss: 1.8194 - regression_loss: 1.5560 - classification_loss: 0.2634 369/500 [=====================>........] - ETA: 33s - loss: 1.8184 - regression_loss: 1.5552 - classification_loss: 0.2632 370/500 [=====================>........] - ETA: 32s - loss: 1.8180 - regression_loss: 1.5548 - classification_loss: 0.2631 371/500 [=====================>........] - ETA: 32s - loss: 1.8186 - regression_loss: 1.5555 - classification_loss: 0.2631 372/500 [=====================>........] - ETA: 32s - loss: 1.8153 - regression_loss: 1.5527 - classification_loss: 0.2626 373/500 [=====================>........] - ETA: 32s - loss: 1.8150 - regression_loss: 1.5526 - classification_loss: 0.2625 374/500 [=====================>........] - ETA: 31s - loss: 1.8139 - regression_loss: 1.5516 - classification_loss: 0.2623 375/500 [=====================>........] - ETA: 31s - loss: 1.8142 - regression_loss: 1.5519 - classification_loss: 0.2622 376/500 [=====================>........] - ETA: 31s - loss: 1.8142 - regression_loss: 1.5522 - classification_loss: 0.2620 377/500 [=====================>........] - ETA: 31s - loss: 1.8153 - regression_loss: 1.5532 - classification_loss: 0.2621 378/500 [=====================>........] - ETA: 30s - loss: 1.8156 - regression_loss: 1.5532 - classification_loss: 0.2624 379/500 [=====================>........] - ETA: 30s - loss: 1.8163 - regression_loss: 1.5540 - classification_loss: 0.2623 380/500 [=====================>........] - ETA: 30s - loss: 1.8150 - regression_loss: 1.5528 - classification_loss: 0.2622 381/500 [=====================>........] - ETA: 30s - loss: 1.8154 - regression_loss: 1.5532 - classification_loss: 0.2622 382/500 [=====================>........] - ETA: 29s - loss: 1.8138 - regression_loss: 1.5519 - classification_loss: 0.2618 383/500 [=====================>........] - ETA: 29s - loss: 1.8138 - regression_loss: 1.5520 - classification_loss: 0.2618 384/500 [======================>.......] - ETA: 29s - loss: 1.8147 - regression_loss: 1.5527 - classification_loss: 0.2619 385/500 [======================>.......] - ETA: 29s - loss: 1.8130 - regression_loss: 1.5513 - classification_loss: 0.2617 386/500 [======================>.......] - ETA: 28s - loss: 1.8114 - regression_loss: 1.5499 - classification_loss: 0.2615 387/500 [======================>.......] - ETA: 28s - loss: 1.8141 - regression_loss: 1.5522 - classification_loss: 0.2620 388/500 [======================>.......] - ETA: 28s - loss: 1.8149 - regression_loss: 1.5529 - classification_loss: 0.2620 389/500 [======================>.......] - ETA: 28s - loss: 1.8148 - regression_loss: 1.5529 - classification_loss: 0.2619 390/500 [======================>.......] - ETA: 27s - loss: 1.8153 - regression_loss: 1.5534 - classification_loss: 0.2619 391/500 [======================>.......] - ETA: 27s - loss: 1.8162 - regression_loss: 1.5542 - classification_loss: 0.2620 392/500 [======================>.......] - ETA: 27s - loss: 1.8136 - regression_loss: 1.5518 - classification_loss: 0.2617 393/500 [======================>.......] - ETA: 27s - loss: 1.8140 - regression_loss: 1.5522 - classification_loss: 0.2617 394/500 [======================>.......] - ETA: 26s - loss: 1.8136 - regression_loss: 1.5520 - classification_loss: 0.2616 395/500 [======================>.......] - ETA: 26s - loss: 1.8129 - regression_loss: 1.5514 - classification_loss: 0.2615 396/500 [======================>.......] - ETA: 26s - loss: 1.8124 - regression_loss: 1.5509 - classification_loss: 0.2614 397/500 [======================>.......] - ETA: 26s - loss: 1.8139 - regression_loss: 1.5523 - classification_loss: 0.2616 398/500 [======================>.......] - ETA: 25s - loss: 1.8147 - regression_loss: 1.5529 - classification_loss: 0.2618 399/500 [======================>.......] - ETA: 25s - loss: 1.8150 - regression_loss: 1.5533 - classification_loss: 0.2618 400/500 [=======================>......] - ETA: 25s - loss: 1.8122 - regression_loss: 1.5508 - classification_loss: 0.2613 401/500 [=======================>......] - ETA: 25s - loss: 1.8126 - regression_loss: 1.5512 - classification_loss: 0.2614 402/500 [=======================>......] - ETA: 24s - loss: 1.8143 - regression_loss: 1.5527 - classification_loss: 0.2616 403/500 [=======================>......] - ETA: 24s - loss: 1.8166 - regression_loss: 1.5549 - classification_loss: 0.2617 404/500 [=======================>......] - ETA: 24s - loss: 1.8171 - regression_loss: 1.5553 - classification_loss: 0.2618 405/500 [=======================>......] - ETA: 24s - loss: 1.8176 - regression_loss: 1.5557 - classification_loss: 0.2620 406/500 [=======================>......] - ETA: 23s - loss: 1.8183 - regression_loss: 1.5563 - classification_loss: 0.2620 407/500 [=======================>......] - ETA: 23s - loss: 1.8186 - regression_loss: 1.5566 - classification_loss: 0.2621 408/500 [=======================>......] - ETA: 23s - loss: 1.8211 - regression_loss: 1.5583 - classification_loss: 0.2628 409/500 [=======================>......] - ETA: 23s - loss: 1.8224 - regression_loss: 1.5594 - classification_loss: 0.2630 410/500 [=======================>......] - ETA: 22s - loss: 1.8232 - regression_loss: 1.5601 - classification_loss: 0.2631 411/500 [=======================>......] - ETA: 22s - loss: 1.8229 - regression_loss: 1.5599 - classification_loss: 0.2631 412/500 [=======================>......] - ETA: 22s - loss: 1.8244 - regression_loss: 1.5611 - classification_loss: 0.2633 413/500 [=======================>......] - ETA: 22s - loss: 1.8248 - regression_loss: 1.5615 - classification_loss: 0.2634 414/500 [=======================>......] - ETA: 21s - loss: 1.8247 - regression_loss: 1.5611 - classification_loss: 0.2636 415/500 [=======================>......] - ETA: 21s - loss: 1.8246 - regression_loss: 1.5610 - classification_loss: 0.2636 416/500 [=======================>......] - ETA: 21s - loss: 1.8247 - regression_loss: 1.5611 - classification_loss: 0.2636 417/500 [========================>.....] - ETA: 21s - loss: 1.8226 - regression_loss: 1.5592 - classification_loss: 0.2634 418/500 [========================>.....] - ETA: 20s - loss: 1.8232 - regression_loss: 1.5597 - classification_loss: 0.2635 419/500 [========================>.....] - ETA: 20s - loss: 1.8232 - regression_loss: 1.5595 - classification_loss: 0.2637 420/500 [========================>.....] - ETA: 20s - loss: 1.8242 - regression_loss: 1.5604 - classification_loss: 0.2638 421/500 [========================>.....] - ETA: 20s - loss: 1.8251 - regression_loss: 1.5612 - classification_loss: 0.2638 422/500 [========================>.....] - ETA: 19s - loss: 1.8252 - regression_loss: 1.5613 - classification_loss: 0.2639 423/500 [========================>.....] - ETA: 19s - loss: 1.8242 - regression_loss: 1.5605 - classification_loss: 0.2637 424/500 [========================>.....] - ETA: 19s - loss: 1.8235 - regression_loss: 1.5600 - classification_loss: 0.2635 425/500 [========================>.....] - ETA: 19s - loss: 1.8237 - regression_loss: 1.5602 - classification_loss: 0.2635 426/500 [========================>.....] - ETA: 18s - loss: 1.8240 - regression_loss: 1.5606 - classification_loss: 0.2634 427/500 [========================>.....] - ETA: 18s - loss: 1.8242 - regression_loss: 1.5609 - classification_loss: 0.2633 428/500 [========================>.....] - ETA: 18s - loss: 1.8243 - regression_loss: 1.5610 - classification_loss: 0.2633 429/500 [========================>.....] - ETA: 18s - loss: 1.8232 - regression_loss: 1.5601 - classification_loss: 0.2631 430/500 [========================>.....] - ETA: 17s - loss: 1.8222 - regression_loss: 1.5593 - classification_loss: 0.2628 431/500 [========================>.....] - ETA: 17s - loss: 1.8229 - regression_loss: 1.5601 - classification_loss: 0.2629 432/500 [========================>.....] - ETA: 17s - loss: 1.8226 - regression_loss: 1.5598 - classification_loss: 0.2627 433/500 [========================>.....] - ETA: 16s - loss: 1.8222 - regression_loss: 1.5596 - classification_loss: 0.2626 434/500 [=========================>....] - ETA: 16s - loss: 1.8228 - regression_loss: 1.5601 - classification_loss: 0.2627 435/500 [=========================>....] - ETA: 16s - loss: 1.8220 - regression_loss: 1.5595 - classification_loss: 0.2626 436/500 [=========================>....] - ETA: 16s - loss: 1.8210 - regression_loss: 1.5587 - classification_loss: 0.2623 437/500 [=========================>....] - ETA: 15s - loss: 1.8216 - regression_loss: 1.5589 - classification_loss: 0.2627 438/500 [=========================>....] - ETA: 15s - loss: 1.8211 - regression_loss: 1.5584 - classification_loss: 0.2627 439/500 [=========================>....] - ETA: 15s - loss: 1.8213 - regression_loss: 1.5587 - classification_loss: 0.2627 440/500 [=========================>....] - ETA: 15s - loss: 1.8218 - regression_loss: 1.5591 - classification_loss: 0.2628 441/500 [=========================>....] - ETA: 14s - loss: 1.8203 - regression_loss: 1.5578 - classification_loss: 0.2625 442/500 [=========================>....] - ETA: 14s - loss: 1.8199 - regression_loss: 1.5574 - classification_loss: 0.2624 443/500 [=========================>....] - ETA: 14s - loss: 1.8201 - regression_loss: 1.5576 - classification_loss: 0.2625 444/500 [=========================>....] - ETA: 14s - loss: 1.8204 - regression_loss: 1.5579 - classification_loss: 0.2625 445/500 [=========================>....] - ETA: 13s - loss: 1.8201 - regression_loss: 1.5576 - classification_loss: 0.2626 446/500 [=========================>....] - ETA: 13s - loss: 1.8195 - regression_loss: 1.5571 - classification_loss: 0.2624 447/500 [=========================>....] - ETA: 13s - loss: 1.8177 - regression_loss: 1.5556 - classification_loss: 0.2622 448/500 [=========================>....] - ETA: 13s - loss: 1.8172 - regression_loss: 1.5552 - classification_loss: 0.2621 449/500 [=========================>....] - ETA: 12s - loss: 1.8173 - regression_loss: 1.5552 - classification_loss: 0.2620 450/500 [==========================>...] - ETA: 12s - loss: 1.8163 - regression_loss: 1.5542 - classification_loss: 0.2621 451/500 [==========================>...] - ETA: 12s - loss: 1.8169 - regression_loss: 1.5548 - classification_loss: 0.2621 452/500 [==========================>...] - ETA: 12s - loss: 1.8160 - regression_loss: 1.5541 - classification_loss: 0.2619 453/500 [==========================>...] - ETA: 11s - loss: 1.8164 - regression_loss: 1.5545 - classification_loss: 0.2619 454/500 [==========================>...] - ETA: 11s - loss: 1.8161 - regression_loss: 1.5544 - classification_loss: 0.2617 455/500 [==========================>...] - ETA: 11s - loss: 1.8158 - regression_loss: 1.5541 - classification_loss: 0.2617 456/500 [==========================>...] - ETA: 11s - loss: 1.8161 - regression_loss: 1.5543 - classification_loss: 0.2618 457/500 [==========================>...] - ETA: 10s - loss: 1.8169 - regression_loss: 1.5550 - classification_loss: 0.2619 458/500 [==========================>...] - ETA: 10s - loss: 1.8159 - regression_loss: 1.5541 - classification_loss: 0.2617 459/500 [==========================>...] - ETA: 10s - loss: 1.8164 - regression_loss: 1.5546 - classification_loss: 0.2619 460/500 [==========================>...] - ETA: 10s - loss: 1.8162 - regression_loss: 1.5544 - classification_loss: 0.2618 461/500 [==========================>...] - ETA: 9s - loss: 1.8174 - regression_loss: 1.5554 - classification_loss: 0.2620  462/500 [==========================>...] - ETA: 9s - loss: 1.8174 - regression_loss: 1.5553 - classification_loss: 0.2621 463/500 [==========================>...] - ETA: 9s - loss: 1.8176 - regression_loss: 1.5557 - classification_loss: 0.2619 464/500 [==========================>...] - ETA: 9s - loss: 1.8171 - regression_loss: 1.5553 - classification_loss: 0.2618 465/500 [==========================>...] - ETA: 8s - loss: 1.8178 - regression_loss: 1.5558 - classification_loss: 0.2620 466/500 [==========================>...] - ETA: 8s - loss: 1.8179 - regression_loss: 1.5558 - classification_loss: 0.2620 467/500 [===========================>..] - ETA: 8s - loss: 1.8192 - regression_loss: 1.5570 - classification_loss: 0.2622 468/500 [===========================>..] - ETA: 8s - loss: 1.8199 - regression_loss: 1.5575 - classification_loss: 0.2624 469/500 [===========================>..] - ETA: 7s - loss: 1.8197 - regression_loss: 1.5573 - classification_loss: 0.2624 470/500 [===========================>..] - ETA: 7s - loss: 1.8200 - regression_loss: 1.5575 - classification_loss: 0.2625 471/500 [===========================>..] - ETA: 7s - loss: 1.8192 - regression_loss: 1.5568 - classification_loss: 0.2624 472/500 [===========================>..] - ETA: 7s - loss: 1.8203 - regression_loss: 1.5573 - classification_loss: 0.2630 473/500 [===========================>..] - ETA: 6s - loss: 1.8205 - regression_loss: 1.5574 - classification_loss: 0.2631 474/500 [===========================>..] - ETA: 6s - loss: 1.8209 - regression_loss: 1.5578 - classification_loss: 0.2631 475/500 [===========================>..] - ETA: 6s - loss: 1.8211 - regression_loss: 1.5581 - classification_loss: 0.2630 476/500 [===========================>..] - ETA: 6s - loss: 1.8215 - regression_loss: 1.5585 - classification_loss: 0.2630 477/500 [===========================>..] - ETA: 5s - loss: 1.8223 - regression_loss: 1.5592 - classification_loss: 0.2631 478/500 [===========================>..] - ETA: 5s - loss: 1.8223 - regression_loss: 1.5593 - classification_loss: 0.2630 479/500 [===========================>..] - ETA: 5s - loss: 1.8218 - regression_loss: 1.5589 - classification_loss: 0.2629 480/500 [===========================>..] - ETA: 5s - loss: 1.8193 - regression_loss: 1.5566 - classification_loss: 0.2626 481/500 [===========================>..] - ETA: 4s - loss: 1.8207 - regression_loss: 1.5580 - classification_loss: 0.2627 482/500 [===========================>..] - ETA: 4s - loss: 1.8215 - regression_loss: 1.5586 - classification_loss: 0.2630 483/500 [===========================>..] - ETA: 4s - loss: 1.8215 - regression_loss: 1.5586 - classification_loss: 0.2629 484/500 [============================>.] - ETA: 4s - loss: 1.8213 - regression_loss: 1.5583 - classification_loss: 0.2630 485/500 [============================>.] - ETA: 3s - loss: 1.8217 - regression_loss: 1.5586 - classification_loss: 0.2631 486/500 [============================>.] - ETA: 3s - loss: 1.8203 - regression_loss: 1.5576 - classification_loss: 0.2627 487/500 [============================>.] - ETA: 3s - loss: 1.8186 - regression_loss: 1.5561 - classification_loss: 0.2625 488/500 [============================>.] - ETA: 3s - loss: 1.8190 - regression_loss: 1.5564 - classification_loss: 0.2626 489/500 [============================>.] - ETA: 2s - loss: 1.8203 - regression_loss: 1.5576 - classification_loss: 0.2627 490/500 [============================>.] - ETA: 2s - loss: 1.8213 - regression_loss: 1.5585 - classification_loss: 0.2627 491/500 [============================>.] - ETA: 2s - loss: 1.8225 - regression_loss: 1.5597 - classification_loss: 0.2628 492/500 [============================>.] - ETA: 2s - loss: 1.8235 - regression_loss: 1.5606 - classification_loss: 0.2629 493/500 [============================>.] - ETA: 1s - loss: 1.8237 - regression_loss: 1.5608 - classification_loss: 0.2629 494/500 [============================>.] - ETA: 1s - loss: 1.8238 - regression_loss: 1.5609 - classification_loss: 0.2629 495/500 [============================>.] - ETA: 1s - loss: 1.8236 - regression_loss: 1.5606 - classification_loss: 0.2629 496/500 [============================>.] - ETA: 1s - loss: 1.8232 - regression_loss: 1.5603 - classification_loss: 0.2629 497/500 [============================>.] - ETA: 0s - loss: 1.8218 - regression_loss: 1.5592 - classification_loss: 0.2626 498/500 [============================>.] - ETA: 0s - loss: 1.8206 - regression_loss: 1.5581 - classification_loss: 0.2626 499/500 [============================>.] - ETA: 0s - loss: 1.8204 - regression_loss: 1.5579 - classification_loss: 0.2625 500/500 [==============================] - 127s 254ms/step - loss: 1.8214 - regression_loss: 1.5586 - classification_loss: 0.2628 1172 instances of class plum with average precision: 0.6154 mAP: 0.6154 Epoch 00008: saving model to ./training/snapshots/resnet50_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 2:05 - loss: 1.6080 - regression_loss: 1.3914 - classification_loss: 0.2166 2/500 [..............................] - ETA: 2:07 - loss: 1.4267 - regression_loss: 1.2381 - classification_loss: 0.1886 3/500 [..............................] - ETA: 2:06 - loss: 1.3723 - regression_loss: 1.2099 - classification_loss: 0.1624 4/500 [..............................] - ETA: 2:07 - loss: 1.5462 - regression_loss: 1.3502 - classification_loss: 0.1959 5/500 [..............................] - ETA: 2:07 - loss: 1.5190 - regression_loss: 1.3273 - classification_loss: 0.1917 6/500 [..............................] - ETA: 2:06 - loss: 1.5677 - regression_loss: 1.3647 - classification_loss: 0.2030 7/500 [..............................] - ETA: 2:05 - loss: 1.6302 - regression_loss: 1.4050 - classification_loss: 0.2252 8/500 [..............................] - ETA: 2:05 - loss: 1.7473 - regression_loss: 1.5058 - classification_loss: 0.2415 9/500 [..............................] - ETA: 2:05 - loss: 1.7784 - regression_loss: 1.5329 - classification_loss: 0.2455 10/500 [..............................] - ETA: 2:05 - loss: 1.7410 - regression_loss: 1.5012 - classification_loss: 0.2398 11/500 [..............................] - ETA: 2:05 - loss: 1.7663 - regression_loss: 1.5242 - classification_loss: 0.2421 12/500 [..............................] - ETA: 2:05 - loss: 1.7722 - regression_loss: 1.5252 - classification_loss: 0.2470 13/500 [..............................] - ETA: 2:05 - loss: 1.7454 - regression_loss: 1.5011 - classification_loss: 0.2444 14/500 [..............................] - ETA: 2:05 - loss: 1.7496 - regression_loss: 1.5060 - classification_loss: 0.2435 15/500 [..............................] - ETA: 2:05 - loss: 1.7845 - regression_loss: 1.5239 - classification_loss: 0.2606 16/500 [..............................] - ETA: 2:04 - loss: 1.7711 - regression_loss: 1.5111 - classification_loss: 0.2601 17/500 [>.............................] - ETA: 2:04 - loss: 1.7981 - regression_loss: 1.5349 - classification_loss: 0.2632 18/500 [>.............................] - ETA: 2:04 - loss: 1.8300 - regression_loss: 1.5612 - classification_loss: 0.2688 19/500 [>.............................] - ETA: 2:03 - loss: 1.8094 - regression_loss: 1.5470 - classification_loss: 0.2624 20/500 [>.............................] - ETA: 2:03 - loss: 1.8000 - regression_loss: 1.5376 - classification_loss: 0.2624 21/500 [>.............................] - ETA: 2:03 - loss: 1.8171 - regression_loss: 1.5531 - classification_loss: 0.2640 22/500 [>.............................] - ETA: 2:02 - loss: 1.8039 - regression_loss: 1.5425 - classification_loss: 0.2613 23/500 [>.............................] - ETA: 2:02 - loss: 1.8203 - regression_loss: 1.5559 - classification_loss: 0.2644 24/500 [>.............................] - ETA: 2:01 - loss: 1.8201 - regression_loss: 1.5548 - classification_loss: 0.2653 25/500 [>.............................] - ETA: 2:01 - loss: 1.8157 - regression_loss: 1.5527 - classification_loss: 0.2631 26/500 [>.............................] - ETA: 2:01 - loss: 1.8169 - regression_loss: 1.5546 - classification_loss: 0.2623 27/500 [>.............................] - ETA: 2:01 - loss: 1.8253 - regression_loss: 1.5610 - classification_loss: 0.2643 28/500 [>.............................] - ETA: 2:00 - loss: 1.8230 - regression_loss: 1.5596 - classification_loss: 0.2634 29/500 [>.............................] - ETA: 2:00 - loss: 1.8089 - regression_loss: 1.5486 - classification_loss: 0.2604 30/500 [>.............................] - ETA: 1:59 - loss: 1.8153 - regression_loss: 1.5521 - classification_loss: 0.2632 31/500 [>.............................] - ETA: 1:59 - loss: 1.8264 - regression_loss: 1.5636 - classification_loss: 0.2628 32/500 [>.............................] - ETA: 1:59 - loss: 1.8189 - regression_loss: 1.5569 - classification_loss: 0.2620 33/500 [>.............................] - ETA: 1:58 - loss: 1.8110 - regression_loss: 1.5512 - classification_loss: 0.2598 34/500 [=>............................] - ETA: 1:58 - loss: 1.8008 - regression_loss: 1.5413 - classification_loss: 0.2595 35/500 [=>............................] - ETA: 1:58 - loss: 1.7922 - regression_loss: 1.5339 - classification_loss: 0.2582 36/500 [=>............................] - ETA: 1:57 - loss: 1.7844 - regression_loss: 1.5272 - classification_loss: 0.2573 37/500 [=>............................] - ETA: 1:57 - loss: 1.7975 - regression_loss: 1.5358 - classification_loss: 0.2617 38/500 [=>............................] - ETA: 1:57 - loss: 1.8039 - regression_loss: 1.5424 - classification_loss: 0.2616 39/500 [=>............................] - ETA: 1:57 - loss: 1.7866 - regression_loss: 1.5279 - classification_loss: 0.2586 40/500 [=>............................] - ETA: 1:56 - loss: 1.7911 - regression_loss: 1.5324 - classification_loss: 0.2587 41/500 [=>............................] - ETA: 1:56 - loss: 1.7933 - regression_loss: 1.5345 - classification_loss: 0.2587 42/500 [=>............................] - ETA: 1:56 - loss: 1.7712 - regression_loss: 1.5159 - classification_loss: 0.2553 43/500 [=>............................] - ETA: 1:55 - loss: 1.7650 - regression_loss: 1.5112 - classification_loss: 0.2538 44/500 [=>............................] - ETA: 1:55 - loss: 1.7717 - regression_loss: 1.5168 - classification_loss: 0.2548 45/500 [=>............................] - ETA: 1:55 - loss: 1.7528 - regression_loss: 1.5009 - classification_loss: 0.2519 46/500 [=>............................] - ETA: 1:55 - loss: 1.7398 - regression_loss: 1.4902 - classification_loss: 0.2495 47/500 [=>............................] - ETA: 1:55 - loss: 1.7494 - regression_loss: 1.4980 - classification_loss: 0.2514 48/500 [=>............................] - ETA: 1:54 - loss: 1.7582 - regression_loss: 1.5063 - classification_loss: 0.2519 49/500 [=>............................] - ETA: 1:54 - loss: 1.7581 - regression_loss: 1.5063 - classification_loss: 0.2518 50/500 [==>...........................] - ETA: 1:53 - loss: 1.7625 - regression_loss: 1.5103 - classification_loss: 0.2521 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7680 - regression_loss: 1.5137 - classification_loss: 0.2543 52/500 [==>...........................] - ETA: 1:52 - loss: 1.7749 - regression_loss: 1.5198 - classification_loss: 0.2551 53/500 [==>...........................] - ETA: 1:52 - loss: 1.7667 - regression_loss: 1.5134 - classification_loss: 0.2533 54/500 [==>...........................] - ETA: 1:52 - loss: 1.7680 - regression_loss: 1.5151 - classification_loss: 0.2529 55/500 [==>...........................] - ETA: 1:51 - loss: 1.7508 - regression_loss: 1.5002 - classification_loss: 0.2505 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7410 - regression_loss: 1.4918 - classification_loss: 0.2492 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7420 - regression_loss: 1.4924 - classification_loss: 0.2496 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7433 - regression_loss: 1.4933 - classification_loss: 0.2500 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7468 - regression_loss: 1.4957 - classification_loss: 0.2512 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7474 - regression_loss: 1.4972 - classification_loss: 0.2502 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7455 - regression_loss: 1.4960 - classification_loss: 0.2494 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7415 - regression_loss: 1.4920 - classification_loss: 0.2495 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7435 - regression_loss: 1.4936 - classification_loss: 0.2500 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7458 - regression_loss: 1.4974 - classification_loss: 0.2484 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7441 - regression_loss: 1.4965 - classification_loss: 0.2475 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7350 - regression_loss: 1.4896 - classification_loss: 0.2454 67/500 [===>..........................] - ETA: 1:48 - loss: 1.7259 - regression_loss: 1.4816 - classification_loss: 0.2443 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7193 - regression_loss: 1.4753 - classification_loss: 0.2440 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7301 - regression_loss: 1.4848 - classification_loss: 0.2453 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7280 - regression_loss: 1.4833 - classification_loss: 0.2448 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7355 - regression_loss: 1.4894 - classification_loss: 0.2461 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7347 - regression_loss: 1.4890 - classification_loss: 0.2457 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7365 - regression_loss: 1.4911 - classification_loss: 0.2454 74/500 [===>..........................] - ETA: 1:47 - loss: 1.7378 - regression_loss: 1.4924 - classification_loss: 0.2453 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7312 - regression_loss: 1.4871 - classification_loss: 0.2441 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7318 - regression_loss: 1.4880 - classification_loss: 0.2437 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7284 - regression_loss: 1.4855 - classification_loss: 0.2429 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7286 - regression_loss: 1.4860 - classification_loss: 0.2426 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7290 - regression_loss: 1.4864 - classification_loss: 0.2427 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7361 - regression_loss: 1.4929 - classification_loss: 0.2432 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7386 - regression_loss: 1.4956 - classification_loss: 0.2430 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7418 - regression_loss: 1.4977 - classification_loss: 0.2441 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7393 - regression_loss: 1.4953 - classification_loss: 0.2440 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7380 - regression_loss: 1.4908 - classification_loss: 0.2472 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7368 - regression_loss: 1.4895 - classification_loss: 0.2473 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7423 - regression_loss: 1.4943 - classification_loss: 0.2480 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7402 - regression_loss: 1.4926 - classification_loss: 0.2476 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7439 - regression_loss: 1.4958 - classification_loss: 0.2481 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7518 - regression_loss: 1.5019 - classification_loss: 0.2499 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7520 - regression_loss: 1.5030 - classification_loss: 0.2490 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7612 - regression_loss: 1.5105 - classification_loss: 0.2507 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7611 - regression_loss: 1.5101 - classification_loss: 0.2509 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7520 - regression_loss: 1.5027 - classification_loss: 0.2494 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7535 - regression_loss: 1.5038 - classification_loss: 0.2497 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7498 - regression_loss: 1.5009 - classification_loss: 0.2489 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7454 - regression_loss: 1.4970 - classification_loss: 0.2485 97/500 [====>.........................] - ETA: 1:41 - loss: 1.7473 - regression_loss: 1.4987 - classification_loss: 0.2486 98/500 [====>.........................] - ETA: 1:41 - loss: 1.7528 - regression_loss: 1.5027 - classification_loss: 0.2501 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7554 - regression_loss: 1.5047 - classification_loss: 0.2507 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7490 - regression_loss: 1.4986 - classification_loss: 0.2503 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7486 - regression_loss: 1.4986 - classification_loss: 0.2500 102/500 [=====>........................] - ETA: 1:40 - loss: 1.7531 - regression_loss: 1.5020 - classification_loss: 0.2512 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7564 - regression_loss: 1.5048 - classification_loss: 0.2516 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7590 - regression_loss: 1.5056 - classification_loss: 0.2534 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7588 - regression_loss: 1.5047 - classification_loss: 0.2541 106/500 [=====>........................] - ETA: 1:39 - loss: 1.7676 - regression_loss: 1.5114 - classification_loss: 0.2562 107/500 [=====>........................] - ETA: 1:39 - loss: 1.7701 - regression_loss: 1.5137 - classification_loss: 0.2564 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7613 - regression_loss: 1.5064 - classification_loss: 0.2549 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7571 - regression_loss: 1.5033 - classification_loss: 0.2538 110/500 [=====>........................] - ETA: 1:38 - loss: 1.7601 - regression_loss: 1.5062 - classification_loss: 0.2539 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7641 - regression_loss: 1.5086 - classification_loss: 0.2555 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7579 - regression_loss: 1.5027 - classification_loss: 0.2552 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7604 - regression_loss: 1.5049 - classification_loss: 0.2555 114/500 [=====>........................] - ETA: 1:37 - loss: 1.7631 - regression_loss: 1.5076 - classification_loss: 0.2554 115/500 [=====>........................] - ETA: 1:37 - loss: 1.7642 - regression_loss: 1.5093 - classification_loss: 0.2550 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7646 - regression_loss: 1.5096 - classification_loss: 0.2550 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7685 - regression_loss: 1.5126 - classification_loss: 0.2559 118/500 [======>.......................] - ETA: 1:36 - loss: 1.7703 - regression_loss: 1.5140 - classification_loss: 0.2563 119/500 [======>.......................] - ETA: 1:36 - loss: 1.7738 - regression_loss: 1.5171 - classification_loss: 0.2567 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7740 - regression_loss: 1.5178 - classification_loss: 0.2562 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7775 - regression_loss: 1.5206 - classification_loss: 0.2570 122/500 [======>.......................] - ETA: 1:35 - loss: 1.7772 - regression_loss: 1.5206 - classification_loss: 0.2566 123/500 [======>.......................] - ETA: 1:35 - loss: 1.7796 - regression_loss: 1.5231 - classification_loss: 0.2565 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7760 - regression_loss: 1.5192 - classification_loss: 0.2568 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7730 - regression_loss: 1.5160 - classification_loss: 0.2570 126/500 [======>.......................] - ETA: 1:34 - loss: 1.7665 - regression_loss: 1.5107 - classification_loss: 0.2559 127/500 [======>.......................] - ETA: 1:34 - loss: 1.7621 - regression_loss: 1.5067 - classification_loss: 0.2554 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7655 - regression_loss: 1.5097 - classification_loss: 0.2558 129/500 [======>.......................] - ETA: 1:33 - loss: 1.7606 - regression_loss: 1.5057 - classification_loss: 0.2549 130/500 [======>.......................] - ETA: 1:33 - loss: 1.7667 - regression_loss: 1.5104 - classification_loss: 0.2563 131/500 [======>.......................] - ETA: 1:33 - loss: 1.7678 - regression_loss: 1.5112 - classification_loss: 0.2566 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7589 - regression_loss: 1.5036 - classification_loss: 0.2553 133/500 [======>.......................] - ETA: 1:32 - loss: 1.7596 - regression_loss: 1.5041 - classification_loss: 0.2555 134/500 [=======>......................] - ETA: 1:32 - loss: 1.7580 - regression_loss: 1.5024 - classification_loss: 0.2556 135/500 [=======>......................] - ETA: 1:32 - loss: 1.7671 - regression_loss: 1.5090 - classification_loss: 0.2581 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7699 - regression_loss: 1.5121 - classification_loss: 0.2578 137/500 [=======>......................] - ETA: 1:31 - loss: 1.7644 - regression_loss: 1.5081 - classification_loss: 0.2563 138/500 [=======>......................] - ETA: 1:31 - loss: 1.7585 - regression_loss: 1.5034 - classification_loss: 0.2551 139/500 [=======>......................] - ETA: 1:31 - loss: 1.7584 - regression_loss: 1.5038 - classification_loss: 0.2545 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7600 - regression_loss: 1.5052 - classification_loss: 0.2548 141/500 [=======>......................] - ETA: 1:30 - loss: 1.7534 - regression_loss: 1.4998 - classification_loss: 0.2536 142/500 [=======>......................] - ETA: 1:30 - loss: 1.7541 - regression_loss: 1.4999 - classification_loss: 0.2542 143/500 [=======>......................] - ETA: 1:30 - loss: 1.7532 - regression_loss: 1.4989 - classification_loss: 0.2544 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7541 - regression_loss: 1.4997 - classification_loss: 0.2544 145/500 [=======>......................] - ETA: 1:29 - loss: 1.7562 - regression_loss: 1.5015 - classification_loss: 0.2547 146/500 [=======>......................] - ETA: 1:29 - loss: 1.7515 - regression_loss: 1.4971 - classification_loss: 0.2544 147/500 [=======>......................] - ETA: 1:29 - loss: 1.7536 - regression_loss: 1.4992 - classification_loss: 0.2545 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7600 - regression_loss: 1.5041 - classification_loss: 0.2559 149/500 [=======>......................] - ETA: 1:28 - loss: 1.7627 - regression_loss: 1.5066 - classification_loss: 0.2561 150/500 [========>.....................] - ETA: 1:28 - loss: 1.7618 - regression_loss: 1.5060 - classification_loss: 0.2558 151/500 [========>.....................] - ETA: 1:28 - loss: 1.7611 - regression_loss: 1.5058 - classification_loss: 0.2554 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7629 - regression_loss: 1.5070 - classification_loss: 0.2559 153/500 [========>.....................] - ETA: 1:27 - loss: 1.7623 - regression_loss: 1.5065 - classification_loss: 0.2558 154/500 [========>.....................] - ETA: 1:27 - loss: 1.7577 - regression_loss: 1.5024 - classification_loss: 0.2553 155/500 [========>.....................] - ETA: 1:27 - loss: 1.7580 - regression_loss: 1.5031 - classification_loss: 0.2549 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7588 - regression_loss: 1.5039 - classification_loss: 0.2549 157/500 [========>.....................] - ETA: 1:26 - loss: 1.7577 - regression_loss: 1.5027 - classification_loss: 0.2549 158/500 [========>.....................] - ETA: 1:26 - loss: 1.7538 - regression_loss: 1.4997 - classification_loss: 0.2540 159/500 [========>.....................] - ETA: 1:26 - loss: 1.7553 - regression_loss: 1.5010 - classification_loss: 0.2543 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7564 - regression_loss: 1.5019 - classification_loss: 0.2545 161/500 [========>.....................] - ETA: 1:25 - loss: 1.7592 - regression_loss: 1.5042 - classification_loss: 0.2550 162/500 [========>.....................] - ETA: 1:25 - loss: 1.7573 - regression_loss: 1.5026 - classification_loss: 0.2547 163/500 [========>.....................] - ETA: 1:25 - loss: 1.7571 - regression_loss: 1.5024 - classification_loss: 0.2548 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7575 - regression_loss: 1.5028 - classification_loss: 0.2547 165/500 [========>.....................] - ETA: 1:24 - loss: 1.7566 - regression_loss: 1.5023 - classification_loss: 0.2544 166/500 [========>.....................] - ETA: 1:24 - loss: 1.7575 - regression_loss: 1.5032 - classification_loss: 0.2544 167/500 [=========>....................] - ETA: 1:24 - loss: 1.7586 - regression_loss: 1.5041 - classification_loss: 0.2545 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7562 - regression_loss: 1.5023 - classification_loss: 0.2539 169/500 [=========>....................] - ETA: 1:23 - loss: 1.7593 - regression_loss: 1.5049 - classification_loss: 0.2543 170/500 [=========>....................] - ETA: 1:23 - loss: 1.7564 - regression_loss: 1.5028 - classification_loss: 0.2536 171/500 [=========>....................] - ETA: 1:23 - loss: 1.7537 - regression_loss: 1.5005 - classification_loss: 0.2532 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7558 - regression_loss: 1.5024 - classification_loss: 0.2535 173/500 [=========>....................] - ETA: 1:22 - loss: 1.7584 - regression_loss: 1.5053 - classification_loss: 0.2532 174/500 [=========>....................] - ETA: 1:22 - loss: 1.7576 - regression_loss: 1.5050 - classification_loss: 0.2526 175/500 [=========>....................] - ETA: 1:22 - loss: 1.7585 - regression_loss: 1.5056 - classification_loss: 0.2529 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7579 - regression_loss: 1.5046 - classification_loss: 0.2533 177/500 [=========>....................] - ETA: 1:21 - loss: 1.7599 - regression_loss: 1.5056 - classification_loss: 0.2543 178/500 [=========>....................] - ETA: 1:21 - loss: 1.7544 - regression_loss: 1.5007 - classification_loss: 0.2537 179/500 [=========>....................] - ETA: 1:21 - loss: 1.7543 - regression_loss: 1.5003 - classification_loss: 0.2541 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7558 - regression_loss: 1.5015 - classification_loss: 0.2543 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7549 - regression_loss: 1.5007 - classification_loss: 0.2542 182/500 [=========>....................] - ETA: 1:20 - loss: 1.7564 - regression_loss: 1.5019 - classification_loss: 0.2545 183/500 [=========>....................] - ETA: 1:20 - loss: 1.7557 - regression_loss: 1.5015 - classification_loss: 0.2543 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7573 - regression_loss: 1.5028 - classification_loss: 0.2545 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7575 - regression_loss: 1.5032 - classification_loss: 0.2544 186/500 [==========>...................] - ETA: 1:19 - loss: 1.7571 - regression_loss: 1.5028 - classification_loss: 0.2543 187/500 [==========>...................] - ETA: 1:19 - loss: 1.7568 - regression_loss: 1.5027 - classification_loss: 0.2541 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7555 - regression_loss: 1.5017 - classification_loss: 0.2538 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7557 - regression_loss: 1.5018 - classification_loss: 0.2538 190/500 [==========>...................] - ETA: 1:18 - loss: 1.7578 - regression_loss: 1.5040 - classification_loss: 0.2538 191/500 [==========>...................] - ETA: 1:18 - loss: 1.7608 - regression_loss: 1.5068 - classification_loss: 0.2540 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7619 - regression_loss: 1.5078 - classification_loss: 0.2541 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7619 - regression_loss: 1.5079 - classification_loss: 0.2541 194/500 [==========>...................] - ETA: 1:17 - loss: 1.7602 - regression_loss: 1.5064 - classification_loss: 0.2538 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7600 - regression_loss: 1.5062 - classification_loss: 0.2537 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7600 - regression_loss: 1.5064 - classification_loss: 0.2536 197/500 [==========>...................] - ETA: 1:16 - loss: 1.7613 - regression_loss: 1.5075 - classification_loss: 0.2538 198/500 [==========>...................] - ETA: 1:16 - loss: 1.7625 - regression_loss: 1.5081 - classification_loss: 0.2544 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7603 - regression_loss: 1.5062 - classification_loss: 0.2541 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7616 - regression_loss: 1.5073 - classification_loss: 0.2543 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7604 - regression_loss: 1.5062 - classification_loss: 0.2542 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7642 - regression_loss: 1.5090 - classification_loss: 0.2552 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7661 - regression_loss: 1.5105 - classification_loss: 0.2556 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7681 - regression_loss: 1.5123 - classification_loss: 0.2558 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7675 - regression_loss: 1.5119 - classification_loss: 0.2556 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7685 - regression_loss: 1.5129 - classification_loss: 0.2556 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7723 - regression_loss: 1.5159 - classification_loss: 0.2564 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7745 - regression_loss: 1.5176 - classification_loss: 0.2569 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7745 - regression_loss: 1.5176 - classification_loss: 0.2570 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7760 - regression_loss: 1.5191 - classification_loss: 0.2569 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7750 - regression_loss: 1.5182 - classification_loss: 0.2568 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7748 - regression_loss: 1.5180 - classification_loss: 0.2568 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7746 - regression_loss: 1.5180 - classification_loss: 0.2566 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7753 - regression_loss: 1.5185 - classification_loss: 0.2567 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7772 - regression_loss: 1.5198 - classification_loss: 0.2574 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7750 - regression_loss: 1.5179 - classification_loss: 0.2572 217/500 [============>.................] - ETA: 1:10 - loss: 1.7767 - regression_loss: 1.5192 - classification_loss: 0.2575 218/500 [============>.................] - ETA: 1:10 - loss: 1.7739 - regression_loss: 1.5167 - classification_loss: 0.2572 219/500 [============>.................] - ETA: 1:10 - loss: 1.7758 - regression_loss: 1.5186 - classification_loss: 0.2572 220/500 [============>.................] - ETA: 1:09 - loss: 1.7760 - regression_loss: 1.5187 - classification_loss: 0.2573 221/500 [============>.................] - ETA: 1:09 - loss: 1.7766 - regression_loss: 1.5192 - classification_loss: 0.2574 222/500 [============>.................] - ETA: 1:09 - loss: 1.7748 - regression_loss: 1.5175 - classification_loss: 0.2573 223/500 [============>.................] - ETA: 1:09 - loss: 1.7774 - regression_loss: 1.5194 - classification_loss: 0.2580 224/500 [============>.................] - ETA: 1:08 - loss: 1.7802 - regression_loss: 1.5218 - classification_loss: 0.2584 225/500 [============>.................] - ETA: 1:08 - loss: 1.7812 - regression_loss: 1.5226 - classification_loss: 0.2586 226/500 [============>.................] - ETA: 1:08 - loss: 1.7821 - regression_loss: 1.5234 - classification_loss: 0.2587 227/500 [============>.................] - ETA: 1:07 - loss: 1.7808 - regression_loss: 1.5226 - classification_loss: 0.2582 228/500 [============>.................] - ETA: 1:07 - loss: 1.7784 - regression_loss: 1.5208 - classification_loss: 0.2576 229/500 [============>.................] - ETA: 1:07 - loss: 1.7784 - regression_loss: 1.5208 - classification_loss: 0.2576 230/500 [============>.................] - ETA: 1:07 - loss: 1.7789 - regression_loss: 1.5210 - classification_loss: 0.2578 231/500 [============>.................] - ETA: 1:06 - loss: 1.7774 - regression_loss: 1.5198 - classification_loss: 0.2576 232/500 [============>.................] - ETA: 1:06 - loss: 1.7768 - regression_loss: 1.5194 - classification_loss: 0.2574 233/500 [============>.................] - ETA: 1:06 - loss: 1.7765 - regression_loss: 1.5193 - classification_loss: 0.2572 234/500 [=============>................] - ETA: 1:06 - loss: 1.7755 - regression_loss: 1.5185 - classification_loss: 0.2570 235/500 [=============>................] - ETA: 1:05 - loss: 1.7736 - regression_loss: 1.5169 - classification_loss: 0.2567 236/500 [=============>................] - ETA: 1:05 - loss: 1.7733 - regression_loss: 1.5167 - classification_loss: 0.2566 237/500 [=============>................] - ETA: 1:05 - loss: 1.7726 - regression_loss: 1.5162 - classification_loss: 0.2564 238/500 [=============>................] - ETA: 1:05 - loss: 1.7747 - regression_loss: 1.5180 - classification_loss: 0.2567 239/500 [=============>................] - ETA: 1:04 - loss: 1.7743 - regression_loss: 1.5177 - classification_loss: 0.2566 240/500 [=============>................] - ETA: 1:04 - loss: 1.7758 - regression_loss: 1.5191 - classification_loss: 0.2567 241/500 [=============>................] - ETA: 1:04 - loss: 1.7763 - regression_loss: 1.5195 - classification_loss: 0.2568 242/500 [=============>................] - ETA: 1:04 - loss: 1.7764 - regression_loss: 1.5200 - classification_loss: 0.2564 243/500 [=============>................] - ETA: 1:03 - loss: 1.7768 - regression_loss: 1.5206 - classification_loss: 0.2562 244/500 [=============>................] - ETA: 1:03 - loss: 1.7758 - regression_loss: 1.5197 - classification_loss: 0.2561 245/500 [=============>................] - ETA: 1:03 - loss: 1.7747 - regression_loss: 1.5187 - classification_loss: 0.2560 246/500 [=============>................] - ETA: 1:03 - loss: 1.7767 - regression_loss: 1.5203 - classification_loss: 0.2564 247/500 [=============>................] - ETA: 1:02 - loss: 1.7765 - regression_loss: 1.5202 - classification_loss: 0.2563 248/500 [=============>................] - ETA: 1:02 - loss: 1.7765 - regression_loss: 1.5203 - classification_loss: 0.2562 249/500 [=============>................] - ETA: 1:02 - loss: 1.7769 - regression_loss: 1.5202 - classification_loss: 0.2567 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7783 - regression_loss: 1.5216 - classification_loss: 0.2567 251/500 [==============>...............] - ETA: 1:01 - loss: 1.7787 - regression_loss: 1.5216 - classification_loss: 0.2570 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7781 - regression_loss: 1.5211 - classification_loss: 0.2570 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7776 - regression_loss: 1.5209 - classification_loss: 0.2568 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7735 - regression_loss: 1.5175 - classification_loss: 0.2561 255/500 [==============>...............] - ETA: 1:00 - loss: 1.7742 - regression_loss: 1.5179 - classification_loss: 0.2563 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7711 - regression_loss: 1.5154 - classification_loss: 0.2556 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7717 - regression_loss: 1.5161 - classification_loss: 0.2556 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7706 - regression_loss: 1.5152 - classification_loss: 0.2553 259/500 [==============>...............] - ETA: 59s - loss: 1.7728 - regression_loss: 1.5171 - classification_loss: 0.2556  260/500 [==============>...............] - ETA: 59s - loss: 1.7698 - regression_loss: 1.5147 - classification_loss: 0.2551 261/500 [==============>...............] - ETA: 59s - loss: 1.7703 - regression_loss: 1.5153 - classification_loss: 0.2550 262/500 [==============>...............] - ETA: 59s - loss: 1.7703 - regression_loss: 1.5153 - classification_loss: 0.2550 263/500 [==============>...............] - ETA: 58s - loss: 1.7707 - regression_loss: 1.5157 - classification_loss: 0.2549 264/500 [==============>...............] - ETA: 58s - loss: 1.7734 - regression_loss: 1.5179 - classification_loss: 0.2555 265/500 [==============>...............] - ETA: 58s - loss: 1.7699 - regression_loss: 1.5150 - classification_loss: 0.2549 266/500 [==============>...............] - ETA: 58s - loss: 1.7710 - regression_loss: 1.5160 - classification_loss: 0.2551 267/500 [===============>..............] - ETA: 57s - loss: 1.7690 - regression_loss: 1.5144 - classification_loss: 0.2546 268/500 [===============>..............] - ETA: 57s - loss: 1.7646 - regression_loss: 1.5106 - classification_loss: 0.2540 269/500 [===============>..............] - ETA: 57s - loss: 1.7650 - regression_loss: 1.5108 - classification_loss: 0.2542 270/500 [===============>..............] - ETA: 57s - loss: 1.7660 - regression_loss: 1.5117 - classification_loss: 0.2544 271/500 [===============>..............] - ETA: 56s - loss: 1.7660 - regression_loss: 1.5116 - classification_loss: 0.2544 272/500 [===============>..............] - ETA: 56s - loss: 1.7664 - regression_loss: 1.5121 - classification_loss: 0.2543 273/500 [===============>..............] - ETA: 56s - loss: 1.7669 - regression_loss: 1.5126 - classification_loss: 0.2543 274/500 [===============>..............] - ETA: 56s - loss: 1.7669 - regression_loss: 1.5127 - classification_loss: 0.2543 275/500 [===============>..............] - ETA: 55s - loss: 1.7682 - regression_loss: 1.5138 - classification_loss: 0.2544 276/500 [===============>..............] - ETA: 55s - loss: 1.7684 - regression_loss: 1.5140 - classification_loss: 0.2544 277/500 [===============>..............] - ETA: 55s - loss: 1.7679 - regression_loss: 1.5135 - classification_loss: 0.2544 278/500 [===============>..............] - ETA: 55s - loss: 1.7684 - regression_loss: 1.5139 - classification_loss: 0.2545 279/500 [===============>..............] - ETA: 54s - loss: 1.7676 - regression_loss: 1.5133 - classification_loss: 0.2543 280/500 [===============>..............] - ETA: 54s - loss: 1.7687 - regression_loss: 1.5141 - classification_loss: 0.2546 281/500 [===============>..............] - ETA: 54s - loss: 1.7683 - regression_loss: 1.5140 - classification_loss: 0.2543 282/500 [===============>..............] - ETA: 54s - loss: 1.7689 - regression_loss: 1.5145 - classification_loss: 0.2545 283/500 [===============>..............] - ETA: 53s - loss: 1.7669 - regression_loss: 1.5128 - classification_loss: 0.2542 284/500 [================>.............] - ETA: 53s - loss: 1.7647 - regression_loss: 1.5110 - classification_loss: 0.2537 285/500 [================>.............] - ETA: 53s - loss: 1.7639 - regression_loss: 1.5104 - classification_loss: 0.2536 286/500 [================>.............] - ETA: 53s - loss: 1.7637 - regression_loss: 1.5103 - classification_loss: 0.2534 287/500 [================>.............] - ETA: 52s - loss: 1.7656 - regression_loss: 1.5118 - classification_loss: 0.2538 288/500 [================>.............] - ETA: 52s - loss: 1.7663 - regression_loss: 1.5120 - classification_loss: 0.2542 289/500 [================>.............] - ETA: 52s - loss: 1.7668 - regression_loss: 1.5124 - classification_loss: 0.2544 290/500 [================>.............] - ETA: 52s - loss: 1.7664 - regression_loss: 1.5122 - classification_loss: 0.2542 291/500 [================>.............] - ETA: 51s - loss: 1.7641 - regression_loss: 1.5103 - classification_loss: 0.2539 292/500 [================>.............] - ETA: 51s - loss: 1.7653 - regression_loss: 1.5115 - classification_loss: 0.2539 293/500 [================>.............] - ETA: 51s - loss: 1.7667 - regression_loss: 1.5128 - classification_loss: 0.2540 294/500 [================>.............] - ETA: 51s - loss: 1.7657 - regression_loss: 1.5121 - classification_loss: 0.2536 295/500 [================>.............] - ETA: 50s - loss: 1.7648 - regression_loss: 1.5111 - classification_loss: 0.2537 296/500 [================>.............] - ETA: 50s - loss: 1.7644 - regression_loss: 1.5108 - classification_loss: 0.2536 297/500 [================>.............] - ETA: 50s - loss: 1.7649 - regression_loss: 1.5115 - classification_loss: 0.2534 298/500 [================>.............] - ETA: 50s - loss: 1.7651 - regression_loss: 1.5115 - classification_loss: 0.2537 299/500 [================>.............] - ETA: 49s - loss: 1.7656 - regression_loss: 1.5118 - classification_loss: 0.2538 300/500 [=================>............] - ETA: 49s - loss: 1.7650 - regression_loss: 1.5113 - classification_loss: 0.2537 301/500 [=================>............] - ETA: 49s - loss: 1.7651 - regression_loss: 1.5109 - classification_loss: 0.2542 302/500 [=================>............] - ETA: 49s - loss: 1.7665 - regression_loss: 1.5119 - classification_loss: 0.2546 303/500 [=================>............] - ETA: 49s - loss: 1.7652 - regression_loss: 1.5109 - classification_loss: 0.2543 304/500 [=================>............] - ETA: 48s - loss: 1.7652 - regression_loss: 1.5109 - classification_loss: 0.2543 305/500 [=================>............] - ETA: 48s - loss: 1.7644 - regression_loss: 1.5104 - classification_loss: 0.2540 306/500 [=================>............] - ETA: 48s - loss: 1.7642 - regression_loss: 1.5103 - classification_loss: 0.2540 307/500 [=================>............] - ETA: 48s - loss: 1.7651 - regression_loss: 1.5109 - classification_loss: 0.2542 308/500 [=================>............] - ETA: 47s - loss: 1.7636 - regression_loss: 1.5097 - classification_loss: 0.2539 309/500 [=================>............] - ETA: 47s - loss: 1.7639 - regression_loss: 1.5100 - classification_loss: 0.2539 310/500 [=================>............] - ETA: 47s - loss: 1.7648 - regression_loss: 1.5108 - classification_loss: 0.2540 311/500 [=================>............] - ETA: 47s - loss: 1.7641 - regression_loss: 1.5101 - classification_loss: 0.2540 312/500 [=================>............] - ETA: 46s - loss: 1.7657 - regression_loss: 1.5115 - classification_loss: 0.2542 313/500 [=================>............] - ETA: 46s - loss: 1.7655 - regression_loss: 1.5111 - classification_loss: 0.2544 314/500 [=================>............] - ETA: 46s - loss: 1.7657 - regression_loss: 1.5112 - classification_loss: 0.2544 315/500 [=================>............] - ETA: 46s - loss: 1.7658 - regression_loss: 1.5113 - classification_loss: 0.2545 316/500 [=================>............] - ETA: 45s - loss: 1.7657 - regression_loss: 1.5111 - classification_loss: 0.2545 317/500 [==================>...........] - ETA: 45s - loss: 1.7626 - regression_loss: 1.5085 - classification_loss: 0.2541 318/500 [==================>...........] - ETA: 45s - loss: 1.7631 - regression_loss: 1.5090 - classification_loss: 0.2541 319/500 [==================>...........] - ETA: 45s - loss: 1.7634 - regression_loss: 1.5093 - classification_loss: 0.2541 320/500 [==================>...........] - ETA: 44s - loss: 1.7625 - regression_loss: 1.5086 - classification_loss: 0.2539 321/500 [==================>...........] - ETA: 44s - loss: 1.7637 - regression_loss: 1.5092 - classification_loss: 0.2545 322/500 [==================>...........] - ETA: 44s - loss: 1.7616 - regression_loss: 1.5074 - classification_loss: 0.2542 323/500 [==================>...........] - ETA: 43s - loss: 1.7605 - regression_loss: 1.5064 - classification_loss: 0.2541 324/500 [==================>...........] - ETA: 43s - loss: 1.7611 - regression_loss: 1.5070 - classification_loss: 0.2541 325/500 [==================>...........] - ETA: 43s - loss: 1.7621 - regression_loss: 1.5077 - classification_loss: 0.2543 326/500 [==================>...........] - ETA: 43s - loss: 1.7607 - regression_loss: 1.5065 - classification_loss: 0.2543 327/500 [==================>...........] - ETA: 42s - loss: 1.7633 - regression_loss: 1.5087 - classification_loss: 0.2547 328/500 [==================>...........] - ETA: 42s - loss: 1.7626 - regression_loss: 1.5081 - classification_loss: 0.2545 329/500 [==================>...........] - ETA: 42s - loss: 1.7645 - regression_loss: 1.5096 - classification_loss: 0.2549 330/500 [==================>...........] - ETA: 42s - loss: 1.7637 - regression_loss: 1.5091 - classification_loss: 0.2546 331/500 [==================>...........] - ETA: 41s - loss: 1.7648 - regression_loss: 1.5101 - classification_loss: 0.2547 332/500 [==================>...........] - ETA: 41s - loss: 1.7644 - regression_loss: 1.5096 - classification_loss: 0.2548 333/500 [==================>...........] - ETA: 41s - loss: 1.7657 - regression_loss: 1.5107 - classification_loss: 0.2550 334/500 [===================>..........] - ETA: 41s - loss: 1.7668 - regression_loss: 1.5115 - classification_loss: 0.2552 335/500 [===================>..........] - ETA: 41s - loss: 1.7659 - regression_loss: 1.5109 - classification_loss: 0.2550 336/500 [===================>..........] - ETA: 40s - loss: 1.7670 - regression_loss: 1.5118 - classification_loss: 0.2552 337/500 [===================>..........] - ETA: 40s - loss: 1.7655 - regression_loss: 1.5103 - classification_loss: 0.2552 338/500 [===================>..........] - ETA: 40s - loss: 1.7648 - regression_loss: 1.5097 - classification_loss: 0.2551 339/500 [===================>..........] - ETA: 40s - loss: 1.7662 - regression_loss: 1.5096 - classification_loss: 0.2565 340/500 [===================>..........] - ETA: 39s - loss: 1.7671 - regression_loss: 1.5107 - classification_loss: 0.2564 341/500 [===================>..........] - ETA: 39s - loss: 1.7684 - regression_loss: 1.5120 - classification_loss: 0.2564 342/500 [===================>..........] - ETA: 39s - loss: 1.7682 - regression_loss: 1.5119 - classification_loss: 0.2563 343/500 [===================>..........] - ETA: 39s - loss: 1.7689 - regression_loss: 1.5127 - classification_loss: 0.2562 344/500 [===================>..........] - ETA: 38s - loss: 1.7689 - regression_loss: 1.5127 - classification_loss: 0.2562 345/500 [===================>..........] - ETA: 38s - loss: 1.7690 - regression_loss: 1.5129 - classification_loss: 0.2562 346/500 [===================>..........] - ETA: 38s - loss: 1.7660 - regression_loss: 1.5103 - classification_loss: 0.2556 347/500 [===================>..........] - ETA: 38s - loss: 1.7661 - regression_loss: 1.5105 - classification_loss: 0.2556 348/500 [===================>..........] - ETA: 37s - loss: 1.7666 - regression_loss: 1.5110 - classification_loss: 0.2556 349/500 [===================>..........] - ETA: 37s - loss: 1.7640 - regression_loss: 1.5088 - classification_loss: 0.2552 350/500 [====================>.........] - ETA: 37s - loss: 1.7639 - regression_loss: 1.5089 - classification_loss: 0.2550 351/500 [====================>.........] - ETA: 37s - loss: 1.7609 - regression_loss: 1.5063 - classification_loss: 0.2545 352/500 [====================>.........] - ETA: 36s - loss: 1.7604 - regression_loss: 1.5061 - classification_loss: 0.2543 353/500 [====================>.........] - ETA: 36s - loss: 1.7596 - regression_loss: 1.5054 - classification_loss: 0.2542 354/500 [====================>.........] - ETA: 36s - loss: 1.7589 - regression_loss: 1.5048 - classification_loss: 0.2541 355/500 [====================>.........] - ETA: 36s - loss: 1.7561 - regression_loss: 1.5023 - classification_loss: 0.2539 356/500 [====================>.........] - ETA: 35s - loss: 1.7556 - regression_loss: 1.5018 - classification_loss: 0.2537 357/500 [====================>.........] - ETA: 35s - loss: 1.7554 - regression_loss: 1.5018 - classification_loss: 0.2536 358/500 [====================>.........] - ETA: 35s - loss: 1.7562 - regression_loss: 1.5024 - classification_loss: 0.2538 359/500 [====================>.........] - ETA: 35s - loss: 1.7571 - regression_loss: 1.5032 - classification_loss: 0.2539 360/500 [====================>.........] - ETA: 34s - loss: 1.7574 - regression_loss: 1.5035 - classification_loss: 0.2539 361/500 [====================>.........] - ETA: 34s - loss: 1.7582 - regression_loss: 1.5042 - classification_loss: 0.2540 362/500 [====================>.........] - ETA: 34s - loss: 1.7575 - regression_loss: 1.5037 - classification_loss: 0.2538 363/500 [====================>.........] - ETA: 34s - loss: 1.7580 - regression_loss: 1.5043 - classification_loss: 0.2537 364/500 [====================>.........] - ETA: 33s - loss: 1.7577 - regression_loss: 1.5040 - classification_loss: 0.2536 365/500 [====================>.........] - ETA: 33s - loss: 1.7561 - regression_loss: 1.5028 - classification_loss: 0.2533 366/500 [====================>.........] - ETA: 33s - loss: 1.7548 - regression_loss: 1.5018 - classification_loss: 0.2529 367/500 [=====================>........] - ETA: 33s - loss: 1.7548 - regression_loss: 1.5019 - classification_loss: 0.2529 368/500 [=====================>........] - ETA: 32s - loss: 1.7544 - regression_loss: 1.5016 - classification_loss: 0.2528 369/500 [=====================>........] - ETA: 32s - loss: 1.7540 - regression_loss: 1.5013 - classification_loss: 0.2527 370/500 [=====================>........] - ETA: 32s - loss: 1.7550 - regression_loss: 1.5020 - classification_loss: 0.2530 371/500 [=====================>........] - ETA: 32s - loss: 1.7563 - regression_loss: 1.5030 - classification_loss: 0.2533 372/500 [=====================>........] - ETA: 31s - loss: 1.7574 - regression_loss: 1.5039 - classification_loss: 0.2535 373/500 [=====================>........] - ETA: 31s - loss: 1.7551 - regression_loss: 1.5017 - classification_loss: 0.2534 374/500 [=====================>........] - ETA: 31s - loss: 1.7550 - regression_loss: 1.5017 - classification_loss: 0.2533 375/500 [=====================>........] - ETA: 31s - loss: 1.7556 - regression_loss: 1.5023 - classification_loss: 0.2533 376/500 [=====================>........] - ETA: 30s - loss: 1.7554 - regression_loss: 1.5022 - classification_loss: 0.2532 377/500 [=====================>........] - ETA: 30s - loss: 1.7558 - regression_loss: 1.5024 - classification_loss: 0.2533 378/500 [=====================>........] - ETA: 30s - loss: 1.7565 - regression_loss: 1.5031 - classification_loss: 0.2534 379/500 [=====================>........] - ETA: 30s - loss: 1.7569 - regression_loss: 1.5035 - classification_loss: 0.2534 380/500 [=====================>........] - ETA: 29s - loss: 1.7551 - regression_loss: 1.5021 - classification_loss: 0.2531 381/500 [=====================>........] - ETA: 29s - loss: 1.7557 - regression_loss: 1.5027 - classification_loss: 0.2530 382/500 [=====================>........] - ETA: 29s - loss: 1.7542 - regression_loss: 1.5014 - classification_loss: 0.2528 383/500 [=====================>........] - ETA: 29s - loss: 1.7546 - regression_loss: 1.5017 - classification_loss: 0.2528 384/500 [======================>.......] - ETA: 28s - loss: 1.7552 - regression_loss: 1.5022 - classification_loss: 0.2529 385/500 [======================>.......] - ETA: 28s - loss: 1.7559 - regression_loss: 1.5028 - classification_loss: 0.2531 386/500 [======================>.......] - ETA: 28s - loss: 1.7567 - regression_loss: 1.5033 - classification_loss: 0.2534 387/500 [======================>.......] - ETA: 28s - loss: 1.7554 - regression_loss: 1.5024 - classification_loss: 0.2530 388/500 [======================>.......] - ETA: 27s - loss: 1.7569 - regression_loss: 1.5037 - classification_loss: 0.2533 389/500 [======================>.......] - ETA: 27s - loss: 1.7570 - regression_loss: 1.5037 - classification_loss: 0.2533 390/500 [======================>.......] - ETA: 27s - loss: 1.7580 - regression_loss: 1.5046 - classification_loss: 0.2534 391/500 [======================>.......] - ETA: 27s - loss: 1.7563 - regression_loss: 1.5031 - classification_loss: 0.2532 392/500 [======================>.......] - ETA: 26s - loss: 1.7562 - regression_loss: 1.5030 - classification_loss: 0.2532 393/500 [======================>.......] - ETA: 26s - loss: 1.7570 - regression_loss: 1.5038 - classification_loss: 0.2532 394/500 [======================>.......] - ETA: 26s - loss: 1.7574 - regression_loss: 1.5040 - classification_loss: 0.2534 395/500 [======================>.......] - ETA: 26s - loss: 1.7555 - regression_loss: 1.5025 - classification_loss: 0.2530 396/500 [======================>.......] - ETA: 25s - loss: 1.7566 - regression_loss: 1.5035 - classification_loss: 0.2531 397/500 [======================>.......] - ETA: 25s - loss: 1.7580 - regression_loss: 1.5047 - classification_loss: 0.2533 398/500 [======================>.......] - ETA: 25s - loss: 1.7584 - regression_loss: 1.5052 - classification_loss: 0.2532 399/500 [======================>.......] - ETA: 25s - loss: 1.7581 - regression_loss: 1.5050 - classification_loss: 0.2531 400/500 [=======================>......] - ETA: 24s - loss: 1.7585 - regression_loss: 1.5051 - classification_loss: 0.2534 401/500 [=======================>......] - ETA: 24s - loss: 1.7574 - regression_loss: 1.5042 - classification_loss: 0.2532 402/500 [=======================>......] - ETA: 24s - loss: 1.7582 - regression_loss: 1.5049 - classification_loss: 0.2533 403/500 [=======================>......] - ETA: 24s - loss: 1.7585 - regression_loss: 1.5052 - classification_loss: 0.2533 404/500 [=======================>......] - ETA: 23s - loss: 1.7584 - regression_loss: 1.5052 - classification_loss: 0.2532 405/500 [=======================>......] - ETA: 23s - loss: 1.7581 - regression_loss: 1.5050 - classification_loss: 0.2532 406/500 [=======================>......] - ETA: 23s - loss: 1.7590 - regression_loss: 1.5055 - classification_loss: 0.2535 407/500 [=======================>......] - ETA: 23s - loss: 1.7571 - regression_loss: 1.5038 - classification_loss: 0.2532 408/500 [=======================>......] - ETA: 22s - loss: 1.7602 - regression_loss: 1.5059 - classification_loss: 0.2543 409/500 [=======================>......] - ETA: 22s - loss: 1.7609 - regression_loss: 1.5064 - classification_loss: 0.2545 410/500 [=======================>......] - ETA: 22s - loss: 1.7609 - regression_loss: 1.5063 - classification_loss: 0.2546 411/500 [=======================>......] - ETA: 22s - loss: 1.7606 - regression_loss: 1.5060 - classification_loss: 0.2546 412/500 [=======================>......] - ETA: 21s - loss: 1.7578 - regression_loss: 1.5033 - classification_loss: 0.2545 413/500 [=======================>......] - ETA: 21s - loss: 1.7575 - regression_loss: 1.5028 - classification_loss: 0.2547 414/500 [=======================>......] - ETA: 21s - loss: 1.7602 - regression_loss: 1.5050 - classification_loss: 0.2552 415/500 [=======================>......] - ETA: 21s - loss: 1.7598 - regression_loss: 1.5046 - classification_loss: 0.2552 416/500 [=======================>......] - ETA: 20s - loss: 1.7603 - regression_loss: 1.5048 - classification_loss: 0.2554 417/500 [========================>.....] - ETA: 20s - loss: 1.7601 - regression_loss: 1.5043 - classification_loss: 0.2558 418/500 [========================>.....] - ETA: 20s - loss: 1.7599 - regression_loss: 1.5041 - classification_loss: 0.2558 419/500 [========================>.....] - ETA: 20s - loss: 1.7585 - regression_loss: 1.5028 - classification_loss: 0.2557 420/500 [========================>.....] - ETA: 19s - loss: 1.7582 - regression_loss: 1.5025 - classification_loss: 0.2557 421/500 [========================>.....] - ETA: 19s - loss: 1.7594 - regression_loss: 1.5036 - classification_loss: 0.2558 422/500 [========================>.....] - ETA: 19s - loss: 1.7593 - regression_loss: 1.5034 - classification_loss: 0.2558 423/500 [========================>.....] - ETA: 19s - loss: 1.7592 - regression_loss: 1.5034 - classification_loss: 0.2559 424/500 [========================>.....] - ETA: 18s - loss: 1.7573 - regression_loss: 1.5017 - classification_loss: 0.2556 425/500 [========================>.....] - ETA: 18s - loss: 1.7580 - regression_loss: 1.5023 - classification_loss: 0.2557 426/500 [========================>.....] - ETA: 18s - loss: 1.7599 - regression_loss: 1.5040 - classification_loss: 0.2559 427/500 [========================>.....] - ETA: 18s - loss: 1.7590 - regression_loss: 1.5029 - classification_loss: 0.2561 428/500 [========================>.....] - ETA: 17s - loss: 1.7573 - regression_loss: 1.5013 - classification_loss: 0.2560 429/500 [========================>.....] - ETA: 17s - loss: 1.7592 - regression_loss: 1.5028 - classification_loss: 0.2564 430/500 [========================>.....] - ETA: 17s - loss: 1.7595 - regression_loss: 1.5030 - classification_loss: 0.2565 431/500 [========================>.....] - ETA: 17s - loss: 1.7603 - regression_loss: 1.5037 - classification_loss: 0.2566 432/500 [========================>.....] - ETA: 16s - loss: 1.7596 - regression_loss: 1.5031 - classification_loss: 0.2565 433/500 [========================>.....] - ETA: 16s - loss: 1.7599 - regression_loss: 1.5035 - classification_loss: 0.2564 434/500 [=========================>....] - ETA: 16s - loss: 1.7584 - regression_loss: 1.5023 - classification_loss: 0.2561 435/500 [=========================>....] - ETA: 16s - loss: 1.7594 - regression_loss: 1.5029 - classification_loss: 0.2566 436/500 [=========================>....] - ETA: 15s - loss: 1.7608 - regression_loss: 1.5043 - classification_loss: 0.2565 437/500 [=========================>....] - ETA: 15s - loss: 1.7619 - regression_loss: 1.5050 - classification_loss: 0.2569 438/500 [=========================>....] - ETA: 15s - loss: 1.7621 - regression_loss: 1.5053 - classification_loss: 0.2568 439/500 [=========================>....] - ETA: 15s - loss: 1.7615 - regression_loss: 1.5048 - classification_loss: 0.2567 440/500 [=========================>....] - ETA: 14s - loss: 1.7610 - regression_loss: 1.5045 - classification_loss: 0.2565 441/500 [=========================>....] - ETA: 14s - loss: 1.7601 - regression_loss: 1.5038 - classification_loss: 0.2563 442/500 [=========================>....] - ETA: 14s - loss: 1.7600 - regression_loss: 1.5038 - classification_loss: 0.2562 443/500 [=========================>....] - ETA: 14s - loss: 1.7578 - regression_loss: 1.5020 - classification_loss: 0.2558 444/500 [=========================>....] - ETA: 13s - loss: 1.7576 - regression_loss: 1.5017 - classification_loss: 0.2558 445/500 [=========================>....] - ETA: 13s - loss: 1.7581 - regression_loss: 1.5022 - classification_loss: 0.2559 446/500 [=========================>....] - ETA: 13s - loss: 1.7582 - regression_loss: 1.5023 - classification_loss: 0.2559 447/500 [=========================>....] - ETA: 13s - loss: 1.7583 - regression_loss: 1.5024 - classification_loss: 0.2559 448/500 [=========================>....] - ETA: 12s - loss: 1.7583 - regression_loss: 1.5024 - classification_loss: 0.2559 449/500 [=========================>....] - ETA: 12s - loss: 1.7577 - regression_loss: 1.5020 - classification_loss: 0.2557 450/500 [==========================>...] - ETA: 12s - loss: 1.7567 - regression_loss: 1.5011 - classification_loss: 0.2556 451/500 [==========================>...] - ETA: 12s - loss: 1.7570 - regression_loss: 1.5013 - classification_loss: 0.2557 452/500 [==========================>...] - ETA: 11s - loss: 1.7563 - regression_loss: 1.5008 - classification_loss: 0.2556 453/500 [==========================>...] - ETA: 11s - loss: 1.7559 - regression_loss: 1.5003 - classification_loss: 0.2556 454/500 [==========================>...] - ETA: 11s - loss: 1.7559 - regression_loss: 1.5004 - classification_loss: 0.2555 455/500 [==========================>...] - ETA: 11s - loss: 1.7537 - regression_loss: 1.4985 - classification_loss: 0.2552 456/500 [==========================>...] - ETA: 10s - loss: 1.7539 - regression_loss: 1.4988 - classification_loss: 0.2551 457/500 [==========================>...] - ETA: 10s - loss: 1.7540 - regression_loss: 1.4990 - classification_loss: 0.2550 458/500 [==========================>...] - ETA: 10s - loss: 1.7527 - regression_loss: 1.4979 - classification_loss: 0.2548 459/500 [==========================>...] - ETA: 10s - loss: 1.7537 - regression_loss: 1.4988 - classification_loss: 0.2549 460/500 [==========================>...] - ETA: 9s - loss: 1.7527 - regression_loss: 1.4980 - classification_loss: 0.2547  461/500 [==========================>...] - ETA: 9s - loss: 1.7522 - regression_loss: 1.4978 - classification_loss: 0.2544 462/500 [==========================>...] - ETA: 9s - loss: 1.7523 - regression_loss: 1.4979 - classification_loss: 0.2544 463/500 [==========================>...] - ETA: 9s - loss: 1.7532 - regression_loss: 1.4987 - classification_loss: 0.2545 464/500 [==========================>...] - ETA: 8s - loss: 1.7514 - regression_loss: 1.4972 - classification_loss: 0.2542 465/500 [==========================>...] - ETA: 8s - loss: 1.7522 - regression_loss: 1.4980 - classification_loss: 0.2542 466/500 [==========================>...] - ETA: 8s - loss: 1.7528 - regression_loss: 1.4986 - classification_loss: 0.2542 467/500 [===========================>..] - ETA: 8s - loss: 1.7532 - regression_loss: 1.4989 - classification_loss: 0.2543 468/500 [===========================>..] - ETA: 7s - loss: 1.7524 - regression_loss: 1.4983 - classification_loss: 0.2541 469/500 [===========================>..] - ETA: 7s - loss: 1.7525 - regression_loss: 1.4984 - classification_loss: 0.2541 470/500 [===========================>..] - ETA: 7s - loss: 1.7519 - regression_loss: 1.4980 - classification_loss: 0.2540 471/500 [===========================>..] - ETA: 7s - loss: 1.7522 - regression_loss: 1.4983 - classification_loss: 0.2539 472/500 [===========================>..] - ETA: 6s - loss: 1.7526 - regression_loss: 1.4987 - classification_loss: 0.2539 473/500 [===========================>..] - ETA: 6s - loss: 1.7530 - regression_loss: 1.4991 - classification_loss: 0.2540 474/500 [===========================>..] - ETA: 6s - loss: 1.7526 - regression_loss: 1.4986 - classification_loss: 0.2539 475/500 [===========================>..] - ETA: 6s - loss: 1.7527 - regression_loss: 1.4988 - classification_loss: 0.2539 476/500 [===========================>..] - ETA: 5s - loss: 1.7527 - regression_loss: 1.4988 - classification_loss: 0.2538 477/500 [===========================>..] - ETA: 5s - loss: 1.7529 - regression_loss: 1.4988 - classification_loss: 0.2541 478/500 [===========================>..] - ETA: 5s - loss: 1.7531 - regression_loss: 1.4990 - classification_loss: 0.2541 479/500 [===========================>..] - ETA: 5s - loss: 1.7543 - regression_loss: 1.5001 - classification_loss: 0.2542 480/500 [===========================>..] - ETA: 4s - loss: 1.7525 - regression_loss: 1.4985 - classification_loss: 0.2539 481/500 [===========================>..] - ETA: 4s - loss: 1.7532 - regression_loss: 1.4993 - classification_loss: 0.2539 482/500 [===========================>..] - ETA: 4s - loss: 1.7535 - regression_loss: 1.4996 - classification_loss: 0.2539 483/500 [===========================>..] - ETA: 4s - loss: 1.7526 - regression_loss: 1.4989 - classification_loss: 0.2538 484/500 [============================>.] - ETA: 3s - loss: 1.7528 - regression_loss: 1.4990 - classification_loss: 0.2538 485/500 [============================>.] - ETA: 3s - loss: 1.7524 - regression_loss: 1.4987 - classification_loss: 0.2537 486/500 [============================>.] - ETA: 3s - loss: 1.7528 - regression_loss: 1.4990 - classification_loss: 0.2538 487/500 [============================>.] - ETA: 3s - loss: 1.7532 - regression_loss: 1.4993 - classification_loss: 0.2539 488/500 [============================>.] - ETA: 2s - loss: 1.7527 - regression_loss: 1.4990 - classification_loss: 0.2538 489/500 [============================>.] - ETA: 2s - loss: 1.7528 - regression_loss: 1.4991 - classification_loss: 0.2538 490/500 [============================>.] - ETA: 2s - loss: 1.7523 - regression_loss: 1.4986 - classification_loss: 0.2537 491/500 [============================>.] - ETA: 2s - loss: 1.7512 - regression_loss: 1.4976 - classification_loss: 0.2536 492/500 [============================>.] - ETA: 1s - loss: 1.7499 - regression_loss: 1.4965 - classification_loss: 0.2533 493/500 [============================>.] - ETA: 1s - loss: 1.7498 - regression_loss: 1.4966 - classification_loss: 0.2532 494/500 [============================>.] - ETA: 1s - loss: 1.7505 - regression_loss: 1.4971 - classification_loss: 0.2534 495/500 [============================>.] - ETA: 1s - loss: 1.7499 - regression_loss: 1.4965 - classification_loss: 0.2534 496/500 [============================>.] - ETA: 0s - loss: 1.7502 - regression_loss: 1.4969 - classification_loss: 0.2533 497/500 [============================>.] - ETA: 0s - loss: 1.7502 - regression_loss: 1.4965 - classification_loss: 0.2537 498/500 [============================>.] - ETA: 0s - loss: 1.7505 - regression_loss: 1.4968 - classification_loss: 0.2537 499/500 [============================>.] - ETA: 0s - loss: 1.7505 - regression_loss: 1.4968 - classification_loss: 0.2537 500/500 [==============================] - 124s 248ms/step - loss: 1.7513 - regression_loss: 1.4975 - classification_loss: 0.2538 1172 instances of class plum with average precision: 0.6799 mAP: 0.6799 Epoch 00009: saving model to ./training/snapshots/resnet50_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:02 - loss: 1.6723 - regression_loss: 1.4687 - classification_loss: 0.2036 2/500 [..............................] - ETA: 2:06 - loss: 1.5428 - regression_loss: 1.3358 - classification_loss: 0.2071 3/500 [..............................] - ETA: 2:06 - loss: 1.5834 - regression_loss: 1.3792 - classification_loss: 0.2042 4/500 [..............................] - ETA: 2:06 - loss: 1.7113 - regression_loss: 1.4857 - classification_loss: 0.2256 5/500 [..............................] - ETA: 2:06 - loss: 1.8132 - regression_loss: 1.5618 - classification_loss: 0.2513 6/500 [..............................] - ETA: 2:05 - loss: 1.7805 - regression_loss: 1.5291 - classification_loss: 0.2514 7/500 [..............................] - ETA: 2:03 - loss: 1.7596 - regression_loss: 1.5095 - classification_loss: 0.2502 8/500 [..............................] - ETA: 2:02 - loss: 1.7632 - regression_loss: 1.5127 - classification_loss: 0.2505 9/500 [..............................] - ETA: 2:01 - loss: 1.6951 - regression_loss: 1.4561 - classification_loss: 0.2390 10/500 [..............................] - ETA: 2:00 - loss: 1.6469 - regression_loss: 1.4214 - classification_loss: 0.2255 11/500 [..............................] - ETA: 1:59 - loss: 1.6517 - regression_loss: 1.4223 - classification_loss: 0.2294 12/500 [..............................] - ETA: 1:58 - loss: 1.7339 - regression_loss: 1.4869 - classification_loss: 0.2470 13/500 [..............................] - ETA: 1:57 - loss: 1.7365 - regression_loss: 1.4876 - classification_loss: 0.2489 14/500 [..............................] - ETA: 1:56 - loss: 1.7604 - regression_loss: 1.5108 - classification_loss: 0.2497 15/500 [..............................] - ETA: 1:56 - loss: 1.7766 - regression_loss: 1.5203 - classification_loss: 0.2562 16/500 [..............................] - ETA: 1:57 - loss: 1.7249 - regression_loss: 1.4674 - classification_loss: 0.2575 17/500 [>.............................] - ETA: 1:57 - loss: 1.7366 - regression_loss: 1.4804 - classification_loss: 0.2562 18/500 [>.............................] - ETA: 1:57 - loss: 1.7443 - regression_loss: 1.4891 - classification_loss: 0.2552 19/500 [>.............................] - ETA: 1:57 - loss: 1.7240 - regression_loss: 1.4718 - classification_loss: 0.2522 20/500 [>.............................] - ETA: 1:56 - loss: 1.7191 - regression_loss: 1.4652 - classification_loss: 0.2539 21/500 [>.............................] - ETA: 1:56 - loss: 1.6992 - regression_loss: 1.4483 - classification_loss: 0.2508 22/500 [>.............................] - ETA: 1:56 - loss: 1.7115 - regression_loss: 1.4596 - classification_loss: 0.2520 23/500 [>.............................] - ETA: 1:56 - loss: 1.7251 - regression_loss: 1.4710 - classification_loss: 0.2541 24/500 [>.............................] - ETA: 1:55 - loss: 1.7366 - regression_loss: 1.4804 - classification_loss: 0.2562 25/500 [>.............................] - ETA: 1:55 - loss: 1.7525 - regression_loss: 1.4938 - classification_loss: 0.2587 26/500 [>.............................] - ETA: 1:55 - loss: 1.7472 - regression_loss: 1.4882 - classification_loss: 0.2590 27/500 [>.............................] - ETA: 1:54 - loss: 1.7262 - regression_loss: 1.4717 - classification_loss: 0.2545 28/500 [>.............................] - ETA: 1:54 - loss: 1.7236 - regression_loss: 1.4702 - classification_loss: 0.2534 29/500 [>.............................] - ETA: 1:53 - loss: 1.7385 - regression_loss: 1.4845 - classification_loss: 0.2540 30/500 [>.............................] - ETA: 1:53 - loss: 1.7202 - regression_loss: 1.4708 - classification_loss: 0.2494 31/500 [>.............................] - ETA: 1:53 - loss: 1.7323 - regression_loss: 1.4821 - classification_loss: 0.2502 32/500 [>.............................] - ETA: 1:52 - loss: 1.7329 - regression_loss: 1.4808 - classification_loss: 0.2521 33/500 [>.............................] - ETA: 1:52 - loss: 1.7243 - regression_loss: 1.4706 - classification_loss: 0.2537 34/500 [=>............................] - ETA: 1:51 - loss: 1.7227 - regression_loss: 1.4699 - classification_loss: 0.2528 35/500 [=>............................] - ETA: 1:51 - loss: 1.7259 - regression_loss: 1.4734 - classification_loss: 0.2526 36/500 [=>............................] - ETA: 1:51 - loss: 1.7165 - regression_loss: 1.4681 - classification_loss: 0.2485 37/500 [=>............................] - ETA: 1:50 - loss: 1.7201 - regression_loss: 1.4711 - classification_loss: 0.2489 38/500 [=>............................] - ETA: 1:50 - loss: 1.6981 - regression_loss: 1.4533 - classification_loss: 0.2448 39/500 [=>............................] - ETA: 1:50 - loss: 1.6938 - regression_loss: 1.4495 - classification_loss: 0.2443 40/500 [=>............................] - ETA: 1:50 - loss: 1.6946 - regression_loss: 1.4509 - classification_loss: 0.2437 41/500 [=>............................] - ETA: 1:50 - loss: 1.6878 - regression_loss: 1.4460 - classification_loss: 0.2419 42/500 [=>............................] - ETA: 1:50 - loss: 1.7035 - regression_loss: 1.4599 - classification_loss: 0.2436 43/500 [=>............................] - ETA: 1:50 - loss: 1.7075 - regression_loss: 1.4642 - classification_loss: 0.2433 44/500 [=>............................] - ETA: 1:49 - loss: 1.7122 - regression_loss: 1.4702 - classification_loss: 0.2420 45/500 [=>............................] - ETA: 1:49 - loss: 1.6959 - regression_loss: 1.4569 - classification_loss: 0.2390 46/500 [=>............................] - ETA: 1:49 - loss: 1.7004 - regression_loss: 1.4593 - classification_loss: 0.2411 47/500 [=>............................] - ETA: 1:49 - loss: 1.6918 - regression_loss: 1.4527 - classification_loss: 0.2391 48/500 [=>............................] - ETA: 1:48 - loss: 1.7111 - regression_loss: 1.4680 - classification_loss: 0.2431 49/500 [=>............................] - ETA: 1:48 - loss: 1.7164 - regression_loss: 1.4722 - classification_loss: 0.2442 50/500 [==>...........................] - ETA: 1:48 - loss: 1.7006 - regression_loss: 1.4577 - classification_loss: 0.2429 51/500 [==>...........................] - ETA: 1:48 - loss: 1.7022 - regression_loss: 1.4589 - classification_loss: 0.2433 52/500 [==>...........................] - ETA: 1:48 - loss: 1.7062 - regression_loss: 1.4627 - classification_loss: 0.2436 53/500 [==>...........................] - ETA: 1:48 - loss: 1.7174 - regression_loss: 1.4743 - classification_loss: 0.2431 54/500 [==>...........................] - ETA: 1:48 - loss: 1.7258 - regression_loss: 1.4807 - classification_loss: 0.2451 55/500 [==>...........................] - ETA: 1:48 - loss: 1.7285 - regression_loss: 1.4835 - classification_loss: 0.2450 56/500 [==>...........................] - ETA: 1:47 - loss: 1.7430 - regression_loss: 1.4961 - classification_loss: 0.2469 57/500 [==>...........................] - ETA: 1:47 - loss: 1.7511 - regression_loss: 1.5025 - classification_loss: 0.2486 58/500 [==>...........................] - ETA: 1:47 - loss: 1.7545 - regression_loss: 1.5054 - classification_loss: 0.2491 59/500 [==>...........................] - ETA: 1:46 - loss: 1.7557 - regression_loss: 1.5056 - classification_loss: 0.2501 60/500 [==>...........................] - ETA: 1:46 - loss: 1.7500 - regression_loss: 1.5021 - classification_loss: 0.2479 61/500 [==>...........................] - ETA: 1:46 - loss: 1.7399 - regression_loss: 1.4937 - classification_loss: 0.2462 62/500 [==>...........................] - ETA: 1:45 - loss: 1.7494 - regression_loss: 1.5007 - classification_loss: 0.2487 63/500 [==>...........................] - ETA: 1:45 - loss: 1.7427 - regression_loss: 1.4941 - classification_loss: 0.2486 64/500 [==>...........................] - ETA: 1:45 - loss: 1.7463 - regression_loss: 1.4976 - classification_loss: 0.2487 65/500 [==>...........................] - ETA: 1:45 - loss: 1.7424 - regression_loss: 1.4945 - classification_loss: 0.2478 66/500 [==>...........................] - ETA: 1:44 - loss: 1.7392 - regression_loss: 1.4930 - classification_loss: 0.2463 67/500 [===>..........................] - ETA: 1:44 - loss: 1.7433 - regression_loss: 1.4951 - classification_loss: 0.2483 68/500 [===>..........................] - ETA: 1:44 - loss: 1.7453 - regression_loss: 1.4966 - classification_loss: 0.2487 69/500 [===>..........................] - ETA: 1:44 - loss: 1.7500 - regression_loss: 1.5008 - classification_loss: 0.2492 70/500 [===>..........................] - ETA: 1:43 - loss: 1.7572 - regression_loss: 1.5063 - classification_loss: 0.2508 71/500 [===>..........................] - ETA: 1:43 - loss: 1.7535 - regression_loss: 1.5035 - classification_loss: 0.2499 72/500 [===>..........................] - ETA: 1:43 - loss: 1.7452 - regression_loss: 1.4967 - classification_loss: 0.2486 73/500 [===>..........................] - ETA: 1:43 - loss: 1.7313 - regression_loss: 1.4844 - classification_loss: 0.2469 74/500 [===>..........................] - ETA: 1:42 - loss: 1.7207 - regression_loss: 1.4754 - classification_loss: 0.2453 75/500 [===>..........................] - ETA: 1:42 - loss: 1.7092 - regression_loss: 1.4653 - classification_loss: 0.2439 76/500 [===>..........................] - ETA: 1:42 - loss: 1.7126 - regression_loss: 1.4685 - classification_loss: 0.2441 77/500 [===>..........................] - ETA: 1:42 - loss: 1.7174 - regression_loss: 1.4727 - classification_loss: 0.2448 78/500 [===>..........................] - ETA: 1:41 - loss: 1.7101 - regression_loss: 1.4665 - classification_loss: 0.2436 79/500 [===>..........................] - ETA: 1:41 - loss: 1.7096 - regression_loss: 1.4659 - classification_loss: 0.2437 80/500 [===>..........................] - ETA: 1:41 - loss: 1.7144 - regression_loss: 1.4697 - classification_loss: 0.2447 81/500 [===>..........................] - ETA: 1:41 - loss: 1.7139 - regression_loss: 1.4692 - classification_loss: 0.2447 82/500 [===>..........................] - ETA: 1:40 - loss: 1.7050 - regression_loss: 1.4622 - classification_loss: 0.2428 83/500 [===>..........................] - ETA: 1:40 - loss: 1.7057 - regression_loss: 1.4618 - classification_loss: 0.2439 84/500 [====>.........................] - ETA: 1:40 - loss: 1.7044 - regression_loss: 1.4610 - classification_loss: 0.2434 85/500 [====>.........................] - ETA: 1:40 - loss: 1.7040 - regression_loss: 1.4612 - classification_loss: 0.2429 86/500 [====>.........................] - ETA: 1:40 - loss: 1.6992 - regression_loss: 1.4570 - classification_loss: 0.2422 87/500 [====>.........................] - ETA: 1:39 - loss: 1.7025 - regression_loss: 1.4578 - classification_loss: 0.2446 88/500 [====>.........................] - ETA: 1:39 - loss: 1.7054 - regression_loss: 1.4607 - classification_loss: 0.2447 89/500 [====>.........................] - ETA: 1:39 - loss: 1.6982 - regression_loss: 1.4545 - classification_loss: 0.2436 90/500 [====>.........................] - ETA: 1:39 - loss: 1.6986 - regression_loss: 1.4558 - classification_loss: 0.2427 91/500 [====>.........................] - ETA: 1:39 - loss: 1.7081 - regression_loss: 1.4636 - classification_loss: 0.2445 92/500 [====>.........................] - ETA: 1:38 - loss: 1.7006 - regression_loss: 1.4570 - classification_loss: 0.2436 93/500 [====>.........................] - ETA: 1:38 - loss: 1.7014 - regression_loss: 1.4574 - classification_loss: 0.2439 94/500 [====>.........................] - ETA: 1:38 - loss: 1.7052 - regression_loss: 1.4607 - classification_loss: 0.2445 95/500 [====>.........................] - ETA: 1:38 - loss: 1.7022 - regression_loss: 1.4577 - classification_loss: 0.2445 96/500 [====>.........................] - ETA: 1:37 - loss: 1.6946 - regression_loss: 1.4516 - classification_loss: 0.2430 97/500 [====>.........................] - ETA: 1:37 - loss: 1.6974 - regression_loss: 1.4538 - classification_loss: 0.2436 98/500 [====>.........................] - ETA: 1:37 - loss: 1.6929 - regression_loss: 1.4504 - classification_loss: 0.2426 99/500 [====>.........................] - ETA: 1:37 - loss: 1.6955 - regression_loss: 1.4527 - classification_loss: 0.2427 100/500 [=====>........................] - ETA: 1:36 - loss: 1.7040 - regression_loss: 1.4596 - classification_loss: 0.2444 101/500 [=====>........................] - ETA: 1:36 - loss: 1.7033 - regression_loss: 1.4579 - classification_loss: 0.2454 102/500 [=====>........................] - ETA: 1:36 - loss: 1.7049 - regression_loss: 1.4600 - classification_loss: 0.2448 103/500 [=====>........................] - ETA: 1:36 - loss: 1.7066 - regression_loss: 1.4620 - classification_loss: 0.2446 104/500 [=====>........................] - ETA: 1:36 - loss: 1.7053 - regression_loss: 1.4610 - classification_loss: 0.2443 105/500 [=====>........................] - ETA: 1:35 - loss: 1.6971 - regression_loss: 1.4538 - classification_loss: 0.2432 106/500 [=====>........................] - ETA: 1:35 - loss: 1.6911 - regression_loss: 1.4490 - classification_loss: 0.2421 107/500 [=====>........................] - ETA: 1:35 - loss: 1.6843 - regression_loss: 1.4431 - classification_loss: 0.2411 108/500 [=====>........................] - ETA: 1:35 - loss: 1.6849 - regression_loss: 1.4437 - classification_loss: 0.2412 109/500 [=====>........................] - ETA: 1:34 - loss: 1.6882 - regression_loss: 1.4467 - classification_loss: 0.2415 110/500 [=====>........................] - ETA: 1:34 - loss: 1.6886 - regression_loss: 1.4472 - classification_loss: 0.2413 111/500 [=====>........................] - ETA: 1:34 - loss: 1.6816 - regression_loss: 1.4412 - classification_loss: 0.2405 112/500 [=====>........................] - ETA: 1:34 - loss: 1.6849 - regression_loss: 1.4443 - classification_loss: 0.2406 113/500 [=====>........................] - ETA: 1:34 - loss: 1.6863 - regression_loss: 1.4451 - classification_loss: 0.2413 114/500 [=====>........................] - ETA: 1:33 - loss: 1.6896 - regression_loss: 1.4478 - classification_loss: 0.2418 115/500 [=====>........................] - ETA: 1:33 - loss: 1.6938 - regression_loss: 1.4515 - classification_loss: 0.2422 116/500 [=====>........................] - ETA: 1:33 - loss: 1.6969 - regression_loss: 1.4545 - classification_loss: 0.2424 117/500 [======>.......................] - ETA: 1:33 - loss: 1.7038 - regression_loss: 1.4584 - classification_loss: 0.2454 118/500 [======>.......................] - ETA: 1:32 - loss: 1.7045 - regression_loss: 1.4596 - classification_loss: 0.2449 119/500 [======>.......................] - ETA: 1:32 - loss: 1.7048 - regression_loss: 1.4601 - classification_loss: 0.2447 120/500 [======>.......................] - ETA: 1:32 - loss: 1.7069 - regression_loss: 1.4617 - classification_loss: 0.2451 121/500 [======>.......................] - ETA: 1:32 - loss: 1.7088 - regression_loss: 1.4636 - classification_loss: 0.2453 122/500 [======>.......................] - ETA: 1:32 - loss: 1.6998 - regression_loss: 1.4558 - classification_loss: 0.2440 123/500 [======>.......................] - ETA: 1:31 - loss: 1.6993 - regression_loss: 1.4552 - classification_loss: 0.2440 124/500 [======>.......................] - ETA: 1:31 - loss: 1.6963 - regression_loss: 1.4532 - classification_loss: 0.2431 125/500 [======>.......................] - ETA: 1:31 - loss: 1.6913 - regression_loss: 1.4491 - classification_loss: 0.2422 126/500 [======>.......................] - ETA: 1:31 - loss: 1.6935 - regression_loss: 1.4505 - classification_loss: 0.2430 127/500 [======>.......................] - ETA: 1:30 - loss: 1.6960 - regression_loss: 1.4526 - classification_loss: 0.2434 128/500 [======>.......................] - ETA: 1:30 - loss: 1.6986 - regression_loss: 1.4550 - classification_loss: 0.2435 129/500 [======>.......................] - ETA: 1:30 - loss: 1.7000 - regression_loss: 1.4557 - classification_loss: 0.2443 130/500 [======>.......................] - ETA: 1:30 - loss: 1.6974 - regression_loss: 1.4538 - classification_loss: 0.2435 131/500 [======>.......................] - ETA: 1:30 - loss: 1.6995 - regression_loss: 1.4558 - classification_loss: 0.2437 132/500 [======>.......................] - ETA: 1:29 - loss: 1.6987 - regression_loss: 1.4552 - classification_loss: 0.2434 133/500 [======>.......................] - ETA: 1:29 - loss: 1.7001 - regression_loss: 1.4566 - classification_loss: 0.2435 134/500 [=======>......................] - ETA: 1:29 - loss: 1.7003 - regression_loss: 1.4573 - classification_loss: 0.2430 135/500 [=======>......................] - ETA: 1:29 - loss: 1.7029 - regression_loss: 1.4599 - classification_loss: 0.2430 136/500 [=======>......................] - ETA: 1:29 - loss: 1.7003 - regression_loss: 1.4577 - classification_loss: 0.2426 137/500 [=======>......................] - ETA: 1:28 - loss: 1.7020 - regression_loss: 1.4593 - classification_loss: 0.2427 138/500 [=======>......................] - ETA: 1:28 - loss: 1.7036 - regression_loss: 1.4610 - classification_loss: 0.2426 139/500 [=======>......................] - ETA: 1:28 - loss: 1.7036 - regression_loss: 1.4607 - classification_loss: 0.2429 140/500 [=======>......................] - ETA: 1:28 - loss: 1.7022 - regression_loss: 1.4594 - classification_loss: 0.2428 141/500 [=======>......................] - ETA: 1:27 - loss: 1.7028 - regression_loss: 1.4602 - classification_loss: 0.2426 142/500 [=======>......................] - ETA: 1:27 - loss: 1.7040 - regression_loss: 1.4613 - classification_loss: 0.2427 143/500 [=======>......................] - ETA: 1:27 - loss: 1.7034 - regression_loss: 1.4610 - classification_loss: 0.2424 144/500 [=======>......................] - ETA: 1:27 - loss: 1.7052 - regression_loss: 1.4626 - classification_loss: 0.2426 145/500 [=======>......................] - ETA: 1:27 - loss: 1.7045 - regression_loss: 1.4619 - classification_loss: 0.2425 146/500 [=======>......................] - ETA: 1:26 - loss: 1.7051 - regression_loss: 1.4619 - classification_loss: 0.2431 147/500 [=======>......................] - ETA: 1:26 - loss: 1.7058 - regression_loss: 1.4627 - classification_loss: 0.2430 148/500 [=======>......................] - ETA: 1:26 - loss: 1.7058 - regression_loss: 1.4632 - classification_loss: 0.2426 149/500 [=======>......................] - ETA: 1:26 - loss: 1.7000 - regression_loss: 1.4584 - classification_loss: 0.2416 150/500 [========>.....................] - ETA: 1:25 - loss: 1.6934 - regression_loss: 1.4527 - classification_loss: 0.2408 151/500 [========>.....................] - ETA: 1:25 - loss: 1.6936 - regression_loss: 1.4531 - classification_loss: 0.2405 152/500 [========>.....................] - ETA: 1:25 - loss: 1.6920 - regression_loss: 1.4517 - classification_loss: 0.2403 153/500 [========>.....................] - ETA: 1:25 - loss: 1.6942 - regression_loss: 1.4533 - classification_loss: 0.2409 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6967 - regression_loss: 1.4558 - classification_loss: 0.2409 155/500 [========>.....................] - ETA: 1:24 - loss: 1.6959 - regression_loss: 1.4552 - classification_loss: 0.2407 156/500 [========>.....................] - ETA: 1:24 - loss: 1.6952 - regression_loss: 1.4546 - classification_loss: 0.2406 157/500 [========>.....................] - ETA: 1:24 - loss: 1.6950 - regression_loss: 1.4544 - classification_loss: 0.2406 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6947 - regression_loss: 1.4534 - classification_loss: 0.2414 159/500 [========>.....................] - ETA: 1:23 - loss: 1.6980 - regression_loss: 1.4564 - classification_loss: 0.2417 160/500 [========>.....................] - ETA: 1:23 - loss: 1.7001 - regression_loss: 1.4583 - classification_loss: 0.2418 161/500 [========>.....................] - ETA: 1:23 - loss: 1.6999 - regression_loss: 1.4582 - classification_loss: 0.2417 162/500 [========>.....................] - ETA: 1:23 - loss: 1.7016 - regression_loss: 1.4598 - classification_loss: 0.2418 163/500 [========>.....................] - ETA: 1:23 - loss: 1.7044 - regression_loss: 1.4628 - classification_loss: 0.2417 164/500 [========>.....................] - ETA: 1:22 - loss: 1.7036 - regression_loss: 1.4622 - classification_loss: 0.2414 165/500 [========>.....................] - ETA: 1:22 - loss: 1.7048 - regression_loss: 1.4634 - classification_loss: 0.2414 166/500 [========>.....................] - ETA: 1:22 - loss: 1.7054 - regression_loss: 1.4640 - classification_loss: 0.2414 167/500 [=========>....................] - ETA: 1:22 - loss: 1.7040 - regression_loss: 1.4627 - classification_loss: 0.2412 168/500 [=========>....................] - ETA: 1:21 - loss: 1.7070 - regression_loss: 1.4651 - classification_loss: 0.2420 169/500 [=========>....................] - ETA: 1:21 - loss: 1.7076 - regression_loss: 1.4655 - classification_loss: 0.2421 170/500 [=========>....................] - ETA: 1:21 - loss: 1.7094 - regression_loss: 1.4666 - classification_loss: 0.2428 171/500 [=========>....................] - ETA: 1:21 - loss: 1.7102 - regression_loss: 1.4671 - classification_loss: 0.2431 172/500 [=========>....................] - ETA: 1:20 - loss: 1.7113 - regression_loss: 1.4681 - classification_loss: 0.2433 173/500 [=========>....................] - ETA: 1:20 - loss: 1.7090 - regression_loss: 1.4648 - classification_loss: 0.2442 174/500 [=========>....................] - ETA: 1:20 - loss: 1.7110 - regression_loss: 1.4668 - classification_loss: 0.2443 175/500 [=========>....................] - ETA: 1:20 - loss: 1.7109 - regression_loss: 1.4666 - classification_loss: 0.2443 176/500 [=========>....................] - ETA: 1:19 - loss: 1.7067 - regression_loss: 1.4632 - classification_loss: 0.2435 177/500 [=========>....................] - ETA: 1:19 - loss: 1.7053 - regression_loss: 1.4622 - classification_loss: 0.2431 178/500 [=========>....................] - ETA: 1:19 - loss: 1.7069 - regression_loss: 1.4637 - classification_loss: 0.2432 179/500 [=========>....................] - ETA: 1:19 - loss: 1.7063 - regression_loss: 1.4632 - classification_loss: 0.2431 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7080 - regression_loss: 1.4647 - classification_loss: 0.2433 181/500 [=========>....................] - ETA: 1:18 - loss: 1.7069 - regression_loss: 1.4639 - classification_loss: 0.2431 182/500 [=========>....................] - ETA: 1:18 - loss: 1.7073 - regression_loss: 1.4644 - classification_loss: 0.2429 183/500 [=========>....................] - ETA: 1:18 - loss: 1.7071 - regression_loss: 1.4643 - classification_loss: 0.2428 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7043 - regression_loss: 1.4619 - classification_loss: 0.2424 185/500 [==========>...................] - ETA: 1:17 - loss: 1.7025 - regression_loss: 1.4604 - classification_loss: 0.2421 186/500 [==========>...................] - ETA: 1:17 - loss: 1.7021 - regression_loss: 1.4606 - classification_loss: 0.2415 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6973 - regression_loss: 1.4566 - classification_loss: 0.2407 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6979 - regression_loss: 1.4571 - classification_loss: 0.2408 189/500 [==========>...................] - ETA: 1:16 - loss: 1.6956 - regression_loss: 1.4553 - classification_loss: 0.2403 190/500 [==========>...................] - ETA: 1:16 - loss: 1.6920 - regression_loss: 1.4522 - classification_loss: 0.2398 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6932 - regression_loss: 1.4534 - classification_loss: 0.2398 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6960 - regression_loss: 1.4558 - classification_loss: 0.2402 193/500 [==========>...................] - ETA: 1:15 - loss: 1.6952 - regression_loss: 1.4551 - classification_loss: 0.2401 194/500 [==========>...................] - ETA: 1:15 - loss: 1.6956 - regression_loss: 1.4558 - classification_loss: 0.2398 195/500 [==========>...................] - ETA: 1:15 - loss: 1.6961 - regression_loss: 1.4562 - classification_loss: 0.2399 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6964 - regression_loss: 1.4567 - classification_loss: 0.2397 197/500 [==========>...................] - ETA: 1:14 - loss: 1.6979 - regression_loss: 1.4578 - classification_loss: 0.2401 198/500 [==========>...................] - ETA: 1:14 - loss: 1.6989 - regression_loss: 1.4588 - classification_loss: 0.2401 199/500 [==========>...................] - ETA: 1:14 - loss: 1.7005 - regression_loss: 1.4600 - classification_loss: 0.2406 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6995 - regression_loss: 1.4590 - classification_loss: 0.2405 201/500 [===========>..................] - ETA: 1:13 - loss: 1.6997 - regression_loss: 1.4592 - classification_loss: 0.2405 202/500 [===========>..................] - ETA: 1:13 - loss: 1.6951 - regression_loss: 1.4554 - classification_loss: 0.2397 203/500 [===========>..................] - ETA: 1:13 - loss: 1.6955 - regression_loss: 1.4558 - classification_loss: 0.2397 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6947 - regression_loss: 1.4550 - classification_loss: 0.2396 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6945 - regression_loss: 1.4549 - classification_loss: 0.2396 206/500 [===========>..................] - ETA: 1:12 - loss: 1.6931 - regression_loss: 1.4536 - classification_loss: 0.2395 207/500 [===========>..................] - ETA: 1:12 - loss: 1.6936 - regression_loss: 1.4542 - classification_loss: 0.2394 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6883 - regression_loss: 1.4497 - classification_loss: 0.2386 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6896 - regression_loss: 1.4508 - classification_loss: 0.2388 210/500 [===========>..................] - ETA: 1:11 - loss: 1.6875 - regression_loss: 1.4492 - classification_loss: 0.2383 211/500 [===========>..................] - ETA: 1:11 - loss: 1.6874 - regression_loss: 1.4492 - classification_loss: 0.2382 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6876 - regression_loss: 1.4494 - classification_loss: 0.2382 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6876 - regression_loss: 1.4495 - classification_loss: 0.2381 214/500 [===========>..................] - ETA: 1:10 - loss: 1.6920 - regression_loss: 1.4531 - classification_loss: 0.2389 215/500 [===========>..................] - ETA: 1:10 - loss: 1.6940 - regression_loss: 1.4550 - classification_loss: 0.2390 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6952 - regression_loss: 1.4564 - classification_loss: 0.2388 217/500 [============>.................] - ETA: 1:10 - loss: 1.6958 - regression_loss: 1.4570 - classification_loss: 0.2388 218/500 [============>.................] - ETA: 1:09 - loss: 1.6963 - regression_loss: 1.4575 - classification_loss: 0.2388 219/500 [============>.................] - ETA: 1:09 - loss: 1.6944 - regression_loss: 1.4559 - classification_loss: 0.2385 220/500 [============>.................] - ETA: 1:09 - loss: 1.6940 - regression_loss: 1.4557 - classification_loss: 0.2383 221/500 [============>.................] - ETA: 1:08 - loss: 1.6933 - regression_loss: 1.4550 - classification_loss: 0.2383 222/500 [============>.................] - ETA: 1:08 - loss: 1.6952 - regression_loss: 1.4567 - classification_loss: 0.2385 223/500 [============>.................] - ETA: 1:08 - loss: 1.6950 - regression_loss: 1.4567 - classification_loss: 0.2382 224/500 [============>.................] - ETA: 1:08 - loss: 1.6960 - regression_loss: 1.4577 - classification_loss: 0.2383 225/500 [============>.................] - ETA: 1:08 - loss: 1.6933 - regression_loss: 1.4554 - classification_loss: 0.2379 226/500 [============>.................] - ETA: 1:07 - loss: 1.6910 - regression_loss: 1.4537 - classification_loss: 0.2374 227/500 [============>.................] - ETA: 1:07 - loss: 1.6907 - regression_loss: 1.4534 - classification_loss: 0.2374 228/500 [============>.................] - ETA: 1:07 - loss: 1.6888 - regression_loss: 1.4518 - classification_loss: 0.2370 229/500 [============>.................] - ETA: 1:07 - loss: 1.6918 - regression_loss: 1.4546 - classification_loss: 0.2372 230/500 [============>.................] - ETA: 1:06 - loss: 1.6933 - regression_loss: 1.4560 - classification_loss: 0.2372 231/500 [============>.................] - ETA: 1:06 - loss: 1.6944 - regression_loss: 1.4569 - classification_loss: 0.2374 232/500 [============>.................] - ETA: 1:06 - loss: 1.6951 - regression_loss: 1.4575 - classification_loss: 0.2376 233/500 [============>.................] - ETA: 1:06 - loss: 1.6973 - regression_loss: 1.4592 - classification_loss: 0.2380 234/500 [=============>................] - ETA: 1:05 - loss: 1.6981 - regression_loss: 1.4598 - classification_loss: 0.2383 235/500 [=============>................] - ETA: 1:05 - loss: 1.6970 - regression_loss: 1.4588 - classification_loss: 0.2382 236/500 [=============>................] - ETA: 1:05 - loss: 1.6965 - regression_loss: 1.4585 - classification_loss: 0.2380 237/500 [=============>................] - ETA: 1:05 - loss: 1.6974 - regression_loss: 1.4591 - classification_loss: 0.2383 238/500 [=============>................] - ETA: 1:04 - loss: 1.7011 - regression_loss: 1.4623 - classification_loss: 0.2389 239/500 [=============>................] - ETA: 1:04 - loss: 1.7004 - regression_loss: 1.4615 - classification_loss: 0.2389 240/500 [=============>................] - ETA: 1:04 - loss: 1.7004 - regression_loss: 1.4617 - classification_loss: 0.2387 241/500 [=============>................] - ETA: 1:04 - loss: 1.6973 - regression_loss: 1.4592 - classification_loss: 0.2381 242/500 [=============>................] - ETA: 1:03 - loss: 1.6974 - regression_loss: 1.4593 - classification_loss: 0.2380 243/500 [=============>................] - ETA: 1:03 - loss: 1.6978 - regression_loss: 1.4598 - classification_loss: 0.2380 244/500 [=============>................] - ETA: 1:03 - loss: 1.6982 - regression_loss: 1.4601 - classification_loss: 0.2382 245/500 [=============>................] - ETA: 1:03 - loss: 1.6993 - regression_loss: 1.4612 - classification_loss: 0.2382 246/500 [=============>................] - ETA: 1:02 - loss: 1.7005 - regression_loss: 1.4622 - classification_loss: 0.2383 247/500 [=============>................] - ETA: 1:02 - loss: 1.6987 - regression_loss: 1.4606 - classification_loss: 0.2380 248/500 [=============>................] - ETA: 1:02 - loss: 1.6970 - regression_loss: 1.4594 - classification_loss: 0.2375 249/500 [=============>................] - ETA: 1:02 - loss: 1.6967 - regression_loss: 1.4592 - classification_loss: 0.2375 250/500 [==============>...............] - ETA: 1:01 - loss: 1.6941 - regression_loss: 1.4572 - classification_loss: 0.2369 251/500 [==============>...............] - ETA: 1:01 - loss: 1.6962 - regression_loss: 1.4589 - classification_loss: 0.2373 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6979 - regression_loss: 1.4605 - classification_loss: 0.2374 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7001 - regression_loss: 1.4625 - classification_loss: 0.2377 254/500 [==============>...............] - ETA: 1:00 - loss: 1.7012 - regression_loss: 1.4639 - classification_loss: 0.2373 255/500 [==============>...............] - ETA: 1:00 - loss: 1.7004 - regression_loss: 1.4630 - classification_loss: 0.2374 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7009 - regression_loss: 1.4633 - classification_loss: 0.2376 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7024 - regression_loss: 1.4649 - classification_loss: 0.2376 258/500 [==============>...............] - ETA: 59s - loss: 1.7027 - regression_loss: 1.4652 - classification_loss: 0.2375  259/500 [==============>...............] - ETA: 59s - loss: 1.7018 - regression_loss: 1.4642 - classification_loss: 0.2376 260/500 [==============>...............] - ETA: 59s - loss: 1.7013 - regression_loss: 1.4638 - classification_loss: 0.2375 261/500 [==============>...............] - ETA: 59s - loss: 1.7008 - regression_loss: 1.4636 - classification_loss: 0.2373 262/500 [==============>...............] - ETA: 58s - loss: 1.6986 - regression_loss: 1.4614 - classification_loss: 0.2372 263/500 [==============>...............] - ETA: 58s - loss: 1.7012 - regression_loss: 1.4637 - classification_loss: 0.2375 264/500 [==============>...............] - ETA: 58s - loss: 1.7010 - regression_loss: 1.4633 - classification_loss: 0.2376 265/500 [==============>...............] - ETA: 58s - loss: 1.7023 - regression_loss: 1.4645 - classification_loss: 0.2378 266/500 [==============>...............] - ETA: 57s - loss: 1.7015 - regression_loss: 1.4640 - classification_loss: 0.2375 267/500 [===============>..............] - ETA: 57s - loss: 1.7014 - regression_loss: 1.4640 - classification_loss: 0.2373 268/500 [===============>..............] - ETA: 57s - loss: 1.7024 - regression_loss: 1.4648 - classification_loss: 0.2376 269/500 [===============>..............] - ETA: 57s - loss: 1.7002 - regression_loss: 1.4629 - classification_loss: 0.2373 270/500 [===============>..............] - ETA: 56s - loss: 1.6993 - regression_loss: 1.4622 - classification_loss: 0.2370 271/500 [===============>..............] - ETA: 56s - loss: 1.7003 - regression_loss: 1.4631 - classification_loss: 0.2372 272/500 [===============>..............] - ETA: 56s - loss: 1.7017 - regression_loss: 1.4644 - classification_loss: 0.2373 273/500 [===============>..............] - ETA: 56s - loss: 1.7027 - regression_loss: 1.4653 - classification_loss: 0.2374 274/500 [===============>..............] - ETA: 55s - loss: 1.7036 - regression_loss: 1.4657 - classification_loss: 0.2379 275/500 [===============>..............] - ETA: 55s - loss: 1.7035 - regression_loss: 1.4656 - classification_loss: 0.2380 276/500 [===============>..............] - ETA: 55s - loss: 1.7035 - regression_loss: 1.4654 - classification_loss: 0.2381 277/500 [===============>..............] - ETA: 55s - loss: 1.7052 - regression_loss: 1.4665 - classification_loss: 0.2387 278/500 [===============>..............] - ETA: 54s - loss: 1.7061 - regression_loss: 1.4672 - classification_loss: 0.2389 279/500 [===============>..............] - ETA: 54s - loss: 1.7063 - regression_loss: 1.4675 - classification_loss: 0.2389 280/500 [===============>..............] - ETA: 54s - loss: 1.7063 - regression_loss: 1.4675 - classification_loss: 0.2387 281/500 [===============>..............] - ETA: 54s - loss: 1.7058 - regression_loss: 1.4671 - classification_loss: 0.2387 282/500 [===============>..............] - ETA: 53s - loss: 1.7059 - regression_loss: 1.4673 - classification_loss: 0.2386 283/500 [===============>..............] - ETA: 53s - loss: 1.7067 - regression_loss: 1.4681 - classification_loss: 0.2386 284/500 [================>.............] - ETA: 53s - loss: 1.7129 - regression_loss: 1.4730 - classification_loss: 0.2399 285/500 [================>.............] - ETA: 53s - loss: 1.7125 - regression_loss: 1.4726 - classification_loss: 0.2399 286/500 [================>.............] - ETA: 53s - loss: 1.7113 - regression_loss: 1.4714 - classification_loss: 0.2399 287/500 [================>.............] - ETA: 52s - loss: 1.7123 - regression_loss: 1.4721 - classification_loss: 0.2402 288/500 [================>.............] - ETA: 52s - loss: 1.7120 - regression_loss: 1.4720 - classification_loss: 0.2400 289/500 [================>.............] - ETA: 52s - loss: 1.7091 - regression_loss: 1.4694 - classification_loss: 0.2397 290/500 [================>.............] - ETA: 52s - loss: 1.7087 - regression_loss: 1.4691 - classification_loss: 0.2396 291/500 [================>.............] - ETA: 51s - loss: 1.7054 - regression_loss: 1.4663 - classification_loss: 0.2391 292/500 [================>.............] - ETA: 51s - loss: 1.7023 - regression_loss: 1.4636 - classification_loss: 0.2386 293/500 [================>.............] - ETA: 51s - loss: 1.6983 - regression_loss: 1.4601 - classification_loss: 0.2382 294/500 [================>.............] - ETA: 51s - loss: 1.6979 - regression_loss: 1.4599 - classification_loss: 0.2380 295/500 [================>.............] - ETA: 50s - loss: 1.6968 - regression_loss: 1.4589 - classification_loss: 0.2379 296/500 [================>.............] - ETA: 50s - loss: 1.6965 - regression_loss: 1.4586 - classification_loss: 0.2379 297/500 [================>.............] - ETA: 50s - loss: 1.6961 - regression_loss: 1.4583 - classification_loss: 0.2378 298/500 [================>.............] - ETA: 50s - loss: 1.6962 - regression_loss: 1.4585 - classification_loss: 0.2377 299/500 [================>.............] - ETA: 49s - loss: 1.6965 - regression_loss: 1.4588 - classification_loss: 0.2377 300/500 [=================>............] - ETA: 49s - loss: 1.6960 - regression_loss: 1.4581 - classification_loss: 0.2379 301/500 [=================>............] - ETA: 49s - loss: 1.6961 - regression_loss: 1.4583 - classification_loss: 0.2378 302/500 [=================>............] - ETA: 49s - loss: 1.6960 - regression_loss: 1.4580 - classification_loss: 0.2381 303/500 [=================>............] - ETA: 48s - loss: 1.6963 - regression_loss: 1.4582 - classification_loss: 0.2381 304/500 [=================>............] - ETA: 48s - loss: 1.6961 - regression_loss: 1.4582 - classification_loss: 0.2379 305/500 [=================>............] - ETA: 48s - loss: 1.6938 - regression_loss: 1.4564 - classification_loss: 0.2375 306/500 [=================>............] - ETA: 48s - loss: 1.6922 - regression_loss: 1.4550 - classification_loss: 0.2372 307/500 [=================>............] - ETA: 47s - loss: 1.6937 - regression_loss: 1.4563 - classification_loss: 0.2374 308/500 [=================>............] - ETA: 47s - loss: 1.6926 - regression_loss: 1.4555 - classification_loss: 0.2371 309/500 [=================>............] - ETA: 47s - loss: 1.6922 - regression_loss: 1.4553 - classification_loss: 0.2369 310/500 [=================>............] - ETA: 47s - loss: 1.6919 - regression_loss: 1.4549 - classification_loss: 0.2370 311/500 [=================>............] - ETA: 46s - loss: 1.6924 - regression_loss: 1.4552 - classification_loss: 0.2372 312/500 [=================>............] - ETA: 46s - loss: 1.6930 - regression_loss: 1.4557 - classification_loss: 0.2373 313/500 [=================>............] - ETA: 46s - loss: 1.6934 - regression_loss: 1.4561 - classification_loss: 0.2372 314/500 [=================>............] - ETA: 46s - loss: 1.6945 - regression_loss: 1.4572 - classification_loss: 0.2374 315/500 [=================>............] - ETA: 45s - loss: 1.6930 - regression_loss: 1.4558 - classification_loss: 0.2372 316/500 [=================>............] - ETA: 45s - loss: 1.6930 - regression_loss: 1.4557 - classification_loss: 0.2372 317/500 [==================>...........] - ETA: 45s - loss: 1.6937 - regression_loss: 1.4563 - classification_loss: 0.2373 318/500 [==================>...........] - ETA: 45s - loss: 1.6938 - regression_loss: 1.4565 - classification_loss: 0.2373 319/500 [==================>...........] - ETA: 44s - loss: 1.6917 - regression_loss: 1.4549 - classification_loss: 0.2369 320/500 [==================>...........] - ETA: 44s - loss: 1.6905 - regression_loss: 1.4539 - classification_loss: 0.2366 321/500 [==================>...........] - ETA: 44s - loss: 1.6884 - regression_loss: 1.4520 - classification_loss: 0.2364 322/500 [==================>...........] - ETA: 44s - loss: 1.6879 - regression_loss: 1.4516 - classification_loss: 0.2363 323/500 [==================>...........] - ETA: 43s - loss: 1.6865 - regression_loss: 1.4505 - classification_loss: 0.2361 324/500 [==================>...........] - ETA: 43s - loss: 1.6840 - regression_loss: 1.4479 - classification_loss: 0.2361 325/500 [==================>...........] - ETA: 43s - loss: 1.6838 - regression_loss: 1.4478 - classification_loss: 0.2360 326/500 [==================>...........] - ETA: 43s - loss: 1.6838 - regression_loss: 1.4477 - classification_loss: 0.2360 327/500 [==================>...........] - ETA: 42s - loss: 1.6838 - regression_loss: 1.4478 - classification_loss: 0.2360 328/500 [==================>...........] - ETA: 42s - loss: 1.6826 - regression_loss: 1.4466 - classification_loss: 0.2359 329/500 [==================>...........] - ETA: 42s - loss: 1.6827 - regression_loss: 1.4469 - classification_loss: 0.2358 330/500 [==================>...........] - ETA: 42s - loss: 1.6822 - regression_loss: 1.4465 - classification_loss: 0.2357 331/500 [==================>...........] - ETA: 41s - loss: 1.6803 - regression_loss: 1.4449 - classification_loss: 0.2354 332/500 [==================>...........] - ETA: 41s - loss: 1.6787 - regression_loss: 1.4435 - classification_loss: 0.2352 333/500 [==================>...........] - ETA: 41s - loss: 1.6817 - regression_loss: 1.4461 - classification_loss: 0.2356 334/500 [===================>..........] - ETA: 41s - loss: 1.6825 - regression_loss: 1.4469 - classification_loss: 0.2357 335/500 [===================>..........] - ETA: 41s - loss: 1.6798 - regression_loss: 1.4445 - classification_loss: 0.2353 336/500 [===================>..........] - ETA: 40s - loss: 1.6789 - regression_loss: 1.4437 - classification_loss: 0.2352 337/500 [===================>..........] - ETA: 40s - loss: 1.6761 - regression_loss: 1.4411 - classification_loss: 0.2350 338/500 [===================>..........] - ETA: 40s - loss: 1.6770 - regression_loss: 1.4417 - classification_loss: 0.2353 339/500 [===================>..........] - ETA: 40s - loss: 1.6781 - regression_loss: 1.4426 - classification_loss: 0.2355 340/500 [===================>..........] - ETA: 39s - loss: 1.6798 - regression_loss: 1.4439 - classification_loss: 0.2359 341/500 [===================>..........] - ETA: 39s - loss: 1.6808 - regression_loss: 1.4448 - classification_loss: 0.2360 342/500 [===================>..........] - ETA: 39s - loss: 1.6816 - regression_loss: 1.4456 - classification_loss: 0.2360 343/500 [===================>..........] - ETA: 39s - loss: 1.6822 - regression_loss: 1.4461 - classification_loss: 0.2360 344/500 [===================>..........] - ETA: 38s - loss: 1.6796 - regression_loss: 1.4439 - classification_loss: 0.2356 345/500 [===================>..........] - ETA: 38s - loss: 1.6795 - regression_loss: 1.4438 - classification_loss: 0.2357 346/500 [===================>..........] - ETA: 38s - loss: 1.6782 - regression_loss: 1.4427 - classification_loss: 0.2355 347/500 [===================>..........] - ETA: 38s - loss: 1.6785 - regression_loss: 1.4428 - classification_loss: 0.2357 348/500 [===================>..........] - ETA: 37s - loss: 1.6766 - regression_loss: 1.4412 - classification_loss: 0.2354 349/500 [===================>..........] - ETA: 37s - loss: 1.6788 - regression_loss: 1.4430 - classification_loss: 0.2358 350/500 [====================>.........] - ETA: 37s - loss: 1.6805 - regression_loss: 1.4445 - classification_loss: 0.2360 351/500 [====================>.........] - ETA: 37s - loss: 1.6809 - regression_loss: 1.4448 - classification_loss: 0.2361 352/500 [====================>.........] - ETA: 36s - loss: 1.6848 - regression_loss: 1.4482 - classification_loss: 0.2366 353/500 [====================>.........] - ETA: 36s - loss: 1.6858 - regression_loss: 1.4492 - classification_loss: 0.2366 354/500 [====================>.........] - ETA: 36s - loss: 1.6857 - regression_loss: 1.4490 - classification_loss: 0.2367 355/500 [====================>.........] - ETA: 36s - loss: 1.6820 - regression_loss: 1.4458 - classification_loss: 0.2362 356/500 [====================>.........] - ETA: 35s - loss: 1.6886 - regression_loss: 1.4508 - classification_loss: 0.2378 357/500 [====================>.........] - ETA: 35s - loss: 1.6888 - regression_loss: 1.4510 - classification_loss: 0.2378 358/500 [====================>.........] - ETA: 35s - loss: 1.6886 - regression_loss: 1.4509 - classification_loss: 0.2377 359/500 [====================>.........] - ETA: 35s - loss: 1.6896 - regression_loss: 1.4519 - classification_loss: 0.2377 360/500 [====================>.........] - ETA: 34s - loss: 1.6884 - regression_loss: 1.4509 - classification_loss: 0.2375 361/500 [====================>.........] - ETA: 34s - loss: 1.6898 - regression_loss: 1.4521 - classification_loss: 0.2377 362/500 [====================>.........] - ETA: 34s - loss: 1.6915 - regression_loss: 1.4534 - classification_loss: 0.2380 363/500 [====================>.........] - ETA: 34s - loss: 1.6919 - regression_loss: 1.4539 - classification_loss: 0.2381 364/500 [====================>.........] - ETA: 33s - loss: 1.6917 - regression_loss: 1.4536 - classification_loss: 0.2381 365/500 [====================>.........] - ETA: 33s - loss: 1.6902 - regression_loss: 1.4523 - classification_loss: 0.2379 366/500 [====================>.........] - ETA: 33s - loss: 1.6900 - regression_loss: 1.4523 - classification_loss: 0.2378 367/500 [=====================>........] - ETA: 33s - loss: 1.6907 - regression_loss: 1.4528 - classification_loss: 0.2379 368/500 [=====================>........] - ETA: 32s - loss: 1.6907 - regression_loss: 1.4528 - classification_loss: 0.2379 369/500 [=====================>........] - ETA: 32s - loss: 1.6906 - regression_loss: 1.4526 - classification_loss: 0.2379 370/500 [=====================>........] - ETA: 32s - loss: 1.6897 - regression_loss: 1.4519 - classification_loss: 0.2378 371/500 [=====================>........] - ETA: 32s - loss: 1.6890 - regression_loss: 1.4514 - classification_loss: 0.2376 372/500 [=====================>........] - ETA: 31s - loss: 1.6896 - regression_loss: 1.4519 - classification_loss: 0.2378 373/500 [=====================>........] - ETA: 31s - loss: 1.6897 - regression_loss: 1.4521 - classification_loss: 0.2376 374/500 [=====================>........] - ETA: 31s - loss: 1.6893 - regression_loss: 1.4518 - classification_loss: 0.2376 375/500 [=====================>........] - ETA: 31s - loss: 1.6880 - regression_loss: 1.4506 - classification_loss: 0.2374 376/500 [=====================>........] - ETA: 30s - loss: 1.6873 - regression_loss: 1.4497 - classification_loss: 0.2376 377/500 [=====================>........] - ETA: 30s - loss: 1.6883 - regression_loss: 1.4505 - classification_loss: 0.2378 378/500 [=====================>........] - ETA: 30s - loss: 1.6895 - regression_loss: 1.4515 - classification_loss: 0.2380 379/500 [=====================>........] - ETA: 30s - loss: 1.6889 - regression_loss: 1.4509 - classification_loss: 0.2380 380/500 [=====================>........] - ETA: 29s - loss: 1.6886 - regression_loss: 1.4507 - classification_loss: 0.2379 381/500 [=====================>........] - ETA: 29s - loss: 1.6880 - regression_loss: 1.4501 - classification_loss: 0.2379 382/500 [=====================>........] - ETA: 29s - loss: 1.6886 - regression_loss: 1.4505 - classification_loss: 0.2381 383/500 [=====================>........] - ETA: 29s - loss: 1.6884 - regression_loss: 1.4505 - classification_loss: 0.2378 384/500 [======================>.......] - ETA: 28s - loss: 1.6889 - regression_loss: 1.4511 - classification_loss: 0.2378 385/500 [======================>.......] - ETA: 28s - loss: 1.6885 - regression_loss: 1.4506 - classification_loss: 0.2379 386/500 [======================>.......] - ETA: 28s - loss: 1.6900 - regression_loss: 1.4517 - classification_loss: 0.2383 387/500 [======================>.......] - ETA: 28s - loss: 1.6885 - regression_loss: 1.4505 - classification_loss: 0.2381 388/500 [======================>.......] - ETA: 27s - loss: 1.6896 - regression_loss: 1.4512 - classification_loss: 0.2384 389/500 [======================>.......] - ETA: 27s - loss: 1.6903 - regression_loss: 1.4518 - classification_loss: 0.2384 390/500 [======================>.......] - ETA: 27s - loss: 1.6892 - regression_loss: 1.4510 - classification_loss: 0.2382 391/500 [======================>.......] - ETA: 27s - loss: 1.6893 - regression_loss: 1.4509 - classification_loss: 0.2383 392/500 [======================>.......] - ETA: 26s - loss: 1.6895 - regression_loss: 1.4510 - classification_loss: 0.2384 393/500 [======================>.......] - ETA: 26s - loss: 1.6906 - regression_loss: 1.4520 - classification_loss: 0.2386 394/500 [======================>.......] - ETA: 26s - loss: 1.6922 - regression_loss: 1.4532 - classification_loss: 0.2390 395/500 [======================>.......] - ETA: 26s - loss: 1.6923 - regression_loss: 1.4533 - classification_loss: 0.2390 396/500 [======================>.......] - ETA: 25s - loss: 1.6908 - regression_loss: 1.4520 - classification_loss: 0.2388 397/500 [======================>.......] - ETA: 25s - loss: 1.6900 - regression_loss: 1.4513 - classification_loss: 0.2387 398/500 [======================>.......] - ETA: 25s - loss: 1.6912 - regression_loss: 1.4524 - classification_loss: 0.2388 399/500 [======================>.......] - ETA: 25s - loss: 1.6928 - regression_loss: 1.4539 - classification_loss: 0.2389 400/500 [=======================>......] - ETA: 24s - loss: 1.6929 - regression_loss: 1.4541 - classification_loss: 0.2388 401/500 [=======================>......] - ETA: 24s - loss: 1.6919 - regression_loss: 1.4532 - classification_loss: 0.2386 402/500 [=======================>......] - ETA: 24s - loss: 1.6919 - regression_loss: 1.4532 - classification_loss: 0.2387 403/500 [=======================>......] - ETA: 24s - loss: 1.6901 - regression_loss: 1.4517 - classification_loss: 0.2384 404/500 [=======================>......] - ETA: 23s - loss: 1.6917 - regression_loss: 1.4529 - classification_loss: 0.2387 405/500 [=======================>......] - ETA: 23s - loss: 1.6914 - regression_loss: 1.4528 - classification_loss: 0.2387 406/500 [=======================>......] - ETA: 23s - loss: 1.6918 - regression_loss: 1.4530 - classification_loss: 0.2387 407/500 [=======================>......] - ETA: 23s - loss: 1.6922 - regression_loss: 1.4534 - classification_loss: 0.2388 408/500 [=======================>......] - ETA: 22s - loss: 1.6920 - regression_loss: 1.4532 - classification_loss: 0.2387 409/500 [=======================>......] - ETA: 22s - loss: 1.6935 - regression_loss: 1.4543 - classification_loss: 0.2392 410/500 [=======================>......] - ETA: 22s - loss: 1.6939 - regression_loss: 1.4546 - classification_loss: 0.2393 411/500 [=======================>......] - ETA: 22s - loss: 1.6940 - regression_loss: 1.4546 - classification_loss: 0.2394 412/500 [=======================>......] - ETA: 21s - loss: 1.6940 - regression_loss: 1.4547 - classification_loss: 0.2394 413/500 [=======================>......] - ETA: 21s - loss: 1.6935 - regression_loss: 1.4542 - classification_loss: 0.2393 414/500 [=======================>......] - ETA: 21s - loss: 1.6941 - regression_loss: 1.4548 - classification_loss: 0.2393 415/500 [=======================>......] - ETA: 21s - loss: 1.6921 - regression_loss: 1.4531 - classification_loss: 0.2390 416/500 [=======================>......] - ETA: 20s - loss: 1.6920 - regression_loss: 1.4531 - classification_loss: 0.2389 417/500 [========================>.....] - ETA: 20s - loss: 1.6923 - regression_loss: 1.4533 - classification_loss: 0.2390 418/500 [========================>.....] - ETA: 20s - loss: 1.6926 - regression_loss: 1.4536 - classification_loss: 0.2390 419/500 [========================>.....] - ETA: 20s - loss: 1.6920 - regression_loss: 1.4532 - classification_loss: 0.2388 420/500 [========================>.....] - ETA: 19s - loss: 1.6923 - regression_loss: 1.4535 - classification_loss: 0.2388 421/500 [========================>.....] - ETA: 19s - loss: 1.6923 - regression_loss: 1.4534 - classification_loss: 0.2389 422/500 [========================>.....] - ETA: 19s - loss: 1.6920 - regression_loss: 1.4531 - classification_loss: 0.2388 423/500 [========================>.....] - ETA: 19s - loss: 1.6940 - regression_loss: 1.4549 - classification_loss: 0.2391 424/500 [========================>.....] - ETA: 18s - loss: 1.6940 - regression_loss: 1.4549 - classification_loss: 0.2390 425/500 [========================>.....] - ETA: 18s - loss: 1.6927 - regression_loss: 1.4539 - classification_loss: 0.2388 426/500 [========================>.....] - ETA: 18s - loss: 1.6939 - regression_loss: 1.4550 - classification_loss: 0.2390 427/500 [========================>.....] - ETA: 18s - loss: 1.6944 - regression_loss: 1.4554 - classification_loss: 0.2390 428/500 [========================>.....] - ETA: 17s - loss: 1.6945 - regression_loss: 1.4556 - classification_loss: 0.2389 429/500 [========================>.....] - ETA: 17s - loss: 1.6969 - regression_loss: 1.4576 - classification_loss: 0.2393 430/500 [========================>.....] - ETA: 17s - loss: 1.6970 - regression_loss: 1.4577 - classification_loss: 0.2393 431/500 [========================>.....] - ETA: 17s - loss: 1.6961 - regression_loss: 1.4569 - classification_loss: 0.2391 432/500 [========================>.....] - ETA: 16s - loss: 1.6958 - regression_loss: 1.4568 - classification_loss: 0.2391 433/500 [========================>.....] - ETA: 16s - loss: 1.6977 - regression_loss: 1.4583 - classification_loss: 0.2394 434/500 [=========================>....] - ETA: 16s - loss: 1.6982 - regression_loss: 1.4587 - classification_loss: 0.2394 435/500 [=========================>....] - ETA: 16s - loss: 1.6974 - regression_loss: 1.4582 - classification_loss: 0.2393 436/500 [=========================>....] - ETA: 15s - loss: 1.6961 - regression_loss: 1.4571 - classification_loss: 0.2390 437/500 [=========================>....] - ETA: 15s - loss: 1.6958 - regression_loss: 1.4569 - classification_loss: 0.2389 438/500 [=========================>....] - ETA: 15s - loss: 1.6966 - regression_loss: 1.4576 - classification_loss: 0.2390 439/500 [=========================>....] - ETA: 15s - loss: 1.6959 - regression_loss: 1.4571 - classification_loss: 0.2389 440/500 [=========================>....] - ETA: 14s - loss: 1.6960 - regression_loss: 1.4572 - classification_loss: 0.2388 441/500 [=========================>....] - ETA: 14s - loss: 1.6961 - regression_loss: 1.4574 - classification_loss: 0.2387 442/500 [=========================>....] - ETA: 14s - loss: 1.6950 - regression_loss: 1.4565 - classification_loss: 0.2385 443/500 [=========================>....] - ETA: 14s - loss: 1.6958 - regression_loss: 1.4572 - classification_loss: 0.2386 444/500 [=========================>....] - ETA: 13s - loss: 1.6940 - regression_loss: 1.4557 - classification_loss: 0.2383 445/500 [=========================>....] - ETA: 13s - loss: 1.6928 - regression_loss: 1.4547 - classification_loss: 0.2381 446/500 [=========================>....] - ETA: 13s - loss: 1.6927 - regression_loss: 1.4547 - classification_loss: 0.2380 447/500 [=========================>....] - ETA: 13s - loss: 1.6926 - regression_loss: 1.4547 - classification_loss: 0.2379 448/500 [=========================>....] - ETA: 12s - loss: 1.6934 - regression_loss: 1.4555 - classification_loss: 0.2379 449/500 [=========================>....] - ETA: 12s - loss: 1.6909 - regression_loss: 1.4533 - classification_loss: 0.2376 450/500 [==========================>...] - ETA: 12s - loss: 1.6898 - regression_loss: 1.4525 - classification_loss: 0.2373 451/500 [==========================>...] - ETA: 12s - loss: 1.6885 - regression_loss: 1.4513 - classification_loss: 0.2372 452/500 [==========================>...] - ETA: 11s - loss: 1.6892 - regression_loss: 1.4521 - classification_loss: 0.2371 453/500 [==========================>...] - ETA: 11s - loss: 1.6897 - regression_loss: 1.4526 - classification_loss: 0.2371 454/500 [==========================>...] - ETA: 11s - loss: 1.6913 - regression_loss: 1.4538 - classification_loss: 0.2375 455/500 [==========================>...] - ETA: 11s - loss: 1.6909 - regression_loss: 1.4537 - classification_loss: 0.2372 456/500 [==========================>...] - ETA: 10s - loss: 1.6908 - regression_loss: 1.4536 - classification_loss: 0.2372 457/500 [==========================>...] - ETA: 10s - loss: 1.6906 - regression_loss: 1.4534 - classification_loss: 0.2371 458/500 [==========================>...] - ETA: 10s - loss: 1.6907 - regression_loss: 1.4536 - classification_loss: 0.2371 459/500 [==========================>...] - ETA: 10s - loss: 1.6909 - regression_loss: 1.4537 - classification_loss: 0.2372 460/500 [==========================>...] - ETA: 9s - loss: 1.6909 - regression_loss: 1.4538 - classification_loss: 0.2371  461/500 [==========================>...] - ETA: 9s - loss: 1.6913 - regression_loss: 1.4540 - classification_loss: 0.2374 462/500 [==========================>...] - ETA: 9s - loss: 1.6917 - regression_loss: 1.4543 - classification_loss: 0.2374 463/500 [==========================>...] - ETA: 9s - loss: 1.6919 - regression_loss: 1.4546 - classification_loss: 0.2374 464/500 [==========================>...] - ETA: 8s - loss: 1.6918 - regression_loss: 1.4545 - classification_loss: 0.2373 465/500 [==========================>...] - ETA: 8s - loss: 1.6918 - regression_loss: 1.4544 - classification_loss: 0.2374 466/500 [==========================>...] - ETA: 8s - loss: 1.6923 - regression_loss: 1.4549 - classification_loss: 0.2374 467/500 [===========================>..] - ETA: 8s - loss: 1.6906 - regression_loss: 1.4533 - classification_loss: 0.2373 468/500 [===========================>..] - ETA: 7s - loss: 1.6910 - regression_loss: 1.4537 - classification_loss: 0.2373 469/500 [===========================>..] - ETA: 7s - loss: 1.6918 - regression_loss: 1.4538 - classification_loss: 0.2380 470/500 [===========================>..] - ETA: 7s - loss: 1.6907 - regression_loss: 1.4529 - classification_loss: 0.2377 471/500 [===========================>..] - ETA: 7s - loss: 1.6912 - regression_loss: 1.4535 - classification_loss: 0.2377 472/500 [===========================>..] - ETA: 6s - loss: 1.6913 - regression_loss: 1.4534 - classification_loss: 0.2379 473/500 [===========================>..] - ETA: 6s - loss: 1.6914 - regression_loss: 1.4536 - classification_loss: 0.2378 474/500 [===========================>..] - ETA: 6s - loss: 1.6918 - regression_loss: 1.4540 - classification_loss: 0.2379 475/500 [===========================>..] - ETA: 6s - loss: 1.6921 - regression_loss: 1.4541 - classification_loss: 0.2380 476/500 [===========================>..] - ETA: 5s - loss: 1.6937 - regression_loss: 1.4553 - classification_loss: 0.2384 477/500 [===========================>..] - ETA: 5s - loss: 1.6938 - regression_loss: 1.4554 - classification_loss: 0.2383 478/500 [===========================>..] - ETA: 5s - loss: 1.6949 - regression_loss: 1.4563 - classification_loss: 0.2386 479/500 [===========================>..] - ETA: 5s - loss: 1.6953 - regression_loss: 1.4567 - classification_loss: 0.2386 480/500 [===========================>..] - ETA: 4s - loss: 1.6957 - regression_loss: 1.4570 - classification_loss: 0.2387 481/500 [===========================>..] - ETA: 4s - loss: 1.6951 - regression_loss: 1.4566 - classification_loss: 0.2385 482/500 [===========================>..] - ETA: 4s - loss: 1.6966 - regression_loss: 1.4580 - classification_loss: 0.2386 483/500 [===========================>..] - ETA: 4s - loss: 1.6955 - regression_loss: 1.4571 - classification_loss: 0.2384 484/500 [============================>.] - ETA: 3s - loss: 1.6970 - regression_loss: 1.4581 - classification_loss: 0.2389 485/500 [============================>.] - ETA: 3s - loss: 1.6974 - regression_loss: 1.4585 - classification_loss: 0.2390 486/500 [============================>.] - ETA: 3s - loss: 1.6963 - regression_loss: 1.4575 - classification_loss: 0.2388 487/500 [============================>.] - ETA: 3s - loss: 1.6958 - regression_loss: 1.4571 - classification_loss: 0.2386 488/500 [============================>.] - ETA: 2s - loss: 1.6969 - regression_loss: 1.4572 - classification_loss: 0.2397 489/500 [============================>.] - ETA: 2s - loss: 1.6977 - regression_loss: 1.4579 - classification_loss: 0.2399 490/500 [============================>.] - ETA: 2s - loss: 1.6985 - regression_loss: 1.4584 - classification_loss: 0.2400 491/500 [============================>.] - ETA: 2s - loss: 1.6990 - regression_loss: 1.4589 - classification_loss: 0.2401 492/500 [============================>.] - ETA: 1s - loss: 1.6983 - regression_loss: 1.4583 - classification_loss: 0.2400 493/500 [============================>.] - ETA: 1s - loss: 1.6963 - regression_loss: 1.4563 - classification_loss: 0.2400 494/500 [============================>.] - ETA: 1s - loss: 1.6952 - regression_loss: 1.4555 - classification_loss: 0.2397 495/500 [============================>.] - ETA: 1s - loss: 1.6954 - regression_loss: 1.4556 - classification_loss: 0.2398 496/500 [============================>.] - ETA: 0s - loss: 1.6954 - regression_loss: 1.4557 - classification_loss: 0.2397 497/500 [============================>.] - ETA: 0s - loss: 1.6935 - regression_loss: 1.4541 - classification_loss: 0.2394 498/500 [============================>.] - ETA: 0s - loss: 1.6934 - regression_loss: 1.4539 - classification_loss: 0.2396 499/500 [============================>.] - ETA: 0s - loss: 1.6927 - regression_loss: 1.4532 - classification_loss: 0.2394 500/500 [==============================] - 124s 247ms/step - loss: 1.6933 - regression_loss: 1.4537 - classification_loss: 0.2397 1172 instances of class plum with average precision: 0.6413 mAP: 0.6413 Epoch 00010: saving model to ./training/snapshots/resnet50_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 2:07 - loss: 1.7787 - regression_loss: 1.5834 - classification_loss: 0.1953 2/500 [..............................] - ETA: 2:08 - loss: 1.7698 - regression_loss: 1.6150 - classification_loss: 0.1548 3/500 [..............................] - ETA: 2:06 - loss: 1.3347 - regression_loss: 1.2044 - classification_loss: 0.1302 4/500 [..............................] - ETA: 2:06 - loss: 1.0885 - regression_loss: 0.9696 - classification_loss: 0.1189 5/500 [..............................] - ETA: 2:03 - loss: 1.2303 - regression_loss: 1.0835 - classification_loss: 0.1468 6/500 [..............................] - ETA: 2:03 - loss: 1.1966 - regression_loss: 1.0501 - classification_loss: 0.1465 7/500 [..............................] - ETA: 2:04 - loss: 1.2729 - regression_loss: 1.1177 - classification_loss: 0.1551 8/500 [..............................] - ETA: 2:04 - loss: 1.3303 - regression_loss: 1.1628 - classification_loss: 0.1675 9/500 [..............................] - ETA: 2:04 - loss: 1.4125 - regression_loss: 1.2332 - classification_loss: 0.1793 10/500 [..............................] - ETA: 2:04 - loss: 1.4976 - regression_loss: 1.3089 - classification_loss: 0.1886 11/500 [..............................] - ETA: 2:04 - loss: 1.5568 - regression_loss: 1.3534 - classification_loss: 0.2034 12/500 [..............................] - ETA: 2:03 - loss: 1.6011 - regression_loss: 1.3870 - classification_loss: 0.2142 13/500 [..............................] - ETA: 2:02 - loss: 1.6223 - regression_loss: 1.4037 - classification_loss: 0.2186 14/500 [..............................] - ETA: 2:02 - loss: 1.6388 - regression_loss: 1.4149 - classification_loss: 0.2239 15/500 [..............................] - ETA: 2:02 - loss: 1.6820 - regression_loss: 1.4430 - classification_loss: 0.2390 16/500 [..............................] - ETA: 2:02 - loss: 1.6854 - regression_loss: 1.4450 - classification_loss: 0.2404 17/500 [>.............................] - ETA: 2:02 - loss: 1.7036 - regression_loss: 1.4607 - classification_loss: 0.2429 18/500 [>.............................] - ETA: 2:02 - loss: 1.7024 - regression_loss: 1.4615 - classification_loss: 0.2409 19/500 [>.............................] - ETA: 2:01 - loss: 1.7095 - regression_loss: 1.4678 - classification_loss: 0.2417 20/500 [>.............................] - ETA: 2:01 - loss: 1.7329 - regression_loss: 1.4867 - classification_loss: 0.2463 21/500 [>.............................] - ETA: 2:01 - loss: 1.7356 - regression_loss: 1.4861 - classification_loss: 0.2496 22/500 [>.............................] - ETA: 2:01 - loss: 1.6964 - regression_loss: 1.4530 - classification_loss: 0.2434 23/500 [>.............................] - ETA: 2:01 - loss: 1.6680 - regression_loss: 1.4292 - classification_loss: 0.2387 24/500 [>.............................] - ETA: 2:01 - loss: 1.6956 - regression_loss: 1.4511 - classification_loss: 0.2445 25/500 [>.............................] - ETA: 2:01 - loss: 1.7082 - regression_loss: 1.4627 - classification_loss: 0.2456 26/500 [>.............................] - ETA: 2:01 - loss: 1.7097 - regression_loss: 1.4660 - classification_loss: 0.2437 27/500 [>.............................] - ETA: 2:00 - loss: 1.6979 - regression_loss: 1.4571 - classification_loss: 0.2408 28/500 [>.............................] - ETA: 2:01 - loss: 1.6868 - regression_loss: 1.4490 - classification_loss: 0.2378 29/500 [>.............................] - ETA: 2:00 - loss: 1.6578 - regression_loss: 1.4247 - classification_loss: 0.2331 30/500 [>.............................] - ETA: 2:00 - loss: 1.6671 - regression_loss: 1.4327 - classification_loss: 0.2344 31/500 [>.............................] - ETA: 1:59 - loss: 1.6617 - regression_loss: 1.4286 - classification_loss: 0.2331 32/500 [>.............................] - ETA: 1:59 - loss: 1.6573 - regression_loss: 1.4250 - classification_loss: 0.2323 33/500 [>.............................] - ETA: 1:59 - loss: 1.6417 - regression_loss: 1.4122 - classification_loss: 0.2294 34/500 [=>............................] - ETA: 1:58 - loss: 1.6355 - regression_loss: 1.4072 - classification_loss: 0.2283 35/500 [=>............................] - ETA: 1:58 - loss: 1.6155 - regression_loss: 1.3914 - classification_loss: 0.2241 36/500 [=>............................] - ETA: 1:58 - loss: 1.6024 - regression_loss: 1.3776 - classification_loss: 0.2248 37/500 [=>............................] - ETA: 1:57 - loss: 1.6228 - regression_loss: 1.3922 - classification_loss: 0.2306 38/500 [=>............................] - ETA: 1:57 - loss: 1.6358 - regression_loss: 1.4025 - classification_loss: 0.2334 39/500 [=>............................] - ETA: 1:56 - loss: 1.6365 - regression_loss: 1.4027 - classification_loss: 0.2337 40/500 [=>............................] - ETA: 1:55 - loss: 1.6309 - regression_loss: 1.3977 - classification_loss: 0.2332 41/500 [=>............................] - ETA: 1:55 - loss: 1.6281 - regression_loss: 1.3961 - classification_loss: 0.2320 42/500 [=>............................] - ETA: 1:54 - loss: 1.6260 - regression_loss: 1.3950 - classification_loss: 0.2310 43/500 [=>............................] - ETA: 1:54 - loss: 1.6360 - regression_loss: 1.4045 - classification_loss: 0.2315 44/500 [=>............................] - ETA: 1:54 - loss: 1.6390 - regression_loss: 1.4080 - classification_loss: 0.2310 45/500 [=>............................] - ETA: 1:53 - loss: 1.6224 - regression_loss: 1.3942 - classification_loss: 0.2282 46/500 [=>............................] - ETA: 1:53 - loss: 1.6233 - regression_loss: 1.3950 - classification_loss: 0.2283 47/500 [=>............................] - ETA: 1:53 - loss: 1.6262 - regression_loss: 1.3981 - classification_loss: 0.2280 48/500 [=>............................] - ETA: 1:53 - loss: 1.6129 - regression_loss: 1.3864 - classification_loss: 0.2264 49/500 [=>............................] - ETA: 1:52 - loss: 1.6183 - regression_loss: 1.3917 - classification_loss: 0.2267 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6154 - regression_loss: 1.3894 - classification_loss: 0.2261 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6199 - regression_loss: 1.3924 - classification_loss: 0.2275 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6249 - regression_loss: 1.3970 - classification_loss: 0.2279 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6204 - regression_loss: 1.3928 - classification_loss: 0.2275 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6164 - regression_loss: 1.3909 - classification_loss: 0.2256 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6189 - regression_loss: 1.3932 - classification_loss: 0.2256 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6246 - regression_loss: 1.3989 - classification_loss: 0.2257 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6231 - regression_loss: 1.3976 - classification_loss: 0.2254 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6254 - regression_loss: 1.4000 - classification_loss: 0.2254 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6272 - regression_loss: 1.4014 - classification_loss: 0.2258 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6247 - regression_loss: 1.3996 - classification_loss: 0.2251 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6125 - regression_loss: 1.3895 - classification_loss: 0.2229 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6187 - regression_loss: 1.3956 - classification_loss: 0.2230 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6116 - regression_loss: 1.3895 - classification_loss: 0.2221 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6148 - regression_loss: 1.3912 - classification_loss: 0.2236 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6135 - regression_loss: 1.3895 - classification_loss: 0.2240 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6181 - regression_loss: 1.3936 - classification_loss: 0.2246 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6190 - regression_loss: 1.3938 - classification_loss: 0.2252 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6162 - regression_loss: 1.3916 - classification_loss: 0.2247 69/500 [===>..........................] - ETA: 1:46 - loss: 1.6139 - regression_loss: 1.3893 - classification_loss: 0.2246 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6184 - regression_loss: 1.3937 - classification_loss: 0.2247 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6192 - regression_loss: 1.3953 - classification_loss: 0.2238 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6223 - regression_loss: 1.3979 - classification_loss: 0.2245 73/500 [===>..........................] - ETA: 1:45 - loss: 1.6152 - regression_loss: 1.3920 - classification_loss: 0.2231 74/500 [===>..........................] - ETA: 1:45 - loss: 1.6224 - regression_loss: 1.3981 - classification_loss: 0.2243 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6217 - regression_loss: 1.3967 - classification_loss: 0.2250 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6341 - regression_loss: 1.4073 - classification_loss: 0.2268 77/500 [===>..........................] - ETA: 1:44 - loss: 1.6335 - regression_loss: 1.4072 - classification_loss: 0.2264 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6342 - regression_loss: 1.4082 - classification_loss: 0.2260 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6252 - regression_loss: 1.4008 - classification_loss: 0.2245 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6337 - regression_loss: 1.4087 - classification_loss: 0.2250 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6349 - regression_loss: 1.4104 - classification_loss: 0.2245 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6350 - regression_loss: 1.4108 - classification_loss: 0.2243 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6405 - regression_loss: 1.4146 - classification_loss: 0.2259 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6285 - regression_loss: 1.4047 - classification_loss: 0.2239 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6265 - regression_loss: 1.4023 - classification_loss: 0.2242 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6273 - regression_loss: 1.4028 - classification_loss: 0.2246 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6211 - regression_loss: 1.3978 - classification_loss: 0.2233 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6178 - regression_loss: 1.3947 - classification_loss: 0.2230 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6221 - regression_loss: 1.3985 - classification_loss: 0.2236 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6253 - regression_loss: 1.4011 - classification_loss: 0.2242 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6178 - regression_loss: 1.3948 - classification_loss: 0.2230 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6206 - regression_loss: 1.3959 - classification_loss: 0.2247 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6177 - regression_loss: 1.3935 - classification_loss: 0.2242 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6224 - regression_loss: 1.3976 - classification_loss: 0.2249 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6138 - regression_loss: 1.3902 - classification_loss: 0.2236 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6147 - regression_loss: 1.3909 - classification_loss: 0.2237 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6099 - regression_loss: 1.3878 - classification_loss: 0.2221 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6123 - regression_loss: 1.3902 - classification_loss: 0.2221 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6154 - regression_loss: 1.3932 - classification_loss: 0.2222 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6212 - regression_loss: 1.3982 - classification_loss: 0.2230 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6202 - regression_loss: 1.3975 - classification_loss: 0.2227 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6223 - regression_loss: 1.3994 - classification_loss: 0.2229 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6139 - regression_loss: 1.3920 - classification_loss: 0.2219 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6209 - regression_loss: 1.3974 - classification_loss: 0.2234 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6228 - regression_loss: 1.3990 - classification_loss: 0.2237 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6130 - regression_loss: 1.3907 - classification_loss: 0.2223 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6146 - regression_loss: 1.3922 - classification_loss: 0.2223 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6143 - regression_loss: 1.3916 - classification_loss: 0.2227 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6174 - regression_loss: 1.3939 - classification_loss: 0.2235 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6164 - regression_loss: 1.3928 - classification_loss: 0.2236 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6166 - regression_loss: 1.3930 - classification_loss: 0.2236 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6206 - regression_loss: 1.3967 - classification_loss: 0.2239 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6222 - regression_loss: 1.3982 - classification_loss: 0.2240 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6233 - regression_loss: 1.3983 - classification_loss: 0.2250 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6221 - regression_loss: 1.3976 - classification_loss: 0.2245 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6200 - regression_loss: 1.3959 - classification_loss: 0.2241 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6222 - regression_loss: 1.3978 - classification_loss: 0.2244 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6179 - regression_loss: 1.3942 - classification_loss: 0.2237 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6208 - regression_loss: 1.3965 - classification_loss: 0.2243 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6143 - regression_loss: 1.3910 - classification_loss: 0.2234 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6112 - regression_loss: 1.3883 - classification_loss: 0.2229 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6095 - regression_loss: 1.3869 - classification_loss: 0.2226 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6178 - regression_loss: 1.3942 - classification_loss: 0.2236 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6149 - regression_loss: 1.3917 - classification_loss: 0.2231 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6153 - regression_loss: 1.3922 - classification_loss: 0.2231 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6176 - regression_loss: 1.3944 - classification_loss: 0.2232 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6202 - regression_loss: 1.3967 - classification_loss: 0.2235 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6229 - regression_loss: 1.3984 - classification_loss: 0.2245 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6253 - regression_loss: 1.4005 - classification_loss: 0.2248 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6243 - regression_loss: 1.3999 - classification_loss: 0.2245 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6171 - regression_loss: 1.3939 - classification_loss: 0.2232 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6154 - regression_loss: 1.3923 - classification_loss: 0.2231 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6171 - regression_loss: 1.3939 - classification_loss: 0.2232 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6220 - regression_loss: 1.3977 - classification_loss: 0.2243 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6250 - regression_loss: 1.4002 - classification_loss: 0.2248 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6240 - regression_loss: 1.3994 - classification_loss: 0.2246 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6244 - regression_loss: 1.4003 - classification_loss: 0.2241 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6248 - regression_loss: 1.4004 - classification_loss: 0.2244 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6217 - regression_loss: 1.3979 - classification_loss: 0.2238 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6216 - regression_loss: 1.3978 - classification_loss: 0.2238 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6228 - regression_loss: 1.3986 - classification_loss: 0.2242 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6200 - regression_loss: 1.3963 - classification_loss: 0.2237 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6242 - regression_loss: 1.4000 - classification_loss: 0.2242 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6223 - regression_loss: 1.3987 - classification_loss: 0.2236 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6163 - regression_loss: 1.3934 - classification_loss: 0.2229 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6145 - regression_loss: 1.3919 - classification_loss: 0.2225 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6072 - regression_loss: 1.3857 - classification_loss: 0.2214 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6088 - regression_loss: 1.3867 - classification_loss: 0.2221 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6083 - regression_loss: 1.3859 - classification_loss: 0.2224 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6077 - regression_loss: 1.3854 - classification_loss: 0.2223 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6060 - regression_loss: 1.3839 - classification_loss: 0.2221 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6078 - regression_loss: 1.3859 - classification_loss: 0.2220 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6118 - regression_loss: 1.3896 - classification_loss: 0.2222 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6133 - regression_loss: 1.3907 - classification_loss: 0.2226 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6136 - regression_loss: 1.3912 - classification_loss: 0.2224 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6177 - regression_loss: 1.3949 - classification_loss: 0.2228 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6189 - regression_loss: 1.3959 - classification_loss: 0.2230 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6159 - regression_loss: 1.3933 - classification_loss: 0.2226 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6169 - regression_loss: 1.3944 - classification_loss: 0.2226 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6182 - regression_loss: 1.3956 - classification_loss: 0.2226 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6137 - regression_loss: 1.3918 - classification_loss: 0.2219 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6175 - regression_loss: 1.3953 - classification_loss: 0.2222 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6183 - regression_loss: 1.3960 - classification_loss: 0.2223 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6194 - regression_loss: 1.3974 - classification_loss: 0.2220 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6170 - regression_loss: 1.3957 - classification_loss: 0.2214 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6159 - regression_loss: 1.3946 - classification_loss: 0.2213 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6130 - regression_loss: 1.3917 - classification_loss: 0.2212 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6167 - regression_loss: 1.3944 - classification_loss: 0.2223 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6195 - regression_loss: 1.3969 - classification_loss: 0.2227 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6216 - regression_loss: 1.3988 - classification_loss: 0.2229 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6216 - regression_loss: 1.3986 - classification_loss: 0.2230 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6213 - regression_loss: 1.3982 - classification_loss: 0.2231 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6215 - regression_loss: 1.3977 - classification_loss: 0.2238 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6211 - regression_loss: 1.3973 - classification_loss: 0.2238 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6241 - regression_loss: 1.3999 - classification_loss: 0.2242 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6260 - regression_loss: 1.4014 - classification_loss: 0.2246 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6262 - regression_loss: 1.4001 - classification_loss: 0.2261 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6247 - regression_loss: 1.3988 - classification_loss: 0.2258 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6233 - regression_loss: 1.3978 - classification_loss: 0.2255 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6263 - regression_loss: 1.4001 - classification_loss: 0.2262 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6240 - regression_loss: 1.3983 - classification_loss: 0.2257 182/500 [=========>....................] - ETA: 1:20 - loss: 1.6235 - regression_loss: 1.3976 - classification_loss: 0.2259 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6234 - regression_loss: 1.3978 - classification_loss: 0.2257 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6235 - regression_loss: 1.3978 - classification_loss: 0.2257 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6196 - regression_loss: 1.3946 - classification_loss: 0.2251 186/500 [==========>...................] - ETA: 1:19 - loss: 1.6221 - regression_loss: 1.3964 - classification_loss: 0.2257 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6223 - regression_loss: 1.3967 - classification_loss: 0.2257 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6196 - regression_loss: 1.3946 - classification_loss: 0.2250 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6216 - regression_loss: 1.3964 - classification_loss: 0.2252 190/500 [==========>...................] - ETA: 1:18 - loss: 1.6172 - regression_loss: 1.3929 - classification_loss: 0.2244 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6203 - regression_loss: 1.3955 - classification_loss: 0.2249 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6175 - regression_loss: 1.3930 - classification_loss: 0.2245 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6168 - regression_loss: 1.3926 - classification_loss: 0.2242 194/500 [==========>...................] - ETA: 1:17 - loss: 1.6119 - regression_loss: 1.3884 - classification_loss: 0.2236 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6151 - regression_loss: 1.3909 - classification_loss: 0.2242 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6160 - regression_loss: 1.3917 - classification_loss: 0.2243 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6141 - regression_loss: 1.3901 - classification_loss: 0.2240 198/500 [==========>...................] - ETA: 1:16 - loss: 1.6158 - regression_loss: 1.3915 - classification_loss: 0.2242 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6206 - regression_loss: 1.3958 - classification_loss: 0.2249 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6215 - regression_loss: 1.3964 - classification_loss: 0.2251 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6238 - regression_loss: 1.3982 - classification_loss: 0.2256 202/500 [===========>..................] - ETA: 1:15 - loss: 1.6238 - regression_loss: 1.3983 - classification_loss: 0.2255 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6251 - regression_loss: 1.3994 - classification_loss: 0.2257 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6259 - regression_loss: 1.3997 - classification_loss: 0.2261 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6255 - regression_loss: 1.3995 - classification_loss: 0.2260 206/500 [===========>..................] - ETA: 1:14 - loss: 1.6268 - regression_loss: 1.4006 - classification_loss: 0.2262 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6267 - regression_loss: 1.4004 - classification_loss: 0.2263 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6282 - regression_loss: 1.4017 - classification_loss: 0.2265 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6278 - regression_loss: 1.4017 - classification_loss: 0.2261 210/500 [===========>..................] - ETA: 1:13 - loss: 1.6290 - regression_loss: 1.4029 - classification_loss: 0.2261 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6291 - regression_loss: 1.4028 - classification_loss: 0.2264 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6262 - regression_loss: 1.4001 - classification_loss: 0.2261 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6278 - regression_loss: 1.4014 - classification_loss: 0.2264 214/500 [===========>..................] - ETA: 1:12 - loss: 1.6269 - regression_loss: 1.4004 - classification_loss: 0.2265 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6279 - regression_loss: 1.4011 - classification_loss: 0.2268 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6286 - regression_loss: 1.4018 - classification_loss: 0.2268 217/500 [============>.................] - ETA: 1:11 - loss: 1.6314 - regression_loss: 1.4043 - classification_loss: 0.2270 218/500 [============>.................] - ETA: 1:11 - loss: 1.6320 - regression_loss: 1.4050 - classification_loss: 0.2271 219/500 [============>.................] - ETA: 1:10 - loss: 1.6318 - regression_loss: 1.4047 - classification_loss: 0.2270 220/500 [============>.................] - ETA: 1:10 - loss: 1.6318 - regression_loss: 1.4050 - classification_loss: 0.2268 221/500 [============>.................] - ETA: 1:10 - loss: 1.6323 - regression_loss: 1.4054 - classification_loss: 0.2269 222/500 [============>.................] - ETA: 1:10 - loss: 1.6320 - regression_loss: 1.4049 - classification_loss: 0.2271 223/500 [============>.................] - ETA: 1:09 - loss: 1.6320 - regression_loss: 1.4051 - classification_loss: 0.2269 224/500 [============>.................] - ETA: 1:09 - loss: 1.6345 - regression_loss: 1.4072 - classification_loss: 0.2274 225/500 [============>.................] - ETA: 1:09 - loss: 1.6326 - regression_loss: 1.4054 - classification_loss: 0.2272 226/500 [============>.................] - ETA: 1:09 - loss: 1.6332 - regression_loss: 1.4059 - classification_loss: 0.2273 227/500 [============>.................] - ETA: 1:08 - loss: 1.6330 - regression_loss: 1.4055 - classification_loss: 0.2275 228/500 [============>.................] - ETA: 1:08 - loss: 1.6333 - regression_loss: 1.4058 - classification_loss: 0.2275 229/500 [============>.................] - ETA: 1:08 - loss: 1.6388 - regression_loss: 1.4099 - classification_loss: 0.2289 230/500 [============>.................] - ETA: 1:08 - loss: 1.6377 - regression_loss: 1.4090 - classification_loss: 0.2287 231/500 [============>.................] - ETA: 1:07 - loss: 1.6399 - regression_loss: 1.4111 - classification_loss: 0.2289 232/500 [============>.................] - ETA: 1:07 - loss: 1.6416 - regression_loss: 1.4125 - classification_loss: 0.2291 233/500 [============>.................] - ETA: 1:07 - loss: 1.6409 - regression_loss: 1.4122 - classification_loss: 0.2288 234/500 [=============>................] - ETA: 1:07 - loss: 1.6429 - regression_loss: 1.4135 - classification_loss: 0.2294 235/500 [=============>................] - ETA: 1:06 - loss: 1.6440 - regression_loss: 1.4143 - classification_loss: 0.2297 236/500 [=============>................] - ETA: 1:06 - loss: 1.6430 - regression_loss: 1.4138 - classification_loss: 0.2291 237/500 [=============>................] - ETA: 1:06 - loss: 1.6427 - regression_loss: 1.4138 - classification_loss: 0.2289 238/500 [=============>................] - ETA: 1:06 - loss: 1.6439 - regression_loss: 1.4146 - classification_loss: 0.2293 239/500 [=============>................] - ETA: 1:05 - loss: 1.6430 - regression_loss: 1.4138 - classification_loss: 0.2293 240/500 [=============>................] - ETA: 1:05 - loss: 1.6443 - regression_loss: 1.4150 - classification_loss: 0.2294 241/500 [=============>................] - ETA: 1:05 - loss: 1.6402 - regression_loss: 1.4114 - classification_loss: 0.2288 242/500 [=============>................] - ETA: 1:05 - loss: 1.6395 - regression_loss: 1.4109 - classification_loss: 0.2286 243/500 [=============>................] - ETA: 1:04 - loss: 1.6398 - regression_loss: 1.4111 - classification_loss: 0.2287 244/500 [=============>................] - ETA: 1:04 - loss: 1.6413 - regression_loss: 1.4126 - classification_loss: 0.2287 245/500 [=============>................] - ETA: 1:04 - loss: 1.6419 - regression_loss: 1.4131 - classification_loss: 0.2288 246/500 [=============>................] - ETA: 1:04 - loss: 1.6371 - regression_loss: 1.4089 - classification_loss: 0.2281 247/500 [=============>................] - ETA: 1:03 - loss: 1.6367 - regression_loss: 1.4086 - classification_loss: 0.2280 248/500 [=============>................] - ETA: 1:03 - loss: 1.6376 - regression_loss: 1.4095 - classification_loss: 0.2281 249/500 [=============>................] - ETA: 1:03 - loss: 1.6339 - regression_loss: 1.4063 - classification_loss: 0.2276 250/500 [==============>...............] - ETA: 1:03 - loss: 1.6310 - regression_loss: 1.4038 - classification_loss: 0.2272 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6292 - regression_loss: 1.4024 - classification_loss: 0.2268 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6286 - regression_loss: 1.4020 - classification_loss: 0.2266 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6301 - regression_loss: 1.4031 - classification_loss: 0.2269 254/500 [==============>...............] - ETA: 1:02 - loss: 1.6333 - regression_loss: 1.4063 - classification_loss: 0.2270 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6332 - regression_loss: 1.4064 - classification_loss: 0.2268 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6340 - regression_loss: 1.4070 - classification_loss: 0.2270 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6325 - regression_loss: 1.4058 - classification_loss: 0.2267 258/500 [==============>...............] - ETA: 1:01 - loss: 1.6326 - regression_loss: 1.4059 - classification_loss: 0.2266 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6316 - regression_loss: 1.4051 - classification_loss: 0.2265 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6315 - regression_loss: 1.4050 - classification_loss: 0.2264 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6320 - regression_loss: 1.4055 - classification_loss: 0.2265 262/500 [==============>...............] - ETA: 1:00 - loss: 1.6323 - regression_loss: 1.4058 - classification_loss: 0.2264 263/500 [==============>...............] - ETA: 59s - loss: 1.6299 - regression_loss: 1.4040 - classification_loss: 0.2259  264/500 [==============>...............] - ETA: 59s - loss: 1.6275 - regression_loss: 1.4020 - classification_loss: 0.2255 265/500 [==============>...............] - ETA: 59s - loss: 1.6264 - regression_loss: 1.4011 - classification_loss: 0.2254 266/500 [==============>...............] - ETA: 59s - loss: 1.6272 - regression_loss: 1.4017 - classification_loss: 0.2255 267/500 [===============>..............] - ETA: 58s - loss: 1.6292 - regression_loss: 1.4030 - classification_loss: 0.2262 268/500 [===============>..............] - ETA: 58s - loss: 1.6289 - regression_loss: 1.4025 - classification_loss: 0.2265 269/500 [===============>..............] - ETA: 58s - loss: 1.6286 - regression_loss: 1.4021 - classification_loss: 0.2265 270/500 [===============>..............] - ETA: 58s - loss: 1.6285 - regression_loss: 1.4020 - classification_loss: 0.2265 271/500 [===============>..............] - ETA: 57s - loss: 1.6297 - regression_loss: 1.4030 - classification_loss: 0.2267 272/500 [===============>..............] - ETA: 57s - loss: 1.6298 - regression_loss: 1.4032 - classification_loss: 0.2266 273/500 [===============>..............] - ETA: 57s - loss: 1.6295 - regression_loss: 1.4030 - classification_loss: 0.2265 274/500 [===============>..............] - ETA: 57s - loss: 1.6309 - regression_loss: 1.4043 - classification_loss: 0.2266 275/500 [===============>..............] - ETA: 56s - loss: 1.6329 - regression_loss: 1.4058 - classification_loss: 0.2271 276/500 [===============>..............] - ETA: 56s - loss: 1.6346 - regression_loss: 1.4071 - classification_loss: 0.2274 277/500 [===============>..............] - ETA: 56s - loss: 1.6334 - regression_loss: 1.4064 - classification_loss: 0.2271 278/500 [===============>..............] - ETA: 56s - loss: 1.6331 - regression_loss: 1.4061 - classification_loss: 0.2270 279/500 [===============>..............] - ETA: 55s - loss: 1.6333 - regression_loss: 1.4063 - classification_loss: 0.2270 280/500 [===============>..............] - ETA: 55s - loss: 1.6347 - regression_loss: 1.4075 - classification_loss: 0.2272 281/500 [===============>..............] - ETA: 55s - loss: 1.6320 - regression_loss: 1.4052 - classification_loss: 0.2268 282/500 [===============>..............] - ETA: 55s - loss: 1.6319 - regression_loss: 1.4051 - classification_loss: 0.2267 283/500 [===============>..............] - ETA: 54s - loss: 1.6326 - regression_loss: 1.4058 - classification_loss: 0.2268 284/500 [================>.............] - ETA: 54s - loss: 1.6324 - regression_loss: 1.4057 - classification_loss: 0.2267 285/500 [================>.............] - ETA: 54s - loss: 1.6340 - regression_loss: 1.4069 - classification_loss: 0.2271 286/500 [================>.............] - ETA: 54s - loss: 1.6314 - regression_loss: 1.4048 - classification_loss: 0.2266 287/500 [================>.............] - ETA: 53s - loss: 1.6330 - regression_loss: 1.4061 - classification_loss: 0.2268 288/500 [================>.............] - ETA: 53s - loss: 1.6332 - regression_loss: 1.4065 - classification_loss: 0.2267 289/500 [================>.............] - ETA: 53s - loss: 1.6303 - regression_loss: 1.4041 - classification_loss: 0.2262 290/500 [================>.............] - ETA: 53s - loss: 1.6274 - regression_loss: 1.4014 - classification_loss: 0.2260 291/500 [================>.............] - ETA: 52s - loss: 1.6285 - regression_loss: 1.4025 - classification_loss: 0.2261 292/500 [================>.............] - ETA: 52s - loss: 1.6291 - regression_loss: 1.4029 - classification_loss: 0.2261 293/500 [================>.............] - ETA: 52s - loss: 1.6262 - regression_loss: 1.4004 - classification_loss: 0.2258 294/500 [================>.............] - ETA: 52s - loss: 1.6282 - regression_loss: 1.4022 - classification_loss: 0.2260 295/500 [================>.............] - ETA: 51s - loss: 1.6284 - regression_loss: 1.4024 - classification_loss: 0.2260 296/500 [================>.............] - ETA: 51s - loss: 1.6308 - regression_loss: 1.4043 - classification_loss: 0.2265 297/500 [================>.............] - ETA: 51s - loss: 1.6337 - regression_loss: 1.4070 - classification_loss: 0.2267 298/500 [================>.............] - ETA: 51s - loss: 1.6343 - regression_loss: 1.4076 - classification_loss: 0.2267 299/500 [================>.............] - ETA: 50s - loss: 1.6341 - regression_loss: 1.4074 - classification_loss: 0.2267 300/500 [=================>............] - ETA: 50s - loss: 1.6352 - regression_loss: 1.4081 - classification_loss: 0.2271 301/500 [=================>............] - ETA: 50s - loss: 1.6346 - regression_loss: 1.4076 - classification_loss: 0.2270 302/500 [=================>............] - ETA: 50s - loss: 1.6370 - regression_loss: 1.4094 - classification_loss: 0.2275 303/500 [=================>............] - ETA: 49s - loss: 1.6355 - regression_loss: 1.4082 - classification_loss: 0.2273 304/500 [=================>............] - ETA: 49s - loss: 1.6363 - regression_loss: 1.4091 - classification_loss: 0.2272 305/500 [=================>............] - ETA: 49s - loss: 1.6368 - regression_loss: 1.4090 - classification_loss: 0.2277 306/500 [=================>............] - ETA: 49s - loss: 1.6379 - regression_loss: 1.4100 - classification_loss: 0.2279 307/500 [=================>............] - ETA: 48s - loss: 1.6362 - regression_loss: 1.4086 - classification_loss: 0.2277 308/500 [=================>............] - ETA: 48s - loss: 1.6370 - regression_loss: 1.4093 - classification_loss: 0.2277 309/500 [=================>............] - ETA: 48s - loss: 1.6387 - regression_loss: 1.4106 - classification_loss: 0.2280 310/500 [=================>............] - ETA: 48s - loss: 1.6378 - regression_loss: 1.4102 - classification_loss: 0.2277 311/500 [=================>............] - ETA: 47s - loss: 1.6353 - regression_loss: 1.4080 - classification_loss: 0.2273 312/500 [=================>............] - ETA: 47s - loss: 1.6342 - regression_loss: 1.4073 - classification_loss: 0.2270 313/500 [=================>............] - ETA: 47s - loss: 1.6343 - regression_loss: 1.4073 - classification_loss: 0.2270 314/500 [=================>............] - ETA: 47s - loss: 1.6326 - regression_loss: 1.4059 - classification_loss: 0.2267 315/500 [=================>............] - ETA: 46s - loss: 1.6304 - regression_loss: 1.4041 - classification_loss: 0.2263 316/500 [=================>............] - ETA: 46s - loss: 1.6314 - regression_loss: 1.4050 - classification_loss: 0.2264 317/500 [==================>...........] - ETA: 46s - loss: 1.6315 - regression_loss: 1.4052 - classification_loss: 0.2264 318/500 [==================>...........] - ETA: 46s - loss: 1.6317 - regression_loss: 1.4054 - classification_loss: 0.2263 319/500 [==================>...........] - ETA: 45s - loss: 1.6293 - regression_loss: 1.4035 - classification_loss: 0.2258 320/500 [==================>...........] - ETA: 45s - loss: 1.6291 - regression_loss: 1.4035 - classification_loss: 0.2257 321/500 [==================>...........] - ETA: 45s - loss: 1.6285 - regression_loss: 1.4029 - classification_loss: 0.2255 322/500 [==================>...........] - ETA: 45s - loss: 1.6291 - regression_loss: 1.4034 - classification_loss: 0.2256 323/500 [==================>...........] - ETA: 44s - loss: 1.6280 - regression_loss: 1.4027 - classification_loss: 0.2252 324/500 [==================>...........] - ETA: 44s - loss: 1.6273 - regression_loss: 1.4022 - classification_loss: 0.2251 325/500 [==================>...........] - ETA: 44s - loss: 1.6258 - regression_loss: 1.4010 - classification_loss: 0.2248 326/500 [==================>...........] - ETA: 43s - loss: 1.6217 - regression_loss: 1.3974 - classification_loss: 0.2243 327/500 [==================>...........] - ETA: 43s - loss: 1.6219 - regression_loss: 1.3975 - classification_loss: 0.2243 328/500 [==================>...........] - ETA: 43s - loss: 1.6221 - regression_loss: 1.3977 - classification_loss: 0.2244 329/500 [==================>...........] - ETA: 43s - loss: 1.6234 - regression_loss: 1.3989 - classification_loss: 0.2246 330/500 [==================>...........] - ETA: 42s - loss: 1.6246 - regression_loss: 1.3999 - classification_loss: 0.2248 331/500 [==================>...........] - ETA: 42s - loss: 1.6258 - regression_loss: 1.4011 - classification_loss: 0.2247 332/500 [==================>...........] - ETA: 42s - loss: 1.6266 - regression_loss: 1.4017 - classification_loss: 0.2249 333/500 [==================>...........] - ETA: 42s - loss: 1.6277 - regression_loss: 1.4025 - classification_loss: 0.2252 334/500 [===================>..........] - ETA: 41s - loss: 1.6273 - regression_loss: 1.4022 - classification_loss: 0.2251 335/500 [===================>..........] - ETA: 41s - loss: 1.6278 - regression_loss: 1.4026 - classification_loss: 0.2251 336/500 [===================>..........] - ETA: 41s - loss: 1.6293 - regression_loss: 1.4040 - classification_loss: 0.2254 337/500 [===================>..........] - ETA: 41s - loss: 1.6309 - regression_loss: 1.4053 - classification_loss: 0.2256 338/500 [===================>..........] - ETA: 40s - loss: 1.6291 - regression_loss: 1.4036 - classification_loss: 0.2255 339/500 [===================>..........] - ETA: 40s - loss: 1.6298 - regression_loss: 1.4041 - classification_loss: 0.2256 340/500 [===================>..........] - ETA: 40s - loss: 1.6272 - regression_loss: 1.4020 - classification_loss: 0.2252 341/500 [===================>..........] - ETA: 40s - loss: 1.6262 - regression_loss: 1.4013 - classification_loss: 0.2250 342/500 [===================>..........] - ETA: 39s - loss: 1.6268 - regression_loss: 1.4018 - classification_loss: 0.2250 343/500 [===================>..........] - ETA: 39s - loss: 1.6269 - regression_loss: 1.4019 - classification_loss: 0.2250 344/500 [===================>..........] - ETA: 39s - loss: 1.6282 - regression_loss: 1.4029 - classification_loss: 0.2253 345/500 [===================>..........] - ETA: 39s - loss: 1.6294 - regression_loss: 1.4039 - classification_loss: 0.2255 346/500 [===================>..........] - ETA: 38s - loss: 1.6306 - regression_loss: 1.4051 - classification_loss: 0.2255 347/500 [===================>..........] - ETA: 38s - loss: 1.6317 - regression_loss: 1.4061 - classification_loss: 0.2256 348/500 [===================>..........] - ETA: 38s - loss: 1.6313 - regression_loss: 1.4056 - classification_loss: 0.2257 349/500 [===================>..........] - ETA: 38s - loss: 1.6325 - regression_loss: 1.4066 - classification_loss: 0.2259 350/500 [====================>.........] - ETA: 37s - loss: 1.6329 - regression_loss: 1.4069 - classification_loss: 0.2260 351/500 [====================>.........] - ETA: 37s - loss: 1.6332 - regression_loss: 1.4070 - classification_loss: 0.2262 352/500 [====================>.........] - ETA: 37s - loss: 1.6338 - regression_loss: 1.4075 - classification_loss: 0.2263 353/500 [====================>.........] - ETA: 37s - loss: 1.6344 - regression_loss: 1.4081 - classification_loss: 0.2263 354/500 [====================>.........] - ETA: 36s - loss: 1.6344 - regression_loss: 1.4081 - classification_loss: 0.2263 355/500 [====================>.........] - ETA: 36s - loss: 1.6335 - regression_loss: 1.4073 - classification_loss: 0.2263 356/500 [====================>.........] - ETA: 36s - loss: 1.6336 - regression_loss: 1.4072 - classification_loss: 0.2263 357/500 [====================>.........] - ETA: 36s - loss: 1.6341 - regression_loss: 1.4078 - classification_loss: 0.2263 358/500 [====================>.........] - ETA: 35s - loss: 1.6335 - regression_loss: 1.4074 - classification_loss: 0.2261 359/500 [====================>.........] - ETA: 35s - loss: 1.6350 - regression_loss: 1.4086 - classification_loss: 0.2264 360/500 [====================>.........] - ETA: 35s - loss: 1.6330 - regression_loss: 1.4070 - classification_loss: 0.2261 361/500 [====================>.........] - ETA: 35s - loss: 1.6327 - regression_loss: 1.4068 - classification_loss: 0.2259 362/500 [====================>.........] - ETA: 34s - loss: 1.6316 - regression_loss: 1.4058 - classification_loss: 0.2257 363/500 [====================>.........] - ETA: 34s - loss: 1.6322 - regression_loss: 1.4064 - classification_loss: 0.2258 364/500 [====================>.........] - ETA: 34s - loss: 1.6323 - regression_loss: 1.4065 - classification_loss: 0.2258 365/500 [====================>.........] - ETA: 34s - loss: 1.6336 - regression_loss: 1.4074 - classification_loss: 0.2262 366/500 [====================>.........] - ETA: 33s - loss: 1.6327 - regression_loss: 1.4066 - classification_loss: 0.2261 367/500 [=====================>........] - ETA: 33s - loss: 1.6315 - regression_loss: 1.4055 - classification_loss: 0.2260 368/500 [=====================>........] - ETA: 33s - loss: 1.6326 - regression_loss: 1.4063 - classification_loss: 0.2262 369/500 [=====================>........] - ETA: 33s - loss: 1.6314 - regression_loss: 1.4055 - classification_loss: 0.2259 370/500 [=====================>........] - ETA: 32s - loss: 1.6323 - regression_loss: 1.4063 - classification_loss: 0.2260 371/500 [=====================>........] - ETA: 32s - loss: 1.6328 - regression_loss: 1.4067 - classification_loss: 0.2261 372/500 [=====================>........] - ETA: 32s - loss: 1.6327 - regression_loss: 1.4067 - classification_loss: 0.2260 373/500 [=====================>........] - ETA: 32s - loss: 1.6323 - regression_loss: 1.4064 - classification_loss: 0.2260 374/500 [=====================>........] - ETA: 31s - loss: 1.6325 - regression_loss: 1.4067 - classification_loss: 0.2259 375/500 [=====================>........] - ETA: 31s - loss: 1.6318 - regression_loss: 1.4061 - classification_loss: 0.2258 376/500 [=====================>........] - ETA: 31s - loss: 1.6331 - regression_loss: 1.4071 - classification_loss: 0.2260 377/500 [=====================>........] - ETA: 31s - loss: 1.6356 - regression_loss: 1.4095 - classification_loss: 0.2261 378/500 [=====================>........] - ETA: 30s - loss: 1.6357 - regression_loss: 1.4096 - classification_loss: 0.2261 379/500 [=====================>........] - ETA: 30s - loss: 1.6344 - regression_loss: 1.4086 - classification_loss: 0.2257 380/500 [=====================>........] - ETA: 30s - loss: 1.6343 - regression_loss: 1.4087 - classification_loss: 0.2256 381/500 [=====================>........] - ETA: 30s - loss: 1.6349 - regression_loss: 1.4091 - classification_loss: 0.2258 382/500 [=====================>........] - ETA: 29s - loss: 1.6348 - regression_loss: 1.4093 - classification_loss: 0.2256 383/500 [=====================>........] - ETA: 29s - loss: 1.6341 - regression_loss: 1.4087 - classification_loss: 0.2254 384/500 [======================>.......] - ETA: 29s - loss: 1.6347 - regression_loss: 1.4092 - classification_loss: 0.2255 385/500 [======================>.......] - ETA: 29s - loss: 1.6360 - regression_loss: 1.4102 - classification_loss: 0.2257 386/500 [======================>.......] - ETA: 28s - loss: 1.6371 - regression_loss: 1.4112 - classification_loss: 0.2258 387/500 [======================>.......] - ETA: 28s - loss: 1.6374 - regression_loss: 1.4114 - classification_loss: 0.2260 388/500 [======================>.......] - ETA: 28s - loss: 1.6371 - regression_loss: 1.4112 - classification_loss: 0.2259 389/500 [======================>.......] - ETA: 28s - loss: 1.6356 - regression_loss: 1.4099 - classification_loss: 0.2257 390/500 [======================>.......] - ETA: 27s - loss: 1.6356 - regression_loss: 1.4098 - classification_loss: 0.2257 391/500 [======================>.......] - ETA: 27s - loss: 1.6364 - regression_loss: 1.4104 - classification_loss: 0.2260 392/500 [======================>.......] - ETA: 27s - loss: 1.6366 - regression_loss: 1.4105 - classification_loss: 0.2260 393/500 [======================>.......] - ETA: 27s - loss: 1.6361 - regression_loss: 1.4103 - classification_loss: 0.2258 394/500 [======================>.......] - ETA: 26s - loss: 1.6365 - regression_loss: 1.4107 - classification_loss: 0.2257 395/500 [======================>.......] - ETA: 26s - loss: 1.6398 - regression_loss: 1.4135 - classification_loss: 0.2262 396/500 [======================>.......] - ETA: 26s - loss: 1.6396 - regression_loss: 1.4134 - classification_loss: 0.2261 397/500 [======================>.......] - ETA: 26s - loss: 1.6394 - regression_loss: 1.4134 - classification_loss: 0.2260 398/500 [======================>.......] - ETA: 25s - loss: 1.6391 - regression_loss: 1.4131 - classification_loss: 0.2260 399/500 [======================>.......] - ETA: 25s - loss: 1.6375 - regression_loss: 1.4116 - classification_loss: 0.2259 400/500 [=======================>......] - ETA: 25s - loss: 1.6365 - regression_loss: 1.4108 - classification_loss: 0.2257 401/500 [=======================>......] - ETA: 25s - loss: 1.6370 - regression_loss: 1.4111 - classification_loss: 0.2259 402/500 [=======================>......] - ETA: 24s - loss: 1.6361 - regression_loss: 1.4105 - classification_loss: 0.2256 403/500 [=======================>......] - ETA: 24s - loss: 1.6344 - regression_loss: 1.4091 - classification_loss: 0.2253 404/500 [=======================>......] - ETA: 24s - loss: 1.6326 - regression_loss: 1.4075 - classification_loss: 0.2251 405/500 [=======================>......] - ETA: 24s - loss: 1.6308 - regression_loss: 1.4060 - classification_loss: 0.2248 406/500 [=======================>......] - ETA: 23s - loss: 1.6315 - regression_loss: 1.4067 - classification_loss: 0.2248 407/500 [=======================>......] - ETA: 23s - loss: 1.6318 - regression_loss: 1.4070 - classification_loss: 0.2248 408/500 [=======================>......] - ETA: 23s - loss: 1.6321 - regression_loss: 1.4072 - classification_loss: 0.2249 409/500 [=======================>......] - ETA: 23s - loss: 1.6309 - regression_loss: 1.4062 - classification_loss: 0.2248 410/500 [=======================>......] - ETA: 22s - loss: 1.6306 - regression_loss: 1.4058 - classification_loss: 0.2247 411/500 [=======================>......] - ETA: 22s - loss: 1.6330 - regression_loss: 1.4077 - classification_loss: 0.2253 412/500 [=======================>......] - ETA: 22s - loss: 1.6334 - regression_loss: 1.4080 - classification_loss: 0.2254 413/500 [=======================>......] - ETA: 22s - loss: 1.6336 - regression_loss: 1.4082 - classification_loss: 0.2255 414/500 [=======================>......] - ETA: 21s - loss: 1.6341 - regression_loss: 1.4085 - classification_loss: 0.2256 415/500 [=======================>......] - ETA: 21s - loss: 1.6344 - regression_loss: 1.4088 - classification_loss: 0.2256 416/500 [=======================>......] - ETA: 21s - loss: 1.6348 - regression_loss: 1.4092 - classification_loss: 0.2256 417/500 [========================>.....] - ETA: 21s - loss: 1.6336 - regression_loss: 1.4082 - classification_loss: 0.2253 418/500 [========================>.....] - ETA: 20s - loss: 1.6338 - regression_loss: 1.4085 - classification_loss: 0.2253 419/500 [========================>.....] - ETA: 20s - loss: 1.6343 - regression_loss: 1.4089 - classification_loss: 0.2254 420/500 [========================>.....] - ETA: 20s - loss: 1.6338 - regression_loss: 1.4085 - classification_loss: 0.2253 421/500 [========================>.....] - ETA: 20s - loss: 1.6319 - regression_loss: 1.4069 - classification_loss: 0.2250 422/500 [========================>.....] - ETA: 19s - loss: 1.6322 - regression_loss: 1.4072 - classification_loss: 0.2251 423/500 [========================>.....] - ETA: 19s - loss: 1.6336 - regression_loss: 1.4082 - classification_loss: 0.2254 424/500 [========================>.....] - ETA: 19s - loss: 1.6330 - regression_loss: 1.4076 - classification_loss: 0.2254 425/500 [========================>.....] - ETA: 19s - loss: 1.6334 - regression_loss: 1.4079 - classification_loss: 0.2255 426/500 [========================>.....] - ETA: 18s - loss: 1.6322 - regression_loss: 1.4070 - classification_loss: 0.2252 427/500 [========================>.....] - ETA: 18s - loss: 1.6328 - regression_loss: 1.4075 - classification_loss: 0.2253 428/500 [========================>.....] - ETA: 18s - loss: 1.6339 - regression_loss: 1.4085 - classification_loss: 0.2254 429/500 [========================>.....] - ETA: 17s - loss: 1.6346 - regression_loss: 1.4091 - classification_loss: 0.2255 430/500 [========================>.....] - ETA: 17s - loss: 1.6350 - regression_loss: 1.4095 - classification_loss: 0.2255 431/500 [========================>.....] - ETA: 17s - loss: 1.6351 - regression_loss: 1.4096 - classification_loss: 0.2255 432/500 [========================>.....] - ETA: 17s - loss: 1.6352 - regression_loss: 1.4096 - classification_loss: 0.2256 433/500 [========================>.....] - ETA: 16s - loss: 1.6351 - regression_loss: 1.4095 - classification_loss: 0.2256 434/500 [=========================>....] - ETA: 16s - loss: 1.6345 - regression_loss: 1.4090 - classification_loss: 0.2255 435/500 [=========================>....] - ETA: 16s - loss: 1.6348 - regression_loss: 1.4093 - classification_loss: 0.2255 436/500 [=========================>....] - ETA: 16s - loss: 1.6337 - regression_loss: 1.4082 - classification_loss: 0.2255 437/500 [=========================>....] - ETA: 15s - loss: 1.6341 - regression_loss: 1.4082 - classification_loss: 0.2259 438/500 [=========================>....] - ETA: 15s - loss: 1.6354 - regression_loss: 1.4093 - classification_loss: 0.2261 439/500 [=========================>....] - ETA: 15s - loss: 1.6364 - regression_loss: 1.4100 - classification_loss: 0.2263 440/500 [=========================>....] - ETA: 15s - loss: 1.6362 - regression_loss: 1.4099 - classification_loss: 0.2263 441/500 [=========================>....] - ETA: 14s - loss: 1.6361 - regression_loss: 1.4099 - classification_loss: 0.2262 442/500 [=========================>....] - ETA: 14s - loss: 1.6365 - regression_loss: 1.4103 - classification_loss: 0.2261 443/500 [=========================>....] - ETA: 14s - loss: 1.6362 - regression_loss: 1.4101 - classification_loss: 0.2261 444/500 [=========================>....] - ETA: 14s - loss: 1.6367 - regression_loss: 1.4105 - classification_loss: 0.2262 445/500 [=========================>....] - ETA: 13s - loss: 1.6364 - regression_loss: 1.4103 - classification_loss: 0.2261 446/500 [=========================>....] - ETA: 13s - loss: 1.6359 - regression_loss: 1.4098 - classification_loss: 0.2261 447/500 [=========================>....] - ETA: 13s - loss: 1.6339 - regression_loss: 1.4081 - classification_loss: 0.2258 448/500 [=========================>....] - ETA: 13s - loss: 1.6338 - regression_loss: 1.4081 - classification_loss: 0.2257 449/500 [=========================>....] - ETA: 12s - loss: 1.6317 - regression_loss: 1.4063 - classification_loss: 0.2254 450/500 [==========================>...] - ETA: 12s - loss: 1.6315 - regression_loss: 1.4062 - classification_loss: 0.2252 451/500 [==========================>...] - ETA: 12s - loss: 1.6319 - regression_loss: 1.4066 - classification_loss: 0.2253 452/500 [==========================>...] - ETA: 12s - loss: 1.6317 - regression_loss: 1.4065 - classification_loss: 0.2252 453/500 [==========================>...] - ETA: 11s - loss: 1.6319 - regression_loss: 1.4067 - classification_loss: 0.2252 454/500 [==========================>...] - ETA: 11s - loss: 1.6325 - regression_loss: 1.4073 - classification_loss: 0.2252 455/500 [==========================>...] - ETA: 11s - loss: 1.6326 - regression_loss: 1.4073 - classification_loss: 0.2253 456/500 [==========================>...] - ETA: 11s - loss: 1.6325 - regression_loss: 1.4072 - classification_loss: 0.2253 457/500 [==========================>...] - ETA: 10s - loss: 1.6318 - regression_loss: 1.4067 - classification_loss: 0.2251 458/500 [==========================>...] - ETA: 10s - loss: 1.6324 - regression_loss: 1.4071 - classification_loss: 0.2253 459/500 [==========================>...] - ETA: 10s - loss: 1.6336 - regression_loss: 1.4081 - classification_loss: 0.2255 460/500 [==========================>...] - ETA: 10s - loss: 1.6350 - regression_loss: 1.4092 - classification_loss: 0.2258 461/500 [==========================>...] - ETA: 9s - loss: 1.6349 - regression_loss: 1.4091 - classification_loss: 0.2258  462/500 [==========================>...] - ETA: 9s - loss: 1.6354 - regression_loss: 1.4095 - classification_loss: 0.2258 463/500 [==========================>...] - ETA: 9s - loss: 1.6334 - regression_loss: 1.4079 - classification_loss: 0.2255 464/500 [==========================>...] - ETA: 9s - loss: 1.6320 - regression_loss: 1.4067 - classification_loss: 0.2253 465/500 [==========================>...] - ETA: 8s - loss: 1.6344 - regression_loss: 1.4088 - classification_loss: 0.2256 466/500 [==========================>...] - ETA: 8s - loss: 1.6355 - regression_loss: 1.4098 - classification_loss: 0.2258 467/500 [===========================>..] - ETA: 8s - loss: 1.6364 - regression_loss: 1.4105 - classification_loss: 0.2258 468/500 [===========================>..] - ETA: 8s - loss: 1.6373 - regression_loss: 1.4112 - classification_loss: 0.2261 469/500 [===========================>..] - ETA: 7s - loss: 1.6374 - regression_loss: 1.4111 - classification_loss: 0.2263 470/500 [===========================>..] - ETA: 7s - loss: 1.6385 - regression_loss: 1.4120 - classification_loss: 0.2265 471/500 [===========================>..] - ETA: 7s - loss: 1.6378 - regression_loss: 1.4114 - classification_loss: 0.2264 472/500 [===========================>..] - ETA: 7s - loss: 1.6379 - regression_loss: 1.4115 - classification_loss: 0.2264 473/500 [===========================>..] - ETA: 6s - loss: 1.6362 - regression_loss: 1.4101 - classification_loss: 0.2261 474/500 [===========================>..] - ETA: 6s - loss: 1.6363 - regression_loss: 1.4101 - classification_loss: 0.2262 475/500 [===========================>..] - ETA: 6s - loss: 1.6364 - regression_loss: 1.4101 - classification_loss: 0.2263 476/500 [===========================>..] - ETA: 6s - loss: 1.6365 - regression_loss: 1.4103 - classification_loss: 0.2263 477/500 [===========================>..] - ETA: 5s - loss: 1.6366 - regression_loss: 1.4104 - classification_loss: 0.2262 478/500 [===========================>..] - ETA: 5s - loss: 1.6354 - regression_loss: 1.4094 - classification_loss: 0.2261 479/500 [===========================>..] - ETA: 5s - loss: 1.6343 - regression_loss: 1.4084 - classification_loss: 0.2259 480/500 [===========================>..] - ETA: 5s - loss: 1.6349 - regression_loss: 1.4089 - classification_loss: 0.2260 481/500 [===========================>..] - ETA: 4s - loss: 1.6352 - regression_loss: 1.4092 - classification_loss: 0.2260 482/500 [===========================>..] - ETA: 4s - loss: 1.6358 - regression_loss: 1.4096 - classification_loss: 0.2261 483/500 [===========================>..] - ETA: 4s - loss: 1.6357 - regression_loss: 1.4096 - classification_loss: 0.2261 484/500 [============================>.] - ETA: 4s - loss: 1.6357 - regression_loss: 1.4097 - classification_loss: 0.2261 485/500 [============================>.] - ETA: 3s - loss: 1.6362 - regression_loss: 1.4101 - classification_loss: 0.2261 486/500 [============================>.] - ETA: 3s - loss: 1.6363 - regression_loss: 1.4102 - classification_loss: 0.2261 487/500 [============================>.] - ETA: 3s - loss: 1.6357 - regression_loss: 1.4098 - classification_loss: 0.2259 488/500 [============================>.] - ETA: 3s - loss: 1.6355 - regression_loss: 1.4095 - classification_loss: 0.2260 489/500 [============================>.] - ETA: 2s - loss: 1.6357 - regression_loss: 1.4097 - classification_loss: 0.2261 490/500 [============================>.] - ETA: 2s - loss: 1.6353 - regression_loss: 1.4092 - classification_loss: 0.2260 491/500 [============================>.] - ETA: 2s - loss: 1.6369 - regression_loss: 1.4106 - classification_loss: 0.2263 492/500 [============================>.] - ETA: 2s - loss: 1.6364 - regression_loss: 1.4102 - classification_loss: 0.2262 493/500 [============================>.] - ETA: 1s - loss: 1.6367 - regression_loss: 1.4106 - classification_loss: 0.2261 494/500 [============================>.] - ETA: 1s - loss: 1.6376 - regression_loss: 1.4113 - classification_loss: 0.2263 495/500 [============================>.] - ETA: 1s - loss: 1.6379 - regression_loss: 1.4116 - classification_loss: 0.2263 496/500 [============================>.] - ETA: 1s - loss: 1.6387 - regression_loss: 1.4123 - classification_loss: 0.2263 497/500 [============================>.] - ETA: 0s - loss: 1.6386 - regression_loss: 1.4123 - classification_loss: 0.2263 498/500 [============================>.] - ETA: 0s - loss: 1.6373 - regression_loss: 1.4112 - classification_loss: 0.2261 499/500 [============================>.] - ETA: 0s - loss: 1.6374 - regression_loss: 1.4113 - classification_loss: 0.2261 500/500 [==============================] - 127s 253ms/step - loss: 1.6382 - regression_loss: 1.4119 - classification_loss: 0.2263 1172 instances of class plum with average precision: 0.7125 mAP: 0.7125 Epoch 00011: saving model to ./training/snapshots/resnet50_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 1:52 - loss: 1.9448 - regression_loss: 1.7310 - classification_loss: 0.2138 2/500 [..............................] - ETA: 1:53 - loss: 1.8061 - regression_loss: 1.5576 - classification_loss: 0.2485 3/500 [..............................] - ETA: 1:58 - loss: 1.7246 - regression_loss: 1.5004 - classification_loss: 0.2242 4/500 [..............................] - ETA: 2:01 - loss: 1.8649 - regression_loss: 1.6157 - classification_loss: 0.2492 5/500 [..............................] - ETA: 1:59 - loss: 1.8699 - regression_loss: 1.6197 - classification_loss: 0.2502 6/500 [..............................] - ETA: 1:59 - loss: 1.7967 - regression_loss: 1.5459 - classification_loss: 0.2507 7/500 [..............................] - ETA: 2:00 - loss: 1.7030 - regression_loss: 1.4711 - classification_loss: 0.2319 8/500 [..............................] - ETA: 2:01 - loss: 1.6258 - regression_loss: 1.4002 - classification_loss: 0.2255 9/500 [..............................] - ETA: 2:00 - loss: 1.6404 - regression_loss: 1.4126 - classification_loss: 0.2277 10/500 [..............................] - ETA: 1:59 - loss: 1.5752 - regression_loss: 1.3581 - classification_loss: 0.2170 11/500 [..............................] - ETA: 2:00 - loss: 1.5838 - regression_loss: 1.3650 - classification_loss: 0.2188 12/500 [..............................] - ETA: 2:00 - loss: 1.5596 - regression_loss: 1.3477 - classification_loss: 0.2119 13/500 [..............................] - ETA: 2:00 - loss: 1.5467 - regression_loss: 1.3381 - classification_loss: 0.2086 14/500 [..............................] - ETA: 2:00 - loss: 1.5681 - regression_loss: 1.3573 - classification_loss: 0.2107 15/500 [..............................] - ETA: 2:00 - loss: 1.5735 - regression_loss: 1.3638 - classification_loss: 0.2097 16/500 [..............................] - ETA: 2:00 - loss: 1.5988 - regression_loss: 1.3832 - classification_loss: 0.2156 17/500 [>.............................] - ETA: 2:00 - loss: 1.6079 - regression_loss: 1.3913 - classification_loss: 0.2166 18/500 [>.............................] - ETA: 2:00 - loss: 1.5926 - regression_loss: 1.3791 - classification_loss: 0.2135 19/500 [>.............................] - ETA: 2:00 - loss: 1.6387 - regression_loss: 1.4160 - classification_loss: 0.2228 20/500 [>.............................] - ETA: 1:59 - loss: 1.6498 - regression_loss: 1.4257 - classification_loss: 0.2241 21/500 [>.............................] - ETA: 1:59 - loss: 1.6037 - regression_loss: 1.3861 - classification_loss: 0.2176 22/500 [>.............................] - ETA: 1:59 - loss: 1.6243 - regression_loss: 1.3594 - classification_loss: 0.2649 23/500 [>.............................] - ETA: 1:59 - loss: 1.6323 - regression_loss: 1.3668 - classification_loss: 0.2655 24/500 [>.............................] - ETA: 1:59 - loss: 1.6256 - regression_loss: 1.3633 - classification_loss: 0.2623 25/500 [>.............................] - ETA: 1:59 - loss: 1.6269 - regression_loss: 1.3691 - classification_loss: 0.2578 26/500 [>.............................] - ETA: 1:58 - loss: 1.6361 - regression_loss: 1.3781 - classification_loss: 0.2580 27/500 [>.............................] - ETA: 1:58 - loss: 1.6307 - regression_loss: 1.3771 - classification_loss: 0.2536 28/500 [>.............................] - ETA: 1:58 - loss: 1.6449 - regression_loss: 1.3944 - classification_loss: 0.2504 29/500 [>.............................] - ETA: 1:58 - loss: 1.6329 - regression_loss: 1.3876 - classification_loss: 0.2453 30/500 [>.............................] - ETA: 1:58 - loss: 1.6408 - regression_loss: 1.3930 - classification_loss: 0.2478 31/500 [>.............................] - ETA: 1:58 - loss: 1.6399 - regression_loss: 1.3925 - classification_loss: 0.2474 32/500 [>.............................] - ETA: 1:58 - loss: 1.6278 - regression_loss: 1.3813 - classification_loss: 0.2465 33/500 [>.............................] - ETA: 1:57 - loss: 1.6384 - regression_loss: 1.3897 - classification_loss: 0.2487 34/500 [=>............................] - ETA: 1:57 - loss: 1.6234 - regression_loss: 1.3773 - classification_loss: 0.2460 35/500 [=>............................] - ETA: 1:57 - loss: 1.6287 - regression_loss: 1.3813 - classification_loss: 0.2473 36/500 [=>............................] - ETA: 1:57 - loss: 1.6484 - regression_loss: 1.3976 - classification_loss: 0.2508 37/500 [=>............................] - ETA: 1:56 - loss: 1.6422 - regression_loss: 1.3936 - classification_loss: 0.2486 38/500 [=>............................] - ETA: 1:56 - loss: 1.6473 - regression_loss: 1.3989 - classification_loss: 0.2485 39/500 [=>............................] - ETA: 1:56 - loss: 1.6517 - regression_loss: 1.4029 - classification_loss: 0.2488 40/500 [=>............................] - ETA: 1:56 - loss: 1.6509 - regression_loss: 1.4034 - classification_loss: 0.2474 41/500 [=>............................] - ETA: 1:56 - loss: 1.6511 - regression_loss: 1.4031 - classification_loss: 0.2480 42/500 [=>............................] - ETA: 1:55 - loss: 1.6507 - regression_loss: 1.4035 - classification_loss: 0.2471 43/500 [=>............................] - ETA: 1:55 - loss: 1.6528 - regression_loss: 1.4060 - classification_loss: 0.2468 44/500 [=>............................] - ETA: 1:55 - loss: 1.6480 - regression_loss: 1.4020 - classification_loss: 0.2460 45/500 [=>............................] - ETA: 1:55 - loss: 1.6422 - regression_loss: 1.3945 - classification_loss: 0.2477 46/500 [=>............................] - ETA: 1:55 - loss: 1.6410 - regression_loss: 1.3949 - classification_loss: 0.2461 47/500 [=>............................] - ETA: 1:54 - loss: 1.6178 - regression_loss: 1.3754 - classification_loss: 0.2424 48/500 [=>............................] - ETA: 1:54 - loss: 1.6126 - regression_loss: 1.3715 - classification_loss: 0.2411 49/500 [=>............................] - ETA: 1:54 - loss: 1.6138 - regression_loss: 1.3730 - classification_loss: 0.2407 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6032 - regression_loss: 1.3652 - classification_loss: 0.2380 51/500 [==>...........................] - ETA: 1:53 - loss: 1.5962 - regression_loss: 1.3585 - classification_loss: 0.2376 52/500 [==>...........................] - ETA: 1:53 - loss: 1.5898 - regression_loss: 1.3539 - classification_loss: 0.2359 53/500 [==>...........................] - ETA: 1:53 - loss: 1.5837 - regression_loss: 1.3489 - classification_loss: 0.2349 54/500 [==>...........................] - ETA: 1:53 - loss: 1.5718 - regression_loss: 1.3393 - classification_loss: 0.2325 55/500 [==>...........................] - ETA: 1:52 - loss: 1.5639 - regression_loss: 1.3330 - classification_loss: 0.2309 56/500 [==>...........................] - ETA: 1:52 - loss: 1.5664 - regression_loss: 1.3353 - classification_loss: 0.2311 57/500 [==>...........................] - ETA: 1:52 - loss: 1.5726 - regression_loss: 1.3405 - classification_loss: 0.2322 58/500 [==>...........................] - ETA: 1:52 - loss: 1.5819 - regression_loss: 1.3488 - classification_loss: 0.2331 59/500 [==>...........................] - ETA: 1:52 - loss: 1.5647 - regression_loss: 1.3337 - classification_loss: 0.2310 60/500 [==>...........................] - ETA: 1:51 - loss: 1.5681 - regression_loss: 1.3375 - classification_loss: 0.2306 61/500 [==>...........................] - ETA: 1:51 - loss: 1.5659 - regression_loss: 1.3356 - classification_loss: 0.2302 62/500 [==>...........................] - ETA: 1:51 - loss: 1.5684 - regression_loss: 1.3384 - classification_loss: 0.2300 63/500 [==>...........................] - ETA: 1:50 - loss: 1.5724 - regression_loss: 1.3418 - classification_loss: 0.2306 64/500 [==>...........................] - ETA: 1:50 - loss: 1.5744 - regression_loss: 1.3441 - classification_loss: 0.2303 65/500 [==>...........................] - ETA: 1:50 - loss: 1.5843 - regression_loss: 1.3533 - classification_loss: 0.2309 66/500 [==>...........................] - ETA: 1:50 - loss: 1.5808 - regression_loss: 1.3513 - classification_loss: 0.2295 67/500 [===>..........................] - ETA: 1:49 - loss: 1.5652 - regression_loss: 1.3384 - classification_loss: 0.2269 68/500 [===>..........................] - ETA: 1:49 - loss: 1.5570 - regression_loss: 1.3318 - classification_loss: 0.2252 69/500 [===>..........................] - ETA: 1:49 - loss: 1.5508 - regression_loss: 1.3269 - classification_loss: 0.2240 70/500 [===>..........................] - ETA: 1:49 - loss: 1.5475 - regression_loss: 1.3248 - classification_loss: 0.2227 71/500 [===>..........................] - ETA: 1:48 - loss: 1.5434 - regression_loss: 1.3211 - classification_loss: 0.2223 72/500 [===>..........................] - ETA: 1:48 - loss: 1.5387 - regression_loss: 1.3164 - classification_loss: 0.2223 73/500 [===>..........................] - ETA: 1:48 - loss: 1.5440 - regression_loss: 1.3186 - classification_loss: 0.2254 74/500 [===>..........................] - ETA: 1:48 - loss: 1.5464 - regression_loss: 1.3209 - classification_loss: 0.2255 75/500 [===>..........................] - ETA: 1:47 - loss: 1.5456 - regression_loss: 1.3208 - classification_loss: 0.2247 76/500 [===>..........................] - ETA: 1:47 - loss: 1.5452 - regression_loss: 1.3203 - classification_loss: 0.2248 77/500 [===>..........................] - ETA: 1:47 - loss: 1.5456 - regression_loss: 1.3211 - classification_loss: 0.2244 78/500 [===>..........................] - ETA: 1:47 - loss: 1.5521 - regression_loss: 1.3267 - classification_loss: 0.2254 79/500 [===>..........................] - ETA: 1:46 - loss: 1.5448 - regression_loss: 1.3202 - classification_loss: 0.2246 80/500 [===>..........................] - ETA: 1:46 - loss: 1.5502 - regression_loss: 1.3245 - classification_loss: 0.2257 81/500 [===>..........................] - ETA: 1:46 - loss: 1.5542 - regression_loss: 1.3284 - classification_loss: 0.2259 82/500 [===>..........................] - ETA: 1:46 - loss: 1.5523 - regression_loss: 1.3269 - classification_loss: 0.2254 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5547 - regression_loss: 1.3291 - classification_loss: 0.2256 84/500 [====>.........................] - ETA: 1:45 - loss: 1.5593 - regression_loss: 1.3327 - classification_loss: 0.2266 85/500 [====>.........................] - ETA: 1:45 - loss: 1.5616 - regression_loss: 1.3351 - classification_loss: 0.2265 86/500 [====>.........................] - ETA: 1:45 - loss: 1.5627 - regression_loss: 1.3366 - classification_loss: 0.2261 87/500 [====>.........................] - ETA: 1:44 - loss: 1.5585 - regression_loss: 1.3336 - classification_loss: 0.2249 88/500 [====>.........................] - ETA: 1:44 - loss: 1.5642 - regression_loss: 1.3389 - classification_loss: 0.2253 89/500 [====>.........................] - ETA: 1:44 - loss: 1.5597 - regression_loss: 1.3358 - classification_loss: 0.2239 90/500 [====>.........................] - ETA: 1:44 - loss: 1.5550 - regression_loss: 1.3324 - classification_loss: 0.2226 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5538 - regression_loss: 1.3302 - classification_loss: 0.2236 92/500 [====>.........................] - ETA: 1:43 - loss: 1.5557 - regression_loss: 1.3314 - classification_loss: 0.2243 93/500 [====>.........................] - ETA: 1:43 - loss: 1.5497 - regression_loss: 1.3264 - classification_loss: 0.2233 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5559 - regression_loss: 1.3310 - classification_loss: 0.2249 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5573 - regression_loss: 1.3312 - classification_loss: 0.2260 96/500 [====>.........................] - ETA: 1:42 - loss: 1.5633 - regression_loss: 1.3364 - classification_loss: 0.2269 97/500 [====>.........................] - ETA: 1:42 - loss: 1.5841 - regression_loss: 1.3527 - classification_loss: 0.2314 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5819 - regression_loss: 1.3514 - classification_loss: 0.2306 99/500 [====>.........................] - ETA: 1:41 - loss: 1.5956 - regression_loss: 1.3637 - classification_loss: 0.2319 100/500 [=====>........................] - ETA: 1:41 - loss: 1.5935 - regression_loss: 1.3617 - classification_loss: 0.2318 101/500 [=====>........................] - ETA: 1:41 - loss: 1.5959 - regression_loss: 1.3641 - classification_loss: 0.2318 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5987 - regression_loss: 1.3645 - classification_loss: 0.2342 103/500 [=====>........................] - ETA: 1:40 - loss: 1.5983 - regression_loss: 1.3643 - classification_loss: 0.2340 104/500 [=====>........................] - ETA: 1:40 - loss: 1.5982 - regression_loss: 1.3643 - classification_loss: 0.2339 105/500 [=====>........................] - ETA: 1:40 - loss: 1.6011 - regression_loss: 1.3671 - classification_loss: 0.2340 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5977 - regression_loss: 1.3647 - classification_loss: 0.2329 107/500 [=====>........................] - ETA: 1:39 - loss: 1.5986 - regression_loss: 1.3656 - classification_loss: 0.2329 108/500 [=====>........................] - ETA: 1:39 - loss: 1.5920 - regression_loss: 1.3604 - classification_loss: 0.2315 109/500 [=====>........................] - ETA: 1:39 - loss: 1.5917 - regression_loss: 1.3600 - classification_loss: 0.2316 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5861 - regression_loss: 1.3553 - classification_loss: 0.2308 111/500 [=====>........................] - ETA: 1:38 - loss: 1.5859 - regression_loss: 1.3548 - classification_loss: 0.2311 112/500 [=====>........................] - ETA: 1:38 - loss: 1.5844 - regression_loss: 1.3538 - classification_loss: 0.2306 113/500 [=====>........................] - ETA: 1:38 - loss: 1.5840 - regression_loss: 1.3537 - classification_loss: 0.2303 114/500 [=====>........................] - ETA: 1:37 - loss: 1.5862 - regression_loss: 1.3551 - classification_loss: 0.2310 115/500 [=====>........................] - ETA: 1:37 - loss: 1.5875 - regression_loss: 1.3563 - classification_loss: 0.2312 116/500 [=====>........................] - ETA: 1:37 - loss: 1.5937 - regression_loss: 1.3606 - classification_loss: 0.2331 117/500 [======>.......................] - ETA: 1:37 - loss: 1.5970 - regression_loss: 1.3586 - classification_loss: 0.2384 118/500 [======>.......................] - ETA: 1:36 - loss: 1.6002 - regression_loss: 1.3594 - classification_loss: 0.2408 119/500 [======>.......................] - ETA: 1:36 - loss: 1.6015 - regression_loss: 1.3603 - classification_loss: 0.2412 120/500 [======>.......................] - ETA: 1:36 - loss: 1.5986 - regression_loss: 1.3572 - classification_loss: 0.2414 121/500 [======>.......................] - ETA: 1:36 - loss: 1.5966 - regression_loss: 1.3557 - classification_loss: 0.2410 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5936 - regression_loss: 1.3537 - classification_loss: 0.2400 123/500 [======>.......................] - ETA: 1:35 - loss: 1.5935 - regression_loss: 1.3538 - classification_loss: 0.2397 124/500 [======>.......................] - ETA: 1:35 - loss: 1.5857 - regression_loss: 1.3468 - classification_loss: 0.2389 125/500 [======>.......................] - ETA: 1:35 - loss: 1.5874 - regression_loss: 1.3476 - classification_loss: 0.2398 126/500 [======>.......................] - ETA: 1:34 - loss: 1.5895 - regression_loss: 1.3496 - classification_loss: 0.2399 127/500 [======>.......................] - ETA: 1:34 - loss: 1.5909 - regression_loss: 1.3505 - classification_loss: 0.2404 128/500 [======>.......................] - ETA: 1:34 - loss: 1.5891 - regression_loss: 1.3492 - classification_loss: 0.2399 129/500 [======>.......................] - ETA: 1:34 - loss: 1.5892 - regression_loss: 1.3496 - classification_loss: 0.2396 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5946 - regression_loss: 1.3539 - classification_loss: 0.2407 131/500 [======>.......................] - ETA: 1:33 - loss: 1.5960 - regression_loss: 1.3549 - classification_loss: 0.2411 132/500 [======>.......................] - ETA: 1:33 - loss: 1.5881 - regression_loss: 1.3482 - classification_loss: 0.2399 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5887 - regression_loss: 1.3491 - classification_loss: 0.2396 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5893 - regression_loss: 1.3498 - classification_loss: 0.2395 135/500 [=======>......................] - ETA: 1:32 - loss: 1.5891 - regression_loss: 1.3501 - classification_loss: 0.2390 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5869 - regression_loss: 1.3487 - classification_loss: 0.2382 137/500 [=======>......................] - ETA: 1:31 - loss: 1.5877 - regression_loss: 1.3488 - classification_loss: 0.2389 138/500 [=======>......................] - ETA: 1:31 - loss: 1.5818 - regression_loss: 1.3440 - classification_loss: 0.2378 139/500 [=======>......................] - ETA: 1:31 - loss: 1.5838 - regression_loss: 1.3455 - classification_loss: 0.2384 140/500 [=======>......................] - ETA: 1:31 - loss: 1.5812 - regression_loss: 1.3433 - classification_loss: 0.2379 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5761 - regression_loss: 1.3393 - classification_loss: 0.2368 142/500 [=======>......................] - ETA: 1:30 - loss: 1.5796 - regression_loss: 1.3420 - classification_loss: 0.2376 143/500 [=======>......................] - ETA: 1:30 - loss: 1.5821 - regression_loss: 1.3441 - classification_loss: 0.2379 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5867 - regression_loss: 1.3487 - classification_loss: 0.2380 145/500 [=======>......................] - ETA: 1:29 - loss: 1.5930 - regression_loss: 1.3531 - classification_loss: 0.2399 146/500 [=======>......................] - ETA: 1:29 - loss: 1.5930 - regression_loss: 1.3535 - classification_loss: 0.2396 147/500 [=======>......................] - ETA: 1:29 - loss: 1.5947 - regression_loss: 1.3550 - classification_loss: 0.2397 148/500 [=======>......................] - ETA: 1:29 - loss: 1.5945 - regression_loss: 1.3555 - classification_loss: 0.2390 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5884 - regression_loss: 1.3503 - classification_loss: 0.2381 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5884 - regression_loss: 1.3504 - classification_loss: 0.2380 151/500 [========>.....................] - ETA: 1:28 - loss: 1.5869 - regression_loss: 1.3494 - classification_loss: 0.2375 152/500 [========>.....................] - ETA: 1:28 - loss: 1.5875 - regression_loss: 1.3501 - classification_loss: 0.2374 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5869 - regression_loss: 1.3496 - classification_loss: 0.2373 154/500 [========>.....................] - ETA: 1:27 - loss: 1.5909 - regression_loss: 1.3538 - classification_loss: 0.2371 155/500 [========>.....................] - ETA: 1:27 - loss: 1.5926 - regression_loss: 1.3554 - classification_loss: 0.2372 156/500 [========>.....................] - ETA: 1:27 - loss: 1.5942 - regression_loss: 1.3567 - classification_loss: 0.2374 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5903 - regression_loss: 1.3536 - classification_loss: 0.2368 158/500 [========>.....................] - ETA: 1:26 - loss: 1.5892 - regression_loss: 1.3529 - classification_loss: 0.2363 159/500 [========>.....................] - ETA: 1:26 - loss: 1.5899 - regression_loss: 1.3539 - classification_loss: 0.2360 160/500 [========>.....................] - ETA: 1:26 - loss: 1.5891 - regression_loss: 1.3535 - classification_loss: 0.2356 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5908 - regression_loss: 1.3551 - classification_loss: 0.2357 162/500 [========>.....................] - ETA: 1:25 - loss: 1.5922 - regression_loss: 1.3565 - classification_loss: 0.2357 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5930 - regression_loss: 1.3574 - classification_loss: 0.2356 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5948 - regression_loss: 1.3590 - classification_loss: 0.2359 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5951 - regression_loss: 1.3592 - classification_loss: 0.2359 166/500 [========>.....................] - ETA: 1:24 - loss: 1.5972 - regression_loss: 1.3608 - classification_loss: 0.2364 167/500 [=========>....................] - ETA: 1:24 - loss: 1.5936 - regression_loss: 1.3579 - classification_loss: 0.2357 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5937 - regression_loss: 1.3581 - classification_loss: 0.2356 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5895 - regression_loss: 1.3548 - classification_loss: 0.2347 170/500 [=========>....................] - ETA: 1:23 - loss: 1.5886 - regression_loss: 1.3543 - classification_loss: 0.2343 171/500 [=========>....................] - ETA: 1:23 - loss: 1.5867 - regression_loss: 1.3530 - classification_loss: 0.2337 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5873 - regression_loss: 1.3534 - classification_loss: 0.2339 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5881 - regression_loss: 1.3539 - classification_loss: 0.2342 174/500 [=========>....................] - ETA: 1:22 - loss: 1.5899 - regression_loss: 1.3555 - classification_loss: 0.2344 175/500 [=========>....................] - ETA: 1:22 - loss: 1.5844 - regression_loss: 1.3507 - classification_loss: 0.2337 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5805 - regression_loss: 1.3471 - classification_loss: 0.2333 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5818 - regression_loss: 1.3482 - classification_loss: 0.2336 178/500 [=========>....................] - ETA: 1:21 - loss: 1.5798 - regression_loss: 1.3468 - classification_loss: 0.2330 179/500 [=========>....................] - ETA: 1:21 - loss: 1.5816 - regression_loss: 1.3484 - classification_loss: 0.2332 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5827 - regression_loss: 1.3496 - classification_loss: 0.2331 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5844 - regression_loss: 1.3511 - classification_loss: 0.2333 182/500 [=========>....................] - ETA: 1:20 - loss: 1.5831 - regression_loss: 1.3500 - classification_loss: 0.2331 183/500 [=========>....................] - ETA: 1:20 - loss: 1.5827 - regression_loss: 1.3496 - classification_loss: 0.2330 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5839 - regression_loss: 1.3510 - classification_loss: 0.2329 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5850 - regression_loss: 1.3519 - classification_loss: 0.2331 186/500 [==========>...................] - ETA: 1:19 - loss: 1.5858 - regression_loss: 1.3527 - classification_loss: 0.2331 187/500 [==========>...................] - ETA: 1:19 - loss: 1.5868 - regression_loss: 1.3537 - classification_loss: 0.2331 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5861 - regression_loss: 1.3531 - classification_loss: 0.2330 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5859 - regression_loss: 1.3529 - classification_loss: 0.2330 190/500 [==========>...................] - ETA: 1:18 - loss: 1.5888 - regression_loss: 1.3555 - classification_loss: 0.2333 191/500 [==========>...................] - ETA: 1:18 - loss: 1.5910 - regression_loss: 1.3574 - classification_loss: 0.2336 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5926 - regression_loss: 1.3591 - classification_loss: 0.2335 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5899 - regression_loss: 1.3571 - classification_loss: 0.2328 194/500 [==========>...................] - ETA: 1:17 - loss: 1.5880 - regression_loss: 1.3554 - classification_loss: 0.2326 195/500 [==========>...................] - ETA: 1:17 - loss: 1.5875 - regression_loss: 1.3551 - classification_loss: 0.2324 196/500 [==========>...................] - ETA: 1:17 - loss: 1.5873 - regression_loss: 1.3552 - classification_loss: 0.2321 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5850 - regression_loss: 1.3535 - classification_loss: 0.2315 198/500 [==========>...................] - ETA: 1:16 - loss: 1.5827 - regression_loss: 1.3514 - classification_loss: 0.2313 199/500 [==========>...................] - ETA: 1:16 - loss: 1.5837 - regression_loss: 1.3524 - classification_loss: 0.2313 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5868 - regression_loss: 1.3540 - classification_loss: 0.2328 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5851 - regression_loss: 1.3528 - classification_loss: 0.2322 202/500 [===========>..................] - ETA: 1:15 - loss: 1.5865 - regression_loss: 1.3541 - classification_loss: 0.2325 203/500 [===========>..................] - ETA: 1:15 - loss: 1.5877 - regression_loss: 1.3551 - classification_loss: 0.2326 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5923 - regression_loss: 1.3584 - classification_loss: 0.2339 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5930 - regression_loss: 1.3585 - classification_loss: 0.2344 206/500 [===========>..................] - ETA: 1:14 - loss: 1.5920 - regression_loss: 1.3575 - classification_loss: 0.2344 207/500 [===========>..................] - ETA: 1:14 - loss: 1.5885 - regression_loss: 1.3546 - classification_loss: 0.2339 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5897 - regression_loss: 1.3556 - classification_loss: 0.2341 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5904 - regression_loss: 1.3559 - classification_loss: 0.2345 210/500 [===========>..................] - ETA: 1:13 - loss: 1.5929 - regression_loss: 1.3583 - classification_loss: 0.2346 211/500 [===========>..................] - ETA: 1:13 - loss: 1.5926 - regression_loss: 1.3581 - classification_loss: 0.2344 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5902 - regression_loss: 1.3563 - classification_loss: 0.2339 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5906 - regression_loss: 1.3565 - classification_loss: 0.2340 214/500 [===========>..................] - ETA: 1:12 - loss: 1.5932 - regression_loss: 1.3589 - classification_loss: 0.2342 215/500 [===========>..................] - ETA: 1:12 - loss: 1.5933 - regression_loss: 1.3593 - classification_loss: 0.2340 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5941 - regression_loss: 1.3602 - classification_loss: 0.2339 217/500 [============>.................] - ETA: 1:11 - loss: 1.5954 - regression_loss: 1.3615 - classification_loss: 0.2339 218/500 [============>.................] - ETA: 1:11 - loss: 1.5970 - regression_loss: 1.3629 - classification_loss: 0.2341 219/500 [============>.................] - ETA: 1:11 - loss: 1.5977 - regression_loss: 1.3634 - classification_loss: 0.2342 220/500 [============>.................] - ETA: 1:10 - loss: 1.5994 - regression_loss: 1.3649 - classification_loss: 0.2345 221/500 [============>.................] - ETA: 1:10 - loss: 1.6003 - regression_loss: 1.3658 - classification_loss: 0.2345 222/500 [============>.................] - ETA: 1:10 - loss: 1.6011 - regression_loss: 1.3664 - classification_loss: 0.2346 223/500 [============>.................] - ETA: 1:10 - loss: 1.5991 - regression_loss: 1.3651 - classification_loss: 0.2340 224/500 [============>.................] - ETA: 1:09 - loss: 1.6019 - regression_loss: 1.3674 - classification_loss: 0.2344 225/500 [============>.................] - ETA: 1:09 - loss: 1.6018 - regression_loss: 1.3674 - classification_loss: 0.2344 226/500 [============>.................] - ETA: 1:09 - loss: 1.6025 - regression_loss: 1.3682 - classification_loss: 0.2343 227/500 [============>.................] - ETA: 1:09 - loss: 1.6020 - regression_loss: 1.3678 - classification_loss: 0.2343 228/500 [============>.................] - ETA: 1:08 - loss: 1.6034 - regression_loss: 1.3691 - classification_loss: 0.2343 229/500 [============>.................] - ETA: 1:08 - loss: 1.6022 - regression_loss: 1.3682 - classification_loss: 0.2340 230/500 [============>.................] - ETA: 1:08 - loss: 1.6046 - regression_loss: 1.3702 - classification_loss: 0.2344 231/500 [============>.................] - ETA: 1:08 - loss: 1.6036 - regression_loss: 1.3695 - classification_loss: 0.2342 232/500 [============>.................] - ETA: 1:07 - loss: 1.6062 - regression_loss: 1.3718 - classification_loss: 0.2344 233/500 [============>.................] - ETA: 1:07 - loss: 1.6016 - regression_loss: 1.3677 - classification_loss: 0.2339 234/500 [=============>................] - ETA: 1:07 - loss: 1.6029 - regression_loss: 1.3692 - classification_loss: 0.2337 235/500 [=============>................] - ETA: 1:07 - loss: 1.6023 - regression_loss: 1.3688 - classification_loss: 0.2335 236/500 [=============>................] - ETA: 1:06 - loss: 1.6041 - regression_loss: 1.3706 - classification_loss: 0.2335 237/500 [=============>................] - ETA: 1:06 - loss: 1.6013 - regression_loss: 1.3685 - classification_loss: 0.2329 238/500 [=============>................] - ETA: 1:06 - loss: 1.6011 - regression_loss: 1.3684 - classification_loss: 0.2327 239/500 [=============>................] - ETA: 1:06 - loss: 1.5989 - regression_loss: 1.3666 - classification_loss: 0.2323 240/500 [=============>................] - ETA: 1:05 - loss: 1.5993 - regression_loss: 1.3671 - classification_loss: 0.2323 241/500 [=============>................] - ETA: 1:05 - loss: 1.5975 - regression_loss: 1.3657 - classification_loss: 0.2318 242/500 [=============>................] - ETA: 1:05 - loss: 1.6007 - regression_loss: 1.3684 - classification_loss: 0.2324 243/500 [=============>................] - ETA: 1:05 - loss: 1.6017 - regression_loss: 1.3692 - classification_loss: 0.2325 244/500 [=============>................] - ETA: 1:04 - loss: 1.6008 - regression_loss: 1.3685 - classification_loss: 0.2323 245/500 [=============>................] - ETA: 1:04 - loss: 1.5968 - regression_loss: 1.3650 - classification_loss: 0.2318 246/500 [=============>................] - ETA: 1:04 - loss: 1.5949 - regression_loss: 1.3634 - classification_loss: 0.2315 247/500 [=============>................] - ETA: 1:04 - loss: 1.5959 - regression_loss: 1.3644 - classification_loss: 0.2315 248/500 [=============>................] - ETA: 1:03 - loss: 1.5963 - regression_loss: 1.3649 - classification_loss: 0.2314 249/500 [=============>................] - ETA: 1:03 - loss: 1.5967 - regression_loss: 1.3653 - classification_loss: 0.2314 250/500 [==============>...............] - ETA: 1:03 - loss: 1.5984 - regression_loss: 1.3666 - classification_loss: 0.2319 251/500 [==============>...............] - ETA: 1:03 - loss: 1.5982 - regression_loss: 1.3665 - classification_loss: 0.2318 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5986 - regression_loss: 1.3669 - classification_loss: 0.2317 253/500 [==============>...............] - ETA: 1:02 - loss: 1.5984 - regression_loss: 1.3668 - classification_loss: 0.2316 254/500 [==============>...............] - ETA: 1:02 - loss: 1.5964 - regression_loss: 1.3653 - classification_loss: 0.2311 255/500 [==============>...............] - ETA: 1:02 - loss: 1.5964 - regression_loss: 1.3654 - classification_loss: 0.2309 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5986 - regression_loss: 1.3673 - classification_loss: 0.2313 257/500 [==============>...............] - ETA: 1:01 - loss: 1.5946 - regression_loss: 1.3639 - classification_loss: 0.2307 258/500 [==============>...............] - ETA: 1:01 - loss: 1.5944 - regression_loss: 1.3638 - classification_loss: 0.2306 259/500 [==============>...............] - ETA: 1:01 - loss: 1.5935 - regression_loss: 1.3630 - classification_loss: 0.2305 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5945 - regression_loss: 1.3638 - classification_loss: 0.2307 261/500 [==============>...............] - ETA: 1:00 - loss: 1.5947 - regression_loss: 1.3640 - classification_loss: 0.2307 262/500 [==============>...............] - ETA: 1:00 - loss: 1.5955 - regression_loss: 1.3648 - classification_loss: 0.2307 263/500 [==============>...............] - ETA: 1:00 - loss: 1.5921 - regression_loss: 1.3621 - classification_loss: 0.2300 264/500 [==============>...............] - ETA: 59s - loss: 1.5914 - regression_loss: 1.3615 - classification_loss: 0.2299  265/500 [==============>...............] - ETA: 59s - loss: 1.5905 - regression_loss: 1.3608 - classification_loss: 0.2297 266/500 [==============>...............] - ETA: 59s - loss: 1.5883 - regression_loss: 1.3591 - classification_loss: 0.2292 267/500 [===============>..............] - ETA: 59s - loss: 1.5895 - regression_loss: 1.3603 - classification_loss: 0.2292 268/500 [===============>..............] - ETA: 58s - loss: 1.5864 - regression_loss: 1.3578 - classification_loss: 0.2286 269/500 [===============>..............] - ETA: 58s - loss: 1.5855 - regression_loss: 1.3572 - classification_loss: 0.2282 270/500 [===============>..............] - ETA: 58s - loss: 1.5867 - regression_loss: 1.3583 - classification_loss: 0.2284 271/500 [===============>..............] - ETA: 58s - loss: 1.5876 - regression_loss: 1.3594 - classification_loss: 0.2282 272/500 [===============>..............] - ETA: 57s - loss: 1.5895 - regression_loss: 1.3611 - classification_loss: 0.2284 273/500 [===============>..............] - ETA: 57s - loss: 1.5895 - regression_loss: 1.3610 - classification_loss: 0.2286 274/500 [===============>..............] - ETA: 57s - loss: 1.5892 - regression_loss: 1.3606 - classification_loss: 0.2286 275/500 [===============>..............] - ETA: 57s - loss: 1.5946 - regression_loss: 1.3652 - classification_loss: 0.2295 276/500 [===============>..............] - ETA: 56s - loss: 1.5944 - regression_loss: 1.3650 - classification_loss: 0.2295 277/500 [===============>..............] - ETA: 56s - loss: 1.5954 - regression_loss: 1.3659 - classification_loss: 0.2295 278/500 [===============>..............] - ETA: 56s - loss: 1.5960 - regression_loss: 1.3664 - classification_loss: 0.2295 279/500 [===============>..............] - ETA: 56s - loss: 1.5930 - regression_loss: 1.3639 - classification_loss: 0.2291 280/500 [===============>..............] - ETA: 55s - loss: 1.5932 - regression_loss: 1.3639 - classification_loss: 0.2293 281/500 [===============>..............] - ETA: 55s - loss: 1.5967 - regression_loss: 1.3673 - classification_loss: 0.2293 282/500 [===============>..............] - ETA: 55s - loss: 1.5974 - regression_loss: 1.3679 - classification_loss: 0.2294 283/500 [===============>..............] - ETA: 54s - loss: 1.5982 - regression_loss: 1.3686 - classification_loss: 0.2296 284/500 [================>.............] - ETA: 54s - loss: 1.5965 - regression_loss: 1.3673 - classification_loss: 0.2292 285/500 [================>.............] - ETA: 54s - loss: 1.5973 - regression_loss: 1.3678 - classification_loss: 0.2295 286/500 [================>.............] - ETA: 54s - loss: 1.5971 - regression_loss: 1.3677 - classification_loss: 0.2293 287/500 [================>.............] - ETA: 53s - loss: 1.5945 - regression_loss: 1.3653 - classification_loss: 0.2292 288/500 [================>.............] - ETA: 53s - loss: 1.5952 - regression_loss: 1.3660 - classification_loss: 0.2292 289/500 [================>.............] - ETA: 53s - loss: 1.5946 - regression_loss: 1.3658 - classification_loss: 0.2288 290/500 [================>.............] - ETA: 53s - loss: 1.5919 - regression_loss: 1.3635 - classification_loss: 0.2284 291/500 [================>.............] - ETA: 52s - loss: 1.5922 - regression_loss: 1.3638 - classification_loss: 0.2284 292/500 [================>.............] - ETA: 52s - loss: 1.5937 - regression_loss: 1.3651 - classification_loss: 0.2286 293/500 [================>.............] - ETA: 52s - loss: 1.5915 - regression_loss: 1.3632 - classification_loss: 0.2283 294/500 [================>.............] - ETA: 52s - loss: 1.5927 - regression_loss: 1.3644 - classification_loss: 0.2283 295/500 [================>.............] - ETA: 51s - loss: 1.5928 - regression_loss: 1.3646 - classification_loss: 0.2283 296/500 [================>.............] - ETA: 51s - loss: 1.5924 - regression_loss: 1.3643 - classification_loss: 0.2281 297/500 [================>.............] - ETA: 51s - loss: 1.5891 - regression_loss: 1.3616 - classification_loss: 0.2275 298/500 [================>.............] - ETA: 51s - loss: 1.5918 - regression_loss: 1.3641 - classification_loss: 0.2277 299/500 [================>.............] - ETA: 50s - loss: 1.5928 - regression_loss: 1.3649 - classification_loss: 0.2279 300/500 [=================>............] - ETA: 50s - loss: 1.5938 - regression_loss: 1.3659 - classification_loss: 0.2279 301/500 [=================>............] - ETA: 50s - loss: 1.5903 - regression_loss: 1.3628 - classification_loss: 0.2274 302/500 [=================>............] - ETA: 50s - loss: 1.5909 - regression_loss: 1.3633 - classification_loss: 0.2276 303/500 [=================>............] - ETA: 49s - loss: 1.5897 - regression_loss: 1.3623 - classification_loss: 0.2274 304/500 [=================>............] - ETA: 49s - loss: 1.5904 - regression_loss: 1.3629 - classification_loss: 0.2275 305/500 [=================>............] - ETA: 49s - loss: 1.5898 - regression_loss: 1.3626 - classification_loss: 0.2272 306/500 [=================>............] - ETA: 49s - loss: 1.5907 - regression_loss: 1.3634 - classification_loss: 0.2273 307/500 [=================>............] - ETA: 48s - loss: 1.5915 - regression_loss: 1.3641 - classification_loss: 0.2274 308/500 [=================>............] - ETA: 48s - loss: 1.5940 - regression_loss: 1.3659 - classification_loss: 0.2282 309/500 [=================>............] - ETA: 48s - loss: 1.5950 - regression_loss: 1.3669 - classification_loss: 0.2281 310/500 [=================>............] - ETA: 48s - loss: 1.5963 - regression_loss: 1.3684 - classification_loss: 0.2279 311/500 [=================>............] - ETA: 47s - loss: 1.5951 - regression_loss: 1.3676 - classification_loss: 0.2276 312/500 [=================>............] - ETA: 47s - loss: 1.5960 - regression_loss: 1.3683 - classification_loss: 0.2277 313/500 [=================>............] - ETA: 47s - loss: 1.5939 - regression_loss: 1.3666 - classification_loss: 0.2273 314/500 [=================>............] - ETA: 47s - loss: 1.5947 - regression_loss: 1.3671 - classification_loss: 0.2276 315/500 [=================>............] - ETA: 46s - loss: 1.5924 - regression_loss: 1.3651 - classification_loss: 0.2273 316/500 [=================>............] - ETA: 46s - loss: 1.5933 - regression_loss: 1.3658 - classification_loss: 0.2275 317/500 [==================>...........] - ETA: 46s - loss: 1.5920 - regression_loss: 1.3647 - classification_loss: 0.2272 318/500 [==================>...........] - ETA: 46s - loss: 1.5928 - regression_loss: 1.3654 - classification_loss: 0.2274 319/500 [==================>...........] - ETA: 45s - loss: 1.5922 - regression_loss: 1.3648 - classification_loss: 0.2273 320/500 [==================>...........] - ETA: 45s - loss: 1.5922 - regression_loss: 1.3649 - classification_loss: 0.2273 321/500 [==================>...........] - ETA: 45s - loss: 1.5918 - regression_loss: 1.3646 - classification_loss: 0.2271 322/500 [==================>...........] - ETA: 45s - loss: 1.5912 - regression_loss: 1.3642 - classification_loss: 0.2270 323/500 [==================>...........] - ETA: 44s - loss: 1.5906 - regression_loss: 1.3638 - classification_loss: 0.2268 324/500 [==================>...........] - ETA: 44s - loss: 1.5906 - regression_loss: 1.3638 - classification_loss: 0.2269 325/500 [==================>...........] - ETA: 44s - loss: 1.5914 - regression_loss: 1.3645 - classification_loss: 0.2269 326/500 [==================>...........] - ETA: 44s - loss: 1.5903 - regression_loss: 1.3636 - classification_loss: 0.2267 327/500 [==================>...........] - ETA: 43s - loss: 1.5910 - regression_loss: 1.3642 - classification_loss: 0.2269 328/500 [==================>...........] - ETA: 43s - loss: 1.5913 - regression_loss: 1.3643 - classification_loss: 0.2270 329/500 [==================>...........] - ETA: 43s - loss: 1.5913 - regression_loss: 1.3643 - classification_loss: 0.2269 330/500 [==================>...........] - ETA: 43s - loss: 1.5887 - regression_loss: 1.3623 - classification_loss: 0.2264 331/500 [==================>...........] - ETA: 42s - loss: 1.5891 - regression_loss: 1.3626 - classification_loss: 0.2265 332/500 [==================>...........] - ETA: 42s - loss: 1.5887 - regression_loss: 1.3620 - classification_loss: 0.2266 333/500 [==================>...........] - ETA: 42s - loss: 1.5862 - regression_loss: 1.3601 - classification_loss: 0.2262 334/500 [===================>..........] - ETA: 42s - loss: 1.5860 - regression_loss: 1.3599 - classification_loss: 0.2261 335/500 [===================>..........] - ETA: 41s - loss: 1.5856 - regression_loss: 1.3598 - classification_loss: 0.2258 336/500 [===================>..........] - ETA: 41s - loss: 1.5858 - regression_loss: 1.3599 - classification_loss: 0.2258 337/500 [===================>..........] - ETA: 41s - loss: 1.5847 - regression_loss: 1.3591 - classification_loss: 0.2256 338/500 [===================>..........] - ETA: 41s - loss: 1.5864 - regression_loss: 1.3605 - classification_loss: 0.2258 339/500 [===================>..........] - ETA: 40s - loss: 1.5857 - regression_loss: 1.3600 - classification_loss: 0.2257 340/500 [===================>..........] - ETA: 40s - loss: 1.5857 - regression_loss: 1.3602 - classification_loss: 0.2255 341/500 [===================>..........] - ETA: 40s - loss: 1.5865 - regression_loss: 1.3609 - classification_loss: 0.2256 342/500 [===================>..........] - ETA: 40s - loss: 1.5877 - regression_loss: 1.3621 - classification_loss: 0.2256 343/500 [===================>..........] - ETA: 39s - loss: 1.5879 - regression_loss: 1.3622 - classification_loss: 0.2257 344/500 [===================>..........] - ETA: 39s - loss: 1.5884 - regression_loss: 1.3626 - classification_loss: 0.2258 345/500 [===================>..........] - ETA: 39s - loss: 1.5865 - regression_loss: 1.3610 - classification_loss: 0.2255 346/500 [===================>..........] - ETA: 39s - loss: 1.5898 - regression_loss: 1.3638 - classification_loss: 0.2260 347/500 [===================>..........] - ETA: 38s - loss: 1.5906 - regression_loss: 1.3646 - classification_loss: 0.2260 348/500 [===================>..........] - ETA: 38s - loss: 1.5915 - regression_loss: 1.3654 - classification_loss: 0.2261 349/500 [===================>..........] - ETA: 38s - loss: 1.5917 - regression_loss: 1.3657 - classification_loss: 0.2259 350/500 [====================>.........] - ETA: 37s - loss: 1.5918 - regression_loss: 1.3658 - classification_loss: 0.2260 351/500 [====================>.........] - ETA: 37s - loss: 1.5941 - regression_loss: 1.3677 - classification_loss: 0.2264 352/500 [====================>.........] - ETA: 37s - loss: 1.5969 - regression_loss: 1.3701 - classification_loss: 0.2267 353/500 [====================>.........] - ETA: 37s - loss: 1.5965 - regression_loss: 1.3698 - classification_loss: 0.2267 354/500 [====================>.........] - ETA: 36s - loss: 1.5968 - regression_loss: 1.3701 - classification_loss: 0.2267 355/500 [====================>.........] - ETA: 36s - loss: 1.5944 - regression_loss: 1.3681 - classification_loss: 0.2263 356/500 [====================>.........] - ETA: 36s - loss: 1.5943 - regression_loss: 1.3680 - classification_loss: 0.2263 357/500 [====================>.........] - ETA: 36s - loss: 1.5955 - regression_loss: 1.3691 - classification_loss: 0.2265 358/500 [====================>.........] - ETA: 35s - loss: 1.5961 - regression_loss: 1.3695 - classification_loss: 0.2266 359/500 [====================>.........] - ETA: 35s - loss: 1.5961 - regression_loss: 1.3695 - classification_loss: 0.2266 360/500 [====================>.........] - ETA: 35s - loss: 1.5964 - regression_loss: 1.3698 - classification_loss: 0.2266 361/500 [====================>.........] - ETA: 35s - loss: 1.5958 - regression_loss: 1.3692 - classification_loss: 0.2266 362/500 [====================>.........] - ETA: 34s - loss: 1.5961 - regression_loss: 1.3695 - classification_loss: 0.2266 363/500 [====================>.........] - ETA: 34s - loss: 1.5954 - regression_loss: 1.3689 - classification_loss: 0.2265 364/500 [====================>.........] - ETA: 34s - loss: 1.5953 - regression_loss: 1.3688 - classification_loss: 0.2264 365/500 [====================>.........] - ETA: 34s - loss: 1.5949 - regression_loss: 1.3686 - classification_loss: 0.2263 366/500 [====================>.........] - ETA: 33s - loss: 1.5962 - regression_loss: 1.3696 - classification_loss: 0.2266 367/500 [=====================>........] - ETA: 33s - loss: 1.5967 - regression_loss: 1.3700 - classification_loss: 0.2266 368/500 [=====================>........] - ETA: 33s - loss: 1.5949 - regression_loss: 1.3686 - classification_loss: 0.2263 369/500 [=====================>........] - ETA: 33s - loss: 1.5958 - regression_loss: 1.3695 - classification_loss: 0.2263 370/500 [=====================>........] - ETA: 32s - loss: 1.5957 - regression_loss: 1.3694 - classification_loss: 0.2263 371/500 [=====================>........] - ETA: 32s - loss: 1.5959 - regression_loss: 1.3696 - classification_loss: 0.2263 372/500 [=====================>........] - ETA: 32s - loss: 1.5954 - regression_loss: 1.3691 - classification_loss: 0.2263 373/500 [=====================>........] - ETA: 32s - loss: 1.5943 - regression_loss: 1.3682 - classification_loss: 0.2261 374/500 [=====================>........] - ETA: 31s - loss: 1.5929 - regression_loss: 1.3669 - classification_loss: 0.2259 375/500 [=====================>........] - ETA: 31s - loss: 1.5912 - regression_loss: 1.3655 - classification_loss: 0.2256 376/500 [=====================>........] - ETA: 31s - loss: 1.5905 - regression_loss: 1.3648 - classification_loss: 0.2257 377/500 [=====================>........] - ETA: 31s - loss: 1.5910 - regression_loss: 1.3652 - classification_loss: 0.2257 378/500 [=====================>........] - ETA: 30s - loss: 1.5911 - regression_loss: 1.3652 - classification_loss: 0.2258 379/500 [=====================>........] - ETA: 30s - loss: 1.5912 - regression_loss: 1.3654 - classification_loss: 0.2259 380/500 [=====================>........] - ETA: 30s - loss: 1.5928 - regression_loss: 1.3665 - classification_loss: 0.2263 381/500 [=====================>........] - ETA: 30s - loss: 1.5925 - regression_loss: 1.3663 - classification_loss: 0.2262 382/500 [=====================>........] - ETA: 29s - loss: 1.5920 - regression_loss: 1.3659 - classification_loss: 0.2261 383/500 [=====================>........] - ETA: 29s - loss: 1.5931 - regression_loss: 1.3667 - classification_loss: 0.2263 384/500 [======================>.......] - ETA: 29s - loss: 1.5940 - regression_loss: 1.3677 - classification_loss: 0.2263 385/500 [======================>.......] - ETA: 29s - loss: 1.5943 - regression_loss: 1.3679 - classification_loss: 0.2263 386/500 [======================>.......] - ETA: 28s - loss: 1.5921 - regression_loss: 1.3660 - classification_loss: 0.2260 387/500 [======================>.......] - ETA: 28s - loss: 1.5916 - regression_loss: 1.3657 - classification_loss: 0.2259 388/500 [======================>.......] - ETA: 28s - loss: 1.5916 - regression_loss: 1.3657 - classification_loss: 0.2259 389/500 [======================>.......] - ETA: 28s - loss: 1.5905 - regression_loss: 1.3647 - classification_loss: 0.2258 390/500 [======================>.......] - ETA: 27s - loss: 1.5892 - regression_loss: 1.3637 - classification_loss: 0.2255 391/500 [======================>.......] - ETA: 27s - loss: 1.5878 - regression_loss: 1.3623 - classification_loss: 0.2254 392/500 [======================>.......] - ETA: 27s - loss: 1.5877 - regression_loss: 1.3624 - classification_loss: 0.2253 393/500 [======================>.......] - ETA: 27s - loss: 1.5891 - regression_loss: 1.3636 - classification_loss: 0.2256 394/500 [======================>.......] - ETA: 26s - loss: 1.5892 - regression_loss: 1.3637 - classification_loss: 0.2255 395/500 [======================>.......] - ETA: 26s - loss: 1.5900 - regression_loss: 1.3644 - classification_loss: 0.2257 396/500 [======================>.......] - ETA: 26s - loss: 1.5937 - regression_loss: 1.3673 - classification_loss: 0.2264 397/500 [======================>.......] - ETA: 26s - loss: 1.5965 - regression_loss: 1.3695 - classification_loss: 0.2271 398/500 [======================>.......] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3698 - classification_loss: 0.2268 399/500 [======================>.......] - ETA: 25s - loss: 1.5980 - regression_loss: 1.3710 - classification_loss: 0.2271 400/500 [=======================>......] - ETA: 25s - loss: 1.5984 - regression_loss: 1.3711 - classification_loss: 0.2273 401/500 [=======================>......] - ETA: 25s - loss: 1.5994 - regression_loss: 1.3721 - classification_loss: 0.2273 402/500 [=======================>......] - ETA: 24s - loss: 1.5990 - regression_loss: 1.3717 - classification_loss: 0.2273 403/500 [=======================>......] - ETA: 24s - loss: 1.5993 - regression_loss: 1.3720 - classification_loss: 0.2272 404/500 [=======================>......] - ETA: 24s - loss: 1.5977 - regression_loss: 1.3707 - classification_loss: 0.2270 405/500 [=======================>......] - ETA: 24s - loss: 1.5982 - regression_loss: 1.3712 - classification_loss: 0.2270 406/500 [=======================>......] - ETA: 23s - loss: 1.5997 - regression_loss: 1.3727 - classification_loss: 0.2270 407/500 [=======================>......] - ETA: 23s - loss: 1.5992 - regression_loss: 1.3724 - classification_loss: 0.2268 408/500 [=======================>......] - ETA: 23s - loss: 1.6003 - regression_loss: 1.3735 - classification_loss: 0.2268 409/500 [=======================>......] - ETA: 23s - loss: 1.5995 - regression_loss: 1.3728 - classification_loss: 0.2267 410/500 [=======================>......] - ETA: 22s - loss: 1.6004 - regression_loss: 1.3735 - classification_loss: 0.2269 411/500 [=======================>......] - ETA: 22s - loss: 1.6014 - regression_loss: 1.3744 - classification_loss: 0.2270 412/500 [=======================>......] - ETA: 22s - loss: 1.6017 - regression_loss: 1.3746 - classification_loss: 0.2271 413/500 [=======================>......] - ETA: 22s - loss: 1.5995 - regression_loss: 1.3728 - classification_loss: 0.2267 414/500 [=======================>......] - ETA: 21s - loss: 1.5995 - regression_loss: 1.3728 - classification_loss: 0.2267 415/500 [=======================>......] - ETA: 21s - loss: 1.5987 - regression_loss: 1.3722 - classification_loss: 0.2265 416/500 [=======================>......] - ETA: 21s - loss: 1.5993 - regression_loss: 1.3726 - classification_loss: 0.2266 417/500 [========================>.....] - ETA: 21s - loss: 1.5988 - regression_loss: 1.3720 - classification_loss: 0.2267 418/500 [========================>.....] - ETA: 20s - loss: 1.5969 - regression_loss: 1.3704 - classification_loss: 0.2264 419/500 [========================>.....] - ETA: 20s - loss: 1.5964 - regression_loss: 1.3703 - classification_loss: 0.2261 420/500 [========================>.....] - ETA: 20s - loss: 1.5963 - regression_loss: 1.3702 - classification_loss: 0.2260 421/500 [========================>.....] - ETA: 20s - loss: 1.5970 - regression_loss: 1.3707 - classification_loss: 0.2262 422/500 [========================>.....] - ETA: 19s - loss: 1.5946 - regression_loss: 1.3687 - classification_loss: 0.2259 423/500 [========================>.....] - ETA: 19s - loss: 1.5942 - regression_loss: 1.3683 - classification_loss: 0.2259 424/500 [========================>.....] - ETA: 19s - loss: 1.5931 - regression_loss: 1.3675 - classification_loss: 0.2256 425/500 [========================>.....] - ETA: 19s - loss: 1.5941 - regression_loss: 1.3682 - classification_loss: 0.2259 426/500 [========================>.....] - ETA: 18s - loss: 1.5939 - regression_loss: 1.3682 - classification_loss: 0.2258 427/500 [========================>.....] - ETA: 18s - loss: 1.5938 - regression_loss: 1.3680 - classification_loss: 0.2257 428/500 [========================>.....] - ETA: 18s - loss: 1.5929 - regression_loss: 1.3673 - classification_loss: 0.2256 429/500 [========================>.....] - ETA: 18s - loss: 1.5933 - regression_loss: 1.3676 - classification_loss: 0.2257 430/500 [========================>.....] - ETA: 17s - loss: 1.5915 - regression_loss: 1.3662 - classification_loss: 0.2254 431/500 [========================>.....] - ETA: 17s - loss: 1.5928 - regression_loss: 1.3673 - classification_loss: 0.2255 432/500 [========================>.....] - ETA: 17s - loss: 1.5931 - regression_loss: 1.3676 - classification_loss: 0.2254 433/500 [========================>.....] - ETA: 17s - loss: 1.5929 - regression_loss: 1.3677 - classification_loss: 0.2252 434/500 [=========================>....] - ETA: 16s - loss: 1.5940 - regression_loss: 1.3687 - classification_loss: 0.2253 435/500 [=========================>....] - ETA: 16s - loss: 1.5944 - regression_loss: 1.3692 - classification_loss: 0.2252 436/500 [=========================>....] - ETA: 16s - loss: 1.5933 - regression_loss: 1.3683 - classification_loss: 0.2250 437/500 [=========================>....] - ETA: 15s - loss: 1.5936 - regression_loss: 1.3686 - classification_loss: 0.2250 438/500 [=========================>....] - ETA: 15s - loss: 1.5946 - regression_loss: 1.3694 - classification_loss: 0.2252 439/500 [=========================>....] - ETA: 15s - loss: 1.5948 - regression_loss: 1.3696 - classification_loss: 0.2252 440/500 [=========================>....] - ETA: 15s - loss: 1.5955 - regression_loss: 1.3702 - classification_loss: 0.2253 441/500 [=========================>....] - ETA: 14s - loss: 1.5952 - regression_loss: 1.3698 - classification_loss: 0.2254 442/500 [=========================>....] - ETA: 14s - loss: 1.5943 - regression_loss: 1.3691 - classification_loss: 0.2252 443/500 [=========================>....] - ETA: 14s - loss: 1.5938 - regression_loss: 1.3688 - classification_loss: 0.2250 444/500 [=========================>....] - ETA: 14s - loss: 1.5939 - regression_loss: 1.3690 - classification_loss: 0.2250 445/500 [=========================>....] - ETA: 13s - loss: 1.5923 - regression_loss: 1.3676 - classification_loss: 0.2247 446/500 [=========================>....] - ETA: 13s - loss: 1.5924 - regression_loss: 1.3677 - classification_loss: 0.2247 447/500 [=========================>....] - ETA: 13s - loss: 1.5929 - regression_loss: 1.3684 - classification_loss: 0.2246 448/500 [=========================>....] - ETA: 13s - loss: 1.5931 - regression_loss: 1.3685 - classification_loss: 0.2246 449/500 [=========================>....] - ETA: 12s - loss: 1.5922 - regression_loss: 1.3677 - classification_loss: 0.2245 450/500 [==========================>...] - ETA: 12s - loss: 1.5905 - regression_loss: 1.3663 - classification_loss: 0.2242 451/500 [==========================>...] - ETA: 12s - loss: 1.5907 - regression_loss: 1.3665 - classification_loss: 0.2243 452/500 [==========================>...] - ETA: 12s - loss: 1.5913 - regression_loss: 1.3673 - classification_loss: 0.2240 453/500 [==========================>...] - ETA: 11s - loss: 1.5911 - regression_loss: 1.3671 - classification_loss: 0.2240 454/500 [==========================>...] - ETA: 11s - loss: 1.5913 - regression_loss: 1.3673 - classification_loss: 0.2240 455/500 [==========================>...] - ETA: 11s - loss: 1.5918 - regression_loss: 1.3677 - classification_loss: 0.2241 456/500 [==========================>...] - ETA: 11s - loss: 1.5921 - regression_loss: 1.3680 - classification_loss: 0.2241 457/500 [==========================>...] - ETA: 10s - loss: 1.5922 - regression_loss: 1.3682 - classification_loss: 0.2241 458/500 [==========================>...] - ETA: 10s - loss: 1.5904 - regression_loss: 1.3666 - classification_loss: 0.2238 459/500 [==========================>...] - ETA: 10s - loss: 1.5892 - regression_loss: 1.3656 - classification_loss: 0.2236 460/500 [==========================>...] - ETA: 10s - loss: 1.5890 - regression_loss: 1.3655 - classification_loss: 0.2235 461/500 [==========================>...] - ETA: 9s - loss: 1.5908 - regression_loss: 1.3671 - classification_loss: 0.2237  462/500 [==========================>...] - ETA: 9s - loss: 1.5917 - regression_loss: 1.3678 - classification_loss: 0.2239 463/500 [==========================>...] - ETA: 9s - loss: 1.5926 - regression_loss: 1.3686 - classification_loss: 0.2240 464/500 [==========================>...] - ETA: 9s - loss: 1.5911 - regression_loss: 1.3674 - classification_loss: 0.2237 465/500 [==========================>...] - ETA: 8s - loss: 1.5911 - regression_loss: 1.3674 - classification_loss: 0.2237 466/500 [==========================>...] - ETA: 8s - loss: 1.5908 - regression_loss: 1.3673 - classification_loss: 0.2235 467/500 [===========================>..] - ETA: 8s - loss: 1.5926 - regression_loss: 1.3688 - classification_loss: 0.2238 468/500 [===========================>..] - ETA: 8s - loss: 1.5926 - regression_loss: 1.3688 - classification_loss: 0.2237 469/500 [===========================>..] - ETA: 7s - loss: 1.5928 - regression_loss: 1.3691 - classification_loss: 0.2238 470/500 [===========================>..] - ETA: 7s - loss: 1.5935 - regression_loss: 1.3698 - classification_loss: 0.2238 471/500 [===========================>..] - ETA: 7s - loss: 1.5916 - regression_loss: 1.3680 - classification_loss: 0.2236 472/500 [===========================>..] - ETA: 7s - loss: 1.5911 - regression_loss: 1.3675 - classification_loss: 0.2236 473/500 [===========================>..] - ETA: 6s - loss: 1.5919 - regression_loss: 1.3681 - classification_loss: 0.2238 474/500 [===========================>..] - ETA: 6s - loss: 1.5914 - regression_loss: 1.3677 - classification_loss: 0.2237 475/500 [===========================>..] - ETA: 6s - loss: 1.5911 - regression_loss: 1.3675 - classification_loss: 0.2236 476/500 [===========================>..] - ETA: 6s - loss: 1.5916 - regression_loss: 1.3680 - classification_loss: 0.2236 477/500 [===========================>..] - ETA: 5s - loss: 1.5915 - regression_loss: 1.3681 - classification_loss: 0.2235 478/500 [===========================>..] - ETA: 5s - loss: 1.5923 - regression_loss: 1.3687 - classification_loss: 0.2236 479/500 [===========================>..] - ETA: 5s - loss: 1.5922 - regression_loss: 1.3688 - classification_loss: 0.2235 480/500 [===========================>..] - ETA: 5s - loss: 1.5922 - regression_loss: 1.3687 - classification_loss: 0.2234 481/500 [===========================>..] - ETA: 4s - loss: 1.5929 - regression_loss: 1.3692 - classification_loss: 0.2237 482/500 [===========================>..] - ETA: 4s - loss: 1.5927 - regression_loss: 1.3690 - classification_loss: 0.2237 483/500 [===========================>..] - ETA: 4s - loss: 1.5939 - regression_loss: 1.3701 - classification_loss: 0.2239 484/500 [============================>.] - ETA: 4s - loss: 1.5940 - regression_loss: 1.3701 - classification_loss: 0.2239 485/500 [============================>.] - ETA: 3s - loss: 1.5940 - regression_loss: 1.3703 - classification_loss: 0.2236 486/500 [============================>.] - ETA: 3s - loss: 1.5943 - regression_loss: 1.3706 - classification_loss: 0.2236 487/500 [============================>.] - ETA: 3s - loss: 1.5949 - regression_loss: 1.3712 - classification_loss: 0.2237 488/500 [============================>.] - ETA: 3s - loss: 1.5952 - regression_loss: 1.3715 - classification_loss: 0.2237 489/500 [============================>.] - ETA: 2s - loss: 1.5944 - regression_loss: 1.3708 - classification_loss: 0.2236 490/500 [============================>.] - ETA: 2s - loss: 1.5944 - regression_loss: 1.3708 - classification_loss: 0.2236 491/500 [============================>.] - ETA: 2s - loss: 1.5947 - regression_loss: 1.3711 - classification_loss: 0.2235 492/500 [============================>.] - ETA: 2s - loss: 1.5933 - regression_loss: 1.3699 - classification_loss: 0.2234 493/500 [============================>.] - ETA: 1s - loss: 1.5932 - regression_loss: 1.3698 - classification_loss: 0.2234 494/500 [============================>.] - ETA: 1s - loss: 1.5925 - regression_loss: 1.3692 - classification_loss: 0.2233 495/500 [============================>.] - ETA: 1s - loss: 1.5927 - regression_loss: 1.3693 - classification_loss: 0.2234 496/500 [============================>.] - ETA: 1s - loss: 1.5934 - regression_loss: 1.3700 - classification_loss: 0.2234 497/500 [============================>.] - ETA: 0s - loss: 1.5914 - regression_loss: 1.3683 - classification_loss: 0.2231 498/500 [============================>.] - ETA: 0s - loss: 1.5915 - regression_loss: 1.3684 - classification_loss: 0.2231 499/500 [============================>.] - ETA: 0s - loss: 1.5941 - regression_loss: 1.3704 - classification_loss: 0.2236 500/500 [==============================] - 127s 254ms/step - loss: 1.5952 - regression_loss: 1.3713 - classification_loss: 0.2238 1172 instances of class plum with average precision: 0.7260 mAP: 0.7260 Epoch 00012: saving model to ./training/snapshots/resnet50_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 1:52 - loss: 1.3684 - regression_loss: 1.1888 - classification_loss: 0.1796 2/500 [..............................] - ETA: 2:00 - loss: 1.3680 - regression_loss: 1.2054 - classification_loss: 0.1626 3/500 [..............................] - ETA: 2:02 - loss: 1.6574 - regression_loss: 1.4470 - classification_loss: 0.2104 4/500 [..............................] - ETA: 2:03 - loss: 1.7315 - regression_loss: 1.5045 - classification_loss: 0.2271 5/500 [..............................] - ETA: 2:03 - loss: 1.6230 - regression_loss: 1.4100 - classification_loss: 0.2130 6/500 [..............................] - ETA: 2:05 - loss: 1.6475 - regression_loss: 1.4251 - classification_loss: 0.2224 7/500 [..............................] - ETA: 2:05 - loss: 1.6596 - regression_loss: 1.4360 - classification_loss: 0.2235 8/500 [..............................] - ETA: 2:05 - loss: 1.6724 - regression_loss: 1.4327 - classification_loss: 0.2397 9/500 [..............................] - ETA: 2:05 - loss: 1.6336 - regression_loss: 1.4012 - classification_loss: 0.2323 10/500 [..............................] - ETA: 2:05 - loss: 1.7075 - regression_loss: 1.4592 - classification_loss: 0.2483 11/500 [..............................] - ETA: 2:04 - loss: 1.7410 - regression_loss: 1.4909 - classification_loss: 0.2501 12/500 [..............................] - ETA: 2:04 - loss: 1.6480 - regression_loss: 1.4126 - classification_loss: 0.2354 13/500 [..............................] - ETA: 2:04 - loss: 1.6358 - regression_loss: 1.4056 - classification_loss: 0.2302 14/500 [..............................] - ETA: 2:04 - loss: 1.6738 - regression_loss: 1.4309 - classification_loss: 0.2429 15/500 [..............................] - ETA: 2:03 - loss: 1.6703 - regression_loss: 1.4216 - classification_loss: 0.2486 16/500 [..............................] - ETA: 2:03 - loss: 1.6628 - regression_loss: 1.4195 - classification_loss: 0.2433 17/500 [>.............................] - ETA: 2:02 - loss: 1.6510 - regression_loss: 1.4137 - classification_loss: 0.2373 18/500 [>.............................] - ETA: 2:02 - loss: 1.6679 - regression_loss: 1.4296 - classification_loss: 0.2383 19/500 [>.............................] - ETA: 2:02 - loss: 1.6650 - regression_loss: 1.4284 - classification_loss: 0.2365 20/500 [>.............................] - ETA: 2:02 - loss: 1.6608 - regression_loss: 1.4255 - classification_loss: 0.2353 21/500 [>.............................] - ETA: 2:01 - loss: 1.6594 - regression_loss: 1.4246 - classification_loss: 0.2348 22/500 [>.............................] - ETA: 2:01 - loss: 1.6658 - regression_loss: 1.4291 - classification_loss: 0.2367 23/500 [>.............................] - ETA: 2:01 - loss: 1.6619 - regression_loss: 1.4262 - classification_loss: 0.2357 24/500 [>.............................] - ETA: 2:01 - loss: 1.6552 - regression_loss: 1.4208 - classification_loss: 0.2344 25/500 [>.............................] - ETA: 2:00 - loss: 1.6317 - regression_loss: 1.4025 - classification_loss: 0.2292 26/500 [>.............................] - ETA: 2:00 - loss: 1.6333 - regression_loss: 1.4045 - classification_loss: 0.2288 27/500 [>.............................] - ETA: 2:00 - loss: 1.6274 - regression_loss: 1.3984 - classification_loss: 0.2290 28/500 [>.............................] - ETA: 2:00 - loss: 1.6446 - regression_loss: 1.4159 - classification_loss: 0.2286 29/500 [>.............................] - ETA: 2:00 - loss: 1.6250 - regression_loss: 1.4002 - classification_loss: 0.2247 30/500 [>.............................] - ETA: 1:59 - loss: 1.6409 - regression_loss: 1.4135 - classification_loss: 0.2275 31/500 [>.............................] - ETA: 1:59 - loss: 1.6129 - regression_loss: 1.3897 - classification_loss: 0.2231 32/500 [>.............................] - ETA: 1:59 - loss: 1.6101 - regression_loss: 1.3878 - classification_loss: 0.2223 33/500 [>.............................] - ETA: 1:58 - loss: 1.6231 - regression_loss: 1.3983 - classification_loss: 0.2248 34/500 [=>............................] - ETA: 1:58 - loss: 1.6161 - regression_loss: 1.3912 - classification_loss: 0.2249 35/500 [=>............................] - ETA: 1:58 - loss: 1.6186 - regression_loss: 1.3954 - classification_loss: 0.2232 36/500 [=>............................] - ETA: 1:58 - loss: 1.6169 - regression_loss: 1.3945 - classification_loss: 0.2224 37/500 [=>............................] - ETA: 1:57 - loss: 1.6119 - regression_loss: 1.3912 - classification_loss: 0.2206 38/500 [=>............................] - ETA: 1:57 - loss: 1.6157 - regression_loss: 1.3945 - classification_loss: 0.2212 39/500 [=>............................] - ETA: 1:57 - loss: 1.6122 - regression_loss: 1.3924 - classification_loss: 0.2197 40/500 [=>............................] - ETA: 1:57 - loss: 1.6092 - regression_loss: 1.3902 - classification_loss: 0.2190 41/500 [=>............................] - ETA: 1:56 - loss: 1.6067 - regression_loss: 1.3878 - classification_loss: 0.2189 42/500 [=>............................] - ETA: 1:56 - loss: 1.6148 - regression_loss: 1.3956 - classification_loss: 0.2191 43/500 [=>............................] - ETA: 1:56 - loss: 1.6214 - regression_loss: 1.4012 - classification_loss: 0.2201 44/500 [=>............................] - ETA: 1:55 - loss: 1.6142 - regression_loss: 1.3958 - classification_loss: 0.2184 45/500 [=>............................] - ETA: 1:55 - loss: 1.6188 - regression_loss: 1.4010 - classification_loss: 0.2178 46/500 [=>............................] - ETA: 1:55 - loss: 1.6184 - regression_loss: 1.4001 - classification_loss: 0.2183 47/500 [=>............................] - ETA: 1:54 - loss: 1.6103 - regression_loss: 1.3930 - classification_loss: 0.2173 48/500 [=>............................] - ETA: 1:54 - loss: 1.6024 - regression_loss: 1.3863 - classification_loss: 0.2161 49/500 [=>............................] - ETA: 1:54 - loss: 1.6012 - regression_loss: 1.3860 - classification_loss: 0.2152 50/500 [==>...........................] - ETA: 1:54 - loss: 1.6099 - regression_loss: 1.3925 - classification_loss: 0.2175 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6095 - regression_loss: 1.3921 - classification_loss: 0.2174 52/500 [==>...........................] - ETA: 1:53 - loss: 1.6126 - regression_loss: 1.3957 - classification_loss: 0.2169 53/500 [==>...........................] - ETA: 1:53 - loss: 1.6167 - regression_loss: 1.3989 - classification_loss: 0.2178 54/500 [==>...........................] - ETA: 1:53 - loss: 1.6121 - regression_loss: 1.3950 - classification_loss: 0.2171 55/500 [==>...........................] - ETA: 1:52 - loss: 1.5985 - regression_loss: 1.3838 - classification_loss: 0.2146 56/500 [==>...........................] - ETA: 1:52 - loss: 1.6042 - regression_loss: 1.3879 - classification_loss: 0.2163 57/500 [==>...........................] - ETA: 1:52 - loss: 1.5891 - regression_loss: 1.3749 - classification_loss: 0.2142 58/500 [==>...........................] - ETA: 1:52 - loss: 1.5725 - regression_loss: 1.3608 - classification_loss: 0.2117 59/500 [==>...........................] - ETA: 1:51 - loss: 1.5688 - regression_loss: 1.3580 - classification_loss: 0.2108 60/500 [==>...........................] - ETA: 1:51 - loss: 1.5732 - regression_loss: 1.3617 - classification_loss: 0.2115 61/500 [==>...........................] - ETA: 1:51 - loss: 1.5696 - regression_loss: 1.3583 - classification_loss: 0.2112 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5580 - regression_loss: 1.3482 - classification_loss: 0.2098 63/500 [==>...........................] - ETA: 1:50 - loss: 1.5597 - regression_loss: 1.3492 - classification_loss: 0.2105 64/500 [==>...........................] - ETA: 1:50 - loss: 1.5630 - regression_loss: 1.3519 - classification_loss: 0.2110 65/500 [==>...........................] - ETA: 1:50 - loss: 1.5731 - regression_loss: 1.3617 - classification_loss: 0.2114 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5844 - regression_loss: 1.3704 - classification_loss: 0.2141 67/500 [===>..........................] - ETA: 1:49 - loss: 1.5876 - regression_loss: 1.3730 - classification_loss: 0.2146 68/500 [===>..........................] - ETA: 1:49 - loss: 1.5935 - regression_loss: 1.3779 - classification_loss: 0.2155 69/500 [===>..........................] - ETA: 1:49 - loss: 1.5973 - regression_loss: 1.3799 - classification_loss: 0.2173 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6001 - regression_loss: 1.3830 - classification_loss: 0.2171 71/500 [===>..........................] - ETA: 1:48 - loss: 1.6020 - regression_loss: 1.3844 - classification_loss: 0.2176 72/500 [===>..........................] - ETA: 1:48 - loss: 1.5973 - regression_loss: 1.3798 - classification_loss: 0.2175 73/500 [===>..........................] - ETA: 1:48 - loss: 1.6028 - regression_loss: 1.3840 - classification_loss: 0.2188 74/500 [===>..........................] - ETA: 1:48 - loss: 1.5994 - regression_loss: 1.3812 - classification_loss: 0.2182 75/500 [===>..........................] - ETA: 1:47 - loss: 1.5974 - regression_loss: 1.3795 - classification_loss: 0.2179 76/500 [===>..........................] - ETA: 1:47 - loss: 1.5992 - regression_loss: 1.3813 - classification_loss: 0.2179 77/500 [===>..........................] - ETA: 1:47 - loss: 1.5995 - regression_loss: 1.3820 - classification_loss: 0.2175 78/500 [===>..........................] - ETA: 1:47 - loss: 1.6017 - regression_loss: 1.3836 - classification_loss: 0.2182 79/500 [===>..........................] - ETA: 1:46 - loss: 1.6137 - regression_loss: 1.3939 - classification_loss: 0.2198 80/500 [===>..........................] - ETA: 1:46 - loss: 1.6136 - regression_loss: 1.3944 - classification_loss: 0.2191 81/500 [===>..........................] - ETA: 1:46 - loss: 1.6160 - regression_loss: 1.3973 - classification_loss: 0.2188 82/500 [===>..........................] - ETA: 1:46 - loss: 1.6161 - regression_loss: 1.3972 - classification_loss: 0.2189 83/500 [===>..........................] - ETA: 1:45 - loss: 1.6125 - regression_loss: 1.3948 - classification_loss: 0.2177 84/500 [====>.........................] - ETA: 1:45 - loss: 1.6085 - regression_loss: 1.3916 - classification_loss: 0.2170 85/500 [====>.........................] - ETA: 1:45 - loss: 1.6032 - regression_loss: 1.3871 - classification_loss: 0.2161 86/500 [====>.........................] - ETA: 1:45 - loss: 1.5967 - regression_loss: 1.3815 - classification_loss: 0.2152 87/500 [====>.........................] - ETA: 1:44 - loss: 1.6022 - regression_loss: 1.3851 - classification_loss: 0.2171 88/500 [====>.........................] - ETA: 1:44 - loss: 1.6032 - regression_loss: 1.3858 - classification_loss: 0.2174 89/500 [====>.........................] - ETA: 1:44 - loss: 1.6008 - regression_loss: 1.3844 - classification_loss: 0.2164 90/500 [====>.........................] - ETA: 1:44 - loss: 1.6007 - regression_loss: 1.3848 - classification_loss: 0.2159 91/500 [====>.........................] - ETA: 1:43 - loss: 1.6038 - regression_loss: 1.3876 - classification_loss: 0.2162 92/500 [====>.........................] - ETA: 1:43 - loss: 1.6094 - regression_loss: 1.3919 - classification_loss: 0.2175 93/500 [====>.........................] - ETA: 1:43 - loss: 1.6099 - regression_loss: 1.3926 - classification_loss: 0.2174 94/500 [====>.........................] - ETA: 1:43 - loss: 1.6062 - regression_loss: 1.3894 - classification_loss: 0.2168 95/500 [====>.........................] - ETA: 1:42 - loss: 1.6108 - regression_loss: 1.3946 - classification_loss: 0.2163 96/500 [====>.........................] - ETA: 1:42 - loss: 1.6142 - regression_loss: 1.3973 - classification_loss: 0.2169 97/500 [====>.........................] - ETA: 1:42 - loss: 1.6188 - regression_loss: 1.4007 - classification_loss: 0.2181 98/500 [====>.........................] - ETA: 1:42 - loss: 1.6237 - regression_loss: 1.4048 - classification_loss: 0.2189 99/500 [====>.........................] - ETA: 1:41 - loss: 1.6184 - regression_loss: 1.4000 - classification_loss: 0.2184 100/500 [=====>........................] - ETA: 1:41 - loss: 1.6226 - regression_loss: 1.4034 - classification_loss: 0.2191 101/500 [=====>........................] - ETA: 1:41 - loss: 1.6239 - regression_loss: 1.4044 - classification_loss: 0.2195 102/500 [=====>........................] - ETA: 1:41 - loss: 1.6256 - regression_loss: 1.4035 - classification_loss: 0.2221 103/500 [=====>........................] - ETA: 1:40 - loss: 1.6304 - regression_loss: 1.4076 - classification_loss: 0.2228 104/500 [=====>........................] - ETA: 1:40 - loss: 1.6280 - regression_loss: 1.4057 - classification_loss: 0.2223 105/500 [=====>........................] - ETA: 1:40 - loss: 1.6277 - regression_loss: 1.4060 - classification_loss: 0.2217 106/500 [=====>........................] - ETA: 1:40 - loss: 1.6266 - regression_loss: 1.4051 - classification_loss: 0.2215 107/500 [=====>........................] - ETA: 1:39 - loss: 1.6276 - regression_loss: 1.4061 - classification_loss: 0.2215 108/500 [=====>........................] - ETA: 1:39 - loss: 1.6207 - regression_loss: 1.4003 - classification_loss: 0.2204 109/500 [=====>........................] - ETA: 1:39 - loss: 1.6158 - regression_loss: 1.3960 - classification_loss: 0.2198 110/500 [=====>........................] - ETA: 1:39 - loss: 1.6178 - regression_loss: 1.3977 - classification_loss: 0.2201 111/500 [=====>........................] - ETA: 1:38 - loss: 1.6164 - regression_loss: 1.3967 - classification_loss: 0.2197 112/500 [=====>........................] - ETA: 1:38 - loss: 1.6133 - regression_loss: 1.3940 - classification_loss: 0.2193 113/500 [=====>........................] - ETA: 1:38 - loss: 1.6158 - regression_loss: 1.3956 - classification_loss: 0.2202 114/500 [=====>........................] - ETA: 1:38 - loss: 1.6173 - regression_loss: 1.3968 - classification_loss: 0.2205 115/500 [=====>........................] - ETA: 1:37 - loss: 1.6171 - regression_loss: 1.3966 - classification_loss: 0.2205 116/500 [=====>........................] - ETA: 1:37 - loss: 1.6193 - regression_loss: 1.3986 - classification_loss: 0.2207 117/500 [======>.......................] - ETA: 1:37 - loss: 1.6196 - regression_loss: 1.3986 - classification_loss: 0.2209 118/500 [======>.......................] - ETA: 1:37 - loss: 1.6163 - regression_loss: 1.3955 - classification_loss: 0.2207 119/500 [======>.......................] - ETA: 1:36 - loss: 1.6151 - regression_loss: 1.3943 - classification_loss: 0.2208 120/500 [======>.......................] - ETA: 1:36 - loss: 1.6173 - regression_loss: 1.3955 - classification_loss: 0.2218 121/500 [======>.......................] - ETA: 1:36 - loss: 1.6171 - regression_loss: 1.3953 - classification_loss: 0.2218 122/500 [======>.......................] - ETA: 1:36 - loss: 1.6176 - regression_loss: 1.3960 - classification_loss: 0.2216 123/500 [======>.......................] - ETA: 1:35 - loss: 1.6109 - regression_loss: 1.3904 - classification_loss: 0.2205 124/500 [======>.......................] - ETA: 1:35 - loss: 1.6081 - regression_loss: 1.3882 - classification_loss: 0.2199 125/500 [======>.......................] - ETA: 1:35 - loss: 1.6041 - regression_loss: 1.3851 - classification_loss: 0.2190 126/500 [======>.......................] - ETA: 1:35 - loss: 1.5989 - regression_loss: 1.3805 - classification_loss: 0.2184 127/500 [======>.......................] - ETA: 1:34 - loss: 1.6023 - regression_loss: 1.3832 - classification_loss: 0.2191 128/500 [======>.......................] - ETA: 1:34 - loss: 1.6027 - regression_loss: 1.3837 - classification_loss: 0.2189 129/500 [======>.......................] - ETA: 1:34 - loss: 1.6038 - regression_loss: 1.3846 - classification_loss: 0.2192 130/500 [======>.......................] - ETA: 1:34 - loss: 1.6033 - regression_loss: 1.3841 - classification_loss: 0.2192 131/500 [======>.......................] - ETA: 1:33 - loss: 1.6056 - regression_loss: 1.3855 - classification_loss: 0.2201 132/500 [======>.......................] - ETA: 1:33 - loss: 1.6051 - regression_loss: 1.3853 - classification_loss: 0.2198 133/500 [======>.......................] - ETA: 1:33 - loss: 1.6057 - regression_loss: 1.3840 - classification_loss: 0.2217 134/500 [=======>......................] - ETA: 1:33 - loss: 1.6062 - regression_loss: 1.3846 - classification_loss: 0.2216 135/500 [=======>......................] - ETA: 1:32 - loss: 1.6111 - regression_loss: 1.3882 - classification_loss: 0.2229 136/500 [=======>......................] - ETA: 1:32 - loss: 1.6072 - regression_loss: 1.3853 - classification_loss: 0.2219 137/500 [=======>......................] - ETA: 1:32 - loss: 1.6069 - regression_loss: 1.3850 - classification_loss: 0.2219 138/500 [=======>......................] - ETA: 1:32 - loss: 1.6106 - regression_loss: 1.3884 - classification_loss: 0.2222 139/500 [=======>......................] - ETA: 1:31 - loss: 1.6074 - regression_loss: 1.3858 - classification_loss: 0.2216 140/500 [=======>......................] - ETA: 1:31 - loss: 1.6062 - regression_loss: 1.3848 - classification_loss: 0.2214 141/500 [=======>......................] - ETA: 1:31 - loss: 1.6004 - regression_loss: 1.3797 - classification_loss: 0.2207 142/500 [=======>......................] - ETA: 1:31 - loss: 1.5963 - regression_loss: 1.3764 - classification_loss: 0.2199 143/500 [=======>......................] - ETA: 1:30 - loss: 1.5920 - regression_loss: 1.3727 - classification_loss: 0.2194 144/500 [=======>......................] - ETA: 1:30 - loss: 1.5879 - regression_loss: 1.3689 - classification_loss: 0.2190 145/500 [=======>......................] - ETA: 1:30 - loss: 1.5903 - regression_loss: 1.3709 - classification_loss: 0.2194 146/500 [=======>......................] - ETA: 1:30 - loss: 1.5915 - regression_loss: 1.3724 - classification_loss: 0.2191 147/500 [=======>......................] - ETA: 1:29 - loss: 1.5919 - regression_loss: 1.3725 - classification_loss: 0.2194 148/500 [=======>......................] - ETA: 1:29 - loss: 1.5949 - regression_loss: 1.3749 - classification_loss: 0.2200 149/500 [=======>......................] - ETA: 1:29 - loss: 1.6004 - regression_loss: 1.3785 - classification_loss: 0.2219 150/500 [========>.....................] - ETA: 1:29 - loss: 1.6007 - regression_loss: 1.3789 - classification_loss: 0.2218 151/500 [========>.....................] - ETA: 1:28 - loss: 1.6027 - regression_loss: 1.3803 - classification_loss: 0.2224 152/500 [========>.....................] - ETA: 1:28 - loss: 1.5981 - regression_loss: 1.3762 - classification_loss: 0.2219 153/500 [========>.....................] - ETA: 1:28 - loss: 1.5973 - regression_loss: 1.3756 - classification_loss: 0.2217 154/500 [========>.....................] - ETA: 1:27 - loss: 1.5985 - regression_loss: 1.3764 - classification_loss: 0.2221 155/500 [========>.....................] - ETA: 1:27 - loss: 1.5966 - regression_loss: 1.3747 - classification_loss: 0.2219 156/500 [========>.....................] - ETA: 1:27 - loss: 1.5964 - regression_loss: 1.3741 - classification_loss: 0.2223 157/500 [========>.....................] - ETA: 1:27 - loss: 1.6019 - regression_loss: 1.3793 - classification_loss: 0.2225 158/500 [========>.....................] - ETA: 1:26 - loss: 1.6018 - regression_loss: 1.3792 - classification_loss: 0.2226 159/500 [========>.....................] - ETA: 1:26 - loss: 1.6044 - regression_loss: 1.3813 - classification_loss: 0.2230 160/500 [========>.....................] - ETA: 1:26 - loss: 1.6016 - regression_loss: 1.3793 - classification_loss: 0.2223 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5985 - regression_loss: 1.3768 - classification_loss: 0.2217 162/500 [========>.....................] - ETA: 1:25 - loss: 1.5985 - regression_loss: 1.3769 - classification_loss: 0.2216 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5985 - regression_loss: 1.3771 - classification_loss: 0.2214 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5916 - regression_loss: 1.3712 - classification_loss: 0.2204 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5943 - regression_loss: 1.3736 - classification_loss: 0.2207 166/500 [========>.....................] - ETA: 1:24 - loss: 1.5960 - regression_loss: 1.3754 - classification_loss: 0.2206 167/500 [=========>....................] - ETA: 1:24 - loss: 1.5979 - regression_loss: 1.3770 - classification_loss: 0.2209 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6002 - regression_loss: 1.3787 - classification_loss: 0.2215 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5952 - regression_loss: 1.3745 - classification_loss: 0.2207 170/500 [=========>....................] - ETA: 1:23 - loss: 1.5960 - regression_loss: 1.3756 - classification_loss: 0.2203 171/500 [=========>....................] - ETA: 1:23 - loss: 1.5967 - regression_loss: 1.3762 - classification_loss: 0.2206 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5983 - regression_loss: 1.3776 - classification_loss: 0.2207 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5989 - regression_loss: 1.3780 - classification_loss: 0.2209 174/500 [=========>....................] - ETA: 1:22 - loss: 1.5952 - regression_loss: 1.3752 - classification_loss: 0.2201 175/500 [=========>....................] - ETA: 1:22 - loss: 1.5954 - regression_loss: 1.3754 - classification_loss: 0.2200 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5940 - regression_loss: 1.3744 - classification_loss: 0.2196 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5932 - regression_loss: 1.3738 - classification_loss: 0.2195 178/500 [=========>....................] - ETA: 1:21 - loss: 1.5948 - regression_loss: 1.3753 - classification_loss: 0.2195 179/500 [=========>....................] - ETA: 1:21 - loss: 1.5921 - regression_loss: 1.3728 - classification_loss: 0.2193 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5930 - regression_loss: 1.3735 - classification_loss: 0.2195 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5929 - regression_loss: 1.3735 - classification_loss: 0.2194 182/500 [=========>....................] - ETA: 1:20 - loss: 1.5904 - regression_loss: 1.3710 - classification_loss: 0.2194 183/500 [=========>....................] - ETA: 1:20 - loss: 1.5918 - regression_loss: 1.3723 - classification_loss: 0.2196 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5921 - regression_loss: 1.3726 - classification_loss: 0.2195 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5860 - regression_loss: 1.3672 - classification_loss: 0.2187 186/500 [==========>...................] - ETA: 1:19 - loss: 1.5856 - regression_loss: 1.3667 - classification_loss: 0.2189 187/500 [==========>...................] - ETA: 1:19 - loss: 1.5840 - regression_loss: 1.3652 - classification_loss: 0.2189 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5842 - regression_loss: 1.3652 - classification_loss: 0.2190 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5863 - regression_loss: 1.3668 - classification_loss: 0.2195 190/500 [==========>...................] - ETA: 1:18 - loss: 1.5856 - regression_loss: 1.3661 - classification_loss: 0.2195 191/500 [==========>...................] - ETA: 1:18 - loss: 1.5827 - regression_loss: 1.3638 - classification_loss: 0.2189 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5815 - regression_loss: 1.3628 - classification_loss: 0.2186 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5815 - regression_loss: 1.3630 - classification_loss: 0.2185 194/500 [==========>...................] - ETA: 1:17 - loss: 1.5803 - regression_loss: 1.3621 - classification_loss: 0.2181 195/500 [==========>...................] - ETA: 1:17 - loss: 1.5796 - regression_loss: 1.3614 - classification_loss: 0.2181 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5801 - regression_loss: 1.3619 - classification_loss: 0.2182 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5775 - regression_loss: 1.3599 - classification_loss: 0.2176 198/500 [==========>...................] - ETA: 1:16 - loss: 1.5757 - regression_loss: 1.3585 - classification_loss: 0.2172 199/500 [==========>...................] - ETA: 1:16 - loss: 1.5723 - regression_loss: 1.3556 - classification_loss: 0.2167 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5702 - regression_loss: 1.3531 - classification_loss: 0.2171 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5688 - regression_loss: 1.3517 - classification_loss: 0.2171 202/500 [===========>..................] - ETA: 1:15 - loss: 1.5696 - regression_loss: 1.3525 - classification_loss: 0.2171 203/500 [===========>..................] - ETA: 1:15 - loss: 1.5701 - regression_loss: 1.3532 - classification_loss: 0.2169 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5686 - regression_loss: 1.3521 - classification_loss: 0.2165 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5746 - regression_loss: 1.3572 - classification_loss: 0.2174 206/500 [===========>..................] - ETA: 1:14 - loss: 1.5758 - regression_loss: 1.3579 - classification_loss: 0.2178 207/500 [===========>..................] - ETA: 1:14 - loss: 1.5758 - regression_loss: 1.3578 - classification_loss: 0.2180 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5742 - regression_loss: 1.3565 - classification_loss: 0.2178 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5742 - regression_loss: 1.3565 - classification_loss: 0.2177 210/500 [===========>..................] - ETA: 1:13 - loss: 1.5705 - regression_loss: 1.3532 - classification_loss: 0.2173 211/500 [===========>..................] - ETA: 1:13 - loss: 1.5703 - regression_loss: 1.3532 - classification_loss: 0.2172 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5735 - regression_loss: 1.3559 - classification_loss: 0.2176 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5724 - regression_loss: 1.3550 - classification_loss: 0.2174 214/500 [===========>..................] - ETA: 1:12 - loss: 1.5687 - regression_loss: 1.3519 - classification_loss: 0.2168 215/500 [===========>..................] - ETA: 1:12 - loss: 1.5684 - regression_loss: 1.3518 - classification_loss: 0.2166 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5683 - regression_loss: 1.3519 - classification_loss: 0.2164 217/500 [============>.................] - ETA: 1:11 - loss: 1.5683 - regression_loss: 1.3515 - classification_loss: 0.2168 218/500 [============>.................] - ETA: 1:11 - loss: 1.5679 - regression_loss: 1.3511 - classification_loss: 0.2168 219/500 [============>.................] - ETA: 1:11 - loss: 1.5674 - regression_loss: 1.3508 - classification_loss: 0.2166 220/500 [============>.................] - ETA: 1:10 - loss: 1.5670 - regression_loss: 1.3506 - classification_loss: 0.2164 221/500 [============>.................] - ETA: 1:10 - loss: 1.5674 - regression_loss: 1.3510 - classification_loss: 0.2164 222/500 [============>.................] - ETA: 1:10 - loss: 1.5685 - regression_loss: 1.3519 - classification_loss: 0.2166 223/500 [============>.................] - ETA: 1:10 - loss: 1.5695 - regression_loss: 1.3524 - classification_loss: 0.2170 224/500 [============>.................] - ETA: 1:09 - loss: 1.5696 - regression_loss: 1.3528 - classification_loss: 0.2168 225/500 [============>.................] - ETA: 1:09 - loss: 1.5696 - regression_loss: 1.3528 - classification_loss: 0.2168 226/500 [============>.................] - ETA: 1:09 - loss: 1.5703 - regression_loss: 1.3535 - classification_loss: 0.2168 227/500 [============>.................] - ETA: 1:09 - loss: 1.5659 - regression_loss: 1.3498 - classification_loss: 0.2161 228/500 [============>.................] - ETA: 1:08 - loss: 1.5661 - regression_loss: 1.3499 - classification_loss: 0.2162 229/500 [============>.................] - ETA: 1:08 - loss: 1.5648 - regression_loss: 1.3489 - classification_loss: 0.2159 230/500 [============>.................] - ETA: 1:08 - loss: 1.5618 - regression_loss: 1.3465 - classification_loss: 0.2153 231/500 [============>.................] - ETA: 1:08 - loss: 1.5615 - regression_loss: 1.3460 - classification_loss: 0.2154 232/500 [============>.................] - ETA: 1:07 - loss: 1.5626 - regression_loss: 1.3470 - classification_loss: 0.2156 233/500 [============>.................] - ETA: 1:07 - loss: 1.5622 - regression_loss: 1.3471 - classification_loss: 0.2151 234/500 [=============>................] - ETA: 1:07 - loss: 1.5631 - regression_loss: 1.3480 - classification_loss: 0.2152 235/500 [=============>................] - ETA: 1:07 - loss: 1.5672 - regression_loss: 1.3512 - classification_loss: 0.2160 236/500 [=============>................] - ETA: 1:06 - loss: 1.5669 - regression_loss: 1.3511 - classification_loss: 0.2158 237/500 [=============>................] - ETA: 1:06 - loss: 1.5665 - regression_loss: 1.3509 - classification_loss: 0.2156 238/500 [=============>................] - ETA: 1:06 - loss: 1.5657 - regression_loss: 1.3501 - classification_loss: 0.2155 239/500 [=============>................] - ETA: 1:05 - loss: 1.5655 - regression_loss: 1.3500 - classification_loss: 0.2155 240/500 [=============>................] - ETA: 1:05 - loss: 1.5635 - regression_loss: 1.3484 - classification_loss: 0.2151 241/500 [=============>................] - ETA: 1:05 - loss: 1.5632 - regression_loss: 1.3479 - classification_loss: 0.2153 242/500 [=============>................] - ETA: 1:05 - loss: 1.5616 - regression_loss: 1.3466 - classification_loss: 0.2150 243/500 [=============>................] - ETA: 1:04 - loss: 1.5628 - regression_loss: 1.3477 - classification_loss: 0.2151 244/500 [=============>................] - ETA: 1:04 - loss: 1.5616 - regression_loss: 1.3469 - classification_loss: 0.2147 245/500 [=============>................] - ETA: 1:04 - loss: 1.5624 - regression_loss: 1.3476 - classification_loss: 0.2148 246/500 [=============>................] - ETA: 1:04 - loss: 1.5613 - regression_loss: 1.3468 - classification_loss: 0.2145 247/500 [=============>................] - ETA: 1:03 - loss: 1.5622 - regression_loss: 1.3475 - classification_loss: 0.2147 248/500 [=============>................] - ETA: 1:03 - loss: 1.5601 - regression_loss: 1.3457 - classification_loss: 0.2144 249/500 [=============>................] - ETA: 1:03 - loss: 1.5591 - regression_loss: 1.3449 - classification_loss: 0.2142 250/500 [==============>...............] - ETA: 1:03 - loss: 1.5597 - regression_loss: 1.3454 - classification_loss: 0.2143 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5599 - regression_loss: 1.3457 - classification_loss: 0.2141 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5627 - regression_loss: 1.3478 - classification_loss: 0.2150 253/500 [==============>...............] - ETA: 1:02 - loss: 1.5637 - regression_loss: 1.3486 - classification_loss: 0.2152 254/500 [==============>...............] - ETA: 1:02 - loss: 1.5640 - regression_loss: 1.3489 - classification_loss: 0.2151 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5600 - regression_loss: 1.3450 - classification_loss: 0.2149 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5605 - regression_loss: 1.3455 - classification_loss: 0.2150 257/500 [==============>...............] - ETA: 1:01 - loss: 1.5630 - regression_loss: 1.3478 - classification_loss: 0.2153 258/500 [==============>...............] - ETA: 1:01 - loss: 1.5637 - regression_loss: 1.3483 - classification_loss: 0.2154 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5616 - regression_loss: 1.3465 - classification_loss: 0.2151 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5628 - regression_loss: 1.3477 - classification_loss: 0.2151 261/500 [==============>...............] - ETA: 1:00 - loss: 1.5606 - regression_loss: 1.3457 - classification_loss: 0.2148 262/500 [==============>...............] - ETA: 1:00 - loss: 1.5635 - regression_loss: 1.3485 - classification_loss: 0.2149 263/500 [==============>...............] - ETA: 59s - loss: 1.5641 - regression_loss: 1.3491 - classification_loss: 0.2150  264/500 [==============>...............] - ETA: 59s - loss: 1.5635 - regression_loss: 1.3486 - classification_loss: 0.2149 265/500 [==============>...............] - ETA: 59s - loss: 1.5645 - regression_loss: 1.3497 - classification_loss: 0.2148 266/500 [==============>...............] - ETA: 59s - loss: 1.5646 - regression_loss: 1.3498 - classification_loss: 0.2148 267/500 [===============>..............] - ETA: 58s - loss: 1.5614 - regression_loss: 1.3471 - classification_loss: 0.2143 268/500 [===============>..............] - ETA: 58s - loss: 1.5616 - regression_loss: 1.3474 - classification_loss: 0.2142 269/500 [===============>..............] - ETA: 58s - loss: 1.5627 - regression_loss: 1.3482 - classification_loss: 0.2145 270/500 [===============>..............] - ETA: 58s - loss: 1.5631 - regression_loss: 1.3485 - classification_loss: 0.2145 271/500 [===============>..............] - ETA: 57s - loss: 1.5658 - regression_loss: 1.3501 - classification_loss: 0.2157 272/500 [===============>..............] - ETA: 57s - loss: 1.5663 - regression_loss: 1.3505 - classification_loss: 0.2158 273/500 [===============>..............] - ETA: 57s - loss: 1.5623 - regression_loss: 1.3469 - classification_loss: 0.2153 274/500 [===============>..............] - ETA: 57s - loss: 1.5629 - regression_loss: 1.3474 - classification_loss: 0.2155 275/500 [===============>..............] - ETA: 56s - loss: 1.5646 - regression_loss: 1.3491 - classification_loss: 0.2156 276/500 [===============>..............] - ETA: 56s - loss: 1.5653 - regression_loss: 1.3494 - classification_loss: 0.2158 277/500 [===============>..............] - ETA: 56s - loss: 1.5642 - regression_loss: 1.3487 - classification_loss: 0.2155 278/500 [===============>..............] - ETA: 56s - loss: 1.5641 - regression_loss: 1.3485 - classification_loss: 0.2156 279/500 [===============>..............] - ETA: 55s - loss: 1.5638 - regression_loss: 1.3481 - classification_loss: 0.2157 280/500 [===============>..............] - ETA: 55s - loss: 1.5666 - regression_loss: 1.3507 - classification_loss: 0.2159 281/500 [===============>..............] - ETA: 55s - loss: 1.5688 - regression_loss: 1.3523 - classification_loss: 0.2164 282/500 [===============>..............] - ETA: 55s - loss: 1.5695 - regression_loss: 1.3531 - classification_loss: 0.2164 283/500 [===============>..............] - ETA: 54s - loss: 1.5670 - regression_loss: 1.3510 - classification_loss: 0.2161 284/500 [================>.............] - ETA: 54s - loss: 1.5682 - regression_loss: 1.3519 - classification_loss: 0.2164 285/500 [================>.............] - ETA: 54s - loss: 1.5682 - regression_loss: 1.3520 - classification_loss: 0.2162 286/500 [================>.............] - ETA: 54s - loss: 1.5683 - regression_loss: 1.3521 - classification_loss: 0.2162 287/500 [================>.............] - ETA: 53s - loss: 1.5701 - regression_loss: 1.3540 - classification_loss: 0.2162 288/500 [================>.............] - ETA: 53s - loss: 1.5714 - regression_loss: 1.3554 - classification_loss: 0.2160 289/500 [================>.............] - ETA: 53s - loss: 1.5717 - regression_loss: 1.3558 - classification_loss: 0.2159 290/500 [================>.............] - ETA: 53s - loss: 1.5711 - regression_loss: 1.3552 - classification_loss: 0.2159 291/500 [================>.............] - ETA: 52s - loss: 1.5726 - regression_loss: 1.3565 - classification_loss: 0.2161 292/500 [================>.............] - ETA: 52s - loss: 1.5731 - regression_loss: 1.3569 - classification_loss: 0.2162 293/500 [================>.............] - ETA: 52s - loss: 1.5736 - regression_loss: 1.3575 - classification_loss: 0.2162 294/500 [================>.............] - ETA: 52s - loss: 1.5748 - regression_loss: 1.3584 - classification_loss: 0.2163 295/500 [================>.............] - ETA: 51s - loss: 1.5755 - regression_loss: 1.3591 - classification_loss: 0.2164 296/500 [================>.............] - ETA: 51s - loss: 1.5756 - regression_loss: 1.3592 - classification_loss: 0.2164 297/500 [================>.............] - ETA: 51s - loss: 1.5743 - regression_loss: 1.3582 - classification_loss: 0.2162 298/500 [================>.............] - ETA: 51s - loss: 1.5756 - regression_loss: 1.3590 - classification_loss: 0.2165 299/500 [================>.............] - ETA: 50s - loss: 1.5768 - regression_loss: 1.3601 - classification_loss: 0.2166 300/500 [=================>............] - ETA: 50s - loss: 1.5767 - regression_loss: 1.3604 - classification_loss: 0.2163 301/500 [=================>............] - ETA: 50s - loss: 1.5746 - regression_loss: 1.3586 - classification_loss: 0.2160 302/500 [=================>............] - ETA: 50s - loss: 1.5751 - regression_loss: 1.3591 - classification_loss: 0.2159 303/500 [=================>............] - ETA: 49s - loss: 1.5758 - regression_loss: 1.3599 - classification_loss: 0.2159 304/500 [=================>............] - ETA: 49s - loss: 1.5742 - regression_loss: 1.3586 - classification_loss: 0.2156 305/500 [=================>............] - ETA: 49s - loss: 1.5741 - regression_loss: 1.3586 - classification_loss: 0.2155 306/500 [=================>............] - ETA: 49s - loss: 1.5751 - regression_loss: 1.3593 - classification_loss: 0.2158 307/500 [=================>............] - ETA: 48s - loss: 1.5760 - regression_loss: 1.3602 - classification_loss: 0.2158 308/500 [=================>............] - ETA: 48s - loss: 1.5761 - regression_loss: 1.3603 - classification_loss: 0.2158 309/500 [=================>............] - ETA: 48s - loss: 1.5773 - regression_loss: 1.3613 - classification_loss: 0.2160 310/500 [=================>............] - ETA: 48s - loss: 1.5743 - regression_loss: 1.3585 - classification_loss: 0.2158 311/500 [=================>............] - ETA: 47s - loss: 1.5764 - regression_loss: 1.3603 - classification_loss: 0.2161 312/500 [=================>............] - ETA: 47s - loss: 1.5765 - regression_loss: 1.3605 - classification_loss: 0.2161 313/500 [=================>............] - ETA: 47s - loss: 1.5764 - regression_loss: 1.3602 - classification_loss: 0.2162 314/500 [=================>............] - ETA: 47s - loss: 1.5768 - regression_loss: 1.3607 - classification_loss: 0.2161 315/500 [=================>............] - ETA: 46s - loss: 1.5765 - regression_loss: 1.3604 - classification_loss: 0.2161 316/500 [=================>............] - ETA: 46s - loss: 1.5750 - regression_loss: 1.3589 - classification_loss: 0.2160 317/500 [==================>...........] - ETA: 46s - loss: 1.5730 - regression_loss: 1.3575 - classification_loss: 0.2156 318/500 [==================>...........] - ETA: 46s - loss: 1.5730 - regression_loss: 1.3574 - classification_loss: 0.2156 319/500 [==================>...........] - ETA: 45s - loss: 1.5714 - regression_loss: 1.3559 - classification_loss: 0.2155 320/500 [==================>...........] - ETA: 45s - loss: 1.5722 - regression_loss: 1.3567 - classification_loss: 0.2156 321/500 [==================>...........] - ETA: 45s - loss: 1.5727 - regression_loss: 1.3569 - classification_loss: 0.2157 322/500 [==================>...........] - ETA: 45s - loss: 1.5739 - regression_loss: 1.3579 - classification_loss: 0.2159 323/500 [==================>...........] - ETA: 44s - loss: 1.5728 - regression_loss: 1.3571 - classification_loss: 0.2157 324/500 [==================>...........] - ETA: 44s - loss: 1.5694 - regression_loss: 1.3541 - classification_loss: 0.2153 325/500 [==================>...........] - ETA: 44s - loss: 1.5700 - regression_loss: 1.3546 - classification_loss: 0.2154 326/500 [==================>...........] - ETA: 44s - loss: 1.5709 - regression_loss: 1.3552 - classification_loss: 0.2158 327/500 [==================>...........] - ETA: 43s - loss: 1.5717 - regression_loss: 1.3557 - classification_loss: 0.2160 328/500 [==================>...........] - ETA: 43s - loss: 1.5692 - regression_loss: 1.3536 - classification_loss: 0.2156 329/500 [==================>...........] - ETA: 43s - loss: 1.5701 - regression_loss: 1.3544 - classification_loss: 0.2157 330/500 [==================>...........] - ETA: 43s - loss: 1.5709 - regression_loss: 1.3551 - classification_loss: 0.2158 331/500 [==================>...........] - ETA: 42s - loss: 1.5702 - regression_loss: 1.3547 - classification_loss: 0.2156 332/500 [==================>...........] - ETA: 42s - loss: 1.5715 - regression_loss: 1.3559 - classification_loss: 0.2155 333/500 [==================>...........] - ETA: 42s - loss: 1.5724 - regression_loss: 1.3567 - classification_loss: 0.2157 334/500 [===================>..........] - ETA: 42s - loss: 1.5741 - regression_loss: 1.3579 - classification_loss: 0.2161 335/500 [===================>..........] - ETA: 41s - loss: 1.5738 - regression_loss: 1.3577 - classification_loss: 0.2161 336/500 [===================>..........] - ETA: 41s - loss: 1.5749 - regression_loss: 1.3588 - classification_loss: 0.2161 337/500 [===================>..........] - ETA: 41s - loss: 1.5763 - regression_loss: 1.3597 - classification_loss: 0.2165 338/500 [===================>..........] - ETA: 40s - loss: 1.5779 - regression_loss: 1.3611 - classification_loss: 0.2168 339/500 [===================>..........] - ETA: 40s - loss: 1.5793 - regression_loss: 1.3625 - classification_loss: 0.2168 340/500 [===================>..........] - ETA: 40s - loss: 1.5792 - regression_loss: 1.3624 - classification_loss: 0.2168 341/500 [===================>..........] - ETA: 40s - loss: 1.5757 - regression_loss: 1.3593 - classification_loss: 0.2164 342/500 [===================>..........] - ETA: 39s - loss: 1.5736 - regression_loss: 1.3574 - classification_loss: 0.2161 343/500 [===================>..........] - ETA: 39s - loss: 1.5725 - regression_loss: 1.3567 - classification_loss: 0.2159 344/500 [===================>..........] - ETA: 39s - loss: 1.5723 - regression_loss: 1.3565 - classification_loss: 0.2158 345/500 [===================>..........] - ETA: 39s - loss: 1.5725 - regression_loss: 1.3568 - classification_loss: 0.2157 346/500 [===================>..........] - ETA: 38s - loss: 1.5721 - regression_loss: 1.3564 - classification_loss: 0.2157 347/500 [===================>..........] - ETA: 38s - loss: 1.5727 - regression_loss: 1.3568 - classification_loss: 0.2159 348/500 [===================>..........] - ETA: 38s - loss: 1.5725 - regression_loss: 1.3566 - classification_loss: 0.2159 349/500 [===================>..........] - ETA: 38s - loss: 1.5709 - regression_loss: 1.3554 - classification_loss: 0.2155 350/500 [====================>.........] - ETA: 37s - loss: 1.5681 - regression_loss: 1.3530 - classification_loss: 0.2151 351/500 [====================>.........] - ETA: 37s - loss: 1.5689 - regression_loss: 1.3537 - classification_loss: 0.2152 352/500 [====================>.........] - ETA: 37s - loss: 1.5676 - regression_loss: 1.3526 - classification_loss: 0.2150 353/500 [====================>.........] - ETA: 37s - loss: 1.5677 - regression_loss: 1.3527 - classification_loss: 0.2150 354/500 [====================>.........] - ETA: 36s - loss: 1.5689 - regression_loss: 1.3537 - classification_loss: 0.2152 355/500 [====================>.........] - ETA: 36s - loss: 1.5675 - regression_loss: 1.3526 - classification_loss: 0.2149 356/500 [====================>.........] - ETA: 36s - loss: 1.5677 - regression_loss: 1.3527 - classification_loss: 0.2150 357/500 [====================>.........] - ETA: 36s - loss: 1.5684 - regression_loss: 1.3532 - classification_loss: 0.2152 358/500 [====================>.........] - ETA: 35s - loss: 1.5685 - regression_loss: 1.3534 - classification_loss: 0.2151 359/500 [====================>.........] - ETA: 35s - loss: 1.5674 - regression_loss: 1.3525 - classification_loss: 0.2149 360/500 [====================>.........] - ETA: 35s - loss: 1.5683 - regression_loss: 1.3533 - classification_loss: 0.2150 361/500 [====================>.........] - ETA: 35s - loss: 1.5686 - regression_loss: 1.3535 - classification_loss: 0.2151 362/500 [====================>.........] - ETA: 34s - loss: 1.5678 - regression_loss: 1.3529 - classification_loss: 0.2150 363/500 [====================>.........] - ETA: 34s - loss: 1.5661 - regression_loss: 1.3515 - classification_loss: 0.2147 364/500 [====================>.........] - ETA: 34s - loss: 1.5671 - regression_loss: 1.3522 - classification_loss: 0.2149 365/500 [====================>.........] - ETA: 34s - loss: 1.5683 - regression_loss: 1.3530 - classification_loss: 0.2153 366/500 [====================>.........] - ETA: 33s - loss: 1.5674 - regression_loss: 1.3521 - classification_loss: 0.2153 367/500 [=====================>........] - ETA: 33s - loss: 1.5685 - regression_loss: 1.3529 - classification_loss: 0.2157 368/500 [=====================>........] - ETA: 33s - loss: 1.5689 - regression_loss: 1.3533 - classification_loss: 0.2156 369/500 [=====================>........] - ETA: 33s - loss: 1.5669 - regression_loss: 1.3516 - classification_loss: 0.2153 370/500 [=====================>........] - ETA: 32s - loss: 1.5676 - regression_loss: 1.3522 - classification_loss: 0.2154 371/500 [=====================>........] - ETA: 32s - loss: 1.5677 - regression_loss: 1.3524 - classification_loss: 0.2153 372/500 [=====================>........] - ETA: 32s - loss: 1.5650 - regression_loss: 1.3499 - classification_loss: 0.2151 373/500 [=====================>........] - ETA: 32s - loss: 1.5647 - regression_loss: 1.3495 - classification_loss: 0.2151 374/500 [=====================>........] - ETA: 31s - loss: 1.5624 - regression_loss: 1.3476 - classification_loss: 0.2148 375/500 [=====================>........] - ETA: 31s - loss: 1.5635 - regression_loss: 1.3486 - classification_loss: 0.2149 376/500 [=====================>........] - ETA: 31s - loss: 1.5640 - regression_loss: 1.3491 - classification_loss: 0.2149 377/500 [=====================>........] - ETA: 31s - loss: 1.5639 - regression_loss: 1.3491 - classification_loss: 0.2148 378/500 [=====================>........] - ETA: 30s - loss: 1.5648 - regression_loss: 1.3498 - classification_loss: 0.2150 379/500 [=====================>........] - ETA: 30s - loss: 1.5648 - regression_loss: 1.3497 - classification_loss: 0.2151 380/500 [=====================>........] - ETA: 30s - loss: 1.5638 - regression_loss: 1.3487 - classification_loss: 0.2151 381/500 [=====================>........] - ETA: 30s - loss: 1.5624 - regression_loss: 1.3475 - classification_loss: 0.2148 382/500 [=====================>........] - ETA: 29s - loss: 1.5628 - regression_loss: 1.3479 - classification_loss: 0.2149 383/500 [=====================>........] - ETA: 29s - loss: 1.5631 - regression_loss: 1.3482 - classification_loss: 0.2149 384/500 [======================>.......] - ETA: 29s - loss: 1.5629 - regression_loss: 1.3479 - classification_loss: 0.2150 385/500 [======================>.......] - ETA: 29s - loss: 1.5627 - regression_loss: 1.3478 - classification_loss: 0.2149 386/500 [======================>.......] - ETA: 28s - loss: 1.5625 - regression_loss: 1.3475 - classification_loss: 0.2150 387/500 [======================>.......] - ETA: 28s - loss: 1.5658 - regression_loss: 1.3500 - classification_loss: 0.2159 388/500 [======================>.......] - ETA: 28s - loss: 1.5660 - regression_loss: 1.3500 - classification_loss: 0.2160 389/500 [======================>.......] - ETA: 28s - loss: 1.5654 - regression_loss: 1.3495 - classification_loss: 0.2159 390/500 [======================>.......] - ETA: 27s - loss: 1.5671 - regression_loss: 1.3503 - classification_loss: 0.2168 391/500 [======================>.......] - ETA: 27s - loss: 1.5671 - regression_loss: 1.3502 - classification_loss: 0.2170 392/500 [======================>.......] - ETA: 27s - loss: 1.5655 - regression_loss: 1.3488 - classification_loss: 0.2167 393/500 [======================>.......] - ETA: 27s - loss: 1.5656 - regression_loss: 1.3488 - classification_loss: 0.2167 394/500 [======================>.......] - ETA: 26s - loss: 1.5666 - regression_loss: 1.3498 - classification_loss: 0.2168 395/500 [======================>.......] - ETA: 26s - loss: 1.5662 - regression_loss: 1.3495 - classification_loss: 0.2167 396/500 [======================>.......] - ETA: 26s - loss: 1.5670 - regression_loss: 1.3502 - classification_loss: 0.2168 397/500 [======================>.......] - ETA: 26s - loss: 1.5664 - regression_loss: 1.3498 - classification_loss: 0.2166 398/500 [======================>.......] - ETA: 25s - loss: 1.5661 - regression_loss: 1.3496 - classification_loss: 0.2165 399/500 [======================>.......] - ETA: 25s - loss: 1.5667 - regression_loss: 1.3502 - classification_loss: 0.2166 400/500 [=======================>......] - ETA: 25s - loss: 1.5673 - regression_loss: 1.3508 - classification_loss: 0.2165 401/500 [=======================>......] - ETA: 25s - loss: 1.5678 - regression_loss: 1.3513 - classification_loss: 0.2165 402/500 [=======================>......] - ETA: 24s - loss: 1.5664 - regression_loss: 1.3503 - classification_loss: 0.2162 403/500 [=======================>......] - ETA: 24s - loss: 1.5668 - regression_loss: 1.3507 - classification_loss: 0.2161 404/500 [=======================>......] - ETA: 24s - loss: 1.5670 - regression_loss: 1.3509 - classification_loss: 0.2161 405/500 [=======================>......] - ETA: 24s - loss: 1.5663 - regression_loss: 1.3503 - classification_loss: 0.2160 406/500 [=======================>......] - ETA: 23s - loss: 1.5650 - regression_loss: 1.3493 - classification_loss: 0.2158 407/500 [=======================>......] - ETA: 23s - loss: 1.5662 - regression_loss: 1.3503 - classification_loss: 0.2159 408/500 [=======================>......] - ETA: 23s - loss: 1.5647 - regression_loss: 1.3491 - classification_loss: 0.2156 409/500 [=======================>......] - ETA: 23s - loss: 1.5643 - regression_loss: 1.3488 - classification_loss: 0.2156 410/500 [=======================>......] - ETA: 22s - loss: 1.5632 - regression_loss: 1.3478 - classification_loss: 0.2154 411/500 [=======================>......] - ETA: 22s - loss: 1.5632 - regression_loss: 1.3479 - classification_loss: 0.2153 412/500 [=======================>......] - ETA: 22s - loss: 1.5631 - regression_loss: 1.3478 - classification_loss: 0.2153 413/500 [=======================>......] - ETA: 22s - loss: 1.5635 - regression_loss: 1.3482 - classification_loss: 0.2153 414/500 [=======================>......] - ETA: 21s - loss: 1.5626 - regression_loss: 1.3473 - classification_loss: 0.2153 415/500 [=======================>......] - ETA: 21s - loss: 1.5623 - regression_loss: 1.3471 - classification_loss: 0.2152 416/500 [=======================>......] - ETA: 21s - loss: 1.5636 - regression_loss: 1.3482 - classification_loss: 0.2154 417/500 [========================>.....] - ETA: 21s - loss: 1.5658 - regression_loss: 1.3500 - classification_loss: 0.2158 418/500 [========================>.....] - ETA: 20s - loss: 1.5656 - regression_loss: 1.3498 - classification_loss: 0.2158 419/500 [========================>.....] - ETA: 20s - loss: 1.5644 - regression_loss: 1.3487 - classification_loss: 0.2157 420/500 [========================>.....] - ETA: 20s - loss: 1.5645 - regression_loss: 1.3489 - classification_loss: 0.2155 421/500 [========================>.....] - ETA: 19s - loss: 1.5647 - regression_loss: 1.3492 - classification_loss: 0.2155 422/500 [========================>.....] - ETA: 19s - loss: 1.5648 - regression_loss: 1.3493 - classification_loss: 0.2155 423/500 [========================>.....] - ETA: 19s - loss: 1.5658 - regression_loss: 1.3502 - classification_loss: 0.2156 424/500 [========================>.....] - ETA: 19s - loss: 1.5669 - regression_loss: 1.3509 - classification_loss: 0.2160 425/500 [========================>.....] - ETA: 18s - loss: 1.5681 - regression_loss: 1.3519 - classification_loss: 0.2162 426/500 [========================>.....] - ETA: 18s - loss: 1.5681 - regression_loss: 1.3519 - classification_loss: 0.2162 427/500 [========================>.....] - ETA: 18s - loss: 1.5679 - regression_loss: 1.3520 - classification_loss: 0.2159 428/500 [========================>.....] - ETA: 18s - loss: 1.5688 - regression_loss: 1.3527 - classification_loss: 0.2161 429/500 [========================>.....] - ETA: 17s - loss: 1.5661 - regression_loss: 1.3503 - classification_loss: 0.2158 430/500 [========================>.....] - ETA: 17s - loss: 1.5643 - regression_loss: 1.3487 - classification_loss: 0.2157 431/500 [========================>.....] - ETA: 17s - loss: 1.5620 - regression_loss: 1.3466 - classification_loss: 0.2154 432/500 [========================>.....] - ETA: 17s - loss: 1.5620 - regression_loss: 1.3466 - classification_loss: 0.2154 433/500 [========================>.....] - ETA: 16s - loss: 1.5602 - regression_loss: 1.3452 - classification_loss: 0.2151 434/500 [=========================>....] - ETA: 16s - loss: 1.5611 - regression_loss: 1.3459 - classification_loss: 0.2151 435/500 [=========================>....] - ETA: 16s - loss: 1.5610 - regression_loss: 1.3460 - classification_loss: 0.2150 436/500 [=========================>....] - ETA: 16s - loss: 1.5614 - regression_loss: 1.3464 - classification_loss: 0.2150 437/500 [=========================>....] - ETA: 15s - loss: 1.5608 - regression_loss: 1.3459 - classification_loss: 0.2149 438/500 [=========================>....] - ETA: 15s - loss: 1.5606 - regression_loss: 1.3457 - classification_loss: 0.2149 439/500 [=========================>....] - ETA: 15s - loss: 1.5604 - regression_loss: 1.3454 - classification_loss: 0.2149 440/500 [=========================>....] - ETA: 15s - loss: 1.5609 - regression_loss: 1.3459 - classification_loss: 0.2150 441/500 [=========================>....] - ETA: 14s - loss: 1.5618 - regression_loss: 1.3468 - classification_loss: 0.2150 442/500 [=========================>....] - ETA: 14s - loss: 1.5622 - regression_loss: 1.3471 - classification_loss: 0.2151 443/500 [=========================>....] - ETA: 14s - loss: 1.5619 - regression_loss: 1.3469 - classification_loss: 0.2150 444/500 [=========================>....] - ETA: 14s - loss: 1.5614 - regression_loss: 1.3466 - classification_loss: 0.2149 445/500 [=========================>....] - ETA: 13s - loss: 1.5627 - regression_loss: 1.3476 - classification_loss: 0.2151 446/500 [=========================>....] - ETA: 13s - loss: 1.5631 - regression_loss: 1.3479 - classification_loss: 0.2152 447/500 [=========================>....] - ETA: 13s - loss: 1.5635 - regression_loss: 1.3482 - classification_loss: 0.2153 448/500 [=========================>....] - ETA: 13s - loss: 1.5644 - regression_loss: 1.3492 - classification_loss: 0.2152 449/500 [=========================>....] - ETA: 12s - loss: 1.5652 - regression_loss: 1.3499 - classification_loss: 0.2153 450/500 [==========================>...] - ETA: 12s - loss: 1.5653 - regression_loss: 1.3500 - classification_loss: 0.2153 451/500 [==========================>...] - ETA: 12s - loss: 1.5660 - regression_loss: 1.3507 - classification_loss: 0.2154 452/500 [==========================>...] - ETA: 12s - loss: 1.5673 - regression_loss: 1.3518 - classification_loss: 0.2155 453/500 [==========================>...] - ETA: 11s - loss: 1.5678 - regression_loss: 1.3521 - classification_loss: 0.2157 454/500 [==========================>...] - ETA: 11s - loss: 1.5682 - regression_loss: 1.3526 - classification_loss: 0.2157 455/500 [==========================>...] - ETA: 11s - loss: 1.5687 - regression_loss: 1.3530 - classification_loss: 0.2157 456/500 [==========================>...] - ETA: 11s - loss: 1.5691 - regression_loss: 1.3534 - classification_loss: 0.2157 457/500 [==========================>...] - ETA: 10s - loss: 1.5699 - regression_loss: 1.3540 - classification_loss: 0.2159 458/500 [==========================>...] - ETA: 10s - loss: 1.5703 - regression_loss: 1.3545 - classification_loss: 0.2158 459/500 [==========================>...] - ETA: 10s - loss: 1.5697 - regression_loss: 1.3540 - classification_loss: 0.2157 460/500 [==========================>...] - ETA: 10s - loss: 1.5673 - regression_loss: 1.3520 - classification_loss: 0.2153 461/500 [==========================>...] - ETA: 9s - loss: 1.5661 - regression_loss: 1.3510 - classification_loss: 0.2151  462/500 [==========================>...] - ETA: 9s - loss: 1.5643 - regression_loss: 1.3495 - classification_loss: 0.2148 463/500 [==========================>...] - ETA: 9s - loss: 1.5650 - regression_loss: 1.3501 - classification_loss: 0.2149 464/500 [==========================>...] - ETA: 9s - loss: 1.5643 - regression_loss: 1.3496 - classification_loss: 0.2148 465/500 [==========================>...] - ETA: 8s - loss: 1.5637 - regression_loss: 1.3491 - classification_loss: 0.2147 466/500 [==========================>...] - ETA: 8s - loss: 1.5627 - regression_loss: 1.3483 - classification_loss: 0.2145 467/500 [===========================>..] - ETA: 8s - loss: 1.5629 - regression_loss: 1.3485 - classification_loss: 0.2145 468/500 [===========================>..] - ETA: 8s - loss: 1.5619 - regression_loss: 1.3476 - classification_loss: 0.2143 469/500 [===========================>..] - ETA: 7s - loss: 1.5612 - regression_loss: 1.3470 - classification_loss: 0.2142 470/500 [===========================>..] - ETA: 7s - loss: 1.5604 - regression_loss: 1.3462 - classification_loss: 0.2142 471/500 [===========================>..] - ETA: 7s - loss: 1.5581 - regression_loss: 1.3442 - classification_loss: 0.2139 472/500 [===========================>..] - ETA: 7s - loss: 1.5581 - regression_loss: 1.3445 - classification_loss: 0.2136 473/500 [===========================>..] - ETA: 6s - loss: 1.5575 - regression_loss: 1.3440 - classification_loss: 0.2135 474/500 [===========================>..] - ETA: 6s - loss: 1.5567 - regression_loss: 1.3434 - classification_loss: 0.2134 475/500 [===========================>..] - ETA: 6s - loss: 1.5559 - regression_loss: 1.3426 - classification_loss: 0.2132 476/500 [===========================>..] - ETA: 6s - loss: 1.5562 - regression_loss: 1.3430 - classification_loss: 0.2132 477/500 [===========================>..] - ETA: 5s - loss: 1.5563 - regression_loss: 1.3431 - classification_loss: 0.2132 478/500 [===========================>..] - ETA: 5s - loss: 1.5573 - regression_loss: 1.3439 - classification_loss: 0.2133 479/500 [===========================>..] - ETA: 5s - loss: 1.5571 - regression_loss: 1.3440 - classification_loss: 0.2131 480/500 [===========================>..] - ETA: 5s - loss: 1.5569 - regression_loss: 1.3438 - classification_loss: 0.2132 481/500 [===========================>..] - ETA: 4s - loss: 1.5554 - regression_loss: 1.3423 - classification_loss: 0.2130 482/500 [===========================>..] - ETA: 4s - loss: 1.5559 - regression_loss: 1.3429 - classification_loss: 0.2131 483/500 [===========================>..] - ETA: 4s - loss: 1.5562 - regression_loss: 1.3431 - classification_loss: 0.2131 484/500 [============================>.] - ETA: 4s - loss: 1.5563 - regression_loss: 1.3432 - classification_loss: 0.2131 485/500 [============================>.] - ETA: 3s - loss: 1.5564 - regression_loss: 1.3434 - classification_loss: 0.2130 486/500 [============================>.] - ETA: 3s - loss: 1.5546 - regression_loss: 1.3418 - classification_loss: 0.2128 487/500 [============================>.] - ETA: 3s - loss: 1.5543 - regression_loss: 1.3415 - classification_loss: 0.2128 488/500 [============================>.] - ETA: 3s - loss: 1.5550 - regression_loss: 1.3421 - classification_loss: 0.2129 489/500 [============================>.] - ETA: 2s - loss: 1.5562 - regression_loss: 1.3430 - classification_loss: 0.2132 490/500 [============================>.] - ETA: 2s - loss: 1.5562 - regression_loss: 1.3429 - classification_loss: 0.2133 491/500 [============================>.] - ETA: 2s - loss: 1.5562 - regression_loss: 1.3429 - classification_loss: 0.2133 492/500 [============================>.] - ETA: 2s - loss: 1.5564 - regression_loss: 1.3432 - classification_loss: 0.2133 493/500 [============================>.] - ETA: 1s - loss: 1.5571 - regression_loss: 1.3439 - classification_loss: 0.2132 494/500 [============================>.] - ETA: 1s - loss: 1.5568 - regression_loss: 1.3437 - classification_loss: 0.2131 495/500 [============================>.] - ETA: 1s - loss: 1.5576 - regression_loss: 1.3443 - classification_loss: 0.2133 496/500 [============================>.] - ETA: 1s - loss: 1.5564 - regression_loss: 1.3433 - classification_loss: 0.2130 497/500 [============================>.] - ETA: 0s - loss: 1.5574 - regression_loss: 1.3442 - classification_loss: 0.2132 498/500 [============================>.] - ETA: 0s - loss: 1.5577 - regression_loss: 1.3446 - classification_loss: 0.2131 499/500 [============================>.] - ETA: 0s - loss: 1.5578 - regression_loss: 1.3446 - classification_loss: 0.2131 500/500 [==============================] - 126s 253ms/step - loss: 1.5573 - regression_loss: 1.3443 - classification_loss: 0.2130 1172 instances of class plum with average precision: 0.6944 mAP: 0.6944 Epoch 00013: saving model to ./training/snapshots/resnet50_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 1:58 - loss: 0.8237 - regression_loss: 0.7604 - classification_loss: 0.0633 2/500 [..............................] - ETA: 2:00 - loss: 1.1100 - regression_loss: 0.9566 - classification_loss: 0.1534 3/500 [..............................] - ETA: 2:01 - loss: 1.2171 - regression_loss: 1.0559 - classification_loss: 0.1612 4/500 [..............................] - ETA: 1:59 - loss: 1.4114 - regression_loss: 1.1924 - classification_loss: 0.2190 5/500 [..............................] - ETA: 1:59 - loss: 1.5023 - regression_loss: 1.2584 - classification_loss: 0.2439 6/500 [..............................] - ETA: 2:00 - loss: 1.5485 - regression_loss: 1.3005 - classification_loss: 0.2480 7/500 [..............................] - ETA: 2:01 - loss: 1.5019 - regression_loss: 1.2651 - classification_loss: 0.2368 8/500 [..............................] - ETA: 2:01 - loss: 1.5062 - regression_loss: 1.2714 - classification_loss: 0.2348 9/500 [..............................] - ETA: 2:02 - loss: 1.5358 - regression_loss: 1.3050 - classification_loss: 0.2308 10/500 [..............................] - ETA: 2:03 - loss: 1.5292 - regression_loss: 1.3033 - classification_loss: 0.2259 11/500 [..............................] - ETA: 2:03 - loss: 1.4778 - regression_loss: 1.2623 - classification_loss: 0.2155 12/500 [..............................] - ETA: 2:03 - loss: 1.4648 - regression_loss: 1.2549 - classification_loss: 0.2098 13/500 [..............................] - ETA: 2:02 - loss: 1.3983 - regression_loss: 1.1999 - classification_loss: 0.1984 14/500 [..............................] - ETA: 2:00 - loss: 1.4022 - regression_loss: 1.2029 - classification_loss: 0.1994 15/500 [..............................] - ETA: 1:58 - loss: 1.4465 - regression_loss: 1.2420 - classification_loss: 0.2045 16/500 [..............................] - ETA: 1:58 - loss: 1.4342 - regression_loss: 1.2368 - classification_loss: 0.1974 17/500 [>.............................] - ETA: 1:59 - loss: 1.4606 - regression_loss: 1.2592 - classification_loss: 0.2014 18/500 [>.............................] - ETA: 1:59 - loss: 1.4845 - regression_loss: 1.2809 - classification_loss: 0.2036 19/500 [>.............................] - ETA: 1:59 - loss: 1.4956 - regression_loss: 1.2924 - classification_loss: 0.2032 20/500 [>.............................] - ETA: 1:59 - loss: 1.5218 - regression_loss: 1.3154 - classification_loss: 0.2064 21/500 [>.............................] - ETA: 1:58 - loss: 1.5133 - regression_loss: 1.3050 - classification_loss: 0.2083 22/500 [>.............................] - ETA: 1:59 - loss: 1.5110 - regression_loss: 1.3036 - classification_loss: 0.2074 23/500 [>.............................] - ETA: 1:58 - loss: 1.5197 - regression_loss: 1.3100 - classification_loss: 0.2096 24/500 [>.............................] - ETA: 1:58 - loss: 1.5169 - regression_loss: 1.3044 - classification_loss: 0.2125 25/500 [>.............................] - ETA: 1:58 - loss: 1.4949 - regression_loss: 1.2864 - classification_loss: 0.2085 26/500 [>.............................] - ETA: 1:58 - loss: 1.4823 - regression_loss: 1.2759 - classification_loss: 0.2064 27/500 [>.............................] - ETA: 1:57 - loss: 1.4751 - regression_loss: 1.2697 - classification_loss: 0.2054 28/500 [>.............................] - ETA: 1:57 - loss: 1.4473 - regression_loss: 1.2458 - classification_loss: 0.2016 29/500 [>.............................] - ETA: 1:57 - loss: 1.4458 - regression_loss: 1.2446 - classification_loss: 0.2012 30/500 [>.............................] - ETA: 1:57 - loss: 1.4517 - regression_loss: 1.2515 - classification_loss: 0.2003 31/500 [>.............................] - ETA: 1:57 - loss: 1.4675 - regression_loss: 1.2651 - classification_loss: 0.2024 32/500 [>.............................] - ETA: 1:57 - loss: 1.4488 - regression_loss: 1.2492 - classification_loss: 0.1997 33/500 [>.............................] - ETA: 1:56 - loss: 1.4553 - regression_loss: 1.2543 - classification_loss: 0.2010 34/500 [=>............................] - ETA: 1:56 - loss: 1.4541 - regression_loss: 1.2542 - classification_loss: 0.1999 35/500 [=>............................] - ETA: 1:56 - loss: 1.4517 - regression_loss: 1.2535 - classification_loss: 0.1982 36/500 [=>............................] - ETA: 1:56 - loss: 1.4684 - regression_loss: 1.2689 - classification_loss: 0.1995 37/500 [=>............................] - ETA: 1:55 - loss: 1.4507 - regression_loss: 1.2539 - classification_loss: 0.1968 38/500 [=>............................] - ETA: 1:55 - loss: 1.4689 - regression_loss: 1.2692 - classification_loss: 0.1996 39/500 [=>............................] - ETA: 1:55 - loss: 1.4803 - regression_loss: 1.2798 - classification_loss: 0.2005 40/500 [=>............................] - ETA: 1:55 - loss: 1.4601 - regression_loss: 1.2632 - classification_loss: 0.1969 41/500 [=>............................] - ETA: 1:55 - loss: 1.4540 - regression_loss: 1.2578 - classification_loss: 0.1962 42/500 [=>............................] - ETA: 1:54 - loss: 1.4667 - regression_loss: 1.2687 - classification_loss: 0.1981 43/500 [=>............................] - ETA: 1:54 - loss: 1.4801 - regression_loss: 1.2791 - classification_loss: 0.2010 44/500 [=>............................] - ETA: 1:54 - loss: 1.4689 - regression_loss: 1.2696 - classification_loss: 0.1993 45/500 [=>............................] - ETA: 1:53 - loss: 1.4768 - regression_loss: 1.2770 - classification_loss: 0.1998 46/500 [=>............................] - ETA: 1:53 - loss: 1.4797 - regression_loss: 1.2802 - classification_loss: 0.1994 47/500 [=>............................] - ETA: 1:53 - loss: 1.4704 - regression_loss: 1.2712 - classification_loss: 0.1992 48/500 [=>............................] - ETA: 1:53 - loss: 1.4733 - regression_loss: 1.2742 - classification_loss: 0.1991 49/500 [=>............................] - ETA: 1:52 - loss: 1.4842 - regression_loss: 1.2842 - classification_loss: 0.2000 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4795 - regression_loss: 1.2798 - classification_loss: 0.1997 51/500 [==>...........................] - ETA: 1:52 - loss: 1.4790 - regression_loss: 1.2804 - classification_loss: 0.1986 52/500 [==>...........................] - ETA: 1:52 - loss: 1.4838 - regression_loss: 1.2844 - classification_loss: 0.1994 53/500 [==>...........................] - ETA: 1:52 - loss: 1.5054 - regression_loss: 1.3004 - classification_loss: 0.2050 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5112 - regression_loss: 1.3051 - classification_loss: 0.2062 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5181 - regression_loss: 1.3115 - classification_loss: 0.2066 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5253 - regression_loss: 1.3175 - classification_loss: 0.2077 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5350 - regression_loss: 1.3254 - classification_loss: 0.2095 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5275 - regression_loss: 1.3175 - classification_loss: 0.2100 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5173 - regression_loss: 1.3089 - classification_loss: 0.2084 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5138 - regression_loss: 1.3058 - classification_loss: 0.2080 61/500 [==>...........................] - ETA: 1:50 - loss: 1.4973 - regression_loss: 1.2913 - classification_loss: 0.2060 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5000 - regression_loss: 1.2930 - classification_loss: 0.2070 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4986 - regression_loss: 1.2920 - classification_loss: 0.2066 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5023 - regression_loss: 1.2950 - classification_loss: 0.2073 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5084 - regression_loss: 1.2987 - classification_loss: 0.2097 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5005 - regression_loss: 1.2922 - classification_loss: 0.2083 67/500 [===>..........................] - ETA: 1:49 - loss: 1.4892 - regression_loss: 1.2823 - classification_loss: 0.2068 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4879 - regression_loss: 1.2819 - classification_loss: 0.2061 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4832 - regression_loss: 1.2779 - classification_loss: 0.2053 70/500 [===>..........................] - ETA: 1:48 - loss: 1.4843 - regression_loss: 1.2792 - classification_loss: 0.2051 71/500 [===>..........................] - ETA: 1:48 - loss: 1.4916 - regression_loss: 1.2846 - classification_loss: 0.2070 72/500 [===>..........................] - ETA: 1:48 - loss: 1.4913 - regression_loss: 1.2853 - classification_loss: 0.2059 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5006 - regression_loss: 1.2929 - classification_loss: 0.2077 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5062 - regression_loss: 1.2977 - classification_loss: 0.2084 75/500 [===>..........................] - ETA: 1:47 - loss: 1.4973 - regression_loss: 1.2909 - classification_loss: 0.2064 76/500 [===>..........................] - ETA: 1:47 - loss: 1.4999 - regression_loss: 1.2937 - classification_loss: 0.2062 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5072 - regression_loss: 1.3003 - classification_loss: 0.2069 78/500 [===>..........................] - ETA: 1:46 - loss: 1.5110 - regression_loss: 1.3038 - classification_loss: 0.2072 79/500 [===>..........................] - ETA: 1:46 - loss: 1.5138 - regression_loss: 1.3061 - classification_loss: 0.2077 80/500 [===>..........................] - ETA: 1:46 - loss: 1.5158 - regression_loss: 1.3074 - classification_loss: 0.2084 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5119 - regression_loss: 1.3037 - classification_loss: 0.2081 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5115 - regression_loss: 1.3037 - classification_loss: 0.2079 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5108 - regression_loss: 1.3030 - classification_loss: 0.2078 84/500 [====>.........................] - ETA: 1:45 - loss: 1.5041 - regression_loss: 1.2973 - classification_loss: 0.2068 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5119 - regression_loss: 1.3036 - classification_loss: 0.2083 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5031 - regression_loss: 1.2963 - classification_loss: 0.2068 87/500 [====>.........................] - ETA: 1:44 - loss: 1.5041 - regression_loss: 1.2979 - classification_loss: 0.2061 88/500 [====>.........................] - ETA: 1:44 - loss: 1.5070 - regression_loss: 1.3007 - classification_loss: 0.2063 89/500 [====>.........................] - ETA: 1:44 - loss: 1.5085 - regression_loss: 1.3010 - classification_loss: 0.2075 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5106 - regression_loss: 1.3030 - classification_loss: 0.2076 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5113 - regression_loss: 1.3033 - classification_loss: 0.2080 92/500 [====>.........................] - ETA: 1:43 - loss: 1.5075 - regression_loss: 1.3006 - classification_loss: 0.2069 93/500 [====>.........................] - ETA: 1:43 - loss: 1.5109 - regression_loss: 1.3035 - classification_loss: 0.2074 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5113 - regression_loss: 1.3039 - classification_loss: 0.2074 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5133 - regression_loss: 1.3058 - classification_loss: 0.2075 96/500 [====>.........................] - ETA: 1:42 - loss: 1.5145 - regression_loss: 1.3068 - classification_loss: 0.2076 97/500 [====>.........................] - ETA: 1:42 - loss: 1.5185 - regression_loss: 1.3108 - classification_loss: 0.2077 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5186 - regression_loss: 1.3112 - classification_loss: 0.2073 99/500 [====>.........................] - ETA: 1:41 - loss: 1.5103 - regression_loss: 1.3038 - classification_loss: 0.2064 100/500 [=====>........................] - ETA: 1:41 - loss: 1.5117 - regression_loss: 1.3054 - classification_loss: 0.2063 101/500 [=====>........................] - ETA: 1:41 - loss: 1.5136 - regression_loss: 1.3080 - classification_loss: 0.2056 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5138 - regression_loss: 1.3082 - classification_loss: 0.2056 103/500 [=====>........................] - ETA: 1:40 - loss: 1.5163 - regression_loss: 1.3103 - classification_loss: 0.2060 104/500 [=====>........................] - ETA: 1:40 - loss: 1.5216 - regression_loss: 1.3148 - classification_loss: 0.2069 105/500 [=====>........................] - ETA: 1:40 - loss: 1.5203 - regression_loss: 1.3139 - classification_loss: 0.2064 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5173 - regression_loss: 1.3112 - classification_loss: 0.2061 107/500 [=====>........................] - ETA: 1:39 - loss: 1.5196 - regression_loss: 1.3134 - classification_loss: 0.2063 108/500 [=====>........................] - ETA: 1:39 - loss: 1.5196 - regression_loss: 1.3132 - classification_loss: 0.2064 109/500 [=====>........................] - ETA: 1:39 - loss: 1.5165 - regression_loss: 1.3103 - classification_loss: 0.2063 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5185 - regression_loss: 1.3120 - classification_loss: 0.2064 111/500 [=====>........................] - ETA: 1:38 - loss: 1.5208 - regression_loss: 1.3135 - classification_loss: 0.2073 112/500 [=====>........................] - ETA: 1:38 - loss: 1.5202 - regression_loss: 1.3131 - classification_loss: 0.2071 113/500 [=====>........................] - ETA: 1:38 - loss: 1.5193 - regression_loss: 1.3118 - classification_loss: 0.2075 114/500 [=====>........................] - ETA: 1:38 - loss: 1.5203 - regression_loss: 1.3128 - classification_loss: 0.2075 115/500 [=====>........................] - ETA: 1:37 - loss: 1.5193 - regression_loss: 1.3124 - classification_loss: 0.2069 116/500 [=====>........................] - ETA: 1:37 - loss: 1.5110 - regression_loss: 1.3050 - classification_loss: 0.2060 117/500 [======>.......................] - ETA: 1:37 - loss: 1.5056 - regression_loss: 1.3002 - classification_loss: 0.2054 118/500 [======>.......................] - ETA: 1:36 - loss: 1.5124 - regression_loss: 1.3055 - classification_loss: 0.2069 119/500 [======>.......................] - ETA: 1:36 - loss: 1.5138 - regression_loss: 1.3067 - classification_loss: 0.2071 120/500 [======>.......................] - ETA: 1:36 - loss: 1.5147 - regression_loss: 1.3078 - classification_loss: 0.2070 121/500 [======>.......................] - ETA: 1:36 - loss: 1.5175 - regression_loss: 1.3095 - classification_loss: 0.2080 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5146 - regression_loss: 1.3074 - classification_loss: 0.2072 123/500 [======>.......................] - ETA: 1:35 - loss: 1.5178 - regression_loss: 1.3105 - classification_loss: 0.2073 124/500 [======>.......................] - ETA: 1:35 - loss: 1.5238 - regression_loss: 1.3150 - classification_loss: 0.2088 125/500 [======>.......................] - ETA: 1:35 - loss: 1.5265 - regression_loss: 1.3176 - classification_loss: 0.2090 126/500 [======>.......................] - ETA: 1:34 - loss: 1.5269 - regression_loss: 1.3182 - classification_loss: 0.2087 127/500 [======>.......................] - ETA: 1:34 - loss: 1.5275 - regression_loss: 1.3192 - classification_loss: 0.2083 128/500 [======>.......................] - ETA: 1:34 - loss: 1.5254 - regression_loss: 1.3177 - classification_loss: 0.2077 129/500 [======>.......................] - ETA: 1:34 - loss: 1.5236 - regression_loss: 1.3161 - classification_loss: 0.2075 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5246 - regression_loss: 1.3172 - classification_loss: 0.2074 131/500 [======>.......................] - ETA: 1:33 - loss: 1.5261 - regression_loss: 1.3184 - classification_loss: 0.2077 132/500 [======>.......................] - ETA: 1:33 - loss: 1.5283 - regression_loss: 1.3202 - classification_loss: 0.2081 133/500 [======>.......................] - ETA: 1:33 - loss: 1.5267 - regression_loss: 1.3187 - classification_loss: 0.2079 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5277 - regression_loss: 1.3198 - classification_loss: 0.2079 135/500 [=======>......................] - ETA: 1:32 - loss: 1.5295 - regression_loss: 1.3219 - classification_loss: 0.2076 136/500 [=======>......................] - ETA: 1:32 - loss: 1.5266 - regression_loss: 1.3197 - classification_loss: 0.2069 137/500 [=======>......................] - ETA: 1:32 - loss: 1.5275 - regression_loss: 1.3205 - classification_loss: 0.2070 138/500 [=======>......................] - ETA: 1:31 - loss: 1.5279 - regression_loss: 1.3209 - classification_loss: 0.2070 139/500 [=======>......................] - ETA: 1:31 - loss: 1.5280 - regression_loss: 1.3212 - classification_loss: 0.2069 140/500 [=======>......................] - ETA: 1:31 - loss: 1.5323 - regression_loss: 1.3247 - classification_loss: 0.2076 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5290 - regression_loss: 1.3223 - classification_loss: 0.2068 142/500 [=======>......................] - ETA: 1:30 - loss: 1.5312 - regression_loss: 1.3236 - classification_loss: 0.2076 143/500 [=======>......................] - ETA: 1:30 - loss: 1.5271 - regression_loss: 1.3203 - classification_loss: 0.2068 144/500 [=======>......................] - ETA: 1:30 - loss: 1.5201 - regression_loss: 1.3144 - classification_loss: 0.2057 145/500 [=======>......................] - ETA: 1:29 - loss: 1.5207 - regression_loss: 1.3145 - classification_loss: 0.2062 146/500 [=======>......................] - ETA: 1:29 - loss: 1.5148 - regression_loss: 1.3096 - classification_loss: 0.2052 147/500 [=======>......................] - ETA: 1:29 - loss: 1.5179 - regression_loss: 1.3128 - classification_loss: 0.2051 148/500 [=======>......................] - ETA: 1:29 - loss: 1.5189 - regression_loss: 1.3137 - classification_loss: 0.2052 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5202 - regression_loss: 1.3153 - classification_loss: 0.2049 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5203 - regression_loss: 1.3154 - classification_loss: 0.2049 151/500 [========>.....................] - ETA: 1:28 - loss: 1.5165 - regression_loss: 1.3123 - classification_loss: 0.2042 152/500 [========>.....................] - ETA: 1:28 - loss: 1.5111 - regression_loss: 1.3077 - classification_loss: 0.2033 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5114 - regression_loss: 1.3082 - classification_loss: 0.2032 154/500 [========>.....................] - ETA: 1:27 - loss: 1.5128 - regression_loss: 1.3094 - classification_loss: 0.2035 155/500 [========>.....................] - ETA: 1:27 - loss: 1.5129 - regression_loss: 1.3094 - classification_loss: 0.2035 156/500 [========>.....................] - ETA: 1:27 - loss: 1.5136 - regression_loss: 1.3103 - classification_loss: 0.2033 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5102 - regression_loss: 1.3075 - classification_loss: 0.2027 158/500 [========>.....................] - ETA: 1:26 - loss: 1.5089 - regression_loss: 1.3061 - classification_loss: 0.2028 159/500 [========>.....................] - ETA: 1:26 - loss: 1.5106 - regression_loss: 1.3078 - classification_loss: 0.2028 160/500 [========>.....................] - ETA: 1:26 - loss: 1.5073 - regression_loss: 1.3051 - classification_loss: 0.2022 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5133 - regression_loss: 1.3103 - classification_loss: 0.2030 162/500 [========>.....................] - ETA: 1:25 - loss: 1.5135 - regression_loss: 1.3104 - classification_loss: 0.2031 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5134 - regression_loss: 1.3100 - classification_loss: 0.2033 164/500 [========>.....................] - ETA: 1:25 - loss: 1.5110 - regression_loss: 1.3083 - classification_loss: 0.2028 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5099 - regression_loss: 1.3072 - classification_loss: 0.2027 166/500 [========>.....................] - ETA: 1:24 - loss: 1.5059 - regression_loss: 1.3039 - classification_loss: 0.2021 167/500 [=========>....................] - ETA: 1:24 - loss: 1.5021 - regression_loss: 1.3006 - classification_loss: 0.2015 168/500 [=========>....................] - ETA: 1:24 - loss: 1.5015 - regression_loss: 1.3000 - classification_loss: 0.2014 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5022 - regression_loss: 1.3005 - classification_loss: 0.2016 170/500 [=========>....................] - ETA: 1:23 - loss: 1.4980 - regression_loss: 1.2970 - classification_loss: 0.2011 171/500 [=========>....................] - ETA: 1:23 - loss: 1.4975 - regression_loss: 1.2971 - classification_loss: 0.2004 172/500 [=========>....................] - ETA: 1:23 - loss: 1.4968 - regression_loss: 1.2964 - classification_loss: 0.2004 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5004 - regression_loss: 1.2989 - classification_loss: 0.2015 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4965 - regression_loss: 1.2953 - classification_loss: 0.2012 175/500 [=========>....................] - ETA: 1:22 - loss: 1.4975 - regression_loss: 1.2962 - classification_loss: 0.2013 176/500 [=========>....................] - ETA: 1:22 - loss: 1.4993 - regression_loss: 1.2977 - classification_loss: 0.2016 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5022 - regression_loss: 1.3005 - classification_loss: 0.2017 178/500 [=========>....................] - ETA: 1:21 - loss: 1.5065 - regression_loss: 1.3039 - classification_loss: 0.2026 179/500 [=========>....................] - ETA: 1:21 - loss: 1.5063 - regression_loss: 1.3037 - classification_loss: 0.2026 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5054 - regression_loss: 1.3026 - classification_loss: 0.2028 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5020 - regression_loss: 1.2998 - classification_loss: 0.2022 182/500 [=========>....................] - ETA: 1:20 - loss: 1.5016 - regression_loss: 1.2997 - classification_loss: 0.2019 183/500 [=========>....................] - ETA: 1:20 - loss: 1.5052 - regression_loss: 1.3030 - classification_loss: 0.2022 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5043 - regression_loss: 1.3024 - classification_loss: 0.2019 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5032 - regression_loss: 1.3015 - classification_loss: 0.2016 186/500 [==========>...................] - ETA: 1:19 - loss: 1.5075 - regression_loss: 1.3049 - classification_loss: 0.2026 187/500 [==========>...................] - ETA: 1:19 - loss: 1.5075 - regression_loss: 1.3051 - classification_loss: 0.2024 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5091 - regression_loss: 1.3064 - classification_loss: 0.2027 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5092 - regression_loss: 1.3063 - classification_loss: 0.2030 190/500 [==========>...................] - ETA: 1:18 - loss: 1.5106 - regression_loss: 1.3071 - classification_loss: 0.2035 191/500 [==========>...................] - ETA: 1:18 - loss: 1.5113 - regression_loss: 1.3077 - classification_loss: 0.2036 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5111 - regression_loss: 1.3077 - classification_loss: 0.2034 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5084 - regression_loss: 1.3056 - classification_loss: 0.2029 194/500 [==========>...................] - ETA: 1:17 - loss: 1.5094 - regression_loss: 1.3064 - classification_loss: 0.2029 195/500 [==========>...................] - ETA: 1:17 - loss: 1.5123 - regression_loss: 1.3093 - classification_loss: 0.2030 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5106 - regression_loss: 1.3075 - classification_loss: 0.2031 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5105 - regression_loss: 1.3073 - classification_loss: 0.2032 198/500 [==========>...................] - ETA: 1:16 - loss: 1.5118 - regression_loss: 1.3089 - classification_loss: 0.2029 199/500 [==========>...................] - ETA: 1:16 - loss: 1.5121 - regression_loss: 1.3091 - classification_loss: 0.2030 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5132 - regression_loss: 1.3102 - classification_loss: 0.2030 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5124 - regression_loss: 1.3096 - classification_loss: 0.2029 202/500 [===========>..................] - ETA: 1:15 - loss: 1.5134 - regression_loss: 1.3101 - classification_loss: 0.2033 203/500 [===========>..................] - ETA: 1:15 - loss: 1.5129 - regression_loss: 1.3093 - classification_loss: 0.2036 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5133 - regression_loss: 1.3096 - classification_loss: 0.2037 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5096 - regression_loss: 1.3065 - classification_loss: 0.2032 206/500 [===========>..................] - ETA: 1:14 - loss: 1.5087 - regression_loss: 1.3058 - classification_loss: 0.2029 207/500 [===========>..................] - ETA: 1:14 - loss: 1.5069 - regression_loss: 1.3045 - classification_loss: 0.2025 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5059 - regression_loss: 1.3037 - classification_loss: 0.2022 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5092 - regression_loss: 1.3059 - classification_loss: 0.2033 210/500 [===========>..................] - ETA: 1:13 - loss: 1.5108 - regression_loss: 1.3079 - classification_loss: 0.2029 211/500 [===========>..................] - ETA: 1:13 - loss: 1.5130 - regression_loss: 1.3098 - classification_loss: 0.2032 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5135 - regression_loss: 1.3102 - classification_loss: 0.2033 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5182 - regression_loss: 1.3147 - classification_loss: 0.2034 214/500 [===========>..................] - ETA: 1:12 - loss: 1.5159 - regression_loss: 1.3128 - classification_loss: 0.2031 215/500 [===========>..................] - ETA: 1:12 - loss: 1.5164 - regression_loss: 1.3134 - classification_loss: 0.2029 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5171 - regression_loss: 1.3140 - classification_loss: 0.2031 217/500 [============>.................] - ETA: 1:11 - loss: 1.5183 - regression_loss: 1.3148 - classification_loss: 0.2035 218/500 [============>.................] - ETA: 1:11 - loss: 1.5166 - regression_loss: 1.3136 - classification_loss: 0.2030 219/500 [============>.................] - ETA: 1:11 - loss: 1.5151 - regression_loss: 1.3126 - classification_loss: 0.2025 220/500 [============>.................] - ETA: 1:10 - loss: 1.5154 - regression_loss: 1.3129 - classification_loss: 0.2026 221/500 [============>.................] - ETA: 1:10 - loss: 1.5179 - regression_loss: 1.3155 - classification_loss: 0.2024 222/500 [============>.................] - ETA: 1:10 - loss: 1.5198 - regression_loss: 1.3168 - classification_loss: 0.2030 223/500 [============>.................] - ETA: 1:10 - loss: 1.5192 - regression_loss: 1.3165 - classification_loss: 0.2028 224/500 [============>.................] - ETA: 1:09 - loss: 1.5210 - regression_loss: 1.3184 - classification_loss: 0.2026 225/500 [============>.................] - ETA: 1:09 - loss: 1.5205 - regression_loss: 1.3181 - classification_loss: 0.2024 226/500 [============>.................] - ETA: 1:09 - loss: 1.5207 - regression_loss: 1.3183 - classification_loss: 0.2025 227/500 [============>.................] - ETA: 1:09 - loss: 1.5212 - regression_loss: 1.3188 - classification_loss: 0.2024 228/500 [============>.................] - ETA: 1:08 - loss: 1.5223 - regression_loss: 1.3197 - classification_loss: 0.2026 229/500 [============>.................] - ETA: 1:08 - loss: 1.5227 - regression_loss: 1.3200 - classification_loss: 0.2027 230/500 [============>.................] - ETA: 1:08 - loss: 1.5251 - regression_loss: 1.3221 - classification_loss: 0.2030 231/500 [============>.................] - ETA: 1:08 - loss: 1.5226 - regression_loss: 1.3200 - classification_loss: 0.2026 232/500 [============>.................] - ETA: 1:07 - loss: 1.5230 - regression_loss: 1.3203 - classification_loss: 0.2027 233/500 [============>.................] - ETA: 1:07 - loss: 1.5221 - regression_loss: 1.3196 - classification_loss: 0.2026 234/500 [=============>................] - ETA: 1:07 - loss: 1.5233 - regression_loss: 1.3203 - classification_loss: 0.2029 235/500 [=============>................] - ETA: 1:07 - loss: 1.5227 - regression_loss: 1.3198 - classification_loss: 0.2029 236/500 [=============>................] - ETA: 1:06 - loss: 1.5229 - regression_loss: 1.3201 - classification_loss: 0.2028 237/500 [=============>................] - ETA: 1:06 - loss: 1.5235 - regression_loss: 1.3205 - classification_loss: 0.2030 238/500 [=============>................] - ETA: 1:06 - loss: 1.5237 - regression_loss: 1.3206 - classification_loss: 0.2031 239/500 [=============>................] - ETA: 1:05 - loss: 1.5245 - regression_loss: 1.3214 - classification_loss: 0.2031 240/500 [=============>................] - ETA: 1:05 - loss: 1.5250 - regression_loss: 1.3217 - classification_loss: 0.2033 241/500 [=============>................] - ETA: 1:05 - loss: 1.5259 - regression_loss: 1.3224 - classification_loss: 0.2035 242/500 [=============>................] - ETA: 1:05 - loss: 1.5257 - regression_loss: 1.3223 - classification_loss: 0.2034 243/500 [=============>................] - ETA: 1:04 - loss: 1.5272 - regression_loss: 1.3235 - classification_loss: 0.2037 244/500 [=============>................] - ETA: 1:04 - loss: 1.5235 - regression_loss: 1.3204 - classification_loss: 0.2031 245/500 [=============>................] - ETA: 1:04 - loss: 1.5228 - regression_loss: 1.3198 - classification_loss: 0.2030 246/500 [=============>................] - ETA: 1:04 - loss: 1.5239 - regression_loss: 1.3210 - classification_loss: 0.2029 247/500 [=============>................] - ETA: 1:03 - loss: 1.5268 - regression_loss: 1.3233 - classification_loss: 0.2036 248/500 [=============>................] - ETA: 1:03 - loss: 1.5264 - regression_loss: 1.3228 - classification_loss: 0.2035 249/500 [=============>................] - ETA: 1:03 - loss: 1.5262 - regression_loss: 1.3228 - classification_loss: 0.2035 250/500 [==============>...............] - ETA: 1:03 - loss: 1.5270 - regression_loss: 1.3238 - classification_loss: 0.2032 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5293 - regression_loss: 1.3260 - classification_loss: 0.2034 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5306 - regression_loss: 1.3271 - classification_loss: 0.2035 253/500 [==============>...............] - ETA: 1:02 - loss: 1.5305 - regression_loss: 1.3270 - classification_loss: 0.2035 254/500 [==============>...............] - ETA: 1:02 - loss: 1.5295 - regression_loss: 1.3261 - classification_loss: 0.2034 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5273 - regression_loss: 1.3243 - classification_loss: 0.2030 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5263 - regression_loss: 1.3234 - classification_loss: 0.2029 257/500 [==============>...............] - ETA: 1:01 - loss: 1.5265 - regression_loss: 1.3234 - classification_loss: 0.2031 258/500 [==============>...............] - ETA: 1:01 - loss: 1.5296 - regression_loss: 1.3260 - classification_loss: 0.2037 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5280 - regression_loss: 1.3245 - classification_loss: 0.2035 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5269 - regression_loss: 1.3236 - classification_loss: 0.2034 261/500 [==============>...............] - ETA: 1:00 - loss: 1.5285 - regression_loss: 1.3247 - classification_loss: 0.2038 262/500 [==============>...............] - ETA: 1:00 - loss: 1.5285 - regression_loss: 1.3248 - classification_loss: 0.2037 263/500 [==============>...............] - ETA: 59s - loss: 1.5265 - regression_loss: 1.3232 - classification_loss: 0.2033  264/500 [==============>...............] - ETA: 59s - loss: 1.5288 - regression_loss: 1.3254 - classification_loss: 0.2034 265/500 [==============>...............] - ETA: 59s - loss: 1.5304 - regression_loss: 1.3268 - classification_loss: 0.2035 266/500 [==============>...............] - ETA: 59s - loss: 1.5317 - regression_loss: 1.3280 - classification_loss: 0.2037 267/500 [===============>..............] - ETA: 58s - loss: 1.5325 - regression_loss: 1.3287 - classification_loss: 0.2038 268/500 [===============>..............] - ETA: 58s - loss: 1.5331 - regression_loss: 1.3293 - classification_loss: 0.2038 269/500 [===============>..............] - ETA: 58s - loss: 1.5336 - regression_loss: 1.3297 - classification_loss: 0.2040 270/500 [===============>..............] - ETA: 58s - loss: 1.5320 - regression_loss: 1.3278 - classification_loss: 0.2042 271/500 [===============>..............] - ETA: 57s - loss: 1.5334 - regression_loss: 1.3290 - classification_loss: 0.2044 272/500 [===============>..............] - ETA: 57s - loss: 1.5337 - regression_loss: 1.3293 - classification_loss: 0.2044 273/500 [===============>..............] - ETA: 57s - loss: 1.5360 - regression_loss: 1.3311 - classification_loss: 0.2049 274/500 [===============>..............] - ETA: 57s - loss: 1.5352 - regression_loss: 1.3306 - classification_loss: 0.2047 275/500 [===============>..............] - ETA: 56s - loss: 1.5360 - regression_loss: 1.3312 - classification_loss: 0.2048 276/500 [===============>..............] - ETA: 56s - loss: 1.5339 - regression_loss: 1.3292 - classification_loss: 0.2046 277/500 [===============>..............] - ETA: 56s - loss: 1.5336 - regression_loss: 1.3290 - classification_loss: 0.2047 278/500 [===============>..............] - ETA: 56s - loss: 1.5329 - regression_loss: 1.3284 - classification_loss: 0.2045 279/500 [===============>..............] - ETA: 55s - loss: 1.5309 - regression_loss: 1.3266 - classification_loss: 0.2043 280/500 [===============>..............] - ETA: 55s - loss: 1.5316 - regression_loss: 1.3271 - classification_loss: 0.2044 281/500 [===============>..............] - ETA: 55s - loss: 1.5294 - regression_loss: 1.3254 - classification_loss: 0.2040 282/500 [===============>..............] - ETA: 55s - loss: 1.5313 - regression_loss: 1.3269 - classification_loss: 0.2044 283/500 [===============>..............] - ETA: 54s - loss: 1.5319 - regression_loss: 1.3274 - classification_loss: 0.2045 284/500 [================>.............] - ETA: 54s - loss: 1.5339 - regression_loss: 1.3290 - classification_loss: 0.2049 285/500 [================>.............] - ETA: 54s - loss: 1.5303 - regression_loss: 1.3257 - classification_loss: 0.2047 286/500 [================>.............] - ETA: 54s - loss: 1.5304 - regression_loss: 1.3255 - classification_loss: 0.2048 287/500 [================>.............] - ETA: 53s - loss: 1.5275 - regression_loss: 1.3230 - classification_loss: 0.2045 288/500 [================>.............] - ETA: 53s - loss: 1.5266 - regression_loss: 1.3220 - classification_loss: 0.2046 289/500 [================>.............] - ETA: 53s - loss: 1.5267 - regression_loss: 1.3219 - classification_loss: 0.2048 290/500 [================>.............] - ETA: 53s - loss: 1.5284 - regression_loss: 1.3233 - classification_loss: 0.2051 291/500 [================>.............] - ETA: 52s - loss: 1.5289 - regression_loss: 1.3238 - classification_loss: 0.2051 292/500 [================>.............] - ETA: 52s - loss: 1.5288 - regression_loss: 1.3237 - classification_loss: 0.2051 293/500 [================>.............] - ETA: 52s - loss: 1.5292 - regression_loss: 1.3240 - classification_loss: 0.2052 294/500 [================>.............] - ETA: 52s - loss: 1.5297 - regression_loss: 1.3245 - classification_loss: 0.2052 295/500 [================>.............] - ETA: 51s - loss: 1.5282 - regression_loss: 1.3232 - classification_loss: 0.2050 296/500 [================>.............] - ETA: 51s - loss: 1.5287 - regression_loss: 1.3237 - classification_loss: 0.2050 297/500 [================>.............] - ETA: 51s - loss: 1.5293 - regression_loss: 1.3242 - classification_loss: 0.2051 298/500 [================>.............] - ETA: 51s - loss: 1.5272 - regression_loss: 1.3224 - classification_loss: 0.2048 299/500 [================>.............] - ETA: 50s - loss: 1.5273 - regression_loss: 1.3226 - classification_loss: 0.2047 300/500 [=================>............] - ETA: 50s - loss: 1.5243 - regression_loss: 1.3200 - classification_loss: 0.2043 301/500 [=================>............] - ETA: 50s - loss: 1.5206 - regression_loss: 1.3168 - classification_loss: 0.2038 302/500 [=================>............] - ETA: 50s - loss: 1.5177 - regression_loss: 1.3140 - classification_loss: 0.2037 303/500 [=================>............] - ETA: 49s - loss: 1.5176 - regression_loss: 1.3139 - classification_loss: 0.2037 304/500 [=================>............] - ETA: 49s - loss: 1.5160 - regression_loss: 1.3126 - classification_loss: 0.2033 305/500 [=================>............] - ETA: 49s - loss: 1.5168 - regression_loss: 1.3137 - classification_loss: 0.2031 306/500 [=================>............] - ETA: 49s - loss: 1.5174 - regression_loss: 1.3143 - classification_loss: 0.2031 307/500 [=================>............] - ETA: 48s - loss: 1.5175 - regression_loss: 1.3144 - classification_loss: 0.2031 308/500 [=================>............] - ETA: 48s - loss: 1.5190 - regression_loss: 1.3158 - classification_loss: 0.2033 309/500 [=================>............] - ETA: 48s - loss: 1.5174 - regression_loss: 1.3143 - classification_loss: 0.2030 310/500 [=================>............] - ETA: 48s - loss: 1.5191 - regression_loss: 1.3159 - classification_loss: 0.2032 311/500 [=================>............] - ETA: 47s - loss: 1.5187 - regression_loss: 1.3155 - classification_loss: 0.2031 312/500 [=================>............] - ETA: 47s - loss: 1.5191 - regression_loss: 1.3162 - classification_loss: 0.2029 313/500 [=================>............] - ETA: 47s - loss: 1.5194 - regression_loss: 1.3165 - classification_loss: 0.2029 314/500 [=================>............] - ETA: 47s - loss: 1.5190 - regression_loss: 1.3162 - classification_loss: 0.2028 315/500 [=================>............] - ETA: 46s - loss: 1.5192 - regression_loss: 1.3164 - classification_loss: 0.2028 316/500 [=================>............] - ETA: 46s - loss: 1.5187 - regression_loss: 1.3159 - classification_loss: 0.2029 317/500 [==================>...........] - ETA: 46s - loss: 1.5183 - regression_loss: 1.3156 - classification_loss: 0.2027 318/500 [==================>...........] - ETA: 46s - loss: 1.5194 - regression_loss: 1.3163 - classification_loss: 0.2031 319/500 [==================>...........] - ETA: 45s - loss: 1.5186 - regression_loss: 1.3156 - classification_loss: 0.2030 320/500 [==================>...........] - ETA: 45s - loss: 1.5198 - regression_loss: 1.3167 - classification_loss: 0.2031 321/500 [==================>...........] - ETA: 45s - loss: 1.5198 - regression_loss: 1.3168 - classification_loss: 0.2030 322/500 [==================>...........] - ETA: 45s - loss: 1.5167 - regression_loss: 1.3141 - classification_loss: 0.2026 323/500 [==================>...........] - ETA: 44s - loss: 1.5171 - regression_loss: 1.3144 - classification_loss: 0.2027 324/500 [==================>...........] - ETA: 44s - loss: 1.5237 - regression_loss: 1.3196 - classification_loss: 0.2041 325/500 [==================>...........] - ETA: 44s - loss: 1.5216 - regression_loss: 1.3177 - classification_loss: 0.2039 326/500 [==================>...........] - ETA: 44s - loss: 1.5224 - regression_loss: 1.3184 - classification_loss: 0.2040 327/500 [==================>...........] - ETA: 43s - loss: 1.5199 - regression_loss: 1.3163 - classification_loss: 0.2036 328/500 [==================>...........] - ETA: 43s - loss: 1.5211 - regression_loss: 1.3171 - classification_loss: 0.2040 329/500 [==================>...........] - ETA: 43s - loss: 1.5206 - regression_loss: 1.3164 - classification_loss: 0.2041 330/500 [==================>...........] - ETA: 43s - loss: 1.5209 - regression_loss: 1.3168 - classification_loss: 0.2042 331/500 [==================>...........] - ETA: 42s - loss: 1.5196 - regression_loss: 1.3156 - classification_loss: 0.2039 332/500 [==================>...........] - ETA: 42s - loss: 1.5197 - regression_loss: 1.3157 - classification_loss: 0.2040 333/500 [==================>...........] - ETA: 42s - loss: 1.5209 - regression_loss: 1.3164 - classification_loss: 0.2045 334/500 [===================>..........] - ETA: 42s - loss: 1.5211 - regression_loss: 1.3167 - classification_loss: 0.2044 335/500 [===================>..........] - ETA: 41s - loss: 1.5186 - regression_loss: 1.3147 - classification_loss: 0.2040 336/500 [===================>..........] - ETA: 41s - loss: 1.5201 - regression_loss: 1.3159 - classification_loss: 0.2043 337/500 [===================>..........] - ETA: 41s - loss: 1.5221 - regression_loss: 1.3173 - classification_loss: 0.2048 338/500 [===================>..........] - ETA: 41s - loss: 1.5232 - regression_loss: 1.3183 - classification_loss: 0.2050 339/500 [===================>..........] - ETA: 40s - loss: 1.5247 - regression_loss: 1.3199 - classification_loss: 0.2048 340/500 [===================>..........] - ETA: 40s - loss: 1.5254 - regression_loss: 1.3204 - classification_loss: 0.2050 341/500 [===================>..........] - ETA: 40s - loss: 1.5246 - regression_loss: 1.3199 - classification_loss: 0.2047 342/500 [===================>..........] - ETA: 40s - loss: 1.5253 - regression_loss: 1.3205 - classification_loss: 0.2048 343/500 [===================>..........] - ETA: 39s - loss: 1.5220 - regression_loss: 1.3177 - classification_loss: 0.2043 344/500 [===================>..........] - ETA: 39s - loss: 1.5219 - regression_loss: 1.3176 - classification_loss: 0.2043 345/500 [===================>..........] - ETA: 39s - loss: 1.5203 - regression_loss: 1.3162 - classification_loss: 0.2041 346/500 [===================>..........] - ETA: 39s - loss: 1.5209 - regression_loss: 1.3166 - classification_loss: 0.2043 347/500 [===================>..........] - ETA: 38s - loss: 1.5218 - regression_loss: 1.3173 - classification_loss: 0.2045 348/500 [===================>..........] - ETA: 38s - loss: 1.5212 - regression_loss: 1.3169 - classification_loss: 0.2043 349/500 [===================>..........] - ETA: 38s - loss: 1.5201 - regression_loss: 1.3161 - classification_loss: 0.2040 350/500 [====================>.........] - ETA: 38s - loss: 1.5207 - regression_loss: 1.3166 - classification_loss: 0.2041 351/500 [====================>.........] - ETA: 37s - loss: 1.5183 - regression_loss: 1.3147 - classification_loss: 0.2037 352/500 [====================>.........] - ETA: 37s - loss: 1.5166 - regression_loss: 1.3132 - classification_loss: 0.2034 353/500 [====================>.........] - ETA: 37s - loss: 1.5166 - regression_loss: 1.3133 - classification_loss: 0.2033 354/500 [====================>.........] - ETA: 36s - loss: 1.5165 - regression_loss: 1.3133 - classification_loss: 0.2032 355/500 [====================>.........] - ETA: 36s - loss: 1.5145 - regression_loss: 1.3117 - classification_loss: 0.2028 356/500 [====================>.........] - ETA: 36s - loss: 1.5143 - regression_loss: 1.3116 - classification_loss: 0.2028 357/500 [====================>.........] - ETA: 36s - loss: 1.5139 - regression_loss: 1.3112 - classification_loss: 0.2027 358/500 [====================>.........] - ETA: 35s - loss: 1.5116 - regression_loss: 1.3093 - classification_loss: 0.2023 359/500 [====================>.........] - ETA: 35s - loss: 1.5124 - regression_loss: 1.3099 - classification_loss: 0.2024 360/500 [====================>.........] - ETA: 35s - loss: 1.5107 - regression_loss: 1.3086 - classification_loss: 0.2021 361/500 [====================>.........] - ETA: 35s - loss: 1.5129 - regression_loss: 1.3105 - classification_loss: 0.2024 362/500 [====================>.........] - ETA: 34s - loss: 1.5131 - regression_loss: 1.3107 - classification_loss: 0.2024 363/500 [====================>.........] - ETA: 34s - loss: 1.5128 - regression_loss: 1.3104 - classification_loss: 0.2024 364/500 [====================>.........] - ETA: 34s - loss: 1.5114 - regression_loss: 1.3092 - classification_loss: 0.2022 365/500 [====================>.........] - ETA: 34s - loss: 1.5123 - regression_loss: 1.3100 - classification_loss: 0.2024 366/500 [====================>.........] - ETA: 33s - loss: 1.5126 - regression_loss: 1.3101 - classification_loss: 0.2025 367/500 [=====================>........] - ETA: 33s - loss: 1.5133 - regression_loss: 1.3107 - classification_loss: 0.2026 368/500 [=====================>........] - ETA: 33s - loss: 1.5131 - regression_loss: 1.3105 - classification_loss: 0.2025 369/500 [=====================>........] - ETA: 33s - loss: 1.5128 - regression_loss: 1.3103 - classification_loss: 0.2025 370/500 [=====================>........] - ETA: 32s - loss: 1.5131 - regression_loss: 1.3107 - classification_loss: 0.2025 371/500 [=====================>........] - ETA: 32s - loss: 1.5128 - regression_loss: 1.3104 - classification_loss: 0.2023 372/500 [=====================>........] - ETA: 32s - loss: 1.5135 - regression_loss: 1.3109 - classification_loss: 0.2026 373/500 [=====================>........] - ETA: 32s - loss: 1.5144 - regression_loss: 1.3117 - classification_loss: 0.2027 374/500 [=====================>........] - ETA: 31s - loss: 1.5144 - regression_loss: 1.3118 - classification_loss: 0.2026 375/500 [=====================>........] - ETA: 31s - loss: 1.5151 - regression_loss: 1.3124 - classification_loss: 0.2028 376/500 [=====================>........] - ETA: 31s - loss: 1.5147 - regression_loss: 1.3119 - classification_loss: 0.2028 377/500 [=====================>........] - ETA: 31s - loss: 1.5149 - regression_loss: 1.3122 - classification_loss: 0.2028 378/500 [=====================>........] - ETA: 30s - loss: 1.5158 - regression_loss: 1.3130 - classification_loss: 0.2029 379/500 [=====================>........] - ETA: 30s - loss: 1.5162 - regression_loss: 1.3135 - classification_loss: 0.2027 380/500 [=====================>........] - ETA: 30s - loss: 1.5160 - regression_loss: 1.3134 - classification_loss: 0.2026 381/500 [=====================>........] - ETA: 30s - loss: 1.5145 - regression_loss: 1.3121 - classification_loss: 0.2024 382/500 [=====================>........] - ETA: 29s - loss: 1.5145 - regression_loss: 1.3121 - classification_loss: 0.2024 383/500 [=====================>........] - ETA: 29s - loss: 1.5130 - regression_loss: 1.3108 - classification_loss: 0.2021 384/500 [======================>.......] - ETA: 29s - loss: 1.5130 - regression_loss: 1.3109 - classification_loss: 0.2021 385/500 [======================>.......] - ETA: 29s - loss: 1.5131 - regression_loss: 1.3108 - classification_loss: 0.2023 386/500 [======================>.......] - ETA: 28s - loss: 1.5139 - regression_loss: 1.3115 - classification_loss: 0.2024 387/500 [======================>.......] - ETA: 28s - loss: 1.5146 - regression_loss: 1.3122 - classification_loss: 0.2024 388/500 [======================>.......] - ETA: 28s - loss: 1.5156 - regression_loss: 1.3131 - classification_loss: 0.2025 389/500 [======================>.......] - ETA: 28s - loss: 1.5164 - regression_loss: 1.3139 - classification_loss: 0.2025 390/500 [======================>.......] - ETA: 27s - loss: 1.5174 - regression_loss: 1.3148 - classification_loss: 0.2026 391/500 [======================>.......] - ETA: 27s - loss: 1.5169 - regression_loss: 1.3143 - classification_loss: 0.2026 392/500 [======================>.......] - ETA: 27s - loss: 1.5177 - regression_loss: 1.3151 - classification_loss: 0.2027 393/500 [======================>.......] - ETA: 27s - loss: 1.5170 - regression_loss: 1.3144 - classification_loss: 0.2026 394/500 [======================>.......] - ETA: 26s - loss: 1.5154 - regression_loss: 1.3131 - classification_loss: 0.2023 395/500 [======================>.......] - ETA: 26s - loss: 1.5189 - regression_loss: 1.3164 - classification_loss: 0.2025 396/500 [======================>.......] - ETA: 26s - loss: 1.5194 - regression_loss: 1.3168 - classification_loss: 0.2026 397/500 [======================>.......] - ETA: 26s - loss: 1.5186 - regression_loss: 1.3161 - classification_loss: 0.2025 398/500 [======================>.......] - ETA: 25s - loss: 1.5197 - regression_loss: 1.3170 - classification_loss: 0.2027 399/500 [======================>.......] - ETA: 25s - loss: 1.5177 - regression_loss: 1.3154 - classification_loss: 0.2023 400/500 [=======================>......] - ETA: 25s - loss: 1.5150 - regression_loss: 1.3131 - classification_loss: 0.2019 401/500 [=======================>......] - ETA: 25s - loss: 1.5138 - regression_loss: 1.3121 - classification_loss: 0.2017 402/500 [=======================>......] - ETA: 24s - loss: 1.5136 - regression_loss: 1.3120 - classification_loss: 0.2016 403/500 [=======================>......] - ETA: 24s - loss: 1.5139 - regression_loss: 1.3122 - classification_loss: 0.2017 404/500 [=======================>......] - ETA: 24s - loss: 1.5139 - regression_loss: 1.3122 - classification_loss: 0.2017 405/500 [=======================>......] - ETA: 24s - loss: 1.5137 - regression_loss: 1.3120 - classification_loss: 0.2017 406/500 [=======================>......] - ETA: 23s - loss: 1.5154 - regression_loss: 1.3134 - classification_loss: 0.2020 407/500 [=======================>......] - ETA: 23s - loss: 1.5153 - regression_loss: 1.3134 - classification_loss: 0.2019 408/500 [=======================>......] - ETA: 23s - loss: 1.5128 - regression_loss: 1.3112 - classification_loss: 0.2016 409/500 [=======================>......] - ETA: 23s - loss: 1.5138 - regression_loss: 1.3121 - classification_loss: 0.2017 410/500 [=======================>......] - ETA: 22s - loss: 1.5132 - regression_loss: 1.3115 - classification_loss: 0.2016 411/500 [=======================>......] - ETA: 22s - loss: 1.5130 - regression_loss: 1.3114 - classification_loss: 0.2016 412/500 [=======================>......] - ETA: 22s - loss: 1.5125 - regression_loss: 1.3110 - classification_loss: 0.2015 413/500 [=======================>......] - ETA: 22s - loss: 1.5110 - regression_loss: 1.3096 - classification_loss: 0.2014 414/500 [=======================>......] - ETA: 21s - loss: 1.5109 - regression_loss: 1.3095 - classification_loss: 0.2014 415/500 [=======================>......] - ETA: 21s - loss: 1.5100 - regression_loss: 1.3087 - classification_loss: 0.2013 416/500 [=======================>......] - ETA: 21s - loss: 1.5109 - regression_loss: 1.3096 - classification_loss: 0.2013 417/500 [========================>.....] - ETA: 21s - loss: 1.5113 - regression_loss: 1.3099 - classification_loss: 0.2014 418/500 [========================>.....] - ETA: 20s - loss: 1.5100 - regression_loss: 1.3087 - classification_loss: 0.2013 419/500 [========================>.....] - ETA: 20s - loss: 1.5107 - regression_loss: 1.3094 - classification_loss: 0.2014 420/500 [========================>.....] - ETA: 20s - loss: 1.5107 - regression_loss: 1.3094 - classification_loss: 0.2013 421/500 [========================>.....] - ETA: 20s - loss: 1.5107 - regression_loss: 1.3093 - classification_loss: 0.2014 422/500 [========================>.....] - ETA: 19s - loss: 1.5110 - regression_loss: 1.3096 - classification_loss: 0.2013 423/500 [========================>.....] - ETA: 19s - loss: 1.5088 - regression_loss: 1.3078 - classification_loss: 0.2010 424/500 [========================>.....] - ETA: 19s - loss: 1.5085 - regression_loss: 1.3076 - classification_loss: 0.2009 425/500 [========================>.....] - ETA: 18s - loss: 1.5070 - regression_loss: 1.3064 - classification_loss: 0.2006 426/500 [========================>.....] - ETA: 18s - loss: 1.5047 - regression_loss: 1.3044 - classification_loss: 0.2003 427/500 [========================>.....] - ETA: 18s - loss: 1.5036 - regression_loss: 1.3033 - classification_loss: 0.2003 428/500 [========================>.....] - ETA: 18s - loss: 1.5032 - regression_loss: 1.3029 - classification_loss: 0.2003 429/500 [========================>.....] - ETA: 17s - loss: 1.5009 - regression_loss: 1.3008 - classification_loss: 0.2001 430/500 [========================>.....] - ETA: 17s - loss: 1.5032 - regression_loss: 1.3027 - classification_loss: 0.2005 431/500 [========================>.....] - ETA: 17s - loss: 1.5017 - regression_loss: 1.3013 - classification_loss: 0.2004 432/500 [========================>.....] - ETA: 17s - loss: 1.5017 - regression_loss: 1.3012 - classification_loss: 0.2005 433/500 [========================>.....] - ETA: 16s - loss: 1.5019 - regression_loss: 1.3014 - classification_loss: 0.2004 434/500 [=========================>....] - ETA: 16s - loss: 1.5028 - regression_loss: 1.3022 - classification_loss: 0.2006 435/500 [=========================>....] - ETA: 16s - loss: 1.5033 - regression_loss: 1.3024 - classification_loss: 0.2009 436/500 [=========================>....] - ETA: 16s - loss: 1.5032 - regression_loss: 1.3024 - classification_loss: 0.2008 437/500 [=========================>....] - ETA: 15s - loss: 1.5034 - regression_loss: 1.3025 - classification_loss: 0.2009 438/500 [=========================>....] - ETA: 15s - loss: 1.5038 - regression_loss: 1.3030 - classification_loss: 0.2009 439/500 [=========================>....] - ETA: 15s - loss: 1.5029 - regression_loss: 1.3023 - classification_loss: 0.2006 440/500 [=========================>....] - ETA: 15s - loss: 1.5022 - regression_loss: 1.3017 - classification_loss: 0.2004 441/500 [=========================>....] - ETA: 14s - loss: 1.5027 - regression_loss: 1.3022 - classification_loss: 0.2005 442/500 [=========================>....] - ETA: 14s - loss: 1.5023 - regression_loss: 1.3018 - classification_loss: 0.2005 443/500 [=========================>....] - ETA: 14s - loss: 1.5020 - regression_loss: 1.3016 - classification_loss: 0.2004 444/500 [=========================>....] - ETA: 14s - loss: 1.5020 - regression_loss: 1.3015 - classification_loss: 0.2004 445/500 [=========================>....] - ETA: 13s - loss: 1.5014 - regression_loss: 1.3011 - classification_loss: 0.2004 446/500 [=========================>....] - ETA: 13s - loss: 1.5021 - regression_loss: 1.3018 - classification_loss: 0.2003 447/500 [=========================>....] - ETA: 13s - loss: 1.5023 - regression_loss: 1.3020 - classification_loss: 0.2002 448/500 [=========================>....] - ETA: 13s - loss: 1.5028 - regression_loss: 1.3024 - classification_loss: 0.2003 449/500 [=========================>....] - ETA: 12s - loss: 1.5025 - regression_loss: 1.3022 - classification_loss: 0.2003 450/500 [==========================>...] - ETA: 12s - loss: 1.5031 - regression_loss: 1.3027 - classification_loss: 0.2004 451/500 [==========================>...] - ETA: 12s - loss: 1.5018 - regression_loss: 1.3016 - classification_loss: 0.2002 452/500 [==========================>...] - ETA: 12s - loss: 1.5009 - regression_loss: 1.3007 - classification_loss: 0.2002 453/500 [==========================>...] - ETA: 11s - loss: 1.5013 - regression_loss: 1.3011 - classification_loss: 0.2003 454/500 [==========================>...] - ETA: 11s - loss: 1.5012 - regression_loss: 1.3010 - classification_loss: 0.2002 455/500 [==========================>...] - ETA: 11s - loss: 1.5017 - regression_loss: 1.3014 - classification_loss: 0.2003 456/500 [==========================>...] - ETA: 11s - loss: 1.5005 - regression_loss: 1.3004 - classification_loss: 0.2001 457/500 [==========================>...] - ETA: 10s - loss: 1.5004 - regression_loss: 1.3004 - classification_loss: 0.2000 458/500 [==========================>...] - ETA: 10s - loss: 1.4995 - regression_loss: 1.2995 - classification_loss: 0.2000 459/500 [==========================>...] - ETA: 10s - loss: 1.5000 - regression_loss: 1.2999 - classification_loss: 0.2000 460/500 [==========================>...] - ETA: 10s - loss: 1.4995 - regression_loss: 1.2994 - classification_loss: 0.2001 461/500 [==========================>...] - ETA: 9s - loss: 1.4983 - regression_loss: 1.2985 - classification_loss: 0.1998  462/500 [==========================>...] - ETA: 9s - loss: 1.4993 - regression_loss: 1.2995 - classification_loss: 0.1998 463/500 [==========================>...] - ETA: 9s - loss: 1.4980 - regression_loss: 1.2984 - classification_loss: 0.1996 464/500 [==========================>...] - ETA: 9s - loss: 1.4965 - regression_loss: 1.2971 - classification_loss: 0.1995 465/500 [==========================>...] - ETA: 8s - loss: 1.4969 - regression_loss: 1.2974 - classification_loss: 0.1995 466/500 [==========================>...] - ETA: 8s - loss: 1.4964 - regression_loss: 1.2970 - classification_loss: 0.1994 467/500 [===========================>..] - ETA: 8s - loss: 1.4958 - regression_loss: 1.2964 - classification_loss: 0.1994 468/500 [===========================>..] - ETA: 8s - loss: 1.4952 - regression_loss: 1.2959 - classification_loss: 0.1993 469/500 [===========================>..] - ETA: 7s - loss: 1.4950 - regression_loss: 1.2957 - classification_loss: 0.1992 470/500 [===========================>..] - ETA: 7s - loss: 1.4946 - regression_loss: 1.2954 - classification_loss: 0.1992 471/500 [===========================>..] - ETA: 7s - loss: 1.4950 - regression_loss: 1.2956 - classification_loss: 0.1994 472/500 [===========================>..] - ETA: 7s - loss: 1.4949 - regression_loss: 1.2955 - classification_loss: 0.1994 473/500 [===========================>..] - ETA: 6s - loss: 1.4963 - regression_loss: 1.2967 - classification_loss: 0.1997 474/500 [===========================>..] - ETA: 6s - loss: 1.4979 - regression_loss: 1.2981 - classification_loss: 0.1999 475/500 [===========================>..] - ETA: 6s - loss: 1.4997 - regression_loss: 1.2998 - classification_loss: 0.2000 476/500 [===========================>..] - ETA: 6s - loss: 1.5009 - regression_loss: 1.3006 - classification_loss: 0.2002 477/500 [===========================>..] - ETA: 5s - loss: 1.4992 - regression_loss: 1.2992 - classification_loss: 0.2000 478/500 [===========================>..] - ETA: 5s - loss: 1.4991 - regression_loss: 1.2989 - classification_loss: 0.2001 479/500 [===========================>..] - ETA: 5s - loss: 1.5003 - regression_loss: 1.3001 - classification_loss: 0.2001 480/500 [===========================>..] - ETA: 5s - loss: 1.5012 - regression_loss: 1.3009 - classification_loss: 0.2003 481/500 [===========================>..] - ETA: 4s - loss: 1.5005 - regression_loss: 1.3004 - classification_loss: 0.2002 482/500 [===========================>..] - ETA: 4s - loss: 1.5009 - regression_loss: 1.3007 - classification_loss: 0.2002 483/500 [===========================>..] - ETA: 4s - loss: 1.5020 - regression_loss: 1.3017 - classification_loss: 0.2003 484/500 [============================>.] - ETA: 4s - loss: 1.5011 - regression_loss: 1.3009 - classification_loss: 0.2002 485/500 [============================>.] - ETA: 3s - loss: 1.5020 - regression_loss: 1.3016 - classification_loss: 0.2004 486/500 [============================>.] - ETA: 3s - loss: 1.5018 - regression_loss: 1.3015 - classification_loss: 0.2003 487/500 [============================>.] - ETA: 3s - loss: 1.5022 - regression_loss: 1.3018 - classification_loss: 0.2004 488/500 [============================>.] - ETA: 3s - loss: 1.5019 - regression_loss: 1.3015 - classification_loss: 0.2003 489/500 [============================>.] - ETA: 2s - loss: 1.5007 - regression_loss: 1.3006 - classification_loss: 0.2001 490/500 [============================>.] - ETA: 2s - loss: 1.5004 - regression_loss: 1.3003 - classification_loss: 0.2000 491/500 [============================>.] - ETA: 2s - loss: 1.4993 - regression_loss: 1.2996 - classification_loss: 0.1998 492/500 [============================>.] - ETA: 2s - loss: 1.4997 - regression_loss: 1.2999 - classification_loss: 0.1999 493/500 [============================>.] - ETA: 1s - loss: 1.5001 - regression_loss: 1.3002 - classification_loss: 0.1999 494/500 [============================>.] - ETA: 1s - loss: 1.4982 - regression_loss: 1.2986 - classification_loss: 0.1996 495/500 [============================>.] - ETA: 1s - loss: 1.4985 - regression_loss: 1.2988 - classification_loss: 0.1997 496/500 [============================>.] - ETA: 1s - loss: 1.4986 - regression_loss: 1.2989 - classification_loss: 0.1997 497/500 [============================>.] - ETA: 0s - loss: 1.4988 - regression_loss: 1.2991 - classification_loss: 0.1997 498/500 [============================>.] - ETA: 0s - loss: 1.4987 - regression_loss: 1.2990 - classification_loss: 0.1997 499/500 [============================>.] - ETA: 0s - loss: 1.4988 - regression_loss: 1.2991 - classification_loss: 0.1997 500/500 [==============================] - 127s 254ms/step - loss: 1.4984 - regression_loss: 1.2988 - classification_loss: 0.1996 1172 instances of class plum with average precision: 0.6966 mAP: 0.6966 Epoch 00014: saving model to ./training/snapshots/resnet50_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 1:51 - loss: 1.0854 - regression_loss: 0.9801 - classification_loss: 0.1053 2/500 [..............................] - ETA: 1:58 - loss: 1.2728 - regression_loss: 1.0963 - classification_loss: 0.1765 3/500 [..............................] - ETA: 2:04 - loss: 1.4401 - regression_loss: 1.2375 - classification_loss: 0.2026 4/500 [..............................] - ETA: 2:04 - loss: 1.3455 - regression_loss: 1.1699 - classification_loss: 0.1755 5/500 [..............................] - ETA: 2:03 - loss: 1.4052 - regression_loss: 1.2229 - classification_loss: 0.1823 6/500 [..............................] - ETA: 2:03 - loss: 1.4254 - regression_loss: 1.2414 - classification_loss: 0.1840 7/500 [..............................] - ETA: 2:03 - loss: 1.4086 - regression_loss: 1.2294 - classification_loss: 0.1791 8/500 [..............................] - ETA: 2:03 - loss: 1.4865 - regression_loss: 1.2947 - classification_loss: 0.1918 9/500 [..............................] - ETA: 2:03 - loss: 1.5287 - regression_loss: 1.3258 - classification_loss: 0.2029 10/500 [..............................] - ETA: 2:03 - loss: 1.5783 - regression_loss: 1.3671 - classification_loss: 0.2112 11/500 [..............................] - ETA: 2:03 - loss: 1.4879 - regression_loss: 1.2897 - classification_loss: 0.1982 12/500 [..............................] - ETA: 2:02 - loss: 1.5431 - regression_loss: 1.3364 - classification_loss: 0.2067 13/500 [..............................] - ETA: 2:02 - loss: 1.5551 - regression_loss: 1.3450 - classification_loss: 0.2102 14/500 [..............................] - ETA: 2:02 - loss: 1.5425 - regression_loss: 1.3352 - classification_loss: 0.2073 15/500 [..............................] - ETA: 2:02 - loss: 1.5464 - regression_loss: 1.3380 - classification_loss: 0.2084 16/500 [..............................] - ETA: 2:02 - loss: 1.5457 - regression_loss: 1.3393 - classification_loss: 0.2064 17/500 [>.............................] - ETA: 2:01 - loss: 1.5723 - regression_loss: 1.3634 - classification_loss: 0.2089 18/500 [>.............................] - ETA: 2:01 - loss: 1.5658 - regression_loss: 1.3612 - classification_loss: 0.2046 19/500 [>.............................] - ETA: 2:01 - loss: 1.5957 - regression_loss: 1.3867 - classification_loss: 0.2090 20/500 [>.............................] - ETA: 2:01 - loss: 1.5839 - regression_loss: 1.3761 - classification_loss: 0.2079 21/500 [>.............................] - ETA: 2:00 - loss: 1.5731 - regression_loss: 1.3663 - classification_loss: 0.2068 22/500 [>.............................] - ETA: 2:00 - loss: 1.5811 - regression_loss: 1.3731 - classification_loss: 0.2080 23/500 [>.............................] - ETA: 1:59 - loss: 1.5918 - regression_loss: 1.3832 - classification_loss: 0.2086 24/500 [>.............................] - ETA: 1:59 - loss: 1.5933 - regression_loss: 1.3844 - classification_loss: 0.2089 25/500 [>.............................] - ETA: 1:59 - loss: 1.5954 - regression_loss: 1.3871 - classification_loss: 0.2083 26/500 [>.............................] - ETA: 1:59 - loss: 1.5999 - regression_loss: 1.3911 - classification_loss: 0.2088 27/500 [>.............................] - ETA: 1:58 - loss: 1.6013 - regression_loss: 1.3933 - classification_loss: 0.2080 28/500 [>.............................] - ETA: 1:58 - loss: 1.6070 - regression_loss: 1.3957 - classification_loss: 0.2113 29/500 [>.............................] - ETA: 1:58 - loss: 1.6082 - regression_loss: 1.3977 - classification_loss: 0.2105 30/500 [>.............................] - ETA: 1:58 - loss: 1.5705 - regression_loss: 1.3653 - classification_loss: 0.2052 31/500 [>.............................] - ETA: 1:58 - loss: 1.5530 - regression_loss: 1.3483 - classification_loss: 0.2047 32/500 [>.............................] - ETA: 1:57 - loss: 1.5632 - regression_loss: 1.3563 - classification_loss: 0.2069 33/500 [>.............................] - ETA: 1:57 - loss: 1.5490 - regression_loss: 1.3443 - classification_loss: 0.2047 34/500 [=>............................] - ETA: 1:57 - loss: 1.5422 - regression_loss: 1.3385 - classification_loss: 0.2036 35/500 [=>............................] - ETA: 1:56 - loss: 1.5487 - regression_loss: 1.3443 - classification_loss: 0.2043 36/500 [=>............................] - ETA: 1:56 - loss: 1.5481 - regression_loss: 1.3436 - classification_loss: 0.2045 37/500 [=>............................] - ETA: 1:56 - loss: 1.5533 - regression_loss: 1.3476 - classification_loss: 0.2057 38/500 [=>............................] - ETA: 1:55 - loss: 1.5552 - regression_loss: 1.3494 - classification_loss: 0.2058 39/500 [=>............................] - ETA: 1:55 - loss: 1.5444 - regression_loss: 1.3403 - classification_loss: 0.2042 40/500 [=>............................] - ETA: 1:55 - loss: 1.5474 - regression_loss: 1.3434 - classification_loss: 0.2040 41/500 [=>............................] - ETA: 1:55 - loss: 1.5406 - regression_loss: 1.3380 - classification_loss: 0.2026 42/500 [=>............................] - ETA: 1:54 - loss: 1.5445 - regression_loss: 1.3410 - classification_loss: 0.2035 43/500 [=>............................] - ETA: 1:53 - loss: 1.5452 - regression_loss: 1.3410 - classification_loss: 0.2041 44/500 [=>............................] - ETA: 1:53 - loss: 1.5411 - regression_loss: 1.3360 - classification_loss: 0.2051 45/500 [=>............................] - ETA: 1:52 - loss: 1.5438 - regression_loss: 1.3384 - classification_loss: 0.2054 46/500 [=>............................] - ETA: 1:52 - loss: 1.5502 - regression_loss: 1.3434 - classification_loss: 0.2068 47/500 [=>............................] - ETA: 1:52 - loss: 1.5525 - regression_loss: 1.3450 - classification_loss: 0.2075 48/500 [=>............................] - ETA: 1:52 - loss: 1.5503 - regression_loss: 1.3429 - classification_loss: 0.2074 49/500 [=>............................] - ETA: 1:52 - loss: 1.5474 - regression_loss: 1.3396 - classification_loss: 0.2078 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5518 - regression_loss: 1.3433 - classification_loss: 0.2085 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5574 - regression_loss: 1.3479 - classification_loss: 0.2095 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5598 - regression_loss: 1.3499 - classification_loss: 0.2100 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5445 - regression_loss: 1.3368 - classification_loss: 0.2077 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5494 - regression_loss: 1.3416 - classification_loss: 0.2078 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5598 - regression_loss: 1.3507 - classification_loss: 0.2091 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5582 - regression_loss: 1.3496 - classification_loss: 0.2085 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5519 - regression_loss: 1.3445 - classification_loss: 0.2074 58/500 [==>...........................] - ETA: 1:50 - loss: 1.5395 - regression_loss: 1.3338 - classification_loss: 0.2057 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5510 - regression_loss: 1.3439 - classification_loss: 0.2071 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5513 - regression_loss: 1.3441 - classification_loss: 0.2072 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5502 - regression_loss: 1.3439 - classification_loss: 0.2063 62/500 [==>...........................] - ETA: 1:49 - loss: 1.5356 - regression_loss: 1.3222 - classification_loss: 0.2134 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5414 - regression_loss: 1.3268 - classification_loss: 0.2146 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5427 - regression_loss: 1.3282 - classification_loss: 0.2145 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5445 - regression_loss: 1.3304 - classification_loss: 0.2142 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5349 - regression_loss: 1.3213 - classification_loss: 0.2136 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5352 - regression_loss: 1.3217 - classification_loss: 0.2135 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5337 - regression_loss: 1.3204 - classification_loss: 0.2133 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5324 - regression_loss: 1.3202 - classification_loss: 0.2122 70/500 [===>..........................] - ETA: 1:48 - loss: 1.5286 - regression_loss: 1.3175 - classification_loss: 0.2111 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5248 - regression_loss: 1.3150 - classification_loss: 0.2098 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5256 - regression_loss: 1.3155 - classification_loss: 0.2100 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5150 - regression_loss: 1.3071 - classification_loss: 0.2078 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5195 - regression_loss: 1.3099 - classification_loss: 0.2097 75/500 [===>..........................] - ETA: 1:47 - loss: 1.5206 - regression_loss: 1.3109 - classification_loss: 0.2097 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5229 - regression_loss: 1.3128 - classification_loss: 0.2101 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5214 - regression_loss: 1.3116 - classification_loss: 0.2098 78/500 [===>..........................] - ETA: 1:46 - loss: 1.5220 - regression_loss: 1.3124 - classification_loss: 0.2095 79/500 [===>..........................] - ETA: 1:46 - loss: 1.5274 - regression_loss: 1.3161 - classification_loss: 0.2113 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5295 - regression_loss: 1.3169 - classification_loss: 0.2126 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5208 - regression_loss: 1.3095 - classification_loss: 0.2113 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5261 - regression_loss: 1.3141 - classification_loss: 0.2120 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5271 - regression_loss: 1.3163 - classification_loss: 0.2108 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5240 - regression_loss: 1.3138 - classification_loss: 0.2102 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5148 - regression_loss: 1.3059 - classification_loss: 0.2089 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5191 - regression_loss: 1.3100 - classification_loss: 0.2091 87/500 [====>.........................] - ETA: 1:44 - loss: 1.5139 - regression_loss: 1.3062 - classification_loss: 0.2077 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5141 - regression_loss: 1.3068 - classification_loss: 0.2073 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5224 - regression_loss: 1.3134 - classification_loss: 0.2090 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5228 - regression_loss: 1.3139 - classification_loss: 0.2089 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5221 - regression_loss: 1.3124 - classification_loss: 0.2097 92/500 [====>.........................] - ETA: 1:43 - loss: 1.5244 - regression_loss: 1.3140 - classification_loss: 0.2104 93/500 [====>.........................] - ETA: 1:42 - loss: 1.5166 - regression_loss: 1.3077 - classification_loss: 0.2090 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5211 - regression_loss: 1.3117 - classification_loss: 0.2094 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5166 - regression_loss: 1.3078 - classification_loss: 0.2088 96/500 [====>.........................] - ETA: 1:42 - loss: 1.5180 - regression_loss: 1.3090 - classification_loss: 0.2089 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5177 - regression_loss: 1.3091 - classification_loss: 0.2085 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5188 - regression_loss: 1.3099 - classification_loss: 0.2089 99/500 [====>.........................] - ETA: 1:41 - loss: 1.5228 - regression_loss: 1.3131 - classification_loss: 0.2097 100/500 [=====>........................] - ETA: 1:41 - loss: 1.5264 - regression_loss: 1.3156 - classification_loss: 0.2108 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5222 - regression_loss: 1.3115 - classification_loss: 0.2107 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5219 - regression_loss: 1.3109 - classification_loss: 0.2110 103/500 [=====>........................] - ETA: 1:40 - loss: 1.5212 - regression_loss: 1.3097 - classification_loss: 0.2115 104/500 [=====>........................] - ETA: 1:40 - loss: 1.5152 - regression_loss: 1.3046 - classification_loss: 0.2105 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5145 - regression_loss: 1.3045 - classification_loss: 0.2100 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5149 - regression_loss: 1.3047 - classification_loss: 0.2101 107/500 [=====>........................] - ETA: 1:39 - loss: 1.5217 - regression_loss: 1.3109 - classification_loss: 0.2108 108/500 [=====>........................] - ETA: 1:39 - loss: 1.5219 - regression_loss: 1.3113 - classification_loss: 0.2106 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5187 - regression_loss: 1.3085 - classification_loss: 0.2102 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5232 - regression_loss: 1.3123 - classification_loss: 0.2109 111/500 [=====>........................] - ETA: 1:38 - loss: 1.5268 - regression_loss: 1.3151 - classification_loss: 0.2116 112/500 [=====>........................] - ETA: 1:38 - loss: 1.5264 - regression_loss: 1.3148 - classification_loss: 0.2116 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5268 - regression_loss: 1.3150 - classification_loss: 0.2118 114/500 [=====>........................] - ETA: 1:37 - loss: 1.5265 - regression_loss: 1.3150 - classification_loss: 0.2116 115/500 [=====>........................] - ETA: 1:37 - loss: 1.5298 - regression_loss: 1.3179 - classification_loss: 0.2119 116/500 [=====>........................] - ETA: 1:37 - loss: 1.5284 - regression_loss: 1.3169 - classification_loss: 0.2115 117/500 [======>.......................] - ETA: 1:37 - loss: 1.5269 - regression_loss: 1.3160 - classification_loss: 0.2110 118/500 [======>.......................] - ETA: 1:36 - loss: 1.5251 - regression_loss: 1.3144 - classification_loss: 0.2107 119/500 [======>.......................] - ETA: 1:36 - loss: 1.5219 - regression_loss: 1.3116 - classification_loss: 0.2103 120/500 [======>.......................] - ETA: 1:36 - loss: 1.5234 - regression_loss: 1.3130 - classification_loss: 0.2104 121/500 [======>.......................] - ETA: 1:36 - loss: 1.5207 - regression_loss: 1.3110 - classification_loss: 0.2097 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5221 - regression_loss: 1.3124 - classification_loss: 0.2097 123/500 [======>.......................] - ETA: 1:35 - loss: 1.5205 - regression_loss: 1.3102 - classification_loss: 0.2103 124/500 [======>.......................] - ETA: 1:35 - loss: 1.5208 - regression_loss: 1.3105 - classification_loss: 0.2103 125/500 [======>.......................] - ETA: 1:35 - loss: 1.5180 - regression_loss: 1.3081 - classification_loss: 0.2098 126/500 [======>.......................] - ETA: 1:34 - loss: 1.5203 - regression_loss: 1.3108 - classification_loss: 0.2095 127/500 [======>.......................] - ETA: 1:34 - loss: 1.5207 - regression_loss: 1.3112 - classification_loss: 0.2095 128/500 [======>.......................] - ETA: 1:34 - loss: 1.5194 - regression_loss: 1.3107 - classification_loss: 0.2087 129/500 [======>.......................] - ETA: 1:34 - loss: 1.5184 - regression_loss: 1.3100 - classification_loss: 0.2083 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5228 - regression_loss: 1.3136 - classification_loss: 0.2092 131/500 [======>.......................] - ETA: 1:33 - loss: 1.5176 - regression_loss: 1.3095 - classification_loss: 0.2081 132/500 [======>.......................] - ETA: 1:33 - loss: 1.5188 - regression_loss: 1.3107 - classification_loss: 0.2081 133/500 [======>.......................] - ETA: 1:33 - loss: 1.5174 - regression_loss: 1.3097 - classification_loss: 0.2077 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5132 - regression_loss: 1.3063 - classification_loss: 0.2069 135/500 [=======>......................] - ETA: 1:32 - loss: 1.5077 - regression_loss: 1.3017 - classification_loss: 0.2060 136/500 [=======>......................] - ETA: 1:32 - loss: 1.5082 - regression_loss: 1.3020 - classification_loss: 0.2063 137/500 [=======>......................] - ETA: 1:32 - loss: 1.5100 - regression_loss: 1.3034 - classification_loss: 0.2065 138/500 [=======>......................] - ETA: 1:31 - loss: 1.5136 - regression_loss: 1.3063 - classification_loss: 0.2073 139/500 [=======>......................] - ETA: 1:31 - loss: 1.5122 - regression_loss: 1.3054 - classification_loss: 0.2068 140/500 [=======>......................] - ETA: 1:31 - loss: 1.5095 - regression_loss: 1.3033 - classification_loss: 0.2062 141/500 [=======>......................] - ETA: 1:31 - loss: 1.5124 - regression_loss: 1.3062 - classification_loss: 0.2062 142/500 [=======>......................] - ETA: 1:30 - loss: 1.5094 - regression_loss: 1.3037 - classification_loss: 0.2057 143/500 [=======>......................] - ETA: 1:30 - loss: 1.5064 - regression_loss: 1.3010 - classification_loss: 0.2053 144/500 [=======>......................] - ETA: 1:30 - loss: 1.5037 - regression_loss: 1.2990 - classification_loss: 0.2047 145/500 [=======>......................] - ETA: 1:30 - loss: 1.5049 - regression_loss: 1.3001 - classification_loss: 0.2048 146/500 [=======>......................] - ETA: 1:29 - loss: 1.5054 - regression_loss: 1.2997 - classification_loss: 0.2056 147/500 [=======>......................] - ETA: 1:29 - loss: 1.5051 - regression_loss: 1.2990 - classification_loss: 0.2062 148/500 [=======>......................] - ETA: 1:29 - loss: 1.5057 - regression_loss: 1.2993 - classification_loss: 0.2064 149/500 [=======>......................] - ETA: 1:29 - loss: 1.5047 - regression_loss: 1.2986 - classification_loss: 0.2061 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5011 - regression_loss: 1.2955 - classification_loss: 0.2056 151/500 [========>.....................] - ETA: 1:28 - loss: 1.5023 - regression_loss: 1.2964 - classification_loss: 0.2058 152/500 [========>.....................] - ETA: 1:28 - loss: 1.5034 - regression_loss: 1.2975 - classification_loss: 0.2059 153/500 [========>.....................] - ETA: 1:28 - loss: 1.5027 - regression_loss: 1.2963 - classification_loss: 0.2064 154/500 [========>.....................] - ETA: 1:27 - loss: 1.5031 - regression_loss: 1.2967 - classification_loss: 0.2064 155/500 [========>.....................] - ETA: 1:27 - loss: 1.5045 - regression_loss: 1.2977 - classification_loss: 0.2068 156/500 [========>.....................] - ETA: 1:27 - loss: 1.5030 - regression_loss: 1.2965 - classification_loss: 0.2065 157/500 [========>.....................] - ETA: 1:27 - loss: 1.5034 - regression_loss: 1.2973 - classification_loss: 0.2061 158/500 [========>.....................] - ETA: 1:26 - loss: 1.5033 - regression_loss: 1.2973 - classification_loss: 0.2059 159/500 [========>.....................] - ETA: 1:26 - loss: 1.5050 - regression_loss: 1.2988 - classification_loss: 0.2062 160/500 [========>.....................] - ETA: 1:26 - loss: 1.5042 - regression_loss: 1.2982 - classification_loss: 0.2060 161/500 [========>.....................] - ETA: 1:26 - loss: 1.5004 - regression_loss: 1.2951 - classification_loss: 0.2053 162/500 [========>.....................] - ETA: 1:26 - loss: 1.5066 - regression_loss: 1.3001 - classification_loss: 0.2065 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5085 - regression_loss: 1.3017 - classification_loss: 0.2068 164/500 [========>.....................] - ETA: 1:25 - loss: 1.5059 - regression_loss: 1.2997 - classification_loss: 0.2062 165/500 [========>.....................] - ETA: 1:25 - loss: 1.5087 - regression_loss: 1.3021 - classification_loss: 0.2066 166/500 [========>.....................] - ETA: 1:25 - loss: 1.5058 - regression_loss: 1.2996 - classification_loss: 0.2062 167/500 [=========>....................] - ETA: 1:24 - loss: 1.5083 - regression_loss: 1.3014 - classification_loss: 0.2070 168/500 [=========>....................] - ETA: 1:24 - loss: 1.5081 - regression_loss: 1.3013 - classification_loss: 0.2068 169/500 [=========>....................] - ETA: 1:24 - loss: 1.5039 - regression_loss: 1.2979 - classification_loss: 0.2061 170/500 [=========>....................] - ETA: 1:23 - loss: 1.5010 - regression_loss: 1.2956 - classification_loss: 0.2054 171/500 [=========>....................] - ETA: 1:23 - loss: 1.4998 - regression_loss: 1.2938 - classification_loss: 0.2060 172/500 [=========>....................] - ETA: 1:23 - loss: 1.5006 - regression_loss: 1.2942 - classification_loss: 0.2065 173/500 [=========>....................] - ETA: 1:23 - loss: 1.5004 - regression_loss: 1.2943 - classification_loss: 0.2061 174/500 [=========>....................] - ETA: 1:22 - loss: 1.5015 - regression_loss: 1.2954 - classification_loss: 0.2061 175/500 [=========>....................] - ETA: 1:22 - loss: 1.5021 - regression_loss: 1.2961 - classification_loss: 0.2060 176/500 [=========>....................] - ETA: 1:22 - loss: 1.5025 - regression_loss: 1.2966 - classification_loss: 0.2059 177/500 [=========>....................] - ETA: 1:22 - loss: 1.4992 - regression_loss: 1.2935 - classification_loss: 0.2057 178/500 [=========>....................] - ETA: 1:21 - loss: 1.5016 - regression_loss: 1.2955 - classification_loss: 0.2061 179/500 [=========>....................] - ETA: 1:21 - loss: 1.4997 - regression_loss: 1.2936 - classification_loss: 0.2061 180/500 [=========>....................] - ETA: 1:21 - loss: 1.4957 - regression_loss: 1.2904 - classification_loss: 0.2053 181/500 [=========>....................] - ETA: 1:21 - loss: 1.4952 - regression_loss: 1.2901 - classification_loss: 0.2051 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4940 - regression_loss: 1.2890 - classification_loss: 0.2050 183/500 [=========>....................] - ETA: 1:20 - loss: 1.4959 - regression_loss: 1.2909 - classification_loss: 0.2050 184/500 [==========>...................] - ETA: 1:20 - loss: 1.4965 - regression_loss: 1.2912 - classification_loss: 0.2053 185/500 [==========>...................] - ETA: 1:20 - loss: 1.4971 - regression_loss: 1.2919 - classification_loss: 0.2051 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4936 - regression_loss: 1.2889 - classification_loss: 0.2046 187/500 [==========>...................] - ETA: 1:19 - loss: 1.4951 - regression_loss: 1.2899 - classification_loss: 0.2052 188/500 [==========>...................] - ETA: 1:19 - loss: 1.4975 - regression_loss: 1.2918 - classification_loss: 0.2056 189/500 [==========>...................] - ETA: 1:19 - loss: 1.4965 - regression_loss: 1.2913 - classification_loss: 0.2052 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4987 - regression_loss: 1.2929 - classification_loss: 0.2058 191/500 [==========>...................] - ETA: 1:18 - loss: 1.4952 - regression_loss: 1.2901 - classification_loss: 0.2051 192/500 [==========>...................] - ETA: 1:18 - loss: 1.4957 - regression_loss: 1.2906 - classification_loss: 0.2051 193/500 [==========>...................] - ETA: 1:18 - loss: 1.4962 - regression_loss: 1.2912 - classification_loss: 0.2051 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4960 - regression_loss: 1.2912 - classification_loss: 0.2049 195/500 [==========>...................] - ETA: 1:17 - loss: 1.4919 - regression_loss: 1.2871 - classification_loss: 0.2048 196/500 [==========>...................] - ETA: 1:17 - loss: 1.4925 - regression_loss: 1.2879 - classification_loss: 0.2046 197/500 [==========>...................] - ETA: 1:17 - loss: 1.4974 - regression_loss: 1.2925 - classification_loss: 0.2050 198/500 [==========>...................] - ETA: 1:16 - loss: 1.5003 - regression_loss: 1.2949 - classification_loss: 0.2054 199/500 [==========>...................] - ETA: 1:16 - loss: 1.5001 - regression_loss: 1.2949 - classification_loss: 0.2052 200/500 [===========>..................] - ETA: 1:16 - loss: 1.5000 - regression_loss: 1.2948 - classification_loss: 0.2051 201/500 [===========>..................] - ETA: 1:16 - loss: 1.4974 - regression_loss: 1.2927 - classification_loss: 0.2047 202/500 [===========>..................] - ETA: 1:15 - loss: 1.4977 - regression_loss: 1.2931 - classification_loss: 0.2046 203/500 [===========>..................] - ETA: 1:15 - loss: 1.4948 - regression_loss: 1.2909 - classification_loss: 0.2039 204/500 [===========>..................] - ETA: 1:15 - loss: 1.4942 - regression_loss: 1.2909 - classification_loss: 0.2033 205/500 [===========>..................] - ETA: 1:15 - loss: 1.4945 - regression_loss: 1.2910 - classification_loss: 0.2035 206/500 [===========>..................] - ETA: 1:14 - loss: 1.4952 - regression_loss: 1.2917 - classification_loss: 0.2035 207/500 [===========>..................] - ETA: 1:14 - loss: 1.4956 - regression_loss: 1.2920 - classification_loss: 0.2036 208/500 [===========>..................] - ETA: 1:14 - loss: 1.4981 - regression_loss: 1.2941 - classification_loss: 0.2040 209/500 [===========>..................] - ETA: 1:14 - loss: 1.4978 - regression_loss: 1.2941 - classification_loss: 0.2037 210/500 [===========>..................] - ETA: 1:13 - loss: 1.4923 - regression_loss: 1.2891 - classification_loss: 0.2032 211/500 [===========>..................] - ETA: 1:13 - loss: 1.4940 - regression_loss: 1.2906 - classification_loss: 0.2033 212/500 [===========>..................] - ETA: 1:13 - loss: 1.4921 - regression_loss: 1.2888 - classification_loss: 0.2033 213/500 [===========>..................] - ETA: 1:13 - loss: 1.4958 - regression_loss: 1.2922 - classification_loss: 0.2036 214/500 [===========>..................] - ETA: 1:12 - loss: 1.4929 - regression_loss: 1.2895 - classification_loss: 0.2034 215/500 [===========>..................] - ETA: 1:12 - loss: 1.4889 - regression_loss: 1.2861 - classification_loss: 0.2028 216/500 [===========>..................] - ETA: 1:12 - loss: 1.4873 - regression_loss: 1.2852 - classification_loss: 0.2021 217/500 [============>.................] - ETA: 1:11 - loss: 1.4859 - regression_loss: 1.2839 - classification_loss: 0.2019 218/500 [============>.................] - ETA: 1:11 - loss: 1.4893 - regression_loss: 1.2871 - classification_loss: 0.2023 219/500 [============>.................] - ETA: 1:11 - loss: 1.4898 - regression_loss: 1.2876 - classification_loss: 0.2022 220/500 [============>.................] - ETA: 1:11 - loss: 1.4911 - regression_loss: 1.2887 - classification_loss: 0.2023 221/500 [============>.................] - ETA: 1:10 - loss: 1.4916 - regression_loss: 1.2889 - classification_loss: 0.2027 222/500 [============>.................] - ETA: 1:10 - loss: 1.4914 - regression_loss: 1.2887 - classification_loss: 0.2028 223/500 [============>.................] - ETA: 1:10 - loss: 1.4926 - regression_loss: 1.2895 - classification_loss: 0.2031 224/500 [============>.................] - ETA: 1:10 - loss: 1.4915 - regression_loss: 1.2885 - classification_loss: 0.2029 225/500 [============>.................] - ETA: 1:09 - loss: 1.4914 - regression_loss: 1.2884 - classification_loss: 0.2030 226/500 [============>.................] - ETA: 1:09 - loss: 1.4901 - regression_loss: 1.2873 - classification_loss: 0.2028 227/500 [============>.................] - ETA: 1:09 - loss: 1.4903 - regression_loss: 1.2874 - classification_loss: 0.2028 228/500 [============>.................] - ETA: 1:08 - loss: 1.4925 - regression_loss: 1.2893 - classification_loss: 0.2031 229/500 [============>.................] - ETA: 1:08 - loss: 1.4888 - regression_loss: 1.2862 - classification_loss: 0.2027 230/500 [============>.................] - ETA: 1:08 - loss: 1.4862 - regression_loss: 1.2841 - classification_loss: 0.2022 231/500 [============>.................] - ETA: 1:08 - loss: 1.4837 - regression_loss: 1.2818 - classification_loss: 0.2018 232/500 [============>.................] - ETA: 1:07 - loss: 1.4831 - regression_loss: 1.2815 - classification_loss: 0.2016 233/500 [============>.................] - ETA: 1:07 - loss: 1.4825 - regression_loss: 1.2811 - classification_loss: 0.2015 234/500 [=============>................] - ETA: 1:07 - loss: 1.4787 - regression_loss: 1.2778 - classification_loss: 0.2009 235/500 [=============>................] - ETA: 1:07 - loss: 1.4790 - regression_loss: 1.2782 - classification_loss: 0.2008 236/500 [=============>................] - ETA: 1:06 - loss: 1.4806 - regression_loss: 1.2797 - classification_loss: 0.2010 237/500 [=============>................] - ETA: 1:06 - loss: 1.4814 - regression_loss: 1.2803 - classification_loss: 0.2011 238/500 [=============>................] - ETA: 1:06 - loss: 1.4804 - regression_loss: 1.2796 - classification_loss: 0.2009 239/500 [=============>................] - ETA: 1:06 - loss: 1.4794 - regression_loss: 1.2786 - classification_loss: 0.2008 240/500 [=============>................] - ETA: 1:05 - loss: 1.4798 - regression_loss: 1.2790 - classification_loss: 0.2008 241/500 [=============>................] - ETA: 1:05 - loss: 1.4811 - regression_loss: 1.2801 - classification_loss: 0.2010 242/500 [=============>................] - ETA: 1:05 - loss: 1.4785 - regression_loss: 1.2779 - classification_loss: 0.2006 243/500 [=============>................] - ETA: 1:05 - loss: 1.4787 - regression_loss: 1.2781 - classification_loss: 0.2006 244/500 [=============>................] - ETA: 1:04 - loss: 1.4789 - regression_loss: 1.2785 - classification_loss: 0.2004 245/500 [=============>................] - ETA: 1:04 - loss: 1.4766 - regression_loss: 1.2766 - classification_loss: 0.2000 246/500 [=============>................] - ETA: 1:04 - loss: 1.4790 - regression_loss: 1.2784 - classification_loss: 0.2006 247/500 [=============>................] - ETA: 1:04 - loss: 1.4781 - regression_loss: 1.2777 - classification_loss: 0.2004 248/500 [=============>................] - ETA: 1:03 - loss: 1.4784 - regression_loss: 1.2780 - classification_loss: 0.2004 249/500 [=============>................] - ETA: 1:03 - loss: 1.4801 - regression_loss: 1.2796 - classification_loss: 0.2005 250/500 [==============>...............] - ETA: 1:03 - loss: 1.4810 - regression_loss: 1.2804 - classification_loss: 0.2006 251/500 [==============>...............] - ETA: 1:03 - loss: 1.4807 - regression_loss: 1.2803 - classification_loss: 0.2004 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4805 - regression_loss: 1.2801 - classification_loss: 0.2004 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4807 - regression_loss: 1.2804 - classification_loss: 0.2003 254/500 [==============>...............] - ETA: 1:02 - loss: 1.4788 - regression_loss: 1.2787 - classification_loss: 0.2001 255/500 [==============>...............] - ETA: 1:02 - loss: 1.4795 - regression_loss: 1.2794 - classification_loss: 0.2001 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4777 - regression_loss: 1.2779 - classification_loss: 0.1997 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4750 - regression_loss: 1.2757 - classification_loss: 0.1993 258/500 [==============>...............] - ETA: 1:01 - loss: 1.4731 - regression_loss: 1.2742 - classification_loss: 0.1989 259/500 [==============>...............] - ETA: 1:01 - loss: 1.4758 - regression_loss: 1.2763 - classification_loss: 0.1995 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4760 - regression_loss: 1.2767 - classification_loss: 0.1992 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4777 - regression_loss: 1.2783 - classification_loss: 0.1994 262/500 [==============>...............] - ETA: 1:00 - loss: 1.4771 - regression_loss: 1.2780 - classification_loss: 0.1992 263/500 [==============>...............] - ETA: 1:00 - loss: 1.4775 - regression_loss: 1.2784 - classification_loss: 0.1991 264/500 [==============>...............] - ETA: 59s - loss: 1.4766 - regression_loss: 1.2775 - classification_loss: 0.1990  265/500 [==============>...............] - ETA: 59s - loss: 1.4761 - regression_loss: 1.2771 - classification_loss: 0.1990 266/500 [==============>...............] - ETA: 59s - loss: 1.4760 - regression_loss: 1.2770 - classification_loss: 0.1990 267/500 [===============>..............] - ETA: 59s - loss: 1.4740 - regression_loss: 1.2754 - classification_loss: 0.1986 268/500 [===============>..............] - ETA: 58s - loss: 1.4746 - regression_loss: 1.2759 - classification_loss: 0.1987 269/500 [===============>..............] - ETA: 58s - loss: 1.4757 - regression_loss: 1.2769 - classification_loss: 0.1988 270/500 [===============>..............] - ETA: 58s - loss: 1.4738 - regression_loss: 1.2754 - classification_loss: 0.1985 271/500 [===============>..............] - ETA: 58s - loss: 1.4734 - regression_loss: 1.2750 - classification_loss: 0.1984 272/500 [===============>..............] - ETA: 57s - loss: 1.4721 - regression_loss: 1.2736 - classification_loss: 0.1985 273/500 [===============>..............] - ETA: 57s - loss: 1.4701 - regression_loss: 1.2719 - classification_loss: 0.1981 274/500 [===============>..............] - ETA: 57s - loss: 1.4743 - regression_loss: 1.2758 - classification_loss: 0.1986 275/500 [===============>..............] - ETA: 57s - loss: 1.4721 - regression_loss: 1.2740 - classification_loss: 0.1981 276/500 [===============>..............] - ETA: 56s - loss: 1.4728 - regression_loss: 1.2745 - classification_loss: 0.1983 277/500 [===============>..............] - ETA: 56s - loss: 1.4709 - regression_loss: 1.2729 - classification_loss: 0.1980 278/500 [===============>..............] - ETA: 56s - loss: 1.4711 - regression_loss: 1.2731 - classification_loss: 0.1979 279/500 [===============>..............] - ETA: 56s - loss: 1.4721 - regression_loss: 1.2739 - classification_loss: 0.1982 280/500 [===============>..............] - ETA: 55s - loss: 1.4746 - regression_loss: 1.2760 - classification_loss: 0.1986 281/500 [===============>..............] - ETA: 55s - loss: 1.4750 - regression_loss: 1.2762 - classification_loss: 0.1988 282/500 [===============>..............] - ETA: 55s - loss: 1.4724 - regression_loss: 1.2741 - classification_loss: 0.1983 283/500 [===============>..............] - ETA: 55s - loss: 1.4722 - regression_loss: 1.2740 - classification_loss: 0.1982 284/500 [================>.............] - ETA: 54s - loss: 1.4725 - regression_loss: 1.2743 - classification_loss: 0.1982 285/500 [================>.............] - ETA: 54s - loss: 1.4730 - regression_loss: 1.2746 - classification_loss: 0.1984 286/500 [================>.............] - ETA: 54s - loss: 1.4703 - regression_loss: 1.2721 - classification_loss: 0.1982 287/500 [================>.............] - ETA: 54s - loss: 1.4698 - regression_loss: 1.2717 - classification_loss: 0.1981 288/500 [================>.............] - ETA: 53s - loss: 1.4710 - regression_loss: 1.2728 - classification_loss: 0.1982 289/500 [================>.............] - ETA: 53s - loss: 1.4726 - regression_loss: 1.2740 - classification_loss: 0.1986 290/500 [================>.............] - ETA: 53s - loss: 1.4728 - regression_loss: 1.2742 - classification_loss: 0.1987 291/500 [================>.............] - ETA: 53s - loss: 1.4725 - regression_loss: 1.2741 - classification_loss: 0.1985 292/500 [================>.............] - ETA: 52s - loss: 1.4726 - regression_loss: 1.2741 - classification_loss: 0.1985 293/500 [================>.............] - ETA: 52s - loss: 1.4717 - regression_loss: 1.2734 - classification_loss: 0.1984 294/500 [================>.............] - ETA: 52s - loss: 1.4681 - regression_loss: 1.2702 - classification_loss: 0.1978 295/500 [================>.............] - ETA: 52s - loss: 1.4691 - regression_loss: 1.2712 - classification_loss: 0.1979 296/500 [================>.............] - ETA: 51s - loss: 1.4703 - regression_loss: 1.2724 - classification_loss: 0.1979 297/500 [================>.............] - ETA: 51s - loss: 1.4677 - regression_loss: 1.2701 - classification_loss: 0.1976 298/500 [================>.............] - ETA: 51s - loss: 1.4672 - regression_loss: 1.2698 - classification_loss: 0.1974 299/500 [================>.............] - ETA: 51s - loss: 1.4670 - regression_loss: 1.2697 - classification_loss: 0.1973 300/500 [=================>............] - ETA: 50s - loss: 1.4659 - regression_loss: 1.2688 - classification_loss: 0.1971 301/500 [=================>............] - ETA: 50s - loss: 1.4673 - regression_loss: 1.2701 - classification_loss: 0.1972 302/500 [=================>............] - ETA: 50s - loss: 1.4685 - regression_loss: 1.2711 - classification_loss: 0.1974 303/500 [=================>............] - ETA: 50s - loss: 1.4704 - regression_loss: 1.2727 - classification_loss: 0.1977 304/500 [=================>............] - ETA: 49s - loss: 1.4711 - regression_loss: 1.2732 - classification_loss: 0.1979 305/500 [=================>............] - ETA: 49s - loss: 1.4694 - regression_loss: 1.2717 - classification_loss: 0.1977 306/500 [=================>............] - ETA: 49s - loss: 1.4697 - regression_loss: 1.2721 - classification_loss: 0.1976 307/500 [=================>............] - ETA: 49s - loss: 1.4706 - regression_loss: 1.2728 - classification_loss: 0.1978 308/500 [=================>............] - ETA: 48s - loss: 1.4715 - regression_loss: 1.2737 - classification_loss: 0.1978 309/500 [=================>............] - ETA: 48s - loss: 1.4725 - regression_loss: 1.2746 - classification_loss: 0.1978 310/500 [=================>............] - ETA: 48s - loss: 1.4736 - regression_loss: 1.2756 - classification_loss: 0.1980 311/500 [=================>............] - ETA: 48s - loss: 1.4749 - regression_loss: 1.2766 - classification_loss: 0.1983 312/500 [=================>............] - ETA: 47s - loss: 1.4748 - regression_loss: 1.2766 - classification_loss: 0.1982 313/500 [=================>............] - ETA: 47s - loss: 1.4727 - regression_loss: 1.2748 - classification_loss: 0.1979 314/500 [=================>............] - ETA: 47s - loss: 1.4710 - regression_loss: 1.2735 - classification_loss: 0.1975 315/500 [=================>............] - ETA: 47s - loss: 1.4685 - regression_loss: 1.2714 - classification_loss: 0.1971 316/500 [=================>............] - ETA: 46s - loss: 1.4679 - regression_loss: 1.2707 - classification_loss: 0.1971 317/500 [==================>...........] - ETA: 46s - loss: 1.4683 - regression_loss: 1.2711 - classification_loss: 0.1972 318/500 [==================>...........] - ETA: 46s - loss: 1.4684 - regression_loss: 1.2711 - classification_loss: 0.1972 319/500 [==================>...........] - ETA: 46s - loss: 1.4671 - regression_loss: 1.2700 - classification_loss: 0.1971 320/500 [==================>...........] - ETA: 45s - loss: 1.4671 - regression_loss: 1.2700 - classification_loss: 0.1972 321/500 [==================>...........] - ETA: 45s - loss: 1.4673 - regression_loss: 1.2702 - classification_loss: 0.1970 322/500 [==================>...........] - ETA: 45s - loss: 1.4669 - regression_loss: 1.2700 - classification_loss: 0.1969 323/500 [==================>...........] - ETA: 45s - loss: 1.4728 - regression_loss: 1.2746 - classification_loss: 0.1982 324/500 [==================>...........] - ETA: 44s - loss: 1.4721 - regression_loss: 1.2741 - classification_loss: 0.1981 325/500 [==================>...........] - ETA: 44s - loss: 1.4738 - regression_loss: 1.2755 - classification_loss: 0.1983 326/500 [==================>...........] - ETA: 44s - loss: 1.4739 - regression_loss: 1.2756 - classification_loss: 0.1983 327/500 [==================>...........] - ETA: 43s - loss: 1.4735 - regression_loss: 1.2753 - classification_loss: 0.1981 328/500 [==================>...........] - ETA: 43s - loss: 1.4752 - regression_loss: 1.2768 - classification_loss: 0.1984 329/500 [==================>...........] - ETA: 43s - loss: 1.4750 - regression_loss: 1.2766 - classification_loss: 0.1984 330/500 [==================>...........] - ETA: 43s - loss: 1.4758 - regression_loss: 1.2772 - classification_loss: 0.1986 331/500 [==================>...........] - ETA: 42s - loss: 1.4749 - regression_loss: 1.2765 - classification_loss: 0.1983 332/500 [==================>...........] - ETA: 42s - loss: 1.4756 - regression_loss: 1.2773 - classification_loss: 0.1984 333/500 [==================>...........] - ETA: 42s - loss: 1.4764 - regression_loss: 1.2778 - classification_loss: 0.1986 334/500 [===================>..........] - ETA: 42s - loss: 1.4762 - regression_loss: 1.2777 - classification_loss: 0.1985 335/500 [===================>..........] - ETA: 41s - loss: 1.4766 - regression_loss: 1.2779 - classification_loss: 0.1987 336/500 [===================>..........] - ETA: 41s - loss: 1.4760 - regression_loss: 1.2774 - classification_loss: 0.1986 337/500 [===================>..........] - ETA: 41s - loss: 1.4776 - regression_loss: 1.2788 - classification_loss: 0.1988 338/500 [===================>..........] - ETA: 41s - loss: 1.4780 - regression_loss: 1.2793 - classification_loss: 0.1988 339/500 [===================>..........] - ETA: 40s - loss: 1.4774 - regression_loss: 1.2788 - classification_loss: 0.1986 340/500 [===================>..........] - ETA: 40s - loss: 1.4753 - regression_loss: 1.2770 - classification_loss: 0.1983 341/500 [===================>..........] - ETA: 40s - loss: 1.4761 - regression_loss: 1.2776 - classification_loss: 0.1985 342/500 [===================>..........] - ETA: 40s - loss: 1.4778 - regression_loss: 1.2790 - classification_loss: 0.1987 343/500 [===================>..........] - ETA: 39s - loss: 1.4783 - regression_loss: 1.2795 - classification_loss: 0.1988 344/500 [===================>..........] - ETA: 39s - loss: 1.4782 - regression_loss: 1.2794 - classification_loss: 0.1988 345/500 [===================>..........] - ETA: 39s - loss: 1.4794 - regression_loss: 1.2805 - classification_loss: 0.1989 346/500 [===================>..........] - ETA: 39s - loss: 1.4796 - regression_loss: 1.2808 - classification_loss: 0.1989 347/500 [===================>..........] - ETA: 38s - loss: 1.4792 - regression_loss: 1.2805 - classification_loss: 0.1987 348/500 [===================>..........] - ETA: 38s - loss: 1.4800 - regression_loss: 1.2810 - classification_loss: 0.1990 349/500 [===================>..........] - ETA: 38s - loss: 1.4809 - regression_loss: 1.2818 - classification_loss: 0.1990 350/500 [====================>.........] - ETA: 38s - loss: 1.4813 - regression_loss: 1.2823 - classification_loss: 0.1991 351/500 [====================>.........] - ETA: 37s - loss: 1.4804 - regression_loss: 1.2815 - classification_loss: 0.1989 352/500 [====================>.........] - ETA: 37s - loss: 1.4789 - regression_loss: 1.2803 - classification_loss: 0.1986 353/500 [====================>.........] - ETA: 37s - loss: 1.4793 - regression_loss: 1.2805 - classification_loss: 0.1988 354/500 [====================>.........] - ETA: 37s - loss: 1.4817 - regression_loss: 1.2824 - classification_loss: 0.1993 355/500 [====================>.........] - ETA: 36s - loss: 1.4793 - regression_loss: 1.2804 - classification_loss: 0.1990 356/500 [====================>.........] - ETA: 36s - loss: 1.4795 - regression_loss: 1.2806 - classification_loss: 0.1989 357/500 [====================>.........] - ETA: 36s - loss: 1.4775 - regression_loss: 1.2789 - classification_loss: 0.1986 358/500 [====================>.........] - ETA: 36s - loss: 1.4778 - regression_loss: 1.2792 - classification_loss: 0.1986 359/500 [====================>.........] - ETA: 35s - loss: 1.4800 - regression_loss: 1.2812 - classification_loss: 0.1988 360/500 [====================>.........] - ETA: 35s - loss: 1.4807 - regression_loss: 1.2818 - classification_loss: 0.1988 361/500 [====================>.........] - ETA: 35s - loss: 1.4810 - regression_loss: 1.2820 - classification_loss: 0.1989 362/500 [====================>.........] - ETA: 35s - loss: 1.4808 - regression_loss: 1.2819 - classification_loss: 0.1989 363/500 [====================>.........] - ETA: 34s - loss: 1.4812 - regression_loss: 1.2822 - classification_loss: 0.1990 364/500 [====================>.........] - ETA: 34s - loss: 1.4821 - regression_loss: 1.2831 - classification_loss: 0.1990 365/500 [====================>.........] - ETA: 34s - loss: 1.4829 - regression_loss: 1.2838 - classification_loss: 0.1991 366/500 [====================>.........] - ETA: 34s - loss: 1.4825 - regression_loss: 1.2837 - classification_loss: 0.1988 367/500 [=====================>........] - ETA: 33s - loss: 1.4835 - regression_loss: 1.2844 - classification_loss: 0.1991 368/500 [=====================>........] - ETA: 33s - loss: 1.4829 - regression_loss: 1.2840 - classification_loss: 0.1990 369/500 [=====================>........] - ETA: 33s - loss: 1.4837 - regression_loss: 1.2846 - classification_loss: 0.1992 370/500 [=====================>........] - ETA: 33s - loss: 1.4855 - regression_loss: 1.2863 - classification_loss: 0.1993 371/500 [=====================>........] - ETA: 32s - loss: 1.4857 - regression_loss: 1.2864 - classification_loss: 0.1992 372/500 [=====================>........] - ETA: 32s - loss: 1.4862 - regression_loss: 1.2870 - classification_loss: 0.1993 373/500 [=====================>........] - ETA: 32s - loss: 1.4865 - regression_loss: 1.2875 - classification_loss: 0.1991 374/500 [=====================>........] - ETA: 32s - loss: 1.4840 - regression_loss: 1.2853 - classification_loss: 0.1987 375/500 [=====================>........] - ETA: 31s - loss: 1.4814 - regression_loss: 1.2830 - classification_loss: 0.1983 376/500 [=====================>........] - ETA: 31s - loss: 1.4792 - regression_loss: 1.2812 - classification_loss: 0.1980 377/500 [=====================>........] - ETA: 31s - loss: 1.4792 - regression_loss: 1.2812 - classification_loss: 0.1980 378/500 [=====================>........] - ETA: 31s - loss: 1.4789 - regression_loss: 1.2809 - classification_loss: 0.1980 379/500 [=====================>........] - ETA: 30s - loss: 1.4801 - regression_loss: 1.2818 - classification_loss: 0.1983 380/500 [=====================>........] - ETA: 30s - loss: 1.4811 - regression_loss: 1.2826 - classification_loss: 0.1985 381/500 [=====================>........] - ETA: 30s - loss: 1.4803 - regression_loss: 1.2818 - classification_loss: 0.1985 382/500 [=====================>........] - ETA: 30s - loss: 1.4788 - regression_loss: 1.2807 - classification_loss: 0.1982 383/500 [=====================>........] - ETA: 29s - loss: 1.4790 - regression_loss: 1.2808 - classification_loss: 0.1982 384/500 [======================>.......] - ETA: 29s - loss: 1.4772 - regression_loss: 1.2793 - classification_loss: 0.1979 385/500 [======================>.......] - ETA: 29s - loss: 1.4756 - regression_loss: 1.2780 - classification_loss: 0.1975 386/500 [======================>.......] - ETA: 29s - loss: 1.4761 - regression_loss: 1.2784 - classification_loss: 0.1977 387/500 [======================>.......] - ETA: 28s - loss: 1.4761 - regression_loss: 1.2785 - classification_loss: 0.1976 388/500 [======================>.......] - ETA: 28s - loss: 1.4765 - regression_loss: 1.2788 - classification_loss: 0.1977 389/500 [======================>.......] - ETA: 28s - loss: 1.4761 - regression_loss: 1.2787 - classification_loss: 0.1975 390/500 [======================>.......] - ETA: 27s - loss: 1.4738 - regression_loss: 1.2765 - classification_loss: 0.1973 391/500 [======================>.......] - ETA: 27s - loss: 1.4730 - regression_loss: 1.2758 - classification_loss: 0.1971 392/500 [======================>.......] - ETA: 27s - loss: 1.4741 - regression_loss: 1.2767 - classification_loss: 0.1974 393/500 [======================>.......] - ETA: 27s - loss: 1.4741 - regression_loss: 1.2767 - classification_loss: 0.1974 394/500 [======================>.......] - ETA: 26s - loss: 1.4724 - regression_loss: 1.2753 - classification_loss: 0.1971 395/500 [======================>.......] - ETA: 26s - loss: 1.4729 - regression_loss: 1.2757 - classification_loss: 0.1972 396/500 [======================>.......] - ETA: 26s - loss: 1.4725 - regression_loss: 1.2755 - classification_loss: 0.1970 397/500 [======================>.......] - ETA: 26s - loss: 1.4718 - regression_loss: 1.2750 - classification_loss: 0.1969 398/500 [======================>.......] - ETA: 25s - loss: 1.4725 - regression_loss: 1.2756 - classification_loss: 0.1969 399/500 [======================>.......] - ETA: 25s - loss: 1.4729 - regression_loss: 1.2760 - classification_loss: 0.1969 400/500 [=======================>......] - ETA: 25s - loss: 1.4709 - regression_loss: 1.2743 - classification_loss: 0.1965 401/500 [=======================>......] - ETA: 25s - loss: 1.4707 - regression_loss: 1.2742 - classification_loss: 0.1965 402/500 [=======================>......] - ETA: 24s - loss: 1.4707 - regression_loss: 1.2743 - classification_loss: 0.1964 403/500 [=======================>......] - ETA: 24s - loss: 1.4713 - regression_loss: 1.2748 - classification_loss: 0.1965 404/500 [=======================>......] - ETA: 24s - loss: 1.4733 - regression_loss: 1.2765 - classification_loss: 0.1968 405/500 [=======================>......] - ETA: 24s - loss: 1.4726 - regression_loss: 1.2759 - classification_loss: 0.1967 406/500 [=======================>......] - ETA: 23s - loss: 1.4712 - regression_loss: 1.2747 - classification_loss: 0.1965 407/500 [=======================>......] - ETA: 23s - loss: 1.4699 - regression_loss: 1.2735 - classification_loss: 0.1964 408/500 [=======================>......] - ETA: 23s - loss: 1.4687 - regression_loss: 1.2725 - classification_loss: 0.1962 409/500 [=======================>......] - ETA: 23s - loss: 1.4696 - regression_loss: 1.2729 - classification_loss: 0.1967 410/500 [=======================>......] - ETA: 22s - loss: 1.4705 - regression_loss: 1.2736 - classification_loss: 0.1968 411/500 [=======================>......] - ETA: 22s - loss: 1.4689 - regression_loss: 1.2723 - classification_loss: 0.1966 412/500 [=======================>......] - ETA: 22s - loss: 1.4689 - regression_loss: 1.2724 - classification_loss: 0.1965 413/500 [=======================>......] - ETA: 22s - loss: 1.4677 - regression_loss: 1.2714 - classification_loss: 0.1962 414/500 [=======================>......] - ETA: 21s - loss: 1.4676 - regression_loss: 1.2714 - classification_loss: 0.1962 415/500 [=======================>......] - ETA: 21s - loss: 1.4666 - regression_loss: 1.2706 - classification_loss: 0.1961 416/500 [=======================>......] - ETA: 21s - loss: 1.4662 - regression_loss: 1.2703 - classification_loss: 0.1959 417/500 [========================>.....] - ETA: 21s - loss: 1.4648 - regression_loss: 1.2691 - classification_loss: 0.1956 418/500 [========================>.....] - ETA: 20s - loss: 1.4640 - regression_loss: 1.2686 - classification_loss: 0.1954 419/500 [========================>.....] - ETA: 20s - loss: 1.4650 - regression_loss: 1.2694 - classification_loss: 0.1956 420/500 [========================>.....] - ETA: 20s - loss: 1.4636 - regression_loss: 1.2683 - classification_loss: 0.1953 421/500 [========================>.....] - ETA: 20s - loss: 1.4640 - regression_loss: 1.2686 - classification_loss: 0.1954 422/500 [========================>.....] - ETA: 19s - loss: 1.4644 - regression_loss: 1.2689 - classification_loss: 0.1954 423/500 [========================>.....] - ETA: 19s - loss: 1.4638 - regression_loss: 1.2684 - classification_loss: 0.1954 424/500 [========================>.....] - ETA: 19s - loss: 1.4646 - regression_loss: 1.2690 - classification_loss: 0.1956 425/500 [========================>.....] - ETA: 19s - loss: 1.4640 - regression_loss: 1.2686 - classification_loss: 0.1954 426/500 [========================>.....] - ETA: 18s - loss: 1.4643 - regression_loss: 1.2688 - classification_loss: 0.1955 427/500 [========================>.....] - ETA: 18s - loss: 1.4621 - regression_loss: 1.2669 - classification_loss: 0.1952 428/500 [========================>.....] - ETA: 18s - loss: 1.4629 - regression_loss: 1.2675 - classification_loss: 0.1953 429/500 [========================>.....] - ETA: 18s - loss: 1.4628 - regression_loss: 1.2675 - classification_loss: 0.1953 430/500 [========================>.....] - ETA: 17s - loss: 1.4630 - regression_loss: 1.2678 - classification_loss: 0.1952 431/500 [========================>.....] - ETA: 17s - loss: 1.4635 - regression_loss: 1.2684 - classification_loss: 0.1952 432/500 [========================>.....] - ETA: 17s - loss: 1.4633 - regression_loss: 1.2682 - classification_loss: 0.1951 433/500 [========================>.....] - ETA: 17s - loss: 1.4617 - regression_loss: 1.2668 - classification_loss: 0.1949 434/500 [=========================>....] - ETA: 16s - loss: 1.4618 - regression_loss: 1.2669 - classification_loss: 0.1949 435/500 [=========================>....] - ETA: 16s - loss: 1.4614 - regression_loss: 1.2666 - classification_loss: 0.1948 436/500 [=========================>....] - ETA: 16s - loss: 1.4617 - regression_loss: 1.2667 - classification_loss: 0.1950 437/500 [=========================>....] - ETA: 15s - loss: 1.4623 - regression_loss: 1.2671 - classification_loss: 0.1952 438/500 [=========================>....] - ETA: 15s - loss: 1.4623 - regression_loss: 1.2671 - classification_loss: 0.1952 439/500 [=========================>....] - ETA: 15s - loss: 1.4624 - regression_loss: 1.2672 - classification_loss: 0.1952 440/500 [=========================>....] - ETA: 15s - loss: 1.4635 - regression_loss: 1.2683 - classification_loss: 0.1952 441/500 [=========================>....] - ETA: 14s - loss: 1.4650 - regression_loss: 1.2700 - classification_loss: 0.1950 442/500 [=========================>....] - ETA: 14s - loss: 1.4653 - regression_loss: 1.2702 - classification_loss: 0.1951 443/500 [=========================>....] - ETA: 14s - loss: 1.4638 - regression_loss: 1.2691 - classification_loss: 0.1947 444/500 [=========================>....] - ETA: 14s - loss: 1.4644 - regression_loss: 1.2693 - classification_loss: 0.1950 445/500 [=========================>....] - ETA: 13s - loss: 1.4633 - regression_loss: 1.2685 - classification_loss: 0.1949 446/500 [=========================>....] - ETA: 13s - loss: 1.4626 - regression_loss: 1.2679 - classification_loss: 0.1947 447/500 [=========================>....] - ETA: 13s - loss: 1.4602 - regression_loss: 1.2657 - classification_loss: 0.1944 448/500 [=========================>....] - ETA: 13s - loss: 1.4591 - regression_loss: 1.2648 - classification_loss: 0.1943 449/500 [=========================>....] - ETA: 12s - loss: 1.4603 - regression_loss: 1.2657 - classification_loss: 0.1946 450/500 [==========================>...] - ETA: 12s - loss: 1.4607 - regression_loss: 1.2661 - classification_loss: 0.1946 451/500 [==========================>...] - ETA: 12s - loss: 1.4599 - regression_loss: 1.2656 - classification_loss: 0.1943 452/500 [==========================>...] - ETA: 12s - loss: 1.4599 - regression_loss: 1.2655 - classification_loss: 0.1943 453/500 [==========================>...] - ETA: 11s - loss: 1.4608 - regression_loss: 1.2662 - classification_loss: 0.1945 454/500 [==========================>...] - ETA: 11s - loss: 1.4618 - regression_loss: 1.2671 - classification_loss: 0.1947 455/500 [==========================>...] - ETA: 11s - loss: 1.4618 - regression_loss: 1.2670 - classification_loss: 0.1948 456/500 [==========================>...] - ETA: 11s - loss: 1.4621 - regression_loss: 1.2673 - classification_loss: 0.1948 457/500 [==========================>...] - ETA: 10s - loss: 1.4626 - regression_loss: 1.2678 - classification_loss: 0.1949 458/500 [==========================>...] - ETA: 10s - loss: 1.4631 - regression_loss: 1.2682 - classification_loss: 0.1950 459/500 [==========================>...] - ETA: 10s - loss: 1.4624 - regression_loss: 1.2676 - classification_loss: 0.1948 460/500 [==========================>...] - ETA: 10s - loss: 1.4623 - regression_loss: 1.2675 - classification_loss: 0.1948 461/500 [==========================>...] - ETA: 9s - loss: 1.4625 - regression_loss: 1.2677 - classification_loss: 0.1947  462/500 [==========================>...] - ETA: 9s - loss: 1.4632 - regression_loss: 1.2683 - classification_loss: 0.1948 463/500 [==========================>...] - ETA: 9s - loss: 1.4649 - regression_loss: 1.2697 - classification_loss: 0.1952 464/500 [==========================>...] - ETA: 9s - loss: 1.4650 - regression_loss: 1.2698 - classification_loss: 0.1952 465/500 [==========================>...] - ETA: 8s - loss: 1.4649 - regression_loss: 1.2697 - classification_loss: 0.1952 466/500 [==========================>...] - ETA: 8s - loss: 1.4651 - regression_loss: 1.2699 - classification_loss: 0.1952 467/500 [===========================>..] - ETA: 8s - loss: 1.4670 - regression_loss: 1.2713 - classification_loss: 0.1958 468/500 [===========================>..] - ETA: 8s - loss: 1.4667 - regression_loss: 1.2709 - classification_loss: 0.1958 469/500 [===========================>..] - ETA: 7s - loss: 1.4657 - regression_loss: 1.2700 - classification_loss: 0.1957 470/500 [===========================>..] - ETA: 7s - loss: 1.4659 - regression_loss: 1.2702 - classification_loss: 0.1957 471/500 [===========================>..] - ETA: 7s - loss: 1.4665 - regression_loss: 1.2706 - classification_loss: 0.1958 472/500 [===========================>..] - ETA: 7s - loss: 1.4669 - regression_loss: 1.2709 - classification_loss: 0.1960 473/500 [===========================>..] - ETA: 6s - loss: 1.4671 - regression_loss: 1.2712 - classification_loss: 0.1959 474/500 [===========================>..] - ETA: 6s - loss: 1.4679 - regression_loss: 1.2718 - classification_loss: 0.1962 475/500 [===========================>..] - ETA: 6s - loss: 1.4672 - regression_loss: 1.2713 - classification_loss: 0.1960 476/500 [===========================>..] - ETA: 6s - loss: 1.4678 - regression_loss: 1.2719 - classification_loss: 0.1959 477/500 [===========================>..] - ETA: 5s - loss: 1.4680 - regression_loss: 1.2721 - classification_loss: 0.1959 478/500 [===========================>..] - ETA: 5s - loss: 1.4664 - regression_loss: 1.2708 - classification_loss: 0.1956 479/500 [===========================>..] - ETA: 5s - loss: 1.4666 - regression_loss: 1.2710 - classification_loss: 0.1957 480/500 [===========================>..] - ETA: 5s - loss: 1.4670 - regression_loss: 1.2713 - classification_loss: 0.1957 481/500 [===========================>..] - ETA: 4s - loss: 1.4676 - regression_loss: 1.2719 - classification_loss: 0.1957 482/500 [===========================>..] - ETA: 4s - loss: 1.4660 - regression_loss: 1.2705 - classification_loss: 0.1955 483/500 [===========================>..] - ETA: 4s - loss: 1.4658 - regression_loss: 1.2704 - classification_loss: 0.1954 484/500 [============================>.] - ETA: 4s - loss: 1.4670 - regression_loss: 1.2714 - classification_loss: 0.1956 485/500 [============================>.] - ETA: 3s - loss: 1.4664 - regression_loss: 1.2709 - classification_loss: 0.1955 486/500 [============================>.] - ETA: 3s - loss: 1.4669 - regression_loss: 1.2713 - classification_loss: 0.1956 487/500 [============================>.] - ETA: 3s - loss: 1.4671 - regression_loss: 1.2716 - classification_loss: 0.1955 488/500 [============================>.] - ETA: 3s - loss: 1.4663 - regression_loss: 1.2709 - classification_loss: 0.1954 489/500 [============================>.] - ETA: 2s - loss: 1.4663 - regression_loss: 1.2709 - classification_loss: 0.1954 490/500 [============================>.] - ETA: 2s - loss: 1.4667 - regression_loss: 1.2712 - classification_loss: 0.1955 491/500 [============================>.] - ETA: 2s - loss: 1.4663 - regression_loss: 1.2709 - classification_loss: 0.1955 492/500 [============================>.] - ETA: 2s - loss: 1.4664 - regression_loss: 1.2710 - classification_loss: 0.1954 493/500 [============================>.] - ETA: 1s - loss: 1.4660 - regression_loss: 1.2704 - classification_loss: 0.1956 494/500 [============================>.] - ETA: 1s - loss: 1.4670 - regression_loss: 1.2713 - classification_loss: 0.1957 495/500 [============================>.] - ETA: 1s - loss: 1.4669 - regression_loss: 1.2712 - classification_loss: 0.1956 496/500 [============================>.] - ETA: 1s - loss: 1.4661 - regression_loss: 1.2706 - classification_loss: 0.1955 497/500 [============================>.] - ETA: 0s - loss: 1.4671 - regression_loss: 1.2715 - classification_loss: 0.1957 498/500 [============================>.] - ETA: 0s - loss: 1.4675 - regression_loss: 1.2717 - classification_loss: 0.1958 499/500 [============================>.] - ETA: 0s - loss: 1.4675 - regression_loss: 1.2717 - classification_loss: 0.1958 500/500 [==============================] - 127s 254ms/step - loss: 1.4672 - regression_loss: 1.2714 - classification_loss: 0.1958 1172 instances of class plum with average precision: 0.7268 mAP: 0.7268 Epoch 00015: saving model to ./training/snapshots/resnet50_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 1:57 - loss: 1.5117 - regression_loss: 1.2751 - classification_loss: 0.2367 2/500 [..............................] - ETA: 2:03 - loss: 1.7387 - regression_loss: 1.4734 - classification_loss: 0.2653 3/500 [..............................] - ETA: 2:03 - loss: 1.8182 - regression_loss: 1.5497 - classification_loss: 0.2685 4/500 [..............................] - ETA: 2:04 - loss: 1.7414 - regression_loss: 1.4876 - classification_loss: 0.2538 5/500 [..............................] - ETA: 2:02 - loss: 1.6205 - regression_loss: 1.3780 - classification_loss: 0.2425 6/500 [..............................] - ETA: 2:03 - loss: 1.7400 - regression_loss: 1.4886 - classification_loss: 0.2515 7/500 [..............................] - ETA: 2:03 - loss: 1.7670 - regression_loss: 1.5074 - classification_loss: 0.2597 8/500 [..............................] - ETA: 2:03 - loss: 1.6907 - regression_loss: 1.4350 - classification_loss: 0.2556 9/500 [..............................] - ETA: 2:03 - loss: 1.6910 - regression_loss: 1.4388 - classification_loss: 0.2523 10/500 [..............................] - ETA: 2:02 - loss: 1.6778 - regression_loss: 1.4280 - classification_loss: 0.2498 11/500 [..............................] - ETA: 2:01 - loss: 1.7147 - regression_loss: 1.4681 - classification_loss: 0.2466 12/500 [..............................] - ETA: 2:01 - loss: 1.7122 - regression_loss: 1.4676 - classification_loss: 0.2446 13/500 [..............................] - ETA: 2:01 - loss: 1.7387 - regression_loss: 1.4898 - classification_loss: 0.2489 14/500 [..............................] - ETA: 2:01 - loss: 1.7347 - regression_loss: 1.4882 - classification_loss: 0.2465 15/500 [..............................] - ETA: 2:01 - loss: 1.6985 - regression_loss: 1.4589 - classification_loss: 0.2396 16/500 [..............................] - ETA: 2:01 - loss: 1.7232 - regression_loss: 1.4750 - classification_loss: 0.2481 17/500 [>.............................] - ETA: 2:01 - loss: 1.6951 - regression_loss: 1.4526 - classification_loss: 0.2425 18/500 [>.............................] - ETA: 2:00 - loss: 1.6697 - regression_loss: 1.4331 - classification_loss: 0.2366 19/500 [>.............................] - ETA: 2:00 - loss: 1.6657 - regression_loss: 1.4305 - classification_loss: 0.2352 20/500 [>.............................] - ETA: 2:00 - loss: 1.6229 - regression_loss: 1.3950 - classification_loss: 0.2279 21/500 [>.............................] - ETA: 2:00 - loss: 1.6018 - regression_loss: 1.3780 - classification_loss: 0.2238 22/500 [>.............................] - ETA: 2:00 - loss: 1.5928 - regression_loss: 1.3723 - classification_loss: 0.2205 23/500 [>.............................] - ETA: 1:59 - loss: 1.5974 - regression_loss: 1.3769 - classification_loss: 0.2206 24/500 [>.............................] - ETA: 1:59 - loss: 1.5923 - regression_loss: 1.3730 - classification_loss: 0.2193 25/500 [>.............................] - ETA: 1:59 - loss: 1.5784 - regression_loss: 1.3623 - classification_loss: 0.2161 26/500 [>.............................] - ETA: 1:59 - loss: 1.5870 - regression_loss: 1.3689 - classification_loss: 0.2182 27/500 [>.............................] - ETA: 1:59 - loss: 1.5839 - regression_loss: 1.3680 - classification_loss: 0.2159 28/500 [>.............................] - ETA: 1:58 - loss: 1.5757 - regression_loss: 1.3603 - classification_loss: 0.2154 29/500 [>.............................] - ETA: 1:58 - loss: 1.5642 - regression_loss: 1.3497 - classification_loss: 0.2145 30/500 [>.............................] - ETA: 1:58 - loss: 1.5629 - regression_loss: 1.3470 - classification_loss: 0.2159 31/500 [>.............................] - ETA: 1:58 - loss: 1.5369 - regression_loss: 1.3254 - classification_loss: 0.2115 32/500 [>.............................] - ETA: 1:57 - loss: 1.5312 - regression_loss: 1.3200 - classification_loss: 0.2112 33/500 [>.............................] - ETA: 1:57 - loss: 1.5433 - regression_loss: 1.3319 - classification_loss: 0.2114 34/500 [=>............................] - ETA: 1:57 - loss: 1.5244 - regression_loss: 1.3169 - classification_loss: 0.2074 35/500 [=>............................] - ETA: 1:57 - loss: 1.5305 - regression_loss: 1.3224 - classification_loss: 0.2081 36/500 [=>............................] - ETA: 1:57 - loss: 1.5292 - regression_loss: 1.3214 - classification_loss: 0.2078 37/500 [=>............................] - ETA: 1:57 - loss: 1.5300 - regression_loss: 1.3210 - classification_loss: 0.2090 38/500 [=>............................] - ETA: 1:57 - loss: 1.5051 - regression_loss: 1.2991 - classification_loss: 0.2060 39/500 [=>............................] - ETA: 1:56 - loss: 1.5058 - regression_loss: 1.3001 - classification_loss: 0.2057 40/500 [=>............................] - ETA: 1:56 - loss: 1.4916 - regression_loss: 1.2891 - classification_loss: 0.2026 41/500 [=>............................] - ETA: 1:56 - loss: 1.4987 - regression_loss: 1.2951 - classification_loss: 0.2036 42/500 [=>............................] - ETA: 1:56 - loss: 1.4887 - regression_loss: 1.2863 - classification_loss: 0.2023 43/500 [=>............................] - ETA: 1:55 - loss: 1.4959 - regression_loss: 1.2927 - classification_loss: 0.2032 44/500 [=>............................] - ETA: 1:55 - loss: 1.5100 - regression_loss: 1.3064 - classification_loss: 0.2036 45/500 [=>............................] - ETA: 1:55 - loss: 1.5117 - regression_loss: 1.3089 - classification_loss: 0.2028 46/500 [=>............................] - ETA: 1:55 - loss: 1.5032 - regression_loss: 1.3022 - classification_loss: 0.2010 47/500 [=>............................] - ETA: 1:54 - loss: 1.4891 - regression_loss: 1.2904 - classification_loss: 0.1987 48/500 [=>............................] - ETA: 1:54 - loss: 1.4863 - regression_loss: 1.2868 - classification_loss: 0.1994 49/500 [=>............................] - ETA: 1:54 - loss: 1.4867 - regression_loss: 1.2880 - classification_loss: 0.1987 50/500 [==>...........................] - ETA: 1:54 - loss: 1.4728 - regression_loss: 1.2765 - classification_loss: 0.1963 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4607 - regression_loss: 1.2658 - classification_loss: 0.1949 52/500 [==>...........................] - ETA: 1:53 - loss: 1.4741 - regression_loss: 1.2780 - classification_loss: 0.1961 53/500 [==>...........................] - ETA: 1:53 - loss: 1.4668 - regression_loss: 1.2723 - classification_loss: 0.1946 54/500 [==>...........................] - ETA: 1:53 - loss: 1.4482 - regression_loss: 1.2564 - classification_loss: 0.1918 55/500 [==>...........................] - ETA: 1:52 - loss: 1.4580 - regression_loss: 1.2636 - classification_loss: 0.1943 56/500 [==>...........................] - ETA: 1:52 - loss: 1.4639 - regression_loss: 1.2680 - classification_loss: 0.1959 57/500 [==>...........................] - ETA: 1:52 - loss: 1.4498 - regression_loss: 1.2557 - classification_loss: 0.1941 58/500 [==>...........................] - ETA: 1:52 - loss: 1.4533 - regression_loss: 1.2586 - classification_loss: 0.1948 59/500 [==>...........................] - ETA: 1:51 - loss: 1.4596 - regression_loss: 1.2633 - classification_loss: 0.1963 60/500 [==>...........................] - ETA: 1:51 - loss: 1.4560 - regression_loss: 1.2598 - classification_loss: 0.1962 61/500 [==>...........................] - ETA: 1:51 - loss: 1.4596 - regression_loss: 1.2628 - classification_loss: 0.1968 62/500 [==>...........................] - ETA: 1:51 - loss: 1.4613 - regression_loss: 1.2642 - classification_loss: 0.1970 63/500 [==>...........................] - ETA: 1:51 - loss: 1.4643 - regression_loss: 1.2666 - classification_loss: 0.1977 64/500 [==>...........................] - ETA: 1:50 - loss: 1.4605 - regression_loss: 1.2636 - classification_loss: 0.1968 65/500 [==>...........................] - ETA: 1:50 - loss: 1.4658 - regression_loss: 1.2683 - classification_loss: 0.1974 66/500 [==>...........................] - ETA: 1:50 - loss: 1.4598 - regression_loss: 1.2632 - classification_loss: 0.1966 67/500 [===>..........................] - ETA: 1:50 - loss: 1.4678 - regression_loss: 1.2685 - classification_loss: 0.1994 68/500 [===>..........................] - ETA: 1:49 - loss: 1.4698 - regression_loss: 1.2707 - classification_loss: 0.1991 69/500 [===>..........................] - ETA: 1:49 - loss: 1.4783 - regression_loss: 1.2767 - classification_loss: 0.2017 70/500 [===>..........................] - ETA: 1:48 - loss: 1.4818 - regression_loss: 1.2794 - classification_loss: 0.2024 71/500 [===>..........................] - ETA: 1:48 - loss: 1.4919 - regression_loss: 1.2875 - classification_loss: 0.2044 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4900 - regression_loss: 1.2854 - classification_loss: 0.2046 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4812 - regression_loss: 1.2783 - classification_loss: 0.2030 74/500 [===>..........................] - ETA: 1:47 - loss: 1.4767 - regression_loss: 1.2748 - classification_loss: 0.2019 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4705 - regression_loss: 1.2695 - classification_loss: 0.2011 76/500 [===>..........................] - ETA: 1:46 - loss: 1.4706 - regression_loss: 1.2701 - classification_loss: 0.2004 77/500 [===>..........................] - ETA: 1:46 - loss: 1.4725 - regression_loss: 1.2717 - classification_loss: 0.2007 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4769 - regression_loss: 1.2762 - classification_loss: 0.2007 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4762 - regression_loss: 1.2760 - classification_loss: 0.2002 80/500 [===>..........................] - ETA: 1:45 - loss: 1.4760 - regression_loss: 1.2760 - classification_loss: 0.2000 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4760 - regression_loss: 1.2773 - classification_loss: 0.1987 82/500 [===>..........................] - ETA: 1:45 - loss: 1.4662 - regression_loss: 1.2686 - classification_loss: 0.1975 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4671 - regression_loss: 1.2697 - classification_loss: 0.1974 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4628 - regression_loss: 1.2662 - classification_loss: 0.1966 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4603 - regression_loss: 1.2642 - classification_loss: 0.1961 86/500 [====>.........................] - ETA: 1:44 - loss: 1.4480 - regression_loss: 1.2537 - classification_loss: 0.1944 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4493 - regression_loss: 1.2548 - classification_loss: 0.1945 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4521 - regression_loss: 1.2572 - classification_loss: 0.1949 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4527 - regression_loss: 1.2572 - classification_loss: 0.1955 90/500 [====>.........................] - ETA: 1:43 - loss: 1.4452 - regression_loss: 1.2506 - classification_loss: 0.1946 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4506 - regression_loss: 1.2546 - classification_loss: 0.1960 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4517 - regression_loss: 1.2547 - classification_loss: 0.1970 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4559 - regression_loss: 1.2586 - classification_loss: 0.1973 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4602 - regression_loss: 1.2622 - classification_loss: 0.1980 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4599 - regression_loss: 1.2615 - classification_loss: 0.1984 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4544 - regression_loss: 1.2566 - classification_loss: 0.1977 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4569 - regression_loss: 1.2586 - classification_loss: 0.1983 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4569 - regression_loss: 1.2584 - classification_loss: 0.1984 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4563 - regression_loss: 1.2582 - classification_loss: 0.1981 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4570 - regression_loss: 1.2591 - classification_loss: 0.1979 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4600 - regression_loss: 1.2614 - classification_loss: 0.1987 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4629 - regression_loss: 1.2637 - classification_loss: 0.1992 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4546 - regression_loss: 1.2566 - classification_loss: 0.1980 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4554 - regression_loss: 1.2575 - classification_loss: 0.1978 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4564 - regression_loss: 1.2588 - classification_loss: 0.1977 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4572 - regression_loss: 1.2594 - classification_loss: 0.1978 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4604 - regression_loss: 1.2622 - classification_loss: 0.1982 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4583 - regression_loss: 1.2606 - classification_loss: 0.1977 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4612 - regression_loss: 1.2631 - classification_loss: 0.1981 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4560 - regression_loss: 1.2588 - classification_loss: 0.1972 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4591 - regression_loss: 1.2613 - classification_loss: 0.1979 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4620 - regression_loss: 1.2638 - classification_loss: 0.1982 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4618 - regression_loss: 1.2637 - classification_loss: 0.1981 114/500 [=====>........................] - ETA: 1:37 - loss: 1.4577 - regression_loss: 1.2606 - classification_loss: 0.1971 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4568 - regression_loss: 1.2599 - classification_loss: 0.1969 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4590 - regression_loss: 1.2618 - classification_loss: 0.1972 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4581 - regression_loss: 1.2613 - classification_loss: 0.1969 118/500 [======>.......................] - ETA: 1:36 - loss: 1.4584 - regression_loss: 1.2615 - classification_loss: 0.1969 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4504 - regression_loss: 1.2542 - classification_loss: 0.1962 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4500 - regression_loss: 1.2539 - classification_loss: 0.1960 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4550 - regression_loss: 1.2582 - classification_loss: 0.1968 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4543 - regression_loss: 1.2577 - classification_loss: 0.1966 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4525 - regression_loss: 1.2561 - classification_loss: 0.1964 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4524 - regression_loss: 1.2557 - classification_loss: 0.1966 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4508 - regression_loss: 1.2545 - classification_loss: 0.1964 126/500 [======>.......................] - ETA: 1:34 - loss: 1.4523 - regression_loss: 1.2558 - classification_loss: 0.1965 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4552 - regression_loss: 1.2583 - classification_loss: 0.1969 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4549 - regression_loss: 1.2586 - classification_loss: 0.1964 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4586 - regression_loss: 1.2617 - classification_loss: 0.1970 130/500 [======>.......................] - ETA: 1:33 - loss: 1.4590 - regression_loss: 1.2617 - classification_loss: 0.1973 131/500 [======>.......................] - ETA: 1:33 - loss: 1.4634 - regression_loss: 1.2656 - classification_loss: 0.1979 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4636 - regression_loss: 1.2655 - classification_loss: 0.1981 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4621 - regression_loss: 1.2642 - classification_loss: 0.1980 134/500 [=======>......................] - ETA: 1:32 - loss: 1.4621 - regression_loss: 1.2645 - classification_loss: 0.1977 135/500 [=======>......................] - ETA: 1:32 - loss: 1.4617 - regression_loss: 1.2640 - classification_loss: 0.1977 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4623 - regression_loss: 1.2647 - classification_loss: 0.1976 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4630 - regression_loss: 1.2657 - classification_loss: 0.1973 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4617 - regression_loss: 1.2646 - classification_loss: 0.1971 139/500 [=======>......................] - ETA: 1:31 - loss: 1.4628 - regression_loss: 1.2658 - classification_loss: 0.1970 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4626 - regression_loss: 1.2660 - classification_loss: 0.1966 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4642 - regression_loss: 1.2674 - classification_loss: 0.1968 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4656 - regression_loss: 1.2686 - classification_loss: 0.1969 143/500 [=======>......................] - ETA: 1:30 - loss: 1.4676 - regression_loss: 1.2703 - classification_loss: 0.1973 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4685 - regression_loss: 1.2712 - classification_loss: 0.1972 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4685 - regression_loss: 1.2706 - classification_loss: 0.1980 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4700 - regression_loss: 1.2720 - classification_loss: 0.1980 147/500 [=======>......................] - ETA: 1:29 - loss: 1.4764 - regression_loss: 1.2771 - classification_loss: 0.1993 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4809 - regression_loss: 1.2811 - classification_loss: 0.1998 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4782 - regression_loss: 1.2791 - classification_loss: 0.1992 150/500 [========>.....................] - ETA: 1:28 - loss: 1.4741 - regression_loss: 1.2757 - classification_loss: 0.1985 151/500 [========>.....................] - ETA: 1:28 - loss: 1.4735 - regression_loss: 1.2754 - classification_loss: 0.1981 152/500 [========>.....................] - ETA: 1:28 - loss: 1.4741 - regression_loss: 1.2759 - classification_loss: 0.1982 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4706 - regression_loss: 1.2732 - classification_loss: 0.1974 154/500 [========>.....................] - ETA: 1:27 - loss: 1.4651 - regression_loss: 1.2686 - classification_loss: 0.1965 155/500 [========>.....................] - ETA: 1:27 - loss: 1.4623 - regression_loss: 1.2661 - classification_loss: 0.1962 156/500 [========>.....................] - ETA: 1:27 - loss: 1.4639 - regression_loss: 1.2673 - classification_loss: 0.1966 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4653 - regression_loss: 1.2688 - classification_loss: 0.1965 158/500 [========>.....................] - ETA: 1:26 - loss: 1.4602 - regression_loss: 1.2645 - classification_loss: 0.1958 159/500 [========>.....................] - ETA: 1:26 - loss: 1.4607 - regression_loss: 1.2651 - classification_loss: 0.1956 160/500 [========>.....................] - ETA: 1:26 - loss: 1.4645 - regression_loss: 1.2680 - classification_loss: 0.1966 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4605 - regression_loss: 1.2644 - classification_loss: 0.1960 162/500 [========>.....................] - ETA: 1:25 - loss: 1.4595 - regression_loss: 1.2636 - classification_loss: 0.1960 163/500 [========>.....................] - ETA: 1:25 - loss: 1.4615 - regression_loss: 1.2656 - classification_loss: 0.1959 164/500 [========>.....................] - ETA: 1:25 - loss: 1.4586 - regression_loss: 1.2631 - classification_loss: 0.1954 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4551 - regression_loss: 1.2604 - classification_loss: 0.1948 166/500 [========>.....................] - ETA: 1:24 - loss: 1.4539 - regression_loss: 1.2589 - classification_loss: 0.1950 167/500 [=========>....................] - ETA: 1:24 - loss: 1.4499 - regression_loss: 1.2552 - classification_loss: 0.1947 168/500 [=========>....................] - ETA: 1:24 - loss: 1.4533 - regression_loss: 1.2581 - classification_loss: 0.1952 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4536 - regression_loss: 1.2582 - classification_loss: 0.1954 170/500 [=========>....................] - ETA: 1:23 - loss: 1.4558 - regression_loss: 1.2599 - classification_loss: 0.1958 171/500 [=========>....................] - ETA: 1:23 - loss: 1.4546 - regression_loss: 1.2590 - classification_loss: 0.1956 172/500 [=========>....................] - ETA: 1:23 - loss: 1.4541 - regression_loss: 1.2586 - classification_loss: 0.1955 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4544 - regression_loss: 1.2588 - classification_loss: 0.1955 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4513 - regression_loss: 1.2563 - classification_loss: 0.1949 175/500 [=========>....................] - ETA: 1:22 - loss: 1.4504 - regression_loss: 1.2554 - classification_loss: 0.1950 176/500 [=========>....................] - ETA: 1:22 - loss: 1.4512 - regression_loss: 1.2562 - classification_loss: 0.1950 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4508 - regression_loss: 1.2559 - classification_loss: 0.1949 178/500 [=========>....................] - ETA: 1:21 - loss: 1.4463 - regression_loss: 1.2523 - classification_loss: 0.1940 179/500 [=========>....................] - ETA: 1:21 - loss: 1.4486 - regression_loss: 1.2544 - classification_loss: 0.1942 180/500 [=========>....................] - ETA: 1:21 - loss: 1.4470 - regression_loss: 1.2530 - classification_loss: 0.1940 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4473 - regression_loss: 1.2535 - classification_loss: 0.1938 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4442 - regression_loss: 1.2509 - classification_loss: 0.1932 183/500 [=========>....................] - ETA: 1:20 - loss: 1.4440 - regression_loss: 1.2507 - classification_loss: 0.1933 184/500 [==========>...................] - ETA: 1:20 - loss: 1.4483 - regression_loss: 1.2540 - classification_loss: 0.1943 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4509 - regression_loss: 1.2563 - classification_loss: 0.1946 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4511 - regression_loss: 1.2567 - classification_loss: 0.1944 187/500 [==========>...................] - ETA: 1:19 - loss: 1.4546 - regression_loss: 1.2597 - classification_loss: 0.1949 188/500 [==========>...................] - ETA: 1:19 - loss: 1.4573 - regression_loss: 1.2612 - classification_loss: 0.1961 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4540 - regression_loss: 1.2582 - classification_loss: 0.1958 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4557 - regression_loss: 1.2596 - classification_loss: 0.1961 191/500 [==========>...................] - ETA: 1:18 - loss: 1.4527 - regression_loss: 1.2572 - classification_loss: 0.1955 192/500 [==========>...................] - ETA: 1:18 - loss: 1.4549 - regression_loss: 1.2592 - classification_loss: 0.1957 193/500 [==========>...................] - ETA: 1:17 - loss: 1.4542 - regression_loss: 1.2589 - classification_loss: 0.1953 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4552 - regression_loss: 1.2597 - classification_loss: 0.1955 195/500 [==========>...................] - ETA: 1:17 - loss: 1.4537 - regression_loss: 1.2582 - classification_loss: 0.1955 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4615 - regression_loss: 1.2648 - classification_loss: 0.1967 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4666 - regression_loss: 1.2682 - classification_loss: 0.1985 198/500 [==========>...................] - ETA: 1:16 - loss: 1.4675 - regression_loss: 1.2691 - classification_loss: 0.1983 199/500 [==========>...................] - ETA: 1:16 - loss: 1.4644 - regression_loss: 1.2665 - classification_loss: 0.1979 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4635 - regression_loss: 1.2659 - classification_loss: 0.1976 201/500 [===========>..................] - ETA: 1:15 - loss: 1.4675 - regression_loss: 1.2697 - classification_loss: 0.1978 202/500 [===========>..................] - ETA: 1:15 - loss: 1.4681 - regression_loss: 1.2700 - classification_loss: 0.1981 203/500 [===========>..................] - ETA: 1:15 - loss: 1.4711 - regression_loss: 1.2725 - classification_loss: 0.1986 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4714 - regression_loss: 1.2729 - classification_loss: 0.1985 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4737 - regression_loss: 1.2748 - classification_loss: 0.1988 206/500 [===========>..................] - ETA: 1:14 - loss: 1.4729 - regression_loss: 1.2743 - classification_loss: 0.1985 207/500 [===========>..................] - ETA: 1:14 - loss: 1.4724 - regression_loss: 1.2739 - classification_loss: 0.1985 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4739 - regression_loss: 1.2753 - classification_loss: 0.1986 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4722 - regression_loss: 1.2740 - classification_loss: 0.1983 210/500 [===========>..................] - ETA: 1:13 - loss: 1.4738 - regression_loss: 1.2754 - classification_loss: 0.1984 211/500 [===========>..................] - ETA: 1:13 - loss: 1.4764 - regression_loss: 1.2777 - classification_loss: 0.1987 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4761 - regression_loss: 1.2774 - classification_loss: 0.1987 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4723 - regression_loss: 1.2740 - classification_loss: 0.1983 214/500 [===========>..................] - ETA: 1:12 - loss: 1.4749 - regression_loss: 1.2766 - classification_loss: 0.1983 215/500 [===========>..................] - ETA: 1:12 - loss: 1.4748 - regression_loss: 1.2763 - classification_loss: 0.1984 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4746 - regression_loss: 1.2761 - classification_loss: 0.1985 217/500 [============>.................] - ETA: 1:11 - loss: 1.4752 - regression_loss: 1.2767 - classification_loss: 0.1985 218/500 [============>.................] - ETA: 1:11 - loss: 1.4786 - regression_loss: 1.2797 - classification_loss: 0.1989 219/500 [============>.................] - ETA: 1:11 - loss: 1.4773 - regression_loss: 1.2787 - classification_loss: 0.1986 220/500 [============>.................] - ETA: 1:10 - loss: 1.4775 - regression_loss: 1.2790 - classification_loss: 0.1985 221/500 [============>.................] - ETA: 1:10 - loss: 1.4784 - regression_loss: 1.2797 - classification_loss: 0.1987 222/500 [============>.................] - ETA: 1:10 - loss: 1.4810 - regression_loss: 1.2822 - classification_loss: 0.1987 223/500 [============>.................] - ETA: 1:10 - loss: 1.4784 - regression_loss: 1.2799 - classification_loss: 0.1985 224/500 [============>.................] - ETA: 1:09 - loss: 1.4796 - regression_loss: 1.2806 - classification_loss: 0.1989 225/500 [============>.................] - ETA: 1:09 - loss: 1.4790 - regression_loss: 1.2802 - classification_loss: 0.1988 226/500 [============>.................] - ETA: 1:09 - loss: 1.4785 - regression_loss: 1.2800 - classification_loss: 0.1986 227/500 [============>.................] - ETA: 1:09 - loss: 1.4755 - regression_loss: 1.2776 - classification_loss: 0.1979 228/500 [============>.................] - ETA: 1:08 - loss: 1.4738 - regression_loss: 1.2762 - classification_loss: 0.1976 229/500 [============>.................] - ETA: 1:08 - loss: 1.4736 - regression_loss: 1.2758 - classification_loss: 0.1978 230/500 [============>.................] - ETA: 1:08 - loss: 1.4754 - regression_loss: 1.2772 - classification_loss: 0.1983 231/500 [============>.................] - ETA: 1:08 - loss: 1.4724 - regression_loss: 1.2744 - classification_loss: 0.1980 232/500 [============>.................] - ETA: 1:07 - loss: 1.4730 - regression_loss: 1.2748 - classification_loss: 0.1982 233/500 [============>.................] - ETA: 1:07 - loss: 1.4755 - regression_loss: 1.2768 - classification_loss: 0.1987 234/500 [=============>................] - ETA: 1:07 - loss: 1.4721 - regression_loss: 1.2739 - classification_loss: 0.1982 235/500 [=============>................] - ETA: 1:07 - loss: 1.4698 - regression_loss: 1.2718 - classification_loss: 0.1980 236/500 [=============>................] - ETA: 1:06 - loss: 1.4687 - regression_loss: 1.2707 - classification_loss: 0.1980 237/500 [=============>................] - ETA: 1:06 - loss: 1.4676 - regression_loss: 1.2697 - classification_loss: 0.1979 238/500 [=============>................] - ETA: 1:06 - loss: 1.4661 - regression_loss: 1.2685 - classification_loss: 0.1977 239/500 [=============>................] - ETA: 1:06 - loss: 1.4657 - regression_loss: 1.2680 - classification_loss: 0.1977 240/500 [=============>................] - ETA: 1:05 - loss: 1.4657 - regression_loss: 1.2681 - classification_loss: 0.1976 241/500 [=============>................] - ETA: 1:05 - loss: 1.4647 - regression_loss: 1.2672 - classification_loss: 0.1975 242/500 [=============>................] - ETA: 1:05 - loss: 1.4656 - regression_loss: 1.2682 - classification_loss: 0.1974 243/500 [=============>................] - ETA: 1:05 - loss: 1.4649 - regression_loss: 1.2675 - classification_loss: 0.1974 244/500 [=============>................] - ETA: 1:04 - loss: 1.4622 - regression_loss: 1.2652 - classification_loss: 0.1970 245/500 [=============>................] - ETA: 1:04 - loss: 1.4639 - regression_loss: 1.2666 - classification_loss: 0.1972 246/500 [=============>................] - ETA: 1:04 - loss: 1.4615 - regression_loss: 1.2645 - classification_loss: 0.1970 247/500 [=============>................] - ETA: 1:04 - loss: 1.4581 - regression_loss: 1.2617 - classification_loss: 0.1964 248/500 [=============>................] - ETA: 1:03 - loss: 1.4570 - regression_loss: 1.2608 - classification_loss: 0.1962 249/500 [=============>................] - ETA: 1:03 - loss: 1.4582 - regression_loss: 1.2617 - classification_loss: 0.1964 250/500 [==============>...............] - ETA: 1:03 - loss: 1.4584 - regression_loss: 1.2619 - classification_loss: 0.1965 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4559 - regression_loss: 1.2599 - classification_loss: 0.1959 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4563 - regression_loss: 1.2604 - classification_loss: 0.1959 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4542 - regression_loss: 1.2587 - classification_loss: 0.1955 254/500 [==============>...............] - ETA: 1:02 - loss: 1.4539 - regression_loss: 1.2584 - classification_loss: 0.1954 255/500 [==============>...............] - ETA: 1:02 - loss: 1.4555 - regression_loss: 1.2598 - classification_loss: 0.1957 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4522 - regression_loss: 1.2570 - classification_loss: 0.1952 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4524 - regression_loss: 1.2571 - classification_loss: 0.1953 258/500 [==============>...............] - ETA: 1:01 - loss: 1.4534 - regression_loss: 1.2579 - classification_loss: 0.1955 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4499 - regression_loss: 1.2550 - classification_loss: 0.1949 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4523 - regression_loss: 1.2569 - classification_loss: 0.1954 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4509 - regression_loss: 1.2558 - classification_loss: 0.1951 262/500 [==============>...............] - ETA: 1:00 - loss: 1.4512 - regression_loss: 1.2562 - classification_loss: 0.1950 263/500 [==============>...............] - ETA: 59s - loss: 1.4500 - regression_loss: 1.2552 - classification_loss: 0.1948  264/500 [==============>...............] - ETA: 59s - loss: 1.4488 - regression_loss: 1.2543 - classification_loss: 0.1945 265/500 [==============>...............] - ETA: 59s - loss: 1.4481 - regression_loss: 1.2536 - classification_loss: 0.1945 266/500 [==============>...............] - ETA: 59s - loss: 1.4492 - regression_loss: 1.2545 - classification_loss: 0.1947 267/500 [===============>..............] - ETA: 58s - loss: 1.4497 - regression_loss: 1.2550 - classification_loss: 0.1947 268/500 [===============>..............] - ETA: 58s - loss: 1.4493 - regression_loss: 1.2547 - classification_loss: 0.1947 269/500 [===============>..............] - ETA: 58s - loss: 1.4492 - regression_loss: 1.2546 - classification_loss: 0.1946 270/500 [===============>..............] - ETA: 58s - loss: 1.4490 - regression_loss: 1.2545 - classification_loss: 0.1945 271/500 [===============>..............] - ETA: 57s - loss: 1.4499 - regression_loss: 1.2552 - classification_loss: 0.1947 272/500 [===============>..............] - ETA: 57s - loss: 1.4512 - regression_loss: 1.2563 - classification_loss: 0.1950 273/500 [===============>..............] - ETA: 57s - loss: 1.4512 - regression_loss: 1.2562 - classification_loss: 0.1950 274/500 [===============>..............] - ETA: 57s - loss: 1.4524 - regression_loss: 1.2574 - classification_loss: 0.1951 275/500 [===============>..............] - ETA: 56s - loss: 1.4504 - regression_loss: 1.2557 - classification_loss: 0.1947 276/500 [===============>..............] - ETA: 56s - loss: 1.4533 - regression_loss: 1.2584 - classification_loss: 0.1949 277/500 [===============>..............] - ETA: 56s - loss: 1.4539 - regression_loss: 1.2589 - classification_loss: 0.1950 278/500 [===============>..............] - ETA: 56s - loss: 1.4541 - regression_loss: 1.2591 - classification_loss: 0.1950 279/500 [===============>..............] - ETA: 55s - loss: 1.4539 - regression_loss: 1.2591 - classification_loss: 0.1948 280/500 [===============>..............] - ETA: 55s - loss: 1.4545 - regression_loss: 1.2601 - classification_loss: 0.1944 281/500 [===============>..............] - ETA: 55s - loss: 1.4544 - regression_loss: 1.2601 - classification_loss: 0.1944 282/500 [===============>..............] - ETA: 55s - loss: 1.4543 - regression_loss: 1.2600 - classification_loss: 0.1943 283/500 [===============>..............] - ETA: 54s - loss: 1.4545 - regression_loss: 1.2602 - classification_loss: 0.1943 284/500 [================>.............] - ETA: 54s - loss: 1.4554 - regression_loss: 1.2606 - classification_loss: 0.1947 285/500 [================>.............] - ETA: 54s - loss: 1.4540 - regression_loss: 1.2596 - classification_loss: 0.1943 286/500 [================>.............] - ETA: 54s - loss: 1.4541 - regression_loss: 1.2598 - classification_loss: 0.1943 287/500 [================>.............] - ETA: 53s - loss: 1.4540 - regression_loss: 1.2597 - classification_loss: 0.1942 288/500 [================>.............] - ETA: 53s - loss: 1.4537 - regression_loss: 1.2596 - classification_loss: 0.1941 289/500 [================>.............] - ETA: 53s - loss: 1.4536 - regression_loss: 1.2595 - classification_loss: 0.1941 290/500 [================>.............] - ETA: 53s - loss: 1.4530 - regression_loss: 1.2592 - classification_loss: 0.1938 291/500 [================>.............] - ETA: 52s - loss: 1.4510 - regression_loss: 1.2574 - classification_loss: 0.1936 292/500 [================>.............] - ETA: 52s - loss: 1.4526 - regression_loss: 1.2588 - classification_loss: 0.1938 293/500 [================>.............] - ETA: 52s - loss: 1.4500 - regression_loss: 1.2565 - classification_loss: 0.1935 294/500 [================>.............] - ETA: 52s - loss: 1.4476 - regression_loss: 1.2546 - classification_loss: 0.1931 295/500 [================>.............] - ETA: 51s - loss: 1.4444 - regression_loss: 1.2518 - classification_loss: 0.1926 296/500 [================>.............] - ETA: 51s - loss: 1.4428 - regression_loss: 1.2505 - classification_loss: 0.1923 297/500 [================>.............] - ETA: 51s - loss: 1.4438 - regression_loss: 1.2513 - classification_loss: 0.1925 298/500 [================>.............] - ETA: 51s - loss: 1.4443 - regression_loss: 1.2518 - classification_loss: 0.1925 299/500 [================>.............] - ETA: 50s - loss: 1.4449 - regression_loss: 1.2521 - classification_loss: 0.1928 300/500 [=================>............] - ETA: 50s - loss: 1.4467 - regression_loss: 1.2538 - classification_loss: 0.1929 301/500 [=================>............] - ETA: 50s - loss: 1.4456 - regression_loss: 1.2526 - classification_loss: 0.1930 302/500 [=================>............] - ETA: 50s - loss: 1.4464 - regression_loss: 1.2532 - classification_loss: 0.1931 303/500 [=================>............] - ETA: 49s - loss: 1.4480 - regression_loss: 1.2546 - classification_loss: 0.1934 304/500 [=================>............] - ETA: 49s - loss: 1.4493 - regression_loss: 1.2558 - classification_loss: 0.1936 305/500 [=================>............] - ETA: 49s - loss: 1.4484 - regression_loss: 1.2548 - classification_loss: 0.1935 306/500 [=================>............] - ETA: 49s - loss: 1.4500 - regression_loss: 1.2563 - classification_loss: 0.1937 307/500 [=================>............] - ETA: 48s - loss: 1.4515 - regression_loss: 1.2573 - classification_loss: 0.1942 308/500 [=================>............] - ETA: 48s - loss: 1.4520 - regression_loss: 1.2579 - classification_loss: 0.1941 309/500 [=================>............] - ETA: 48s - loss: 1.4513 - regression_loss: 1.2573 - classification_loss: 0.1940 310/500 [=================>............] - ETA: 48s - loss: 1.4529 - regression_loss: 1.2587 - classification_loss: 0.1942 311/500 [=================>............] - ETA: 47s - loss: 1.4501 - regression_loss: 1.2563 - classification_loss: 0.1938 312/500 [=================>............] - ETA: 47s - loss: 1.4501 - regression_loss: 1.2562 - classification_loss: 0.1939 313/500 [=================>............] - ETA: 47s - loss: 1.4499 - regression_loss: 1.2560 - classification_loss: 0.1939 314/500 [=================>............] - ETA: 47s - loss: 1.4499 - regression_loss: 1.2561 - classification_loss: 0.1938 315/500 [=================>............] - ETA: 46s - loss: 1.4479 - regression_loss: 1.2544 - classification_loss: 0.1935 316/500 [=================>............] - ETA: 46s - loss: 1.4483 - regression_loss: 1.2548 - classification_loss: 0.1935 317/500 [==================>...........] - ETA: 46s - loss: 1.4485 - regression_loss: 1.2551 - classification_loss: 0.1934 318/500 [==================>...........] - ETA: 46s - loss: 1.4488 - regression_loss: 1.2555 - classification_loss: 0.1934 319/500 [==================>...........] - ETA: 45s - loss: 1.4496 - regression_loss: 1.2562 - classification_loss: 0.1934 320/500 [==================>...........] - ETA: 45s - loss: 1.4494 - regression_loss: 1.2560 - classification_loss: 0.1933 321/500 [==================>...........] - ETA: 45s - loss: 1.4502 - regression_loss: 1.2567 - classification_loss: 0.1934 322/500 [==================>...........] - ETA: 45s - loss: 1.4486 - regression_loss: 1.2555 - classification_loss: 0.1932 323/500 [==================>...........] - ETA: 44s - loss: 1.4505 - regression_loss: 1.2568 - classification_loss: 0.1938 324/500 [==================>...........] - ETA: 44s - loss: 1.4508 - regression_loss: 1.2569 - classification_loss: 0.1939 325/500 [==================>...........] - ETA: 44s - loss: 1.4512 - regression_loss: 1.2572 - classification_loss: 0.1939 326/500 [==================>...........] - ETA: 44s - loss: 1.4511 - regression_loss: 1.2572 - classification_loss: 0.1939 327/500 [==================>...........] - ETA: 43s - loss: 1.4526 - regression_loss: 1.2587 - classification_loss: 0.1939 328/500 [==================>...........] - ETA: 43s - loss: 1.4528 - regression_loss: 1.2589 - classification_loss: 0.1940 329/500 [==================>...........] - ETA: 43s - loss: 1.4511 - regression_loss: 1.2573 - classification_loss: 0.1938 330/500 [==================>...........] - ETA: 43s - loss: 1.4508 - regression_loss: 1.2569 - classification_loss: 0.1939 331/500 [==================>...........] - ETA: 42s - loss: 1.4488 - regression_loss: 1.2551 - classification_loss: 0.1937 332/500 [==================>...........] - ETA: 42s - loss: 1.4483 - regression_loss: 1.2547 - classification_loss: 0.1936 333/500 [==================>...........] - ETA: 42s - loss: 1.4485 - regression_loss: 1.2549 - classification_loss: 0.1936 334/500 [===================>..........] - ETA: 42s - loss: 1.4476 - regression_loss: 1.2542 - classification_loss: 0.1933 335/500 [===================>..........] - ETA: 41s - loss: 1.4516 - regression_loss: 1.2577 - classification_loss: 0.1939 336/500 [===================>..........] - ETA: 41s - loss: 1.4527 - regression_loss: 1.2587 - classification_loss: 0.1940 337/500 [===================>..........] - ETA: 41s - loss: 1.4533 - regression_loss: 1.2593 - classification_loss: 0.1940 338/500 [===================>..........] - ETA: 41s - loss: 1.4522 - regression_loss: 1.2581 - classification_loss: 0.1940 339/500 [===================>..........] - ETA: 40s - loss: 1.4548 - regression_loss: 1.2604 - classification_loss: 0.1945 340/500 [===================>..........] - ETA: 40s - loss: 1.4550 - regression_loss: 1.2606 - classification_loss: 0.1944 341/500 [===================>..........] - ETA: 40s - loss: 1.4548 - regression_loss: 1.2604 - classification_loss: 0.1943 342/500 [===================>..........] - ETA: 40s - loss: 1.4546 - regression_loss: 1.2603 - classification_loss: 0.1943 343/500 [===================>..........] - ETA: 39s - loss: 1.4532 - regression_loss: 1.2592 - classification_loss: 0.1940 344/500 [===================>..........] - ETA: 39s - loss: 1.4534 - regression_loss: 1.2593 - classification_loss: 0.1941 345/500 [===================>..........] - ETA: 39s - loss: 1.4533 - regression_loss: 1.2593 - classification_loss: 0.1940 346/500 [===================>..........] - ETA: 39s - loss: 1.4550 - regression_loss: 1.2608 - classification_loss: 0.1941 347/500 [===================>..........] - ETA: 38s - loss: 1.4552 - regression_loss: 1.2610 - classification_loss: 0.1942 348/500 [===================>..........] - ETA: 38s - loss: 1.4546 - regression_loss: 1.2605 - classification_loss: 0.1940 349/500 [===================>..........] - ETA: 38s - loss: 1.4532 - regression_loss: 1.2593 - classification_loss: 0.1939 350/500 [====================>.........] - ETA: 38s - loss: 1.4516 - regression_loss: 1.2580 - classification_loss: 0.1936 351/500 [====================>.........] - ETA: 37s - loss: 1.4529 - regression_loss: 1.2591 - classification_loss: 0.1938 352/500 [====================>.........] - ETA: 37s - loss: 1.4539 - regression_loss: 1.2598 - classification_loss: 0.1942 353/500 [====================>.........] - ETA: 37s - loss: 1.4541 - regression_loss: 1.2600 - classification_loss: 0.1942 354/500 [====================>.........] - ETA: 36s - loss: 1.4528 - regression_loss: 1.2589 - classification_loss: 0.1939 355/500 [====================>.........] - ETA: 36s - loss: 1.4521 - regression_loss: 1.2585 - classification_loss: 0.1936 356/500 [====================>.........] - ETA: 36s - loss: 1.4531 - regression_loss: 1.2595 - classification_loss: 0.1936 357/500 [====================>.........] - ETA: 36s - loss: 1.4534 - regression_loss: 1.2599 - classification_loss: 0.1935 358/500 [====================>.........] - ETA: 35s - loss: 1.4542 - regression_loss: 1.2607 - classification_loss: 0.1935 359/500 [====================>.........] - ETA: 35s - loss: 1.4550 - regression_loss: 1.2614 - classification_loss: 0.1936 360/500 [====================>.........] - ETA: 35s - loss: 1.4549 - regression_loss: 1.2612 - classification_loss: 0.1937 361/500 [====================>.........] - ETA: 35s - loss: 1.4537 - regression_loss: 1.2599 - classification_loss: 0.1938 362/500 [====================>.........] - ETA: 34s - loss: 1.4552 - regression_loss: 1.2611 - classification_loss: 0.1941 363/500 [====================>.........] - ETA: 34s - loss: 1.4533 - regression_loss: 1.2596 - classification_loss: 0.1937 364/500 [====================>.........] - ETA: 34s - loss: 1.4539 - regression_loss: 1.2599 - classification_loss: 0.1939 365/500 [====================>.........] - ETA: 34s - loss: 1.4534 - regression_loss: 1.2596 - classification_loss: 0.1938 366/500 [====================>.........] - ETA: 33s - loss: 1.4535 - regression_loss: 1.2597 - classification_loss: 0.1938 367/500 [=====================>........] - ETA: 33s - loss: 1.4541 - regression_loss: 1.2603 - classification_loss: 0.1938 368/500 [=====================>........] - ETA: 33s - loss: 1.4561 - regression_loss: 1.2618 - classification_loss: 0.1943 369/500 [=====================>........] - ETA: 33s - loss: 1.4569 - regression_loss: 1.2625 - classification_loss: 0.1943 370/500 [=====================>........] - ETA: 32s - loss: 1.4566 - regression_loss: 1.2623 - classification_loss: 0.1943 371/500 [=====================>........] - ETA: 32s - loss: 1.4568 - regression_loss: 1.2625 - classification_loss: 0.1942 372/500 [=====================>........] - ETA: 32s - loss: 1.4570 - regression_loss: 1.2626 - classification_loss: 0.1944 373/500 [=====================>........] - ETA: 32s - loss: 1.4547 - regression_loss: 1.2608 - classification_loss: 0.1940 374/500 [=====================>........] - ETA: 31s - loss: 1.4571 - regression_loss: 1.2626 - classification_loss: 0.1945 375/500 [=====================>........] - ETA: 31s - loss: 1.4550 - regression_loss: 1.2607 - classification_loss: 0.1942 376/500 [=====================>........] - ETA: 31s - loss: 1.4542 - regression_loss: 1.2600 - classification_loss: 0.1942 377/500 [=====================>........] - ETA: 31s - loss: 1.4539 - regression_loss: 1.2598 - classification_loss: 0.1942 378/500 [=====================>........] - ETA: 30s - loss: 1.4525 - regression_loss: 1.2585 - classification_loss: 0.1940 379/500 [=====================>........] - ETA: 30s - loss: 1.4532 - regression_loss: 1.2591 - classification_loss: 0.1940 380/500 [=====================>........] - ETA: 30s - loss: 1.4529 - regression_loss: 1.2588 - classification_loss: 0.1941 381/500 [=====================>........] - ETA: 30s - loss: 1.4521 - regression_loss: 1.2581 - classification_loss: 0.1939 382/500 [=====================>........] - ETA: 29s - loss: 1.4513 - regression_loss: 1.2574 - classification_loss: 0.1938 383/500 [=====================>........] - ETA: 29s - loss: 1.4506 - regression_loss: 1.2568 - classification_loss: 0.1938 384/500 [======================>.......] - ETA: 29s - loss: 1.4496 - regression_loss: 1.2561 - classification_loss: 0.1935 385/500 [======================>.......] - ETA: 29s - loss: 1.4499 - regression_loss: 1.2563 - classification_loss: 0.1936 386/500 [======================>.......] - ETA: 28s - loss: 1.4496 - regression_loss: 1.2561 - classification_loss: 0.1935 387/500 [======================>.......] - ETA: 28s - loss: 1.4492 - regression_loss: 1.2558 - classification_loss: 0.1934 388/500 [======================>.......] - ETA: 28s - loss: 1.4500 - regression_loss: 1.2565 - classification_loss: 0.1934 389/500 [======================>.......] - ETA: 28s - loss: 1.4513 - regression_loss: 1.2578 - classification_loss: 0.1935 390/500 [======================>.......] - ETA: 27s - loss: 1.4518 - regression_loss: 1.2584 - classification_loss: 0.1935 391/500 [======================>.......] - ETA: 27s - loss: 1.4523 - regression_loss: 1.2588 - classification_loss: 0.1935 392/500 [======================>.......] - ETA: 27s - loss: 1.4523 - regression_loss: 1.2586 - classification_loss: 0.1937 393/500 [======================>.......] - ETA: 27s - loss: 1.4532 - regression_loss: 1.2593 - classification_loss: 0.1938 394/500 [======================>.......] - ETA: 26s - loss: 1.4543 - regression_loss: 1.2603 - classification_loss: 0.1940 395/500 [======================>.......] - ETA: 26s - loss: 1.4555 - regression_loss: 1.2614 - classification_loss: 0.1941 396/500 [======================>.......] - ETA: 26s - loss: 1.4563 - regression_loss: 1.2623 - classification_loss: 0.1941 397/500 [======================>.......] - ETA: 26s - loss: 1.4568 - regression_loss: 1.2628 - classification_loss: 0.1941 398/500 [======================>.......] - ETA: 25s - loss: 1.4563 - regression_loss: 1.2624 - classification_loss: 0.1939 399/500 [======================>.......] - ETA: 25s - loss: 1.4570 - regression_loss: 1.2629 - classification_loss: 0.1940 400/500 [=======================>......] - ETA: 25s - loss: 1.4570 - regression_loss: 1.2632 - classification_loss: 0.1938 401/500 [=======================>......] - ETA: 25s - loss: 1.4542 - regression_loss: 1.2608 - classification_loss: 0.1935 402/500 [=======================>......] - ETA: 24s - loss: 1.4555 - regression_loss: 1.2621 - classification_loss: 0.1935 403/500 [=======================>......] - ETA: 24s - loss: 1.4560 - regression_loss: 1.2625 - classification_loss: 0.1935 404/500 [=======================>......] - ETA: 24s - loss: 1.4544 - regression_loss: 1.2612 - classification_loss: 0.1932 405/500 [=======================>......] - ETA: 24s - loss: 1.4551 - regression_loss: 1.2616 - classification_loss: 0.1935 406/500 [=======================>......] - ETA: 23s - loss: 1.4551 - regression_loss: 1.2616 - classification_loss: 0.1935 407/500 [=======================>......] - ETA: 23s - loss: 1.4546 - regression_loss: 1.2612 - classification_loss: 0.1934 408/500 [=======================>......] - ETA: 23s - loss: 1.4531 - regression_loss: 1.2599 - classification_loss: 0.1932 409/500 [=======================>......] - ETA: 23s - loss: 1.4527 - regression_loss: 1.2596 - classification_loss: 0.1931 410/500 [=======================>......] - ETA: 22s - loss: 1.4545 - regression_loss: 1.2612 - classification_loss: 0.1932 411/500 [=======================>......] - ETA: 22s - loss: 1.4533 - regression_loss: 1.2603 - classification_loss: 0.1930 412/500 [=======================>......] - ETA: 22s - loss: 1.4515 - regression_loss: 1.2588 - classification_loss: 0.1927 413/500 [=======================>......] - ETA: 22s - loss: 1.4526 - regression_loss: 1.2598 - classification_loss: 0.1928 414/500 [=======================>......] - ETA: 21s - loss: 1.4523 - regression_loss: 1.2595 - classification_loss: 0.1927 415/500 [=======================>......] - ETA: 21s - loss: 1.4526 - regression_loss: 1.2598 - classification_loss: 0.1927 416/500 [=======================>......] - ETA: 21s - loss: 1.4536 - regression_loss: 1.2606 - classification_loss: 0.1930 417/500 [========================>.....] - ETA: 21s - loss: 1.4546 - regression_loss: 1.2614 - classification_loss: 0.1932 418/500 [========================>.....] - ETA: 20s - loss: 1.4542 - regression_loss: 1.2612 - classification_loss: 0.1931 419/500 [========================>.....] - ETA: 20s - loss: 1.4547 - regression_loss: 1.2616 - classification_loss: 0.1931 420/500 [========================>.....] - ETA: 20s - loss: 1.4543 - regression_loss: 1.2612 - classification_loss: 0.1931 421/500 [========================>.....] - ETA: 20s - loss: 1.4528 - regression_loss: 1.2599 - classification_loss: 0.1929 422/500 [========================>.....] - ETA: 19s - loss: 1.4527 - regression_loss: 1.2599 - classification_loss: 0.1928 423/500 [========================>.....] - ETA: 19s - loss: 1.4537 - regression_loss: 1.2608 - classification_loss: 0.1929 424/500 [========================>.....] - ETA: 19s - loss: 1.4531 - regression_loss: 1.2603 - classification_loss: 0.1928 425/500 [========================>.....] - ETA: 18s - loss: 1.4539 - regression_loss: 1.2609 - classification_loss: 0.1931 426/500 [========================>.....] - ETA: 18s - loss: 1.4536 - regression_loss: 1.2606 - classification_loss: 0.1930 427/500 [========================>.....] - ETA: 18s - loss: 1.4522 - regression_loss: 1.2594 - classification_loss: 0.1928 428/500 [========================>.....] - ETA: 18s - loss: 1.4516 - regression_loss: 1.2589 - classification_loss: 0.1927 429/500 [========================>.....] - ETA: 17s - loss: 1.4542 - regression_loss: 1.2613 - classification_loss: 0.1929 430/500 [========================>.....] - ETA: 17s - loss: 1.4542 - regression_loss: 1.2614 - classification_loss: 0.1928 431/500 [========================>.....] - ETA: 17s - loss: 1.4537 - regression_loss: 1.2610 - classification_loss: 0.1927 432/500 [========================>.....] - ETA: 17s - loss: 1.4536 - regression_loss: 1.2609 - classification_loss: 0.1927 433/500 [========================>.....] - ETA: 16s - loss: 1.4533 - regression_loss: 1.2606 - classification_loss: 0.1927 434/500 [=========================>....] - ETA: 16s - loss: 1.4527 - regression_loss: 1.2600 - classification_loss: 0.1927 435/500 [=========================>....] - ETA: 16s - loss: 1.4514 - regression_loss: 1.2591 - classification_loss: 0.1923 436/500 [=========================>....] - ETA: 16s - loss: 1.4519 - regression_loss: 1.2594 - classification_loss: 0.1925 437/500 [=========================>....] - ETA: 15s - loss: 1.4514 - regression_loss: 1.2589 - classification_loss: 0.1925 438/500 [=========================>....] - ETA: 15s - loss: 1.4512 - regression_loss: 1.2588 - classification_loss: 0.1924 439/500 [=========================>....] - ETA: 15s - loss: 1.4533 - regression_loss: 1.2604 - classification_loss: 0.1929 440/500 [=========================>....] - ETA: 15s - loss: 1.4511 - regression_loss: 1.2586 - classification_loss: 0.1925 441/500 [=========================>....] - ETA: 14s - loss: 1.4519 - regression_loss: 1.2594 - classification_loss: 0.1926 442/500 [=========================>....] - ETA: 14s - loss: 1.4531 - regression_loss: 1.2603 - classification_loss: 0.1928 443/500 [=========================>....] - ETA: 14s - loss: 1.4527 - regression_loss: 1.2601 - classification_loss: 0.1926 444/500 [=========================>....] - ETA: 14s - loss: 1.4530 - regression_loss: 1.2604 - classification_loss: 0.1927 445/500 [=========================>....] - ETA: 13s - loss: 1.4522 - regression_loss: 1.2598 - classification_loss: 0.1925 446/500 [=========================>....] - ETA: 13s - loss: 1.4527 - regression_loss: 1.2601 - classification_loss: 0.1926 447/500 [=========================>....] - ETA: 13s - loss: 1.4510 - regression_loss: 1.2587 - classification_loss: 0.1923 448/500 [=========================>....] - ETA: 13s - loss: 1.4522 - regression_loss: 1.2596 - classification_loss: 0.1926 449/500 [=========================>....] - ETA: 12s - loss: 1.4526 - regression_loss: 1.2599 - classification_loss: 0.1927 450/500 [==========================>...] - ETA: 12s - loss: 1.4534 - regression_loss: 1.2602 - classification_loss: 0.1931 451/500 [==========================>...] - ETA: 12s - loss: 1.4538 - regression_loss: 1.2607 - classification_loss: 0.1932 452/500 [==========================>...] - ETA: 12s - loss: 1.4546 - regression_loss: 1.2613 - classification_loss: 0.1933 453/500 [==========================>...] - ETA: 11s - loss: 1.4548 - regression_loss: 1.2616 - classification_loss: 0.1932 454/500 [==========================>...] - ETA: 11s - loss: 1.4549 - regression_loss: 1.2616 - classification_loss: 0.1933 455/500 [==========================>...] - ETA: 11s - loss: 1.4531 - regression_loss: 1.2601 - classification_loss: 0.1930 456/500 [==========================>...] - ETA: 11s - loss: 1.4536 - regression_loss: 1.2606 - classification_loss: 0.1931 457/500 [==========================>...] - ETA: 10s - loss: 1.4516 - regression_loss: 1.2588 - classification_loss: 0.1928 458/500 [==========================>...] - ETA: 10s - loss: 1.4497 - regression_loss: 1.2572 - classification_loss: 0.1925 459/500 [==========================>...] - ETA: 10s - loss: 1.4499 - regression_loss: 1.2574 - classification_loss: 0.1925 460/500 [==========================>...] - ETA: 10s - loss: 1.4514 - regression_loss: 1.2587 - classification_loss: 0.1927 461/500 [==========================>...] - ETA: 9s - loss: 1.4519 - regression_loss: 1.2591 - classification_loss: 0.1928  462/500 [==========================>...] - ETA: 9s - loss: 1.4527 - regression_loss: 1.2598 - classification_loss: 0.1929 463/500 [==========================>...] - ETA: 9s - loss: 1.4529 - regression_loss: 1.2601 - classification_loss: 0.1928 464/500 [==========================>...] - ETA: 9s - loss: 1.4531 - regression_loss: 1.2602 - classification_loss: 0.1928 465/500 [==========================>...] - ETA: 8s - loss: 1.4513 - regression_loss: 1.2587 - classification_loss: 0.1927 466/500 [==========================>...] - ETA: 8s - loss: 1.4507 - regression_loss: 1.2582 - classification_loss: 0.1925 467/500 [===========================>..] - ETA: 8s - loss: 1.4503 - regression_loss: 1.2580 - classification_loss: 0.1923 468/500 [===========================>..] - ETA: 8s - loss: 1.4500 - regression_loss: 1.2578 - classification_loss: 0.1922 469/500 [===========================>..] - ETA: 7s - loss: 1.4490 - regression_loss: 1.2568 - classification_loss: 0.1922 470/500 [===========================>..] - ETA: 7s - loss: 1.4473 - regression_loss: 1.2554 - classification_loss: 0.1919 471/500 [===========================>..] - ETA: 7s - loss: 1.4483 - regression_loss: 1.2563 - classification_loss: 0.1920 472/500 [===========================>..] - ETA: 7s - loss: 1.4481 - regression_loss: 1.2562 - classification_loss: 0.1919 473/500 [===========================>..] - ETA: 6s - loss: 1.4470 - regression_loss: 1.2553 - classification_loss: 0.1916 474/500 [===========================>..] - ETA: 6s - loss: 1.4475 - regression_loss: 1.2557 - classification_loss: 0.1918 475/500 [===========================>..] - ETA: 6s - loss: 1.4461 - regression_loss: 1.2545 - classification_loss: 0.1915 476/500 [===========================>..] - ETA: 6s - loss: 1.4470 - regression_loss: 1.2553 - classification_loss: 0.1917 477/500 [===========================>..] - ETA: 5s - loss: 1.4484 - regression_loss: 1.2564 - classification_loss: 0.1919 478/500 [===========================>..] - ETA: 5s - loss: 1.4483 - regression_loss: 1.2565 - classification_loss: 0.1918 479/500 [===========================>..] - ETA: 5s - loss: 1.4483 - regression_loss: 1.2565 - classification_loss: 0.1918 480/500 [===========================>..] - ETA: 5s - loss: 1.4490 - regression_loss: 1.2570 - classification_loss: 0.1920 481/500 [===========================>..] - ETA: 4s - loss: 1.4498 - regression_loss: 1.2577 - classification_loss: 0.1921 482/500 [===========================>..] - ETA: 4s - loss: 1.4495 - regression_loss: 1.2574 - classification_loss: 0.1921 483/500 [===========================>..] - ETA: 4s - loss: 1.4506 - regression_loss: 1.2583 - classification_loss: 0.1923 484/500 [============================>.] - ETA: 4s - loss: 1.4508 - regression_loss: 1.2585 - classification_loss: 0.1923 485/500 [============================>.] - ETA: 3s - loss: 1.4504 - regression_loss: 1.2581 - classification_loss: 0.1923 486/500 [============================>.] - ETA: 3s - loss: 1.4503 - regression_loss: 1.2582 - classification_loss: 0.1922 487/500 [============================>.] - ETA: 3s - loss: 1.4507 - regression_loss: 1.2584 - classification_loss: 0.1922 488/500 [============================>.] - ETA: 3s - loss: 1.4501 - regression_loss: 1.2580 - classification_loss: 0.1921 489/500 [============================>.] - ETA: 2s - loss: 1.4506 - regression_loss: 1.2585 - classification_loss: 0.1922 490/500 [============================>.] - ETA: 2s - loss: 1.4501 - regression_loss: 1.2579 - classification_loss: 0.1921 491/500 [============================>.] - ETA: 2s - loss: 1.4491 - regression_loss: 1.2571 - classification_loss: 0.1920 492/500 [============================>.] - ETA: 2s - loss: 1.4497 - regression_loss: 1.2575 - classification_loss: 0.1921 493/500 [============================>.] - ETA: 1s - loss: 1.4510 - regression_loss: 1.2586 - classification_loss: 0.1924 494/500 [============================>.] - ETA: 1s - loss: 1.4507 - regression_loss: 1.2583 - classification_loss: 0.1924 495/500 [============================>.] - ETA: 1s - loss: 1.4508 - regression_loss: 1.2584 - classification_loss: 0.1924 496/500 [============================>.] - ETA: 1s - loss: 1.4516 - regression_loss: 1.2591 - classification_loss: 0.1925 497/500 [============================>.] - ETA: 0s - loss: 1.4520 - regression_loss: 1.2594 - classification_loss: 0.1926 498/500 [============================>.] - ETA: 0s - loss: 1.4522 - regression_loss: 1.2596 - classification_loss: 0.1926 499/500 [============================>.] - ETA: 0s - loss: 1.4524 - regression_loss: 1.2597 - classification_loss: 0.1927 500/500 [==============================] - 127s 253ms/step - loss: 1.4521 - regression_loss: 1.2594 - classification_loss: 0.1926 1172 instances of class plum with average precision: 0.7294 mAP: 0.7294 Epoch 00016: saving model to ./training/snapshots/resnet50_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:00 - loss: 1.9346 - regression_loss: 1.6851 - classification_loss: 0.2495 2/500 [..............................] - ETA: 2:03 - loss: 1.8568 - regression_loss: 1.6605 - classification_loss: 0.1964 3/500 [..............................] - ETA: 2:02 - loss: 1.7229 - regression_loss: 1.5365 - classification_loss: 0.1865 4/500 [..............................] - ETA: 2:03 - loss: 1.6880 - regression_loss: 1.4974 - classification_loss: 0.1906 5/500 [..............................] - ETA: 2:04 - loss: 1.5906 - regression_loss: 1.3991 - classification_loss: 0.1915 6/500 [..............................] - ETA: 2:04 - loss: 1.5389 - regression_loss: 1.3543 - classification_loss: 0.1847 7/500 [..............................] - ETA: 2:03 - loss: 1.4408 - regression_loss: 1.2720 - classification_loss: 0.1688 8/500 [..............................] - ETA: 2:04 - loss: 1.3473 - regression_loss: 1.1913 - classification_loss: 0.1559 9/500 [..............................] - ETA: 2:03 - loss: 1.3741 - regression_loss: 1.2146 - classification_loss: 0.1595 10/500 [..............................] - ETA: 2:04 - loss: 1.3938 - regression_loss: 1.2313 - classification_loss: 0.1625 11/500 [..............................] - ETA: 2:04 - loss: 1.3880 - regression_loss: 1.2243 - classification_loss: 0.1637 12/500 [..............................] - ETA: 2:03 - loss: 1.4434 - regression_loss: 1.2230 - classification_loss: 0.2204 13/500 [..............................] - ETA: 2:03 - loss: 1.4152 - regression_loss: 1.2071 - classification_loss: 0.2081 14/500 [..............................] - ETA: 2:03 - loss: 1.4344 - regression_loss: 1.2243 - classification_loss: 0.2101 15/500 [..............................] - ETA: 2:02 - loss: 1.4033 - regression_loss: 1.1933 - classification_loss: 0.2099 16/500 [..............................] - ETA: 2:02 - loss: 1.4121 - regression_loss: 1.2028 - classification_loss: 0.2093 17/500 [>.............................] - ETA: 2:02 - loss: 1.4308 - regression_loss: 1.2200 - classification_loss: 0.2107 18/500 [>.............................] - ETA: 2:02 - loss: 1.4481 - regression_loss: 1.2370 - classification_loss: 0.2110 19/500 [>.............................] - ETA: 2:02 - loss: 1.4528 - regression_loss: 1.2423 - classification_loss: 0.2105 20/500 [>.............................] - ETA: 2:01 - loss: 1.4484 - regression_loss: 1.2410 - classification_loss: 0.2074 21/500 [>.............................] - ETA: 2:01 - loss: 1.4494 - regression_loss: 1.2449 - classification_loss: 0.2045 22/500 [>.............................] - ETA: 2:01 - loss: 1.4080 - regression_loss: 1.2096 - classification_loss: 0.1985 23/500 [>.............................] - ETA: 2:01 - loss: 1.4297 - regression_loss: 1.2297 - classification_loss: 0.2000 24/500 [>.............................] - ETA: 2:01 - loss: 1.4300 - regression_loss: 1.2298 - classification_loss: 0.2001 25/500 [>.............................] - ETA: 2:01 - loss: 1.4220 - regression_loss: 1.2236 - classification_loss: 0.1984 26/500 [>.............................] - ETA: 2:00 - loss: 1.4182 - regression_loss: 1.2217 - classification_loss: 0.1964 27/500 [>.............................] - ETA: 2:00 - loss: 1.3834 - regression_loss: 1.1921 - classification_loss: 0.1913 28/500 [>.............................] - ETA: 2:00 - loss: 1.3949 - regression_loss: 1.2018 - classification_loss: 0.1932 29/500 [>.............................] - ETA: 2:00 - loss: 1.3649 - regression_loss: 1.1752 - classification_loss: 0.1897 30/500 [>.............................] - ETA: 1:59 - loss: 1.3784 - regression_loss: 1.1871 - classification_loss: 0.1913 31/500 [>.............................] - ETA: 1:59 - loss: 1.3884 - regression_loss: 1.1968 - classification_loss: 0.1916 32/500 [>.............................] - ETA: 1:59 - loss: 1.3954 - regression_loss: 1.2039 - classification_loss: 0.1914 33/500 [>.............................] - ETA: 1:59 - loss: 1.3742 - regression_loss: 1.1860 - classification_loss: 0.1882 34/500 [=>............................] - ETA: 1:58 - loss: 1.3852 - regression_loss: 1.1947 - classification_loss: 0.1905 35/500 [=>............................] - ETA: 1:58 - loss: 1.3696 - regression_loss: 1.1820 - classification_loss: 0.1876 36/500 [=>............................] - ETA: 1:58 - loss: 1.3552 - regression_loss: 1.1702 - classification_loss: 0.1850 37/500 [=>............................] - ETA: 1:58 - loss: 1.3682 - regression_loss: 1.1817 - classification_loss: 0.1864 38/500 [=>............................] - ETA: 1:57 - loss: 1.3770 - regression_loss: 1.1901 - classification_loss: 0.1869 39/500 [=>............................] - ETA: 1:57 - loss: 1.3782 - regression_loss: 1.1908 - classification_loss: 0.1873 40/500 [=>............................] - ETA: 1:57 - loss: 1.3956 - regression_loss: 1.2049 - classification_loss: 0.1908 41/500 [=>............................] - ETA: 1:57 - loss: 1.3930 - regression_loss: 1.2032 - classification_loss: 0.1898 42/500 [=>............................] - ETA: 1:56 - loss: 1.3930 - regression_loss: 1.2054 - classification_loss: 0.1877 43/500 [=>............................] - ETA: 1:56 - loss: 1.3951 - regression_loss: 1.2077 - classification_loss: 0.1875 44/500 [=>............................] - ETA: 1:56 - loss: 1.3825 - regression_loss: 1.1974 - classification_loss: 0.1851 45/500 [=>............................] - ETA: 1:56 - loss: 1.3688 - regression_loss: 1.1856 - classification_loss: 0.1832 46/500 [=>............................] - ETA: 1:56 - loss: 1.3827 - regression_loss: 1.1969 - classification_loss: 0.1857 47/500 [=>............................] - ETA: 1:55 - loss: 1.3721 - regression_loss: 1.1880 - classification_loss: 0.1841 48/500 [=>............................] - ETA: 1:55 - loss: 1.3754 - regression_loss: 1.1909 - classification_loss: 0.1845 49/500 [=>............................] - ETA: 1:55 - loss: 1.3835 - regression_loss: 1.1970 - classification_loss: 0.1865 50/500 [==>...........................] - ETA: 1:54 - loss: 1.3930 - regression_loss: 1.2054 - classification_loss: 0.1876 51/500 [==>...........................] - ETA: 1:54 - loss: 1.3951 - regression_loss: 1.2070 - classification_loss: 0.1881 52/500 [==>...........................] - ETA: 1:54 - loss: 1.3886 - regression_loss: 1.2022 - classification_loss: 0.1863 53/500 [==>...........................] - ETA: 1:53 - loss: 1.4000 - regression_loss: 1.2126 - classification_loss: 0.1874 54/500 [==>...........................] - ETA: 1:53 - loss: 1.4035 - regression_loss: 1.2163 - classification_loss: 0.1872 55/500 [==>...........................] - ETA: 1:53 - loss: 1.4136 - regression_loss: 1.2251 - classification_loss: 0.1885 56/500 [==>...........................] - ETA: 1:53 - loss: 1.4123 - regression_loss: 1.2242 - classification_loss: 0.1881 57/500 [==>...........................] - ETA: 1:52 - loss: 1.4145 - regression_loss: 1.2260 - classification_loss: 0.1884 58/500 [==>...........................] - ETA: 1:52 - loss: 1.4209 - regression_loss: 1.2312 - classification_loss: 0.1897 59/500 [==>...........................] - ETA: 1:52 - loss: 1.4222 - regression_loss: 1.2326 - classification_loss: 0.1896 60/500 [==>...........................] - ETA: 1:51 - loss: 1.4400 - regression_loss: 1.2487 - classification_loss: 0.1913 61/500 [==>...........................] - ETA: 1:51 - loss: 1.4470 - regression_loss: 1.2549 - classification_loss: 0.1921 62/500 [==>...........................] - ETA: 1:51 - loss: 1.4342 - regression_loss: 1.2436 - classification_loss: 0.1906 63/500 [==>...........................] - ETA: 1:51 - loss: 1.4400 - regression_loss: 1.2493 - classification_loss: 0.1907 64/500 [==>...........................] - ETA: 1:50 - loss: 1.4394 - regression_loss: 1.2490 - classification_loss: 0.1905 65/500 [==>...........................] - ETA: 1:50 - loss: 1.4481 - regression_loss: 1.2568 - classification_loss: 0.1913 66/500 [==>...........................] - ETA: 1:50 - loss: 1.4523 - regression_loss: 1.2609 - classification_loss: 0.1914 67/500 [===>..........................] - ETA: 1:50 - loss: 1.4695 - regression_loss: 1.2758 - classification_loss: 0.1937 68/500 [===>..........................] - ETA: 1:49 - loss: 1.4755 - regression_loss: 1.2803 - classification_loss: 0.1952 69/500 [===>..........................] - ETA: 1:49 - loss: 1.4812 - regression_loss: 1.2851 - classification_loss: 0.1961 70/500 [===>..........................] - ETA: 1:49 - loss: 1.4676 - regression_loss: 1.2728 - classification_loss: 0.1948 71/500 [===>..........................] - ETA: 1:49 - loss: 1.4738 - regression_loss: 1.2772 - classification_loss: 0.1966 72/500 [===>..........................] - ETA: 1:48 - loss: 1.4670 - regression_loss: 1.2715 - classification_loss: 0.1955 73/500 [===>..........................] - ETA: 1:48 - loss: 1.4661 - regression_loss: 1.2706 - classification_loss: 0.1955 74/500 [===>..........................] - ETA: 1:48 - loss: 1.4690 - regression_loss: 1.2741 - classification_loss: 0.1949 75/500 [===>..........................] - ETA: 1:48 - loss: 1.4560 - regression_loss: 1.2629 - classification_loss: 0.1931 76/500 [===>..........................] - ETA: 1:47 - loss: 1.4528 - regression_loss: 1.2600 - classification_loss: 0.1928 77/500 [===>..........................] - ETA: 1:47 - loss: 1.4513 - regression_loss: 1.2591 - classification_loss: 0.1922 78/500 [===>..........................] - ETA: 1:47 - loss: 1.4515 - regression_loss: 1.2597 - classification_loss: 0.1919 79/500 [===>..........................] - ETA: 1:47 - loss: 1.4412 - regression_loss: 1.2508 - classification_loss: 0.1904 80/500 [===>..........................] - ETA: 1:46 - loss: 1.4325 - regression_loss: 1.2424 - classification_loss: 0.1901 81/500 [===>..........................] - ETA: 1:46 - loss: 1.4412 - regression_loss: 1.2481 - classification_loss: 0.1930 82/500 [===>..........................] - ETA: 1:46 - loss: 1.4438 - regression_loss: 1.2505 - classification_loss: 0.1933 83/500 [===>..........................] - ETA: 1:46 - loss: 1.4478 - regression_loss: 1.2549 - classification_loss: 0.1929 84/500 [====>.........................] - ETA: 1:45 - loss: 1.4487 - regression_loss: 1.2555 - classification_loss: 0.1932 85/500 [====>.........................] - ETA: 1:45 - loss: 1.4456 - regression_loss: 1.2530 - classification_loss: 0.1926 86/500 [====>.........................] - ETA: 1:45 - loss: 1.4477 - regression_loss: 1.2551 - classification_loss: 0.1926 87/500 [====>.........................] - ETA: 1:45 - loss: 1.4434 - regression_loss: 1.2517 - classification_loss: 0.1917 88/500 [====>.........................] - ETA: 1:44 - loss: 1.4384 - regression_loss: 1.2474 - classification_loss: 0.1910 89/500 [====>.........................] - ETA: 1:44 - loss: 1.4424 - regression_loss: 1.2509 - classification_loss: 0.1914 90/500 [====>.........................] - ETA: 1:44 - loss: 1.4401 - regression_loss: 1.2493 - classification_loss: 0.1908 91/500 [====>.........................] - ETA: 1:44 - loss: 1.4425 - regression_loss: 1.2513 - classification_loss: 0.1912 92/500 [====>.........................] - ETA: 1:43 - loss: 1.4405 - regression_loss: 1.2498 - classification_loss: 0.1907 93/500 [====>.........................] - ETA: 1:43 - loss: 1.4325 - regression_loss: 1.2432 - classification_loss: 0.1893 94/500 [====>.........................] - ETA: 1:43 - loss: 1.4340 - regression_loss: 1.2447 - classification_loss: 0.1893 95/500 [====>.........................] - ETA: 1:43 - loss: 1.4266 - regression_loss: 1.2384 - classification_loss: 0.1882 96/500 [====>.........................] - ETA: 1:42 - loss: 1.4219 - regression_loss: 1.2351 - classification_loss: 0.1868 97/500 [====>.........................] - ETA: 1:42 - loss: 1.4283 - regression_loss: 1.2405 - classification_loss: 0.1878 98/500 [====>.........................] - ETA: 1:42 - loss: 1.4276 - regression_loss: 1.2394 - classification_loss: 0.1882 99/500 [====>.........................] - ETA: 1:41 - loss: 1.4270 - regression_loss: 1.2392 - classification_loss: 0.1878 100/500 [=====>........................] - ETA: 1:41 - loss: 1.4289 - regression_loss: 1.2408 - classification_loss: 0.1881 101/500 [=====>........................] - ETA: 1:41 - loss: 1.4309 - regression_loss: 1.2425 - classification_loss: 0.1884 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4234 - regression_loss: 1.2363 - classification_loss: 0.1871 103/500 [=====>........................] - ETA: 1:40 - loss: 1.4189 - regression_loss: 1.2327 - classification_loss: 0.1862 104/500 [=====>........................] - ETA: 1:40 - loss: 1.4193 - regression_loss: 1.2332 - classification_loss: 0.1861 105/500 [=====>........................] - ETA: 1:40 - loss: 1.4188 - regression_loss: 1.2323 - classification_loss: 0.1865 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4229 - regression_loss: 1.2352 - classification_loss: 0.1876 107/500 [=====>........................] - ETA: 1:39 - loss: 1.4252 - regression_loss: 1.2376 - classification_loss: 0.1876 108/500 [=====>........................] - ETA: 1:39 - loss: 1.4255 - regression_loss: 1.2379 - classification_loss: 0.1876 109/500 [=====>........................] - ETA: 1:39 - loss: 1.4189 - regression_loss: 1.2323 - classification_loss: 0.1866 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4210 - regression_loss: 1.2341 - classification_loss: 0.1869 111/500 [=====>........................] - ETA: 1:38 - loss: 1.4258 - regression_loss: 1.2378 - classification_loss: 0.1880 112/500 [=====>........................] - ETA: 1:38 - loss: 1.4195 - regression_loss: 1.2321 - classification_loss: 0.1874 113/500 [=====>........................] - ETA: 1:38 - loss: 1.4165 - regression_loss: 1.2296 - classification_loss: 0.1870 114/500 [=====>........................] - ETA: 1:38 - loss: 1.4160 - regression_loss: 1.2293 - classification_loss: 0.1866 115/500 [=====>........................] - ETA: 1:37 - loss: 1.4109 - regression_loss: 1.2252 - classification_loss: 0.1857 116/500 [=====>........................] - ETA: 1:37 - loss: 1.4146 - regression_loss: 1.2285 - classification_loss: 0.1861 117/500 [======>.......................] - ETA: 1:37 - loss: 1.4131 - regression_loss: 1.2272 - classification_loss: 0.1859 118/500 [======>.......................] - ETA: 1:37 - loss: 1.4136 - regression_loss: 1.2276 - classification_loss: 0.1860 119/500 [======>.......................] - ETA: 1:36 - loss: 1.4147 - regression_loss: 1.2285 - classification_loss: 0.1862 120/500 [======>.......................] - ETA: 1:36 - loss: 1.4159 - regression_loss: 1.2300 - classification_loss: 0.1859 121/500 [======>.......................] - ETA: 1:36 - loss: 1.4146 - regression_loss: 1.2292 - classification_loss: 0.1854 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4082 - regression_loss: 1.2237 - classification_loss: 0.1845 123/500 [======>.......................] - ETA: 1:35 - loss: 1.4147 - regression_loss: 1.2289 - classification_loss: 0.1858 124/500 [======>.......................] - ETA: 1:35 - loss: 1.4136 - regression_loss: 1.2279 - classification_loss: 0.1857 125/500 [======>.......................] - ETA: 1:35 - loss: 1.4212 - regression_loss: 1.2345 - classification_loss: 0.1867 126/500 [======>.......................] - ETA: 1:34 - loss: 1.4212 - regression_loss: 1.2345 - classification_loss: 0.1867 127/500 [======>.......................] - ETA: 1:34 - loss: 1.4285 - regression_loss: 1.2407 - classification_loss: 0.1877 128/500 [======>.......................] - ETA: 1:34 - loss: 1.4314 - regression_loss: 1.2432 - classification_loss: 0.1881 129/500 [======>.......................] - ETA: 1:34 - loss: 1.4334 - regression_loss: 1.2450 - classification_loss: 0.1884 130/500 [======>.......................] - ETA: 1:33 - loss: 1.4330 - regression_loss: 1.2449 - classification_loss: 0.1882 131/500 [======>.......................] - ETA: 1:33 - loss: 1.4319 - regression_loss: 1.2438 - classification_loss: 0.1881 132/500 [======>.......................] - ETA: 1:33 - loss: 1.4306 - regression_loss: 1.2428 - classification_loss: 0.1878 133/500 [======>.......................] - ETA: 1:33 - loss: 1.4286 - regression_loss: 1.2411 - classification_loss: 0.1874 134/500 [=======>......................] - ETA: 1:32 - loss: 1.4284 - regression_loss: 1.2409 - classification_loss: 0.1875 135/500 [=======>......................] - ETA: 1:32 - loss: 1.4324 - regression_loss: 1.2446 - classification_loss: 0.1878 136/500 [=======>......................] - ETA: 1:32 - loss: 1.4323 - regression_loss: 1.2448 - classification_loss: 0.1875 137/500 [=======>......................] - ETA: 1:32 - loss: 1.4340 - regression_loss: 1.2465 - classification_loss: 0.1875 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4325 - regression_loss: 1.2453 - classification_loss: 0.1872 139/500 [=======>......................] - ETA: 1:31 - loss: 1.4295 - regression_loss: 1.2428 - classification_loss: 0.1868 140/500 [=======>......................] - ETA: 1:31 - loss: 1.4275 - regression_loss: 1.2412 - classification_loss: 0.1863 141/500 [=======>......................] - ETA: 1:31 - loss: 1.4286 - regression_loss: 1.2423 - classification_loss: 0.1863 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4298 - regression_loss: 1.2433 - classification_loss: 0.1865 143/500 [=======>......................] - ETA: 1:30 - loss: 1.4297 - regression_loss: 1.2433 - classification_loss: 0.1864 144/500 [=======>......................] - ETA: 1:30 - loss: 1.4310 - regression_loss: 1.2445 - classification_loss: 0.1864 145/500 [=======>......................] - ETA: 1:30 - loss: 1.4315 - regression_loss: 1.2445 - classification_loss: 0.1870 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4332 - regression_loss: 1.2460 - classification_loss: 0.1873 147/500 [=======>......................] - ETA: 1:29 - loss: 1.4337 - regression_loss: 1.2465 - classification_loss: 0.1872 148/500 [=======>......................] - ETA: 1:29 - loss: 1.4354 - regression_loss: 1.2480 - classification_loss: 0.1874 149/500 [=======>......................] - ETA: 1:29 - loss: 1.4364 - regression_loss: 1.2488 - classification_loss: 0.1876 150/500 [========>.....................] - ETA: 1:28 - loss: 1.4292 - regression_loss: 1.2425 - classification_loss: 0.1867 151/500 [========>.....................] - ETA: 1:28 - loss: 1.4301 - regression_loss: 1.2432 - classification_loss: 0.1869 152/500 [========>.....................] - ETA: 1:28 - loss: 1.4315 - regression_loss: 1.2450 - classification_loss: 0.1866 153/500 [========>.....................] - ETA: 1:28 - loss: 1.4258 - regression_loss: 1.2399 - classification_loss: 0.1859 154/500 [========>.....................] - ETA: 1:27 - loss: 1.4279 - regression_loss: 1.2416 - classification_loss: 0.1863 155/500 [========>.....................] - ETA: 1:27 - loss: 1.4228 - regression_loss: 1.2369 - classification_loss: 0.1858 156/500 [========>.....................] - ETA: 1:27 - loss: 1.4261 - regression_loss: 1.2395 - classification_loss: 0.1867 157/500 [========>.....................] - ETA: 1:27 - loss: 1.4254 - regression_loss: 1.2389 - classification_loss: 0.1865 158/500 [========>.....................] - ETA: 1:26 - loss: 1.4215 - regression_loss: 1.2354 - classification_loss: 0.1860 159/500 [========>.....................] - ETA: 1:26 - loss: 1.4231 - regression_loss: 1.2369 - classification_loss: 0.1862 160/500 [========>.....................] - ETA: 1:26 - loss: 1.4224 - regression_loss: 1.2362 - classification_loss: 0.1862 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4231 - regression_loss: 1.2368 - classification_loss: 0.1863 162/500 [========>.....................] - ETA: 1:25 - loss: 1.4237 - regression_loss: 1.2374 - classification_loss: 0.1863 163/500 [========>.....................] - ETA: 1:25 - loss: 1.4233 - regression_loss: 1.2367 - classification_loss: 0.1866 164/500 [========>.....................] - ETA: 1:25 - loss: 1.4229 - regression_loss: 1.2365 - classification_loss: 0.1864 165/500 [========>.....................] - ETA: 1:25 - loss: 1.4252 - regression_loss: 1.2385 - classification_loss: 0.1867 166/500 [========>.....................] - ETA: 1:24 - loss: 1.4270 - regression_loss: 1.2394 - classification_loss: 0.1876 167/500 [=========>....................] - ETA: 1:24 - loss: 1.4244 - regression_loss: 1.2373 - classification_loss: 0.1870 168/500 [=========>....................] - ETA: 1:24 - loss: 1.4318 - regression_loss: 1.2433 - classification_loss: 0.1885 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4336 - regression_loss: 1.2452 - classification_loss: 0.1884 170/500 [=========>....................] - ETA: 1:23 - loss: 1.4342 - regression_loss: 1.2457 - classification_loss: 0.1886 171/500 [=========>....................] - ETA: 1:23 - loss: 1.4323 - regression_loss: 1.2439 - classification_loss: 0.1884 172/500 [=========>....................] - ETA: 1:23 - loss: 1.4294 - regression_loss: 1.2417 - classification_loss: 0.1878 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4294 - regression_loss: 1.2416 - classification_loss: 0.1878 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4264 - regression_loss: 1.2389 - classification_loss: 0.1875 175/500 [=========>....................] - ETA: 1:22 - loss: 1.4261 - regression_loss: 1.2387 - classification_loss: 0.1875 176/500 [=========>....................] - ETA: 1:22 - loss: 1.4222 - regression_loss: 1.2354 - classification_loss: 0.1869 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4231 - regression_loss: 1.2363 - classification_loss: 0.1868 178/500 [=========>....................] - ETA: 1:21 - loss: 1.4238 - regression_loss: 1.2370 - classification_loss: 0.1868 179/500 [=========>....................] - ETA: 1:21 - loss: 1.4251 - regression_loss: 1.2379 - classification_loss: 0.1872 180/500 [=========>....................] - ETA: 1:21 - loss: 1.4245 - regression_loss: 1.2374 - classification_loss: 0.1871 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4251 - regression_loss: 1.2379 - classification_loss: 0.1872 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4231 - regression_loss: 1.2364 - classification_loss: 0.1867 183/500 [=========>....................] - ETA: 1:20 - loss: 1.4236 - regression_loss: 1.2368 - classification_loss: 0.1868 184/500 [==========>...................] - ETA: 1:20 - loss: 1.4332 - regression_loss: 1.2429 - classification_loss: 0.1903 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4340 - regression_loss: 1.2438 - classification_loss: 0.1903 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4330 - regression_loss: 1.2429 - classification_loss: 0.1901 187/500 [==========>...................] - ETA: 1:19 - loss: 1.4351 - regression_loss: 1.2443 - classification_loss: 0.1907 188/500 [==========>...................] - ETA: 1:19 - loss: 1.4363 - regression_loss: 1.2459 - classification_loss: 0.1904 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4378 - regression_loss: 1.2473 - classification_loss: 0.1905 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4378 - regression_loss: 1.2473 - classification_loss: 0.1906 191/500 [==========>...................] - ETA: 1:18 - loss: 1.4384 - regression_loss: 1.2473 - classification_loss: 0.1911 192/500 [==========>...................] - ETA: 1:18 - loss: 1.4386 - regression_loss: 1.2475 - classification_loss: 0.1911 193/500 [==========>...................] - ETA: 1:17 - loss: 1.4368 - regression_loss: 1.2460 - classification_loss: 0.1908 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4394 - regression_loss: 1.2481 - classification_loss: 0.1913 195/500 [==========>...................] - ETA: 1:17 - loss: 1.4405 - regression_loss: 1.2490 - classification_loss: 0.1915 196/500 [==========>...................] - ETA: 1:17 - loss: 1.4379 - regression_loss: 1.2470 - classification_loss: 0.1910 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4366 - regression_loss: 1.2459 - classification_loss: 0.1906 198/500 [==========>...................] - ETA: 1:16 - loss: 1.4370 - regression_loss: 1.2465 - classification_loss: 0.1905 199/500 [==========>...................] - ETA: 1:16 - loss: 1.4370 - regression_loss: 1.2464 - classification_loss: 0.1906 200/500 [===========>..................] - ETA: 1:16 - loss: 1.4376 - regression_loss: 1.2472 - classification_loss: 0.1904 201/500 [===========>..................] - ETA: 1:15 - loss: 1.4330 - regression_loss: 1.2432 - classification_loss: 0.1898 202/500 [===========>..................] - ETA: 1:15 - loss: 1.4348 - regression_loss: 1.2448 - classification_loss: 0.1900 203/500 [===========>..................] - ETA: 1:15 - loss: 1.4310 - regression_loss: 1.2416 - classification_loss: 0.1893 204/500 [===========>..................] - ETA: 1:15 - loss: 1.4327 - regression_loss: 1.2431 - classification_loss: 0.1896 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4310 - regression_loss: 1.2420 - classification_loss: 0.1890 206/500 [===========>..................] - ETA: 1:14 - loss: 1.4317 - regression_loss: 1.2429 - classification_loss: 0.1888 207/500 [===========>..................] - ETA: 1:14 - loss: 1.4325 - regression_loss: 1.2435 - classification_loss: 0.1889 208/500 [===========>..................] - ETA: 1:14 - loss: 1.4307 - regression_loss: 1.2421 - classification_loss: 0.1887 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4307 - regression_loss: 1.2421 - classification_loss: 0.1886 210/500 [===========>..................] - ETA: 1:13 - loss: 1.4312 - regression_loss: 1.2424 - classification_loss: 0.1889 211/500 [===========>..................] - ETA: 1:13 - loss: 1.4299 - regression_loss: 1.2412 - classification_loss: 0.1887 212/500 [===========>..................] - ETA: 1:13 - loss: 1.4312 - regression_loss: 1.2424 - classification_loss: 0.1888 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4305 - regression_loss: 1.2418 - classification_loss: 0.1886 214/500 [===========>..................] - ETA: 1:12 - loss: 1.4321 - regression_loss: 1.2432 - classification_loss: 0.1888 215/500 [===========>..................] - ETA: 1:12 - loss: 1.4295 - regression_loss: 1.2411 - classification_loss: 0.1883 216/500 [===========>..................] - ETA: 1:12 - loss: 1.4303 - regression_loss: 1.2419 - classification_loss: 0.1884 217/500 [============>.................] - ETA: 1:11 - loss: 1.4306 - regression_loss: 1.2420 - classification_loss: 0.1885 218/500 [============>.................] - ETA: 1:11 - loss: 1.4322 - regression_loss: 1.2434 - classification_loss: 0.1888 219/500 [============>.................] - ETA: 1:11 - loss: 1.4341 - regression_loss: 1.2451 - classification_loss: 0.1890 220/500 [============>.................] - ETA: 1:11 - loss: 1.4351 - regression_loss: 1.2460 - classification_loss: 0.1891 221/500 [============>.................] - ETA: 1:10 - loss: 1.4366 - regression_loss: 1.2472 - classification_loss: 0.1894 222/500 [============>.................] - ETA: 1:10 - loss: 1.4368 - regression_loss: 1.2476 - classification_loss: 0.1893 223/500 [============>.................] - ETA: 1:10 - loss: 1.4376 - regression_loss: 1.2482 - classification_loss: 0.1894 224/500 [============>.................] - ETA: 1:10 - loss: 1.4373 - regression_loss: 1.2481 - classification_loss: 0.1892 225/500 [============>.................] - ETA: 1:09 - loss: 1.4373 - regression_loss: 1.2479 - classification_loss: 0.1894 226/500 [============>.................] - ETA: 1:09 - loss: 1.4376 - regression_loss: 1.2482 - classification_loss: 0.1894 227/500 [============>.................] - ETA: 1:09 - loss: 1.4372 - regression_loss: 1.2478 - classification_loss: 0.1894 228/500 [============>.................] - ETA: 1:09 - loss: 1.4365 - regression_loss: 1.2472 - classification_loss: 0.1893 229/500 [============>.................] - ETA: 1:08 - loss: 1.4374 - regression_loss: 1.2482 - classification_loss: 0.1893 230/500 [============>.................] - ETA: 1:08 - loss: 1.4368 - regression_loss: 1.2478 - classification_loss: 0.1890 231/500 [============>.................] - ETA: 1:08 - loss: 1.4387 - regression_loss: 1.2494 - classification_loss: 0.1894 232/500 [============>.................] - ETA: 1:08 - loss: 1.4398 - regression_loss: 1.2504 - classification_loss: 0.1894 233/500 [============>.................] - ETA: 1:07 - loss: 1.4387 - regression_loss: 1.2492 - classification_loss: 0.1895 234/500 [=============>................] - ETA: 1:07 - loss: 1.4349 - regression_loss: 1.2458 - classification_loss: 0.1891 235/500 [=============>................] - ETA: 1:07 - loss: 1.4335 - regression_loss: 1.2447 - classification_loss: 0.1888 236/500 [=============>................] - ETA: 1:07 - loss: 1.4317 - regression_loss: 1.2433 - classification_loss: 0.1884 237/500 [=============>................] - ETA: 1:06 - loss: 1.4304 - regression_loss: 1.2421 - classification_loss: 0.1883 238/500 [=============>................] - ETA: 1:06 - loss: 1.4311 - regression_loss: 1.2428 - classification_loss: 0.1884 239/500 [=============>................] - ETA: 1:06 - loss: 1.4291 - regression_loss: 1.2412 - classification_loss: 0.1879 240/500 [=============>................] - ETA: 1:06 - loss: 1.4297 - regression_loss: 1.2418 - classification_loss: 0.1879 241/500 [=============>................] - ETA: 1:05 - loss: 1.4276 - regression_loss: 1.2400 - classification_loss: 0.1875 242/500 [=============>................] - ETA: 1:05 - loss: 1.4286 - regression_loss: 1.2411 - classification_loss: 0.1876 243/500 [=============>................] - ETA: 1:05 - loss: 1.4260 - regression_loss: 1.2388 - classification_loss: 0.1872 244/500 [=============>................] - ETA: 1:05 - loss: 1.4219 - regression_loss: 1.2352 - classification_loss: 0.1867 245/500 [=============>................] - ETA: 1:04 - loss: 1.4231 - regression_loss: 1.2362 - classification_loss: 0.1869 246/500 [=============>................] - ETA: 1:04 - loss: 1.4224 - regression_loss: 1.2356 - classification_loss: 0.1867 247/500 [=============>................] - ETA: 1:04 - loss: 1.4218 - regression_loss: 1.2352 - classification_loss: 0.1865 248/500 [=============>................] - ETA: 1:04 - loss: 1.4232 - regression_loss: 1.2365 - classification_loss: 0.1867 249/500 [=============>................] - ETA: 1:03 - loss: 1.4252 - regression_loss: 1.2380 - classification_loss: 0.1873 250/500 [==============>...............] - ETA: 1:03 - loss: 1.4259 - regression_loss: 1.2386 - classification_loss: 0.1873 251/500 [==============>...............] - ETA: 1:03 - loss: 1.4281 - regression_loss: 1.2403 - classification_loss: 0.1879 252/500 [==============>...............] - ETA: 1:03 - loss: 1.4300 - regression_loss: 1.2421 - classification_loss: 0.1879 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4315 - regression_loss: 1.2436 - classification_loss: 0.1879 254/500 [==============>...............] - ETA: 1:02 - loss: 1.4275 - regression_loss: 1.2402 - classification_loss: 0.1874 255/500 [==============>...............] - ETA: 1:02 - loss: 1.4274 - regression_loss: 1.2401 - classification_loss: 0.1873 256/500 [==============>...............] - ETA: 1:02 - loss: 1.4273 - regression_loss: 1.2401 - classification_loss: 0.1872 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4289 - regression_loss: 1.2413 - classification_loss: 0.1876 258/500 [==============>...............] - ETA: 1:01 - loss: 1.4289 - regression_loss: 1.2412 - classification_loss: 0.1876 259/500 [==============>...............] - ETA: 1:01 - loss: 1.4288 - regression_loss: 1.2412 - classification_loss: 0.1876 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4289 - regression_loss: 1.2411 - classification_loss: 0.1877 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4281 - regression_loss: 1.2405 - classification_loss: 0.1876 262/500 [==============>...............] - ETA: 1:00 - loss: 1.4319 - regression_loss: 1.2440 - classification_loss: 0.1879 263/500 [==============>...............] - ETA: 1:00 - loss: 1.4315 - regression_loss: 1.2437 - classification_loss: 0.1879 264/500 [==============>...............] - ETA: 59s - loss: 1.4307 - regression_loss: 1.2420 - classification_loss: 0.1887  265/500 [==============>...............] - ETA: 59s - loss: 1.4302 - regression_loss: 1.2416 - classification_loss: 0.1887 266/500 [==============>...............] - ETA: 59s - loss: 1.4298 - regression_loss: 1.2413 - classification_loss: 0.1886 267/500 [===============>..............] - ETA: 59s - loss: 1.4329 - regression_loss: 1.2440 - classification_loss: 0.1889 268/500 [===============>..............] - ETA: 58s - loss: 1.4309 - regression_loss: 1.2422 - classification_loss: 0.1888 269/500 [===============>..............] - ETA: 58s - loss: 1.4288 - regression_loss: 1.2404 - classification_loss: 0.1884 270/500 [===============>..............] - ETA: 58s - loss: 1.4288 - regression_loss: 1.2402 - classification_loss: 0.1886 271/500 [===============>..............] - ETA: 58s - loss: 1.4288 - regression_loss: 1.2402 - classification_loss: 0.1886 272/500 [===============>..............] - ETA: 57s - loss: 1.4304 - regression_loss: 1.2417 - classification_loss: 0.1887 273/500 [===============>..............] - ETA: 57s - loss: 1.4304 - regression_loss: 1.2417 - classification_loss: 0.1887 274/500 [===============>..............] - ETA: 57s - loss: 1.4306 - regression_loss: 1.2418 - classification_loss: 0.1888 275/500 [===============>..............] - ETA: 57s - loss: 1.4302 - regression_loss: 1.2415 - classification_loss: 0.1887 276/500 [===============>..............] - ETA: 56s - loss: 1.4276 - regression_loss: 1.2393 - classification_loss: 0.1883 277/500 [===============>..............] - ETA: 56s - loss: 1.4276 - regression_loss: 1.2394 - classification_loss: 0.1882 278/500 [===============>..............] - ETA: 56s - loss: 1.4257 - regression_loss: 1.2377 - classification_loss: 0.1880 279/500 [===============>..............] - ETA: 56s - loss: 1.4232 - regression_loss: 1.2355 - classification_loss: 0.1877 280/500 [===============>..............] - ETA: 55s - loss: 1.4217 - regression_loss: 1.2343 - classification_loss: 0.1874 281/500 [===============>..............] - ETA: 55s - loss: 1.4193 - regression_loss: 1.2322 - classification_loss: 0.1870 282/500 [===============>..............] - ETA: 55s - loss: 1.4171 - regression_loss: 1.2300 - classification_loss: 0.1871 283/500 [===============>..............] - ETA: 55s - loss: 1.4172 - regression_loss: 1.2300 - classification_loss: 0.1872 284/500 [================>.............] - ETA: 54s - loss: 1.4179 - regression_loss: 1.2306 - classification_loss: 0.1873 285/500 [================>.............] - ETA: 54s - loss: 1.4194 - regression_loss: 1.2319 - classification_loss: 0.1875 286/500 [================>.............] - ETA: 54s - loss: 1.4198 - regression_loss: 1.2323 - classification_loss: 0.1875 287/500 [================>.............] - ETA: 54s - loss: 1.4207 - regression_loss: 1.2332 - classification_loss: 0.1875 288/500 [================>.............] - ETA: 53s - loss: 1.4174 - regression_loss: 1.2304 - classification_loss: 0.1870 289/500 [================>.............] - ETA: 53s - loss: 1.4182 - regression_loss: 1.2311 - classification_loss: 0.1871 290/500 [================>.............] - ETA: 53s - loss: 1.4158 - regression_loss: 1.2291 - classification_loss: 0.1867 291/500 [================>.............] - ETA: 53s - loss: 1.4149 - regression_loss: 1.2286 - classification_loss: 0.1863 292/500 [================>.............] - ETA: 52s - loss: 1.4159 - regression_loss: 1.2295 - classification_loss: 0.1863 293/500 [================>.............] - ETA: 52s - loss: 1.4169 - regression_loss: 1.2305 - classification_loss: 0.1864 294/500 [================>.............] - ETA: 52s - loss: 1.4174 - regression_loss: 1.2312 - classification_loss: 0.1862 295/500 [================>.............] - ETA: 52s - loss: 1.4197 - regression_loss: 1.2332 - classification_loss: 0.1866 296/500 [================>.............] - ETA: 51s - loss: 1.4192 - regression_loss: 1.2328 - classification_loss: 0.1864 297/500 [================>.............] - ETA: 51s - loss: 1.4205 - regression_loss: 1.2339 - classification_loss: 0.1866 298/500 [================>.............] - ETA: 51s - loss: 1.4202 - regression_loss: 1.2337 - classification_loss: 0.1865 299/500 [================>.............] - ETA: 51s - loss: 1.4205 - regression_loss: 1.2339 - classification_loss: 0.1866 300/500 [=================>............] - ETA: 50s - loss: 1.4206 - regression_loss: 1.2340 - classification_loss: 0.1866 301/500 [=================>............] - ETA: 50s - loss: 1.4222 - regression_loss: 1.2352 - classification_loss: 0.1870 302/500 [=================>............] - ETA: 50s - loss: 1.4239 - regression_loss: 1.2366 - classification_loss: 0.1874 303/500 [=================>............] - ETA: 49s - loss: 1.4250 - regression_loss: 1.2372 - classification_loss: 0.1878 304/500 [=================>............] - ETA: 49s - loss: 1.4221 - regression_loss: 1.2347 - classification_loss: 0.1873 305/500 [=================>............] - ETA: 49s - loss: 1.4223 - regression_loss: 1.2351 - classification_loss: 0.1872 306/500 [=================>............] - ETA: 49s - loss: 1.4210 - regression_loss: 1.2341 - classification_loss: 0.1869 307/500 [=================>............] - ETA: 48s - loss: 1.4202 - regression_loss: 1.2335 - classification_loss: 0.1867 308/500 [=================>............] - ETA: 48s - loss: 1.4212 - regression_loss: 1.2343 - classification_loss: 0.1869 309/500 [=================>............] - ETA: 48s - loss: 1.4191 - regression_loss: 1.2324 - classification_loss: 0.1867 310/500 [=================>............] - ETA: 48s - loss: 1.4194 - regression_loss: 1.2327 - classification_loss: 0.1868 311/500 [=================>............] - ETA: 47s - loss: 1.4190 - regression_loss: 1.2319 - classification_loss: 0.1871 312/500 [=================>............] - ETA: 47s - loss: 1.4196 - regression_loss: 1.2323 - classification_loss: 0.1873 313/500 [=================>............] - ETA: 47s - loss: 1.4181 - regression_loss: 1.2311 - classification_loss: 0.1871 314/500 [=================>............] - ETA: 47s - loss: 1.4187 - regression_loss: 1.2315 - classification_loss: 0.1872 315/500 [=================>............] - ETA: 46s - loss: 1.4190 - regression_loss: 1.2318 - classification_loss: 0.1872 316/500 [=================>............] - ETA: 46s - loss: 1.4169 - regression_loss: 1.2301 - classification_loss: 0.1868 317/500 [==================>...........] - ETA: 46s - loss: 1.4175 - regression_loss: 1.2308 - classification_loss: 0.1868 318/500 [==================>...........] - ETA: 46s - loss: 1.4163 - regression_loss: 1.2297 - classification_loss: 0.1866 319/500 [==================>...........] - ETA: 45s - loss: 1.4141 - regression_loss: 1.2279 - classification_loss: 0.1862 320/500 [==================>...........] - ETA: 45s - loss: 1.4135 - regression_loss: 1.2276 - classification_loss: 0.1858 321/500 [==================>...........] - ETA: 45s - loss: 1.4139 - regression_loss: 1.2280 - classification_loss: 0.1859 322/500 [==================>...........] - ETA: 45s - loss: 1.4114 - regression_loss: 1.2259 - classification_loss: 0.1855 323/500 [==================>...........] - ETA: 44s - loss: 1.4130 - regression_loss: 1.2272 - classification_loss: 0.1858 324/500 [==================>...........] - ETA: 44s - loss: 1.4109 - regression_loss: 1.2255 - classification_loss: 0.1854 325/500 [==================>...........] - ETA: 44s - loss: 1.4122 - regression_loss: 1.2267 - classification_loss: 0.1855 326/500 [==================>...........] - ETA: 44s - loss: 1.4128 - regression_loss: 1.2272 - classification_loss: 0.1856 327/500 [==================>...........] - ETA: 43s - loss: 1.4125 - regression_loss: 1.2271 - classification_loss: 0.1854 328/500 [==================>...........] - ETA: 43s - loss: 1.4130 - regression_loss: 1.2276 - classification_loss: 0.1854 329/500 [==================>...........] - ETA: 43s - loss: 1.4137 - regression_loss: 1.2283 - classification_loss: 0.1854 330/500 [==================>...........] - ETA: 43s - loss: 1.4133 - regression_loss: 1.2280 - classification_loss: 0.1854 331/500 [==================>...........] - ETA: 42s - loss: 1.4135 - regression_loss: 1.2281 - classification_loss: 0.1855 332/500 [==================>...........] - ETA: 42s - loss: 1.4144 - regression_loss: 1.2287 - classification_loss: 0.1856 333/500 [==================>...........] - ETA: 42s - loss: 1.4154 - regression_loss: 1.2296 - classification_loss: 0.1859 334/500 [===================>..........] - ETA: 42s - loss: 1.4171 - regression_loss: 1.2308 - classification_loss: 0.1863 335/500 [===================>..........] - ETA: 41s - loss: 1.4181 - regression_loss: 1.2319 - classification_loss: 0.1862 336/500 [===================>..........] - ETA: 41s - loss: 1.4191 - regression_loss: 1.2327 - classification_loss: 0.1864 337/500 [===================>..........] - ETA: 41s - loss: 1.4210 - regression_loss: 1.2340 - classification_loss: 0.1870 338/500 [===================>..........] - ETA: 41s - loss: 1.4200 - regression_loss: 1.2331 - classification_loss: 0.1869 339/500 [===================>..........] - ETA: 40s - loss: 1.4179 - regression_loss: 1.2311 - classification_loss: 0.1868 340/500 [===================>..........] - ETA: 40s - loss: 1.4177 - regression_loss: 1.2311 - classification_loss: 0.1866 341/500 [===================>..........] - ETA: 40s - loss: 1.4193 - regression_loss: 1.2325 - classification_loss: 0.1868 342/500 [===================>..........] - ETA: 40s - loss: 1.4197 - regression_loss: 1.2329 - classification_loss: 0.1868 343/500 [===================>..........] - ETA: 39s - loss: 1.4200 - regression_loss: 1.2332 - classification_loss: 0.1868 344/500 [===================>..........] - ETA: 39s - loss: 1.4189 - regression_loss: 1.2323 - classification_loss: 0.1866 345/500 [===================>..........] - ETA: 39s - loss: 1.4185 - regression_loss: 1.2318 - classification_loss: 0.1867 346/500 [===================>..........] - ETA: 39s - loss: 1.4195 - regression_loss: 1.2327 - classification_loss: 0.1868 347/500 [===================>..........] - ETA: 38s - loss: 1.4211 - regression_loss: 1.2340 - classification_loss: 0.1872 348/500 [===================>..........] - ETA: 38s - loss: 1.4199 - regression_loss: 1.2328 - classification_loss: 0.1871 349/500 [===================>..........] - ETA: 38s - loss: 1.4200 - regression_loss: 1.2330 - classification_loss: 0.1870 350/500 [====================>.........] - ETA: 38s - loss: 1.4212 - regression_loss: 1.2341 - classification_loss: 0.1871 351/500 [====================>.........] - ETA: 37s - loss: 1.4215 - regression_loss: 1.2342 - classification_loss: 0.1873 352/500 [====================>.........] - ETA: 37s - loss: 1.4216 - regression_loss: 1.2343 - classification_loss: 0.1873 353/500 [====================>.........] - ETA: 37s - loss: 1.4217 - regression_loss: 1.2345 - classification_loss: 0.1872 354/500 [====================>.........] - ETA: 37s - loss: 1.4212 - regression_loss: 1.2341 - classification_loss: 0.1871 355/500 [====================>.........] - ETA: 36s - loss: 1.4191 - regression_loss: 1.2323 - classification_loss: 0.1868 356/500 [====================>.........] - ETA: 36s - loss: 1.4185 - regression_loss: 1.2318 - classification_loss: 0.1867 357/500 [====================>.........] - ETA: 36s - loss: 1.4158 - regression_loss: 1.2295 - classification_loss: 0.1863 358/500 [====================>.........] - ETA: 36s - loss: 1.4161 - regression_loss: 1.2297 - classification_loss: 0.1864 359/500 [====================>.........] - ETA: 35s - loss: 1.4168 - regression_loss: 1.2303 - classification_loss: 0.1865 360/500 [====================>.........] - ETA: 35s - loss: 1.4150 - regression_loss: 1.2288 - classification_loss: 0.1862 361/500 [====================>.........] - ETA: 35s - loss: 1.4144 - regression_loss: 1.2283 - classification_loss: 0.1861 362/500 [====================>.........] - ETA: 35s - loss: 1.4159 - regression_loss: 1.2295 - classification_loss: 0.1863 363/500 [====================>.........] - ETA: 34s - loss: 1.4166 - regression_loss: 1.2302 - classification_loss: 0.1864 364/500 [====================>.........] - ETA: 34s - loss: 1.4162 - regression_loss: 1.2298 - classification_loss: 0.1863 365/500 [====================>.........] - ETA: 34s - loss: 1.4147 - regression_loss: 1.2285 - classification_loss: 0.1862 366/500 [====================>.........] - ETA: 34s - loss: 1.4159 - regression_loss: 1.2296 - classification_loss: 0.1863 367/500 [=====================>........] - ETA: 33s - loss: 1.4142 - regression_loss: 1.2281 - classification_loss: 0.1861 368/500 [=====================>........] - ETA: 33s - loss: 1.4126 - regression_loss: 1.2267 - classification_loss: 0.1858 369/500 [=====================>........] - ETA: 33s - loss: 1.4136 - regression_loss: 1.2277 - classification_loss: 0.1859 370/500 [=====================>........] - ETA: 33s - loss: 1.4138 - regression_loss: 1.2278 - classification_loss: 0.1860 371/500 [=====================>........] - ETA: 32s - loss: 1.4140 - regression_loss: 1.2280 - classification_loss: 0.1861 372/500 [=====================>........] - ETA: 32s - loss: 1.4132 - regression_loss: 1.2271 - classification_loss: 0.1861 373/500 [=====================>........] - ETA: 32s - loss: 1.4139 - regression_loss: 1.2277 - classification_loss: 0.1862 374/500 [=====================>........] - ETA: 32s - loss: 1.4145 - regression_loss: 1.2283 - classification_loss: 0.1862 375/500 [=====================>........] - ETA: 31s - loss: 1.4151 - regression_loss: 1.2288 - classification_loss: 0.1863 376/500 [=====================>........] - ETA: 31s - loss: 1.4144 - regression_loss: 1.2282 - classification_loss: 0.1862 377/500 [=====================>........] - ETA: 31s - loss: 1.4154 - regression_loss: 1.2291 - classification_loss: 0.1863 378/500 [=====================>........] - ETA: 31s - loss: 1.4141 - regression_loss: 1.2280 - classification_loss: 0.1861 379/500 [=====================>........] - ETA: 30s - loss: 1.4125 - regression_loss: 1.2266 - classification_loss: 0.1859 380/500 [=====================>........] - ETA: 30s - loss: 1.4122 - regression_loss: 1.2264 - classification_loss: 0.1858 381/500 [=====================>........] - ETA: 30s - loss: 1.4116 - regression_loss: 1.2260 - classification_loss: 0.1856 382/500 [=====================>........] - ETA: 29s - loss: 1.4123 - regression_loss: 1.2267 - classification_loss: 0.1856 383/500 [=====================>........] - ETA: 29s - loss: 1.4125 - regression_loss: 1.2270 - classification_loss: 0.1856 384/500 [======================>.......] - ETA: 29s - loss: 1.4110 - regression_loss: 1.2257 - classification_loss: 0.1853 385/500 [======================>.......] - ETA: 29s - loss: 1.4120 - regression_loss: 1.2264 - classification_loss: 0.1856 386/500 [======================>.......] - ETA: 28s - loss: 1.4129 - regression_loss: 1.2271 - classification_loss: 0.1858 387/500 [======================>.......] - ETA: 28s - loss: 1.4108 - regression_loss: 1.2251 - classification_loss: 0.1857 388/500 [======================>.......] - ETA: 28s - loss: 1.4108 - regression_loss: 1.2250 - classification_loss: 0.1858 389/500 [======================>.......] - ETA: 28s - loss: 1.4102 - regression_loss: 1.2246 - classification_loss: 0.1856 390/500 [======================>.......] - ETA: 27s - loss: 1.4128 - regression_loss: 1.2264 - classification_loss: 0.1864 391/500 [======================>.......] - ETA: 27s - loss: 1.4112 - regression_loss: 1.2249 - classification_loss: 0.1863 392/500 [======================>.......] - ETA: 27s - loss: 1.4085 - regression_loss: 1.2225 - classification_loss: 0.1860 393/500 [======================>.......] - ETA: 27s - loss: 1.4085 - regression_loss: 1.2226 - classification_loss: 0.1860 394/500 [======================>.......] - ETA: 26s - loss: 1.4067 - regression_loss: 1.2210 - classification_loss: 0.1857 395/500 [======================>.......] - ETA: 26s - loss: 1.4077 - regression_loss: 1.2218 - classification_loss: 0.1859 396/500 [======================>.......] - ETA: 26s - loss: 1.4080 - regression_loss: 1.2221 - classification_loss: 0.1859 397/500 [======================>.......] - ETA: 26s - loss: 1.4082 - regression_loss: 1.2223 - classification_loss: 0.1859 398/500 [======================>.......] - ETA: 25s - loss: 1.4092 - regression_loss: 1.2229 - classification_loss: 0.1864 399/500 [======================>.......] - ETA: 25s - loss: 1.4101 - regression_loss: 1.2238 - classification_loss: 0.1863 400/500 [=======================>......] - ETA: 25s - loss: 1.4115 - regression_loss: 1.2251 - classification_loss: 0.1864 401/500 [=======================>......] - ETA: 25s - loss: 1.4124 - regression_loss: 1.2259 - classification_loss: 0.1865 402/500 [=======================>......] - ETA: 24s - loss: 1.4126 - regression_loss: 1.2262 - classification_loss: 0.1864 403/500 [=======================>......] - ETA: 24s - loss: 1.4118 - regression_loss: 1.2255 - classification_loss: 0.1863 404/500 [=======================>......] - ETA: 24s - loss: 1.4121 - regression_loss: 1.2259 - classification_loss: 0.1862 405/500 [=======================>......] - ETA: 24s - loss: 1.4121 - regression_loss: 1.2259 - classification_loss: 0.1863 406/500 [=======================>......] - ETA: 23s - loss: 1.4123 - regression_loss: 1.2260 - classification_loss: 0.1863 407/500 [=======================>......] - ETA: 23s - loss: 1.4124 - regression_loss: 1.2263 - classification_loss: 0.1862 408/500 [=======================>......] - ETA: 23s - loss: 1.4138 - regression_loss: 1.2273 - classification_loss: 0.1865 409/500 [=======================>......] - ETA: 23s - loss: 1.4114 - regression_loss: 1.2251 - classification_loss: 0.1863 410/500 [=======================>......] - ETA: 22s - loss: 1.4112 - regression_loss: 1.2248 - classification_loss: 0.1864 411/500 [=======================>......] - ETA: 22s - loss: 1.4104 - regression_loss: 1.2240 - classification_loss: 0.1864 412/500 [=======================>......] - ETA: 22s - loss: 1.4106 - regression_loss: 1.2242 - classification_loss: 0.1864 413/500 [=======================>......] - ETA: 22s - loss: 1.4104 - regression_loss: 1.2242 - classification_loss: 0.1862 414/500 [=======================>......] - ETA: 21s - loss: 1.4089 - regression_loss: 1.2229 - classification_loss: 0.1860 415/500 [=======================>......] - ETA: 21s - loss: 1.4092 - regression_loss: 1.2232 - classification_loss: 0.1860 416/500 [=======================>......] - ETA: 21s - loss: 1.4100 - regression_loss: 1.2239 - classification_loss: 0.1861 417/500 [========================>.....] - ETA: 21s - loss: 1.4108 - regression_loss: 1.2245 - classification_loss: 0.1862 418/500 [========================>.....] - ETA: 20s - loss: 1.4092 - regression_loss: 1.2233 - classification_loss: 0.1859 419/500 [========================>.....] - ETA: 20s - loss: 1.4098 - regression_loss: 1.2238 - classification_loss: 0.1860 420/500 [========================>.....] - ETA: 20s - loss: 1.4096 - regression_loss: 1.2235 - classification_loss: 0.1860 421/500 [========================>.....] - ETA: 20s - loss: 1.4096 - regression_loss: 1.2235 - classification_loss: 0.1861 422/500 [========================>.....] - ETA: 19s - loss: 1.4101 - regression_loss: 1.2240 - classification_loss: 0.1861 423/500 [========================>.....] - ETA: 19s - loss: 1.4103 - regression_loss: 1.2242 - classification_loss: 0.1861 424/500 [========================>.....] - ETA: 19s - loss: 1.4126 - regression_loss: 1.2259 - classification_loss: 0.1868 425/500 [========================>.....] - ETA: 19s - loss: 1.4122 - regression_loss: 1.2255 - classification_loss: 0.1867 426/500 [========================>.....] - ETA: 18s - loss: 1.4122 - regression_loss: 1.2254 - classification_loss: 0.1868 427/500 [========================>.....] - ETA: 18s - loss: 1.4121 - regression_loss: 1.2253 - classification_loss: 0.1868 428/500 [========================>.....] - ETA: 18s - loss: 1.4125 - regression_loss: 1.2257 - classification_loss: 0.1868 429/500 [========================>.....] - ETA: 18s - loss: 1.4119 - regression_loss: 1.2252 - classification_loss: 0.1867 430/500 [========================>.....] - ETA: 17s - loss: 1.4109 - regression_loss: 1.2243 - classification_loss: 0.1866 431/500 [========================>.....] - ETA: 17s - loss: 1.4095 - regression_loss: 1.2231 - classification_loss: 0.1864 432/500 [========================>.....] - ETA: 17s - loss: 1.4077 - regression_loss: 1.2215 - classification_loss: 0.1862 433/500 [========================>.....] - ETA: 17s - loss: 1.4055 - regression_loss: 1.2197 - classification_loss: 0.1859 434/500 [=========================>....] - ETA: 16s - loss: 1.4055 - regression_loss: 1.2196 - classification_loss: 0.1859 435/500 [=========================>....] - ETA: 16s - loss: 1.4042 - regression_loss: 1.2185 - classification_loss: 0.1857 436/500 [=========================>....] - ETA: 16s - loss: 1.4046 - regression_loss: 1.2188 - classification_loss: 0.1858 437/500 [=========================>....] - ETA: 16s - loss: 1.4045 - regression_loss: 1.2186 - classification_loss: 0.1860 438/500 [=========================>....] - ETA: 15s - loss: 1.4044 - regression_loss: 1.2184 - classification_loss: 0.1860 439/500 [=========================>....] - ETA: 15s - loss: 1.4047 - regression_loss: 1.2186 - classification_loss: 0.1861 440/500 [=========================>....] - ETA: 15s - loss: 1.4053 - regression_loss: 1.2189 - classification_loss: 0.1864 441/500 [=========================>....] - ETA: 14s - loss: 1.4055 - regression_loss: 1.2191 - classification_loss: 0.1864 442/500 [=========================>....] - ETA: 14s - loss: 1.4050 - regression_loss: 1.2186 - classification_loss: 0.1864 443/500 [=========================>....] - ETA: 14s - loss: 1.4072 - regression_loss: 1.2203 - classification_loss: 0.1869 444/500 [=========================>....] - ETA: 14s - loss: 1.4066 - regression_loss: 1.2199 - classification_loss: 0.1867 445/500 [=========================>....] - ETA: 13s - loss: 1.4053 - regression_loss: 1.2187 - classification_loss: 0.1866 446/500 [=========================>....] - ETA: 13s - loss: 1.4051 - regression_loss: 1.2185 - classification_loss: 0.1866 447/500 [=========================>....] - ETA: 13s - loss: 1.4051 - regression_loss: 1.2184 - classification_loss: 0.1866 448/500 [=========================>....] - ETA: 13s - loss: 1.4052 - regression_loss: 1.2186 - classification_loss: 0.1866 449/500 [=========================>....] - ETA: 12s - loss: 1.4042 - regression_loss: 1.2177 - classification_loss: 0.1865 450/500 [==========================>...] - ETA: 12s - loss: 1.4045 - regression_loss: 1.2180 - classification_loss: 0.1865 451/500 [==========================>...] - ETA: 12s - loss: 1.4053 - regression_loss: 1.2189 - classification_loss: 0.1864 452/500 [==========================>...] - ETA: 12s - loss: 1.4057 - regression_loss: 1.2192 - classification_loss: 0.1865 453/500 [==========================>...] - ETA: 11s - loss: 1.4079 - regression_loss: 1.2211 - classification_loss: 0.1868 454/500 [==========================>...] - ETA: 11s - loss: 1.4077 - regression_loss: 1.2210 - classification_loss: 0.1867 455/500 [==========================>...] - ETA: 11s - loss: 1.4083 - regression_loss: 1.2214 - classification_loss: 0.1869 456/500 [==========================>...] - ETA: 11s - loss: 1.4086 - regression_loss: 1.2217 - classification_loss: 0.1870 457/500 [==========================>...] - ETA: 10s - loss: 1.4084 - regression_loss: 1.2215 - classification_loss: 0.1869 458/500 [==========================>...] - ETA: 10s - loss: 1.4085 - regression_loss: 1.2216 - classification_loss: 0.1869 459/500 [==========================>...] - ETA: 10s - loss: 1.4095 - regression_loss: 1.2224 - classification_loss: 0.1871 460/500 [==========================>...] - ETA: 10s - loss: 1.4087 - regression_loss: 1.2217 - classification_loss: 0.1870 461/500 [==========================>...] - ETA: 9s - loss: 1.4080 - regression_loss: 1.2210 - classification_loss: 0.1871  462/500 [==========================>...] - ETA: 9s - loss: 1.4076 - regression_loss: 1.2206 - classification_loss: 0.1870 463/500 [==========================>...] - ETA: 9s - loss: 1.4072 - regression_loss: 1.2203 - classification_loss: 0.1870 464/500 [==========================>...] - ETA: 9s - loss: 1.4073 - regression_loss: 1.2204 - classification_loss: 0.1870 465/500 [==========================>...] - ETA: 8s - loss: 1.4073 - regression_loss: 1.2203 - classification_loss: 0.1870 466/500 [==========================>...] - ETA: 8s - loss: 1.4072 - regression_loss: 1.2203 - classification_loss: 0.1870 467/500 [===========================>..] - ETA: 8s - loss: 1.4061 - regression_loss: 1.2194 - classification_loss: 0.1867 468/500 [===========================>..] - ETA: 8s - loss: 1.4062 - regression_loss: 1.2196 - classification_loss: 0.1867 469/500 [===========================>..] - ETA: 7s - loss: 1.4063 - regression_loss: 1.2196 - classification_loss: 0.1867 470/500 [===========================>..] - ETA: 7s - loss: 1.4072 - regression_loss: 1.2204 - classification_loss: 0.1868 471/500 [===========================>..] - ETA: 7s - loss: 1.4067 - regression_loss: 1.2199 - classification_loss: 0.1868 472/500 [===========================>..] - ETA: 7s - loss: 1.4066 - regression_loss: 1.2199 - classification_loss: 0.1867 473/500 [===========================>..] - ETA: 6s - loss: 1.4069 - regression_loss: 1.2201 - classification_loss: 0.1868 474/500 [===========================>..] - ETA: 6s - loss: 1.4062 - regression_loss: 1.2196 - classification_loss: 0.1867 475/500 [===========================>..] - ETA: 6s - loss: 1.4062 - regression_loss: 1.2194 - classification_loss: 0.1867 476/500 [===========================>..] - ETA: 6s - loss: 1.4064 - regression_loss: 1.2196 - classification_loss: 0.1868 477/500 [===========================>..] - ETA: 5s - loss: 1.4062 - regression_loss: 1.2194 - classification_loss: 0.1868 478/500 [===========================>..] - ETA: 5s - loss: 1.4071 - regression_loss: 1.2201 - classification_loss: 0.1870 479/500 [===========================>..] - ETA: 5s - loss: 1.4074 - regression_loss: 1.2204 - classification_loss: 0.1870 480/500 [===========================>..] - ETA: 5s - loss: 1.4077 - regression_loss: 1.2207 - classification_loss: 0.1870 481/500 [===========================>..] - ETA: 4s - loss: 1.4087 - regression_loss: 1.2215 - classification_loss: 0.1871 482/500 [===========================>..] - ETA: 4s - loss: 1.4093 - regression_loss: 1.2221 - classification_loss: 0.1872 483/500 [===========================>..] - ETA: 4s - loss: 1.4093 - regression_loss: 1.2221 - classification_loss: 0.1872 484/500 [============================>.] - ETA: 4s - loss: 1.4103 - regression_loss: 1.2230 - classification_loss: 0.1873 485/500 [============================>.] - ETA: 3s - loss: 1.4101 - regression_loss: 1.2228 - classification_loss: 0.1873 486/500 [============================>.] - ETA: 3s - loss: 1.4095 - regression_loss: 1.2223 - classification_loss: 0.1871 487/500 [============================>.] - ETA: 3s - loss: 1.4090 - regression_loss: 1.2219 - classification_loss: 0.1871 488/500 [============================>.] - ETA: 3s - loss: 1.4090 - regression_loss: 1.2219 - classification_loss: 0.1871 489/500 [============================>.] - ETA: 2s - loss: 1.4098 - regression_loss: 1.2227 - classification_loss: 0.1871 490/500 [============================>.] - ETA: 2s - loss: 1.4103 - regression_loss: 1.2232 - classification_loss: 0.1871 491/500 [============================>.] - ETA: 2s - loss: 1.4099 - regression_loss: 1.2228 - classification_loss: 0.1871 492/500 [============================>.] - ETA: 2s - loss: 1.4101 - regression_loss: 1.2231 - classification_loss: 0.1870 493/500 [============================>.] - ETA: 1s - loss: 1.4102 - regression_loss: 1.2232 - classification_loss: 0.1870 494/500 [============================>.] - ETA: 1s - loss: 1.4093 - regression_loss: 1.2225 - classification_loss: 0.1869 495/500 [============================>.] - ETA: 1s - loss: 1.4098 - regression_loss: 1.2228 - classification_loss: 0.1870 496/500 [============================>.] - ETA: 1s - loss: 1.4089 - regression_loss: 1.2222 - classification_loss: 0.1868 497/500 [============================>.] - ETA: 0s - loss: 1.4097 - regression_loss: 1.2227 - classification_loss: 0.1870 498/500 [============================>.] - ETA: 0s - loss: 1.4101 - regression_loss: 1.2229 - classification_loss: 0.1871 499/500 [============================>.] - ETA: 0s - loss: 1.4089 - regression_loss: 1.2220 - classification_loss: 0.1869 500/500 [==============================] - 127s 254ms/step - loss: 1.4091 - regression_loss: 1.2222 - classification_loss: 0.1869 1172 instances of class plum with average precision: 0.7555 mAP: 0.7555 Epoch 00017: saving model to ./training/snapshots/resnet50_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:02 - loss: 1.8459 - regression_loss: 1.6302 - classification_loss: 0.2157 2/500 [..............................] - ETA: 2:02 - loss: 1.6822 - regression_loss: 1.4832 - classification_loss: 0.1990 3/500 [..............................] - ETA: 2:03 - loss: 1.6384 - regression_loss: 1.4368 - classification_loss: 0.2016 4/500 [..............................] - ETA: 2:03 - loss: 1.5016 - regression_loss: 1.3171 - classification_loss: 0.1845 5/500 [..............................] - ETA: 2:02 - loss: 1.5141 - regression_loss: 1.3140 - classification_loss: 0.2001 6/500 [..............................] - ETA: 2:01 - loss: 1.5171 - regression_loss: 1.3139 - classification_loss: 0.2033 7/500 [..............................] - ETA: 2:03 - loss: 1.4897 - regression_loss: 1.2859 - classification_loss: 0.2038 8/500 [..............................] - ETA: 2:03 - loss: 1.5065 - regression_loss: 1.3078 - classification_loss: 0.1987 9/500 [..............................] - ETA: 2:03 - loss: 1.5174 - regression_loss: 1.3226 - classification_loss: 0.1948 10/500 [..............................] - ETA: 2:03 - loss: 1.4326 - regression_loss: 1.2523 - classification_loss: 0.1803 11/500 [..............................] - ETA: 2:02 - loss: 1.4205 - regression_loss: 1.2408 - classification_loss: 0.1797 12/500 [..............................] - ETA: 2:02 - loss: 1.3693 - regression_loss: 1.1972 - classification_loss: 0.1721 13/500 [..............................] - ETA: 2:02 - loss: 1.3922 - regression_loss: 1.2155 - classification_loss: 0.1768 14/500 [..............................] - ETA: 2:02 - loss: 1.4182 - regression_loss: 1.2336 - classification_loss: 0.1846 15/500 [..............................] - ETA: 2:02 - loss: 1.3943 - regression_loss: 1.2156 - classification_loss: 0.1787 16/500 [..............................] - ETA: 2:01 - loss: 1.3969 - regression_loss: 1.2186 - classification_loss: 0.1783 17/500 [>.............................] - ETA: 2:01 - loss: 1.3949 - regression_loss: 1.2180 - classification_loss: 0.1768 18/500 [>.............................] - ETA: 2:01 - loss: 1.3715 - regression_loss: 1.2000 - classification_loss: 0.1714 19/500 [>.............................] - ETA: 2:01 - loss: 1.3254 - regression_loss: 1.1588 - classification_loss: 0.1666 20/500 [>.............................] - ETA: 2:00 - loss: 1.3413 - regression_loss: 1.1726 - classification_loss: 0.1687 21/500 [>.............................] - ETA: 2:00 - loss: 1.3642 - regression_loss: 1.1924 - classification_loss: 0.1718 22/500 [>.............................] - ETA: 2:00 - loss: 1.3698 - regression_loss: 1.1947 - classification_loss: 0.1750 23/500 [>.............................] - ETA: 2:00 - loss: 1.3527 - regression_loss: 1.1818 - classification_loss: 0.1710 24/500 [>.............................] - ETA: 1:59 - loss: 1.3863 - regression_loss: 1.2095 - classification_loss: 0.1769 25/500 [>.............................] - ETA: 1:59 - loss: 1.3787 - regression_loss: 1.2006 - classification_loss: 0.1781 26/500 [>.............................] - ETA: 1:59 - loss: 1.3655 - regression_loss: 1.1916 - classification_loss: 0.1739 27/500 [>.............................] - ETA: 1:59 - loss: 1.3746 - regression_loss: 1.1998 - classification_loss: 0.1748 28/500 [>.............................] - ETA: 1:59 - loss: 1.3513 - regression_loss: 1.1801 - classification_loss: 0.1712 29/500 [>.............................] - ETA: 1:59 - loss: 1.3305 - regression_loss: 1.1618 - classification_loss: 0.1687 30/500 [>.............................] - ETA: 1:58 - loss: 1.3332 - regression_loss: 1.1649 - classification_loss: 0.1683 31/500 [>.............................] - ETA: 1:58 - loss: 1.3142 - regression_loss: 1.1483 - classification_loss: 0.1658 32/500 [>.............................] - ETA: 1:58 - loss: 1.2945 - regression_loss: 1.1320 - classification_loss: 0.1625 33/500 [>.............................] - ETA: 1:58 - loss: 1.2990 - regression_loss: 1.1361 - classification_loss: 0.1629 34/500 [=>............................] - ETA: 1:57 - loss: 1.2939 - regression_loss: 1.1322 - classification_loss: 0.1618 35/500 [=>............................] - ETA: 1:57 - loss: 1.3019 - regression_loss: 1.1405 - classification_loss: 0.1614 36/500 [=>............................] - ETA: 1:57 - loss: 1.3003 - regression_loss: 1.1392 - classification_loss: 0.1610 37/500 [=>............................] - ETA: 1:56 - loss: 1.3161 - regression_loss: 1.1528 - classification_loss: 0.1634 38/500 [=>............................] - ETA: 1:56 - loss: 1.2912 - regression_loss: 1.1309 - classification_loss: 0.1603 39/500 [=>............................] - ETA: 1:56 - loss: 1.2988 - regression_loss: 1.1359 - classification_loss: 0.1629 40/500 [=>............................] - ETA: 1:56 - loss: 1.2911 - regression_loss: 1.1298 - classification_loss: 0.1614 41/500 [=>............................] - ETA: 1:56 - loss: 1.2874 - regression_loss: 1.1258 - classification_loss: 0.1616 42/500 [=>............................] - ETA: 1:55 - loss: 1.3031 - regression_loss: 1.1389 - classification_loss: 0.1642 43/500 [=>............................] - ETA: 1:55 - loss: 1.2868 - regression_loss: 1.1245 - classification_loss: 0.1622 44/500 [=>............................] - ETA: 1:55 - loss: 1.2883 - regression_loss: 1.1262 - classification_loss: 0.1621 45/500 [=>............................] - ETA: 1:54 - loss: 1.2963 - regression_loss: 1.1324 - classification_loss: 0.1639 46/500 [=>............................] - ETA: 1:54 - loss: 1.2944 - regression_loss: 1.1304 - classification_loss: 0.1640 47/500 [=>............................] - ETA: 1:54 - loss: 1.2833 - regression_loss: 1.1215 - classification_loss: 0.1618 48/500 [=>............................] - ETA: 1:54 - loss: 1.2731 - regression_loss: 1.1117 - classification_loss: 0.1614 49/500 [=>............................] - ETA: 1:53 - loss: 1.2766 - regression_loss: 1.1159 - classification_loss: 0.1607 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2795 - regression_loss: 1.1185 - classification_loss: 0.1611 51/500 [==>...........................] - ETA: 1:53 - loss: 1.2825 - regression_loss: 1.1208 - classification_loss: 0.1617 52/500 [==>...........................] - ETA: 1:53 - loss: 1.2841 - regression_loss: 1.1220 - classification_loss: 0.1621 53/500 [==>...........................] - ETA: 1:53 - loss: 1.2912 - regression_loss: 1.1285 - classification_loss: 0.1628 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2883 - regression_loss: 1.1258 - classification_loss: 0.1625 55/500 [==>...........................] - ETA: 1:52 - loss: 1.2867 - regression_loss: 1.1247 - classification_loss: 0.1620 56/500 [==>...........................] - ETA: 1:52 - loss: 1.2904 - regression_loss: 1.1279 - classification_loss: 0.1625 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2966 - regression_loss: 1.1332 - classification_loss: 0.1634 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2995 - regression_loss: 1.1358 - classification_loss: 0.1637 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2889 - regression_loss: 1.1264 - classification_loss: 0.1625 60/500 [==>...........................] - ETA: 1:51 - loss: 1.2932 - regression_loss: 1.1288 - classification_loss: 0.1645 61/500 [==>...........................] - ETA: 1:50 - loss: 1.3020 - regression_loss: 1.1348 - classification_loss: 0.1672 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2968 - regression_loss: 1.1309 - classification_loss: 0.1659 63/500 [==>...........................] - ETA: 1:50 - loss: 1.3063 - regression_loss: 1.1390 - classification_loss: 0.1673 64/500 [==>...........................] - ETA: 1:50 - loss: 1.2972 - regression_loss: 1.1315 - classification_loss: 0.1656 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2941 - regression_loss: 1.1292 - classification_loss: 0.1649 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2911 - regression_loss: 1.1269 - classification_loss: 0.1642 67/500 [===>..........................] - ETA: 1:49 - loss: 1.2951 - regression_loss: 1.1304 - classification_loss: 0.1647 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2922 - regression_loss: 1.1285 - classification_loss: 0.1637 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2896 - regression_loss: 1.1268 - classification_loss: 0.1629 70/500 [===>..........................] - ETA: 1:48 - loss: 1.3020 - regression_loss: 1.1356 - classification_loss: 0.1664 71/500 [===>..........................] - ETA: 1:48 - loss: 1.3096 - regression_loss: 1.1432 - classification_loss: 0.1664 72/500 [===>..........................] - ETA: 1:48 - loss: 1.3014 - regression_loss: 1.1359 - classification_loss: 0.1655 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2972 - regression_loss: 1.1327 - classification_loss: 0.1645 74/500 [===>..........................] - ETA: 1:47 - loss: 1.3017 - regression_loss: 1.1368 - classification_loss: 0.1649 75/500 [===>..........................] - ETA: 1:47 - loss: 1.3061 - regression_loss: 1.1406 - classification_loss: 0.1656 76/500 [===>..........................] - ETA: 1:47 - loss: 1.3092 - regression_loss: 1.1429 - classification_loss: 0.1663 77/500 [===>..........................] - ETA: 1:46 - loss: 1.3183 - regression_loss: 1.1503 - classification_loss: 0.1680 78/500 [===>..........................] - ETA: 1:46 - loss: 1.3176 - regression_loss: 1.1497 - classification_loss: 0.1680 79/500 [===>..........................] - ETA: 1:46 - loss: 1.3268 - regression_loss: 1.1583 - classification_loss: 0.1685 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3225 - regression_loss: 1.1543 - classification_loss: 0.1682 81/500 [===>..........................] - ETA: 1:45 - loss: 1.3252 - regression_loss: 1.1573 - classification_loss: 0.1679 82/500 [===>..........................] - ETA: 1:45 - loss: 1.3224 - regression_loss: 1.1549 - classification_loss: 0.1675 83/500 [===>..........................] - ETA: 1:45 - loss: 1.3274 - regression_loss: 1.1593 - classification_loss: 0.1681 84/500 [====>.........................] - ETA: 1:44 - loss: 1.3314 - regression_loss: 1.1626 - classification_loss: 0.1688 85/500 [====>.........................] - ETA: 1:44 - loss: 1.3326 - regression_loss: 1.1637 - classification_loss: 0.1690 86/500 [====>.........................] - ETA: 1:44 - loss: 1.3430 - regression_loss: 1.1718 - classification_loss: 0.1713 87/500 [====>.........................] - ETA: 1:44 - loss: 1.3443 - regression_loss: 1.1731 - classification_loss: 0.1712 88/500 [====>.........................] - ETA: 1:44 - loss: 1.3482 - regression_loss: 1.1774 - classification_loss: 0.1708 89/500 [====>.........................] - ETA: 1:43 - loss: 1.3472 - regression_loss: 1.1768 - classification_loss: 0.1704 90/500 [====>.........................] - ETA: 1:43 - loss: 1.3497 - regression_loss: 1.1788 - classification_loss: 0.1709 91/500 [====>.........................] - ETA: 1:43 - loss: 1.3587 - regression_loss: 1.1862 - classification_loss: 0.1725 92/500 [====>.........................] - ETA: 1:43 - loss: 1.3584 - regression_loss: 1.1858 - classification_loss: 0.1726 93/500 [====>.........................] - ETA: 1:42 - loss: 1.3529 - regression_loss: 1.1814 - classification_loss: 0.1715 94/500 [====>.........................] - ETA: 1:42 - loss: 1.3457 - regression_loss: 1.1750 - classification_loss: 0.1707 95/500 [====>.........................] - ETA: 1:42 - loss: 1.3513 - regression_loss: 1.1800 - classification_loss: 0.1713 96/500 [====>.........................] - ETA: 1:42 - loss: 1.3572 - regression_loss: 1.1839 - classification_loss: 0.1733 97/500 [====>.........................] - ETA: 1:41 - loss: 1.3594 - regression_loss: 1.1858 - classification_loss: 0.1736 98/500 [====>.........................] - ETA: 1:41 - loss: 1.3619 - regression_loss: 1.1881 - classification_loss: 0.1738 99/500 [====>.........................] - ETA: 1:41 - loss: 1.3645 - regression_loss: 1.1895 - classification_loss: 0.1750 100/500 [=====>........................] - ETA: 1:40 - loss: 1.3646 - regression_loss: 1.1903 - classification_loss: 0.1743 101/500 [=====>........................] - ETA: 1:40 - loss: 1.3682 - regression_loss: 1.1934 - classification_loss: 0.1748 102/500 [=====>........................] - ETA: 1:40 - loss: 1.3652 - regression_loss: 1.1908 - classification_loss: 0.1744 103/500 [=====>........................] - ETA: 1:40 - loss: 1.3666 - regression_loss: 1.1920 - classification_loss: 0.1745 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3702 - regression_loss: 1.1952 - classification_loss: 0.1750 105/500 [=====>........................] - ETA: 1:39 - loss: 1.3647 - regression_loss: 1.1907 - classification_loss: 0.1740 106/500 [=====>........................] - ETA: 1:39 - loss: 1.3643 - regression_loss: 1.1909 - classification_loss: 0.1734 107/500 [=====>........................] - ETA: 1:39 - loss: 1.3585 - regression_loss: 1.1850 - classification_loss: 0.1735 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3649 - regression_loss: 1.1909 - classification_loss: 0.1740 109/500 [=====>........................] - ETA: 1:38 - loss: 1.3640 - regression_loss: 1.1905 - classification_loss: 0.1736 110/500 [=====>........................] - ETA: 1:38 - loss: 1.3560 - regression_loss: 1.1836 - classification_loss: 0.1724 111/500 [=====>........................] - ETA: 1:38 - loss: 1.3601 - regression_loss: 1.1872 - classification_loss: 0.1728 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3580 - regression_loss: 1.1860 - classification_loss: 0.1720 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3578 - regression_loss: 1.1856 - classification_loss: 0.1722 114/500 [=====>........................] - ETA: 1:37 - loss: 1.3614 - regression_loss: 1.1889 - classification_loss: 0.1725 115/500 [=====>........................] - ETA: 1:37 - loss: 1.3593 - regression_loss: 1.1870 - classification_loss: 0.1723 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3588 - regression_loss: 1.1866 - classification_loss: 0.1723 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3672 - regression_loss: 1.1938 - classification_loss: 0.1734 118/500 [======>.......................] - ETA: 1:36 - loss: 1.3651 - regression_loss: 1.1923 - classification_loss: 0.1728 119/500 [======>.......................] - ETA: 1:36 - loss: 1.3598 - regression_loss: 1.1879 - classification_loss: 0.1718 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3617 - regression_loss: 1.1895 - classification_loss: 0.1723 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3641 - regression_loss: 1.1917 - classification_loss: 0.1725 122/500 [======>.......................] - ETA: 1:35 - loss: 1.3641 - regression_loss: 1.1919 - classification_loss: 0.1722 123/500 [======>.......................] - ETA: 1:35 - loss: 1.3659 - regression_loss: 1.1934 - classification_loss: 0.1725 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3696 - regression_loss: 1.1962 - classification_loss: 0.1733 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3657 - regression_loss: 1.1929 - classification_loss: 0.1728 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3699 - regression_loss: 1.1959 - classification_loss: 0.1740 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3682 - regression_loss: 1.1949 - classification_loss: 0.1733 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3627 - regression_loss: 1.1905 - classification_loss: 0.1722 129/500 [======>.......................] - ETA: 1:32 - loss: 1.3636 - regression_loss: 1.1911 - classification_loss: 0.1725 130/500 [======>.......................] - ETA: 1:32 - loss: 1.3647 - regression_loss: 1.1919 - classification_loss: 0.1728 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3632 - regression_loss: 1.1907 - classification_loss: 0.1725 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3642 - regression_loss: 1.1913 - classification_loss: 0.1729 133/500 [======>.......................] - ETA: 1:31 - loss: 1.3634 - regression_loss: 1.1908 - classification_loss: 0.1727 134/500 [=======>......................] - ETA: 1:31 - loss: 1.3653 - regression_loss: 1.1924 - classification_loss: 0.1729 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3635 - regression_loss: 1.1911 - classification_loss: 0.1724 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3673 - regression_loss: 1.1941 - classification_loss: 0.1732 137/500 [=======>......................] - ETA: 1:31 - loss: 1.3671 - regression_loss: 1.1940 - classification_loss: 0.1731 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3668 - regression_loss: 1.1937 - classification_loss: 0.1731 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3660 - regression_loss: 1.1927 - classification_loss: 0.1733 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3695 - regression_loss: 1.1958 - classification_loss: 0.1737 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3666 - regression_loss: 1.1935 - classification_loss: 0.1731 142/500 [=======>......................] - ETA: 1:29 - loss: 1.3655 - regression_loss: 1.1924 - classification_loss: 0.1730 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3666 - regression_loss: 1.1936 - classification_loss: 0.1729 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3665 - regression_loss: 1.1936 - classification_loss: 0.1729 145/500 [=======>......................] - ETA: 1:29 - loss: 1.3692 - regression_loss: 1.1959 - classification_loss: 0.1733 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3699 - regression_loss: 1.1969 - classification_loss: 0.1730 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3651 - regression_loss: 1.1928 - classification_loss: 0.1724 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3612 - regression_loss: 1.1890 - classification_loss: 0.1723 149/500 [=======>......................] - ETA: 1:27 - loss: 1.3577 - regression_loss: 1.1863 - classification_loss: 0.1715 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3558 - regression_loss: 1.1845 - classification_loss: 0.1713 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3553 - regression_loss: 1.1841 - classification_loss: 0.1712 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3554 - regression_loss: 1.1841 - classification_loss: 0.1713 153/500 [========>.....................] - ETA: 1:26 - loss: 1.3552 - regression_loss: 1.1839 - classification_loss: 0.1713 154/500 [========>.....................] - ETA: 1:26 - loss: 1.3580 - regression_loss: 1.1862 - classification_loss: 0.1718 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3573 - regression_loss: 1.1852 - classification_loss: 0.1722 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3589 - regression_loss: 1.1866 - classification_loss: 0.1723 157/500 [========>.....................] - ETA: 1:25 - loss: 1.3567 - regression_loss: 1.1846 - classification_loss: 0.1721 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3552 - regression_loss: 1.1830 - classification_loss: 0.1722 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3562 - regression_loss: 1.1839 - classification_loss: 0.1724 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3518 - regression_loss: 1.1802 - classification_loss: 0.1716 161/500 [========>.....................] - ETA: 1:24 - loss: 1.3548 - regression_loss: 1.1822 - classification_loss: 0.1725 162/500 [========>.....................] - ETA: 1:24 - loss: 1.3570 - regression_loss: 1.1839 - classification_loss: 0.1730 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3585 - regression_loss: 1.1849 - classification_loss: 0.1735 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3593 - regression_loss: 1.1857 - classification_loss: 0.1736 165/500 [========>.....................] - ETA: 1:23 - loss: 1.3609 - regression_loss: 1.1872 - classification_loss: 0.1737 166/500 [========>.....................] - ETA: 1:23 - loss: 1.3690 - regression_loss: 1.1944 - classification_loss: 0.1746 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3698 - regression_loss: 1.1953 - classification_loss: 0.1746 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3727 - regression_loss: 1.1983 - classification_loss: 0.1744 169/500 [=========>....................] - ETA: 1:22 - loss: 1.3750 - regression_loss: 1.2001 - classification_loss: 0.1749 170/500 [=========>....................] - ETA: 1:22 - loss: 1.3764 - regression_loss: 1.2014 - classification_loss: 0.1749 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3781 - regression_loss: 1.2030 - classification_loss: 0.1751 172/500 [=========>....................] - ETA: 1:22 - loss: 1.3760 - regression_loss: 1.2015 - classification_loss: 0.1746 173/500 [=========>....................] - ETA: 1:21 - loss: 1.3698 - regression_loss: 1.1959 - classification_loss: 0.1738 174/500 [=========>....................] - ETA: 1:21 - loss: 1.3711 - regression_loss: 1.1972 - classification_loss: 0.1739 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3741 - regression_loss: 1.2002 - classification_loss: 0.1739 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3719 - regression_loss: 1.1982 - classification_loss: 0.1737 177/500 [=========>....................] - ETA: 1:20 - loss: 1.3718 - regression_loss: 1.1981 - classification_loss: 0.1737 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3727 - regression_loss: 1.1987 - classification_loss: 0.1740 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3721 - regression_loss: 1.1980 - classification_loss: 0.1741 180/500 [=========>....................] - ETA: 1:20 - loss: 1.3711 - regression_loss: 1.1970 - classification_loss: 0.1741 181/500 [=========>....................] - ETA: 1:19 - loss: 1.3747 - regression_loss: 1.2002 - classification_loss: 0.1745 182/500 [=========>....................] - ETA: 1:19 - loss: 1.3769 - regression_loss: 1.2019 - classification_loss: 0.1750 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3755 - regression_loss: 1.2006 - classification_loss: 0.1749 184/500 [==========>...................] - ETA: 1:19 - loss: 1.3756 - regression_loss: 1.2005 - classification_loss: 0.1752 185/500 [==========>...................] - ETA: 1:18 - loss: 1.3757 - regression_loss: 1.2008 - classification_loss: 0.1748 186/500 [==========>...................] - ETA: 1:18 - loss: 1.3784 - regression_loss: 1.2031 - classification_loss: 0.1753 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3790 - regression_loss: 1.2034 - classification_loss: 0.1756 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3795 - regression_loss: 1.2037 - classification_loss: 0.1758 189/500 [==========>...................] - ETA: 1:17 - loss: 1.3809 - regression_loss: 1.2049 - classification_loss: 0.1759 190/500 [==========>...................] - ETA: 1:17 - loss: 1.3844 - regression_loss: 1.2077 - classification_loss: 0.1767 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3851 - regression_loss: 1.2083 - classification_loss: 0.1768 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3871 - regression_loss: 1.2099 - classification_loss: 0.1772 193/500 [==========>...................] - ETA: 1:16 - loss: 1.3872 - regression_loss: 1.2099 - classification_loss: 0.1773 194/500 [==========>...................] - ETA: 1:16 - loss: 1.3892 - regression_loss: 1.2117 - classification_loss: 0.1775 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3881 - regression_loss: 1.2109 - classification_loss: 0.1773 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3882 - regression_loss: 1.2107 - classification_loss: 0.1775 197/500 [==========>...................] - ETA: 1:15 - loss: 1.3846 - regression_loss: 1.2076 - classification_loss: 0.1770 198/500 [==========>...................] - ETA: 1:15 - loss: 1.3873 - regression_loss: 1.2095 - classification_loss: 0.1777 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3867 - regression_loss: 1.2091 - classification_loss: 0.1776 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3886 - regression_loss: 1.2107 - classification_loss: 0.1780 201/500 [===========>..................] - ETA: 1:14 - loss: 1.3884 - regression_loss: 1.2107 - classification_loss: 0.1777 202/500 [===========>..................] - ETA: 1:14 - loss: 1.3856 - regression_loss: 1.2082 - classification_loss: 0.1773 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3812 - regression_loss: 1.2045 - classification_loss: 0.1768 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3809 - regression_loss: 1.2037 - classification_loss: 0.1772 205/500 [===========>..................] - ETA: 1:13 - loss: 1.3784 - regression_loss: 1.2017 - classification_loss: 0.1768 206/500 [===========>..................] - ETA: 1:13 - loss: 1.3790 - regression_loss: 1.2023 - classification_loss: 0.1767 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3793 - regression_loss: 1.2026 - classification_loss: 0.1767 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3790 - regression_loss: 1.2024 - classification_loss: 0.1766 209/500 [===========>..................] - ETA: 1:12 - loss: 1.3815 - regression_loss: 1.2049 - classification_loss: 0.1765 210/500 [===========>..................] - ETA: 1:12 - loss: 1.3816 - regression_loss: 1.2053 - classification_loss: 0.1763 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3816 - regression_loss: 1.2052 - classification_loss: 0.1763 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3854 - regression_loss: 1.2082 - classification_loss: 0.1771 213/500 [===========>..................] - ETA: 1:11 - loss: 1.3826 - regression_loss: 1.2060 - classification_loss: 0.1767 214/500 [===========>..................] - ETA: 1:11 - loss: 1.3871 - regression_loss: 1.2103 - classification_loss: 0.1767 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3873 - regression_loss: 1.2105 - classification_loss: 0.1767 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3902 - regression_loss: 1.2133 - classification_loss: 0.1769 217/500 [============>.................] - ETA: 1:10 - loss: 1.3895 - regression_loss: 1.2126 - classification_loss: 0.1769 218/500 [============>.................] - ETA: 1:10 - loss: 1.3865 - regression_loss: 1.2102 - classification_loss: 0.1764 219/500 [============>.................] - ETA: 1:10 - loss: 1.3828 - regression_loss: 1.2069 - classification_loss: 0.1759 220/500 [============>.................] - ETA: 1:10 - loss: 1.3812 - regression_loss: 1.2057 - classification_loss: 0.1755 221/500 [============>.................] - ETA: 1:09 - loss: 1.3806 - regression_loss: 1.2052 - classification_loss: 0.1754 222/500 [============>.................] - ETA: 1:09 - loss: 1.3794 - regression_loss: 1.2040 - classification_loss: 0.1754 223/500 [============>.................] - ETA: 1:09 - loss: 1.3804 - regression_loss: 1.2049 - classification_loss: 0.1755 224/500 [============>.................] - ETA: 1:09 - loss: 1.3843 - regression_loss: 1.2084 - classification_loss: 0.1759 225/500 [============>.................] - ETA: 1:08 - loss: 1.3827 - regression_loss: 1.2068 - classification_loss: 0.1759 226/500 [============>.................] - ETA: 1:08 - loss: 1.3822 - regression_loss: 1.2061 - classification_loss: 0.1760 227/500 [============>.................] - ETA: 1:08 - loss: 1.3809 - regression_loss: 1.2051 - classification_loss: 0.1758 228/500 [============>.................] - ETA: 1:08 - loss: 1.3807 - regression_loss: 1.2050 - classification_loss: 0.1757 229/500 [============>.................] - ETA: 1:07 - loss: 1.3806 - regression_loss: 1.2050 - classification_loss: 0.1756 230/500 [============>.................] - ETA: 1:07 - loss: 1.3776 - regression_loss: 1.2023 - classification_loss: 0.1753 231/500 [============>.................] - ETA: 1:07 - loss: 1.3745 - regression_loss: 1.1996 - classification_loss: 0.1749 232/500 [============>.................] - ETA: 1:07 - loss: 1.3744 - regression_loss: 1.1995 - classification_loss: 0.1749 233/500 [============>.................] - ETA: 1:06 - loss: 1.3751 - regression_loss: 1.2001 - classification_loss: 0.1750 234/500 [=============>................] - ETA: 1:06 - loss: 1.3763 - regression_loss: 1.2011 - classification_loss: 0.1752 235/500 [=============>................] - ETA: 1:06 - loss: 1.3758 - regression_loss: 1.2007 - classification_loss: 0.1750 236/500 [=============>................] - ETA: 1:06 - loss: 1.3771 - regression_loss: 1.2017 - classification_loss: 0.1754 237/500 [=============>................] - ETA: 1:05 - loss: 1.3783 - regression_loss: 1.2028 - classification_loss: 0.1755 238/500 [=============>................] - ETA: 1:05 - loss: 1.3794 - regression_loss: 1.2037 - classification_loss: 0.1757 239/500 [=============>................] - ETA: 1:05 - loss: 1.3793 - regression_loss: 1.2034 - classification_loss: 0.1758 240/500 [=============>................] - ETA: 1:05 - loss: 1.3771 - regression_loss: 1.2015 - classification_loss: 0.1755 241/500 [=============>................] - ETA: 1:05 - loss: 1.3763 - regression_loss: 1.2008 - classification_loss: 0.1755 242/500 [=============>................] - ETA: 1:04 - loss: 1.3772 - regression_loss: 1.2015 - classification_loss: 0.1757 243/500 [=============>................] - ETA: 1:04 - loss: 1.3778 - regression_loss: 1.2021 - classification_loss: 0.1757 244/500 [=============>................] - ETA: 1:04 - loss: 1.3760 - regression_loss: 1.2005 - classification_loss: 0.1756 245/500 [=============>................] - ETA: 1:03 - loss: 1.3752 - regression_loss: 1.1997 - classification_loss: 0.1755 246/500 [=============>................] - ETA: 1:03 - loss: 1.3754 - regression_loss: 1.2001 - classification_loss: 0.1753 247/500 [=============>................] - ETA: 1:03 - loss: 1.3737 - regression_loss: 1.1986 - classification_loss: 0.1750 248/500 [=============>................] - ETA: 1:03 - loss: 1.3715 - regression_loss: 1.1968 - classification_loss: 0.1747 249/500 [=============>................] - ETA: 1:03 - loss: 1.3736 - regression_loss: 1.1987 - classification_loss: 0.1749 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3731 - regression_loss: 1.1982 - classification_loss: 0.1749 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3715 - regression_loss: 1.1968 - classification_loss: 0.1746 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3690 - regression_loss: 1.1948 - classification_loss: 0.1742 253/500 [==============>...............] - ETA: 1:02 - loss: 1.3692 - regression_loss: 1.1949 - classification_loss: 0.1743 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3696 - regression_loss: 1.1950 - classification_loss: 0.1746 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3685 - regression_loss: 1.1941 - classification_loss: 0.1745 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3669 - regression_loss: 1.1926 - classification_loss: 0.1743 257/500 [==============>...............] - ETA: 1:00 - loss: 1.3690 - regression_loss: 1.1943 - classification_loss: 0.1748 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3699 - regression_loss: 1.1951 - classification_loss: 0.1749 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3701 - regression_loss: 1.1954 - classification_loss: 0.1748 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3710 - regression_loss: 1.1963 - classification_loss: 0.1747 261/500 [==============>...............] - ETA: 1:00 - loss: 1.3721 - regression_loss: 1.1972 - classification_loss: 0.1749 262/500 [==============>...............] - ETA: 59s - loss: 1.3720 - regression_loss: 1.1972 - classification_loss: 0.1748  263/500 [==============>...............] - ETA: 59s - loss: 1.3722 - regression_loss: 1.1975 - classification_loss: 0.1747 264/500 [==============>...............] - ETA: 59s - loss: 1.3709 - regression_loss: 1.1963 - classification_loss: 0.1746 265/500 [==============>...............] - ETA: 58s - loss: 1.3713 - regression_loss: 1.1967 - classification_loss: 0.1746 266/500 [==============>...............] - ETA: 58s - loss: 1.3689 - regression_loss: 1.1946 - classification_loss: 0.1743 267/500 [===============>..............] - ETA: 58s - loss: 1.3693 - regression_loss: 1.1951 - classification_loss: 0.1742 268/500 [===============>..............] - ETA: 58s - loss: 1.3691 - regression_loss: 1.1949 - classification_loss: 0.1742 269/500 [===============>..............] - ETA: 57s - loss: 1.3664 - regression_loss: 1.1925 - classification_loss: 0.1739 270/500 [===============>..............] - ETA: 57s - loss: 1.3667 - regression_loss: 1.1927 - classification_loss: 0.1740 271/500 [===============>..............] - ETA: 57s - loss: 1.3657 - regression_loss: 1.1919 - classification_loss: 0.1738 272/500 [===============>..............] - ETA: 57s - loss: 1.3686 - regression_loss: 1.1945 - classification_loss: 0.1741 273/500 [===============>..............] - ETA: 56s - loss: 1.3691 - regression_loss: 1.1950 - classification_loss: 0.1740 274/500 [===============>..............] - ETA: 56s - loss: 1.3662 - regression_loss: 1.1924 - classification_loss: 0.1738 275/500 [===============>..............] - ETA: 56s - loss: 1.3675 - regression_loss: 1.1935 - classification_loss: 0.1740 276/500 [===============>..............] - ETA: 56s - loss: 1.3697 - regression_loss: 1.1953 - classification_loss: 0.1744 277/500 [===============>..............] - ETA: 55s - loss: 1.3697 - regression_loss: 1.1952 - classification_loss: 0.1744 278/500 [===============>..............] - ETA: 55s - loss: 1.3704 - regression_loss: 1.1959 - classification_loss: 0.1744 279/500 [===============>..............] - ETA: 55s - loss: 1.3708 - regression_loss: 1.1964 - classification_loss: 0.1745 280/500 [===============>..............] - ETA: 55s - loss: 1.3721 - regression_loss: 1.1975 - classification_loss: 0.1746 281/500 [===============>..............] - ETA: 54s - loss: 1.3752 - regression_loss: 1.2003 - classification_loss: 0.1749 282/500 [===============>..............] - ETA: 54s - loss: 1.3753 - regression_loss: 1.2004 - classification_loss: 0.1749 283/500 [===============>..............] - ETA: 54s - loss: 1.3724 - regression_loss: 1.1979 - classification_loss: 0.1745 284/500 [================>.............] - ETA: 54s - loss: 1.3734 - regression_loss: 1.1988 - classification_loss: 0.1746 285/500 [================>.............] - ETA: 53s - loss: 1.3741 - regression_loss: 1.1994 - classification_loss: 0.1747 286/500 [================>.............] - ETA: 53s - loss: 1.3748 - regression_loss: 1.2001 - classification_loss: 0.1747 287/500 [================>.............] - ETA: 53s - loss: 1.3747 - regression_loss: 1.2001 - classification_loss: 0.1747 288/500 [================>.............] - ETA: 53s - loss: 1.3712 - regression_loss: 1.1969 - classification_loss: 0.1742 289/500 [================>.............] - ETA: 52s - loss: 1.3737 - regression_loss: 1.1990 - classification_loss: 0.1747 290/500 [================>.............] - ETA: 52s - loss: 1.3748 - regression_loss: 1.2000 - classification_loss: 0.1748 291/500 [================>.............] - ETA: 52s - loss: 1.3719 - regression_loss: 1.1975 - classification_loss: 0.1744 292/500 [================>.............] - ETA: 52s - loss: 1.3700 - regression_loss: 1.1960 - classification_loss: 0.1740 293/500 [================>.............] - ETA: 51s - loss: 1.3699 - regression_loss: 1.1960 - classification_loss: 0.1740 294/500 [================>.............] - ETA: 51s - loss: 1.3682 - regression_loss: 1.1946 - classification_loss: 0.1736 295/500 [================>.............] - ETA: 51s - loss: 1.3693 - regression_loss: 1.1955 - classification_loss: 0.1738 296/500 [================>.............] - ETA: 51s - loss: 1.3670 - regression_loss: 1.1935 - classification_loss: 0.1735 297/500 [================>.............] - ETA: 50s - loss: 1.3669 - regression_loss: 1.1937 - classification_loss: 0.1732 298/500 [================>.............] - ETA: 50s - loss: 1.3653 - regression_loss: 1.1923 - classification_loss: 0.1730 299/500 [================>.............] - ETA: 50s - loss: 1.3655 - regression_loss: 1.1925 - classification_loss: 0.1730 300/500 [=================>............] - ETA: 50s - loss: 1.3656 - regression_loss: 1.1926 - classification_loss: 0.1730 301/500 [=================>............] - ETA: 49s - loss: 1.3653 - regression_loss: 1.1924 - classification_loss: 0.1729 302/500 [=================>............] - ETA: 49s - loss: 1.3657 - regression_loss: 1.1928 - classification_loss: 0.1729 303/500 [=================>............] - ETA: 49s - loss: 1.3653 - regression_loss: 1.1926 - classification_loss: 0.1727 304/500 [=================>............] - ETA: 49s - loss: 1.3667 - regression_loss: 1.1937 - classification_loss: 0.1730 305/500 [=================>............] - ETA: 48s - loss: 1.3672 - regression_loss: 1.1942 - classification_loss: 0.1730 306/500 [=================>............] - ETA: 48s - loss: 1.3648 - regression_loss: 1.1920 - classification_loss: 0.1727 307/500 [=================>............] - ETA: 48s - loss: 1.3658 - regression_loss: 1.1929 - classification_loss: 0.1729 308/500 [=================>............] - ETA: 48s - loss: 1.3659 - regression_loss: 1.1929 - classification_loss: 0.1729 309/500 [=================>............] - ETA: 47s - loss: 1.3667 - regression_loss: 1.1936 - classification_loss: 0.1731 310/500 [=================>............] - ETA: 47s - loss: 1.3646 - regression_loss: 1.1918 - classification_loss: 0.1728 311/500 [=================>............] - ETA: 47s - loss: 1.3674 - regression_loss: 1.1942 - classification_loss: 0.1732 312/500 [=================>............] - ETA: 47s - loss: 1.3668 - regression_loss: 1.1939 - classification_loss: 0.1729 313/500 [=================>............] - ETA: 46s - loss: 1.3670 - regression_loss: 1.1942 - classification_loss: 0.1729 314/500 [=================>............] - ETA: 46s - loss: 1.3665 - regression_loss: 1.1937 - classification_loss: 0.1728 315/500 [=================>............] - ETA: 46s - loss: 1.3653 - regression_loss: 1.1926 - classification_loss: 0.1728 316/500 [=================>............] - ETA: 46s - loss: 1.3634 - regression_loss: 1.1909 - classification_loss: 0.1725 317/500 [==================>...........] - ETA: 45s - loss: 1.3637 - regression_loss: 1.1911 - classification_loss: 0.1726 318/500 [==================>...........] - ETA: 45s - loss: 1.3652 - regression_loss: 1.1922 - classification_loss: 0.1730 319/500 [==================>...........] - ETA: 45s - loss: 1.3671 - regression_loss: 1.1939 - classification_loss: 0.1733 320/500 [==================>...........] - ETA: 45s - loss: 1.3669 - regression_loss: 1.1938 - classification_loss: 0.1732 321/500 [==================>...........] - ETA: 44s - loss: 1.3670 - regression_loss: 1.1941 - classification_loss: 0.1729 322/500 [==================>...........] - ETA: 44s - loss: 1.3683 - regression_loss: 1.1955 - classification_loss: 0.1728 323/500 [==================>...........] - ETA: 44s - loss: 1.3694 - regression_loss: 1.1965 - classification_loss: 0.1729 324/500 [==================>...........] - ETA: 44s - loss: 1.3700 - regression_loss: 1.1970 - classification_loss: 0.1729 325/500 [==================>...........] - ETA: 43s - loss: 1.3702 - regression_loss: 1.1971 - classification_loss: 0.1731 326/500 [==================>...........] - ETA: 43s - loss: 1.3722 - regression_loss: 1.1985 - classification_loss: 0.1737 327/500 [==================>...........] - ETA: 43s - loss: 1.3720 - regression_loss: 1.1984 - classification_loss: 0.1736 328/500 [==================>...........] - ETA: 43s - loss: 1.3724 - regression_loss: 1.1986 - classification_loss: 0.1737 329/500 [==================>...........] - ETA: 42s - loss: 1.3700 - regression_loss: 1.1966 - classification_loss: 0.1734 330/500 [==================>...........] - ETA: 42s - loss: 1.3693 - regression_loss: 1.1961 - classification_loss: 0.1732 331/500 [==================>...........] - ETA: 42s - loss: 1.3680 - regression_loss: 1.1949 - classification_loss: 0.1731 332/500 [==================>...........] - ETA: 42s - loss: 1.3674 - regression_loss: 1.1945 - classification_loss: 0.1729 333/500 [==================>...........] - ETA: 41s - loss: 1.3673 - regression_loss: 1.1945 - classification_loss: 0.1729 334/500 [===================>..........] - ETA: 41s - loss: 1.3661 - regression_loss: 1.1934 - classification_loss: 0.1728 335/500 [===================>..........] - ETA: 41s - loss: 1.3663 - regression_loss: 1.1936 - classification_loss: 0.1727 336/500 [===================>..........] - ETA: 41s - loss: 1.3661 - regression_loss: 1.1934 - classification_loss: 0.1727 337/500 [===================>..........] - ETA: 40s - loss: 1.3667 - regression_loss: 1.1940 - classification_loss: 0.1728 338/500 [===================>..........] - ETA: 40s - loss: 1.3673 - regression_loss: 1.1944 - classification_loss: 0.1728 339/500 [===================>..........] - ETA: 40s - loss: 1.3687 - regression_loss: 1.1956 - classification_loss: 0.1731 340/500 [===================>..........] - ETA: 40s - loss: 1.3692 - regression_loss: 1.1962 - classification_loss: 0.1731 341/500 [===================>..........] - ETA: 39s - loss: 1.3702 - regression_loss: 1.1969 - classification_loss: 0.1733 342/500 [===================>..........] - ETA: 39s - loss: 1.3702 - regression_loss: 1.1968 - classification_loss: 0.1734 343/500 [===================>..........] - ETA: 39s - loss: 1.3697 - regression_loss: 1.1965 - classification_loss: 0.1732 344/500 [===================>..........] - ETA: 39s - loss: 1.3694 - regression_loss: 1.1962 - classification_loss: 0.1732 345/500 [===================>..........] - ETA: 38s - loss: 1.3702 - regression_loss: 1.1969 - classification_loss: 0.1733 346/500 [===================>..........] - ETA: 38s - loss: 1.3699 - regression_loss: 1.1967 - classification_loss: 0.1732 347/500 [===================>..........] - ETA: 38s - loss: 1.3680 - regression_loss: 1.1951 - classification_loss: 0.1728 348/500 [===================>..........] - ETA: 38s - loss: 1.3691 - regression_loss: 1.1962 - classification_loss: 0.1729 349/500 [===================>..........] - ETA: 37s - loss: 1.3678 - regression_loss: 1.1950 - classification_loss: 0.1729 350/500 [====================>.........] - ETA: 37s - loss: 1.3665 - regression_loss: 1.1937 - classification_loss: 0.1727 351/500 [====================>.........] - ETA: 37s - loss: 1.3683 - regression_loss: 1.1956 - classification_loss: 0.1728 352/500 [====================>.........] - ETA: 37s - loss: 1.3681 - regression_loss: 1.1955 - classification_loss: 0.1727 353/500 [====================>.........] - ETA: 36s - loss: 1.3690 - regression_loss: 1.1961 - classification_loss: 0.1730 354/500 [====================>.........] - ETA: 36s - loss: 1.3682 - regression_loss: 1.1954 - classification_loss: 0.1729 355/500 [====================>.........] - ETA: 36s - loss: 1.3675 - regression_loss: 1.1948 - classification_loss: 0.1727 356/500 [====================>.........] - ETA: 36s - loss: 1.3686 - regression_loss: 1.1957 - classification_loss: 0.1729 357/500 [====================>.........] - ETA: 35s - loss: 1.3682 - regression_loss: 1.1954 - classification_loss: 0.1728 358/500 [====================>.........] - ETA: 35s - loss: 1.3685 - regression_loss: 1.1958 - classification_loss: 0.1727 359/500 [====================>.........] - ETA: 35s - loss: 1.3683 - regression_loss: 1.1956 - classification_loss: 0.1727 360/500 [====================>.........] - ETA: 35s - loss: 1.3662 - regression_loss: 1.1938 - classification_loss: 0.1724 361/500 [====================>.........] - ETA: 34s - loss: 1.3664 - regression_loss: 1.1940 - classification_loss: 0.1724 362/500 [====================>.........] - ETA: 34s - loss: 1.3668 - regression_loss: 1.1943 - classification_loss: 0.1725 363/500 [====================>.........] - ETA: 34s - loss: 1.3677 - regression_loss: 1.1951 - classification_loss: 0.1726 364/500 [====================>.........] - ETA: 34s - loss: 1.3694 - regression_loss: 1.1963 - classification_loss: 0.1731 365/500 [====================>.........] - ETA: 33s - loss: 1.3698 - regression_loss: 1.1963 - classification_loss: 0.1735 366/500 [====================>.........] - ETA: 33s - loss: 1.3699 - regression_loss: 1.1964 - classification_loss: 0.1735 367/500 [=====================>........] - ETA: 33s - loss: 1.3701 - regression_loss: 1.1967 - classification_loss: 0.1734 368/500 [=====================>........] - ETA: 33s - loss: 1.3703 - regression_loss: 1.1968 - classification_loss: 0.1734 369/500 [=====================>........] - ETA: 32s - loss: 1.3705 - regression_loss: 1.1970 - classification_loss: 0.1735 370/500 [=====================>........] - ETA: 32s - loss: 1.3706 - regression_loss: 1.1972 - classification_loss: 0.1734 371/500 [=====================>........] - ETA: 32s - loss: 1.3690 - regression_loss: 1.1959 - classification_loss: 0.1731 372/500 [=====================>........] - ETA: 32s - loss: 1.3672 - regression_loss: 1.1943 - classification_loss: 0.1729 373/500 [=====================>........] - ETA: 31s - loss: 1.3689 - regression_loss: 1.1956 - classification_loss: 0.1733 374/500 [=====================>........] - ETA: 31s - loss: 1.3690 - regression_loss: 1.1957 - classification_loss: 0.1733 375/500 [=====================>........] - ETA: 31s - loss: 1.3709 - regression_loss: 1.1972 - classification_loss: 0.1736 376/500 [=====================>........] - ETA: 31s - loss: 1.3714 - regression_loss: 1.1976 - classification_loss: 0.1738 377/500 [=====================>........] - ETA: 30s - loss: 1.3725 - regression_loss: 1.1986 - classification_loss: 0.1739 378/500 [=====================>........] - ETA: 30s - loss: 1.3727 - regression_loss: 1.1988 - classification_loss: 0.1739 379/500 [=====================>........] - ETA: 30s - loss: 1.3724 - regression_loss: 1.1986 - classification_loss: 0.1738 380/500 [=====================>........] - ETA: 30s - loss: 1.3726 - regression_loss: 1.1988 - classification_loss: 0.1738 381/500 [=====================>........] - ETA: 29s - loss: 1.3734 - regression_loss: 1.1995 - classification_loss: 0.1740 382/500 [=====================>........] - ETA: 29s - loss: 1.3718 - regression_loss: 1.1979 - classification_loss: 0.1739 383/500 [=====================>........] - ETA: 29s - loss: 1.3714 - regression_loss: 1.1975 - classification_loss: 0.1739 384/500 [======================>.......] - ETA: 29s - loss: 1.3704 - regression_loss: 1.1967 - classification_loss: 0.1737 385/500 [======================>.......] - ETA: 28s - loss: 1.3698 - regression_loss: 1.1962 - classification_loss: 0.1736 386/500 [======================>.......] - ETA: 28s - loss: 1.3707 - regression_loss: 1.1969 - classification_loss: 0.1737 387/500 [======================>.......] - ETA: 28s - loss: 1.3700 - regression_loss: 1.1961 - classification_loss: 0.1740 388/500 [======================>.......] - ETA: 28s - loss: 1.3706 - regression_loss: 1.1965 - classification_loss: 0.1741 389/500 [======================>.......] - ETA: 27s - loss: 1.3707 - regression_loss: 1.1966 - classification_loss: 0.1741 390/500 [======================>.......] - ETA: 27s - loss: 1.3711 - regression_loss: 1.1968 - classification_loss: 0.1743 391/500 [======================>.......] - ETA: 27s - loss: 1.3720 - regression_loss: 1.1978 - classification_loss: 0.1743 392/500 [======================>.......] - ETA: 27s - loss: 1.3725 - regression_loss: 1.1981 - classification_loss: 0.1744 393/500 [======================>.......] - ETA: 26s - loss: 1.3720 - regression_loss: 1.1978 - classification_loss: 0.1743 394/500 [======================>.......] - ETA: 26s - loss: 1.3716 - regression_loss: 1.1974 - classification_loss: 0.1742 395/500 [======================>.......] - ETA: 26s - loss: 1.3711 - regression_loss: 1.1971 - classification_loss: 0.1741 396/500 [======================>.......] - ETA: 26s - loss: 1.3717 - regression_loss: 1.1975 - classification_loss: 0.1742 397/500 [======================>.......] - ETA: 25s - loss: 1.3724 - regression_loss: 1.1981 - classification_loss: 0.1743 398/500 [======================>.......] - ETA: 25s - loss: 1.3724 - regression_loss: 1.1981 - classification_loss: 0.1743 399/500 [======================>.......] - ETA: 25s - loss: 1.3703 - regression_loss: 1.1963 - classification_loss: 0.1741 400/500 [=======================>......] - ETA: 25s - loss: 1.3696 - regression_loss: 1.1956 - classification_loss: 0.1740 401/500 [=======================>......] - ETA: 24s - loss: 1.3696 - regression_loss: 1.1955 - classification_loss: 0.1740 402/500 [=======================>......] - ETA: 24s - loss: 1.3693 - regression_loss: 1.1953 - classification_loss: 0.1740 403/500 [=======================>......] - ETA: 24s - loss: 1.3697 - regression_loss: 1.1956 - classification_loss: 0.1740 404/500 [=======================>......] - ETA: 24s - loss: 1.3705 - regression_loss: 1.1964 - classification_loss: 0.1741 405/500 [=======================>......] - ETA: 23s - loss: 1.3697 - regression_loss: 1.1958 - classification_loss: 0.1739 406/500 [=======================>......] - ETA: 23s - loss: 1.3699 - regression_loss: 1.1959 - classification_loss: 0.1740 407/500 [=======================>......] - ETA: 23s - loss: 1.3706 - regression_loss: 1.1965 - classification_loss: 0.1741 408/500 [=======================>......] - ETA: 23s - loss: 1.3711 - regression_loss: 1.1969 - classification_loss: 0.1742 409/500 [=======================>......] - ETA: 22s - loss: 1.3720 - regression_loss: 1.1978 - classification_loss: 0.1742 410/500 [=======================>......] - ETA: 22s - loss: 1.3762 - regression_loss: 1.1992 - classification_loss: 0.1770 411/500 [=======================>......] - ETA: 22s - loss: 1.3768 - regression_loss: 1.1998 - classification_loss: 0.1770 412/500 [=======================>......] - ETA: 22s - loss: 1.3771 - regression_loss: 1.2000 - classification_loss: 0.1771 413/500 [=======================>......] - ETA: 21s - loss: 1.3771 - regression_loss: 1.2000 - classification_loss: 0.1771 414/500 [=======================>......] - ETA: 21s - loss: 1.3775 - regression_loss: 1.2003 - classification_loss: 0.1772 415/500 [=======================>......] - ETA: 21s - loss: 1.3777 - regression_loss: 1.2003 - classification_loss: 0.1774 416/500 [=======================>......] - ETA: 21s - loss: 1.3778 - regression_loss: 1.2005 - classification_loss: 0.1773 417/500 [========================>.....] - ETA: 20s - loss: 1.3787 - regression_loss: 1.2013 - classification_loss: 0.1774 418/500 [========================>.....] - ETA: 20s - loss: 1.3767 - regression_loss: 1.1995 - classification_loss: 0.1772 419/500 [========================>.....] - ETA: 20s - loss: 1.3770 - regression_loss: 1.1998 - classification_loss: 0.1772 420/500 [========================>.....] - ETA: 20s - loss: 1.3778 - regression_loss: 1.2005 - classification_loss: 0.1773 421/500 [========================>.....] - ETA: 19s - loss: 1.3754 - regression_loss: 1.1985 - classification_loss: 0.1770 422/500 [========================>.....] - ETA: 19s - loss: 1.3738 - regression_loss: 1.1971 - classification_loss: 0.1767 423/500 [========================>.....] - ETA: 19s - loss: 1.3738 - regression_loss: 1.1969 - classification_loss: 0.1768 424/500 [========================>.....] - ETA: 19s - loss: 1.3724 - regression_loss: 1.1958 - classification_loss: 0.1766 425/500 [========================>.....] - ETA: 18s - loss: 1.3739 - regression_loss: 1.1968 - classification_loss: 0.1771 426/500 [========================>.....] - ETA: 18s - loss: 1.3749 - regression_loss: 1.1975 - classification_loss: 0.1773 427/500 [========================>.....] - ETA: 18s - loss: 1.3750 - regression_loss: 1.1976 - classification_loss: 0.1773 428/500 [========================>.....] - ETA: 18s - loss: 1.3731 - regression_loss: 1.1961 - classification_loss: 0.1770 429/500 [========================>.....] - ETA: 17s - loss: 1.3739 - regression_loss: 1.1967 - classification_loss: 0.1771 430/500 [========================>.....] - ETA: 17s - loss: 1.3747 - regression_loss: 1.1974 - classification_loss: 0.1773 431/500 [========================>.....] - ETA: 17s - loss: 1.3741 - regression_loss: 1.1969 - classification_loss: 0.1772 432/500 [========================>.....] - ETA: 17s - loss: 1.3743 - regression_loss: 1.1972 - classification_loss: 0.1772 433/500 [========================>.....] - ETA: 16s - loss: 1.3741 - regression_loss: 1.1969 - classification_loss: 0.1772 434/500 [=========================>....] - ETA: 16s - loss: 1.3745 - regression_loss: 1.1974 - classification_loss: 0.1771 435/500 [=========================>....] - ETA: 16s - loss: 1.3752 - regression_loss: 1.1980 - classification_loss: 0.1772 436/500 [=========================>....] - ETA: 16s - loss: 1.3762 - regression_loss: 1.1987 - classification_loss: 0.1775 437/500 [=========================>....] - ETA: 15s - loss: 1.3760 - regression_loss: 1.1986 - classification_loss: 0.1774 438/500 [=========================>....] - ETA: 15s - loss: 1.3764 - regression_loss: 1.1988 - classification_loss: 0.1776 439/500 [=========================>....] - ETA: 15s - loss: 1.3767 - regression_loss: 1.1991 - classification_loss: 0.1777 440/500 [=========================>....] - ETA: 15s - loss: 1.3780 - regression_loss: 1.2002 - classification_loss: 0.1779 441/500 [=========================>....] - ETA: 14s - loss: 1.3787 - regression_loss: 1.2007 - classification_loss: 0.1779 442/500 [=========================>....] - ETA: 14s - loss: 1.3791 - regression_loss: 1.2012 - classification_loss: 0.1779 443/500 [=========================>....] - ETA: 14s - loss: 1.3801 - regression_loss: 1.2020 - classification_loss: 0.1781 444/500 [=========================>....] - ETA: 14s - loss: 1.3794 - regression_loss: 1.2013 - classification_loss: 0.1781 445/500 [=========================>....] - ETA: 13s - loss: 1.3795 - regression_loss: 1.2013 - classification_loss: 0.1781 446/500 [=========================>....] - ETA: 13s - loss: 1.3800 - regression_loss: 1.2018 - classification_loss: 0.1782 447/500 [=========================>....] - ETA: 13s - loss: 1.3790 - regression_loss: 1.2010 - classification_loss: 0.1780 448/500 [=========================>....] - ETA: 13s - loss: 1.3793 - regression_loss: 1.2012 - classification_loss: 0.1781 449/500 [=========================>....] - ETA: 12s - loss: 1.3802 - regression_loss: 1.2020 - classification_loss: 0.1782 450/500 [==========================>...] - ETA: 12s - loss: 1.3803 - regression_loss: 1.2018 - classification_loss: 0.1785 451/500 [==========================>...] - ETA: 12s - loss: 1.3800 - regression_loss: 1.2017 - classification_loss: 0.1784 452/500 [==========================>...] - ETA: 12s - loss: 1.3797 - regression_loss: 1.2013 - classification_loss: 0.1784 453/500 [==========================>...] - ETA: 11s - loss: 1.3784 - regression_loss: 1.2002 - classification_loss: 0.1782 454/500 [==========================>...] - ETA: 11s - loss: 1.3775 - regression_loss: 1.1995 - classification_loss: 0.1780 455/500 [==========================>...] - ETA: 11s - loss: 1.3782 - regression_loss: 1.2001 - classification_loss: 0.1780 456/500 [==========================>...] - ETA: 11s - loss: 1.3792 - regression_loss: 1.2009 - classification_loss: 0.1782 457/500 [==========================>...] - ETA: 10s - loss: 1.3790 - regression_loss: 1.2008 - classification_loss: 0.1782 458/500 [==========================>...] - ETA: 10s - loss: 1.3773 - regression_loss: 1.1993 - classification_loss: 0.1780 459/500 [==========================>...] - ETA: 10s - loss: 1.3760 - regression_loss: 1.1981 - classification_loss: 0.1779 460/500 [==========================>...] - ETA: 10s - loss: 1.3747 - regression_loss: 1.1970 - classification_loss: 0.1777 461/500 [==========================>...] - ETA: 9s - loss: 1.3751 - regression_loss: 1.1974 - classification_loss: 0.1778  462/500 [==========================>...] - ETA: 9s - loss: 1.3755 - regression_loss: 1.1977 - classification_loss: 0.1778 463/500 [==========================>...] - ETA: 9s - loss: 1.3756 - regression_loss: 1.1978 - classification_loss: 0.1778 464/500 [==========================>...] - ETA: 9s - loss: 1.3755 - regression_loss: 1.1977 - classification_loss: 0.1778 465/500 [==========================>...] - ETA: 8s - loss: 1.3748 - regression_loss: 1.1971 - classification_loss: 0.1777 466/500 [==========================>...] - ETA: 8s - loss: 1.3800 - regression_loss: 1.2012 - classification_loss: 0.1788 467/500 [===========================>..] - ETA: 8s - loss: 1.3799 - regression_loss: 1.2011 - classification_loss: 0.1788 468/500 [===========================>..] - ETA: 8s - loss: 1.3794 - regression_loss: 1.2006 - classification_loss: 0.1788 469/500 [===========================>..] - ETA: 7s - loss: 1.3779 - regression_loss: 1.1993 - classification_loss: 0.1786 470/500 [===========================>..] - ETA: 7s - loss: 1.3766 - regression_loss: 1.1982 - classification_loss: 0.1783 471/500 [===========================>..] - ETA: 7s - loss: 1.3755 - regression_loss: 1.1973 - classification_loss: 0.1782 472/500 [===========================>..] - ETA: 7s - loss: 1.3766 - regression_loss: 1.1984 - classification_loss: 0.1782 473/500 [===========================>..] - ETA: 6s - loss: 1.3760 - regression_loss: 1.1979 - classification_loss: 0.1781 474/500 [===========================>..] - ETA: 6s - loss: 1.3757 - regression_loss: 1.1977 - classification_loss: 0.1780 475/500 [===========================>..] - ETA: 6s - loss: 1.3733 - regression_loss: 1.1956 - classification_loss: 0.1777 476/500 [===========================>..] - ETA: 6s - loss: 1.3739 - regression_loss: 1.1960 - classification_loss: 0.1779 477/500 [===========================>..] - ETA: 5s - loss: 1.3747 - regression_loss: 1.1966 - classification_loss: 0.1780 478/500 [===========================>..] - ETA: 5s - loss: 1.3735 - regression_loss: 1.1955 - classification_loss: 0.1779 479/500 [===========================>..] - ETA: 5s - loss: 1.3727 - regression_loss: 1.1948 - classification_loss: 0.1778 480/500 [===========================>..] - ETA: 5s - loss: 1.3723 - regression_loss: 1.1945 - classification_loss: 0.1777 481/500 [===========================>..] - ETA: 4s - loss: 1.3722 - regression_loss: 1.1945 - classification_loss: 0.1777 482/500 [===========================>..] - ETA: 4s - loss: 1.3732 - regression_loss: 1.1955 - classification_loss: 0.1777 483/500 [===========================>..] - ETA: 4s - loss: 1.3733 - regression_loss: 1.1955 - classification_loss: 0.1777 484/500 [============================>.] - ETA: 4s - loss: 1.3745 - regression_loss: 1.1964 - classification_loss: 0.1780 485/500 [============================>.] - ETA: 3s - loss: 1.3747 - regression_loss: 1.1965 - classification_loss: 0.1781 486/500 [============================>.] - ETA: 3s - loss: 1.3751 - regression_loss: 1.1970 - classification_loss: 0.1781 487/500 [============================>.] - ETA: 3s - loss: 1.3750 - regression_loss: 1.1969 - classification_loss: 0.1782 488/500 [============================>.] - ETA: 3s - loss: 1.3746 - regression_loss: 1.1964 - classification_loss: 0.1782 489/500 [============================>.] - ETA: 2s - loss: 1.3752 - regression_loss: 1.1969 - classification_loss: 0.1783 490/500 [============================>.] - ETA: 2s - loss: 1.3742 - regression_loss: 1.1961 - classification_loss: 0.1781 491/500 [============================>.] - ETA: 2s - loss: 1.3723 - regression_loss: 1.1944 - classification_loss: 0.1779 492/500 [============================>.] - ETA: 2s - loss: 1.3708 - regression_loss: 1.1930 - classification_loss: 0.1778 493/500 [============================>.] - ETA: 1s - loss: 1.3718 - regression_loss: 1.1938 - classification_loss: 0.1779 494/500 [============================>.] - ETA: 1s - loss: 1.3724 - regression_loss: 1.1942 - classification_loss: 0.1782 495/500 [============================>.] - ETA: 1s - loss: 1.3736 - regression_loss: 1.1952 - classification_loss: 0.1784 496/500 [============================>.] - ETA: 1s - loss: 1.3724 - regression_loss: 1.1941 - classification_loss: 0.1782 497/500 [============================>.] - ETA: 0s - loss: 1.3723 - regression_loss: 1.1940 - classification_loss: 0.1783 498/500 [============================>.] - ETA: 0s - loss: 1.3725 - regression_loss: 1.1941 - classification_loss: 0.1784 499/500 [============================>.] - ETA: 0s - loss: 1.3719 - regression_loss: 1.1937 - classification_loss: 0.1782 500/500 [==============================] - 125s 251ms/step - loss: 1.3715 - regression_loss: 1.1934 - classification_loss: 0.1781 1172 instances of class plum with average precision: 0.7274 mAP: 0.7274 Epoch 00018: saving model to ./training/snapshots/resnet50_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 1:57 - loss: 2.1153 - regression_loss: 1.8625 - classification_loss: 0.2528 2/500 [..............................] - ETA: 1:54 - loss: 1.8672 - regression_loss: 1.6502 - classification_loss: 0.2171 3/500 [..............................] - ETA: 1:58 - loss: 1.7384 - regression_loss: 1.5347 - classification_loss: 0.2037 4/500 [..............................] - ETA: 2:00 - loss: 1.7895 - regression_loss: 1.5591 - classification_loss: 0.2304 5/500 [..............................] - ETA: 2:01 - loss: 1.7643 - regression_loss: 1.5429 - classification_loss: 0.2214 6/500 [..............................] - ETA: 2:01 - loss: 1.6119 - regression_loss: 1.4113 - classification_loss: 0.2006 7/500 [..............................] - ETA: 2:01 - loss: 1.6017 - regression_loss: 1.4029 - classification_loss: 0.1988 8/500 [..............................] - ETA: 2:02 - loss: 1.5868 - regression_loss: 1.3963 - classification_loss: 0.1906 9/500 [..............................] - ETA: 2:02 - loss: 1.6129 - regression_loss: 1.4125 - classification_loss: 0.2003 10/500 [..............................] - ETA: 2:02 - loss: 1.6138 - regression_loss: 1.4118 - classification_loss: 0.2020 11/500 [..............................] - ETA: 2:01 - loss: 1.6048 - regression_loss: 1.4037 - classification_loss: 0.2011 12/500 [..............................] - ETA: 2:01 - loss: 1.6540 - regression_loss: 1.4425 - classification_loss: 0.2115 13/500 [..............................] - ETA: 2:01 - loss: 1.6172 - regression_loss: 1.4081 - classification_loss: 0.2091 14/500 [..............................] - ETA: 2:00 - loss: 1.5661 - regression_loss: 1.3632 - classification_loss: 0.2029 15/500 [..............................] - ETA: 2:00 - loss: 1.5508 - regression_loss: 1.3475 - classification_loss: 0.2033 16/500 [..............................] - ETA: 2:00 - loss: 1.4986 - regression_loss: 1.3043 - classification_loss: 0.1943 17/500 [>.............................] - ETA: 1:59 - loss: 1.4589 - regression_loss: 1.2710 - classification_loss: 0.1879 18/500 [>.............................] - ETA: 1:59 - loss: 1.4569 - regression_loss: 1.2664 - classification_loss: 0.1904 19/500 [>.............................] - ETA: 1:59 - loss: 1.5218 - regression_loss: 1.3241 - classification_loss: 0.1977 20/500 [>.............................] - ETA: 1:59 - loss: 1.5134 - regression_loss: 1.3143 - classification_loss: 0.1991 21/500 [>.............................] - ETA: 1:59 - loss: 1.5014 - regression_loss: 1.3068 - classification_loss: 0.1946 22/500 [>.............................] - ETA: 1:59 - loss: 1.5182 - regression_loss: 1.3208 - classification_loss: 0.1974 23/500 [>.............................] - ETA: 1:58 - loss: 1.5379 - regression_loss: 1.3350 - classification_loss: 0.2030 24/500 [>.............................] - ETA: 1:58 - loss: 1.5295 - regression_loss: 1.3247 - classification_loss: 0.2048 25/500 [>.............................] - ETA: 1:58 - loss: 1.5398 - regression_loss: 1.3354 - classification_loss: 0.2044 26/500 [>.............................] - ETA: 1:57 - loss: 1.5228 - regression_loss: 1.3236 - classification_loss: 0.1992 27/500 [>.............................] - ETA: 1:57 - loss: 1.4977 - regression_loss: 1.3032 - classification_loss: 0.1944 28/500 [>.............................] - ETA: 1:57 - loss: 1.5062 - regression_loss: 1.3097 - classification_loss: 0.1965 29/500 [>.............................] - ETA: 1:57 - loss: 1.5112 - regression_loss: 1.3133 - classification_loss: 0.1980 30/500 [>.............................] - ETA: 1:57 - loss: 1.4911 - regression_loss: 1.2955 - classification_loss: 0.1956 31/500 [>.............................] - ETA: 1:57 - loss: 1.4830 - regression_loss: 1.2892 - classification_loss: 0.1939 32/500 [>.............................] - ETA: 1:56 - loss: 1.4804 - regression_loss: 1.2875 - classification_loss: 0.1929 33/500 [>.............................] - ETA: 1:56 - loss: 1.4800 - regression_loss: 1.2884 - classification_loss: 0.1916 34/500 [=>............................] - ETA: 1:56 - loss: 1.4714 - regression_loss: 1.2817 - classification_loss: 0.1897 35/500 [=>............................] - ETA: 1:55 - loss: 1.4640 - regression_loss: 1.2753 - classification_loss: 0.1888 36/500 [=>............................] - ETA: 1:55 - loss: 1.4602 - regression_loss: 1.2722 - classification_loss: 0.1880 37/500 [=>............................] - ETA: 1:55 - loss: 1.4477 - regression_loss: 1.2623 - classification_loss: 0.1854 38/500 [=>............................] - ETA: 1:54 - loss: 1.4420 - regression_loss: 1.2576 - classification_loss: 0.1844 39/500 [=>............................] - ETA: 1:54 - loss: 1.4231 - regression_loss: 1.2423 - classification_loss: 0.1808 40/500 [=>............................] - ETA: 1:54 - loss: 1.4429 - regression_loss: 1.2603 - classification_loss: 0.1825 41/500 [=>............................] - ETA: 1:54 - loss: 1.4494 - regression_loss: 1.2662 - classification_loss: 0.1832 42/500 [=>............................] - ETA: 1:54 - loss: 1.4638 - regression_loss: 1.2782 - classification_loss: 0.1856 43/500 [=>............................] - ETA: 1:54 - loss: 1.4578 - regression_loss: 1.2722 - classification_loss: 0.1856 44/500 [=>............................] - ETA: 1:53 - loss: 1.4579 - regression_loss: 1.2735 - classification_loss: 0.1844 45/500 [=>............................] - ETA: 1:53 - loss: 1.4552 - regression_loss: 1.2705 - classification_loss: 0.1847 46/500 [=>............................] - ETA: 1:53 - loss: 1.4464 - regression_loss: 1.2634 - classification_loss: 0.1830 47/500 [=>............................] - ETA: 1:53 - loss: 1.4330 - regression_loss: 1.2523 - classification_loss: 0.1807 48/500 [=>............................] - ETA: 1:52 - loss: 1.4305 - regression_loss: 1.2507 - classification_loss: 0.1798 49/500 [=>............................] - ETA: 1:52 - loss: 1.4339 - regression_loss: 1.2540 - classification_loss: 0.1798 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4373 - regression_loss: 1.2570 - classification_loss: 0.1803 51/500 [==>...........................] - ETA: 1:52 - loss: 1.4448 - regression_loss: 1.2630 - classification_loss: 0.1818 52/500 [==>...........................] - ETA: 1:51 - loss: 1.4461 - regression_loss: 1.2632 - classification_loss: 0.1828 53/500 [==>...........................] - ETA: 1:51 - loss: 1.4565 - regression_loss: 1.2733 - classification_loss: 0.1833 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4621 - regression_loss: 1.2789 - classification_loss: 0.1832 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4561 - regression_loss: 1.2733 - classification_loss: 0.1828 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4567 - regression_loss: 1.2734 - classification_loss: 0.1833 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4443 - regression_loss: 1.2627 - classification_loss: 0.1816 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4345 - regression_loss: 1.2542 - classification_loss: 0.1802 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4382 - regression_loss: 1.2577 - classification_loss: 0.1804 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4235 - regression_loss: 1.2452 - classification_loss: 0.1783 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4263 - regression_loss: 1.2476 - classification_loss: 0.1787 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4122 - regression_loss: 1.2358 - classification_loss: 0.1763 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4151 - regression_loss: 1.2382 - classification_loss: 0.1769 64/500 [==>...........................] - ETA: 1:49 - loss: 1.4117 - regression_loss: 1.2354 - classification_loss: 0.1764 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4102 - regression_loss: 1.2343 - classification_loss: 0.1759 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4054 - regression_loss: 1.2303 - classification_loss: 0.1751 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4007 - regression_loss: 1.2262 - classification_loss: 0.1745 68/500 [===>..........................] - ETA: 1:48 - loss: 1.3952 - regression_loss: 1.2213 - classification_loss: 0.1740 69/500 [===>..........................] - ETA: 1:48 - loss: 1.3863 - regression_loss: 1.2139 - classification_loss: 0.1724 70/500 [===>..........................] - ETA: 1:47 - loss: 1.3889 - regression_loss: 1.2161 - classification_loss: 0.1727 71/500 [===>..........................] - ETA: 1:47 - loss: 1.3907 - regression_loss: 1.2181 - classification_loss: 0.1726 72/500 [===>..........................] - ETA: 1:47 - loss: 1.3839 - regression_loss: 1.2128 - classification_loss: 0.1711 73/500 [===>..........................] - ETA: 1:47 - loss: 1.3862 - regression_loss: 1.2144 - classification_loss: 0.1718 74/500 [===>..........................] - ETA: 1:46 - loss: 1.3903 - regression_loss: 1.2173 - classification_loss: 0.1730 75/500 [===>..........................] - ETA: 1:46 - loss: 1.3862 - regression_loss: 1.2142 - classification_loss: 0.1720 76/500 [===>..........................] - ETA: 1:46 - loss: 1.3812 - regression_loss: 1.2098 - classification_loss: 0.1714 77/500 [===>..........................] - ETA: 1:46 - loss: 1.3796 - regression_loss: 1.2087 - classification_loss: 0.1708 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3821 - regression_loss: 1.2115 - classification_loss: 0.1706 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3802 - regression_loss: 1.2102 - classification_loss: 0.1700 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3801 - regression_loss: 1.2101 - classification_loss: 0.1701 81/500 [===>..........................] - ETA: 1:45 - loss: 1.3817 - regression_loss: 1.2116 - classification_loss: 0.1701 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3803 - regression_loss: 1.2100 - classification_loss: 0.1702 83/500 [===>..........................] - ETA: 1:44 - loss: 1.3800 - regression_loss: 1.2092 - classification_loss: 0.1708 84/500 [====>.........................] - ETA: 1:44 - loss: 1.3801 - regression_loss: 1.2094 - classification_loss: 0.1707 85/500 [====>.........................] - ETA: 1:44 - loss: 1.3838 - regression_loss: 1.2126 - classification_loss: 0.1712 86/500 [====>.........................] - ETA: 1:43 - loss: 1.3808 - regression_loss: 1.2108 - classification_loss: 0.1700 87/500 [====>.........................] - ETA: 1:43 - loss: 1.3862 - regression_loss: 1.2150 - classification_loss: 0.1712 88/500 [====>.........................] - ETA: 1:43 - loss: 1.3898 - regression_loss: 1.2180 - classification_loss: 0.1718 89/500 [====>.........................] - ETA: 1:43 - loss: 1.3898 - regression_loss: 1.2180 - classification_loss: 0.1719 90/500 [====>.........................] - ETA: 1:43 - loss: 1.3881 - regression_loss: 1.2165 - classification_loss: 0.1717 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3863 - regression_loss: 1.2148 - classification_loss: 0.1715 92/500 [====>.........................] - ETA: 1:42 - loss: 1.3869 - regression_loss: 1.2152 - classification_loss: 0.1717 93/500 [====>.........................] - ETA: 1:42 - loss: 1.3888 - regression_loss: 1.2167 - classification_loss: 0.1720 94/500 [====>.........................] - ETA: 1:42 - loss: 1.3897 - regression_loss: 1.2174 - classification_loss: 0.1723 95/500 [====>.........................] - ETA: 1:41 - loss: 1.3847 - regression_loss: 1.2135 - classification_loss: 0.1713 96/500 [====>.........................] - ETA: 1:41 - loss: 1.3834 - regression_loss: 1.2124 - classification_loss: 0.1710 97/500 [====>.........................] - ETA: 1:41 - loss: 1.3828 - regression_loss: 1.2122 - classification_loss: 0.1706 98/500 [====>.........................] - ETA: 1:41 - loss: 1.3843 - regression_loss: 1.2134 - classification_loss: 0.1709 99/500 [====>.........................] - ETA: 1:41 - loss: 1.3960 - regression_loss: 1.2239 - classification_loss: 0.1722 100/500 [=====>........................] - ETA: 1:40 - loss: 1.3883 - regression_loss: 1.2171 - classification_loss: 0.1712 101/500 [=====>........................] - ETA: 1:40 - loss: 1.3903 - regression_loss: 1.2186 - classification_loss: 0.1718 102/500 [=====>........................] - ETA: 1:40 - loss: 1.3868 - regression_loss: 1.2159 - classification_loss: 0.1709 103/500 [=====>........................] - ETA: 1:40 - loss: 1.3930 - regression_loss: 1.2213 - classification_loss: 0.1718 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3960 - regression_loss: 1.2239 - classification_loss: 0.1721 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4027 - regression_loss: 1.2306 - classification_loss: 0.1721 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4075 - regression_loss: 1.2341 - classification_loss: 0.1734 107/500 [=====>........................] - ETA: 1:39 - loss: 1.4022 - regression_loss: 1.2297 - classification_loss: 0.1726 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3966 - regression_loss: 1.2250 - classification_loss: 0.1716 109/500 [=====>........................] - ETA: 1:38 - loss: 1.3956 - regression_loss: 1.2244 - classification_loss: 0.1712 110/500 [=====>........................] - ETA: 1:38 - loss: 1.3923 - regression_loss: 1.2215 - classification_loss: 0.1709 111/500 [=====>........................] - ETA: 1:38 - loss: 1.3908 - regression_loss: 1.2202 - classification_loss: 0.1707 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3954 - regression_loss: 1.2236 - classification_loss: 0.1718 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3903 - regression_loss: 1.2191 - classification_loss: 0.1713 114/500 [=====>........................] - ETA: 1:37 - loss: 1.3885 - regression_loss: 1.2175 - classification_loss: 0.1710 115/500 [=====>........................] - ETA: 1:37 - loss: 1.3896 - regression_loss: 1.2184 - classification_loss: 0.1711 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3883 - regression_loss: 1.2174 - classification_loss: 0.1709 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3909 - regression_loss: 1.2196 - classification_loss: 0.1713 118/500 [======>.......................] - ETA: 1:36 - loss: 1.3922 - regression_loss: 1.2207 - classification_loss: 0.1714 119/500 [======>.......................] - ETA: 1:36 - loss: 1.3927 - regression_loss: 1.2211 - classification_loss: 0.1716 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3975 - regression_loss: 1.2252 - classification_loss: 0.1723 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3925 - regression_loss: 1.2198 - classification_loss: 0.1727 122/500 [======>.......................] - ETA: 1:35 - loss: 1.3923 - regression_loss: 1.2190 - classification_loss: 0.1733 123/500 [======>.......................] - ETA: 1:35 - loss: 1.3945 - regression_loss: 1.2209 - classification_loss: 0.1737 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3952 - regression_loss: 1.2214 - classification_loss: 0.1739 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3970 - regression_loss: 1.2228 - classification_loss: 0.1742 126/500 [======>.......................] - ETA: 1:34 - loss: 1.3973 - regression_loss: 1.2234 - classification_loss: 0.1739 127/500 [======>.......................] - ETA: 1:34 - loss: 1.3999 - regression_loss: 1.2257 - classification_loss: 0.1741 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3952 - regression_loss: 1.2218 - classification_loss: 0.1734 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3919 - regression_loss: 1.2190 - classification_loss: 0.1729 130/500 [======>.......................] - ETA: 1:33 - loss: 1.3923 - regression_loss: 1.2189 - classification_loss: 0.1734 131/500 [======>.......................] - ETA: 1:33 - loss: 1.3937 - regression_loss: 1.2202 - classification_loss: 0.1735 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3860 - regression_loss: 1.2136 - classification_loss: 0.1724 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3900 - regression_loss: 1.2177 - classification_loss: 0.1723 134/500 [=======>......................] - ETA: 1:32 - loss: 1.3902 - regression_loss: 1.2180 - classification_loss: 0.1722 135/500 [=======>......................] - ETA: 1:32 - loss: 1.3962 - regression_loss: 1.2227 - classification_loss: 0.1735 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4007 - regression_loss: 1.2257 - classification_loss: 0.1750 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4017 - regression_loss: 1.2264 - classification_loss: 0.1753 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4051 - regression_loss: 1.2293 - classification_loss: 0.1757 139/500 [=======>......................] - ETA: 1:31 - loss: 1.3992 - regression_loss: 1.2244 - classification_loss: 0.1749 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4002 - regression_loss: 1.2252 - classification_loss: 0.1751 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3988 - regression_loss: 1.2240 - classification_loss: 0.1748 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4004 - regression_loss: 1.2253 - classification_loss: 0.1751 143/500 [=======>......................] - ETA: 1:30 - loss: 1.4004 - regression_loss: 1.2254 - classification_loss: 0.1750 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3980 - regression_loss: 1.2234 - classification_loss: 0.1745 145/500 [=======>......................] - ETA: 1:29 - loss: 1.3937 - regression_loss: 1.2201 - classification_loss: 0.1737 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4027 - regression_loss: 1.2258 - classification_loss: 0.1769 147/500 [=======>......................] - ETA: 1:29 - loss: 1.4055 - regression_loss: 1.2279 - classification_loss: 0.1777 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4017 - regression_loss: 1.2247 - classification_loss: 0.1770 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4016 - regression_loss: 1.2246 - classification_loss: 0.1770 150/500 [========>.....................] - ETA: 1:28 - loss: 1.4006 - regression_loss: 1.2236 - classification_loss: 0.1770 151/500 [========>.....................] - ETA: 1:28 - loss: 1.3964 - regression_loss: 1.2198 - classification_loss: 0.1765 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3915 - regression_loss: 1.2157 - classification_loss: 0.1758 153/500 [========>.....................] - ETA: 1:27 - loss: 1.3896 - regression_loss: 1.2141 - classification_loss: 0.1755 154/500 [========>.....................] - ETA: 1:27 - loss: 1.3889 - regression_loss: 1.2132 - classification_loss: 0.1757 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3868 - regression_loss: 1.2114 - classification_loss: 0.1754 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3831 - regression_loss: 1.2082 - classification_loss: 0.1748 157/500 [========>.....................] - ETA: 1:26 - loss: 1.3828 - regression_loss: 1.2081 - classification_loss: 0.1748 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3837 - regression_loss: 1.2090 - classification_loss: 0.1747 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3780 - regression_loss: 1.2038 - classification_loss: 0.1741 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3783 - regression_loss: 1.2038 - classification_loss: 0.1745 161/500 [========>.....................] - ETA: 1:25 - loss: 1.3733 - regression_loss: 1.1993 - classification_loss: 0.1740 162/500 [========>.....................] - ETA: 1:24 - loss: 1.3733 - regression_loss: 1.1993 - classification_loss: 0.1739 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3741 - regression_loss: 1.2002 - classification_loss: 0.1739 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3769 - regression_loss: 1.2026 - classification_loss: 0.1743 165/500 [========>.....................] - ETA: 1:24 - loss: 1.3807 - regression_loss: 1.2053 - classification_loss: 0.1753 166/500 [========>.....................] - ETA: 1:23 - loss: 1.3805 - regression_loss: 1.2054 - classification_loss: 0.1751 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3757 - regression_loss: 1.2011 - classification_loss: 0.1746 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3780 - regression_loss: 1.2030 - classification_loss: 0.1750 169/500 [=========>....................] - ETA: 1:22 - loss: 1.3824 - regression_loss: 1.2065 - classification_loss: 0.1759 170/500 [=========>....................] - ETA: 1:22 - loss: 1.3793 - regression_loss: 1.2035 - classification_loss: 0.1758 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3793 - regression_loss: 1.2038 - classification_loss: 0.1755 172/500 [=========>....................] - ETA: 1:22 - loss: 1.3782 - regression_loss: 1.2029 - classification_loss: 0.1753 173/500 [=========>....................] - ETA: 1:21 - loss: 1.3748 - regression_loss: 1.2001 - classification_loss: 0.1748 174/500 [=========>....................] - ETA: 1:21 - loss: 1.3754 - regression_loss: 1.2005 - classification_loss: 0.1749 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3751 - regression_loss: 1.2002 - classification_loss: 0.1749 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3739 - regression_loss: 1.1993 - classification_loss: 0.1746 177/500 [=========>....................] - ETA: 1:20 - loss: 1.3764 - regression_loss: 1.2013 - classification_loss: 0.1750 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3781 - regression_loss: 1.2028 - classification_loss: 0.1753 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3808 - regression_loss: 1.2050 - classification_loss: 0.1758 180/500 [=========>....................] - ETA: 1:20 - loss: 1.3770 - regression_loss: 1.2018 - classification_loss: 0.1752 181/500 [=========>....................] - ETA: 1:19 - loss: 1.3768 - regression_loss: 1.2016 - classification_loss: 0.1752 182/500 [=========>....................] - ETA: 1:19 - loss: 1.3780 - regression_loss: 1.2026 - classification_loss: 0.1754 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3759 - regression_loss: 1.2005 - classification_loss: 0.1754 184/500 [==========>...................] - ETA: 1:19 - loss: 1.3757 - regression_loss: 1.2004 - classification_loss: 0.1753 185/500 [==========>...................] - ETA: 1:18 - loss: 1.3760 - regression_loss: 1.2007 - classification_loss: 0.1753 186/500 [==========>...................] - ETA: 1:18 - loss: 1.3796 - regression_loss: 1.2034 - classification_loss: 0.1762 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3815 - regression_loss: 1.2051 - classification_loss: 0.1764 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3790 - regression_loss: 1.2028 - classification_loss: 0.1761 189/500 [==========>...................] - ETA: 1:17 - loss: 1.3822 - regression_loss: 1.2056 - classification_loss: 0.1765 190/500 [==========>...................] - ETA: 1:17 - loss: 1.3811 - regression_loss: 1.2051 - classification_loss: 0.1761 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3835 - regression_loss: 1.2073 - classification_loss: 0.1762 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3844 - regression_loss: 1.2080 - classification_loss: 0.1764 193/500 [==========>...................] - ETA: 1:16 - loss: 1.3802 - regression_loss: 1.2041 - classification_loss: 0.1761 194/500 [==========>...................] - ETA: 1:16 - loss: 1.3812 - regression_loss: 1.2050 - classification_loss: 0.1763 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3827 - regression_loss: 1.2063 - classification_loss: 0.1764 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3803 - regression_loss: 1.2043 - classification_loss: 0.1761 197/500 [==========>...................] - ETA: 1:15 - loss: 1.3811 - regression_loss: 1.2048 - classification_loss: 0.1763 198/500 [==========>...................] - ETA: 1:15 - loss: 1.3828 - regression_loss: 1.2059 - classification_loss: 0.1769 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3828 - regression_loss: 1.2058 - classification_loss: 0.1770 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3803 - regression_loss: 1.2036 - classification_loss: 0.1767 201/500 [===========>..................] - ETA: 1:14 - loss: 1.3824 - regression_loss: 1.2054 - classification_loss: 0.1770 202/500 [===========>..................] - ETA: 1:14 - loss: 1.3829 - regression_loss: 1.2055 - classification_loss: 0.1774 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3788 - regression_loss: 1.2021 - classification_loss: 0.1767 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3788 - regression_loss: 1.2016 - classification_loss: 0.1772 205/500 [===========>..................] - ETA: 1:13 - loss: 1.3802 - regression_loss: 1.2029 - classification_loss: 0.1773 206/500 [===========>..................] - ETA: 1:13 - loss: 1.3851 - regression_loss: 1.2068 - classification_loss: 0.1782 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3855 - regression_loss: 1.2073 - classification_loss: 0.1782 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3854 - regression_loss: 1.2075 - classification_loss: 0.1780 209/500 [===========>..................] - ETA: 1:12 - loss: 1.3862 - regression_loss: 1.2082 - classification_loss: 0.1780 210/500 [===========>..................] - ETA: 1:12 - loss: 1.3866 - regression_loss: 1.2086 - classification_loss: 0.1780 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3849 - regression_loss: 1.2072 - classification_loss: 0.1777 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3802 - regression_loss: 1.2031 - classification_loss: 0.1771 213/500 [===========>..................] - ETA: 1:11 - loss: 1.3799 - regression_loss: 1.2031 - classification_loss: 0.1768 214/500 [===========>..................] - ETA: 1:11 - loss: 1.3817 - regression_loss: 1.2046 - classification_loss: 0.1771 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3798 - regression_loss: 1.2030 - classification_loss: 0.1768 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3805 - regression_loss: 1.2036 - classification_loss: 0.1769 217/500 [============>.................] - ETA: 1:10 - loss: 1.3807 - regression_loss: 1.2037 - classification_loss: 0.1770 218/500 [============>.................] - ETA: 1:10 - loss: 1.3792 - regression_loss: 1.2023 - classification_loss: 0.1769 219/500 [============>.................] - ETA: 1:10 - loss: 1.3761 - regression_loss: 1.1997 - classification_loss: 0.1764 220/500 [============>.................] - ETA: 1:10 - loss: 1.3748 - regression_loss: 1.1985 - classification_loss: 0.1763 221/500 [============>.................] - ETA: 1:09 - loss: 1.3766 - regression_loss: 1.1997 - classification_loss: 0.1768 222/500 [============>.................] - ETA: 1:09 - loss: 1.3796 - regression_loss: 1.2023 - classification_loss: 0.1773 223/500 [============>.................] - ETA: 1:09 - loss: 1.3824 - regression_loss: 1.2045 - classification_loss: 0.1779 224/500 [============>.................] - ETA: 1:09 - loss: 1.3833 - regression_loss: 1.2052 - classification_loss: 0.1781 225/500 [============>.................] - ETA: 1:08 - loss: 1.3836 - regression_loss: 1.2056 - classification_loss: 0.1780 226/500 [============>.................] - ETA: 1:08 - loss: 1.3825 - regression_loss: 1.2049 - classification_loss: 0.1776 227/500 [============>.................] - ETA: 1:08 - loss: 1.3839 - regression_loss: 1.2062 - classification_loss: 0.1777 228/500 [============>.................] - ETA: 1:08 - loss: 1.3852 - regression_loss: 1.2072 - classification_loss: 0.1780 229/500 [============>.................] - ETA: 1:07 - loss: 1.3840 - regression_loss: 1.2064 - classification_loss: 0.1776 230/500 [============>.................] - ETA: 1:07 - loss: 1.3835 - regression_loss: 1.2061 - classification_loss: 0.1774 231/500 [============>.................] - ETA: 1:07 - loss: 1.3850 - regression_loss: 1.2073 - classification_loss: 0.1777 232/500 [============>.................] - ETA: 1:07 - loss: 1.3851 - regression_loss: 1.2076 - classification_loss: 0.1775 233/500 [============>.................] - ETA: 1:06 - loss: 1.3830 - regression_loss: 1.2058 - classification_loss: 0.1772 234/500 [=============>................] - ETA: 1:06 - loss: 1.3832 - regression_loss: 1.2061 - classification_loss: 0.1771 235/500 [=============>................] - ETA: 1:06 - loss: 1.3846 - regression_loss: 1.2073 - classification_loss: 0.1773 236/500 [=============>................] - ETA: 1:06 - loss: 1.3848 - regression_loss: 1.2074 - classification_loss: 0.1774 237/500 [=============>................] - ETA: 1:05 - loss: 1.3850 - regression_loss: 1.2077 - classification_loss: 0.1773 238/500 [=============>................] - ETA: 1:05 - loss: 1.3845 - regression_loss: 1.2075 - classification_loss: 0.1771 239/500 [=============>................] - ETA: 1:05 - loss: 1.3851 - regression_loss: 1.2079 - classification_loss: 0.1772 240/500 [=============>................] - ETA: 1:05 - loss: 1.3845 - regression_loss: 1.2073 - classification_loss: 0.1771 241/500 [=============>................] - ETA: 1:04 - loss: 1.3879 - regression_loss: 1.2104 - classification_loss: 0.1776 242/500 [=============>................] - ETA: 1:04 - loss: 1.3900 - regression_loss: 1.2121 - classification_loss: 0.1779 243/500 [=============>................] - ETA: 1:04 - loss: 1.3900 - regression_loss: 1.2120 - classification_loss: 0.1780 244/500 [=============>................] - ETA: 1:04 - loss: 1.3875 - regression_loss: 1.2098 - classification_loss: 0.1777 245/500 [=============>................] - ETA: 1:03 - loss: 1.3891 - regression_loss: 1.2111 - classification_loss: 0.1780 246/500 [=============>................] - ETA: 1:03 - loss: 1.3898 - regression_loss: 1.2118 - classification_loss: 0.1780 247/500 [=============>................] - ETA: 1:03 - loss: 1.3905 - regression_loss: 1.2123 - classification_loss: 0.1781 248/500 [=============>................] - ETA: 1:03 - loss: 1.3907 - regression_loss: 1.2125 - classification_loss: 0.1782 249/500 [=============>................] - ETA: 1:02 - loss: 1.3901 - regression_loss: 1.2120 - classification_loss: 0.1781 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3875 - regression_loss: 1.2098 - classification_loss: 0.1777 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3869 - regression_loss: 1.2094 - classification_loss: 0.1775 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3853 - regression_loss: 1.2082 - classification_loss: 0.1771 253/500 [==============>...............] - ETA: 1:01 - loss: 1.3851 - regression_loss: 1.2077 - classification_loss: 0.1774 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3866 - regression_loss: 1.2089 - classification_loss: 0.1777 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3865 - regression_loss: 1.2088 - classification_loss: 0.1777 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3854 - regression_loss: 1.2081 - classification_loss: 0.1774 257/500 [==============>...............] - ETA: 1:00 - loss: 1.3853 - regression_loss: 1.2080 - classification_loss: 0.1773 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3871 - regression_loss: 1.2099 - classification_loss: 0.1772 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3835 - regression_loss: 1.2069 - classification_loss: 0.1766 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3821 - regression_loss: 1.2056 - classification_loss: 0.1765 261/500 [==============>...............] - ETA: 1:00 - loss: 1.3834 - regression_loss: 1.2067 - classification_loss: 0.1767 262/500 [==============>...............] - ETA: 59s - loss: 1.3806 - regression_loss: 1.2042 - classification_loss: 0.1764  263/500 [==============>...............] - ETA: 59s - loss: 1.3813 - regression_loss: 1.2049 - classification_loss: 0.1764 264/500 [==============>...............] - ETA: 59s - loss: 1.3798 - regression_loss: 1.2035 - classification_loss: 0.1763 265/500 [==============>...............] - ETA: 59s - loss: 1.3781 - regression_loss: 1.2021 - classification_loss: 0.1760 266/500 [==============>...............] - ETA: 58s - loss: 1.3797 - regression_loss: 1.2034 - classification_loss: 0.1763 267/500 [===============>..............] - ETA: 58s - loss: 1.3769 - regression_loss: 1.2008 - classification_loss: 0.1761 268/500 [===============>..............] - ETA: 58s - loss: 1.3740 - regression_loss: 1.1982 - classification_loss: 0.1758 269/500 [===============>..............] - ETA: 58s - loss: 1.3735 - regression_loss: 1.1977 - classification_loss: 0.1758 270/500 [===============>..............] - ETA: 57s - loss: 1.3742 - regression_loss: 1.1982 - classification_loss: 0.1760 271/500 [===============>..............] - ETA: 57s - loss: 1.3732 - regression_loss: 1.1974 - classification_loss: 0.1759 272/500 [===============>..............] - ETA: 57s - loss: 1.3739 - regression_loss: 1.1981 - classification_loss: 0.1757 273/500 [===============>..............] - ETA: 57s - loss: 1.3701 - regression_loss: 1.1948 - classification_loss: 0.1753 274/500 [===============>..............] - ETA: 56s - loss: 1.3698 - regression_loss: 1.1947 - classification_loss: 0.1751 275/500 [===============>..............] - ETA: 56s - loss: 1.3672 - regression_loss: 1.1923 - classification_loss: 0.1748 276/500 [===============>..............] - ETA: 56s - loss: 1.3671 - regression_loss: 1.1923 - classification_loss: 0.1748 277/500 [===============>..............] - ETA: 56s - loss: 1.3687 - regression_loss: 1.1936 - classification_loss: 0.1751 278/500 [===============>..............] - ETA: 55s - loss: 1.3675 - regression_loss: 1.1927 - classification_loss: 0.1748 279/500 [===============>..............] - ETA: 55s - loss: 1.3678 - regression_loss: 1.1929 - classification_loss: 0.1749 280/500 [===============>..............] - ETA: 55s - loss: 1.3675 - regression_loss: 1.1927 - classification_loss: 0.1748 281/500 [===============>..............] - ETA: 55s - loss: 1.3680 - regression_loss: 1.1932 - classification_loss: 0.1748 282/500 [===============>..............] - ETA: 54s - loss: 1.3676 - regression_loss: 1.1926 - classification_loss: 0.1750 283/500 [===============>..............] - ETA: 54s - loss: 1.3661 - regression_loss: 1.1914 - classification_loss: 0.1747 284/500 [================>.............] - ETA: 54s - loss: 1.3676 - regression_loss: 1.1927 - classification_loss: 0.1749 285/500 [================>.............] - ETA: 54s - loss: 1.3673 - regression_loss: 1.1925 - classification_loss: 0.1748 286/500 [================>.............] - ETA: 53s - loss: 1.3681 - regression_loss: 1.1934 - classification_loss: 0.1747 287/500 [================>.............] - ETA: 53s - loss: 1.3688 - regression_loss: 1.1940 - classification_loss: 0.1748 288/500 [================>.............] - ETA: 53s - loss: 1.3674 - regression_loss: 1.1928 - classification_loss: 0.1746 289/500 [================>.............] - ETA: 53s - loss: 1.3680 - regression_loss: 1.1935 - classification_loss: 0.1746 290/500 [================>.............] - ETA: 52s - loss: 1.3665 - regression_loss: 1.1922 - classification_loss: 0.1744 291/500 [================>.............] - ETA: 52s - loss: 1.3671 - regression_loss: 1.1926 - classification_loss: 0.1745 292/500 [================>.............] - ETA: 52s - loss: 1.3671 - regression_loss: 1.1925 - classification_loss: 0.1746 293/500 [================>.............] - ETA: 51s - loss: 1.3671 - regression_loss: 1.1926 - classification_loss: 0.1745 294/500 [================>.............] - ETA: 51s - loss: 1.3675 - regression_loss: 1.1932 - classification_loss: 0.1743 295/500 [================>.............] - ETA: 51s - loss: 1.3678 - regression_loss: 1.1935 - classification_loss: 0.1743 296/500 [================>.............] - ETA: 51s - loss: 1.3690 - regression_loss: 1.1946 - classification_loss: 0.1744 297/500 [================>.............] - ETA: 50s - loss: 1.3702 - regression_loss: 1.1956 - classification_loss: 0.1745 298/500 [================>.............] - ETA: 50s - loss: 1.3707 - regression_loss: 1.1960 - classification_loss: 0.1746 299/500 [================>.............] - ETA: 50s - loss: 1.3715 - regression_loss: 1.1969 - classification_loss: 0.1747 300/500 [=================>............] - ETA: 50s - loss: 1.3707 - regression_loss: 1.1961 - classification_loss: 0.1746 301/500 [=================>............] - ETA: 49s - loss: 1.3704 - regression_loss: 1.1957 - classification_loss: 0.1746 302/500 [=================>............] - ETA: 49s - loss: 1.3710 - regression_loss: 1.1962 - classification_loss: 0.1748 303/500 [=================>............] - ETA: 49s - loss: 1.3698 - regression_loss: 1.1951 - classification_loss: 0.1747 304/500 [=================>............] - ETA: 49s - loss: 1.3667 - regression_loss: 1.1924 - classification_loss: 0.1743 305/500 [=================>............] - ETA: 48s - loss: 1.3667 - regression_loss: 1.1922 - classification_loss: 0.1744 306/500 [=================>............] - ETA: 48s - loss: 1.3677 - regression_loss: 1.1931 - classification_loss: 0.1747 307/500 [=================>............] - ETA: 48s - loss: 1.3700 - regression_loss: 1.1951 - classification_loss: 0.1749 308/500 [=================>............] - ETA: 48s - loss: 1.3699 - regression_loss: 1.1949 - classification_loss: 0.1750 309/500 [=================>............] - ETA: 47s - loss: 1.3705 - regression_loss: 1.1955 - classification_loss: 0.1750 310/500 [=================>............] - ETA: 47s - loss: 1.3713 - regression_loss: 1.1960 - classification_loss: 0.1753 311/500 [=================>............] - ETA: 47s - loss: 1.3697 - regression_loss: 1.1943 - classification_loss: 0.1755 312/500 [=================>............] - ETA: 47s - loss: 1.3696 - regression_loss: 1.1940 - classification_loss: 0.1756 313/500 [=================>............] - ETA: 46s - loss: 1.3678 - regression_loss: 1.1923 - classification_loss: 0.1755 314/500 [=================>............] - ETA: 46s - loss: 1.3679 - regression_loss: 1.1924 - classification_loss: 0.1754 315/500 [=================>............] - ETA: 46s - loss: 1.3679 - regression_loss: 1.1924 - classification_loss: 0.1755 316/500 [=================>............] - ETA: 46s - loss: 1.3686 - regression_loss: 1.1930 - classification_loss: 0.1756 317/500 [==================>...........] - ETA: 45s - loss: 1.3687 - regression_loss: 1.1930 - classification_loss: 0.1757 318/500 [==================>...........] - ETA: 45s - loss: 1.3697 - regression_loss: 1.1938 - classification_loss: 0.1759 319/500 [==================>...........] - ETA: 45s - loss: 1.3703 - regression_loss: 1.1943 - classification_loss: 0.1760 320/500 [==================>...........] - ETA: 45s - loss: 1.3714 - regression_loss: 1.1953 - classification_loss: 0.1761 321/500 [==================>...........] - ETA: 44s - loss: 1.3696 - regression_loss: 1.1938 - classification_loss: 0.1759 322/500 [==================>...........] - ETA: 44s - loss: 1.3714 - regression_loss: 1.1954 - classification_loss: 0.1760 323/500 [==================>...........] - ETA: 44s - loss: 1.3721 - regression_loss: 1.1960 - classification_loss: 0.1761 324/500 [==================>...........] - ETA: 44s - loss: 1.3727 - regression_loss: 1.1963 - classification_loss: 0.1763 325/500 [==================>...........] - ETA: 43s - loss: 1.3734 - regression_loss: 1.1971 - classification_loss: 0.1763 326/500 [==================>...........] - ETA: 43s - loss: 1.3748 - regression_loss: 1.1986 - classification_loss: 0.1762 327/500 [==================>...........] - ETA: 43s - loss: 1.3751 - regression_loss: 1.1989 - classification_loss: 0.1762 328/500 [==================>...........] - ETA: 43s - loss: 1.3732 - regression_loss: 1.1973 - classification_loss: 0.1759 329/500 [==================>...........] - ETA: 42s - loss: 1.3725 - regression_loss: 1.1967 - classification_loss: 0.1757 330/500 [==================>...........] - ETA: 42s - loss: 1.3705 - regression_loss: 1.1950 - classification_loss: 0.1754 331/500 [==================>...........] - ETA: 42s - loss: 1.3706 - regression_loss: 1.1949 - classification_loss: 0.1756 332/500 [==================>...........] - ETA: 42s - loss: 1.3699 - regression_loss: 1.1944 - classification_loss: 0.1755 333/500 [==================>...........] - ETA: 41s - loss: 1.3710 - regression_loss: 1.1953 - classification_loss: 0.1757 334/500 [===================>..........] - ETA: 41s - loss: 1.3711 - regression_loss: 1.1954 - classification_loss: 0.1757 335/500 [===================>..........] - ETA: 41s - loss: 1.3702 - regression_loss: 1.1948 - classification_loss: 0.1755 336/500 [===================>..........] - ETA: 41s - loss: 1.3691 - regression_loss: 1.1938 - classification_loss: 0.1754 337/500 [===================>..........] - ETA: 40s - loss: 1.3698 - regression_loss: 1.1944 - classification_loss: 0.1754 338/500 [===================>..........] - ETA: 40s - loss: 1.3696 - regression_loss: 1.1943 - classification_loss: 0.1752 339/500 [===================>..........] - ETA: 40s - loss: 1.3700 - regression_loss: 1.1947 - classification_loss: 0.1753 340/500 [===================>..........] - ETA: 40s - loss: 1.3713 - regression_loss: 1.1959 - classification_loss: 0.1753 341/500 [===================>..........] - ETA: 39s - loss: 1.3707 - regression_loss: 1.1950 - classification_loss: 0.1757 342/500 [===================>..........] - ETA: 39s - loss: 1.3721 - regression_loss: 1.1961 - classification_loss: 0.1760 343/500 [===================>..........] - ETA: 39s - loss: 1.3719 - regression_loss: 1.1959 - classification_loss: 0.1760 344/500 [===================>..........] - ETA: 39s - loss: 1.3703 - regression_loss: 1.1947 - classification_loss: 0.1756 345/500 [===================>..........] - ETA: 38s - loss: 1.3723 - regression_loss: 1.1962 - classification_loss: 0.1761 346/500 [===================>..........] - ETA: 38s - loss: 1.3726 - regression_loss: 1.1964 - classification_loss: 0.1762 347/500 [===================>..........] - ETA: 38s - loss: 1.3733 - regression_loss: 1.1969 - classification_loss: 0.1765 348/500 [===================>..........] - ETA: 38s - loss: 1.3735 - regression_loss: 1.1969 - classification_loss: 0.1765 349/500 [===================>..........] - ETA: 37s - loss: 1.3711 - regression_loss: 1.1950 - classification_loss: 0.1762 350/500 [====================>.........] - ETA: 37s - loss: 1.3683 - regression_loss: 1.1925 - classification_loss: 0.1758 351/500 [====================>.........] - ETA: 37s - loss: 1.3664 - regression_loss: 1.1908 - classification_loss: 0.1756 352/500 [====================>.........] - ETA: 37s - loss: 1.3644 - regression_loss: 1.1891 - classification_loss: 0.1753 353/500 [====================>.........] - ETA: 36s - loss: 1.3660 - regression_loss: 1.1906 - classification_loss: 0.1754 354/500 [====================>.........] - ETA: 36s - loss: 1.3659 - regression_loss: 1.1905 - classification_loss: 0.1754 355/500 [====================>.........] - ETA: 36s - loss: 1.3663 - regression_loss: 1.1910 - classification_loss: 0.1753 356/500 [====================>.........] - ETA: 36s - loss: 1.3675 - regression_loss: 1.1918 - classification_loss: 0.1757 357/500 [====================>.........] - ETA: 35s - loss: 1.3657 - regression_loss: 1.1903 - classification_loss: 0.1754 358/500 [====================>.........] - ETA: 35s - loss: 1.3656 - regression_loss: 1.1900 - classification_loss: 0.1756 359/500 [====================>.........] - ETA: 35s - loss: 1.3660 - regression_loss: 1.1903 - classification_loss: 0.1757 360/500 [====================>.........] - ETA: 35s - loss: 1.3682 - regression_loss: 1.1921 - classification_loss: 0.1761 361/500 [====================>.........] - ETA: 34s - loss: 1.3690 - regression_loss: 1.1928 - classification_loss: 0.1762 362/500 [====================>.........] - ETA: 34s - loss: 1.3685 - regression_loss: 1.1925 - classification_loss: 0.1760 363/500 [====================>.........] - ETA: 34s - loss: 1.3699 - regression_loss: 1.1933 - classification_loss: 0.1767 364/500 [====================>.........] - ETA: 34s - loss: 1.3674 - regression_loss: 1.1910 - classification_loss: 0.1763 365/500 [====================>.........] - ETA: 33s - loss: 1.3659 - regression_loss: 1.1898 - classification_loss: 0.1761 366/500 [====================>.........] - ETA: 33s - loss: 1.3653 - regression_loss: 1.1892 - classification_loss: 0.1761 367/500 [=====================>........] - ETA: 33s - loss: 1.3661 - regression_loss: 1.1899 - classification_loss: 0.1762 368/500 [=====================>........] - ETA: 33s - loss: 1.3666 - regression_loss: 1.1905 - classification_loss: 0.1761 369/500 [=====================>........] - ETA: 32s - loss: 1.3654 - regression_loss: 1.1896 - classification_loss: 0.1759 370/500 [=====================>........] - ETA: 32s - loss: 1.3665 - regression_loss: 1.1906 - classification_loss: 0.1760 371/500 [=====================>........] - ETA: 32s - loss: 1.3674 - regression_loss: 1.1912 - classification_loss: 0.1762 372/500 [=====================>........] - ETA: 32s - loss: 1.3673 - regression_loss: 1.1912 - classification_loss: 0.1761 373/500 [=====================>........] - ETA: 31s - loss: 1.3663 - regression_loss: 1.1903 - classification_loss: 0.1760 374/500 [=====================>........] - ETA: 31s - loss: 1.3645 - regression_loss: 1.1888 - classification_loss: 0.1758 375/500 [=====================>........] - ETA: 31s - loss: 1.3665 - regression_loss: 1.1905 - classification_loss: 0.1760 376/500 [=====================>........] - ETA: 31s - loss: 1.3675 - regression_loss: 1.1915 - classification_loss: 0.1761 377/500 [=====================>........] - ETA: 30s - loss: 1.3685 - regression_loss: 1.1923 - classification_loss: 0.1762 378/500 [=====================>........] - ETA: 30s - loss: 1.3690 - regression_loss: 1.1927 - classification_loss: 0.1763 379/500 [=====================>........] - ETA: 30s - loss: 1.3692 - regression_loss: 1.1929 - classification_loss: 0.1763 380/500 [=====================>........] - ETA: 30s - loss: 1.3683 - regression_loss: 1.1923 - classification_loss: 0.1760 381/500 [=====================>........] - ETA: 29s - loss: 1.3698 - regression_loss: 1.1936 - classification_loss: 0.1762 382/500 [=====================>........] - ETA: 29s - loss: 1.3686 - regression_loss: 1.1925 - classification_loss: 0.1760 383/500 [=====================>........] - ETA: 29s - loss: 1.3664 - regression_loss: 1.1907 - classification_loss: 0.1757 384/500 [======================>.......] - ETA: 29s - loss: 1.3644 - regression_loss: 1.1890 - classification_loss: 0.1754 385/500 [======================>.......] - ETA: 28s - loss: 1.3628 - regression_loss: 1.1877 - classification_loss: 0.1752 386/500 [======================>.......] - ETA: 28s - loss: 1.3628 - regression_loss: 1.1876 - classification_loss: 0.1752 387/500 [======================>.......] - ETA: 28s - loss: 1.3631 - regression_loss: 1.1879 - classification_loss: 0.1753 388/500 [======================>.......] - ETA: 28s - loss: 1.3639 - regression_loss: 1.1884 - classification_loss: 0.1755 389/500 [======================>.......] - ETA: 27s - loss: 1.3644 - regression_loss: 1.1889 - classification_loss: 0.1755 390/500 [======================>.......] - ETA: 27s - loss: 1.3647 - regression_loss: 1.1892 - classification_loss: 0.1755 391/500 [======================>.......] - ETA: 27s - loss: 1.3649 - regression_loss: 1.1893 - classification_loss: 0.1755 392/500 [======================>.......] - ETA: 27s - loss: 1.3636 - regression_loss: 1.1884 - classification_loss: 0.1752 393/500 [======================>.......] - ETA: 26s - loss: 1.3645 - regression_loss: 1.1891 - classification_loss: 0.1753 394/500 [======================>.......] - ETA: 26s - loss: 1.3633 - regression_loss: 1.1881 - classification_loss: 0.1752 395/500 [======================>.......] - ETA: 26s - loss: 1.3615 - regression_loss: 1.1865 - classification_loss: 0.1749 396/500 [======================>.......] - ETA: 26s - loss: 1.3601 - regression_loss: 1.1853 - classification_loss: 0.1748 397/500 [======================>.......] - ETA: 25s - loss: 1.3600 - regression_loss: 1.1851 - classification_loss: 0.1748 398/500 [======================>.......] - ETA: 25s - loss: 1.3598 - regression_loss: 1.1850 - classification_loss: 0.1748 399/500 [======================>.......] - ETA: 25s - loss: 1.3598 - regression_loss: 1.1850 - classification_loss: 0.1748 400/500 [=======================>......] - ETA: 25s - loss: 1.3599 - regression_loss: 1.1852 - classification_loss: 0.1747 401/500 [=======================>......] - ETA: 24s - loss: 1.3602 - regression_loss: 1.1854 - classification_loss: 0.1748 402/500 [=======================>......] - ETA: 24s - loss: 1.3619 - regression_loss: 1.1868 - classification_loss: 0.1751 403/500 [=======================>......] - ETA: 24s - loss: 1.3606 - regression_loss: 1.1857 - classification_loss: 0.1749 404/500 [=======================>......] - ETA: 24s - loss: 1.3610 - regression_loss: 1.1863 - classification_loss: 0.1747 405/500 [=======================>......] - ETA: 23s - loss: 1.3614 - regression_loss: 1.1865 - classification_loss: 0.1748 406/500 [=======================>......] - ETA: 23s - loss: 1.3617 - regression_loss: 1.1869 - classification_loss: 0.1749 407/500 [=======================>......] - ETA: 23s - loss: 1.3617 - regression_loss: 1.1870 - classification_loss: 0.1747 408/500 [=======================>......] - ETA: 23s - loss: 1.3627 - regression_loss: 1.1877 - classification_loss: 0.1750 409/500 [=======================>......] - ETA: 22s - loss: 1.3605 - regression_loss: 1.1848 - classification_loss: 0.1757 410/500 [=======================>......] - ETA: 22s - loss: 1.3592 - regression_loss: 1.1836 - classification_loss: 0.1756 411/500 [=======================>......] - ETA: 22s - loss: 1.3585 - regression_loss: 1.1830 - classification_loss: 0.1755 412/500 [=======================>......] - ETA: 22s - loss: 1.3589 - regression_loss: 1.1834 - classification_loss: 0.1755 413/500 [=======================>......] - ETA: 21s - loss: 1.3601 - regression_loss: 1.1842 - classification_loss: 0.1759 414/500 [=======================>......] - ETA: 21s - loss: 1.3622 - regression_loss: 1.1857 - classification_loss: 0.1764 415/500 [=======================>......] - ETA: 21s - loss: 1.3630 - regression_loss: 1.1864 - classification_loss: 0.1766 416/500 [=======================>......] - ETA: 21s - loss: 1.3622 - regression_loss: 1.1859 - classification_loss: 0.1764 417/500 [========================>.....] - ETA: 20s - loss: 1.3623 - regression_loss: 1.1860 - classification_loss: 0.1763 418/500 [========================>.....] - ETA: 20s - loss: 1.3630 - regression_loss: 1.1867 - classification_loss: 0.1763 419/500 [========================>.....] - ETA: 20s - loss: 1.3633 - regression_loss: 1.1871 - classification_loss: 0.1763 420/500 [========================>.....] - ETA: 20s - loss: 1.3629 - regression_loss: 1.1868 - classification_loss: 0.1762 421/500 [========================>.....] - ETA: 19s - loss: 1.3629 - regression_loss: 1.1868 - classification_loss: 0.1761 422/500 [========================>.....] - ETA: 19s - loss: 1.3634 - regression_loss: 1.1872 - classification_loss: 0.1762 423/500 [========================>.....] - ETA: 19s - loss: 1.3642 - regression_loss: 1.1878 - classification_loss: 0.1764 424/500 [========================>.....] - ETA: 19s - loss: 1.3639 - regression_loss: 1.1875 - classification_loss: 0.1764 425/500 [========================>.....] - ETA: 18s - loss: 1.3624 - regression_loss: 1.1862 - classification_loss: 0.1762 426/500 [========================>.....] - ETA: 18s - loss: 1.3633 - regression_loss: 1.1870 - classification_loss: 0.1762 427/500 [========================>.....] - ETA: 18s - loss: 1.3635 - regression_loss: 1.1872 - classification_loss: 0.1763 428/500 [========================>.....] - ETA: 18s - loss: 1.3644 - regression_loss: 1.1881 - classification_loss: 0.1763 429/500 [========================>.....] - ETA: 17s - loss: 1.3646 - regression_loss: 1.1883 - classification_loss: 0.1764 430/500 [========================>.....] - ETA: 17s - loss: 1.3641 - regression_loss: 1.1879 - classification_loss: 0.1763 431/500 [========================>.....] - ETA: 17s - loss: 1.3621 - regression_loss: 1.1861 - classification_loss: 0.1760 432/500 [========================>.....] - ETA: 17s - loss: 1.3620 - regression_loss: 1.1860 - classification_loss: 0.1760 433/500 [========================>.....] - ETA: 16s - loss: 1.3622 - regression_loss: 1.1861 - classification_loss: 0.1760 434/500 [=========================>....] - ETA: 16s - loss: 1.3638 - regression_loss: 1.1875 - classification_loss: 0.1763 435/500 [=========================>....] - ETA: 16s - loss: 1.3642 - regression_loss: 1.1879 - classification_loss: 0.1763 436/500 [=========================>....] - ETA: 16s - loss: 1.3661 - regression_loss: 1.1891 - classification_loss: 0.1770 437/500 [=========================>....] - ETA: 15s - loss: 1.3663 - regression_loss: 1.1892 - classification_loss: 0.1770 438/500 [=========================>....] - ETA: 15s - loss: 1.3672 - regression_loss: 1.1900 - classification_loss: 0.1772 439/500 [=========================>....] - ETA: 15s - loss: 1.3677 - regression_loss: 1.1904 - classification_loss: 0.1772 440/500 [=========================>....] - ETA: 15s - loss: 1.3661 - regression_loss: 1.1891 - classification_loss: 0.1770 441/500 [=========================>....] - ETA: 14s - loss: 1.3663 - regression_loss: 1.1891 - classification_loss: 0.1772 442/500 [=========================>....] - ETA: 14s - loss: 1.3663 - regression_loss: 1.1891 - classification_loss: 0.1771 443/500 [=========================>....] - ETA: 14s - loss: 1.3668 - regression_loss: 1.1896 - classification_loss: 0.1772 444/500 [=========================>....] - ETA: 14s - loss: 1.3652 - regression_loss: 1.1882 - classification_loss: 0.1770 445/500 [=========================>....] - ETA: 13s - loss: 1.3652 - regression_loss: 1.1882 - classification_loss: 0.1769 446/500 [=========================>....] - ETA: 13s - loss: 1.3657 - regression_loss: 1.1887 - classification_loss: 0.1770 447/500 [=========================>....] - ETA: 13s - loss: 1.3656 - regression_loss: 1.1887 - classification_loss: 0.1770 448/500 [=========================>....] - ETA: 13s - loss: 1.3661 - regression_loss: 1.1891 - classification_loss: 0.1770 449/500 [=========================>....] - ETA: 12s - loss: 1.3662 - regression_loss: 1.1891 - classification_loss: 0.1770 450/500 [==========================>...] - ETA: 12s - loss: 1.3664 - regression_loss: 1.1894 - classification_loss: 0.1770 451/500 [==========================>...] - ETA: 12s - loss: 1.3668 - regression_loss: 1.1897 - classification_loss: 0.1771 452/500 [==========================>...] - ETA: 12s - loss: 1.3671 - regression_loss: 1.1897 - classification_loss: 0.1774 453/500 [==========================>...] - ETA: 11s - loss: 1.3672 - regression_loss: 1.1897 - classification_loss: 0.1775 454/500 [==========================>...] - ETA: 11s - loss: 1.3657 - regression_loss: 1.1885 - classification_loss: 0.1772 455/500 [==========================>...] - ETA: 11s - loss: 1.3643 - regression_loss: 1.1873 - classification_loss: 0.1770 456/500 [==========================>...] - ETA: 11s - loss: 1.3655 - regression_loss: 1.1882 - classification_loss: 0.1773 457/500 [==========================>...] - ETA: 10s - loss: 1.3660 - regression_loss: 1.1886 - classification_loss: 0.1774 458/500 [==========================>...] - ETA: 10s - loss: 1.3646 - regression_loss: 1.1874 - classification_loss: 0.1772 459/500 [==========================>...] - ETA: 10s - loss: 1.3633 - regression_loss: 1.1862 - classification_loss: 0.1770 460/500 [==========================>...] - ETA: 10s - loss: 1.3639 - regression_loss: 1.1868 - classification_loss: 0.1772 461/500 [==========================>...] - ETA: 9s - loss: 1.3646 - regression_loss: 1.1873 - classification_loss: 0.1773  462/500 [==========================>...] - ETA: 9s - loss: 1.3634 - regression_loss: 1.1862 - classification_loss: 0.1771 463/500 [==========================>...] - ETA: 9s - loss: 1.3628 - regression_loss: 1.1858 - classification_loss: 0.1770 464/500 [==========================>...] - ETA: 9s - loss: 1.3629 - regression_loss: 1.1859 - classification_loss: 0.1769 465/500 [==========================>...] - ETA: 8s - loss: 1.3621 - regression_loss: 1.1852 - classification_loss: 0.1769 466/500 [==========================>...] - ETA: 8s - loss: 1.3623 - regression_loss: 1.1854 - classification_loss: 0.1769 467/500 [===========================>..] - ETA: 8s - loss: 1.3618 - regression_loss: 1.1850 - classification_loss: 0.1768 468/500 [===========================>..] - ETA: 8s - loss: 1.3619 - regression_loss: 1.1851 - classification_loss: 0.1768 469/500 [===========================>..] - ETA: 7s - loss: 1.3621 - regression_loss: 1.1851 - classification_loss: 0.1770 470/500 [===========================>..] - ETA: 7s - loss: 1.3644 - regression_loss: 1.1870 - classification_loss: 0.1774 471/500 [===========================>..] - ETA: 7s - loss: 1.3642 - regression_loss: 1.1869 - classification_loss: 0.1773 472/500 [===========================>..] - ETA: 7s - loss: 1.3627 - regression_loss: 1.1857 - classification_loss: 0.1770 473/500 [===========================>..] - ETA: 6s - loss: 1.3624 - regression_loss: 1.1854 - classification_loss: 0.1770 474/500 [===========================>..] - ETA: 6s - loss: 1.3621 - regression_loss: 1.1851 - classification_loss: 0.1770 475/500 [===========================>..] - ETA: 6s - loss: 1.3620 - regression_loss: 1.1851 - classification_loss: 0.1770 476/500 [===========================>..] - ETA: 6s - loss: 1.3609 - regression_loss: 1.1841 - classification_loss: 0.1768 477/500 [===========================>..] - ETA: 5s - loss: 1.3612 - regression_loss: 1.1844 - classification_loss: 0.1768 478/500 [===========================>..] - ETA: 5s - loss: 1.3613 - regression_loss: 1.1845 - classification_loss: 0.1768 479/500 [===========================>..] - ETA: 5s - loss: 1.3614 - regression_loss: 1.1845 - classification_loss: 0.1769 480/500 [===========================>..] - ETA: 5s - loss: 1.3613 - regression_loss: 1.1845 - classification_loss: 0.1769 481/500 [===========================>..] - ETA: 4s - loss: 1.3614 - regression_loss: 1.1846 - classification_loss: 0.1769 482/500 [===========================>..] - ETA: 4s - loss: 1.3622 - regression_loss: 1.1852 - classification_loss: 0.1770 483/500 [===========================>..] - ETA: 4s - loss: 1.3624 - regression_loss: 1.1854 - classification_loss: 0.1770 484/500 [============================>.] - ETA: 4s - loss: 1.3629 - regression_loss: 1.1858 - classification_loss: 0.1771 485/500 [============================>.] - ETA: 3s - loss: 1.3622 - regression_loss: 1.1852 - classification_loss: 0.1770 486/500 [============================>.] - ETA: 3s - loss: 1.3626 - regression_loss: 1.1858 - classification_loss: 0.1769 487/500 [============================>.] - ETA: 3s - loss: 1.3630 - regression_loss: 1.1863 - classification_loss: 0.1768 488/500 [============================>.] - ETA: 3s - loss: 1.3630 - regression_loss: 1.1862 - classification_loss: 0.1768 489/500 [============================>.] - ETA: 2s - loss: 1.3623 - regression_loss: 1.1857 - classification_loss: 0.1766 490/500 [============================>.] - ETA: 2s - loss: 1.3607 - regression_loss: 1.1844 - classification_loss: 0.1764 491/500 [============================>.] - ETA: 2s - loss: 1.3614 - regression_loss: 1.1849 - classification_loss: 0.1765 492/500 [============================>.] - ETA: 2s - loss: 1.3627 - regression_loss: 1.1862 - classification_loss: 0.1765 493/500 [============================>.] - ETA: 1s - loss: 1.3627 - regression_loss: 1.1862 - classification_loss: 0.1765 494/500 [============================>.] - ETA: 1s - loss: 1.3625 - regression_loss: 1.1860 - classification_loss: 0.1765 495/500 [============================>.] - ETA: 1s - loss: 1.3613 - regression_loss: 1.1849 - classification_loss: 0.1764 496/500 [============================>.] - ETA: 1s - loss: 1.3626 - regression_loss: 1.1859 - classification_loss: 0.1767 497/500 [============================>.] - ETA: 0s - loss: 1.3617 - regression_loss: 1.1852 - classification_loss: 0.1764 498/500 [============================>.] - ETA: 0s - loss: 1.3609 - regression_loss: 1.1846 - classification_loss: 0.1763 499/500 [============================>.] - ETA: 0s - loss: 1.3604 - regression_loss: 1.1842 - classification_loss: 0.1762 500/500 [==============================] - 126s 251ms/step - loss: 1.3610 - regression_loss: 1.1848 - classification_loss: 0.1763 1172 instances of class plum with average precision: 0.7179 mAP: 0.7179 Epoch 00019: saving model to ./training/snapshots/resnet50_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 1:50 - loss: 1.4536 - regression_loss: 1.2697 - classification_loss: 0.1839 2/500 [..............................] - ETA: 1:51 - loss: 1.2565 - regression_loss: 1.0917 - classification_loss: 0.1648 3/500 [..............................] - ETA: 1:55 - loss: 1.3469 - regression_loss: 1.1492 - classification_loss: 0.1977 4/500 [..............................] - ETA: 1:54 - loss: 1.2949 - regression_loss: 1.0995 - classification_loss: 0.1954 5/500 [..............................] - ETA: 1:53 - loss: 1.3259 - regression_loss: 1.1291 - classification_loss: 0.1967 6/500 [..............................] - ETA: 1:52 - loss: 1.3414 - regression_loss: 1.1518 - classification_loss: 0.1896 7/500 [..............................] - ETA: 1:51 - loss: 1.3041 - regression_loss: 1.1251 - classification_loss: 0.1790 8/500 [..............................] - ETA: 1:50 - loss: 1.3573 - regression_loss: 1.1683 - classification_loss: 0.1890 9/500 [..............................] - ETA: 1:51 - loss: 1.3493 - regression_loss: 1.1672 - classification_loss: 0.1820 10/500 [..............................] - ETA: 1:52 - loss: 1.3726 - regression_loss: 1.1902 - classification_loss: 0.1824 11/500 [..............................] - ETA: 1:53 - loss: 1.3829 - regression_loss: 1.1998 - classification_loss: 0.1831 12/500 [..............................] - ETA: 1:53 - loss: 1.3324 - regression_loss: 1.1576 - classification_loss: 0.1748 13/500 [..............................] - ETA: 1:54 - loss: 1.2784 - regression_loss: 1.1136 - classification_loss: 0.1648 14/500 [..............................] - ETA: 1:54 - loss: 1.2873 - regression_loss: 1.1222 - classification_loss: 0.1651 15/500 [..............................] - ETA: 1:54 - loss: 1.3055 - regression_loss: 1.1395 - classification_loss: 0.1660 16/500 [..............................] - ETA: 1:55 - loss: 1.3040 - regression_loss: 1.1331 - classification_loss: 0.1709 17/500 [>.............................] - ETA: 1:55 - loss: 1.2524 - regression_loss: 1.0863 - classification_loss: 0.1661 18/500 [>.............................] - ETA: 1:55 - loss: 1.2461 - regression_loss: 1.0808 - classification_loss: 0.1653 19/500 [>.............................] - ETA: 1:55 - loss: 1.2094 - regression_loss: 1.0507 - classification_loss: 0.1587 20/500 [>.............................] - ETA: 1:55 - loss: 1.2313 - regression_loss: 1.0688 - classification_loss: 0.1626 21/500 [>.............................] - ETA: 1:55 - loss: 1.2510 - regression_loss: 1.0857 - classification_loss: 0.1652 22/500 [>.............................] - ETA: 1:55 - loss: 1.2841 - regression_loss: 1.1136 - classification_loss: 0.1705 23/500 [>.............................] - ETA: 1:55 - loss: 1.2991 - regression_loss: 1.1309 - classification_loss: 0.1681 24/500 [>.............................] - ETA: 1:55 - loss: 1.3053 - regression_loss: 1.1359 - classification_loss: 0.1694 25/500 [>.............................] - ETA: 1:55 - loss: 1.3311 - regression_loss: 1.1572 - classification_loss: 0.1739 26/500 [>.............................] - ETA: 1:55 - loss: 1.3240 - regression_loss: 1.1530 - classification_loss: 0.1710 27/500 [>.............................] - ETA: 1:55 - loss: 1.3384 - regression_loss: 1.1657 - classification_loss: 0.1726 28/500 [>.............................] - ETA: 1:54 - loss: 1.3284 - regression_loss: 1.1570 - classification_loss: 0.1715 29/500 [>.............................] - ETA: 1:54 - loss: 1.3090 - regression_loss: 1.1413 - classification_loss: 0.1676 30/500 [>.............................] - ETA: 1:54 - loss: 1.3179 - regression_loss: 1.1488 - classification_loss: 0.1691 31/500 [>.............................] - ETA: 1:54 - loss: 1.3284 - regression_loss: 1.1569 - classification_loss: 0.1715 32/500 [>.............................] - ETA: 1:54 - loss: 1.3380 - regression_loss: 1.1674 - classification_loss: 0.1706 33/500 [>.............................] - ETA: 1:54 - loss: 1.3492 - regression_loss: 1.1765 - classification_loss: 0.1726 34/500 [=>............................] - ETA: 1:54 - loss: 1.3443 - regression_loss: 1.1741 - classification_loss: 0.1702 35/500 [=>............................] - ETA: 1:54 - loss: 1.3445 - regression_loss: 1.1733 - classification_loss: 0.1713 36/500 [=>............................] - ETA: 1:54 - loss: 1.3256 - regression_loss: 1.1529 - classification_loss: 0.1727 37/500 [=>............................] - ETA: 1:53 - loss: 1.3331 - regression_loss: 1.1596 - classification_loss: 0.1734 38/500 [=>............................] - ETA: 1:53 - loss: 1.3427 - regression_loss: 1.1683 - classification_loss: 0.1744 39/500 [=>............................] - ETA: 1:53 - loss: 1.3315 - regression_loss: 1.1582 - classification_loss: 0.1732 40/500 [=>............................] - ETA: 1:53 - loss: 1.3406 - regression_loss: 1.1672 - classification_loss: 0.1735 41/500 [=>............................] - ETA: 1:53 - loss: 1.3252 - regression_loss: 1.1540 - classification_loss: 0.1712 42/500 [=>............................] - ETA: 1:53 - loss: 1.3210 - regression_loss: 1.1506 - classification_loss: 0.1703 43/500 [=>............................] - ETA: 1:52 - loss: 1.3257 - regression_loss: 1.1553 - classification_loss: 0.1704 44/500 [=>............................] - ETA: 1:52 - loss: 1.3349 - regression_loss: 1.1631 - classification_loss: 0.1718 45/500 [=>............................] - ETA: 1:52 - loss: 1.3326 - regression_loss: 1.1624 - classification_loss: 0.1702 46/500 [=>............................] - ETA: 1:52 - loss: 1.3274 - regression_loss: 1.1588 - classification_loss: 0.1686 47/500 [=>............................] - ETA: 1:52 - loss: 1.3127 - regression_loss: 1.1461 - classification_loss: 0.1665 48/500 [=>............................] - ETA: 1:52 - loss: 1.3222 - regression_loss: 1.1539 - classification_loss: 0.1682 49/500 [=>............................] - ETA: 1:51 - loss: 1.3012 - regression_loss: 1.1357 - classification_loss: 0.1656 50/500 [==>...........................] - ETA: 1:51 - loss: 1.3036 - regression_loss: 1.1377 - classification_loss: 0.1659 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3137 - regression_loss: 1.1463 - classification_loss: 0.1674 52/500 [==>...........................] - ETA: 1:51 - loss: 1.3067 - regression_loss: 1.1402 - classification_loss: 0.1664 53/500 [==>...........................] - ETA: 1:50 - loss: 1.3045 - regression_loss: 1.1396 - classification_loss: 0.1648 54/500 [==>...........................] - ETA: 1:50 - loss: 1.3043 - regression_loss: 1.1399 - classification_loss: 0.1644 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3014 - regression_loss: 1.1389 - classification_loss: 0.1625 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2983 - regression_loss: 1.1365 - classification_loss: 0.1617 57/500 [==>...........................] - ETA: 1:50 - loss: 1.3027 - regression_loss: 1.1409 - classification_loss: 0.1618 58/500 [==>...........................] - ETA: 1:49 - loss: 1.3146 - regression_loss: 1.1522 - classification_loss: 0.1624 59/500 [==>...........................] - ETA: 1:49 - loss: 1.3176 - regression_loss: 1.1537 - classification_loss: 0.1639 60/500 [==>...........................] - ETA: 1:49 - loss: 1.3161 - regression_loss: 1.1520 - classification_loss: 0.1640 61/500 [==>...........................] - ETA: 1:49 - loss: 1.3158 - regression_loss: 1.1505 - classification_loss: 0.1654 62/500 [==>...........................] - ETA: 1:48 - loss: 1.3189 - regression_loss: 1.1530 - classification_loss: 0.1659 63/500 [==>...........................] - ETA: 1:48 - loss: 1.3109 - regression_loss: 1.1462 - classification_loss: 0.1646 64/500 [==>...........................] - ETA: 1:48 - loss: 1.3158 - regression_loss: 1.1516 - classification_loss: 0.1642 65/500 [==>...........................] - ETA: 1:48 - loss: 1.3185 - regression_loss: 1.1540 - classification_loss: 0.1645 66/500 [==>...........................] - ETA: 1:48 - loss: 1.3162 - regression_loss: 1.1523 - classification_loss: 0.1638 67/500 [===>..........................] - ETA: 1:47 - loss: 1.3225 - regression_loss: 1.1572 - classification_loss: 0.1652 68/500 [===>..........................] - ETA: 1:47 - loss: 1.3416 - regression_loss: 1.1722 - classification_loss: 0.1694 69/500 [===>..........................] - ETA: 1:47 - loss: 1.3426 - regression_loss: 1.1731 - classification_loss: 0.1695 70/500 [===>..........................] - ETA: 1:47 - loss: 1.3362 - regression_loss: 1.1679 - classification_loss: 0.1683 71/500 [===>..........................] - ETA: 1:46 - loss: 1.3355 - regression_loss: 1.1672 - classification_loss: 0.1683 72/500 [===>..........................] - ETA: 1:46 - loss: 1.3318 - regression_loss: 1.1640 - classification_loss: 0.1679 73/500 [===>..........................] - ETA: 1:46 - loss: 1.3279 - regression_loss: 1.1608 - classification_loss: 0.1672 74/500 [===>..........................] - ETA: 1:46 - loss: 1.3263 - regression_loss: 1.1600 - classification_loss: 0.1664 75/500 [===>..........................] - ETA: 1:45 - loss: 1.3209 - regression_loss: 1.1555 - classification_loss: 0.1654 76/500 [===>..........................] - ETA: 1:45 - loss: 1.3215 - regression_loss: 1.1562 - classification_loss: 0.1653 77/500 [===>..........................] - ETA: 1:45 - loss: 1.3190 - regression_loss: 1.1537 - classification_loss: 0.1653 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3230 - regression_loss: 1.1571 - classification_loss: 0.1659 79/500 [===>..........................] - ETA: 1:44 - loss: 1.3210 - regression_loss: 1.1554 - classification_loss: 0.1656 80/500 [===>..........................] - ETA: 1:44 - loss: 1.3179 - regression_loss: 1.1526 - classification_loss: 0.1653 81/500 [===>..........................] - ETA: 1:44 - loss: 1.3165 - regression_loss: 1.1522 - classification_loss: 0.1644 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3130 - regression_loss: 1.1496 - classification_loss: 0.1634 83/500 [===>..........................] - ETA: 1:43 - loss: 1.3132 - regression_loss: 1.1491 - classification_loss: 0.1641 84/500 [====>.........................] - ETA: 1:43 - loss: 1.3182 - regression_loss: 1.1538 - classification_loss: 0.1645 85/500 [====>.........................] - ETA: 1:43 - loss: 1.3181 - regression_loss: 1.1537 - classification_loss: 0.1645 86/500 [====>.........................] - ETA: 1:43 - loss: 1.3174 - regression_loss: 1.1534 - classification_loss: 0.1639 87/500 [====>.........................] - ETA: 1:42 - loss: 1.3194 - regression_loss: 1.1548 - classification_loss: 0.1646 88/500 [====>.........................] - ETA: 1:42 - loss: 1.3190 - regression_loss: 1.1545 - classification_loss: 0.1645 89/500 [====>.........................] - ETA: 1:42 - loss: 1.3224 - regression_loss: 1.1569 - classification_loss: 0.1656 90/500 [====>.........................] - ETA: 1:42 - loss: 1.3195 - regression_loss: 1.1548 - classification_loss: 0.1647 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3144 - regression_loss: 1.1507 - classification_loss: 0.1636 92/500 [====>.........................] - ETA: 1:41 - loss: 1.3185 - regression_loss: 1.1544 - classification_loss: 0.1641 93/500 [====>.........................] - ETA: 1:41 - loss: 1.3115 - regression_loss: 1.1480 - classification_loss: 0.1636 94/500 [====>.........................] - ETA: 1:41 - loss: 1.3138 - regression_loss: 1.1498 - classification_loss: 0.1640 95/500 [====>.........................] - ETA: 1:41 - loss: 1.3196 - regression_loss: 1.1542 - classification_loss: 0.1654 96/500 [====>.........................] - ETA: 1:40 - loss: 1.3136 - regression_loss: 1.1492 - classification_loss: 0.1643 97/500 [====>.........................] - ETA: 1:40 - loss: 1.3140 - regression_loss: 1.1489 - classification_loss: 0.1651 98/500 [====>.........................] - ETA: 1:40 - loss: 1.3058 - regression_loss: 1.1419 - classification_loss: 0.1638 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3089 - regression_loss: 1.1444 - classification_loss: 0.1645 100/500 [=====>........................] - ETA: 1:39 - loss: 1.3077 - regression_loss: 1.1433 - classification_loss: 0.1644 101/500 [=====>........................] - ETA: 1:39 - loss: 1.3102 - regression_loss: 1.1458 - classification_loss: 0.1644 102/500 [=====>........................] - ETA: 1:39 - loss: 1.3127 - regression_loss: 1.1478 - classification_loss: 0.1649 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3201 - regression_loss: 1.1528 - classification_loss: 0.1672 104/500 [=====>........................] - ETA: 1:38 - loss: 1.3175 - regression_loss: 1.1506 - classification_loss: 0.1669 105/500 [=====>........................] - ETA: 1:38 - loss: 1.3174 - regression_loss: 1.1503 - classification_loss: 0.1671 106/500 [=====>........................] - ETA: 1:38 - loss: 1.3193 - regression_loss: 1.1516 - classification_loss: 0.1676 107/500 [=====>........................] - ETA: 1:38 - loss: 1.3209 - regression_loss: 1.1529 - classification_loss: 0.1680 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3190 - regression_loss: 1.1514 - classification_loss: 0.1676 109/500 [=====>........................] - ETA: 1:37 - loss: 1.3202 - regression_loss: 1.1518 - classification_loss: 0.1684 110/500 [=====>........................] - ETA: 1:37 - loss: 1.3234 - regression_loss: 1.1540 - classification_loss: 0.1694 111/500 [=====>........................] - ETA: 1:37 - loss: 1.3257 - regression_loss: 1.1557 - classification_loss: 0.1700 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3242 - regression_loss: 1.1544 - classification_loss: 0.1698 113/500 [=====>........................] - ETA: 1:36 - loss: 1.3251 - regression_loss: 1.1551 - classification_loss: 0.1700 114/500 [=====>........................] - ETA: 1:36 - loss: 1.3266 - regression_loss: 1.1561 - classification_loss: 0.1705 115/500 [=====>........................] - ETA: 1:36 - loss: 1.3308 - regression_loss: 1.1597 - classification_loss: 0.1711 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3316 - regression_loss: 1.1607 - classification_loss: 0.1710 117/500 [======>.......................] - ETA: 1:35 - loss: 1.3356 - regression_loss: 1.1638 - classification_loss: 0.1718 118/500 [======>.......................] - ETA: 1:35 - loss: 1.3388 - regression_loss: 1.1665 - classification_loss: 0.1723 119/500 [======>.......................] - ETA: 1:35 - loss: 1.3529 - regression_loss: 1.1789 - classification_loss: 0.1741 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3587 - regression_loss: 1.1836 - classification_loss: 0.1751 121/500 [======>.......................] - ETA: 1:34 - loss: 1.3598 - regression_loss: 1.1842 - classification_loss: 0.1756 122/500 [======>.......................] - ETA: 1:34 - loss: 1.3634 - regression_loss: 1.1872 - classification_loss: 0.1762 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3641 - regression_loss: 1.1881 - classification_loss: 0.1760 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3655 - regression_loss: 1.1891 - classification_loss: 0.1763 125/500 [======>.......................] - ETA: 1:33 - loss: 1.3685 - regression_loss: 1.1922 - classification_loss: 0.1762 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3676 - regression_loss: 1.1915 - classification_loss: 0.1761 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3690 - regression_loss: 1.1928 - classification_loss: 0.1761 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3645 - regression_loss: 1.1889 - classification_loss: 0.1755 129/500 [======>.......................] - ETA: 1:32 - loss: 1.3631 - regression_loss: 1.1881 - classification_loss: 0.1749 130/500 [======>.......................] - ETA: 1:32 - loss: 1.3639 - regression_loss: 1.1889 - classification_loss: 0.1750 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3656 - regression_loss: 1.1906 - classification_loss: 0.1751 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3612 - regression_loss: 1.1868 - classification_loss: 0.1744 133/500 [======>.......................] - ETA: 1:31 - loss: 1.3646 - regression_loss: 1.1897 - classification_loss: 0.1749 134/500 [=======>......................] - ETA: 1:31 - loss: 1.3646 - regression_loss: 1.1887 - classification_loss: 0.1759 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3676 - regression_loss: 1.1916 - classification_loss: 0.1760 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3690 - regression_loss: 1.1929 - classification_loss: 0.1760 137/500 [=======>......................] - ETA: 1:30 - loss: 1.3699 - regression_loss: 1.1940 - classification_loss: 0.1760 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3711 - regression_loss: 1.1949 - classification_loss: 0.1762 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3692 - regression_loss: 1.1934 - classification_loss: 0.1758 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3642 - regression_loss: 1.1894 - classification_loss: 0.1748 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3612 - regression_loss: 1.1870 - classification_loss: 0.1741 142/500 [=======>......................] - ETA: 1:29 - loss: 1.3604 - regression_loss: 1.1866 - classification_loss: 0.1738 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3595 - regression_loss: 1.1860 - classification_loss: 0.1736 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3592 - regression_loss: 1.1858 - classification_loss: 0.1734 145/500 [=======>......................] - ETA: 1:28 - loss: 1.3616 - regression_loss: 1.1873 - classification_loss: 0.1743 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3579 - regression_loss: 1.1840 - classification_loss: 0.1739 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3588 - regression_loss: 1.1846 - classification_loss: 0.1742 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3564 - regression_loss: 1.1824 - classification_loss: 0.1740 149/500 [=======>......................] - ETA: 1:28 - loss: 1.3591 - regression_loss: 1.1847 - classification_loss: 0.1745 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3581 - regression_loss: 1.1836 - classification_loss: 0.1745 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3573 - regression_loss: 1.1829 - classification_loss: 0.1745 152/500 [========>.....................] - ETA: 1:28 - loss: 1.3596 - regression_loss: 1.1840 - classification_loss: 0.1756 153/500 [========>.....................] - ETA: 1:28 - loss: 1.3602 - regression_loss: 1.1845 - classification_loss: 0.1757 154/500 [========>.....................] - ETA: 1:27 - loss: 1.3627 - regression_loss: 1.1866 - classification_loss: 0.1761 155/500 [========>.....................] - ETA: 1:27 - loss: 1.3587 - regression_loss: 1.1833 - classification_loss: 0.1754 156/500 [========>.....................] - ETA: 1:27 - loss: 1.3587 - regression_loss: 1.1834 - classification_loss: 0.1753 157/500 [========>.....................] - ETA: 1:27 - loss: 1.3584 - regression_loss: 1.1831 - classification_loss: 0.1753 158/500 [========>.....................] - ETA: 1:26 - loss: 1.3576 - regression_loss: 1.1824 - classification_loss: 0.1752 159/500 [========>.....................] - ETA: 1:26 - loss: 1.3531 - regression_loss: 1.1786 - classification_loss: 0.1744 160/500 [========>.....................] - ETA: 1:26 - loss: 1.3557 - regression_loss: 1.1809 - classification_loss: 0.1748 161/500 [========>.....................] - ETA: 1:26 - loss: 1.3550 - regression_loss: 1.1803 - classification_loss: 0.1747 162/500 [========>.....................] - ETA: 1:26 - loss: 1.3527 - regression_loss: 1.1783 - classification_loss: 0.1744 163/500 [========>.....................] - ETA: 1:25 - loss: 1.3496 - regression_loss: 1.1750 - classification_loss: 0.1746 164/500 [========>.....................] - ETA: 1:25 - loss: 1.3493 - regression_loss: 1.1747 - classification_loss: 0.1746 165/500 [========>.....................] - ETA: 1:25 - loss: 1.3494 - regression_loss: 1.1748 - classification_loss: 0.1747 166/500 [========>.....................] - ETA: 1:24 - loss: 1.3432 - regression_loss: 1.1692 - classification_loss: 0.1740 167/500 [=========>....................] - ETA: 1:24 - loss: 1.3408 - regression_loss: 1.1673 - classification_loss: 0.1735 168/500 [=========>....................] - ETA: 1:24 - loss: 1.3365 - regression_loss: 1.1636 - classification_loss: 0.1729 169/500 [=========>....................] - ETA: 1:24 - loss: 1.3393 - regression_loss: 1.1662 - classification_loss: 0.1731 170/500 [=========>....................] - ETA: 1:23 - loss: 1.3415 - regression_loss: 1.1677 - classification_loss: 0.1738 171/500 [=========>....................] - ETA: 1:23 - loss: 1.3412 - regression_loss: 1.1673 - classification_loss: 0.1738 172/500 [=========>....................] - ETA: 1:23 - loss: 1.3427 - regression_loss: 1.1688 - classification_loss: 0.1739 173/500 [=========>....................] - ETA: 1:23 - loss: 1.3435 - regression_loss: 1.1696 - classification_loss: 0.1739 174/500 [=========>....................] - ETA: 1:22 - loss: 1.3420 - regression_loss: 1.1678 - classification_loss: 0.1742 175/500 [=========>....................] - ETA: 1:22 - loss: 1.3428 - regression_loss: 1.1684 - classification_loss: 0.1745 176/500 [=========>....................] - ETA: 1:22 - loss: 1.3418 - regression_loss: 1.1677 - classification_loss: 0.1741 177/500 [=========>....................] - ETA: 1:21 - loss: 1.3422 - regression_loss: 1.1680 - classification_loss: 0.1742 178/500 [=========>....................] - ETA: 1:21 - loss: 1.3408 - regression_loss: 1.1669 - classification_loss: 0.1739 179/500 [=========>....................] - ETA: 1:21 - loss: 1.3403 - regression_loss: 1.1665 - classification_loss: 0.1739 180/500 [=========>....................] - ETA: 1:21 - loss: 1.3377 - regression_loss: 1.1645 - classification_loss: 0.1732 181/500 [=========>....................] - ETA: 1:20 - loss: 1.3363 - regression_loss: 1.1635 - classification_loss: 0.1728 182/500 [=========>....................] - ETA: 1:20 - loss: 1.3362 - regression_loss: 1.1635 - classification_loss: 0.1727 183/500 [=========>....................] - ETA: 1:20 - loss: 1.3346 - regression_loss: 1.1621 - classification_loss: 0.1726 184/500 [==========>...................] - ETA: 1:20 - loss: 1.3318 - regression_loss: 1.1596 - classification_loss: 0.1722 185/500 [==========>...................] - ETA: 1:19 - loss: 1.3317 - regression_loss: 1.1599 - classification_loss: 0.1717 186/500 [==========>...................] - ETA: 1:19 - loss: 1.3325 - regression_loss: 1.1609 - classification_loss: 0.1715 187/500 [==========>...................] - ETA: 1:19 - loss: 1.3336 - regression_loss: 1.1621 - classification_loss: 0.1715 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3340 - regression_loss: 1.1625 - classification_loss: 0.1715 189/500 [==========>...................] - ETA: 1:18 - loss: 1.3348 - regression_loss: 1.1633 - classification_loss: 0.1715 190/500 [==========>...................] - ETA: 1:18 - loss: 1.3347 - regression_loss: 1.1633 - classification_loss: 0.1714 191/500 [==========>...................] - ETA: 1:18 - loss: 1.3330 - regression_loss: 1.1618 - classification_loss: 0.1711 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3350 - regression_loss: 1.1634 - classification_loss: 0.1717 193/500 [==========>...................] - ETA: 1:17 - loss: 1.3318 - regression_loss: 1.1605 - classification_loss: 0.1713 194/500 [==========>...................] - ETA: 1:17 - loss: 1.3311 - regression_loss: 1.1600 - classification_loss: 0.1711 195/500 [==========>...................] - ETA: 1:17 - loss: 1.3325 - regression_loss: 1.1611 - classification_loss: 0.1713 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3346 - regression_loss: 1.1630 - classification_loss: 0.1716 197/500 [==========>...................] - ETA: 1:16 - loss: 1.3329 - regression_loss: 1.1617 - classification_loss: 0.1712 198/500 [==========>...................] - ETA: 1:16 - loss: 1.3318 - regression_loss: 1.1607 - classification_loss: 0.1710 199/500 [==========>...................] - ETA: 1:16 - loss: 1.3303 - regression_loss: 1.1594 - classification_loss: 0.1709 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3336 - regression_loss: 1.1622 - classification_loss: 0.1714 201/500 [===========>..................] - ETA: 1:15 - loss: 1.3402 - regression_loss: 1.1686 - classification_loss: 0.1717 202/500 [===========>..................] - ETA: 1:15 - loss: 1.3362 - regression_loss: 1.1652 - classification_loss: 0.1711 203/500 [===========>..................] - ETA: 1:15 - loss: 1.3341 - regression_loss: 1.1634 - classification_loss: 0.1707 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3367 - regression_loss: 1.1655 - classification_loss: 0.1711 205/500 [===========>..................] - ETA: 1:14 - loss: 1.3385 - regression_loss: 1.1669 - classification_loss: 0.1716 206/500 [===========>..................] - ETA: 1:14 - loss: 1.3360 - regression_loss: 1.1649 - classification_loss: 0.1712 207/500 [===========>..................] - ETA: 1:14 - loss: 1.3361 - regression_loss: 1.1648 - classification_loss: 0.1713 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3369 - regression_loss: 1.1655 - classification_loss: 0.1714 209/500 [===========>..................] - ETA: 1:13 - loss: 1.3351 - regression_loss: 1.1641 - classification_loss: 0.1709 210/500 [===========>..................] - ETA: 1:13 - loss: 1.3351 - regression_loss: 1.1643 - classification_loss: 0.1708 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3356 - regression_loss: 1.1650 - classification_loss: 0.1706 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3333 - regression_loss: 1.1625 - classification_loss: 0.1708 213/500 [===========>..................] - ETA: 1:12 - loss: 1.3369 - regression_loss: 1.1657 - classification_loss: 0.1711 214/500 [===========>..................] - ETA: 1:12 - loss: 1.3379 - regression_loss: 1.1667 - classification_loss: 0.1712 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3371 - regression_loss: 1.1662 - classification_loss: 0.1709 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3350 - regression_loss: 1.1645 - classification_loss: 0.1705 217/500 [============>.................] - ETA: 1:11 - loss: 1.3369 - regression_loss: 1.1659 - classification_loss: 0.1710 218/500 [============>.................] - ETA: 1:10 - loss: 1.3394 - regression_loss: 1.1681 - classification_loss: 0.1714 219/500 [============>.................] - ETA: 1:10 - loss: 1.3401 - regression_loss: 1.1687 - classification_loss: 0.1714 220/500 [============>.................] - ETA: 1:10 - loss: 1.3411 - regression_loss: 1.1696 - classification_loss: 0.1716 221/500 [============>.................] - ETA: 1:10 - loss: 1.3437 - regression_loss: 1.1716 - classification_loss: 0.1721 222/500 [============>.................] - ETA: 1:09 - loss: 1.3439 - regression_loss: 1.1720 - classification_loss: 0.1720 223/500 [============>.................] - ETA: 1:09 - loss: 1.3434 - regression_loss: 1.1715 - classification_loss: 0.1719 224/500 [============>.................] - ETA: 1:09 - loss: 1.3457 - regression_loss: 1.1734 - classification_loss: 0.1723 225/500 [============>.................] - ETA: 1:09 - loss: 1.3456 - regression_loss: 1.1733 - classification_loss: 0.1723 226/500 [============>.................] - ETA: 1:09 - loss: 1.3433 - regression_loss: 1.1710 - classification_loss: 0.1722 227/500 [============>.................] - ETA: 1:08 - loss: 1.3445 - regression_loss: 1.1722 - classification_loss: 0.1723 228/500 [============>.................] - ETA: 1:08 - loss: 1.3434 - regression_loss: 1.1713 - classification_loss: 0.1721 229/500 [============>.................] - ETA: 1:08 - loss: 1.3436 - regression_loss: 1.1718 - classification_loss: 0.1718 230/500 [============>.................] - ETA: 1:08 - loss: 1.3446 - regression_loss: 1.1726 - classification_loss: 0.1720 231/500 [============>.................] - ETA: 1:08 - loss: 1.3430 - regression_loss: 1.1714 - classification_loss: 0.1716 232/500 [============>.................] - ETA: 1:07 - loss: 1.3434 - regression_loss: 1.1716 - classification_loss: 0.1718 233/500 [============>.................] - ETA: 1:07 - loss: 1.3443 - regression_loss: 1.1724 - classification_loss: 0.1719 234/500 [=============>................] - ETA: 1:07 - loss: 1.3433 - regression_loss: 1.1716 - classification_loss: 0.1717 235/500 [=============>................] - ETA: 1:07 - loss: 1.3436 - regression_loss: 1.1720 - classification_loss: 0.1716 236/500 [=============>................] - ETA: 1:06 - loss: 1.3427 - regression_loss: 1.1714 - classification_loss: 0.1713 237/500 [=============>................] - ETA: 1:06 - loss: 1.3438 - regression_loss: 1.1721 - classification_loss: 0.1717 238/500 [=============>................] - ETA: 1:06 - loss: 1.3450 - regression_loss: 1.1730 - classification_loss: 0.1720 239/500 [=============>................] - ETA: 1:06 - loss: 1.3444 - regression_loss: 1.1727 - classification_loss: 0.1717 240/500 [=============>................] - ETA: 1:05 - loss: 1.3453 - regression_loss: 1.1735 - classification_loss: 0.1718 241/500 [=============>................] - ETA: 1:05 - loss: 1.3422 - regression_loss: 1.1708 - classification_loss: 0.1714 242/500 [=============>................] - ETA: 1:05 - loss: 1.3459 - regression_loss: 1.1740 - classification_loss: 0.1719 243/500 [=============>................] - ETA: 1:04 - loss: 1.3439 - regression_loss: 1.1725 - classification_loss: 0.1714 244/500 [=============>................] - ETA: 1:04 - loss: 1.3437 - regression_loss: 1.1722 - classification_loss: 0.1715 245/500 [=============>................] - ETA: 1:04 - loss: 1.3418 - regression_loss: 1.1705 - classification_loss: 0.1713 246/500 [=============>................] - ETA: 1:04 - loss: 1.3381 - regression_loss: 1.1674 - classification_loss: 0.1708 247/500 [=============>................] - ETA: 1:03 - loss: 1.3374 - regression_loss: 1.1667 - classification_loss: 0.1707 248/500 [=============>................] - ETA: 1:03 - loss: 1.3386 - regression_loss: 1.1677 - classification_loss: 0.1709 249/500 [=============>................] - ETA: 1:03 - loss: 1.3378 - regression_loss: 1.1671 - classification_loss: 0.1706 250/500 [==============>...............] - ETA: 1:03 - loss: 1.3389 - regression_loss: 1.1681 - classification_loss: 0.1708 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3402 - regression_loss: 1.1692 - classification_loss: 0.1711 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3408 - regression_loss: 1.1698 - classification_loss: 0.1710 253/500 [==============>...............] - ETA: 1:02 - loss: 1.3414 - regression_loss: 1.1701 - classification_loss: 0.1713 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3384 - regression_loss: 1.1676 - classification_loss: 0.1708 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3399 - regression_loss: 1.1690 - classification_loss: 0.1709 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3396 - regression_loss: 1.1689 - classification_loss: 0.1707 257/500 [==============>...............] - ETA: 1:01 - loss: 1.3394 - regression_loss: 1.1688 - classification_loss: 0.1706 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3406 - regression_loss: 1.1696 - classification_loss: 0.1710 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3408 - regression_loss: 1.1700 - classification_loss: 0.1708 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3398 - regression_loss: 1.1692 - classification_loss: 0.1706 261/500 [==============>...............] - ETA: 1:00 - loss: 1.3410 - regression_loss: 1.1701 - classification_loss: 0.1709 262/500 [==============>...............] - ETA: 59s - loss: 1.3415 - regression_loss: 1.1706 - classification_loss: 0.1709  263/500 [==============>...............] - ETA: 59s - loss: 1.3416 - regression_loss: 1.1706 - classification_loss: 0.1710 264/500 [==============>...............] - ETA: 59s - loss: 1.3402 - regression_loss: 1.1694 - classification_loss: 0.1708 265/500 [==============>...............] - ETA: 59s - loss: 1.3402 - regression_loss: 1.1695 - classification_loss: 0.1707 266/500 [==============>...............] - ETA: 58s - loss: 1.3409 - regression_loss: 1.1700 - classification_loss: 0.1708 267/500 [===============>..............] - ETA: 58s - loss: 1.3385 - regression_loss: 1.1680 - classification_loss: 0.1704 268/500 [===============>..............] - ETA: 58s - loss: 1.3360 - regression_loss: 1.1658 - classification_loss: 0.1701 269/500 [===============>..............] - ETA: 58s - loss: 1.3347 - regression_loss: 1.1647 - classification_loss: 0.1700 270/500 [===============>..............] - ETA: 57s - loss: 1.3328 - regression_loss: 1.1631 - classification_loss: 0.1697 271/500 [===============>..............] - ETA: 57s - loss: 1.3335 - regression_loss: 1.1636 - classification_loss: 0.1698 272/500 [===============>..............] - ETA: 57s - loss: 1.3336 - regression_loss: 1.1637 - classification_loss: 0.1699 273/500 [===============>..............] - ETA: 56s - loss: 1.3363 - regression_loss: 1.1657 - classification_loss: 0.1706 274/500 [===============>..............] - ETA: 56s - loss: 1.3350 - regression_loss: 1.1647 - classification_loss: 0.1703 275/500 [===============>..............] - ETA: 56s - loss: 1.3338 - regression_loss: 1.1638 - classification_loss: 0.1700 276/500 [===============>..............] - ETA: 56s - loss: 1.3332 - regression_loss: 1.1633 - classification_loss: 0.1699 277/500 [===============>..............] - ETA: 55s - loss: 1.3323 - regression_loss: 1.1625 - classification_loss: 0.1698 278/500 [===============>..............] - ETA: 55s - loss: 1.3332 - regression_loss: 1.1633 - classification_loss: 0.1699 279/500 [===============>..............] - ETA: 55s - loss: 1.3321 - regression_loss: 1.1622 - classification_loss: 0.1699 280/500 [===============>..............] - ETA: 55s - loss: 1.3329 - regression_loss: 1.1629 - classification_loss: 0.1700 281/500 [===============>..............] - ETA: 54s - loss: 1.3340 - regression_loss: 1.1638 - classification_loss: 0.1702 282/500 [===============>..............] - ETA: 54s - loss: 1.3351 - regression_loss: 1.1647 - classification_loss: 0.1704 283/500 [===============>..............] - ETA: 54s - loss: 1.3339 - regression_loss: 1.1637 - classification_loss: 0.1702 284/500 [================>.............] - ETA: 54s - loss: 1.3355 - regression_loss: 1.1651 - classification_loss: 0.1704 285/500 [================>.............] - ETA: 53s - loss: 1.3339 - regression_loss: 1.1637 - classification_loss: 0.1703 286/500 [================>.............] - ETA: 53s - loss: 1.3315 - regression_loss: 1.1617 - classification_loss: 0.1698 287/500 [================>.............] - ETA: 53s - loss: 1.3303 - regression_loss: 1.1608 - classification_loss: 0.1695 288/500 [================>.............] - ETA: 53s - loss: 1.3300 - regression_loss: 1.1607 - classification_loss: 0.1694 289/500 [================>.............] - ETA: 52s - loss: 1.3289 - regression_loss: 1.1597 - classification_loss: 0.1692 290/500 [================>.............] - ETA: 52s - loss: 1.3274 - regression_loss: 1.1584 - classification_loss: 0.1690 291/500 [================>.............] - ETA: 52s - loss: 1.3280 - regression_loss: 1.1590 - classification_loss: 0.1691 292/500 [================>.............] - ETA: 52s - loss: 1.3285 - regression_loss: 1.1593 - classification_loss: 0.1692 293/500 [================>.............] - ETA: 51s - loss: 1.3263 - regression_loss: 1.1574 - classification_loss: 0.1689 294/500 [================>.............] - ETA: 51s - loss: 1.3263 - regression_loss: 1.1575 - classification_loss: 0.1688 295/500 [================>.............] - ETA: 51s - loss: 1.3246 - regression_loss: 1.1560 - classification_loss: 0.1686 296/500 [================>.............] - ETA: 51s - loss: 1.3257 - regression_loss: 1.1567 - classification_loss: 0.1690 297/500 [================>.............] - ETA: 50s - loss: 1.3269 - regression_loss: 1.1578 - classification_loss: 0.1691 298/500 [================>.............] - ETA: 50s - loss: 1.3267 - regression_loss: 1.1576 - classification_loss: 0.1691 299/500 [================>.............] - ETA: 50s - loss: 1.3277 - regression_loss: 1.1584 - classification_loss: 0.1693 300/500 [=================>............] - ETA: 50s - loss: 1.3272 - regression_loss: 1.1579 - classification_loss: 0.1692 301/500 [=================>............] - ETA: 49s - loss: 1.3276 - regression_loss: 1.1584 - classification_loss: 0.1692 302/500 [=================>............] - ETA: 49s - loss: 1.3264 - regression_loss: 1.1575 - classification_loss: 0.1689 303/500 [=================>............] - ETA: 49s - loss: 1.3269 - regression_loss: 1.1579 - classification_loss: 0.1689 304/500 [=================>............] - ETA: 49s - loss: 1.3257 - regression_loss: 1.1571 - classification_loss: 0.1686 305/500 [=================>............] - ETA: 48s - loss: 1.3256 - regression_loss: 1.1570 - classification_loss: 0.1686 306/500 [=================>............] - ETA: 48s - loss: 1.3249 - regression_loss: 1.1563 - classification_loss: 0.1686 307/500 [=================>............] - ETA: 48s - loss: 1.3257 - regression_loss: 1.1569 - classification_loss: 0.1688 308/500 [=================>............] - ETA: 48s - loss: 1.3267 - regression_loss: 1.1577 - classification_loss: 0.1690 309/500 [=================>............] - ETA: 47s - loss: 1.3266 - regression_loss: 1.1577 - classification_loss: 0.1688 310/500 [=================>............] - ETA: 47s - loss: 1.3249 - regression_loss: 1.1562 - classification_loss: 0.1687 311/500 [=================>............] - ETA: 47s - loss: 1.3261 - regression_loss: 1.1572 - classification_loss: 0.1689 312/500 [=================>............] - ETA: 47s - loss: 1.3251 - regression_loss: 1.1564 - classification_loss: 0.1688 313/500 [=================>............] - ETA: 46s - loss: 1.3254 - regression_loss: 1.1567 - classification_loss: 0.1687 314/500 [=================>............] - ETA: 46s - loss: 1.3246 - regression_loss: 1.1561 - classification_loss: 0.1685 315/500 [=================>............] - ETA: 46s - loss: 1.3261 - regression_loss: 1.1575 - classification_loss: 0.1686 316/500 [=================>............] - ETA: 46s - loss: 1.3277 - regression_loss: 1.1588 - classification_loss: 0.1689 317/500 [==================>...........] - ETA: 45s - loss: 1.3283 - regression_loss: 1.1596 - classification_loss: 0.1688 318/500 [==================>...........] - ETA: 45s - loss: 1.3276 - regression_loss: 1.1589 - classification_loss: 0.1687 319/500 [==================>...........] - ETA: 45s - loss: 1.3276 - regression_loss: 1.1591 - classification_loss: 0.1685 320/500 [==================>...........] - ETA: 45s - loss: 1.3271 - regression_loss: 1.1587 - classification_loss: 0.1684 321/500 [==================>...........] - ETA: 44s - loss: 1.3277 - regression_loss: 1.1591 - classification_loss: 0.1686 322/500 [==================>...........] - ETA: 44s - loss: 1.3281 - regression_loss: 1.1595 - classification_loss: 0.1686 323/500 [==================>...........] - ETA: 44s - loss: 1.3274 - regression_loss: 1.1590 - classification_loss: 0.1684 324/500 [==================>...........] - ETA: 44s - loss: 1.3285 - regression_loss: 1.1598 - classification_loss: 0.1687 325/500 [==================>...........] - ETA: 43s - loss: 1.3281 - regression_loss: 1.1596 - classification_loss: 0.1685 326/500 [==================>...........] - ETA: 43s - loss: 1.3274 - regression_loss: 1.1589 - classification_loss: 0.1684 327/500 [==================>...........] - ETA: 43s - loss: 1.3267 - regression_loss: 1.1583 - classification_loss: 0.1684 328/500 [==================>...........] - ETA: 43s - loss: 1.3283 - regression_loss: 1.1596 - classification_loss: 0.1688 329/500 [==================>...........] - ETA: 42s - loss: 1.3284 - regression_loss: 1.1599 - classification_loss: 0.1685 330/500 [==================>...........] - ETA: 42s - loss: 1.3275 - regression_loss: 1.1591 - classification_loss: 0.1684 331/500 [==================>...........] - ETA: 42s - loss: 1.3269 - regression_loss: 1.1585 - classification_loss: 0.1684 332/500 [==================>...........] - ETA: 42s - loss: 1.3291 - regression_loss: 1.1602 - classification_loss: 0.1689 333/500 [==================>...........] - ETA: 41s - loss: 1.3295 - regression_loss: 1.1603 - classification_loss: 0.1692 334/500 [===================>..........] - ETA: 41s - loss: 1.3290 - regression_loss: 1.1599 - classification_loss: 0.1691 335/500 [===================>..........] - ETA: 41s - loss: 1.3293 - regression_loss: 1.1601 - classification_loss: 0.1692 336/500 [===================>..........] - ETA: 41s - loss: 1.3297 - regression_loss: 1.1605 - classification_loss: 0.1693 337/500 [===================>..........] - ETA: 40s - loss: 1.3272 - regression_loss: 1.1582 - classification_loss: 0.1690 338/500 [===================>..........] - ETA: 40s - loss: 1.3274 - regression_loss: 1.1583 - classification_loss: 0.1691 339/500 [===================>..........] - ETA: 40s - loss: 1.3283 - regression_loss: 1.1591 - classification_loss: 0.1692 340/500 [===================>..........] - ETA: 40s - loss: 1.3290 - regression_loss: 1.1598 - classification_loss: 0.1692 341/500 [===================>..........] - ETA: 39s - loss: 1.3280 - regression_loss: 1.1591 - classification_loss: 0.1689 342/500 [===================>..........] - ETA: 39s - loss: 1.3288 - regression_loss: 1.1596 - classification_loss: 0.1691 343/500 [===================>..........] - ETA: 39s - loss: 1.3273 - regression_loss: 1.1584 - classification_loss: 0.1689 344/500 [===================>..........] - ETA: 39s - loss: 1.3268 - regression_loss: 1.1580 - classification_loss: 0.1689 345/500 [===================>..........] - ETA: 38s - loss: 1.3282 - regression_loss: 1.1594 - classification_loss: 0.1688 346/500 [===================>..........] - ETA: 38s - loss: 1.3270 - regression_loss: 1.1583 - classification_loss: 0.1687 347/500 [===================>..........] - ETA: 38s - loss: 1.3268 - regression_loss: 1.1581 - classification_loss: 0.1687 348/500 [===================>..........] - ETA: 38s - loss: 1.3268 - regression_loss: 1.1579 - classification_loss: 0.1689 349/500 [===================>..........] - ETA: 37s - loss: 1.3252 - regression_loss: 1.1565 - classification_loss: 0.1686 350/500 [====================>.........] - ETA: 37s - loss: 1.3245 - regression_loss: 1.1560 - classification_loss: 0.1685 351/500 [====================>.........] - ETA: 37s - loss: 1.3248 - regression_loss: 1.1563 - classification_loss: 0.1685 352/500 [====================>.........] - ETA: 37s - loss: 1.3227 - regression_loss: 1.1545 - classification_loss: 0.1682 353/500 [====================>.........] - ETA: 36s - loss: 1.3225 - regression_loss: 1.1543 - classification_loss: 0.1682 354/500 [====================>.........] - ETA: 36s - loss: 1.3239 - regression_loss: 1.1555 - classification_loss: 0.1684 355/500 [====================>.........] - ETA: 36s - loss: 1.3222 - regression_loss: 1.1540 - classification_loss: 0.1682 356/500 [====================>.........] - ETA: 36s - loss: 1.3227 - regression_loss: 1.1545 - classification_loss: 0.1683 357/500 [====================>.........] - ETA: 35s - loss: 1.3219 - regression_loss: 1.1537 - classification_loss: 0.1682 358/500 [====================>.........] - ETA: 35s - loss: 1.3211 - regression_loss: 1.1530 - classification_loss: 0.1681 359/500 [====================>.........] - ETA: 35s - loss: 1.3216 - regression_loss: 1.1536 - classification_loss: 0.1681 360/500 [====================>.........] - ETA: 35s - loss: 1.3220 - regression_loss: 1.1540 - classification_loss: 0.1680 361/500 [====================>.........] - ETA: 34s - loss: 1.3228 - regression_loss: 1.1546 - classification_loss: 0.1682 362/500 [====================>.........] - ETA: 34s - loss: 1.3240 - regression_loss: 1.1555 - classification_loss: 0.1685 363/500 [====================>.........] - ETA: 34s - loss: 1.3244 - regression_loss: 1.1560 - classification_loss: 0.1685 364/500 [====================>.........] - ETA: 34s - loss: 1.3242 - regression_loss: 1.1557 - classification_loss: 0.1685 365/500 [====================>.........] - ETA: 33s - loss: 1.3246 - regression_loss: 1.1560 - classification_loss: 0.1686 366/500 [====================>.........] - ETA: 33s - loss: 1.3251 - regression_loss: 1.1563 - classification_loss: 0.1687 367/500 [=====================>........] - ETA: 33s - loss: 1.3234 - regression_loss: 1.1550 - classification_loss: 0.1684 368/500 [=====================>........] - ETA: 33s - loss: 1.3213 - regression_loss: 1.1532 - classification_loss: 0.1681 369/500 [=====================>........] - ETA: 32s - loss: 1.3216 - regression_loss: 1.1535 - classification_loss: 0.1681 370/500 [=====================>........] - ETA: 32s - loss: 1.3229 - regression_loss: 1.1545 - classification_loss: 0.1684 371/500 [=====================>........] - ETA: 32s - loss: 1.3236 - regression_loss: 1.1553 - classification_loss: 0.1684 372/500 [=====================>........] - ETA: 32s - loss: 1.3238 - regression_loss: 1.1551 - classification_loss: 0.1687 373/500 [=====================>........] - ETA: 31s - loss: 1.3245 - regression_loss: 1.1556 - classification_loss: 0.1689 374/500 [=====================>........] - ETA: 31s - loss: 1.3245 - regression_loss: 1.1556 - classification_loss: 0.1689 375/500 [=====================>........] - ETA: 31s - loss: 1.3245 - regression_loss: 1.1556 - classification_loss: 0.1689 376/500 [=====================>........] - ETA: 31s - loss: 1.3226 - regression_loss: 1.1539 - classification_loss: 0.1686 377/500 [=====================>........] - ETA: 30s - loss: 1.3218 - regression_loss: 1.1533 - classification_loss: 0.1685 378/500 [=====================>........] - ETA: 30s - loss: 1.3225 - regression_loss: 1.1539 - classification_loss: 0.1686 379/500 [=====================>........] - ETA: 30s - loss: 1.3225 - regression_loss: 1.1539 - classification_loss: 0.1686 380/500 [=====================>........] - ETA: 30s - loss: 1.3216 - regression_loss: 1.1532 - classification_loss: 0.1685 381/500 [=====================>........] - ETA: 29s - loss: 1.3223 - regression_loss: 1.1538 - classification_loss: 0.1685 382/500 [=====================>........] - ETA: 29s - loss: 1.3228 - regression_loss: 1.1541 - classification_loss: 0.1687 383/500 [=====================>........] - ETA: 29s - loss: 1.3228 - regression_loss: 1.1540 - classification_loss: 0.1687 384/500 [======================>.......] - ETA: 29s - loss: 1.3227 - regression_loss: 1.1542 - classification_loss: 0.1685 385/500 [======================>.......] - ETA: 28s - loss: 1.3223 - regression_loss: 1.1538 - classification_loss: 0.1685 386/500 [======================>.......] - ETA: 28s - loss: 1.3211 - regression_loss: 1.1528 - classification_loss: 0.1683 387/500 [======================>.......] - ETA: 28s - loss: 1.3208 - regression_loss: 1.1525 - classification_loss: 0.1682 388/500 [======================>.......] - ETA: 28s - loss: 1.3211 - regression_loss: 1.1528 - classification_loss: 0.1683 389/500 [======================>.......] - ETA: 27s - loss: 1.3200 - regression_loss: 1.1519 - classification_loss: 0.1681 390/500 [======================>.......] - ETA: 27s - loss: 1.3199 - regression_loss: 1.1518 - classification_loss: 0.1681 391/500 [======================>.......] - ETA: 27s - loss: 1.3196 - regression_loss: 1.1516 - classification_loss: 0.1681 392/500 [======================>.......] - ETA: 27s - loss: 1.3196 - regression_loss: 1.1515 - classification_loss: 0.1680 393/500 [======================>.......] - ETA: 26s - loss: 1.3180 - regression_loss: 1.1502 - classification_loss: 0.1678 394/500 [======================>.......] - ETA: 26s - loss: 1.3182 - regression_loss: 1.1503 - classification_loss: 0.1679 395/500 [======================>.......] - ETA: 26s - loss: 1.3172 - regression_loss: 1.1494 - classification_loss: 0.1678 396/500 [======================>.......] - ETA: 26s - loss: 1.3183 - regression_loss: 1.1503 - classification_loss: 0.1680 397/500 [======================>.......] - ETA: 25s - loss: 1.3187 - regression_loss: 1.1507 - classification_loss: 0.1680 398/500 [======================>.......] - ETA: 25s - loss: 1.3196 - regression_loss: 1.1515 - classification_loss: 0.1681 399/500 [======================>.......] - ETA: 25s - loss: 1.3206 - regression_loss: 1.1522 - classification_loss: 0.1684 400/500 [=======================>......] - ETA: 25s - loss: 1.3208 - regression_loss: 1.1525 - classification_loss: 0.1683 401/500 [=======================>......] - ETA: 24s - loss: 1.3208 - regression_loss: 1.1525 - classification_loss: 0.1683 402/500 [=======================>......] - ETA: 24s - loss: 1.3198 - regression_loss: 1.1518 - classification_loss: 0.1680 403/500 [=======================>......] - ETA: 24s - loss: 1.3203 - regression_loss: 1.1523 - classification_loss: 0.1681 404/500 [=======================>......] - ETA: 24s - loss: 1.3222 - regression_loss: 1.1537 - classification_loss: 0.1685 405/500 [=======================>......] - ETA: 23s - loss: 1.3219 - regression_loss: 1.1535 - classification_loss: 0.1684 406/500 [=======================>......] - ETA: 23s - loss: 1.3217 - regression_loss: 1.1534 - classification_loss: 0.1684 407/500 [=======================>......] - ETA: 23s - loss: 1.3226 - regression_loss: 1.1542 - classification_loss: 0.1684 408/500 [=======================>......] - ETA: 23s - loss: 1.3208 - regression_loss: 1.1527 - classification_loss: 0.1681 409/500 [=======================>......] - ETA: 22s - loss: 1.3193 - regression_loss: 1.1513 - classification_loss: 0.1680 410/500 [=======================>......] - ETA: 22s - loss: 1.3172 - regression_loss: 1.1495 - classification_loss: 0.1677 411/500 [=======================>......] - ETA: 22s - loss: 1.3178 - regression_loss: 1.1498 - classification_loss: 0.1679 412/500 [=======================>......] - ETA: 22s - loss: 1.3175 - regression_loss: 1.1496 - classification_loss: 0.1679 413/500 [=======================>......] - ETA: 21s - loss: 1.3176 - regression_loss: 1.1497 - classification_loss: 0.1679 414/500 [=======================>......] - ETA: 21s - loss: 1.3179 - regression_loss: 1.1499 - classification_loss: 0.1680 415/500 [=======================>......] - ETA: 21s - loss: 1.3172 - regression_loss: 1.1494 - classification_loss: 0.1678 416/500 [=======================>......] - ETA: 21s - loss: 1.3171 - regression_loss: 1.1492 - classification_loss: 0.1679 417/500 [========================>.....] - ETA: 20s - loss: 1.3174 - regression_loss: 1.1495 - classification_loss: 0.1679 418/500 [========================>.....] - ETA: 20s - loss: 1.3193 - regression_loss: 1.1512 - classification_loss: 0.1681 419/500 [========================>.....] - ETA: 20s - loss: 1.3204 - regression_loss: 1.1521 - classification_loss: 0.1683 420/500 [========================>.....] - ETA: 20s - loss: 1.3199 - regression_loss: 1.1517 - classification_loss: 0.1682 421/500 [========================>.....] - ETA: 19s - loss: 1.3205 - regression_loss: 1.1522 - classification_loss: 0.1683 422/500 [========================>.....] - ETA: 19s - loss: 1.3202 - regression_loss: 1.1518 - classification_loss: 0.1684 423/500 [========================>.....] - ETA: 19s - loss: 1.3203 - regression_loss: 1.1517 - classification_loss: 0.1686 424/500 [========================>.....] - ETA: 19s - loss: 1.3204 - regression_loss: 1.1519 - classification_loss: 0.1685 425/500 [========================>.....] - ETA: 18s - loss: 1.3209 - regression_loss: 1.1524 - classification_loss: 0.1685 426/500 [========================>.....] - ETA: 18s - loss: 1.3205 - regression_loss: 1.1521 - classification_loss: 0.1684 427/500 [========================>.....] - ETA: 18s - loss: 1.3205 - regression_loss: 1.1521 - classification_loss: 0.1684 428/500 [========================>.....] - ETA: 18s - loss: 1.3212 - regression_loss: 1.1526 - classification_loss: 0.1685 429/500 [========================>.....] - ETA: 17s - loss: 1.3212 - regression_loss: 1.1527 - classification_loss: 0.1685 430/500 [========================>.....] - ETA: 17s - loss: 1.3210 - regression_loss: 1.1525 - classification_loss: 0.1685 431/500 [========================>.....] - ETA: 17s - loss: 1.3217 - regression_loss: 1.1530 - classification_loss: 0.1687 432/500 [========================>.....] - ETA: 17s - loss: 1.3204 - regression_loss: 1.1520 - classification_loss: 0.1684 433/500 [========================>.....] - ETA: 16s - loss: 1.3206 - regression_loss: 1.1521 - classification_loss: 0.1685 434/500 [=========================>....] - ETA: 16s - loss: 1.3208 - regression_loss: 1.1523 - classification_loss: 0.1685 435/500 [=========================>....] - ETA: 16s - loss: 1.3206 - regression_loss: 1.1522 - classification_loss: 0.1684 436/500 [=========================>....] - ETA: 16s - loss: 1.3184 - regression_loss: 1.1503 - classification_loss: 0.1681 437/500 [=========================>....] - ETA: 15s - loss: 1.3198 - regression_loss: 1.1515 - classification_loss: 0.1683 438/500 [=========================>....] - ETA: 15s - loss: 1.3200 - regression_loss: 1.1518 - classification_loss: 0.1682 439/500 [=========================>....] - ETA: 15s - loss: 1.3205 - regression_loss: 1.1520 - classification_loss: 0.1684 440/500 [=========================>....] - ETA: 15s - loss: 1.3197 - regression_loss: 1.1515 - classification_loss: 0.1682 441/500 [=========================>....] - ETA: 14s - loss: 1.3183 - regression_loss: 1.1503 - classification_loss: 0.1680 442/500 [=========================>....] - ETA: 14s - loss: 1.3178 - regression_loss: 1.1500 - classification_loss: 0.1678 443/500 [=========================>....] - ETA: 14s - loss: 1.3185 - regression_loss: 1.1506 - classification_loss: 0.1680 444/500 [=========================>....] - ETA: 14s - loss: 1.3179 - regression_loss: 1.1501 - classification_loss: 0.1679 445/500 [=========================>....] - ETA: 13s - loss: 1.3162 - regression_loss: 1.1485 - classification_loss: 0.1677 446/500 [=========================>....] - ETA: 13s - loss: 1.3154 - regression_loss: 1.1478 - classification_loss: 0.1676 447/500 [=========================>....] - ETA: 13s - loss: 1.3140 - regression_loss: 1.1465 - classification_loss: 0.1675 448/500 [=========================>....] - ETA: 13s - loss: 1.3156 - regression_loss: 1.1478 - classification_loss: 0.1678 449/500 [=========================>....] - ETA: 12s - loss: 1.3161 - regression_loss: 1.1484 - classification_loss: 0.1677 450/500 [==========================>...] - ETA: 12s - loss: 1.3148 - regression_loss: 1.1473 - classification_loss: 0.1675 451/500 [==========================>...] - ETA: 12s - loss: 1.3161 - regression_loss: 1.1483 - classification_loss: 0.1678 452/500 [==========================>...] - ETA: 12s - loss: 1.3170 - regression_loss: 1.1490 - classification_loss: 0.1680 453/500 [==========================>...] - ETA: 11s - loss: 1.3178 - regression_loss: 1.1496 - classification_loss: 0.1682 454/500 [==========================>...] - ETA: 11s - loss: 1.3182 - regression_loss: 1.1499 - classification_loss: 0.1683 455/500 [==========================>...] - ETA: 11s - loss: 1.3209 - regression_loss: 1.1525 - classification_loss: 0.1684 456/500 [==========================>...] - ETA: 11s - loss: 1.3207 - regression_loss: 1.1524 - classification_loss: 0.1683 457/500 [==========================>...] - ETA: 10s - loss: 1.3202 - regression_loss: 1.1520 - classification_loss: 0.1682 458/500 [==========================>...] - ETA: 10s - loss: 1.3187 - regression_loss: 1.1508 - classification_loss: 0.1679 459/500 [==========================>...] - ETA: 10s - loss: 1.3186 - regression_loss: 1.1507 - classification_loss: 0.1679 460/500 [==========================>...] - ETA: 10s - loss: 1.3178 - regression_loss: 1.1500 - classification_loss: 0.1678 461/500 [==========================>...] - ETA: 9s - loss: 1.3187 - regression_loss: 1.1507 - classification_loss: 0.1680  462/500 [==========================>...] - ETA: 9s - loss: 1.3189 - regression_loss: 1.1509 - classification_loss: 0.1680 463/500 [==========================>...] - ETA: 9s - loss: 1.3190 - regression_loss: 1.1510 - classification_loss: 0.1680 464/500 [==========================>...] - ETA: 9s - loss: 1.3188 - regression_loss: 1.1509 - classification_loss: 0.1679 465/500 [==========================>...] - ETA: 8s - loss: 1.3190 - regression_loss: 1.1511 - classification_loss: 0.1680 466/500 [==========================>...] - ETA: 8s - loss: 1.3192 - regression_loss: 1.1512 - classification_loss: 0.1680 467/500 [===========================>..] - ETA: 8s - loss: 1.3172 - regression_loss: 1.1493 - classification_loss: 0.1679 468/500 [===========================>..] - ETA: 8s - loss: 1.3173 - regression_loss: 1.1495 - classification_loss: 0.1678 469/500 [===========================>..] - ETA: 7s - loss: 1.3179 - regression_loss: 1.1500 - classification_loss: 0.1679 470/500 [===========================>..] - ETA: 7s - loss: 1.3191 - regression_loss: 1.1513 - classification_loss: 0.1678 471/500 [===========================>..] - ETA: 7s - loss: 1.3185 - regression_loss: 1.1506 - classification_loss: 0.1679 472/500 [===========================>..] - ETA: 7s - loss: 1.3189 - regression_loss: 1.1510 - classification_loss: 0.1679 473/500 [===========================>..] - ETA: 6s - loss: 1.3194 - regression_loss: 1.1515 - classification_loss: 0.1678 474/500 [===========================>..] - ETA: 6s - loss: 1.3202 - regression_loss: 1.1522 - classification_loss: 0.1680 475/500 [===========================>..] - ETA: 6s - loss: 1.3194 - regression_loss: 1.1516 - classification_loss: 0.1679 476/500 [===========================>..] - ETA: 6s - loss: 1.3200 - regression_loss: 1.1521 - classification_loss: 0.1679 477/500 [===========================>..] - ETA: 5s - loss: 1.3186 - regression_loss: 1.1509 - classification_loss: 0.1676 478/500 [===========================>..] - ETA: 5s - loss: 1.3173 - regression_loss: 1.1499 - classification_loss: 0.1674 479/500 [===========================>..] - ETA: 5s - loss: 1.3179 - regression_loss: 1.1504 - classification_loss: 0.1675 480/500 [===========================>..] - ETA: 5s - loss: 1.3179 - regression_loss: 1.1505 - classification_loss: 0.1674 481/500 [===========================>..] - ETA: 4s - loss: 1.3160 - regression_loss: 1.1488 - classification_loss: 0.1672 482/500 [===========================>..] - ETA: 4s - loss: 1.3144 - regression_loss: 1.1474 - classification_loss: 0.1670 483/500 [===========================>..] - ETA: 4s - loss: 1.3137 - regression_loss: 1.1468 - classification_loss: 0.1668 484/500 [============================>.] - ETA: 4s - loss: 1.3141 - regression_loss: 1.1471 - classification_loss: 0.1669 485/500 [============================>.] - ETA: 3s - loss: 1.3142 - regression_loss: 1.1473 - classification_loss: 0.1669 486/500 [============================>.] - ETA: 3s - loss: 1.3142 - regression_loss: 1.1473 - classification_loss: 0.1669 487/500 [============================>.] - ETA: 3s - loss: 1.3127 - regression_loss: 1.1460 - classification_loss: 0.1667 488/500 [============================>.] - ETA: 3s - loss: 1.3130 - regression_loss: 1.1462 - classification_loss: 0.1667 489/500 [============================>.] - ETA: 2s - loss: 1.3117 - regression_loss: 1.1452 - classification_loss: 0.1665 490/500 [============================>.] - ETA: 2s - loss: 1.3128 - regression_loss: 1.1462 - classification_loss: 0.1666 491/500 [============================>.] - ETA: 2s - loss: 1.3137 - regression_loss: 1.1470 - classification_loss: 0.1667 492/500 [============================>.] - ETA: 2s - loss: 1.3131 - regression_loss: 1.1465 - classification_loss: 0.1666 493/500 [============================>.] - ETA: 1s - loss: 1.3132 - regression_loss: 1.1465 - classification_loss: 0.1668 494/500 [============================>.] - ETA: 1s - loss: 1.3132 - regression_loss: 1.1464 - classification_loss: 0.1667 495/500 [============================>.] - ETA: 1s - loss: 1.3125 - regression_loss: 1.1459 - classification_loss: 0.1666 496/500 [============================>.] - ETA: 1s - loss: 1.3131 - regression_loss: 1.1464 - classification_loss: 0.1667 497/500 [============================>.] - ETA: 0s - loss: 1.3129 - regression_loss: 1.1463 - classification_loss: 0.1667 498/500 [============================>.] - ETA: 0s - loss: 1.3126 - regression_loss: 1.1460 - classification_loss: 0.1666 499/500 [============================>.] - ETA: 0s - loss: 1.3128 - regression_loss: 1.1462 - classification_loss: 0.1666 500/500 [==============================] - 126s 252ms/step - loss: 1.3121 - regression_loss: 1.1456 - classification_loss: 0.1665 1172 instances of class plum with average precision: 0.7307 mAP: 0.7307 Epoch 00020: saving model to ./training/snapshots/resnet50_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:05 - loss: 1.1934 - regression_loss: 1.0360 - classification_loss: 0.1573 2/500 [..............................] - ETA: 2:11 - loss: 1.2694 - regression_loss: 1.1032 - classification_loss: 0.1662 3/500 [..............................] - ETA: 2:08 - loss: 1.0811 - regression_loss: 0.9527 - classification_loss: 0.1284 4/500 [..............................] - ETA: 2:09 - loss: 1.2000 - regression_loss: 1.0573 - classification_loss: 0.1427 5/500 [..............................] - ETA: 2:08 - loss: 1.1313 - regression_loss: 0.9982 - classification_loss: 0.1331 6/500 [..............................] - ETA: 2:08 - loss: 1.2049 - regression_loss: 1.0583 - classification_loss: 0.1467 7/500 [..............................] - ETA: 2:08 - loss: 1.2995 - regression_loss: 1.1388 - classification_loss: 0.1607 8/500 [..............................] - ETA: 2:07 - loss: 1.2325 - regression_loss: 1.0846 - classification_loss: 0.1479 9/500 [..............................] - ETA: 2:07 - loss: 1.2922 - regression_loss: 1.1358 - classification_loss: 0.1564 10/500 [..............................] - ETA: 2:08 - loss: 1.3272 - regression_loss: 1.1712 - classification_loss: 0.1560 11/500 [..............................] - ETA: 2:07 - loss: 1.3944 - regression_loss: 1.2251 - classification_loss: 0.1693 12/500 [..............................] - ETA: 2:07 - loss: 1.3362 - regression_loss: 1.1754 - classification_loss: 0.1608 13/500 [..............................] - ETA: 2:07 - loss: 1.3363 - regression_loss: 1.1762 - classification_loss: 0.1601 14/500 [..............................] - ETA: 2:06 - loss: 1.3363 - regression_loss: 1.1760 - classification_loss: 0.1603 15/500 [..............................] - ETA: 2:07 - loss: 1.3314 - regression_loss: 1.1660 - classification_loss: 0.1655 16/500 [..............................] - ETA: 2:06 - loss: 1.3487 - regression_loss: 1.1815 - classification_loss: 0.1672 17/500 [>.............................] - ETA: 2:06 - loss: 1.3626 - regression_loss: 1.1934 - classification_loss: 0.1692 18/500 [>.............................] - ETA: 2:06 - loss: 1.3488 - regression_loss: 1.1810 - classification_loss: 0.1678 19/500 [>.............................] - ETA: 2:05 - loss: 1.3542 - regression_loss: 1.1813 - classification_loss: 0.1729 20/500 [>.............................] - ETA: 2:05 - loss: 1.3236 - regression_loss: 1.1549 - classification_loss: 0.1688 21/500 [>.............................] - ETA: 2:05 - loss: 1.3057 - regression_loss: 1.1374 - classification_loss: 0.1683 22/500 [>.............................] - ETA: 2:04 - loss: 1.2872 - regression_loss: 1.1243 - classification_loss: 0.1629 23/500 [>.............................] - ETA: 2:04 - loss: 1.3172 - regression_loss: 1.1485 - classification_loss: 0.1687 24/500 [>.............................] - ETA: 2:04 - loss: 1.3290 - regression_loss: 1.1556 - classification_loss: 0.1734 25/500 [>.............................] - ETA: 2:04 - loss: 1.3409 - regression_loss: 1.1672 - classification_loss: 0.1737 26/500 [>.............................] - ETA: 2:03 - loss: 1.3504 - regression_loss: 1.1738 - classification_loss: 0.1765 27/500 [>.............................] - ETA: 2:03 - loss: 1.3428 - regression_loss: 1.1659 - classification_loss: 0.1769 28/500 [>.............................] - ETA: 2:02 - loss: 1.3422 - regression_loss: 1.1652 - classification_loss: 0.1770 29/500 [>.............................] - ETA: 2:02 - loss: 1.3464 - regression_loss: 1.1713 - classification_loss: 0.1751 30/500 [>.............................] - ETA: 2:02 - loss: 1.3472 - regression_loss: 1.1696 - classification_loss: 0.1776 31/500 [>.............................] - ETA: 2:02 - loss: 1.3422 - regression_loss: 1.1665 - classification_loss: 0.1757 32/500 [>.............................] - ETA: 2:01 - loss: 1.3457 - regression_loss: 1.1700 - classification_loss: 0.1757 33/500 [>.............................] - ETA: 2:00 - loss: 1.3307 - regression_loss: 1.1581 - classification_loss: 0.1726 34/500 [=>............................] - ETA: 2:00 - loss: 1.3158 - regression_loss: 1.1464 - classification_loss: 0.1695 35/500 [=>............................] - ETA: 1:59 - loss: 1.3002 - regression_loss: 1.1333 - classification_loss: 0.1669 36/500 [=>............................] - ETA: 1:58 - loss: 1.3143 - regression_loss: 1.1476 - classification_loss: 0.1667 37/500 [=>............................] - ETA: 1:58 - loss: 1.3202 - regression_loss: 1.1520 - classification_loss: 0.1682 38/500 [=>............................] - ETA: 1:58 - loss: 1.3157 - regression_loss: 1.1488 - classification_loss: 0.1669 39/500 [=>............................] - ETA: 1:58 - loss: 1.3150 - regression_loss: 1.1476 - classification_loss: 0.1674 40/500 [=>............................] - ETA: 1:57 - loss: 1.3088 - regression_loss: 1.1433 - classification_loss: 0.1655 41/500 [=>............................] - ETA: 1:57 - loss: 1.3095 - regression_loss: 1.1446 - classification_loss: 0.1649 42/500 [=>............................] - ETA: 1:57 - loss: 1.3034 - regression_loss: 1.1394 - classification_loss: 0.1640 43/500 [=>............................] - ETA: 1:57 - loss: 1.3037 - regression_loss: 1.1400 - classification_loss: 0.1637 44/500 [=>............................] - ETA: 1:57 - loss: 1.3024 - regression_loss: 1.1393 - classification_loss: 0.1631 45/500 [=>............................] - ETA: 1:56 - loss: 1.3039 - regression_loss: 1.1411 - classification_loss: 0.1628 46/500 [=>............................] - ETA: 1:56 - loss: 1.2936 - regression_loss: 1.1316 - classification_loss: 0.1620 47/500 [=>............................] - ETA: 1:56 - loss: 1.3055 - regression_loss: 1.1410 - classification_loss: 0.1645 48/500 [=>............................] - ETA: 1:56 - loss: 1.3042 - regression_loss: 1.1390 - classification_loss: 0.1652 49/500 [=>............................] - ETA: 1:56 - loss: 1.3049 - regression_loss: 1.1402 - classification_loss: 0.1647 50/500 [==>...........................] - ETA: 1:55 - loss: 1.2996 - regression_loss: 1.1351 - classification_loss: 0.1645 51/500 [==>...........................] - ETA: 1:55 - loss: 1.3026 - regression_loss: 1.1374 - classification_loss: 0.1652 52/500 [==>...........................] - ETA: 1:55 - loss: 1.3015 - regression_loss: 1.1366 - classification_loss: 0.1649 53/500 [==>...........................] - ETA: 1:55 - loss: 1.3056 - regression_loss: 1.1415 - classification_loss: 0.1640 54/500 [==>...........................] - ETA: 1:54 - loss: 1.3087 - regression_loss: 1.1439 - classification_loss: 0.1648 55/500 [==>...........................] - ETA: 1:54 - loss: 1.3162 - regression_loss: 1.1502 - classification_loss: 0.1660 56/500 [==>...........................] - ETA: 1:54 - loss: 1.3074 - regression_loss: 1.1432 - classification_loss: 0.1642 57/500 [==>...........................] - ETA: 1:54 - loss: 1.3011 - regression_loss: 1.1368 - classification_loss: 0.1643 58/500 [==>...........................] - ETA: 1:53 - loss: 1.2988 - regression_loss: 1.1350 - classification_loss: 0.1638 59/500 [==>...........................] - ETA: 1:53 - loss: 1.3015 - regression_loss: 1.1370 - classification_loss: 0.1645 60/500 [==>...........................] - ETA: 1:53 - loss: 1.3178 - regression_loss: 1.1505 - classification_loss: 0.1673 61/500 [==>...........................] - ETA: 1:52 - loss: 1.3119 - regression_loss: 1.1460 - classification_loss: 0.1659 62/500 [==>...........................] - ETA: 1:52 - loss: 1.3119 - regression_loss: 1.1459 - classification_loss: 0.1659 63/500 [==>...........................] - ETA: 1:52 - loss: 1.3199 - regression_loss: 1.1531 - classification_loss: 0.1668 64/500 [==>...........................] - ETA: 1:52 - loss: 1.3163 - regression_loss: 1.1504 - classification_loss: 0.1659 65/500 [==>...........................] - ETA: 1:52 - loss: 1.3127 - regression_loss: 1.1476 - classification_loss: 0.1650 66/500 [==>...........................] - ETA: 1:52 - loss: 1.3053 - regression_loss: 1.1419 - classification_loss: 0.1635 67/500 [===>..........................] - ETA: 1:51 - loss: 1.3023 - regression_loss: 1.1395 - classification_loss: 0.1629 68/500 [===>..........................] - ETA: 1:51 - loss: 1.3068 - regression_loss: 1.1443 - classification_loss: 0.1625 69/500 [===>..........................] - ETA: 1:51 - loss: 1.2936 - regression_loss: 1.1329 - classification_loss: 0.1607 70/500 [===>..........................] - ETA: 1:50 - loss: 1.2950 - regression_loss: 1.1343 - classification_loss: 0.1607 71/500 [===>..........................] - ETA: 1:50 - loss: 1.3012 - regression_loss: 1.1397 - classification_loss: 0.1615 72/500 [===>..........................] - ETA: 1:50 - loss: 1.3090 - regression_loss: 1.1461 - classification_loss: 0.1629 73/500 [===>..........................] - ETA: 1:49 - loss: 1.3166 - regression_loss: 1.1529 - classification_loss: 0.1637 74/500 [===>..........................] - ETA: 1:49 - loss: 1.3217 - regression_loss: 1.1563 - classification_loss: 0.1653 75/500 [===>..........................] - ETA: 1:49 - loss: 1.3217 - regression_loss: 1.1560 - classification_loss: 0.1657 76/500 [===>..........................] - ETA: 1:49 - loss: 1.3165 - regression_loss: 1.1509 - classification_loss: 0.1656 77/500 [===>..........................] - ETA: 1:48 - loss: 1.3176 - regression_loss: 1.1516 - classification_loss: 0.1660 78/500 [===>..........................] - ETA: 1:48 - loss: 1.3255 - regression_loss: 1.1577 - classification_loss: 0.1678 79/500 [===>..........................] - ETA: 1:48 - loss: 1.3343 - regression_loss: 1.1649 - classification_loss: 0.1695 80/500 [===>..........................] - ETA: 1:48 - loss: 1.3285 - regression_loss: 1.1601 - classification_loss: 0.1684 81/500 [===>..........................] - ETA: 1:47 - loss: 1.3343 - regression_loss: 1.1650 - classification_loss: 0.1693 82/500 [===>..........................] - ETA: 1:47 - loss: 1.3386 - regression_loss: 1.1680 - classification_loss: 0.1706 83/500 [===>..........................] - ETA: 1:47 - loss: 1.3324 - regression_loss: 1.1627 - classification_loss: 0.1696 84/500 [====>.........................] - ETA: 1:47 - loss: 1.3302 - regression_loss: 1.1617 - classification_loss: 0.1685 85/500 [====>.........................] - ETA: 1:46 - loss: 1.3219 - regression_loss: 1.1549 - classification_loss: 0.1670 86/500 [====>.........................] - ETA: 1:46 - loss: 1.3253 - regression_loss: 1.1584 - classification_loss: 0.1668 87/500 [====>.........................] - ETA: 1:46 - loss: 1.3165 - regression_loss: 1.1509 - classification_loss: 0.1656 88/500 [====>.........................] - ETA: 1:46 - loss: 1.3200 - regression_loss: 1.1532 - classification_loss: 0.1667 89/500 [====>.........................] - ETA: 1:45 - loss: 1.3120 - regression_loss: 1.1463 - classification_loss: 0.1657 90/500 [====>.........................] - ETA: 1:45 - loss: 1.3045 - regression_loss: 1.1399 - classification_loss: 0.1646 91/500 [====>.........................] - ETA: 1:45 - loss: 1.3087 - regression_loss: 1.1434 - classification_loss: 0.1653 92/500 [====>.........................] - ETA: 1:45 - loss: 1.3142 - regression_loss: 1.1484 - classification_loss: 0.1658 93/500 [====>.........................] - ETA: 1:44 - loss: 1.3094 - regression_loss: 1.1439 - classification_loss: 0.1655 94/500 [====>.........................] - ETA: 1:44 - loss: 1.3096 - regression_loss: 1.1443 - classification_loss: 0.1653 95/500 [====>.........................] - ETA: 1:44 - loss: 1.3130 - regression_loss: 1.1471 - classification_loss: 0.1659 96/500 [====>.........................] - ETA: 1:44 - loss: 1.3131 - regression_loss: 1.1476 - classification_loss: 0.1655 97/500 [====>.........................] - ETA: 1:43 - loss: 1.3131 - regression_loss: 1.1477 - classification_loss: 0.1655 98/500 [====>.........................] - ETA: 1:43 - loss: 1.3107 - regression_loss: 1.1457 - classification_loss: 0.1650 99/500 [====>.........................] - ETA: 1:43 - loss: 1.3131 - regression_loss: 1.1475 - classification_loss: 0.1656 100/500 [=====>........................] - ETA: 1:43 - loss: 1.3120 - regression_loss: 1.1468 - classification_loss: 0.1652 101/500 [=====>........................] - ETA: 1:42 - loss: 1.3131 - regression_loss: 1.1478 - classification_loss: 0.1653 102/500 [=====>........................] - ETA: 1:42 - loss: 1.3084 - regression_loss: 1.1440 - classification_loss: 0.1643 103/500 [=====>........................] - ETA: 1:42 - loss: 1.3061 - regression_loss: 1.1421 - classification_loss: 0.1640 104/500 [=====>........................] - ETA: 1:42 - loss: 1.3094 - regression_loss: 1.1454 - classification_loss: 0.1640 105/500 [=====>........................] - ETA: 1:41 - loss: 1.3086 - regression_loss: 1.1449 - classification_loss: 0.1637 106/500 [=====>........................] - ETA: 1:41 - loss: 1.3085 - regression_loss: 1.1451 - classification_loss: 0.1635 107/500 [=====>........................] - ETA: 1:41 - loss: 1.3017 - regression_loss: 1.1389 - classification_loss: 0.1628 108/500 [=====>........................] - ETA: 1:41 - loss: 1.3029 - regression_loss: 1.1396 - classification_loss: 0.1633 109/500 [=====>........................] - ETA: 1:40 - loss: 1.3057 - regression_loss: 1.1421 - classification_loss: 0.1636 110/500 [=====>........................] - ETA: 1:40 - loss: 1.3041 - regression_loss: 1.1407 - classification_loss: 0.1634 111/500 [=====>........................] - ETA: 1:40 - loss: 1.2974 - regression_loss: 1.1350 - classification_loss: 0.1624 112/500 [=====>........................] - ETA: 1:40 - loss: 1.3005 - regression_loss: 1.1384 - classification_loss: 0.1621 113/500 [=====>........................] - ETA: 1:39 - loss: 1.2941 - regression_loss: 1.1322 - classification_loss: 0.1619 114/500 [=====>........................] - ETA: 1:39 - loss: 1.2957 - regression_loss: 1.1335 - classification_loss: 0.1622 115/500 [=====>........................] - ETA: 1:39 - loss: 1.2939 - regression_loss: 1.1319 - classification_loss: 0.1620 116/500 [=====>........................] - ETA: 1:39 - loss: 1.2944 - regression_loss: 1.1319 - classification_loss: 0.1625 117/500 [======>.......................] - ETA: 1:38 - loss: 1.2954 - regression_loss: 1.1327 - classification_loss: 0.1627 118/500 [======>.......................] - ETA: 1:38 - loss: 1.2968 - regression_loss: 1.1339 - classification_loss: 0.1630 119/500 [======>.......................] - ETA: 1:38 - loss: 1.2970 - regression_loss: 1.1341 - classification_loss: 0.1630 120/500 [======>.......................] - ETA: 1:38 - loss: 1.3030 - regression_loss: 1.1392 - classification_loss: 0.1638 121/500 [======>.......................] - ETA: 1:37 - loss: 1.3011 - regression_loss: 1.1371 - classification_loss: 0.1641 122/500 [======>.......................] - ETA: 1:37 - loss: 1.3013 - regression_loss: 1.1372 - classification_loss: 0.1641 123/500 [======>.......................] - ETA: 1:37 - loss: 1.3038 - regression_loss: 1.1397 - classification_loss: 0.1642 124/500 [======>.......................] - ETA: 1:37 - loss: 1.3036 - regression_loss: 1.1396 - classification_loss: 0.1641 125/500 [======>.......................] - ETA: 1:36 - loss: 1.3082 - regression_loss: 1.1434 - classification_loss: 0.1649 126/500 [======>.......................] - ETA: 1:36 - loss: 1.3099 - regression_loss: 1.1449 - classification_loss: 0.1650 127/500 [======>.......................] - ETA: 1:36 - loss: 1.3144 - regression_loss: 1.1485 - classification_loss: 0.1658 128/500 [======>.......................] - ETA: 1:36 - loss: 1.3206 - regression_loss: 1.1532 - classification_loss: 0.1674 129/500 [======>.......................] - ETA: 1:35 - loss: 1.3264 - regression_loss: 1.1577 - classification_loss: 0.1687 130/500 [======>.......................] - ETA: 1:35 - loss: 1.3287 - regression_loss: 1.1594 - classification_loss: 0.1692 131/500 [======>.......................] - ETA: 1:35 - loss: 1.3295 - regression_loss: 1.1603 - classification_loss: 0.1692 132/500 [======>.......................] - ETA: 1:34 - loss: 1.3368 - regression_loss: 1.1671 - classification_loss: 0.1696 133/500 [======>.......................] - ETA: 1:34 - loss: 1.3395 - regression_loss: 1.1694 - classification_loss: 0.1700 134/500 [=======>......................] - ETA: 1:34 - loss: 1.3368 - regression_loss: 1.1674 - classification_loss: 0.1694 135/500 [=======>......................] - ETA: 1:34 - loss: 1.3374 - regression_loss: 1.1676 - classification_loss: 0.1697 136/500 [=======>......................] - ETA: 1:34 - loss: 1.3306 - regression_loss: 1.1612 - classification_loss: 0.1694 137/500 [=======>......................] - ETA: 1:33 - loss: 1.3285 - regression_loss: 1.1593 - classification_loss: 0.1692 138/500 [=======>......................] - ETA: 1:33 - loss: 1.3301 - regression_loss: 1.1606 - classification_loss: 0.1696 139/500 [=======>......................] - ETA: 1:33 - loss: 1.3326 - regression_loss: 1.1628 - classification_loss: 0.1698 140/500 [=======>......................] - ETA: 1:33 - loss: 1.3330 - regression_loss: 1.1627 - classification_loss: 0.1704 141/500 [=======>......................] - ETA: 1:32 - loss: 1.3316 - regression_loss: 1.1615 - classification_loss: 0.1701 142/500 [=======>......................] - ETA: 1:32 - loss: 1.3326 - regression_loss: 1.1623 - classification_loss: 0.1703 143/500 [=======>......................] - ETA: 1:32 - loss: 1.3343 - regression_loss: 1.1639 - classification_loss: 0.1704 144/500 [=======>......................] - ETA: 1:32 - loss: 1.3294 - regression_loss: 1.1598 - classification_loss: 0.1695 145/500 [=======>......................] - ETA: 1:31 - loss: 1.3308 - regression_loss: 1.1612 - classification_loss: 0.1696 146/500 [=======>......................] - ETA: 1:31 - loss: 1.3327 - regression_loss: 1.1628 - classification_loss: 0.1698 147/500 [=======>......................] - ETA: 1:31 - loss: 1.3337 - regression_loss: 1.1638 - classification_loss: 0.1698 148/500 [=======>......................] - ETA: 1:30 - loss: 1.3348 - regression_loss: 1.1647 - classification_loss: 0.1701 149/500 [=======>......................] - ETA: 1:30 - loss: 1.3366 - regression_loss: 1.1662 - classification_loss: 0.1704 150/500 [========>.....................] - ETA: 1:30 - loss: 1.3442 - regression_loss: 1.1725 - classification_loss: 0.1717 151/500 [========>.....................] - ETA: 1:30 - loss: 1.3461 - regression_loss: 1.1742 - classification_loss: 0.1719 152/500 [========>.....................] - ETA: 1:29 - loss: 1.3449 - regression_loss: 1.1732 - classification_loss: 0.1718 153/500 [========>.....................] - ETA: 1:29 - loss: 1.3439 - regression_loss: 1.1722 - classification_loss: 0.1717 154/500 [========>.....................] - ETA: 1:29 - loss: 1.3461 - regression_loss: 1.1744 - classification_loss: 0.1716 155/500 [========>.....................] - ETA: 1:29 - loss: 1.3492 - regression_loss: 1.1775 - classification_loss: 0.1717 156/500 [========>.....................] - ETA: 1:28 - loss: 1.3458 - regression_loss: 1.1700 - classification_loss: 0.1759 157/500 [========>.....................] - ETA: 1:28 - loss: 1.3486 - regression_loss: 1.1724 - classification_loss: 0.1762 158/500 [========>.....................] - ETA: 1:28 - loss: 1.3480 - regression_loss: 1.1718 - classification_loss: 0.1762 159/500 [========>.....................] - ETA: 1:28 - loss: 1.3481 - regression_loss: 1.1715 - classification_loss: 0.1766 160/500 [========>.....................] - ETA: 1:27 - loss: 1.3486 - regression_loss: 1.1721 - classification_loss: 0.1766 161/500 [========>.....................] - ETA: 1:27 - loss: 1.3428 - regression_loss: 1.1672 - classification_loss: 0.1757 162/500 [========>.....................] - ETA: 1:27 - loss: 1.3440 - regression_loss: 1.1683 - classification_loss: 0.1757 163/500 [========>.....................] - ETA: 1:27 - loss: 1.3442 - regression_loss: 1.1687 - classification_loss: 0.1756 164/500 [========>.....................] - ETA: 1:26 - loss: 1.3473 - regression_loss: 1.1715 - classification_loss: 0.1758 165/500 [========>.....................] - ETA: 1:26 - loss: 1.3480 - regression_loss: 1.1723 - classification_loss: 0.1757 166/500 [========>.....................] - ETA: 1:26 - loss: 1.3436 - regression_loss: 1.1685 - classification_loss: 0.1752 167/500 [=========>....................] - ETA: 1:26 - loss: 1.3443 - regression_loss: 1.1692 - classification_loss: 0.1751 168/500 [=========>....................] - ETA: 1:25 - loss: 1.3478 - regression_loss: 1.1722 - classification_loss: 0.1756 169/500 [=========>....................] - ETA: 1:25 - loss: 1.3455 - regression_loss: 1.1702 - classification_loss: 0.1753 170/500 [=========>....................] - ETA: 1:25 - loss: 1.3437 - regression_loss: 1.1688 - classification_loss: 0.1749 171/500 [=========>....................] - ETA: 1:25 - loss: 1.3464 - regression_loss: 1.1711 - classification_loss: 0.1753 172/500 [=========>....................] - ETA: 1:24 - loss: 1.3440 - regression_loss: 1.1689 - classification_loss: 0.1751 173/500 [=========>....................] - ETA: 1:24 - loss: 1.3437 - regression_loss: 1.1685 - classification_loss: 0.1752 174/500 [=========>....................] - ETA: 1:24 - loss: 1.3463 - regression_loss: 1.1705 - classification_loss: 0.1758 175/500 [=========>....................] - ETA: 1:24 - loss: 1.3484 - regression_loss: 1.1723 - classification_loss: 0.1761 176/500 [=========>....................] - ETA: 1:23 - loss: 1.3508 - regression_loss: 1.1743 - classification_loss: 0.1766 177/500 [=========>....................] - ETA: 1:23 - loss: 1.3472 - regression_loss: 1.1712 - classification_loss: 0.1760 178/500 [=========>....................] - ETA: 1:23 - loss: 1.3466 - regression_loss: 1.1707 - classification_loss: 0.1759 179/500 [=========>....................] - ETA: 1:23 - loss: 1.3454 - regression_loss: 1.1697 - classification_loss: 0.1757 180/500 [=========>....................] - ETA: 1:22 - loss: 1.3449 - regression_loss: 1.1696 - classification_loss: 0.1753 181/500 [=========>....................] - ETA: 1:22 - loss: 1.3446 - regression_loss: 1.1692 - classification_loss: 0.1754 182/500 [=========>....................] - ETA: 1:22 - loss: 1.3436 - regression_loss: 1.1685 - classification_loss: 0.1751 183/500 [=========>....................] - ETA: 1:21 - loss: 1.3433 - regression_loss: 1.1684 - classification_loss: 0.1749 184/500 [==========>...................] - ETA: 1:21 - loss: 1.3422 - regression_loss: 1.1676 - classification_loss: 0.1746 185/500 [==========>...................] - ETA: 1:21 - loss: 1.3438 - regression_loss: 1.1689 - classification_loss: 0.1749 186/500 [==========>...................] - ETA: 1:21 - loss: 1.3438 - regression_loss: 1.1692 - classification_loss: 0.1746 187/500 [==========>...................] - ETA: 1:20 - loss: 1.3418 - regression_loss: 1.1676 - classification_loss: 0.1742 188/500 [==========>...................] - ETA: 1:20 - loss: 1.3429 - regression_loss: 1.1686 - classification_loss: 0.1743 189/500 [==========>...................] - ETA: 1:20 - loss: 1.3439 - regression_loss: 1.1691 - classification_loss: 0.1748 190/500 [==========>...................] - ETA: 1:20 - loss: 1.3446 - regression_loss: 1.1698 - classification_loss: 0.1748 191/500 [==========>...................] - ETA: 1:19 - loss: 1.3456 - regression_loss: 1.1707 - classification_loss: 0.1749 192/500 [==========>...................] - ETA: 1:19 - loss: 1.3484 - regression_loss: 1.1730 - classification_loss: 0.1754 193/500 [==========>...................] - ETA: 1:19 - loss: 1.3441 - regression_loss: 1.1695 - classification_loss: 0.1746 194/500 [==========>...................] - ETA: 1:19 - loss: 1.3445 - regression_loss: 1.1698 - classification_loss: 0.1747 195/500 [==========>...................] - ETA: 1:18 - loss: 1.3443 - regression_loss: 1.1697 - classification_loss: 0.1746 196/500 [==========>...................] - ETA: 1:18 - loss: 1.3446 - regression_loss: 1.1700 - classification_loss: 0.1746 197/500 [==========>...................] - ETA: 1:18 - loss: 1.3447 - regression_loss: 1.1702 - classification_loss: 0.1745 198/500 [==========>...................] - ETA: 1:18 - loss: 1.3427 - regression_loss: 1.1686 - classification_loss: 0.1740 199/500 [==========>...................] - ETA: 1:17 - loss: 1.3442 - regression_loss: 1.1700 - classification_loss: 0.1742 200/500 [===========>..................] - ETA: 1:17 - loss: 1.3453 - regression_loss: 1.1709 - classification_loss: 0.1744 201/500 [===========>..................] - ETA: 1:17 - loss: 1.3442 - regression_loss: 1.1697 - classification_loss: 0.1745 202/500 [===========>..................] - ETA: 1:17 - loss: 1.3440 - regression_loss: 1.1694 - classification_loss: 0.1746 203/500 [===========>..................] - ETA: 1:16 - loss: 1.3444 - regression_loss: 1.1697 - classification_loss: 0.1747 204/500 [===========>..................] - ETA: 1:16 - loss: 1.3417 - regression_loss: 1.1673 - classification_loss: 0.1743 205/500 [===========>..................] - ETA: 1:16 - loss: 1.3377 - regression_loss: 1.1639 - classification_loss: 0.1737 206/500 [===========>..................] - ETA: 1:16 - loss: 1.3378 - regression_loss: 1.1642 - classification_loss: 0.1736 207/500 [===========>..................] - ETA: 1:15 - loss: 1.3390 - regression_loss: 1.1651 - classification_loss: 0.1738 208/500 [===========>..................] - ETA: 1:15 - loss: 1.3365 - regression_loss: 1.1629 - classification_loss: 0.1736 209/500 [===========>..................] - ETA: 1:15 - loss: 1.3374 - regression_loss: 1.1637 - classification_loss: 0.1737 210/500 [===========>..................] - ETA: 1:14 - loss: 1.3343 - regression_loss: 1.1611 - classification_loss: 0.1732 211/500 [===========>..................] - ETA: 1:14 - loss: 1.3330 - regression_loss: 1.1601 - classification_loss: 0.1730 212/500 [===========>..................] - ETA: 1:14 - loss: 1.3333 - regression_loss: 1.1602 - classification_loss: 0.1731 213/500 [===========>..................] - ETA: 1:14 - loss: 1.3315 - regression_loss: 1.1588 - classification_loss: 0.1727 214/500 [===========>..................] - ETA: 1:13 - loss: 1.3307 - regression_loss: 1.1582 - classification_loss: 0.1725 215/500 [===========>..................] - ETA: 1:13 - loss: 1.3308 - regression_loss: 1.1583 - classification_loss: 0.1724 216/500 [===========>..................] - ETA: 1:13 - loss: 1.3320 - regression_loss: 1.1594 - classification_loss: 0.1726 217/500 [============>.................] - ETA: 1:13 - loss: 1.3297 - regression_loss: 1.1573 - classification_loss: 0.1725 218/500 [============>.................] - ETA: 1:12 - loss: 1.3310 - regression_loss: 1.1585 - classification_loss: 0.1725 219/500 [============>.................] - ETA: 1:12 - loss: 1.3290 - regression_loss: 1.1570 - classification_loss: 0.1720 220/500 [============>.................] - ETA: 1:12 - loss: 1.3306 - regression_loss: 1.1586 - classification_loss: 0.1721 221/500 [============>.................] - ETA: 1:12 - loss: 1.3314 - regression_loss: 1.1593 - classification_loss: 0.1720 222/500 [============>.................] - ETA: 1:11 - loss: 1.3283 - regression_loss: 1.1568 - classification_loss: 0.1715 223/500 [============>.................] - ETA: 1:11 - loss: 1.3283 - regression_loss: 1.1567 - classification_loss: 0.1716 224/500 [============>.................] - ETA: 1:11 - loss: 1.3274 - regression_loss: 1.1559 - classification_loss: 0.1715 225/500 [============>.................] - ETA: 1:11 - loss: 1.3283 - regression_loss: 1.1567 - classification_loss: 0.1716 226/500 [============>.................] - ETA: 1:10 - loss: 1.3281 - regression_loss: 1.1565 - classification_loss: 0.1716 227/500 [============>.................] - ETA: 1:10 - loss: 1.3250 - regression_loss: 1.1538 - classification_loss: 0.1712 228/500 [============>.................] - ETA: 1:10 - loss: 1.3269 - regression_loss: 1.1555 - classification_loss: 0.1715 229/500 [============>.................] - ETA: 1:09 - loss: 1.3245 - regression_loss: 1.1532 - classification_loss: 0.1713 230/500 [============>.................] - ETA: 1:09 - loss: 1.3222 - regression_loss: 1.1513 - classification_loss: 0.1709 231/500 [============>.................] - ETA: 1:09 - loss: 1.3234 - regression_loss: 1.1526 - classification_loss: 0.1709 232/500 [============>.................] - ETA: 1:09 - loss: 1.3237 - regression_loss: 1.1531 - classification_loss: 0.1706 233/500 [============>.................] - ETA: 1:08 - loss: 1.3229 - regression_loss: 1.1526 - classification_loss: 0.1703 234/500 [=============>................] - ETA: 1:08 - loss: 1.3238 - regression_loss: 1.1533 - classification_loss: 0.1705 235/500 [=============>................] - ETA: 1:08 - loss: 1.3241 - regression_loss: 1.1536 - classification_loss: 0.1705 236/500 [=============>................] - ETA: 1:08 - loss: 1.3224 - regression_loss: 1.1521 - classification_loss: 0.1703 237/500 [=============>................] - ETA: 1:07 - loss: 1.3194 - regression_loss: 1.1497 - classification_loss: 0.1698 238/500 [=============>................] - ETA: 1:07 - loss: 1.3162 - regression_loss: 1.1469 - classification_loss: 0.1693 239/500 [=============>................] - ETA: 1:07 - loss: 1.3164 - regression_loss: 1.1471 - classification_loss: 0.1693 240/500 [=============>................] - ETA: 1:07 - loss: 1.3134 - regression_loss: 1.1446 - classification_loss: 0.1689 241/500 [=============>................] - ETA: 1:06 - loss: 1.3128 - regression_loss: 1.1443 - classification_loss: 0.1685 242/500 [=============>................] - ETA: 1:06 - loss: 1.3135 - regression_loss: 1.1449 - classification_loss: 0.1687 243/500 [=============>................] - ETA: 1:06 - loss: 1.3154 - regression_loss: 1.1465 - classification_loss: 0.1689 244/500 [=============>................] - ETA: 1:06 - loss: 1.3156 - regression_loss: 1.1467 - classification_loss: 0.1689 245/500 [=============>................] - ETA: 1:05 - loss: 1.3139 - regression_loss: 1.1454 - classification_loss: 0.1685 246/500 [=============>................] - ETA: 1:05 - loss: 1.3141 - regression_loss: 1.1456 - classification_loss: 0.1685 247/500 [=============>................] - ETA: 1:05 - loss: 1.3154 - regression_loss: 1.1466 - classification_loss: 0.1688 248/500 [=============>................] - ETA: 1:05 - loss: 1.3120 - regression_loss: 1.1437 - classification_loss: 0.1683 249/500 [=============>................] - ETA: 1:04 - loss: 1.3122 - regression_loss: 1.1440 - classification_loss: 0.1682 250/500 [==============>...............] - ETA: 1:04 - loss: 1.3125 - regression_loss: 1.1441 - classification_loss: 0.1684 251/500 [==============>...............] - ETA: 1:04 - loss: 1.3117 - regression_loss: 1.1435 - classification_loss: 0.1683 252/500 [==============>...............] - ETA: 1:04 - loss: 1.3122 - regression_loss: 1.1440 - classification_loss: 0.1682 253/500 [==============>...............] - ETA: 1:03 - loss: 1.3128 - regression_loss: 1.1445 - classification_loss: 0.1683 254/500 [==============>...............] - ETA: 1:03 - loss: 1.3116 - regression_loss: 1.1435 - classification_loss: 0.1681 255/500 [==============>...............] - ETA: 1:03 - loss: 1.3107 - regression_loss: 1.1429 - classification_loss: 0.1678 256/500 [==============>...............] - ETA: 1:03 - loss: 1.3104 - regression_loss: 1.1426 - classification_loss: 0.1677 257/500 [==============>...............] - ETA: 1:02 - loss: 1.3108 - regression_loss: 1.1431 - classification_loss: 0.1678 258/500 [==============>...............] - ETA: 1:02 - loss: 1.3097 - regression_loss: 1.1421 - classification_loss: 0.1676 259/500 [==============>...............] - ETA: 1:02 - loss: 1.3124 - regression_loss: 1.1444 - classification_loss: 0.1680 260/500 [==============>...............] - ETA: 1:02 - loss: 1.3112 - regression_loss: 1.1434 - classification_loss: 0.1678 261/500 [==============>...............] - ETA: 1:01 - loss: 1.3106 - regression_loss: 1.1430 - classification_loss: 0.1676 262/500 [==============>...............] - ETA: 1:01 - loss: 1.3119 - regression_loss: 1.1441 - classification_loss: 0.1679 263/500 [==============>...............] - ETA: 1:01 - loss: 1.3110 - regression_loss: 1.1433 - classification_loss: 0.1677 264/500 [==============>...............] - ETA: 1:00 - loss: 1.3121 - regression_loss: 1.1442 - classification_loss: 0.1678 265/500 [==============>...............] - ETA: 1:00 - loss: 1.3134 - regression_loss: 1.1455 - classification_loss: 0.1679 266/500 [==============>...............] - ETA: 1:00 - loss: 1.3129 - regression_loss: 1.1452 - classification_loss: 0.1677 267/500 [===============>..............] - ETA: 1:00 - loss: 1.3136 - regression_loss: 1.1457 - classification_loss: 0.1679 268/500 [===============>..............] - ETA: 59s - loss: 1.3110 - regression_loss: 1.1436 - classification_loss: 0.1674  269/500 [===============>..............] - ETA: 59s - loss: 1.3125 - regression_loss: 1.1450 - classification_loss: 0.1675 270/500 [===============>..............] - ETA: 59s - loss: 1.3096 - regression_loss: 1.1425 - classification_loss: 0.1670 271/500 [===============>..............] - ETA: 59s - loss: 1.3110 - regression_loss: 1.1438 - classification_loss: 0.1672 272/500 [===============>..............] - ETA: 58s - loss: 1.3118 - regression_loss: 1.1446 - classification_loss: 0.1672 273/500 [===============>..............] - ETA: 58s - loss: 1.3112 - regression_loss: 1.1440 - classification_loss: 0.1672 274/500 [===============>..............] - ETA: 58s - loss: 1.3110 - regression_loss: 1.1438 - classification_loss: 0.1672 275/500 [===============>..............] - ETA: 58s - loss: 1.3109 - regression_loss: 1.1438 - classification_loss: 0.1671 276/500 [===============>..............] - ETA: 57s - loss: 1.3115 - regression_loss: 1.1442 - classification_loss: 0.1673 277/500 [===============>..............] - ETA: 57s - loss: 1.3116 - regression_loss: 1.1446 - classification_loss: 0.1670 278/500 [===============>..............] - ETA: 57s - loss: 1.3118 - regression_loss: 1.1448 - classification_loss: 0.1670 279/500 [===============>..............] - ETA: 57s - loss: 1.3085 - regression_loss: 1.1419 - classification_loss: 0.1665 280/500 [===============>..............] - ETA: 56s - loss: 1.3091 - regression_loss: 1.1426 - classification_loss: 0.1665 281/500 [===============>..............] - ETA: 56s - loss: 1.3094 - regression_loss: 1.1428 - classification_loss: 0.1666 282/500 [===============>..............] - ETA: 56s - loss: 1.3106 - regression_loss: 1.1439 - classification_loss: 0.1667 283/500 [===============>..............] - ETA: 56s - loss: 1.3107 - regression_loss: 1.1441 - classification_loss: 0.1666 284/500 [================>.............] - ETA: 55s - loss: 1.3102 - regression_loss: 1.1438 - classification_loss: 0.1664 285/500 [================>.............] - ETA: 55s - loss: 1.3109 - regression_loss: 1.1444 - classification_loss: 0.1665 286/500 [================>.............] - ETA: 55s - loss: 1.3113 - regression_loss: 1.1449 - classification_loss: 0.1664 287/500 [================>.............] - ETA: 54s - loss: 1.3131 - regression_loss: 1.1463 - classification_loss: 0.1668 288/500 [================>.............] - ETA: 54s - loss: 1.3131 - regression_loss: 1.1461 - classification_loss: 0.1669 289/500 [================>.............] - ETA: 54s - loss: 1.3131 - regression_loss: 1.1461 - classification_loss: 0.1670 290/500 [================>.............] - ETA: 54s - loss: 1.3137 - regression_loss: 1.1466 - classification_loss: 0.1672 291/500 [================>.............] - ETA: 53s - loss: 1.3125 - regression_loss: 1.1453 - classification_loss: 0.1671 292/500 [================>.............] - ETA: 53s - loss: 1.3130 - regression_loss: 1.1459 - classification_loss: 0.1671 293/500 [================>.............] - ETA: 53s - loss: 1.3145 - regression_loss: 1.1471 - classification_loss: 0.1675 294/500 [================>.............] - ETA: 53s - loss: 1.3133 - regression_loss: 1.1459 - classification_loss: 0.1673 295/500 [================>.............] - ETA: 52s - loss: 1.3145 - regression_loss: 1.1470 - classification_loss: 0.1674 296/500 [================>.............] - ETA: 52s - loss: 1.3131 - regression_loss: 1.1460 - classification_loss: 0.1671 297/500 [================>.............] - ETA: 52s - loss: 1.3126 - regression_loss: 1.1456 - classification_loss: 0.1670 298/500 [================>.............] - ETA: 52s - loss: 1.3128 - regression_loss: 1.1457 - classification_loss: 0.1672 299/500 [================>.............] - ETA: 51s - loss: 1.3117 - regression_loss: 1.1447 - classification_loss: 0.1669 300/500 [=================>............] - ETA: 51s - loss: 1.3091 - regression_loss: 1.1425 - classification_loss: 0.1666 301/500 [=================>............] - ETA: 51s - loss: 1.3110 - regression_loss: 1.1439 - classification_loss: 0.1670 302/500 [=================>............] - ETA: 51s - loss: 1.3093 - regression_loss: 1.1426 - classification_loss: 0.1667 303/500 [=================>............] - ETA: 50s - loss: 1.3103 - regression_loss: 1.1434 - classification_loss: 0.1669 304/500 [=================>............] - ETA: 50s - loss: 1.3109 - regression_loss: 1.1440 - classification_loss: 0.1669 305/500 [=================>............] - ETA: 50s - loss: 1.3112 - regression_loss: 1.1442 - classification_loss: 0.1670 306/500 [=================>............] - ETA: 50s - loss: 1.3114 - regression_loss: 1.1444 - classification_loss: 0.1670 307/500 [=================>............] - ETA: 49s - loss: 1.3134 - regression_loss: 1.1460 - classification_loss: 0.1674 308/500 [=================>............] - ETA: 49s - loss: 1.3110 - regression_loss: 1.1439 - classification_loss: 0.1671 309/500 [=================>............] - ETA: 49s - loss: 1.3108 - regression_loss: 1.1439 - classification_loss: 0.1669 310/500 [=================>............] - ETA: 49s - loss: 1.3123 - regression_loss: 1.1451 - classification_loss: 0.1672 311/500 [=================>............] - ETA: 48s - loss: 1.3150 - regression_loss: 1.1474 - classification_loss: 0.1675 312/500 [=================>............] - ETA: 48s - loss: 1.3155 - regression_loss: 1.1477 - classification_loss: 0.1678 313/500 [=================>............] - ETA: 48s - loss: 1.3174 - regression_loss: 1.1492 - classification_loss: 0.1681 314/500 [=================>............] - ETA: 48s - loss: 1.3186 - regression_loss: 1.1503 - classification_loss: 0.1683 315/500 [=================>............] - ETA: 47s - loss: 1.3175 - regression_loss: 1.1494 - classification_loss: 0.1681 316/500 [=================>............] - ETA: 47s - loss: 1.3201 - regression_loss: 1.1515 - classification_loss: 0.1686 317/500 [==================>...........] - ETA: 47s - loss: 1.3225 - regression_loss: 1.1536 - classification_loss: 0.1689 318/500 [==================>...........] - ETA: 46s - loss: 1.3224 - regression_loss: 1.1536 - classification_loss: 0.1687 319/500 [==================>...........] - ETA: 46s - loss: 1.3216 - regression_loss: 1.1529 - classification_loss: 0.1687 320/500 [==================>...........] - ETA: 46s - loss: 1.3203 - regression_loss: 1.1518 - classification_loss: 0.1684 321/500 [==================>...........] - ETA: 46s - loss: 1.3188 - regression_loss: 1.1504 - classification_loss: 0.1683 322/500 [==================>...........] - ETA: 45s - loss: 1.3168 - regression_loss: 1.1489 - classification_loss: 0.1679 323/500 [==================>...........] - ETA: 45s - loss: 1.3174 - regression_loss: 1.1493 - classification_loss: 0.1681 324/500 [==================>...........] - ETA: 45s - loss: 1.3152 - regression_loss: 1.1474 - classification_loss: 0.1678 325/500 [==================>...........] - ETA: 45s - loss: 1.3159 - regression_loss: 1.1483 - classification_loss: 0.1676 326/500 [==================>...........] - ETA: 44s - loss: 1.3157 - regression_loss: 1.1482 - classification_loss: 0.1675 327/500 [==================>...........] - ETA: 44s - loss: 1.3168 - regression_loss: 1.1491 - classification_loss: 0.1677 328/500 [==================>...........] - ETA: 44s - loss: 1.3165 - regression_loss: 1.1489 - classification_loss: 0.1677 329/500 [==================>...........] - ETA: 44s - loss: 1.3162 - regression_loss: 1.1485 - classification_loss: 0.1676 330/500 [==================>...........] - ETA: 43s - loss: 1.3170 - regression_loss: 1.1491 - classification_loss: 0.1678 331/500 [==================>...........] - ETA: 43s - loss: 1.3164 - regression_loss: 1.1486 - classification_loss: 0.1678 332/500 [==================>...........] - ETA: 43s - loss: 1.3162 - regression_loss: 1.1485 - classification_loss: 0.1677 333/500 [==================>...........] - ETA: 43s - loss: 1.3178 - regression_loss: 1.1499 - classification_loss: 0.1680 334/500 [===================>..........] - ETA: 42s - loss: 1.3160 - regression_loss: 1.1483 - classification_loss: 0.1677 335/500 [===================>..........] - ETA: 42s - loss: 1.3133 - regression_loss: 1.1458 - classification_loss: 0.1675 336/500 [===================>..........] - ETA: 42s - loss: 1.3135 - regression_loss: 1.1460 - classification_loss: 0.1675 337/500 [===================>..........] - ETA: 42s - loss: 1.3146 - regression_loss: 1.1470 - classification_loss: 0.1676 338/500 [===================>..........] - ETA: 41s - loss: 1.3142 - regression_loss: 1.1466 - classification_loss: 0.1676 339/500 [===================>..........] - ETA: 41s - loss: 1.3153 - regression_loss: 1.1476 - classification_loss: 0.1677 340/500 [===================>..........] - ETA: 41s - loss: 1.3155 - regression_loss: 1.1477 - classification_loss: 0.1678 341/500 [===================>..........] - ETA: 41s - loss: 1.3159 - regression_loss: 1.1480 - classification_loss: 0.1679 342/500 [===================>..........] - ETA: 40s - loss: 1.3173 - regression_loss: 1.1492 - classification_loss: 0.1681 343/500 [===================>..........] - ETA: 40s - loss: 1.3176 - regression_loss: 1.1496 - classification_loss: 0.1680 344/500 [===================>..........] - ETA: 40s - loss: 1.3186 - regression_loss: 1.1505 - classification_loss: 0.1682 345/500 [===================>..........] - ETA: 40s - loss: 1.3180 - regression_loss: 1.1500 - classification_loss: 0.1680 346/500 [===================>..........] - ETA: 39s - loss: 1.3170 - regression_loss: 1.1491 - classification_loss: 0.1679 347/500 [===================>..........] - ETA: 39s - loss: 1.3147 - regression_loss: 1.1470 - classification_loss: 0.1677 348/500 [===================>..........] - ETA: 39s - loss: 1.3159 - regression_loss: 1.1480 - classification_loss: 0.1679 349/500 [===================>..........] - ETA: 39s - loss: 1.3170 - regression_loss: 1.1491 - classification_loss: 0.1679 350/500 [====================>.........] - ETA: 38s - loss: 1.3184 - regression_loss: 1.1504 - classification_loss: 0.1680 351/500 [====================>.........] - ETA: 38s - loss: 1.3198 - regression_loss: 1.1518 - classification_loss: 0.1680 352/500 [====================>.........] - ETA: 38s - loss: 1.3197 - regression_loss: 1.1517 - classification_loss: 0.1680 353/500 [====================>.........] - ETA: 37s - loss: 1.3184 - regression_loss: 1.1506 - classification_loss: 0.1678 354/500 [====================>.........] - ETA: 37s - loss: 1.3163 - regression_loss: 1.1488 - classification_loss: 0.1676 355/500 [====================>.........] - ETA: 37s - loss: 1.3140 - regression_loss: 1.1467 - classification_loss: 0.1672 356/500 [====================>.........] - ETA: 37s - loss: 1.3148 - regression_loss: 1.1474 - classification_loss: 0.1674 357/500 [====================>.........] - ETA: 36s - loss: 1.3159 - regression_loss: 1.1484 - classification_loss: 0.1676 358/500 [====================>.........] - ETA: 36s - loss: 1.3159 - regression_loss: 1.1484 - classification_loss: 0.1676 359/500 [====================>.........] - ETA: 36s - loss: 1.3151 - regression_loss: 1.1478 - classification_loss: 0.1673 360/500 [====================>.........] - ETA: 36s - loss: 1.3147 - regression_loss: 1.1476 - classification_loss: 0.1672 361/500 [====================>.........] - ETA: 35s - loss: 1.3140 - regression_loss: 1.1468 - classification_loss: 0.1671 362/500 [====================>.........] - ETA: 35s - loss: 1.3138 - regression_loss: 1.1467 - classification_loss: 0.1671 363/500 [====================>.........] - ETA: 35s - loss: 1.3130 - regression_loss: 1.1459 - classification_loss: 0.1671 364/500 [====================>.........] - ETA: 35s - loss: 1.3151 - regression_loss: 1.1478 - classification_loss: 0.1673 365/500 [====================>.........] - ETA: 34s - loss: 1.3155 - regression_loss: 1.1481 - classification_loss: 0.1673 366/500 [====================>.........] - ETA: 34s - loss: 1.3173 - regression_loss: 1.1498 - classification_loss: 0.1675 367/500 [=====================>........] - ETA: 34s - loss: 1.3167 - regression_loss: 1.1490 - classification_loss: 0.1676 368/500 [=====================>........] - ETA: 34s - loss: 1.3142 - regression_loss: 1.1470 - classification_loss: 0.1673 369/500 [=====================>........] - ETA: 33s - loss: 1.3147 - regression_loss: 1.1474 - classification_loss: 0.1673 370/500 [=====================>........] - ETA: 33s - loss: 1.3135 - regression_loss: 1.1463 - classification_loss: 0.1672 371/500 [=====================>........] - ETA: 33s - loss: 1.3141 - regression_loss: 1.1467 - classification_loss: 0.1673 372/500 [=====================>........] - ETA: 33s - loss: 1.3153 - regression_loss: 1.1478 - classification_loss: 0.1676 373/500 [=====================>........] - ETA: 32s - loss: 1.3158 - regression_loss: 1.1483 - classification_loss: 0.1676 374/500 [=====================>........] - ETA: 32s - loss: 1.3140 - regression_loss: 1.1467 - classification_loss: 0.1673 375/500 [=====================>........] - ETA: 32s - loss: 1.3157 - regression_loss: 1.1477 - classification_loss: 0.1680 376/500 [=====================>........] - ETA: 32s - loss: 1.3167 - regression_loss: 1.1486 - classification_loss: 0.1681 377/500 [=====================>........] - ETA: 31s - loss: 1.3174 - regression_loss: 1.1494 - classification_loss: 0.1680 378/500 [=====================>........] - ETA: 31s - loss: 1.3155 - regression_loss: 1.1477 - classification_loss: 0.1678 379/500 [=====================>........] - ETA: 31s - loss: 1.3146 - regression_loss: 1.1471 - classification_loss: 0.1676 380/500 [=====================>........] - ETA: 31s - loss: 1.3161 - regression_loss: 1.1483 - classification_loss: 0.1679 381/500 [=====================>........] - ETA: 30s - loss: 1.3166 - regression_loss: 1.1487 - classification_loss: 0.1680 382/500 [=====================>........] - ETA: 30s - loss: 1.3197 - regression_loss: 1.1515 - classification_loss: 0.1682 383/500 [=====================>........] - ETA: 30s - loss: 1.3202 - regression_loss: 1.1519 - classification_loss: 0.1682 384/500 [======================>.......] - ETA: 29s - loss: 1.3199 - regression_loss: 1.1517 - classification_loss: 0.1682 385/500 [======================>.......] - ETA: 29s - loss: 1.3203 - regression_loss: 1.1520 - classification_loss: 0.1682 386/500 [======================>.......] - ETA: 29s - loss: 1.3214 - regression_loss: 1.1531 - classification_loss: 0.1684 387/500 [======================>.......] - ETA: 29s - loss: 1.3216 - regression_loss: 1.1532 - classification_loss: 0.1684 388/500 [======================>.......] - ETA: 28s - loss: 1.3204 - regression_loss: 1.1522 - classification_loss: 0.1682 389/500 [======================>.......] - ETA: 28s - loss: 1.3203 - regression_loss: 1.1521 - classification_loss: 0.1682 390/500 [======================>.......] - ETA: 28s - loss: 1.3198 - regression_loss: 1.1517 - classification_loss: 0.1681 391/500 [======================>.......] - ETA: 28s - loss: 1.3200 - regression_loss: 1.1519 - classification_loss: 0.1681 392/500 [======================>.......] - ETA: 27s - loss: 1.3227 - regression_loss: 1.1541 - classification_loss: 0.1685 393/500 [======================>.......] - ETA: 27s - loss: 1.3227 - regression_loss: 1.1541 - classification_loss: 0.1686 394/500 [======================>.......] - ETA: 27s - loss: 1.3236 - regression_loss: 1.1548 - classification_loss: 0.1688 395/500 [======================>.......] - ETA: 27s - loss: 1.3227 - regression_loss: 1.1541 - classification_loss: 0.1686 396/500 [======================>.......] - ETA: 26s - loss: 1.3208 - regression_loss: 1.1525 - classification_loss: 0.1683 397/500 [======================>.......] - ETA: 26s - loss: 1.3208 - regression_loss: 1.1526 - classification_loss: 0.1683 398/500 [======================>.......] - ETA: 26s - loss: 1.3190 - regression_loss: 1.1510 - classification_loss: 0.1680 399/500 [======================>.......] - ETA: 26s - loss: 1.3177 - regression_loss: 1.1499 - classification_loss: 0.1678 400/500 [=======================>......] - ETA: 25s - loss: 1.3170 - regression_loss: 1.1495 - classification_loss: 0.1676 401/500 [=======================>......] - ETA: 25s - loss: 1.3173 - regression_loss: 1.1496 - classification_loss: 0.1676 402/500 [=======================>......] - ETA: 25s - loss: 1.3178 - regression_loss: 1.1502 - classification_loss: 0.1677 403/500 [=======================>......] - ETA: 25s - loss: 1.3189 - regression_loss: 1.1511 - classification_loss: 0.1677 404/500 [=======================>......] - ETA: 24s - loss: 1.3181 - regression_loss: 1.1505 - classification_loss: 0.1676 405/500 [=======================>......] - ETA: 24s - loss: 1.3175 - regression_loss: 1.1496 - classification_loss: 0.1679 406/500 [=======================>......] - ETA: 24s - loss: 1.3177 - regression_loss: 1.1498 - classification_loss: 0.1679 407/500 [=======================>......] - ETA: 24s - loss: 1.3158 - regression_loss: 1.1482 - classification_loss: 0.1676 408/500 [=======================>......] - ETA: 23s - loss: 1.3146 - regression_loss: 1.1471 - classification_loss: 0.1675 409/500 [=======================>......] - ETA: 23s - loss: 1.3128 - regression_loss: 1.1455 - classification_loss: 0.1672 410/500 [=======================>......] - ETA: 23s - loss: 1.3129 - regression_loss: 1.1457 - classification_loss: 0.1671 411/500 [=======================>......] - ETA: 22s - loss: 1.3126 - regression_loss: 1.1455 - classification_loss: 0.1671 412/500 [=======================>......] - ETA: 22s - loss: 1.3107 - regression_loss: 1.1438 - classification_loss: 0.1669 413/500 [=======================>......] - ETA: 22s - loss: 1.3105 - regression_loss: 1.1436 - classification_loss: 0.1669 414/500 [=======================>......] - ETA: 22s - loss: 1.3106 - regression_loss: 1.1437 - classification_loss: 0.1669 415/500 [=======================>......] - ETA: 21s - loss: 1.3104 - regression_loss: 1.1435 - classification_loss: 0.1669 416/500 [=======================>......] - ETA: 21s - loss: 1.3112 - regression_loss: 1.1442 - classification_loss: 0.1670 417/500 [========================>.....] - ETA: 21s - loss: 1.3111 - regression_loss: 1.1442 - classification_loss: 0.1669 418/500 [========================>.....] - ETA: 21s - loss: 1.3112 - regression_loss: 1.1443 - classification_loss: 0.1670 419/500 [========================>.....] - ETA: 20s - loss: 1.3096 - regression_loss: 1.1429 - classification_loss: 0.1667 420/500 [========================>.....] - ETA: 20s - loss: 1.3090 - regression_loss: 1.1424 - classification_loss: 0.1666 421/500 [========================>.....] - ETA: 20s - loss: 1.3093 - regression_loss: 1.1426 - classification_loss: 0.1667 422/500 [========================>.....] - ETA: 20s - loss: 1.3092 - regression_loss: 1.1427 - classification_loss: 0.1666 423/500 [========================>.....] - ETA: 19s - loss: 1.3094 - regression_loss: 1.1428 - classification_loss: 0.1666 424/500 [========================>.....] - ETA: 19s - loss: 1.3099 - regression_loss: 1.1432 - classification_loss: 0.1666 425/500 [========================>.....] - ETA: 19s - loss: 1.3117 - regression_loss: 1.1447 - classification_loss: 0.1670 426/500 [========================>.....] - ETA: 19s - loss: 1.3113 - regression_loss: 1.1444 - classification_loss: 0.1669 427/500 [========================>.....] - ETA: 18s - loss: 1.3105 - regression_loss: 1.1437 - classification_loss: 0.1668 428/500 [========================>.....] - ETA: 18s - loss: 1.3109 - regression_loss: 1.1440 - classification_loss: 0.1668 429/500 [========================>.....] - ETA: 18s - loss: 1.3154 - regression_loss: 1.1476 - classification_loss: 0.1678 430/500 [========================>.....] - ETA: 18s - loss: 1.3150 - regression_loss: 1.1474 - classification_loss: 0.1676 431/500 [========================>.....] - ETA: 17s - loss: 1.3156 - regression_loss: 1.1478 - classification_loss: 0.1677 432/500 [========================>.....] - ETA: 17s - loss: 1.3157 - regression_loss: 1.1482 - classification_loss: 0.1676 433/500 [========================>.....] - ETA: 17s - loss: 1.3160 - regression_loss: 1.1485 - classification_loss: 0.1676 434/500 [=========================>....] - ETA: 17s - loss: 1.3169 - regression_loss: 1.1494 - classification_loss: 0.1675 435/500 [=========================>....] - ETA: 16s - loss: 1.3158 - regression_loss: 1.1484 - classification_loss: 0.1674 436/500 [=========================>....] - ETA: 16s - loss: 1.3151 - regression_loss: 1.1477 - classification_loss: 0.1674 437/500 [=========================>....] - ETA: 16s - loss: 1.3149 - regression_loss: 1.1477 - classification_loss: 0.1672 438/500 [=========================>....] - ETA: 15s - loss: 1.3142 - regression_loss: 1.1472 - classification_loss: 0.1670 439/500 [=========================>....] - ETA: 15s - loss: 1.3149 - regression_loss: 1.1477 - classification_loss: 0.1671 440/500 [=========================>....] - ETA: 15s - loss: 1.3154 - regression_loss: 1.1482 - classification_loss: 0.1673 441/500 [=========================>....] - ETA: 15s - loss: 1.3162 - regression_loss: 1.1487 - classification_loss: 0.1674 442/500 [=========================>....] - ETA: 14s - loss: 1.3164 - regression_loss: 1.1490 - classification_loss: 0.1674 443/500 [=========================>....] - ETA: 14s - loss: 1.3178 - regression_loss: 1.1502 - classification_loss: 0.1676 444/500 [=========================>....] - ETA: 14s - loss: 1.3179 - regression_loss: 1.1504 - classification_loss: 0.1675 445/500 [=========================>....] - ETA: 14s - loss: 1.3179 - regression_loss: 1.1504 - classification_loss: 0.1675 446/500 [=========================>....] - ETA: 13s - loss: 1.3190 - regression_loss: 1.1513 - classification_loss: 0.1677 447/500 [=========================>....] - ETA: 13s - loss: 1.3194 - regression_loss: 1.1516 - classification_loss: 0.1678 448/500 [=========================>....] - ETA: 13s - loss: 1.3198 - regression_loss: 1.1520 - classification_loss: 0.1678 449/500 [=========================>....] - ETA: 13s - loss: 1.3177 - regression_loss: 1.1502 - classification_loss: 0.1675 450/500 [==========================>...] - ETA: 12s - loss: 1.3184 - regression_loss: 1.1509 - classification_loss: 0.1676 451/500 [==========================>...] - ETA: 12s - loss: 1.3186 - regression_loss: 1.1510 - classification_loss: 0.1676 452/500 [==========================>...] - ETA: 12s - loss: 1.3191 - regression_loss: 1.1515 - classification_loss: 0.1675 453/500 [==========================>...] - ETA: 12s - loss: 1.3183 - regression_loss: 1.1509 - classification_loss: 0.1675 454/500 [==========================>...] - ETA: 11s - loss: 1.3186 - regression_loss: 1.1512 - classification_loss: 0.1674 455/500 [==========================>...] - ETA: 11s - loss: 1.3190 - regression_loss: 1.1515 - classification_loss: 0.1674 456/500 [==========================>...] - ETA: 11s - loss: 1.3191 - regression_loss: 1.1516 - classification_loss: 0.1675 457/500 [==========================>...] - ETA: 11s - loss: 1.3178 - regression_loss: 1.1506 - classification_loss: 0.1673 458/500 [==========================>...] - ETA: 10s - loss: 1.3180 - regression_loss: 1.1509 - classification_loss: 0.1672 459/500 [==========================>...] - ETA: 10s - loss: 1.3164 - regression_loss: 1.1495 - classification_loss: 0.1669 460/500 [==========================>...] - ETA: 10s - loss: 1.3173 - regression_loss: 1.1501 - classification_loss: 0.1672 461/500 [==========================>...] - ETA: 10s - loss: 1.3182 - regression_loss: 1.1509 - classification_loss: 0.1673 462/500 [==========================>...] - ETA: 9s - loss: 1.3181 - regression_loss: 1.1509 - classification_loss: 0.1672  463/500 [==========================>...] - ETA: 9s - loss: 1.3184 - regression_loss: 1.1511 - classification_loss: 0.1673 464/500 [==========================>...] - ETA: 9s - loss: 1.3188 - regression_loss: 1.1515 - classification_loss: 0.1673 465/500 [==========================>...] - ETA: 9s - loss: 1.3186 - regression_loss: 1.1514 - classification_loss: 0.1673 466/500 [==========================>...] - ETA: 8s - loss: 1.3191 - regression_loss: 1.1517 - classification_loss: 0.1674 467/500 [===========================>..] - ETA: 8s - loss: 1.3182 - regression_loss: 1.1509 - classification_loss: 0.1673 468/500 [===========================>..] - ETA: 8s - loss: 1.3189 - regression_loss: 1.1513 - classification_loss: 0.1675 469/500 [===========================>..] - ETA: 8s - loss: 1.3182 - regression_loss: 1.1507 - classification_loss: 0.1674 470/500 [===========================>..] - ETA: 7s - loss: 1.3169 - regression_loss: 1.1497 - classification_loss: 0.1672 471/500 [===========================>..] - ETA: 7s - loss: 1.3176 - regression_loss: 1.1503 - classification_loss: 0.1673 472/500 [===========================>..] - ETA: 7s - loss: 1.3176 - regression_loss: 1.1504 - classification_loss: 0.1672 473/500 [===========================>..] - ETA: 6s - loss: 1.3185 - regression_loss: 1.1513 - classification_loss: 0.1672 474/500 [===========================>..] - ETA: 6s - loss: 1.3181 - regression_loss: 1.1509 - classification_loss: 0.1672 475/500 [===========================>..] - ETA: 6s - loss: 1.3181 - regression_loss: 1.1509 - classification_loss: 0.1672 476/500 [===========================>..] - ETA: 6s - loss: 1.3182 - regression_loss: 1.1510 - classification_loss: 0.1672 477/500 [===========================>..] - ETA: 5s - loss: 1.3170 - regression_loss: 1.1499 - classification_loss: 0.1671 478/500 [===========================>..] - ETA: 5s - loss: 1.3181 - regression_loss: 1.1508 - classification_loss: 0.1672 479/500 [===========================>..] - ETA: 5s - loss: 1.3187 - regression_loss: 1.1514 - classification_loss: 0.1672 480/500 [===========================>..] - ETA: 5s - loss: 1.3177 - regression_loss: 1.1508 - classification_loss: 0.1670 481/500 [===========================>..] - ETA: 4s - loss: 1.3188 - regression_loss: 1.1515 - classification_loss: 0.1673 482/500 [===========================>..] - ETA: 4s - loss: 1.3179 - regression_loss: 1.1508 - classification_loss: 0.1671 483/500 [===========================>..] - ETA: 4s - loss: 1.3186 - regression_loss: 1.1514 - classification_loss: 0.1672 484/500 [============================>.] - ETA: 4s - loss: 1.3187 - regression_loss: 1.1514 - classification_loss: 0.1672 485/500 [============================>.] - ETA: 3s - loss: 1.3169 - regression_loss: 1.1499 - classification_loss: 0.1670 486/500 [============================>.] - ETA: 3s - loss: 1.3166 - regression_loss: 1.1496 - classification_loss: 0.1669 487/500 [============================>.] - ETA: 3s - loss: 1.3174 - regression_loss: 1.1504 - classification_loss: 0.1671 488/500 [============================>.] - ETA: 3s - loss: 1.3163 - regression_loss: 1.1494 - classification_loss: 0.1669 489/500 [============================>.] - ETA: 2s - loss: 1.3171 - regression_loss: 1.1501 - classification_loss: 0.1670 490/500 [============================>.] - ETA: 2s - loss: 1.3171 - regression_loss: 1.1500 - classification_loss: 0.1670 491/500 [============================>.] - ETA: 2s - loss: 1.3178 - regression_loss: 1.1507 - classification_loss: 0.1671 492/500 [============================>.] - ETA: 2s - loss: 1.3176 - regression_loss: 1.1507 - classification_loss: 0.1670 493/500 [============================>.] - ETA: 1s - loss: 1.3175 - regression_loss: 1.1506 - classification_loss: 0.1669 494/500 [============================>.] - ETA: 1s - loss: 1.3171 - regression_loss: 1.1503 - classification_loss: 0.1668 495/500 [============================>.] - ETA: 1s - loss: 1.3168 - regression_loss: 1.1500 - classification_loss: 0.1668 496/500 [============================>.] - ETA: 1s - loss: 1.3155 - regression_loss: 1.1489 - classification_loss: 0.1666 497/500 [============================>.] - ETA: 0s - loss: 1.3169 - regression_loss: 1.1500 - classification_loss: 0.1669 498/500 [============================>.] - ETA: 0s - loss: 1.3166 - regression_loss: 1.1497 - classification_loss: 0.1669 499/500 [============================>.] - ETA: 0s - loss: 1.3174 - regression_loss: 1.1503 - classification_loss: 0.1671 500/500 [==============================] - 129s 258ms/step - loss: 1.3176 - regression_loss: 1.1506 - classification_loss: 0.1670 1172 instances of class plum with average precision: 0.7558 mAP: 0.7558 Epoch 00021: saving model to ./training/snapshots/resnet50_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 2:01 - loss: 1.3457 - regression_loss: 1.1713 - classification_loss: 0.1744 2/500 [..............................] - ETA: 2:04 - loss: 1.2838 - regression_loss: 1.1260 - classification_loss: 0.1578 3/500 [..............................] - ETA: 2:07 - loss: 1.3789 - regression_loss: 1.2171 - classification_loss: 0.1617 4/500 [..............................] - ETA: 2:05 - loss: 1.5133 - regression_loss: 1.3429 - classification_loss: 0.1704 5/500 [..............................] - ETA: 2:04 - loss: 1.5525 - regression_loss: 1.3767 - classification_loss: 0.1758 6/500 [..............................] - ETA: 2:03 - loss: 1.5677 - regression_loss: 1.3886 - classification_loss: 0.1791 7/500 [..............................] - ETA: 2:03 - loss: 1.5602 - regression_loss: 1.3787 - classification_loss: 0.1815 8/500 [..............................] - ETA: 2:04 - loss: 1.5283 - regression_loss: 1.3562 - classification_loss: 0.1721 9/500 [..............................] - ETA: 2:04 - loss: 1.5400 - regression_loss: 1.3622 - classification_loss: 0.1778 10/500 [..............................] - ETA: 2:05 - loss: 1.4985 - regression_loss: 1.3278 - classification_loss: 0.1707 11/500 [..............................] - ETA: 2:05 - loss: 1.4575 - regression_loss: 1.2874 - classification_loss: 0.1701 12/500 [..............................] - ETA: 2:05 - loss: 1.4359 - regression_loss: 1.2705 - classification_loss: 0.1654 13/500 [..............................] - ETA: 2:04 - loss: 1.4400 - regression_loss: 1.2766 - classification_loss: 0.1633 14/500 [..............................] - ETA: 2:04 - loss: 1.4008 - regression_loss: 1.2365 - classification_loss: 0.1643 15/500 [..............................] - ETA: 2:04 - loss: 1.3672 - regression_loss: 1.2091 - classification_loss: 0.1581 16/500 [..............................] - ETA: 2:04 - loss: 1.3665 - regression_loss: 1.2076 - classification_loss: 0.1589 17/500 [>.............................] - ETA: 2:03 - loss: 1.3292 - regression_loss: 1.1762 - classification_loss: 0.1530 18/500 [>.............................] - ETA: 2:02 - loss: 1.3587 - regression_loss: 1.2003 - classification_loss: 0.1584 19/500 [>.............................] - ETA: 2:02 - loss: 1.3444 - regression_loss: 1.1880 - classification_loss: 0.1564 20/500 [>.............................] - ETA: 2:02 - loss: 1.3577 - regression_loss: 1.1968 - classification_loss: 0.1610 21/500 [>.............................] - ETA: 2:02 - loss: 1.3785 - regression_loss: 1.2135 - classification_loss: 0.1650 22/500 [>.............................] - ETA: 2:02 - loss: 1.3877 - regression_loss: 1.2218 - classification_loss: 0.1659 23/500 [>.............................] - ETA: 2:01 - loss: 1.3843 - regression_loss: 1.2176 - classification_loss: 0.1667 24/500 [>.............................] - ETA: 2:01 - loss: 1.3789 - regression_loss: 1.2130 - classification_loss: 0.1659 25/500 [>.............................] - ETA: 2:01 - loss: 1.3797 - regression_loss: 1.2138 - classification_loss: 0.1659 26/500 [>.............................] - ETA: 2:00 - loss: 1.3507 - regression_loss: 1.1888 - classification_loss: 0.1619 27/500 [>.............................] - ETA: 2:00 - loss: 1.3435 - regression_loss: 1.1817 - classification_loss: 0.1618 28/500 [>.............................] - ETA: 1:59 - loss: 1.3431 - regression_loss: 1.1826 - classification_loss: 0.1605 29/500 [>.............................] - ETA: 2:00 - loss: 1.3372 - regression_loss: 1.1777 - classification_loss: 0.1595 30/500 [>.............................] - ETA: 1:59 - loss: 1.3308 - regression_loss: 1.1714 - classification_loss: 0.1594 31/500 [>.............................] - ETA: 1:59 - loss: 1.3216 - regression_loss: 1.1649 - classification_loss: 0.1567 32/500 [>.............................] - ETA: 1:59 - loss: 1.3037 - regression_loss: 1.1489 - classification_loss: 0.1548 33/500 [>.............................] - ETA: 1:59 - loss: 1.3055 - regression_loss: 1.1508 - classification_loss: 0.1547 34/500 [=>............................] - ETA: 1:59 - loss: 1.3004 - regression_loss: 1.1461 - classification_loss: 0.1543 35/500 [=>............................] - ETA: 1:58 - loss: 1.2829 - regression_loss: 1.1315 - classification_loss: 0.1514 36/500 [=>............................] - ETA: 1:58 - loss: 1.2888 - regression_loss: 1.1372 - classification_loss: 0.1516 37/500 [=>............................] - ETA: 1:58 - loss: 1.2819 - regression_loss: 1.1311 - classification_loss: 0.1508 38/500 [=>............................] - ETA: 1:58 - loss: 1.2864 - regression_loss: 1.1344 - classification_loss: 0.1520 39/500 [=>............................] - ETA: 1:58 - loss: 1.2961 - regression_loss: 1.1406 - classification_loss: 0.1555 40/500 [=>............................] - ETA: 1:58 - loss: 1.3021 - regression_loss: 1.1447 - classification_loss: 0.1574 41/500 [=>............................] - ETA: 1:57 - loss: 1.2981 - regression_loss: 1.1415 - classification_loss: 0.1567 42/500 [=>............................] - ETA: 1:57 - loss: 1.2998 - regression_loss: 1.1432 - classification_loss: 0.1566 43/500 [=>............................] - ETA: 1:57 - loss: 1.3021 - regression_loss: 1.1454 - classification_loss: 0.1567 44/500 [=>............................] - ETA: 1:57 - loss: 1.3138 - regression_loss: 1.1550 - classification_loss: 0.1587 45/500 [=>............................] - ETA: 1:57 - loss: 1.3161 - regression_loss: 1.1570 - classification_loss: 0.1591 46/500 [=>............................] - ETA: 1:56 - loss: 1.3270 - regression_loss: 1.1657 - classification_loss: 0.1614 47/500 [=>............................] - ETA: 1:56 - loss: 1.3153 - regression_loss: 1.1550 - classification_loss: 0.1603 48/500 [=>............................] - ETA: 1:56 - loss: 1.3259 - regression_loss: 1.1633 - classification_loss: 0.1626 49/500 [=>............................] - ETA: 1:55 - loss: 1.3231 - regression_loss: 1.1606 - classification_loss: 0.1625 50/500 [==>...........................] - ETA: 1:55 - loss: 1.3275 - regression_loss: 1.1642 - classification_loss: 0.1633 51/500 [==>...........................] - ETA: 1:55 - loss: 1.3201 - regression_loss: 1.1578 - classification_loss: 0.1622 52/500 [==>...........................] - ETA: 1:55 - loss: 1.3165 - regression_loss: 1.1526 - classification_loss: 0.1639 53/500 [==>...........................] - ETA: 1:54 - loss: 1.3181 - regression_loss: 1.1535 - classification_loss: 0.1646 54/500 [==>...........................] - ETA: 1:54 - loss: 1.3107 - regression_loss: 1.1480 - classification_loss: 0.1627 55/500 [==>...........................] - ETA: 1:54 - loss: 1.3139 - regression_loss: 1.1507 - classification_loss: 0.1632 56/500 [==>...........................] - ETA: 1:54 - loss: 1.3223 - regression_loss: 1.1572 - classification_loss: 0.1651 57/500 [==>...........................] - ETA: 1:54 - loss: 1.3232 - regression_loss: 1.1579 - classification_loss: 0.1653 58/500 [==>...........................] - ETA: 1:53 - loss: 1.3282 - regression_loss: 1.1618 - classification_loss: 0.1664 59/500 [==>...........................] - ETA: 1:53 - loss: 1.3296 - regression_loss: 1.1631 - classification_loss: 0.1665 60/500 [==>...........................] - ETA: 1:53 - loss: 1.3302 - regression_loss: 1.1637 - classification_loss: 0.1665 61/500 [==>...........................] - ETA: 1:52 - loss: 1.3168 - regression_loss: 1.1520 - classification_loss: 0.1648 62/500 [==>...........................] - ETA: 1:52 - loss: 1.3217 - regression_loss: 1.1559 - classification_loss: 0.1658 63/500 [==>...........................] - ETA: 1:51 - loss: 1.3216 - regression_loss: 1.1559 - classification_loss: 0.1657 64/500 [==>...........................] - ETA: 1:51 - loss: 1.3177 - regression_loss: 1.1499 - classification_loss: 0.1679 65/500 [==>...........................] - ETA: 1:51 - loss: 1.3211 - regression_loss: 1.1526 - classification_loss: 0.1685 66/500 [==>...........................] - ETA: 1:50 - loss: 1.3253 - regression_loss: 1.1565 - classification_loss: 0.1688 67/500 [===>..........................] - ETA: 1:50 - loss: 1.3152 - regression_loss: 1.1480 - classification_loss: 0.1672 68/500 [===>..........................] - ETA: 1:50 - loss: 1.3168 - regression_loss: 1.1486 - classification_loss: 0.1682 69/500 [===>..........................] - ETA: 1:50 - loss: 1.3110 - regression_loss: 1.1431 - classification_loss: 0.1678 70/500 [===>..........................] - ETA: 1:49 - loss: 1.3117 - regression_loss: 1.1437 - classification_loss: 0.1680 71/500 [===>..........................] - ETA: 1:49 - loss: 1.3150 - regression_loss: 1.1475 - classification_loss: 0.1675 72/500 [===>..........................] - ETA: 1:49 - loss: 1.3195 - regression_loss: 1.1516 - classification_loss: 0.1679 73/500 [===>..........................] - ETA: 1:49 - loss: 1.3207 - regression_loss: 1.1523 - classification_loss: 0.1684 74/500 [===>..........................] - ETA: 1:48 - loss: 1.3193 - regression_loss: 1.1515 - classification_loss: 0.1678 75/500 [===>..........................] - ETA: 1:48 - loss: 1.3154 - regression_loss: 1.1488 - classification_loss: 0.1666 76/500 [===>..........................] - ETA: 1:48 - loss: 1.3138 - regression_loss: 1.1477 - classification_loss: 0.1661 77/500 [===>..........................] - ETA: 1:48 - loss: 1.3128 - regression_loss: 1.1464 - classification_loss: 0.1664 78/500 [===>..........................] - ETA: 1:48 - loss: 1.3174 - regression_loss: 1.1505 - classification_loss: 0.1669 79/500 [===>..........................] - ETA: 1:47 - loss: 1.3156 - regression_loss: 1.1490 - classification_loss: 0.1667 80/500 [===>..........................] - ETA: 1:47 - loss: 1.3198 - regression_loss: 1.1524 - classification_loss: 0.1674 81/500 [===>..........................] - ETA: 1:47 - loss: 1.3219 - regression_loss: 1.1543 - classification_loss: 0.1676 82/500 [===>..........................] - ETA: 1:47 - loss: 1.3262 - regression_loss: 1.1576 - classification_loss: 0.1686 83/500 [===>..........................] - ETA: 1:46 - loss: 1.3285 - regression_loss: 1.1597 - classification_loss: 0.1689 84/500 [====>.........................] - ETA: 1:46 - loss: 1.3229 - regression_loss: 1.1550 - classification_loss: 0.1679 85/500 [====>.........................] - ETA: 1:46 - loss: 1.3260 - regression_loss: 1.1577 - classification_loss: 0.1683 86/500 [====>.........................] - ETA: 1:46 - loss: 1.3253 - regression_loss: 1.1579 - classification_loss: 0.1675 87/500 [====>.........................] - ETA: 1:45 - loss: 1.3299 - regression_loss: 1.1615 - classification_loss: 0.1684 88/500 [====>.........................] - ETA: 1:45 - loss: 1.3317 - regression_loss: 1.1632 - classification_loss: 0.1685 89/500 [====>.........................] - ETA: 1:45 - loss: 1.3351 - regression_loss: 1.1666 - classification_loss: 0.1685 90/500 [====>.........................] - ETA: 1:45 - loss: 1.3349 - regression_loss: 1.1663 - classification_loss: 0.1686 91/500 [====>.........................] - ETA: 1:44 - loss: 1.3356 - regression_loss: 1.1671 - classification_loss: 0.1686 92/500 [====>.........................] - ETA: 1:44 - loss: 1.3264 - regression_loss: 1.1591 - classification_loss: 0.1673 93/500 [====>.........................] - ETA: 1:44 - loss: 1.3280 - regression_loss: 1.1608 - classification_loss: 0.1672 94/500 [====>.........................] - ETA: 1:44 - loss: 1.3210 - regression_loss: 1.1549 - classification_loss: 0.1662 95/500 [====>.........................] - ETA: 1:43 - loss: 1.3213 - regression_loss: 1.1551 - classification_loss: 0.1662 96/500 [====>.........................] - ETA: 1:43 - loss: 1.3231 - regression_loss: 1.1568 - classification_loss: 0.1663 97/500 [====>.........................] - ETA: 1:43 - loss: 1.3158 - regression_loss: 1.1506 - classification_loss: 0.1651 98/500 [====>.........................] - ETA: 1:43 - loss: 1.3191 - regression_loss: 1.1537 - classification_loss: 0.1654 99/500 [====>.........................] - ETA: 1:42 - loss: 1.3221 - regression_loss: 1.1560 - classification_loss: 0.1661 100/500 [=====>........................] - ETA: 1:42 - loss: 1.3199 - regression_loss: 1.1544 - classification_loss: 0.1655 101/500 [=====>........................] - ETA: 1:42 - loss: 1.3212 - regression_loss: 1.1549 - classification_loss: 0.1663 102/500 [=====>........................] - ETA: 1:42 - loss: 1.3220 - regression_loss: 1.1559 - classification_loss: 0.1661 103/500 [=====>........................] - ETA: 1:41 - loss: 1.3218 - regression_loss: 1.1557 - classification_loss: 0.1661 104/500 [=====>........................] - ETA: 1:41 - loss: 1.3196 - regression_loss: 1.1538 - classification_loss: 0.1658 105/500 [=====>........................] - ETA: 1:41 - loss: 1.3148 - regression_loss: 1.1499 - classification_loss: 0.1649 106/500 [=====>........................] - ETA: 1:41 - loss: 1.3177 - regression_loss: 1.1517 - classification_loss: 0.1660 107/500 [=====>........................] - ETA: 1:40 - loss: 1.3172 - regression_loss: 1.1515 - classification_loss: 0.1657 108/500 [=====>........................] - ETA: 1:40 - loss: 1.3190 - regression_loss: 1.1530 - classification_loss: 0.1660 109/500 [=====>........................] - ETA: 1:40 - loss: 1.3196 - regression_loss: 1.1535 - classification_loss: 0.1662 110/500 [=====>........................] - ETA: 1:40 - loss: 1.3128 - regression_loss: 1.1472 - classification_loss: 0.1656 111/500 [=====>........................] - ETA: 1:39 - loss: 1.3136 - regression_loss: 1.1482 - classification_loss: 0.1654 112/500 [=====>........................] - ETA: 1:39 - loss: 1.3120 - regression_loss: 1.1473 - classification_loss: 0.1648 113/500 [=====>........................] - ETA: 1:39 - loss: 1.3048 - regression_loss: 1.1411 - classification_loss: 0.1637 114/500 [=====>........................] - ETA: 1:39 - loss: 1.3006 - regression_loss: 1.1375 - classification_loss: 0.1631 115/500 [=====>........................] - ETA: 1:38 - loss: 1.3008 - regression_loss: 1.1380 - classification_loss: 0.1629 116/500 [=====>........................] - ETA: 1:38 - loss: 1.2985 - regression_loss: 1.1366 - classification_loss: 0.1619 117/500 [======>.......................] - ETA: 1:38 - loss: 1.2987 - regression_loss: 1.1368 - classification_loss: 0.1620 118/500 [======>.......................] - ETA: 1:38 - loss: 1.2937 - regression_loss: 1.1324 - classification_loss: 0.1613 119/500 [======>.......................] - ETA: 1:37 - loss: 1.2946 - regression_loss: 1.1331 - classification_loss: 0.1615 120/500 [======>.......................] - ETA: 1:37 - loss: 1.2964 - regression_loss: 1.1352 - classification_loss: 0.1612 121/500 [======>.......................] - ETA: 1:37 - loss: 1.2946 - regression_loss: 1.1331 - classification_loss: 0.1615 122/500 [======>.......................] - ETA: 1:37 - loss: 1.2965 - regression_loss: 1.1347 - classification_loss: 0.1618 123/500 [======>.......................] - ETA: 1:36 - loss: 1.2983 - regression_loss: 1.1360 - classification_loss: 0.1623 124/500 [======>.......................] - ETA: 1:36 - loss: 1.3010 - regression_loss: 1.1379 - classification_loss: 0.1631 125/500 [======>.......................] - ETA: 1:36 - loss: 1.3024 - regression_loss: 1.1391 - classification_loss: 0.1634 126/500 [======>.......................] - ETA: 1:36 - loss: 1.3002 - regression_loss: 1.1371 - classification_loss: 0.1631 127/500 [======>.......................] - ETA: 1:35 - loss: 1.2984 - regression_loss: 1.1352 - classification_loss: 0.1632 128/500 [======>.......................] - ETA: 1:35 - loss: 1.3013 - regression_loss: 1.1377 - classification_loss: 0.1636 129/500 [======>.......................] - ETA: 1:35 - loss: 1.2990 - regression_loss: 1.1356 - classification_loss: 0.1634 130/500 [======>.......................] - ETA: 1:35 - loss: 1.3004 - regression_loss: 1.1367 - classification_loss: 0.1637 131/500 [======>.......................] - ETA: 1:34 - loss: 1.2964 - regression_loss: 1.1334 - classification_loss: 0.1630 132/500 [======>.......................] - ETA: 1:34 - loss: 1.2969 - regression_loss: 1.1337 - classification_loss: 0.1632 133/500 [======>.......................] - ETA: 1:34 - loss: 1.2989 - regression_loss: 1.1356 - classification_loss: 0.1633 134/500 [=======>......................] - ETA: 1:34 - loss: 1.2999 - regression_loss: 1.1362 - classification_loss: 0.1636 135/500 [=======>......................] - ETA: 1:33 - loss: 1.2998 - regression_loss: 1.1361 - classification_loss: 0.1637 136/500 [=======>......................] - ETA: 1:33 - loss: 1.2971 - regression_loss: 1.1340 - classification_loss: 0.1631 137/500 [=======>......................] - ETA: 1:33 - loss: 1.2939 - regression_loss: 1.1314 - classification_loss: 0.1625 138/500 [=======>......................] - ETA: 1:32 - loss: 1.2972 - regression_loss: 1.1351 - classification_loss: 0.1621 139/500 [=======>......................] - ETA: 1:32 - loss: 1.2991 - regression_loss: 1.1366 - classification_loss: 0.1625 140/500 [=======>......................] - ETA: 1:32 - loss: 1.3009 - regression_loss: 1.1381 - classification_loss: 0.1628 141/500 [=======>......................] - ETA: 1:32 - loss: 1.2959 - regression_loss: 1.1337 - classification_loss: 0.1622 142/500 [=======>......................] - ETA: 1:31 - loss: 1.2985 - regression_loss: 1.1359 - classification_loss: 0.1625 143/500 [=======>......................] - ETA: 1:31 - loss: 1.2979 - regression_loss: 1.1355 - classification_loss: 0.1624 144/500 [=======>......................] - ETA: 1:31 - loss: 1.2945 - regression_loss: 1.1325 - classification_loss: 0.1620 145/500 [=======>......................] - ETA: 1:31 - loss: 1.2911 - regression_loss: 1.1299 - classification_loss: 0.1612 146/500 [=======>......................] - ETA: 1:30 - loss: 1.2897 - regression_loss: 1.1288 - classification_loss: 0.1609 147/500 [=======>......................] - ETA: 1:30 - loss: 1.2910 - regression_loss: 1.1297 - classification_loss: 0.1614 148/500 [=======>......................] - ETA: 1:30 - loss: 1.2859 - regression_loss: 1.1254 - classification_loss: 0.1606 149/500 [=======>......................] - ETA: 1:30 - loss: 1.2846 - regression_loss: 1.1243 - classification_loss: 0.1603 150/500 [========>.....................] - ETA: 1:29 - loss: 1.2843 - regression_loss: 1.1245 - classification_loss: 0.1598 151/500 [========>.....................] - ETA: 1:29 - loss: 1.2822 - regression_loss: 1.1226 - classification_loss: 0.1596 152/500 [========>.....................] - ETA: 1:29 - loss: 1.2922 - regression_loss: 1.1258 - classification_loss: 0.1664 153/500 [========>.....................] - ETA: 1:29 - loss: 1.2931 - regression_loss: 1.1268 - classification_loss: 0.1663 154/500 [========>.....................] - ETA: 1:28 - loss: 1.2973 - regression_loss: 1.1306 - classification_loss: 0.1667 155/500 [========>.....................] - ETA: 1:28 - loss: 1.2991 - regression_loss: 1.1324 - classification_loss: 0.1666 156/500 [========>.....................] - ETA: 1:28 - loss: 1.2949 - regression_loss: 1.1288 - classification_loss: 0.1661 157/500 [========>.....................] - ETA: 1:28 - loss: 1.2978 - regression_loss: 1.1312 - classification_loss: 0.1666 158/500 [========>.....................] - ETA: 1:27 - loss: 1.2961 - regression_loss: 1.1298 - classification_loss: 0.1663 159/500 [========>.....................] - ETA: 1:27 - loss: 1.2946 - regression_loss: 1.1286 - classification_loss: 0.1660 160/500 [========>.....................] - ETA: 1:27 - loss: 1.2932 - regression_loss: 1.1274 - classification_loss: 0.1658 161/500 [========>.....................] - ETA: 1:27 - loss: 1.2944 - regression_loss: 1.1283 - classification_loss: 0.1661 162/500 [========>.....................] - ETA: 1:26 - loss: 1.2944 - regression_loss: 1.1284 - classification_loss: 0.1660 163/500 [========>.....................] - ETA: 1:26 - loss: 1.2888 - regression_loss: 1.1235 - classification_loss: 0.1653 164/500 [========>.....................] - ETA: 1:26 - loss: 1.2903 - regression_loss: 1.1249 - classification_loss: 0.1655 165/500 [========>.....................] - ETA: 1:26 - loss: 1.2902 - regression_loss: 1.1249 - classification_loss: 0.1653 166/500 [========>.....................] - ETA: 1:25 - loss: 1.2879 - regression_loss: 1.1230 - classification_loss: 0.1648 167/500 [=========>....................] - ETA: 1:25 - loss: 1.2895 - regression_loss: 1.1246 - classification_loss: 0.1649 168/500 [=========>....................] - ETA: 1:25 - loss: 1.2853 - regression_loss: 1.1212 - classification_loss: 0.1641 169/500 [=========>....................] - ETA: 1:25 - loss: 1.2861 - regression_loss: 1.1219 - classification_loss: 0.1643 170/500 [=========>....................] - ETA: 1:24 - loss: 1.2878 - regression_loss: 1.1234 - classification_loss: 0.1644 171/500 [=========>....................] - ETA: 1:24 - loss: 1.2857 - regression_loss: 1.1215 - classification_loss: 0.1642 172/500 [=========>....................] - ETA: 1:24 - loss: 1.2855 - regression_loss: 1.1216 - classification_loss: 0.1639 173/500 [=========>....................] - ETA: 1:24 - loss: 1.2866 - regression_loss: 1.1226 - classification_loss: 0.1640 174/500 [=========>....................] - ETA: 1:23 - loss: 1.2906 - regression_loss: 1.1259 - classification_loss: 0.1647 175/500 [=========>....................] - ETA: 1:23 - loss: 1.2956 - regression_loss: 1.1295 - classification_loss: 0.1661 176/500 [=========>....................] - ETA: 1:23 - loss: 1.2965 - regression_loss: 1.1306 - classification_loss: 0.1659 177/500 [=========>....................] - ETA: 1:23 - loss: 1.2953 - regression_loss: 1.1294 - classification_loss: 0.1659 178/500 [=========>....................] - ETA: 1:22 - loss: 1.2937 - regression_loss: 1.1279 - classification_loss: 0.1658 179/500 [=========>....................] - ETA: 1:22 - loss: 1.2952 - regression_loss: 1.1293 - classification_loss: 0.1659 180/500 [=========>....................] - ETA: 1:22 - loss: 1.2951 - regression_loss: 1.1294 - classification_loss: 0.1656 181/500 [=========>....................] - ETA: 1:22 - loss: 1.2967 - regression_loss: 1.1310 - classification_loss: 0.1656 182/500 [=========>....................] - ETA: 1:21 - loss: 1.2938 - regression_loss: 1.1287 - classification_loss: 0.1651 183/500 [=========>....................] - ETA: 1:21 - loss: 1.2894 - regression_loss: 1.1248 - classification_loss: 0.1646 184/500 [==========>...................] - ETA: 1:21 - loss: 1.2883 - regression_loss: 1.1240 - classification_loss: 0.1643 185/500 [==========>...................] - ETA: 1:21 - loss: 1.2892 - regression_loss: 1.1247 - classification_loss: 0.1644 186/500 [==========>...................] - ETA: 1:20 - loss: 1.2876 - regression_loss: 1.1235 - classification_loss: 0.1641 187/500 [==========>...................] - ETA: 1:20 - loss: 1.2881 - regression_loss: 1.1241 - classification_loss: 0.1640 188/500 [==========>...................] - ETA: 1:20 - loss: 1.2884 - regression_loss: 1.1243 - classification_loss: 0.1641 189/500 [==========>...................] - ETA: 1:20 - loss: 1.2895 - regression_loss: 1.1250 - classification_loss: 0.1645 190/500 [==========>...................] - ETA: 1:19 - loss: 1.2859 - regression_loss: 1.1220 - classification_loss: 0.1640 191/500 [==========>...................] - ETA: 1:19 - loss: 1.2874 - regression_loss: 1.1236 - classification_loss: 0.1638 192/500 [==========>...................] - ETA: 1:19 - loss: 1.2842 - regression_loss: 1.1210 - classification_loss: 0.1632 193/500 [==========>...................] - ETA: 1:19 - loss: 1.2838 - regression_loss: 1.1207 - classification_loss: 0.1631 194/500 [==========>...................] - ETA: 1:18 - loss: 1.2834 - regression_loss: 1.1203 - classification_loss: 0.1631 195/500 [==========>...................] - ETA: 1:18 - loss: 1.2825 - regression_loss: 1.1198 - classification_loss: 0.1628 196/500 [==========>...................] - ETA: 1:18 - loss: 1.2847 - regression_loss: 1.1215 - classification_loss: 0.1631 197/500 [==========>...................] - ETA: 1:18 - loss: 1.2858 - regression_loss: 1.1224 - classification_loss: 0.1634 198/500 [==========>...................] - ETA: 1:17 - loss: 1.2819 - regression_loss: 1.1190 - classification_loss: 0.1630 199/500 [==========>...................] - ETA: 1:17 - loss: 1.2830 - regression_loss: 1.1199 - classification_loss: 0.1631 200/500 [===========>..................] - ETA: 1:17 - loss: 1.2856 - regression_loss: 1.1225 - classification_loss: 0.1632 201/500 [===========>..................] - ETA: 1:17 - loss: 1.2835 - regression_loss: 1.1207 - classification_loss: 0.1628 202/500 [===========>..................] - ETA: 1:16 - loss: 1.2830 - regression_loss: 1.1203 - classification_loss: 0.1627 203/500 [===========>..................] - ETA: 1:16 - loss: 1.2838 - regression_loss: 1.1210 - classification_loss: 0.1628 204/500 [===========>..................] - ETA: 1:16 - loss: 1.2808 - regression_loss: 1.1185 - classification_loss: 0.1622 205/500 [===========>..................] - ETA: 1:16 - loss: 1.2813 - regression_loss: 1.1190 - classification_loss: 0.1623 206/500 [===========>..................] - ETA: 1:15 - loss: 1.2782 - regression_loss: 1.1164 - classification_loss: 0.1618 207/500 [===========>..................] - ETA: 1:15 - loss: 1.2784 - regression_loss: 1.1167 - classification_loss: 0.1617 208/500 [===========>..................] - ETA: 1:15 - loss: 1.2785 - regression_loss: 1.1168 - classification_loss: 0.1617 209/500 [===========>..................] - ETA: 1:14 - loss: 1.2749 - regression_loss: 1.1138 - classification_loss: 0.1611 210/500 [===========>..................] - ETA: 1:14 - loss: 1.2715 - regression_loss: 1.1109 - classification_loss: 0.1605 211/500 [===========>..................] - ETA: 1:14 - loss: 1.2726 - regression_loss: 1.1121 - classification_loss: 0.1605 212/500 [===========>..................] - ETA: 1:14 - loss: 1.2723 - regression_loss: 1.1120 - classification_loss: 0.1603 213/500 [===========>..................] - ETA: 1:13 - loss: 1.2738 - regression_loss: 1.1133 - classification_loss: 0.1605 214/500 [===========>..................] - ETA: 1:13 - loss: 1.2751 - regression_loss: 1.1143 - classification_loss: 0.1607 215/500 [===========>..................] - ETA: 1:13 - loss: 1.2735 - regression_loss: 1.1130 - classification_loss: 0.1605 216/500 [===========>..................] - ETA: 1:13 - loss: 1.2734 - regression_loss: 1.1128 - classification_loss: 0.1606 217/500 [============>.................] - ETA: 1:12 - loss: 1.2735 - regression_loss: 1.1128 - classification_loss: 0.1607 218/500 [============>.................] - ETA: 1:12 - loss: 1.2745 - regression_loss: 1.1137 - classification_loss: 0.1608 219/500 [============>.................] - ETA: 1:12 - loss: 1.2713 - regression_loss: 1.1110 - classification_loss: 0.1604 220/500 [============>.................] - ETA: 1:12 - loss: 1.2721 - regression_loss: 1.1117 - classification_loss: 0.1605 221/500 [============>.................] - ETA: 1:11 - loss: 1.2710 - regression_loss: 1.1106 - classification_loss: 0.1604 222/500 [============>.................] - ETA: 1:11 - loss: 1.2679 - regression_loss: 1.1080 - classification_loss: 0.1600 223/500 [============>.................] - ETA: 1:11 - loss: 1.2705 - regression_loss: 1.1098 - classification_loss: 0.1607 224/500 [============>.................] - ETA: 1:11 - loss: 1.2710 - regression_loss: 1.1102 - classification_loss: 0.1608 225/500 [============>.................] - ETA: 1:10 - loss: 1.2688 - regression_loss: 1.1084 - classification_loss: 0.1604 226/500 [============>.................] - ETA: 1:10 - loss: 1.2697 - regression_loss: 1.1092 - classification_loss: 0.1605 227/500 [============>.................] - ETA: 1:10 - loss: 1.2698 - regression_loss: 1.1093 - classification_loss: 0.1605 228/500 [============>.................] - ETA: 1:10 - loss: 1.2748 - regression_loss: 1.1133 - classification_loss: 0.1615 229/500 [============>.................] - ETA: 1:09 - loss: 1.2759 - regression_loss: 1.1141 - classification_loss: 0.1618 230/500 [============>.................] - ETA: 1:09 - loss: 1.2756 - regression_loss: 1.1141 - classification_loss: 0.1615 231/500 [============>.................] - ETA: 1:09 - loss: 1.2760 - regression_loss: 1.1143 - classification_loss: 0.1617 232/500 [============>.................] - ETA: 1:09 - loss: 1.2783 - regression_loss: 1.1160 - classification_loss: 0.1623 233/500 [============>.................] - ETA: 1:08 - loss: 1.2772 - regression_loss: 1.1149 - classification_loss: 0.1623 234/500 [=============>................] - ETA: 1:08 - loss: 1.2779 - regression_loss: 1.1154 - classification_loss: 0.1625 235/500 [=============>................] - ETA: 1:08 - loss: 1.2784 - regression_loss: 1.1158 - classification_loss: 0.1626 236/500 [=============>................] - ETA: 1:08 - loss: 1.2786 - regression_loss: 1.1162 - classification_loss: 0.1624 237/500 [=============>................] - ETA: 1:07 - loss: 1.2764 - regression_loss: 1.1143 - classification_loss: 0.1621 238/500 [=============>................] - ETA: 1:07 - loss: 1.2790 - regression_loss: 1.1165 - classification_loss: 0.1625 239/500 [=============>................] - ETA: 1:07 - loss: 1.2766 - regression_loss: 1.1145 - classification_loss: 0.1621 240/500 [=============>................] - ETA: 1:06 - loss: 1.2732 - regression_loss: 1.1116 - classification_loss: 0.1616 241/500 [=============>................] - ETA: 1:06 - loss: 1.2700 - regression_loss: 1.1088 - classification_loss: 0.1612 242/500 [=============>................] - ETA: 1:06 - loss: 1.2702 - regression_loss: 1.1090 - classification_loss: 0.1613 243/500 [=============>................] - ETA: 1:06 - loss: 1.2674 - regression_loss: 1.1064 - classification_loss: 0.1610 244/500 [=============>................] - ETA: 1:05 - loss: 1.2670 - regression_loss: 1.1059 - classification_loss: 0.1611 245/500 [=============>................] - ETA: 1:05 - loss: 1.2675 - regression_loss: 1.1063 - classification_loss: 0.1612 246/500 [=============>................] - ETA: 1:05 - loss: 1.2682 - regression_loss: 1.1067 - classification_loss: 0.1615 247/500 [=============>................] - ETA: 1:05 - loss: 1.2703 - regression_loss: 1.1085 - classification_loss: 0.1618 248/500 [=============>................] - ETA: 1:04 - loss: 1.2694 - regression_loss: 1.1078 - classification_loss: 0.1616 249/500 [=============>................] - ETA: 1:04 - loss: 1.2675 - regression_loss: 1.1062 - classification_loss: 0.1612 250/500 [==============>...............] - ETA: 1:04 - loss: 1.2661 - regression_loss: 1.1050 - classification_loss: 0.1610 251/500 [==============>...............] - ETA: 1:04 - loss: 1.2639 - regression_loss: 1.1033 - classification_loss: 0.1607 252/500 [==============>...............] - ETA: 1:03 - loss: 1.2643 - regression_loss: 1.1036 - classification_loss: 0.1607 253/500 [==============>...............] - ETA: 1:03 - loss: 1.2623 - regression_loss: 1.1018 - classification_loss: 0.1606 254/500 [==============>...............] - ETA: 1:03 - loss: 1.2649 - regression_loss: 1.1040 - classification_loss: 0.1610 255/500 [==============>...............] - ETA: 1:03 - loss: 1.2651 - regression_loss: 1.1041 - classification_loss: 0.1610 256/500 [==============>...............] - ETA: 1:02 - loss: 1.2660 - regression_loss: 1.1048 - classification_loss: 0.1612 257/500 [==============>...............] - ETA: 1:02 - loss: 1.2636 - regression_loss: 1.1028 - classification_loss: 0.1608 258/500 [==============>...............] - ETA: 1:02 - loss: 1.2614 - regression_loss: 1.1010 - classification_loss: 0.1603 259/500 [==============>...............] - ETA: 1:02 - loss: 1.2633 - regression_loss: 1.1027 - classification_loss: 0.1606 260/500 [==============>...............] - ETA: 1:01 - loss: 1.2627 - regression_loss: 1.1023 - classification_loss: 0.1603 261/500 [==============>...............] - ETA: 1:01 - loss: 1.2632 - regression_loss: 1.1026 - classification_loss: 0.1606 262/500 [==============>...............] - ETA: 1:01 - loss: 1.2612 - regression_loss: 1.1010 - classification_loss: 0.1602 263/500 [==============>...............] - ETA: 1:01 - loss: 1.2596 - regression_loss: 1.0998 - classification_loss: 0.1598 264/500 [==============>...............] - ETA: 1:00 - loss: 1.2587 - regression_loss: 1.0989 - classification_loss: 0.1598 265/500 [==============>...............] - ETA: 1:00 - loss: 1.2577 - regression_loss: 1.0979 - classification_loss: 0.1598 266/500 [==============>...............] - ETA: 1:00 - loss: 1.2591 - regression_loss: 1.0991 - classification_loss: 0.1600 267/500 [===============>..............] - ETA: 1:00 - loss: 1.2590 - regression_loss: 1.0990 - classification_loss: 0.1599 268/500 [===============>..............] - ETA: 59s - loss: 1.2593 - regression_loss: 1.0995 - classification_loss: 0.1599  269/500 [===============>..............] - ETA: 59s - loss: 1.2590 - regression_loss: 1.0993 - classification_loss: 0.1597 270/500 [===============>..............] - ETA: 59s - loss: 1.2598 - regression_loss: 1.0998 - classification_loss: 0.1600 271/500 [===============>..............] - ETA: 58s - loss: 1.2599 - regression_loss: 1.1000 - classification_loss: 0.1600 272/500 [===============>..............] - ETA: 58s - loss: 1.2601 - regression_loss: 1.1001 - classification_loss: 0.1600 273/500 [===============>..............] - ETA: 58s - loss: 1.2619 - regression_loss: 1.1018 - classification_loss: 0.1601 274/500 [===============>..............] - ETA: 58s - loss: 1.2620 - regression_loss: 1.1020 - classification_loss: 0.1600 275/500 [===============>..............] - ETA: 57s - loss: 1.2605 - regression_loss: 1.1007 - classification_loss: 0.1598 276/500 [===============>..............] - ETA: 57s - loss: 1.2623 - regression_loss: 1.1022 - classification_loss: 0.1601 277/500 [===============>..............] - ETA: 57s - loss: 1.2624 - regression_loss: 1.1022 - classification_loss: 0.1602 278/500 [===============>..............] - ETA: 57s - loss: 1.2642 - regression_loss: 1.1037 - classification_loss: 0.1605 279/500 [===============>..............] - ETA: 56s - loss: 1.2620 - regression_loss: 1.1019 - classification_loss: 0.1600 280/500 [===============>..............] - ETA: 56s - loss: 1.2630 - regression_loss: 1.1027 - classification_loss: 0.1602 281/500 [===============>..............] - ETA: 56s - loss: 1.2625 - regression_loss: 1.1019 - classification_loss: 0.1606 282/500 [===============>..............] - ETA: 56s - loss: 1.2626 - regression_loss: 1.1021 - classification_loss: 0.1605 283/500 [===============>..............] - ETA: 55s - loss: 1.2611 - regression_loss: 1.1010 - classification_loss: 0.1601 284/500 [================>.............] - ETA: 55s - loss: 1.2608 - regression_loss: 1.1008 - classification_loss: 0.1600 285/500 [================>.............] - ETA: 55s - loss: 1.2585 - regression_loss: 1.0988 - classification_loss: 0.1597 286/500 [================>.............] - ETA: 55s - loss: 1.2587 - regression_loss: 1.0988 - classification_loss: 0.1599 287/500 [================>.............] - ETA: 54s - loss: 1.2575 - regression_loss: 1.0978 - classification_loss: 0.1597 288/500 [================>.............] - ETA: 54s - loss: 1.2575 - regression_loss: 1.0977 - classification_loss: 0.1598 289/500 [================>.............] - ETA: 54s - loss: 1.2567 - regression_loss: 1.0971 - classification_loss: 0.1596 290/500 [================>.............] - ETA: 54s - loss: 1.2584 - regression_loss: 1.0985 - classification_loss: 0.1599 291/500 [================>.............] - ETA: 53s - loss: 1.2564 - regression_loss: 1.0969 - classification_loss: 0.1595 292/500 [================>.............] - ETA: 53s - loss: 1.2563 - regression_loss: 1.0969 - classification_loss: 0.1594 293/500 [================>.............] - ETA: 53s - loss: 1.2558 - regression_loss: 1.0965 - classification_loss: 0.1593 294/500 [================>.............] - ETA: 53s - loss: 1.2568 - regression_loss: 1.0975 - classification_loss: 0.1594 295/500 [================>.............] - ETA: 52s - loss: 1.2568 - regression_loss: 1.0974 - classification_loss: 0.1594 296/500 [================>.............] - ETA: 52s - loss: 1.2554 - regression_loss: 1.0963 - classification_loss: 0.1591 297/500 [================>.............] - ETA: 52s - loss: 1.2551 - regression_loss: 1.0961 - classification_loss: 0.1590 298/500 [================>.............] - ETA: 52s - loss: 1.2538 - regression_loss: 1.0946 - classification_loss: 0.1591 299/500 [================>.............] - ETA: 51s - loss: 1.2556 - regression_loss: 1.0961 - classification_loss: 0.1595 300/500 [=================>............] - ETA: 51s - loss: 1.2561 - regression_loss: 1.0965 - classification_loss: 0.1596 301/500 [=================>............] - ETA: 51s - loss: 1.2557 - regression_loss: 1.0962 - classification_loss: 0.1595 302/500 [=================>............] - ETA: 51s - loss: 1.2567 - regression_loss: 1.0971 - classification_loss: 0.1597 303/500 [=================>............] - ETA: 50s - loss: 1.2556 - regression_loss: 1.0962 - classification_loss: 0.1594 304/500 [=================>............] - ETA: 50s - loss: 1.2548 - regression_loss: 1.0954 - classification_loss: 0.1594 305/500 [=================>............] - ETA: 50s - loss: 1.2533 - regression_loss: 1.0941 - classification_loss: 0.1591 306/500 [=================>............] - ETA: 49s - loss: 1.2527 - regression_loss: 1.0937 - classification_loss: 0.1590 307/500 [=================>............] - ETA: 49s - loss: 1.2534 - regression_loss: 1.0943 - classification_loss: 0.1591 308/500 [=================>............] - ETA: 49s - loss: 1.2547 - regression_loss: 1.0955 - classification_loss: 0.1591 309/500 [=================>............] - ETA: 49s - loss: 1.2545 - regression_loss: 1.0955 - classification_loss: 0.1591 310/500 [=================>............] - ETA: 48s - loss: 1.2545 - regression_loss: 1.0955 - classification_loss: 0.1590 311/500 [=================>............] - ETA: 48s - loss: 1.2543 - regression_loss: 1.0953 - classification_loss: 0.1590 312/500 [=================>............] - ETA: 48s - loss: 1.2532 - regression_loss: 1.0943 - classification_loss: 0.1588 313/500 [=================>............] - ETA: 48s - loss: 1.2533 - regression_loss: 1.0944 - classification_loss: 0.1589 314/500 [=================>............] - ETA: 47s - loss: 1.2541 - regression_loss: 1.0950 - classification_loss: 0.1591 315/500 [=================>............] - ETA: 47s - loss: 1.2554 - regression_loss: 1.0961 - classification_loss: 0.1593 316/500 [=================>............] - ETA: 47s - loss: 1.2558 - regression_loss: 1.0965 - classification_loss: 0.1593 317/500 [==================>...........] - ETA: 47s - loss: 1.2536 - regression_loss: 1.0946 - classification_loss: 0.1590 318/500 [==================>...........] - ETA: 46s - loss: 1.2531 - regression_loss: 1.0942 - classification_loss: 0.1589 319/500 [==================>...........] - ETA: 46s - loss: 1.2555 - regression_loss: 1.0963 - classification_loss: 0.1592 320/500 [==================>...........] - ETA: 46s - loss: 1.2570 - regression_loss: 1.0976 - classification_loss: 0.1594 321/500 [==================>...........] - ETA: 46s - loss: 1.2555 - regression_loss: 1.0965 - classification_loss: 0.1591 322/500 [==================>...........] - ETA: 45s - loss: 1.2557 - regression_loss: 1.0968 - classification_loss: 0.1589 323/500 [==================>...........] - ETA: 45s - loss: 1.2542 - regression_loss: 1.0955 - classification_loss: 0.1587 324/500 [==================>...........] - ETA: 45s - loss: 1.2547 - regression_loss: 1.0961 - classification_loss: 0.1587 325/500 [==================>...........] - ETA: 45s - loss: 1.2551 - regression_loss: 1.0965 - classification_loss: 0.1586 326/500 [==================>...........] - ETA: 44s - loss: 1.2559 - regression_loss: 1.0972 - classification_loss: 0.1587 327/500 [==================>...........] - ETA: 44s - loss: 1.2568 - regression_loss: 1.0981 - classification_loss: 0.1587 328/500 [==================>...........] - ETA: 44s - loss: 1.2563 - regression_loss: 1.0977 - classification_loss: 0.1586 329/500 [==================>...........] - ETA: 44s - loss: 1.2559 - regression_loss: 1.0974 - classification_loss: 0.1585 330/500 [==================>...........] - ETA: 43s - loss: 1.2540 - regression_loss: 1.0959 - classification_loss: 0.1581 331/500 [==================>...........] - ETA: 43s - loss: 1.2535 - regression_loss: 1.0955 - classification_loss: 0.1580 332/500 [==================>...........] - ETA: 43s - loss: 1.2548 - regression_loss: 1.0967 - classification_loss: 0.1580 333/500 [==================>...........] - ETA: 43s - loss: 1.2557 - regression_loss: 1.0977 - classification_loss: 0.1580 334/500 [===================>..........] - ETA: 42s - loss: 1.2550 - regression_loss: 1.0972 - classification_loss: 0.1578 335/500 [===================>..........] - ETA: 42s - loss: 1.2529 - regression_loss: 1.0954 - classification_loss: 0.1575 336/500 [===================>..........] - ETA: 42s - loss: 1.2537 - regression_loss: 1.0962 - classification_loss: 0.1575 337/500 [===================>..........] - ETA: 42s - loss: 1.2539 - regression_loss: 1.0964 - classification_loss: 0.1575 338/500 [===================>..........] - ETA: 41s - loss: 1.2542 - regression_loss: 1.0967 - classification_loss: 0.1575 339/500 [===================>..........] - ETA: 41s - loss: 1.2548 - regression_loss: 1.0970 - classification_loss: 0.1578 340/500 [===================>..........] - ETA: 41s - loss: 1.2555 - regression_loss: 1.0976 - classification_loss: 0.1579 341/500 [===================>..........] - ETA: 41s - loss: 1.2567 - regression_loss: 1.0986 - classification_loss: 0.1581 342/500 [===================>..........] - ETA: 40s - loss: 1.2575 - regression_loss: 1.0993 - classification_loss: 0.1582 343/500 [===================>..........] - ETA: 40s - loss: 1.2566 - regression_loss: 1.0985 - classification_loss: 0.1581 344/500 [===================>..........] - ETA: 40s - loss: 1.2573 - regression_loss: 1.0991 - classification_loss: 0.1583 345/500 [===================>..........] - ETA: 39s - loss: 1.2570 - regression_loss: 1.0989 - classification_loss: 0.1582 346/500 [===================>..........] - ETA: 39s - loss: 1.2602 - regression_loss: 1.1016 - classification_loss: 0.1585 347/500 [===================>..........] - ETA: 39s - loss: 1.2612 - regression_loss: 1.1026 - classification_loss: 0.1586 348/500 [===================>..........] - ETA: 39s - loss: 1.2616 - regression_loss: 1.1029 - classification_loss: 0.1587 349/500 [===================>..........] - ETA: 38s - loss: 1.2623 - regression_loss: 1.1034 - classification_loss: 0.1589 350/500 [====================>.........] - ETA: 38s - loss: 1.2613 - regression_loss: 1.1027 - classification_loss: 0.1586 351/500 [====================>.........] - ETA: 38s - loss: 1.2618 - regression_loss: 1.1030 - classification_loss: 0.1588 352/500 [====================>.........] - ETA: 38s - loss: 1.2612 - regression_loss: 1.1027 - classification_loss: 0.1586 353/500 [====================>.........] - ETA: 37s - loss: 1.2616 - regression_loss: 1.1030 - classification_loss: 0.1586 354/500 [====================>.........] - ETA: 37s - loss: 1.2623 - regression_loss: 1.1035 - classification_loss: 0.1588 355/500 [====================>.........] - ETA: 37s - loss: 1.2629 - regression_loss: 1.1042 - classification_loss: 0.1587 356/500 [====================>.........] - ETA: 37s - loss: 1.2617 - regression_loss: 1.1032 - classification_loss: 0.1585 357/500 [====================>.........] - ETA: 36s - loss: 1.2613 - regression_loss: 1.1030 - classification_loss: 0.1583 358/500 [====================>.........] - ETA: 36s - loss: 1.2614 - regression_loss: 1.1031 - classification_loss: 0.1583 359/500 [====================>.........] - ETA: 36s - loss: 1.2615 - regression_loss: 1.1032 - classification_loss: 0.1583 360/500 [====================>.........] - ETA: 36s - loss: 1.2635 - regression_loss: 1.1050 - classification_loss: 0.1585 361/500 [====================>.........] - ETA: 35s - loss: 1.2648 - regression_loss: 1.1061 - classification_loss: 0.1587 362/500 [====================>.........] - ETA: 35s - loss: 1.2639 - regression_loss: 1.1054 - classification_loss: 0.1585 363/500 [====================>.........] - ETA: 35s - loss: 1.2626 - regression_loss: 1.1043 - classification_loss: 0.1582 364/500 [====================>.........] - ETA: 35s - loss: 1.2624 - regression_loss: 1.1042 - classification_loss: 0.1582 365/500 [====================>.........] - ETA: 34s - loss: 1.2623 - regression_loss: 1.1043 - classification_loss: 0.1580 366/500 [====================>.........] - ETA: 34s - loss: 1.2624 - regression_loss: 1.1044 - classification_loss: 0.1579 367/500 [=====================>........] - ETA: 34s - loss: 1.2629 - regression_loss: 1.1049 - classification_loss: 0.1580 368/500 [=====================>........] - ETA: 34s - loss: 1.2632 - regression_loss: 1.1051 - classification_loss: 0.1581 369/500 [=====================>........] - ETA: 33s - loss: 1.2641 - regression_loss: 1.1060 - classification_loss: 0.1581 370/500 [=====================>........] - ETA: 33s - loss: 1.2637 - regression_loss: 1.1055 - classification_loss: 0.1582 371/500 [=====================>........] - ETA: 33s - loss: 1.2625 - regression_loss: 1.1046 - classification_loss: 0.1579 372/500 [=====================>........] - ETA: 33s - loss: 1.2616 - regression_loss: 1.1039 - classification_loss: 0.1577 373/500 [=====================>........] - ETA: 32s - loss: 1.2602 - regression_loss: 1.1028 - classification_loss: 0.1574 374/500 [=====================>........] - ETA: 32s - loss: 1.2601 - regression_loss: 1.1027 - classification_loss: 0.1574 375/500 [=====================>........] - ETA: 32s - loss: 1.2595 - regression_loss: 1.1022 - classification_loss: 0.1573 376/500 [=====================>........] - ETA: 32s - loss: 1.2604 - regression_loss: 1.1030 - classification_loss: 0.1574 377/500 [=====================>........] - ETA: 31s - loss: 1.2607 - regression_loss: 1.1031 - classification_loss: 0.1576 378/500 [=====================>........] - ETA: 31s - loss: 1.2611 - regression_loss: 1.1035 - classification_loss: 0.1576 379/500 [=====================>........] - ETA: 31s - loss: 1.2614 - regression_loss: 1.1039 - classification_loss: 0.1575 380/500 [=====================>........] - ETA: 30s - loss: 1.2594 - regression_loss: 1.1022 - classification_loss: 0.1572 381/500 [=====================>........] - ETA: 30s - loss: 1.2586 - regression_loss: 1.1015 - classification_loss: 0.1571 382/500 [=====================>........] - ETA: 30s - loss: 1.2586 - regression_loss: 1.1015 - classification_loss: 0.1570 383/500 [=====================>........] - ETA: 30s - loss: 1.2586 - regression_loss: 1.1016 - classification_loss: 0.1571 384/500 [======================>.......] - ETA: 29s - loss: 1.2574 - regression_loss: 1.1006 - classification_loss: 0.1568 385/500 [======================>.......] - ETA: 29s - loss: 1.2581 - regression_loss: 1.1013 - classification_loss: 0.1568 386/500 [======================>.......] - ETA: 29s - loss: 1.2590 - regression_loss: 1.1020 - classification_loss: 0.1570 387/500 [======================>.......] - ETA: 29s - loss: 1.2600 - regression_loss: 1.1028 - classification_loss: 0.1571 388/500 [======================>.......] - ETA: 28s - loss: 1.2603 - regression_loss: 1.1032 - classification_loss: 0.1571 389/500 [======================>.......] - ETA: 28s - loss: 1.2594 - regression_loss: 1.1024 - classification_loss: 0.1570 390/500 [======================>.......] - ETA: 28s - loss: 1.2609 - regression_loss: 1.1036 - classification_loss: 0.1573 391/500 [======================>.......] - ETA: 28s - loss: 1.2587 - regression_loss: 1.1016 - classification_loss: 0.1571 392/500 [======================>.......] - ETA: 27s - loss: 1.2591 - regression_loss: 1.1019 - classification_loss: 0.1572 393/500 [======================>.......] - ETA: 27s - loss: 1.2593 - regression_loss: 1.1021 - classification_loss: 0.1573 394/500 [======================>.......] - ETA: 27s - loss: 1.2602 - regression_loss: 1.1028 - classification_loss: 0.1574 395/500 [======================>.......] - ETA: 27s - loss: 1.2600 - regression_loss: 1.1026 - classification_loss: 0.1573 396/500 [======================>.......] - ETA: 26s - loss: 1.2601 - regression_loss: 1.1027 - classification_loss: 0.1573 397/500 [======================>.......] - ETA: 26s - loss: 1.2606 - regression_loss: 1.1031 - classification_loss: 0.1575 398/500 [======================>.......] - ETA: 26s - loss: 1.2617 - regression_loss: 1.1042 - classification_loss: 0.1575 399/500 [======================>.......] - ETA: 26s - loss: 1.2609 - regression_loss: 1.1035 - classification_loss: 0.1574 400/500 [=======================>......] - ETA: 25s - loss: 1.2605 - regression_loss: 1.1032 - classification_loss: 0.1573 401/500 [=======================>......] - ETA: 25s - loss: 1.2605 - regression_loss: 1.1033 - classification_loss: 0.1572 402/500 [=======================>......] - ETA: 25s - loss: 1.2612 - regression_loss: 1.1040 - classification_loss: 0.1573 403/500 [=======================>......] - ETA: 25s - loss: 1.2606 - regression_loss: 1.1035 - classification_loss: 0.1571 404/500 [=======================>......] - ETA: 24s - loss: 1.2605 - regression_loss: 1.1033 - classification_loss: 0.1572 405/500 [=======================>......] - ETA: 24s - loss: 1.2604 - regression_loss: 1.1033 - classification_loss: 0.1571 406/500 [=======================>......] - ETA: 24s - loss: 1.2603 - regression_loss: 1.1031 - classification_loss: 0.1572 407/500 [=======================>......] - ETA: 24s - loss: 1.2615 - regression_loss: 1.1040 - classification_loss: 0.1575 408/500 [=======================>......] - ETA: 23s - loss: 1.2619 - regression_loss: 1.1043 - classification_loss: 0.1576 409/500 [=======================>......] - ETA: 23s - loss: 1.2633 - regression_loss: 1.1055 - classification_loss: 0.1578 410/500 [=======================>......] - ETA: 23s - loss: 1.2626 - regression_loss: 1.1050 - classification_loss: 0.1575 411/500 [=======================>......] - ETA: 23s - loss: 1.2628 - regression_loss: 1.1053 - classification_loss: 0.1575 412/500 [=======================>......] - ETA: 22s - loss: 1.2633 - regression_loss: 1.1057 - classification_loss: 0.1576 413/500 [=======================>......] - ETA: 22s - loss: 1.2628 - regression_loss: 1.1054 - classification_loss: 0.1574 414/500 [=======================>......] - ETA: 22s - loss: 1.2636 - regression_loss: 1.1063 - classification_loss: 0.1574 415/500 [=======================>......] - ETA: 21s - loss: 1.2633 - regression_loss: 1.1060 - classification_loss: 0.1574 416/500 [=======================>......] - ETA: 21s - loss: 1.2644 - regression_loss: 1.1069 - classification_loss: 0.1575 417/500 [========================>.....] - ETA: 21s - loss: 1.2655 - regression_loss: 1.1080 - classification_loss: 0.1575 418/500 [========================>.....] - ETA: 21s - loss: 1.2657 - regression_loss: 1.1081 - classification_loss: 0.1576 419/500 [========================>.....] - ETA: 20s - loss: 1.2655 - regression_loss: 1.1080 - classification_loss: 0.1575 420/500 [========================>.....] - ETA: 20s - loss: 1.2653 - regression_loss: 1.1078 - classification_loss: 0.1575 421/500 [========================>.....] - ETA: 20s - loss: 1.2634 - regression_loss: 1.1062 - classification_loss: 0.1572 422/500 [========================>.....] - ETA: 20s - loss: 1.2639 - regression_loss: 1.1066 - classification_loss: 0.1573 423/500 [========================>.....] - ETA: 19s - loss: 1.2644 - regression_loss: 1.1070 - classification_loss: 0.1573 424/500 [========================>.....] - ETA: 19s - loss: 1.2648 - regression_loss: 1.1074 - classification_loss: 0.1574 425/500 [========================>.....] - ETA: 19s - loss: 1.2654 - regression_loss: 1.1080 - classification_loss: 0.1575 426/500 [========================>.....] - ETA: 19s - loss: 1.2669 - regression_loss: 1.1088 - classification_loss: 0.1581 427/500 [========================>.....] - ETA: 18s - loss: 1.2666 - regression_loss: 1.1087 - classification_loss: 0.1579 428/500 [========================>.....] - ETA: 18s - loss: 1.2662 - regression_loss: 1.1084 - classification_loss: 0.1578 429/500 [========================>.....] - ETA: 18s - loss: 1.2676 - regression_loss: 1.1096 - classification_loss: 0.1580 430/500 [========================>.....] - ETA: 18s - loss: 1.2677 - regression_loss: 1.1096 - classification_loss: 0.1580 431/500 [========================>.....] - ETA: 17s - loss: 1.2675 - regression_loss: 1.1095 - classification_loss: 0.1579 432/500 [========================>.....] - ETA: 17s - loss: 1.2678 - regression_loss: 1.1099 - classification_loss: 0.1580 433/500 [========================>.....] - ETA: 17s - loss: 1.2685 - regression_loss: 1.1105 - classification_loss: 0.1581 434/500 [=========================>....] - ETA: 17s - loss: 1.2684 - regression_loss: 1.1104 - classification_loss: 0.1580 435/500 [=========================>....] - ETA: 16s - loss: 1.2668 - regression_loss: 1.1090 - classification_loss: 0.1578 436/500 [=========================>....] - ETA: 16s - loss: 1.2669 - regression_loss: 1.1091 - classification_loss: 0.1578 437/500 [=========================>....] - ETA: 16s - loss: 1.2656 - regression_loss: 1.1081 - classification_loss: 0.1576 438/500 [=========================>....] - ETA: 16s - loss: 1.2670 - regression_loss: 1.1088 - classification_loss: 0.1581 439/500 [=========================>....] - ETA: 15s - loss: 1.2675 - regression_loss: 1.1092 - classification_loss: 0.1583 440/500 [=========================>....] - ETA: 15s - loss: 1.2681 - regression_loss: 1.1098 - classification_loss: 0.1583 441/500 [=========================>....] - ETA: 15s - loss: 1.2683 - regression_loss: 1.1098 - classification_loss: 0.1585 442/500 [=========================>....] - ETA: 14s - loss: 1.2689 - regression_loss: 1.1104 - classification_loss: 0.1585 443/500 [=========================>....] - ETA: 14s - loss: 1.2698 - regression_loss: 1.1112 - classification_loss: 0.1587 444/500 [=========================>....] - ETA: 14s - loss: 1.2690 - regression_loss: 1.1106 - classification_loss: 0.1585 445/500 [=========================>....] - ETA: 14s - loss: 1.2706 - regression_loss: 1.1118 - classification_loss: 0.1589 446/500 [=========================>....] - ETA: 13s - loss: 1.2711 - regression_loss: 1.1121 - classification_loss: 0.1590 447/500 [=========================>....] - ETA: 13s - loss: 1.2708 - regression_loss: 1.1120 - classification_loss: 0.1589 448/500 [=========================>....] - ETA: 13s - loss: 1.2704 - regression_loss: 1.1116 - classification_loss: 0.1588 449/500 [=========================>....] - ETA: 13s - loss: 1.2709 - regression_loss: 1.1118 - classification_loss: 0.1591 450/500 [==========================>...] - ETA: 12s - loss: 1.2719 - regression_loss: 1.1126 - classification_loss: 0.1593 451/500 [==========================>...] - ETA: 12s - loss: 1.2710 - regression_loss: 1.1119 - classification_loss: 0.1591 452/500 [==========================>...] - ETA: 12s - loss: 1.2713 - regression_loss: 1.1120 - classification_loss: 0.1592 453/500 [==========================>...] - ETA: 12s - loss: 1.2720 - regression_loss: 1.1127 - classification_loss: 0.1594 454/500 [==========================>...] - ETA: 11s - loss: 1.2721 - regression_loss: 1.1127 - classification_loss: 0.1593 455/500 [==========================>...] - ETA: 11s - loss: 1.2723 - regression_loss: 1.1129 - classification_loss: 0.1593 456/500 [==========================>...] - ETA: 11s - loss: 1.2725 - regression_loss: 1.1128 - classification_loss: 0.1596 457/500 [==========================>...] - ETA: 11s - loss: 1.2731 - regression_loss: 1.1134 - classification_loss: 0.1597 458/500 [==========================>...] - ETA: 10s - loss: 1.2728 - regression_loss: 1.1132 - classification_loss: 0.1596 459/500 [==========================>...] - ETA: 10s - loss: 1.2716 - regression_loss: 1.1122 - classification_loss: 0.1594 460/500 [==========================>...] - ETA: 10s - loss: 1.2724 - regression_loss: 1.1128 - classification_loss: 0.1596 461/500 [==========================>...] - ETA: 10s - loss: 1.2719 - regression_loss: 1.1123 - classification_loss: 0.1596 462/500 [==========================>...] - ETA: 9s - loss: 1.2734 - regression_loss: 1.1135 - classification_loss: 0.1599  463/500 [==========================>...] - ETA: 9s - loss: 1.2731 - regression_loss: 1.1132 - classification_loss: 0.1599 464/500 [==========================>...] - ETA: 9s - loss: 1.2720 - regression_loss: 1.1123 - classification_loss: 0.1597 465/500 [==========================>...] - ETA: 9s - loss: 1.2734 - regression_loss: 1.1135 - classification_loss: 0.1599 466/500 [==========================>...] - ETA: 8s - loss: 1.2725 - regression_loss: 1.1128 - classification_loss: 0.1597 467/500 [===========================>..] - ETA: 8s - loss: 1.2720 - regression_loss: 1.1124 - classification_loss: 0.1596 468/500 [===========================>..] - ETA: 8s - loss: 1.2716 - regression_loss: 1.1119 - classification_loss: 0.1597 469/500 [===========================>..] - ETA: 7s - loss: 1.2727 - regression_loss: 1.1129 - classification_loss: 0.1598 470/500 [===========================>..] - ETA: 7s - loss: 1.2729 - regression_loss: 1.1133 - classification_loss: 0.1596 471/500 [===========================>..] - ETA: 7s - loss: 1.2731 - regression_loss: 1.1135 - classification_loss: 0.1597 472/500 [===========================>..] - ETA: 7s - loss: 1.2747 - regression_loss: 1.1147 - classification_loss: 0.1599 473/500 [===========================>..] - ETA: 6s - loss: 1.2749 - regression_loss: 1.1148 - classification_loss: 0.1602 474/500 [===========================>..] - ETA: 6s - loss: 1.2746 - regression_loss: 1.1145 - classification_loss: 0.1601 475/500 [===========================>..] - ETA: 6s - loss: 1.2745 - regression_loss: 1.1144 - classification_loss: 0.1600 476/500 [===========================>..] - ETA: 6s - loss: 1.2749 - regression_loss: 1.1147 - classification_loss: 0.1602 477/500 [===========================>..] - ETA: 5s - loss: 1.2740 - regression_loss: 1.1139 - classification_loss: 0.1601 478/500 [===========================>..] - ETA: 5s - loss: 1.2751 - regression_loss: 1.1147 - classification_loss: 0.1604 479/500 [===========================>..] - ETA: 5s - loss: 1.2755 - regression_loss: 1.1150 - classification_loss: 0.1605 480/500 [===========================>..] - ETA: 5s - loss: 1.2765 - regression_loss: 1.1158 - classification_loss: 0.1608 481/500 [===========================>..] - ETA: 4s - loss: 1.2770 - regression_loss: 1.1162 - classification_loss: 0.1608 482/500 [===========================>..] - ETA: 4s - loss: 1.2770 - regression_loss: 1.1163 - classification_loss: 0.1608 483/500 [===========================>..] - ETA: 4s - loss: 1.2765 - regression_loss: 1.1159 - classification_loss: 0.1606 484/500 [============================>.] - ETA: 4s - loss: 1.2763 - regression_loss: 1.1158 - classification_loss: 0.1606 485/500 [============================>.] - ETA: 3s - loss: 1.2757 - regression_loss: 1.1152 - classification_loss: 0.1604 486/500 [============================>.] - ETA: 3s - loss: 1.2742 - regression_loss: 1.1139 - classification_loss: 0.1603 487/500 [============================>.] - ETA: 3s - loss: 1.2739 - regression_loss: 1.1137 - classification_loss: 0.1602 488/500 [============================>.] - ETA: 3s - loss: 1.2740 - regression_loss: 1.1138 - classification_loss: 0.1602 489/500 [============================>.] - ETA: 2s - loss: 1.2751 - regression_loss: 1.1147 - classification_loss: 0.1604 490/500 [============================>.] - ETA: 2s - loss: 1.2746 - regression_loss: 1.1143 - classification_loss: 0.1603 491/500 [============================>.] - ETA: 2s - loss: 1.2741 - regression_loss: 1.1138 - classification_loss: 0.1602 492/500 [============================>.] - ETA: 2s - loss: 1.2737 - regression_loss: 1.1135 - classification_loss: 0.1601 493/500 [============================>.] - ETA: 1s - loss: 1.2740 - regression_loss: 1.1137 - classification_loss: 0.1602 494/500 [============================>.] - ETA: 1s - loss: 1.2735 - regression_loss: 1.1132 - classification_loss: 0.1602 495/500 [============================>.] - ETA: 1s - loss: 1.2740 - regression_loss: 1.1136 - classification_loss: 0.1603 496/500 [============================>.] - ETA: 1s - loss: 1.2745 - regression_loss: 1.1142 - classification_loss: 0.1603 497/500 [============================>.] - ETA: 0s - loss: 1.2753 - regression_loss: 1.1149 - classification_loss: 0.1603 498/500 [============================>.] - ETA: 0s - loss: 1.2749 - regression_loss: 1.1147 - classification_loss: 0.1603 499/500 [============================>.] - ETA: 0s - loss: 1.2735 - regression_loss: 1.1134 - classification_loss: 0.1601 500/500 [==============================] - 129s 258ms/step - loss: 1.2722 - regression_loss: 1.1123 - classification_loss: 0.1599 1172 instances of class plum with average precision: 0.7648 mAP: 0.7648 Epoch 00022: saving model to ./training/snapshots/resnet50_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 1:59 - loss: 1.2450 - regression_loss: 1.1103 - classification_loss: 0.1347 2/500 [..............................] - ETA: 2:05 - loss: 0.8212 - regression_loss: 0.7372 - classification_loss: 0.0840 3/500 [..............................] - ETA: 2:05 - loss: 0.9336 - regression_loss: 0.8215 - classification_loss: 0.1121 4/500 [..............................] - ETA: 2:06 - loss: 0.9318 - regression_loss: 0.8316 - classification_loss: 0.1002 5/500 [..............................] - ETA: 2:06 - loss: 0.8768 - regression_loss: 0.7841 - classification_loss: 0.0927 6/500 [..............................] - ETA: 2:06 - loss: 1.0263 - regression_loss: 0.9015 - classification_loss: 0.1248 7/500 [..............................] - ETA: 2:06 - loss: 1.1081 - regression_loss: 0.9727 - classification_loss: 0.1354 8/500 [..............................] - ETA: 2:07 - loss: 1.0371 - regression_loss: 0.9138 - classification_loss: 0.1234 9/500 [..............................] - ETA: 2:07 - loss: 1.1100 - regression_loss: 0.9735 - classification_loss: 0.1365 10/500 [..............................] - ETA: 2:06 - loss: 1.1359 - regression_loss: 1.0049 - classification_loss: 0.1309 11/500 [..............................] - ETA: 2:06 - loss: 1.1616 - regression_loss: 1.0256 - classification_loss: 0.1360 12/500 [..............................] - ETA: 2:06 - loss: 1.1882 - regression_loss: 1.0462 - classification_loss: 0.1420 13/500 [..............................] - ETA: 2:06 - loss: 1.2324 - regression_loss: 1.0729 - classification_loss: 0.1594 14/500 [..............................] - ETA: 2:06 - loss: 1.2366 - regression_loss: 1.0752 - classification_loss: 0.1614 15/500 [..............................] - ETA: 2:05 - loss: 1.2051 - regression_loss: 1.0507 - classification_loss: 0.1544 16/500 [..............................] - ETA: 2:05 - loss: 1.2179 - regression_loss: 1.0619 - classification_loss: 0.1560 17/500 [>.............................] - ETA: 2:04 - loss: 1.2115 - regression_loss: 1.0543 - classification_loss: 0.1572 18/500 [>.............................] - ETA: 2:04 - loss: 1.1689 - regression_loss: 1.0182 - classification_loss: 0.1507 19/500 [>.............................] - ETA: 2:04 - loss: 1.2140 - regression_loss: 1.0574 - classification_loss: 0.1566 20/500 [>.............................] - ETA: 2:04 - loss: 1.2224 - regression_loss: 1.0654 - classification_loss: 0.1570 21/500 [>.............................] - ETA: 2:04 - loss: 1.2376 - regression_loss: 1.0782 - classification_loss: 0.1594 22/500 [>.............................] - ETA: 2:03 - loss: 1.2415 - regression_loss: 1.0817 - classification_loss: 0.1598 23/500 [>.............................] - ETA: 2:03 - loss: 1.2515 - regression_loss: 1.0877 - classification_loss: 0.1638 24/500 [>.............................] - ETA: 2:03 - loss: 1.2575 - regression_loss: 1.0932 - classification_loss: 0.1643 25/500 [>.............................] - ETA: 2:03 - loss: 1.2678 - regression_loss: 1.1028 - classification_loss: 0.1650 26/500 [>.............................] - ETA: 2:02 - loss: 1.2436 - regression_loss: 1.0780 - classification_loss: 0.1656 27/500 [>.............................] - ETA: 2:02 - loss: 1.2596 - regression_loss: 1.0924 - classification_loss: 0.1672 28/500 [>.............................] - ETA: 2:02 - loss: 1.2386 - regression_loss: 1.0742 - classification_loss: 0.1644 29/500 [>.............................] - ETA: 2:02 - loss: 1.2357 - regression_loss: 1.0712 - classification_loss: 0.1645 30/500 [>.............................] - ETA: 2:02 - loss: 1.2327 - regression_loss: 1.0705 - classification_loss: 0.1622 31/500 [>.............................] - ETA: 2:01 - loss: 1.2146 - regression_loss: 1.0560 - classification_loss: 0.1586 32/500 [>.............................] - ETA: 2:01 - loss: 1.2276 - regression_loss: 1.0671 - classification_loss: 0.1604 33/500 [>.............................] - ETA: 2:01 - loss: 1.2313 - regression_loss: 1.0713 - classification_loss: 0.1600 34/500 [=>............................] - ETA: 2:01 - loss: 1.2071 - regression_loss: 1.0514 - classification_loss: 0.1557 35/500 [=>............................] - ETA: 2:01 - loss: 1.2142 - regression_loss: 1.0575 - classification_loss: 0.1567 36/500 [=>............................] - ETA: 2:00 - loss: 1.2126 - regression_loss: 1.0567 - classification_loss: 0.1559 37/500 [=>............................] - ETA: 2:00 - loss: 1.2021 - regression_loss: 1.0479 - classification_loss: 0.1543 38/500 [=>............................] - ETA: 2:00 - loss: 1.2125 - regression_loss: 1.0569 - classification_loss: 0.1555 39/500 [=>............................] - ETA: 1:59 - loss: 1.2279 - regression_loss: 1.0698 - classification_loss: 0.1581 40/500 [=>............................] - ETA: 1:59 - loss: 1.2214 - regression_loss: 1.0656 - classification_loss: 0.1558 41/500 [=>............................] - ETA: 1:59 - loss: 1.2204 - regression_loss: 1.0648 - classification_loss: 0.1556 42/500 [=>............................] - ETA: 1:59 - loss: 1.2032 - regression_loss: 1.0480 - classification_loss: 0.1552 43/500 [=>............................] - ETA: 1:58 - loss: 1.2051 - regression_loss: 1.0501 - classification_loss: 0.1550 44/500 [=>............................] - ETA: 1:58 - loss: 1.2154 - regression_loss: 1.0575 - classification_loss: 0.1579 45/500 [=>............................] - ETA: 1:58 - loss: 1.2205 - regression_loss: 1.0632 - classification_loss: 0.1573 46/500 [=>............................] - ETA: 1:58 - loss: 1.2136 - regression_loss: 1.0580 - classification_loss: 0.1555 47/500 [=>............................] - ETA: 1:57 - loss: 1.2222 - regression_loss: 1.0656 - classification_loss: 0.1566 48/500 [=>............................] - ETA: 1:57 - loss: 1.2101 - regression_loss: 1.0552 - classification_loss: 0.1549 49/500 [=>............................] - ETA: 1:57 - loss: 1.2148 - regression_loss: 1.0613 - classification_loss: 0.1535 50/500 [==>...........................] - ETA: 1:57 - loss: 1.1999 - regression_loss: 1.0490 - classification_loss: 0.1510 51/500 [==>...........................] - ETA: 1:56 - loss: 1.2103 - regression_loss: 1.0581 - classification_loss: 0.1522 52/500 [==>...........................] - ETA: 1:56 - loss: 1.2144 - regression_loss: 1.0617 - classification_loss: 0.1527 53/500 [==>...........................] - ETA: 1:56 - loss: 1.2064 - regression_loss: 1.0553 - classification_loss: 0.1511 54/500 [==>...........................] - ETA: 1:56 - loss: 1.2120 - regression_loss: 1.0602 - classification_loss: 0.1518 55/500 [==>...........................] - ETA: 1:55 - loss: 1.2004 - regression_loss: 1.0490 - classification_loss: 0.1514 56/500 [==>...........................] - ETA: 1:55 - loss: 1.2018 - regression_loss: 1.0501 - classification_loss: 0.1517 57/500 [==>...........................] - ETA: 1:55 - loss: 1.2051 - regression_loss: 1.0535 - classification_loss: 0.1516 58/500 [==>...........................] - ETA: 1:55 - loss: 1.1973 - regression_loss: 1.0471 - classification_loss: 0.1503 59/500 [==>...........................] - ETA: 1:54 - loss: 1.1958 - regression_loss: 1.0450 - classification_loss: 0.1509 60/500 [==>...........................] - ETA: 1:54 - loss: 1.2022 - regression_loss: 1.0502 - classification_loss: 0.1520 61/500 [==>...........................] - ETA: 1:54 - loss: 1.2012 - regression_loss: 1.0505 - classification_loss: 0.1508 62/500 [==>...........................] - ETA: 1:54 - loss: 1.2029 - regression_loss: 1.0521 - classification_loss: 0.1508 63/500 [==>...........................] - ETA: 1:53 - loss: 1.2051 - regression_loss: 1.0539 - classification_loss: 0.1511 64/500 [==>...........................] - ETA: 1:53 - loss: 1.2001 - regression_loss: 1.0498 - classification_loss: 0.1503 65/500 [==>...........................] - ETA: 1:53 - loss: 1.1922 - regression_loss: 1.0432 - classification_loss: 0.1490 66/500 [==>...........................] - ETA: 1:52 - loss: 1.1811 - regression_loss: 1.0337 - classification_loss: 0.1474 67/500 [===>..........................] - ETA: 1:52 - loss: 1.1847 - regression_loss: 1.0375 - classification_loss: 0.1471 68/500 [===>..........................] - ETA: 1:52 - loss: 1.1801 - regression_loss: 1.0333 - classification_loss: 0.1468 69/500 [===>..........................] - ETA: 1:52 - loss: 1.1795 - regression_loss: 1.0332 - classification_loss: 0.1462 70/500 [===>..........................] - ETA: 1:51 - loss: 1.1801 - regression_loss: 1.0340 - classification_loss: 0.1461 71/500 [===>..........................] - ETA: 1:51 - loss: 1.1804 - regression_loss: 1.0345 - classification_loss: 0.1459 72/500 [===>..........................] - ETA: 1:51 - loss: 1.1734 - regression_loss: 1.0278 - classification_loss: 0.1455 73/500 [===>..........................] - ETA: 1:51 - loss: 1.1745 - regression_loss: 1.0291 - classification_loss: 0.1454 74/500 [===>..........................] - ETA: 1:50 - loss: 1.1776 - regression_loss: 1.0312 - classification_loss: 0.1464 75/500 [===>..........................] - ETA: 1:50 - loss: 1.1832 - regression_loss: 1.0361 - classification_loss: 0.1471 76/500 [===>..........................] - ETA: 1:50 - loss: 1.1880 - regression_loss: 1.0402 - classification_loss: 0.1478 77/500 [===>..........................] - ETA: 1:50 - loss: 1.1863 - regression_loss: 1.0391 - classification_loss: 0.1472 78/500 [===>..........................] - ETA: 1:49 - loss: 1.1868 - regression_loss: 1.0403 - classification_loss: 0.1466 79/500 [===>..........................] - ETA: 1:49 - loss: 1.1796 - regression_loss: 1.0341 - classification_loss: 0.1455 80/500 [===>..........................] - ETA: 1:49 - loss: 1.1861 - regression_loss: 1.0399 - classification_loss: 0.1463 81/500 [===>..........................] - ETA: 1:49 - loss: 1.1836 - regression_loss: 1.0381 - classification_loss: 0.1455 82/500 [===>..........................] - ETA: 1:49 - loss: 1.1821 - regression_loss: 1.0370 - classification_loss: 0.1451 83/500 [===>..........................] - ETA: 1:48 - loss: 1.1854 - regression_loss: 1.0398 - classification_loss: 0.1456 84/500 [====>.........................] - ETA: 1:48 - loss: 1.1877 - regression_loss: 1.0422 - classification_loss: 0.1456 85/500 [====>.........................] - ETA: 1:48 - loss: 1.1898 - regression_loss: 1.0432 - classification_loss: 0.1466 86/500 [====>.........................] - ETA: 1:47 - loss: 1.1965 - regression_loss: 1.0491 - classification_loss: 0.1474 87/500 [====>.........................] - ETA: 1:47 - loss: 1.2076 - regression_loss: 1.0587 - classification_loss: 0.1489 88/500 [====>.........................] - ETA: 1:47 - loss: 1.2079 - regression_loss: 1.0588 - classification_loss: 0.1491 89/500 [====>.........................] - ETA: 1:46 - loss: 1.2064 - regression_loss: 1.0579 - classification_loss: 0.1485 90/500 [====>.........................] - ETA: 1:46 - loss: 1.1983 - regression_loss: 1.0508 - classification_loss: 0.1475 91/500 [====>.........................] - ETA: 1:46 - loss: 1.1992 - regression_loss: 1.0514 - classification_loss: 0.1478 92/500 [====>.........................] - ETA: 1:45 - loss: 1.2030 - regression_loss: 1.0545 - classification_loss: 0.1485 93/500 [====>.........................] - ETA: 1:45 - loss: 1.2028 - regression_loss: 1.0542 - classification_loss: 0.1486 94/500 [====>.........................] - ETA: 1:45 - loss: 1.2000 - regression_loss: 1.0521 - classification_loss: 0.1480 95/500 [====>.........................] - ETA: 1:44 - loss: 1.2019 - regression_loss: 1.0536 - classification_loss: 0.1483 96/500 [====>.........................] - ETA: 1:44 - loss: 1.2020 - regression_loss: 1.0532 - classification_loss: 0.1487 97/500 [====>.........................] - ETA: 1:44 - loss: 1.2103 - regression_loss: 1.0605 - classification_loss: 0.1499 98/500 [====>.........................] - ETA: 1:44 - loss: 1.2112 - regression_loss: 1.0614 - classification_loss: 0.1498 99/500 [====>.........................] - ETA: 1:43 - loss: 1.2048 - regression_loss: 1.0562 - classification_loss: 0.1486 100/500 [=====>........................] - ETA: 1:43 - loss: 1.2027 - regression_loss: 1.0548 - classification_loss: 0.1479 101/500 [=====>........................] - ETA: 1:43 - loss: 1.2079 - regression_loss: 1.0593 - classification_loss: 0.1486 102/500 [=====>........................] - ETA: 1:43 - loss: 1.2057 - regression_loss: 1.0575 - classification_loss: 0.1481 103/500 [=====>........................] - ETA: 1:43 - loss: 1.2089 - regression_loss: 1.0601 - classification_loss: 0.1488 104/500 [=====>........................] - ETA: 1:42 - loss: 1.2106 - regression_loss: 1.0615 - classification_loss: 0.1491 105/500 [=====>........................] - ETA: 1:42 - loss: 1.2111 - regression_loss: 1.0620 - classification_loss: 0.1491 106/500 [=====>........................] - ETA: 1:42 - loss: 1.2115 - regression_loss: 1.0629 - classification_loss: 0.1486 107/500 [=====>........................] - ETA: 1:41 - loss: 1.2127 - regression_loss: 1.0639 - classification_loss: 0.1488 108/500 [=====>........................] - ETA: 1:41 - loss: 1.2112 - regression_loss: 1.0633 - classification_loss: 0.1479 109/500 [=====>........................] - ETA: 1:41 - loss: 1.2152 - regression_loss: 1.0658 - classification_loss: 0.1494 110/500 [=====>........................] - ETA: 1:41 - loss: 1.2168 - regression_loss: 1.0674 - classification_loss: 0.1494 111/500 [=====>........................] - ETA: 1:40 - loss: 1.2181 - regression_loss: 1.0688 - classification_loss: 0.1493 112/500 [=====>........................] - ETA: 1:40 - loss: 1.2180 - regression_loss: 1.0692 - classification_loss: 0.1488 113/500 [=====>........................] - ETA: 1:40 - loss: 1.2125 - regression_loss: 1.0647 - classification_loss: 0.1478 114/500 [=====>........................] - ETA: 1:40 - loss: 1.2104 - regression_loss: 1.0633 - classification_loss: 0.1471 115/500 [=====>........................] - ETA: 1:39 - loss: 1.2053 - regression_loss: 1.0589 - classification_loss: 0.1464 116/500 [=====>........................] - ETA: 1:39 - loss: 1.2032 - regression_loss: 1.0574 - classification_loss: 0.1457 117/500 [======>.......................] - ETA: 1:39 - loss: 1.2067 - regression_loss: 1.0605 - classification_loss: 0.1461 118/500 [======>.......................] - ETA: 1:39 - loss: 1.2075 - regression_loss: 1.0612 - classification_loss: 0.1463 119/500 [======>.......................] - ETA: 1:38 - loss: 1.2097 - regression_loss: 1.0633 - classification_loss: 0.1464 120/500 [======>.......................] - ETA: 1:38 - loss: 1.2092 - regression_loss: 1.0623 - classification_loss: 0.1468 121/500 [======>.......................] - ETA: 1:38 - loss: 1.2119 - regression_loss: 1.0647 - classification_loss: 0.1472 122/500 [======>.......................] - ETA: 1:38 - loss: 1.2157 - regression_loss: 1.0679 - classification_loss: 0.1478 123/500 [======>.......................] - ETA: 1:37 - loss: 1.2173 - regression_loss: 1.0693 - classification_loss: 0.1480 124/500 [======>.......................] - ETA: 1:37 - loss: 1.2183 - regression_loss: 1.0705 - classification_loss: 0.1478 125/500 [======>.......................] - ETA: 1:37 - loss: 1.2190 - regression_loss: 1.0711 - classification_loss: 0.1478 126/500 [======>.......................] - ETA: 1:37 - loss: 1.2228 - regression_loss: 1.0745 - classification_loss: 0.1483 127/500 [======>.......................] - ETA: 1:36 - loss: 1.2232 - regression_loss: 1.0750 - classification_loss: 0.1482 128/500 [======>.......................] - ETA: 1:36 - loss: 1.2261 - regression_loss: 1.0773 - classification_loss: 0.1488 129/500 [======>.......................] - ETA: 1:36 - loss: 1.2312 - regression_loss: 1.0812 - classification_loss: 0.1500 130/500 [======>.......................] - ETA: 1:36 - loss: 1.2319 - regression_loss: 1.0816 - classification_loss: 0.1502 131/500 [======>.......................] - ETA: 1:35 - loss: 1.2368 - regression_loss: 1.0851 - classification_loss: 0.1517 132/500 [======>.......................] - ETA: 1:35 - loss: 1.2324 - regression_loss: 1.0809 - classification_loss: 0.1515 133/500 [======>.......................] - ETA: 1:35 - loss: 1.2286 - regression_loss: 1.0766 - classification_loss: 0.1520 134/500 [=======>......................] - ETA: 1:35 - loss: 1.2240 - regression_loss: 1.0725 - classification_loss: 0.1514 135/500 [=======>......................] - ETA: 1:34 - loss: 1.2250 - regression_loss: 1.0736 - classification_loss: 0.1514 136/500 [=======>......................] - ETA: 1:34 - loss: 1.2261 - regression_loss: 1.0753 - classification_loss: 0.1508 137/500 [=======>......................] - ETA: 1:34 - loss: 1.2304 - regression_loss: 1.0792 - classification_loss: 0.1512 138/500 [=======>......................] - ETA: 1:33 - loss: 1.2306 - regression_loss: 1.0795 - classification_loss: 0.1512 139/500 [=======>......................] - ETA: 1:33 - loss: 1.2324 - regression_loss: 1.0809 - classification_loss: 0.1515 140/500 [=======>......................] - ETA: 1:33 - loss: 1.2340 - regression_loss: 1.0823 - classification_loss: 0.1517 141/500 [=======>......................] - ETA: 1:33 - loss: 1.2347 - regression_loss: 1.0832 - classification_loss: 0.1515 142/500 [=======>......................] - ETA: 1:32 - loss: 1.2361 - regression_loss: 1.0844 - classification_loss: 0.1517 143/500 [=======>......................] - ETA: 1:32 - loss: 1.2346 - regression_loss: 1.0830 - classification_loss: 0.1516 144/500 [=======>......................] - ETA: 1:32 - loss: 1.2364 - regression_loss: 1.0843 - classification_loss: 0.1520 145/500 [=======>......................] - ETA: 1:32 - loss: 1.2307 - regression_loss: 1.0793 - classification_loss: 0.1514 146/500 [=======>......................] - ETA: 1:31 - loss: 1.2263 - regression_loss: 1.0752 - classification_loss: 0.1510 147/500 [=======>......................] - ETA: 1:31 - loss: 1.2224 - regression_loss: 1.0722 - classification_loss: 0.1502 148/500 [=======>......................] - ETA: 1:31 - loss: 1.2207 - regression_loss: 1.0707 - classification_loss: 0.1499 149/500 [=======>......................] - ETA: 1:31 - loss: 1.2206 - regression_loss: 1.0706 - classification_loss: 0.1500 150/500 [========>.....................] - ETA: 1:30 - loss: 1.2246 - regression_loss: 1.0744 - classification_loss: 0.1503 151/500 [========>.....................] - ETA: 1:30 - loss: 1.2213 - regression_loss: 1.0714 - classification_loss: 0.1499 152/500 [========>.....................] - ETA: 1:30 - loss: 1.2210 - regression_loss: 1.0711 - classification_loss: 0.1499 153/500 [========>.....................] - ETA: 1:30 - loss: 1.2226 - regression_loss: 1.0723 - classification_loss: 0.1502 154/500 [========>.....................] - ETA: 1:29 - loss: 1.2255 - regression_loss: 1.0747 - classification_loss: 0.1508 155/500 [========>.....................] - ETA: 1:29 - loss: 1.2231 - regression_loss: 1.0728 - classification_loss: 0.1503 156/500 [========>.....................] - ETA: 1:29 - loss: 1.2240 - regression_loss: 1.0737 - classification_loss: 0.1503 157/500 [========>.....................] - ETA: 1:29 - loss: 1.2198 - regression_loss: 1.0702 - classification_loss: 0.1496 158/500 [========>.....................] - ETA: 1:28 - loss: 1.2175 - regression_loss: 1.0685 - classification_loss: 0.1490 159/500 [========>.....................] - ETA: 1:28 - loss: 1.2172 - regression_loss: 1.0684 - classification_loss: 0.1489 160/500 [========>.....................] - ETA: 1:28 - loss: 1.2144 - regression_loss: 1.0660 - classification_loss: 0.1484 161/500 [========>.....................] - ETA: 1:28 - loss: 1.2125 - regression_loss: 1.0645 - classification_loss: 0.1480 162/500 [========>.....................] - ETA: 1:27 - loss: 1.2137 - regression_loss: 1.0655 - classification_loss: 0.1482 163/500 [========>.....................] - ETA: 1:27 - loss: 1.2116 - regression_loss: 1.0636 - classification_loss: 0.1479 164/500 [========>.....................] - ETA: 1:27 - loss: 1.2140 - regression_loss: 1.0657 - classification_loss: 0.1483 165/500 [========>.....................] - ETA: 1:26 - loss: 1.2139 - regression_loss: 1.0657 - classification_loss: 0.1482 166/500 [========>.....................] - ETA: 1:26 - loss: 1.2134 - regression_loss: 1.0651 - classification_loss: 0.1483 167/500 [=========>....................] - ETA: 1:26 - loss: 1.2154 - regression_loss: 1.0669 - classification_loss: 0.1486 168/500 [=========>....................] - ETA: 1:26 - loss: 1.2136 - regression_loss: 1.0655 - classification_loss: 0.1481 169/500 [=========>....................] - ETA: 1:25 - loss: 1.2130 - regression_loss: 1.0649 - classification_loss: 0.1480 170/500 [=========>....................] - ETA: 1:25 - loss: 1.2133 - regression_loss: 1.0654 - classification_loss: 0.1479 171/500 [=========>....................] - ETA: 1:25 - loss: 1.2153 - regression_loss: 1.0672 - classification_loss: 0.1480 172/500 [=========>....................] - ETA: 1:25 - loss: 1.2126 - regression_loss: 1.0650 - classification_loss: 0.1477 173/500 [=========>....................] - ETA: 1:24 - loss: 1.2136 - regression_loss: 1.0656 - classification_loss: 0.1479 174/500 [=========>....................] - ETA: 1:24 - loss: 1.2115 - regression_loss: 1.0639 - classification_loss: 0.1476 175/500 [=========>....................] - ETA: 1:24 - loss: 1.2110 - regression_loss: 1.0636 - classification_loss: 0.1473 176/500 [=========>....................] - ETA: 1:24 - loss: 1.2111 - regression_loss: 1.0634 - classification_loss: 0.1477 177/500 [=========>....................] - ETA: 1:23 - loss: 1.2082 - regression_loss: 1.0611 - classification_loss: 0.1471 178/500 [=========>....................] - ETA: 1:23 - loss: 1.2074 - regression_loss: 1.0602 - classification_loss: 0.1471 179/500 [=========>....................] - ETA: 1:23 - loss: 1.2086 - regression_loss: 1.0617 - classification_loss: 0.1469 180/500 [=========>....................] - ETA: 1:23 - loss: 1.2092 - regression_loss: 1.0621 - classification_loss: 0.1470 181/500 [=========>....................] - ETA: 1:22 - loss: 1.2082 - regression_loss: 1.0613 - classification_loss: 0.1469 182/500 [=========>....................] - ETA: 1:22 - loss: 1.2081 - regression_loss: 1.0613 - classification_loss: 0.1468 183/500 [=========>....................] - ETA: 1:22 - loss: 1.2085 - regression_loss: 1.0616 - classification_loss: 0.1469 184/500 [==========>...................] - ETA: 1:22 - loss: 1.2050 - regression_loss: 1.0587 - classification_loss: 0.1463 185/500 [==========>...................] - ETA: 1:21 - loss: 1.2045 - regression_loss: 1.0582 - classification_loss: 0.1462 186/500 [==========>...................] - ETA: 1:21 - loss: 1.2020 - regression_loss: 1.0561 - classification_loss: 0.1459 187/500 [==========>...................] - ETA: 1:21 - loss: 1.1984 - regression_loss: 1.0530 - classification_loss: 0.1454 188/500 [==========>...................] - ETA: 1:21 - loss: 1.1952 - regression_loss: 1.0503 - classification_loss: 0.1449 189/500 [==========>...................] - ETA: 1:20 - loss: 1.1913 - regression_loss: 1.0468 - classification_loss: 0.1445 190/500 [==========>...................] - ETA: 1:20 - loss: 1.1926 - regression_loss: 1.0479 - classification_loss: 0.1447 191/500 [==========>...................] - ETA: 1:20 - loss: 1.1932 - regression_loss: 1.0487 - classification_loss: 0.1445 192/500 [==========>...................] - ETA: 1:19 - loss: 1.1955 - regression_loss: 1.0502 - classification_loss: 0.1453 193/500 [==========>...................] - ETA: 1:19 - loss: 1.1988 - regression_loss: 1.0531 - classification_loss: 0.1457 194/500 [==========>...................] - ETA: 1:19 - loss: 1.1970 - regression_loss: 1.0517 - classification_loss: 0.1454 195/500 [==========>...................] - ETA: 1:19 - loss: 1.1927 - regression_loss: 1.0479 - classification_loss: 0.1447 196/500 [==========>...................] - ETA: 1:18 - loss: 1.1925 - regression_loss: 1.0479 - classification_loss: 0.1446 197/500 [==========>...................] - ETA: 1:18 - loss: 1.1937 - regression_loss: 1.0490 - classification_loss: 0.1446 198/500 [==========>...................] - ETA: 1:18 - loss: 1.1930 - regression_loss: 1.0485 - classification_loss: 0.1444 199/500 [==========>...................] - ETA: 1:18 - loss: 1.1932 - regression_loss: 1.0490 - classification_loss: 0.1442 200/500 [===========>..................] - ETA: 1:17 - loss: 1.1974 - regression_loss: 1.0523 - classification_loss: 0.1450 201/500 [===========>..................] - ETA: 1:17 - loss: 1.1946 - regression_loss: 1.0501 - classification_loss: 0.1445 202/500 [===========>..................] - ETA: 1:17 - loss: 1.1990 - regression_loss: 1.0538 - classification_loss: 0.1452 203/500 [===========>..................] - ETA: 1:17 - loss: 1.2013 - regression_loss: 1.0557 - classification_loss: 0.1456 204/500 [===========>..................] - ETA: 1:16 - loss: 1.2015 - regression_loss: 1.0558 - classification_loss: 0.1457 205/500 [===========>..................] - ETA: 1:16 - loss: 1.2039 - regression_loss: 1.0578 - classification_loss: 0.1461 206/500 [===========>..................] - ETA: 1:16 - loss: 1.2011 - regression_loss: 1.0555 - classification_loss: 0.1456 207/500 [===========>..................] - ETA: 1:16 - loss: 1.1990 - regression_loss: 1.0536 - classification_loss: 0.1454 208/500 [===========>..................] - ETA: 1:15 - loss: 1.2019 - regression_loss: 1.0559 - classification_loss: 0.1460 209/500 [===========>..................] - ETA: 1:15 - loss: 1.2032 - regression_loss: 1.0572 - classification_loss: 0.1460 210/500 [===========>..................] - ETA: 1:15 - loss: 1.2048 - regression_loss: 1.0586 - classification_loss: 0.1463 211/500 [===========>..................] - ETA: 1:14 - loss: 1.2063 - regression_loss: 1.0597 - classification_loss: 0.1466 212/500 [===========>..................] - ETA: 1:14 - loss: 1.2039 - regression_loss: 1.0577 - classification_loss: 0.1462 213/500 [===========>..................] - ETA: 1:14 - loss: 1.2042 - regression_loss: 1.0576 - classification_loss: 0.1465 214/500 [===========>..................] - ETA: 1:14 - loss: 1.2064 - regression_loss: 1.0597 - classification_loss: 0.1468 215/500 [===========>..................] - ETA: 1:13 - loss: 1.2049 - regression_loss: 1.0582 - classification_loss: 0.1466 216/500 [===========>..................] - ETA: 1:13 - loss: 1.2055 - regression_loss: 1.0590 - classification_loss: 0.1465 217/500 [============>.................] - ETA: 1:13 - loss: 1.2061 - regression_loss: 1.0594 - classification_loss: 0.1466 218/500 [============>.................] - ETA: 1:13 - loss: 1.2076 - regression_loss: 1.0603 - classification_loss: 0.1473 219/500 [============>.................] - ETA: 1:12 - loss: 1.2079 - regression_loss: 1.0605 - classification_loss: 0.1474 220/500 [============>.................] - ETA: 1:12 - loss: 1.2080 - regression_loss: 1.0606 - classification_loss: 0.1475 221/500 [============>.................] - ETA: 1:12 - loss: 1.2081 - regression_loss: 1.0606 - classification_loss: 0.1475 222/500 [============>.................] - ETA: 1:12 - loss: 1.2104 - regression_loss: 1.0626 - classification_loss: 0.1478 223/500 [============>.................] - ETA: 1:11 - loss: 1.2110 - regression_loss: 1.0632 - classification_loss: 0.1478 224/500 [============>.................] - ETA: 1:11 - loss: 1.2127 - regression_loss: 1.0643 - classification_loss: 0.1484 225/500 [============>.................] - ETA: 1:11 - loss: 1.2117 - regression_loss: 1.0635 - classification_loss: 0.1483 226/500 [============>.................] - ETA: 1:11 - loss: 1.2109 - regression_loss: 1.0629 - classification_loss: 0.1480 227/500 [============>.................] - ETA: 1:10 - loss: 1.2127 - regression_loss: 1.0645 - classification_loss: 0.1482 228/500 [============>.................] - ETA: 1:10 - loss: 1.2130 - regression_loss: 1.0650 - classification_loss: 0.1480 229/500 [============>.................] - ETA: 1:10 - loss: 1.2102 - regression_loss: 1.0627 - classification_loss: 0.1476 230/500 [============>.................] - ETA: 1:10 - loss: 1.2096 - regression_loss: 1.0621 - classification_loss: 0.1475 231/500 [============>.................] - ETA: 1:09 - loss: 1.2111 - regression_loss: 1.0634 - classification_loss: 0.1477 232/500 [============>.................] - ETA: 1:09 - loss: 1.2129 - regression_loss: 1.0648 - classification_loss: 0.1481 233/500 [============>.................] - ETA: 1:09 - loss: 1.2145 - regression_loss: 1.0661 - classification_loss: 0.1484 234/500 [=============>................] - ETA: 1:09 - loss: 1.2174 - regression_loss: 1.0685 - classification_loss: 0.1489 235/500 [=============>................] - ETA: 1:08 - loss: 1.2152 - regression_loss: 1.0667 - classification_loss: 0.1485 236/500 [=============>................] - ETA: 1:08 - loss: 1.2159 - regression_loss: 1.0668 - classification_loss: 0.1491 237/500 [=============>................] - ETA: 1:08 - loss: 1.2177 - regression_loss: 1.0684 - classification_loss: 0.1493 238/500 [=============>................] - ETA: 1:07 - loss: 1.2189 - regression_loss: 1.0696 - classification_loss: 0.1493 239/500 [=============>................] - ETA: 1:07 - loss: 1.2166 - regression_loss: 1.0676 - classification_loss: 0.1491 240/500 [=============>................] - ETA: 1:07 - loss: 1.2180 - regression_loss: 1.0690 - classification_loss: 0.1491 241/500 [=============>................] - ETA: 1:07 - loss: 1.2173 - regression_loss: 1.0684 - classification_loss: 0.1489 242/500 [=============>................] - ETA: 1:07 - loss: 1.2184 - regression_loss: 1.0693 - classification_loss: 0.1491 243/500 [=============>................] - ETA: 1:06 - loss: 1.2177 - regression_loss: 1.0686 - classification_loss: 0.1491 244/500 [=============>................] - ETA: 1:06 - loss: 1.2154 - regression_loss: 1.0667 - classification_loss: 0.1488 245/500 [=============>................] - ETA: 1:06 - loss: 1.2160 - regression_loss: 1.0671 - classification_loss: 0.1489 246/500 [=============>................] - ETA: 1:05 - loss: 1.2148 - regression_loss: 1.0660 - classification_loss: 0.1488 247/500 [=============>................] - ETA: 1:05 - loss: 1.2166 - regression_loss: 1.0675 - classification_loss: 0.1491 248/500 [=============>................] - ETA: 1:05 - loss: 1.2176 - regression_loss: 1.0685 - classification_loss: 0.1491 249/500 [=============>................] - ETA: 1:05 - loss: 1.2154 - regression_loss: 1.0664 - classification_loss: 0.1490 250/500 [==============>...............] - ETA: 1:04 - loss: 1.2140 - regression_loss: 1.0654 - classification_loss: 0.1486 251/500 [==============>...............] - ETA: 1:04 - loss: 1.2123 - regression_loss: 1.0641 - classification_loss: 0.1483 252/500 [==============>...............] - ETA: 1:04 - loss: 1.2105 - regression_loss: 1.0625 - classification_loss: 0.1480 253/500 [==============>...............] - ETA: 1:04 - loss: 1.2096 - regression_loss: 1.0619 - classification_loss: 0.1476 254/500 [==============>...............] - ETA: 1:03 - loss: 1.2101 - regression_loss: 1.0623 - classification_loss: 0.1478 255/500 [==============>...............] - ETA: 1:03 - loss: 1.2112 - regression_loss: 1.0631 - classification_loss: 0.1481 256/500 [==============>...............] - ETA: 1:03 - loss: 1.2132 - regression_loss: 1.0646 - classification_loss: 0.1485 257/500 [==============>...............] - ETA: 1:03 - loss: 1.2145 - regression_loss: 1.0657 - classification_loss: 0.1489 258/500 [==============>...............] - ETA: 1:02 - loss: 1.2150 - regression_loss: 1.0661 - classification_loss: 0.1489 259/500 [==============>...............] - ETA: 1:02 - loss: 1.2154 - regression_loss: 1.0666 - classification_loss: 0.1488 260/500 [==============>...............] - ETA: 1:02 - loss: 1.2167 - regression_loss: 1.0679 - classification_loss: 0.1488 261/500 [==============>...............] - ETA: 1:02 - loss: 1.2196 - regression_loss: 1.0703 - classification_loss: 0.1493 262/500 [==============>...............] - ETA: 1:01 - loss: 1.2199 - regression_loss: 1.0706 - classification_loss: 0.1493 263/500 [==============>...............] - ETA: 1:01 - loss: 1.2210 - regression_loss: 1.0716 - classification_loss: 0.1494 264/500 [==============>...............] - ETA: 1:01 - loss: 1.2231 - regression_loss: 1.0733 - classification_loss: 0.1497 265/500 [==============>...............] - ETA: 1:01 - loss: 1.2236 - regression_loss: 1.0737 - classification_loss: 0.1499 266/500 [==============>...............] - ETA: 1:00 - loss: 1.2229 - regression_loss: 1.0732 - classification_loss: 0.1497 267/500 [===============>..............] - ETA: 1:00 - loss: 1.2243 - regression_loss: 1.0743 - classification_loss: 0.1500 268/500 [===============>..............] - ETA: 1:00 - loss: 1.2222 - regression_loss: 1.0725 - classification_loss: 0.1497 269/500 [===============>..............] - ETA: 59s - loss: 1.2207 - regression_loss: 1.0713 - classification_loss: 0.1494  270/500 [===============>..............] - ETA: 59s - loss: 1.2242 - regression_loss: 1.0746 - classification_loss: 0.1497 271/500 [===============>..............] - ETA: 59s - loss: 1.2255 - regression_loss: 1.0758 - classification_loss: 0.1498 272/500 [===============>..............] - ETA: 59s - loss: 1.2231 - regression_loss: 1.0737 - classification_loss: 0.1494 273/500 [===============>..............] - ETA: 58s - loss: 1.2230 - regression_loss: 1.0738 - classification_loss: 0.1492 274/500 [===============>..............] - ETA: 58s - loss: 1.2212 - regression_loss: 1.0724 - classification_loss: 0.1489 275/500 [===============>..............] - ETA: 58s - loss: 1.2245 - regression_loss: 1.0753 - classification_loss: 0.1493 276/500 [===============>..............] - ETA: 58s - loss: 1.2262 - regression_loss: 1.0766 - classification_loss: 0.1496 277/500 [===============>..............] - ETA: 57s - loss: 1.2261 - regression_loss: 1.0763 - classification_loss: 0.1498 278/500 [===============>..............] - ETA: 57s - loss: 1.2262 - regression_loss: 1.0764 - classification_loss: 0.1497 279/500 [===============>..............] - ETA: 57s - loss: 1.2264 - regression_loss: 1.0766 - classification_loss: 0.1498 280/500 [===============>..............] - ETA: 57s - loss: 1.2266 - regression_loss: 1.0767 - classification_loss: 0.1499 281/500 [===============>..............] - ETA: 56s - loss: 1.2264 - regression_loss: 1.0765 - classification_loss: 0.1499 282/500 [===============>..............] - ETA: 56s - loss: 1.2269 - regression_loss: 1.0770 - classification_loss: 0.1500 283/500 [===============>..............] - ETA: 56s - loss: 1.2295 - regression_loss: 1.0792 - classification_loss: 0.1502 284/500 [================>.............] - ETA: 55s - loss: 1.2274 - regression_loss: 1.0774 - classification_loss: 0.1500 285/500 [================>.............] - ETA: 55s - loss: 1.2279 - regression_loss: 1.0779 - classification_loss: 0.1499 286/500 [================>.............] - ETA: 55s - loss: 1.2287 - regression_loss: 1.0786 - classification_loss: 0.1501 287/500 [================>.............] - ETA: 55s - loss: 1.2297 - regression_loss: 1.0794 - classification_loss: 0.1504 288/500 [================>.............] - ETA: 54s - loss: 1.2307 - regression_loss: 1.0802 - classification_loss: 0.1505 289/500 [================>.............] - ETA: 54s - loss: 1.2328 - regression_loss: 1.0820 - classification_loss: 0.1508 290/500 [================>.............] - ETA: 54s - loss: 1.2309 - regression_loss: 1.0803 - classification_loss: 0.1505 291/500 [================>.............] - ETA: 54s - loss: 1.2315 - regression_loss: 1.0807 - classification_loss: 0.1508 292/500 [================>.............] - ETA: 53s - loss: 1.2324 - regression_loss: 1.0814 - classification_loss: 0.1509 293/500 [================>.............] - ETA: 53s - loss: 1.2340 - regression_loss: 1.0826 - classification_loss: 0.1514 294/500 [================>.............] - ETA: 53s - loss: 1.2357 - regression_loss: 1.0841 - classification_loss: 0.1516 295/500 [================>.............] - ETA: 53s - loss: 1.2355 - regression_loss: 1.0840 - classification_loss: 0.1516 296/500 [================>.............] - ETA: 52s - loss: 1.2356 - regression_loss: 1.0841 - classification_loss: 0.1516 297/500 [================>.............] - ETA: 52s - loss: 1.2359 - regression_loss: 1.0843 - classification_loss: 0.1517 298/500 [================>.............] - ETA: 52s - loss: 1.2370 - regression_loss: 1.0852 - classification_loss: 0.1518 299/500 [================>.............] - ETA: 52s - loss: 1.2354 - regression_loss: 1.0839 - classification_loss: 0.1515 300/500 [=================>............] - ETA: 51s - loss: 1.2359 - regression_loss: 1.0844 - classification_loss: 0.1515 301/500 [=================>............] - ETA: 51s - loss: 1.2371 - regression_loss: 1.0854 - classification_loss: 0.1517 302/500 [=================>............] - ETA: 51s - loss: 1.2350 - regression_loss: 1.0836 - classification_loss: 0.1514 303/500 [=================>............] - ETA: 51s - loss: 1.2356 - regression_loss: 1.0841 - classification_loss: 0.1515 304/500 [=================>............] - ETA: 50s - loss: 1.2336 - regression_loss: 1.0824 - classification_loss: 0.1512 305/500 [=================>............] - ETA: 50s - loss: 1.2326 - regression_loss: 1.0817 - classification_loss: 0.1508 306/500 [=================>............] - ETA: 50s - loss: 1.2328 - regression_loss: 1.0819 - classification_loss: 0.1509 307/500 [=================>............] - ETA: 50s - loss: 1.2333 - regression_loss: 1.0826 - classification_loss: 0.1507 308/500 [=================>............] - ETA: 49s - loss: 1.2340 - regression_loss: 1.0831 - classification_loss: 0.1508 309/500 [=================>............] - ETA: 49s - loss: 1.2346 - regression_loss: 1.0837 - classification_loss: 0.1509 310/500 [=================>............] - ETA: 49s - loss: 1.2357 - regression_loss: 1.0845 - classification_loss: 0.1512 311/500 [=================>............] - ETA: 48s - loss: 1.2355 - regression_loss: 1.0843 - classification_loss: 0.1512 312/500 [=================>............] - ETA: 48s - loss: 1.2358 - regression_loss: 1.0846 - classification_loss: 0.1512 313/500 [=================>............] - ETA: 48s - loss: 1.2330 - regression_loss: 1.0821 - classification_loss: 0.1509 314/500 [=================>............] - ETA: 48s - loss: 1.2350 - regression_loss: 1.0838 - classification_loss: 0.1512 315/500 [=================>............] - ETA: 47s - loss: 1.2338 - regression_loss: 1.0828 - classification_loss: 0.1510 316/500 [=================>............] - ETA: 47s - loss: 1.2335 - regression_loss: 1.0824 - classification_loss: 0.1510 317/500 [==================>...........] - ETA: 47s - loss: 1.2331 - regression_loss: 1.0822 - classification_loss: 0.1510 318/500 [==================>...........] - ETA: 47s - loss: 1.2342 - regression_loss: 1.0830 - classification_loss: 0.1512 319/500 [==================>...........] - ETA: 46s - loss: 1.2317 - regression_loss: 1.0808 - classification_loss: 0.1509 320/500 [==================>...........] - ETA: 46s - loss: 1.2324 - regression_loss: 1.0815 - classification_loss: 0.1509 321/500 [==================>...........] - ETA: 46s - loss: 1.2331 - regression_loss: 1.0820 - classification_loss: 0.1510 322/500 [==================>...........] - ETA: 46s - loss: 1.2354 - regression_loss: 1.0833 - classification_loss: 0.1521 323/500 [==================>...........] - ETA: 45s - loss: 1.2353 - regression_loss: 1.0833 - classification_loss: 0.1520 324/500 [==================>...........] - ETA: 45s - loss: 1.2349 - regression_loss: 1.0828 - classification_loss: 0.1521 325/500 [==================>...........] - ETA: 45s - loss: 1.2367 - regression_loss: 1.0844 - classification_loss: 0.1523 326/500 [==================>...........] - ETA: 45s - loss: 1.2375 - regression_loss: 1.0851 - classification_loss: 0.1524 327/500 [==================>...........] - ETA: 44s - loss: 1.2382 - regression_loss: 1.0860 - classification_loss: 0.1522 328/500 [==================>...........] - ETA: 44s - loss: 1.2368 - regression_loss: 1.0847 - classification_loss: 0.1521 329/500 [==================>...........] - ETA: 44s - loss: 1.2376 - regression_loss: 1.0855 - classification_loss: 0.1521 330/500 [==================>...........] - ETA: 44s - loss: 1.2383 - regression_loss: 1.0860 - classification_loss: 0.1523 331/500 [==================>...........] - ETA: 43s - loss: 1.2397 - regression_loss: 1.0872 - classification_loss: 0.1525 332/500 [==================>...........] - ETA: 43s - loss: 1.2392 - regression_loss: 1.0868 - classification_loss: 0.1524 333/500 [==================>...........] - ETA: 43s - loss: 1.2398 - regression_loss: 1.0873 - classification_loss: 0.1525 334/500 [===================>..........] - ETA: 42s - loss: 1.2377 - regression_loss: 1.0856 - classification_loss: 0.1521 335/500 [===================>..........] - ETA: 42s - loss: 1.2377 - regression_loss: 1.0856 - classification_loss: 0.1522 336/500 [===================>..........] - ETA: 42s - loss: 1.2381 - regression_loss: 1.0857 - classification_loss: 0.1524 337/500 [===================>..........] - ETA: 42s - loss: 1.2396 - regression_loss: 1.0871 - classification_loss: 0.1525 338/500 [===================>..........] - ETA: 41s - loss: 1.2380 - regression_loss: 1.0858 - classification_loss: 0.1522 339/500 [===================>..........] - ETA: 41s - loss: 1.2394 - regression_loss: 1.0869 - classification_loss: 0.1525 340/500 [===================>..........] - ETA: 41s - loss: 1.2391 - regression_loss: 1.0867 - classification_loss: 0.1523 341/500 [===================>..........] - ETA: 41s - loss: 1.2395 - regression_loss: 1.0872 - classification_loss: 0.1523 342/500 [===================>..........] - ETA: 40s - loss: 1.2411 - regression_loss: 1.0886 - classification_loss: 0.1525 343/500 [===================>..........] - ETA: 40s - loss: 1.2413 - regression_loss: 1.0888 - classification_loss: 0.1525 344/500 [===================>..........] - ETA: 40s - loss: 1.2406 - regression_loss: 1.0882 - classification_loss: 0.1524 345/500 [===================>..........] - ETA: 40s - loss: 1.2402 - regression_loss: 1.0878 - classification_loss: 0.1524 346/500 [===================>..........] - ETA: 39s - loss: 1.2411 - regression_loss: 1.0887 - classification_loss: 0.1524 347/500 [===================>..........] - ETA: 39s - loss: 1.2400 - regression_loss: 1.0878 - classification_loss: 0.1522 348/500 [===================>..........] - ETA: 39s - loss: 1.2407 - regression_loss: 1.0884 - classification_loss: 0.1523 349/500 [===================>..........] - ETA: 39s - loss: 1.2396 - regression_loss: 1.0875 - classification_loss: 0.1521 350/500 [====================>.........] - ETA: 38s - loss: 1.2400 - regression_loss: 1.0877 - classification_loss: 0.1523 351/500 [====================>.........] - ETA: 38s - loss: 1.2409 - regression_loss: 1.0884 - classification_loss: 0.1525 352/500 [====================>.........] - ETA: 38s - loss: 1.2418 - regression_loss: 1.0892 - classification_loss: 0.1526 353/500 [====================>.........] - ETA: 38s - loss: 1.2413 - regression_loss: 1.0889 - classification_loss: 0.1524 354/500 [====================>.........] - ETA: 37s - loss: 1.2405 - regression_loss: 1.0882 - classification_loss: 0.1523 355/500 [====================>.........] - ETA: 37s - loss: 1.2408 - regression_loss: 1.0884 - classification_loss: 0.1524 356/500 [====================>.........] - ETA: 37s - loss: 1.2417 - regression_loss: 1.0893 - classification_loss: 0.1524 357/500 [====================>.........] - ETA: 37s - loss: 1.2399 - regression_loss: 1.0878 - classification_loss: 0.1521 358/500 [====================>.........] - ETA: 36s - loss: 1.2410 - regression_loss: 1.0888 - classification_loss: 0.1522 359/500 [====================>.........] - ETA: 36s - loss: 1.2395 - regression_loss: 1.0876 - classification_loss: 0.1519 360/500 [====================>.........] - ETA: 36s - loss: 1.2396 - regression_loss: 1.0877 - classification_loss: 0.1519 361/500 [====================>.........] - ETA: 35s - loss: 1.2378 - regression_loss: 1.0861 - classification_loss: 0.1516 362/500 [====================>.........] - ETA: 35s - loss: 1.2402 - regression_loss: 1.0878 - classification_loss: 0.1524 363/500 [====================>.........] - ETA: 35s - loss: 1.2406 - regression_loss: 1.0882 - classification_loss: 0.1524 364/500 [====================>.........] - ETA: 35s - loss: 1.2400 - regression_loss: 1.0877 - classification_loss: 0.1523 365/500 [====================>.........] - ETA: 34s - loss: 1.2396 - regression_loss: 1.0874 - classification_loss: 0.1522 366/500 [====================>.........] - ETA: 34s - loss: 1.2389 - regression_loss: 1.0869 - classification_loss: 0.1520 367/500 [=====================>........] - ETA: 34s - loss: 1.2390 - regression_loss: 1.0870 - classification_loss: 0.1520 368/500 [=====================>........] - ETA: 34s - loss: 1.2394 - regression_loss: 1.0872 - classification_loss: 0.1521 369/500 [=====================>........] - ETA: 33s - loss: 1.2405 - regression_loss: 1.0882 - classification_loss: 0.1523 370/500 [=====================>........] - ETA: 33s - loss: 1.2420 - regression_loss: 1.0894 - classification_loss: 0.1526 371/500 [=====================>........] - ETA: 33s - loss: 1.2429 - regression_loss: 1.0901 - classification_loss: 0.1528 372/500 [=====================>........] - ETA: 33s - loss: 1.2430 - regression_loss: 1.0902 - classification_loss: 0.1529 373/500 [=====================>........] - ETA: 32s - loss: 1.2437 - regression_loss: 1.0903 - classification_loss: 0.1534 374/500 [=====================>........] - ETA: 32s - loss: 1.2445 - regression_loss: 1.0910 - classification_loss: 0.1535 375/500 [=====================>........] - ETA: 32s - loss: 1.2446 - regression_loss: 1.0911 - classification_loss: 0.1535 376/500 [=====================>........] - ETA: 32s - loss: 1.2456 - regression_loss: 1.0919 - classification_loss: 0.1537 377/500 [=====================>........] - ETA: 31s - loss: 1.2462 - regression_loss: 1.0924 - classification_loss: 0.1538 378/500 [=====================>........] - ETA: 31s - loss: 1.2464 - regression_loss: 1.0926 - classification_loss: 0.1537 379/500 [=====================>........] - ETA: 31s - loss: 1.2462 - regression_loss: 1.0926 - classification_loss: 0.1537 380/500 [=====================>........] - ETA: 31s - loss: 1.2462 - regression_loss: 1.0926 - classification_loss: 0.1536 381/500 [=====================>........] - ETA: 30s - loss: 1.2452 - regression_loss: 1.0917 - classification_loss: 0.1535 382/500 [=====================>........] - ETA: 30s - loss: 1.2442 - regression_loss: 1.0909 - classification_loss: 0.1533 383/500 [=====================>........] - ETA: 30s - loss: 1.2458 - regression_loss: 1.0920 - classification_loss: 0.1538 384/500 [======================>.......] - ETA: 30s - loss: 1.2450 - regression_loss: 1.0914 - classification_loss: 0.1536 385/500 [======================>.......] - ETA: 29s - loss: 1.2453 - regression_loss: 1.0916 - classification_loss: 0.1537 386/500 [======================>.......] - ETA: 29s - loss: 1.2450 - regression_loss: 1.0913 - classification_loss: 0.1536 387/500 [======================>.......] - ETA: 29s - loss: 1.2446 - regression_loss: 1.0910 - classification_loss: 0.1536 388/500 [======================>.......] - ETA: 28s - loss: 1.2450 - regression_loss: 1.0914 - classification_loss: 0.1536 389/500 [======================>.......] - ETA: 28s - loss: 1.2447 - regression_loss: 1.0911 - classification_loss: 0.1536 390/500 [======================>.......] - ETA: 28s - loss: 1.2439 - regression_loss: 1.0906 - classification_loss: 0.1534 391/500 [======================>.......] - ETA: 28s - loss: 1.2443 - regression_loss: 1.0907 - classification_loss: 0.1537 392/500 [======================>.......] - ETA: 27s - loss: 1.2427 - regression_loss: 1.0893 - classification_loss: 0.1534 393/500 [======================>.......] - ETA: 27s - loss: 1.2414 - regression_loss: 1.0882 - classification_loss: 0.1532 394/500 [======================>.......] - ETA: 27s - loss: 1.2393 - regression_loss: 1.0864 - classification_loss: 0.1529 395/500 [======================>.......] - ETA: 27s - loss: 1.2398 - regression_loss: 1.0869 - classification_loss: 0.1529 396/500 [======================>.......] - ETA: 26s - loss: 1.2401 - regression_loss: 1.0871 - classification_loss: 0.1530 397/500 [======================>.......] - ETA: 26s - loss: 1.2406 - regression_loss: 1.0876 - classification_loss: 0.1530 398/500 [======================>.......] - ETA: 26s - loss: 1.2414 - regression_loss: 1.0883 - classification_loss: 0.1531 399/500 [======================>.......] - ETA: 26s - loss: 1.2412 - regression_loss: 1.0881 - classification_loss: 0.1531 400/500 [=======================>......] - ETA: 25s - loss: 1.2405 - regression_loss: 1.0875 - classification_loss: 0.1529 401/500 [=======================>......] - ETA: 25s - loss: 1.2397 - regression_loss: 1.0868 - classification_loss: 0.1529 402/500 [=======================>......] - ETA: 25s - loss: 1.2400 - regression_loss: 1.0872 - classification_loss: 0.1528 403/500 [=======================>......] - ETA: 25s - loss: 1.2410 - regression_loss: 1.0881 - classification_loss: 0.1530 404/500 [=======================>......] - ETA: 24s - loss: 1.2416 - regression_loss: 1.0886 - classification_loss: 0.1530 405/500 [=======================>......] - ETA: 24s - loss: 1.2401 - regression_loss: 1.0873 - classification_loss: 0.1528 406/500 [=======================>......] - ETA: 24s - loss: 1.2397 - regression_loss: 1.0869 - classification_loss: 0.1528 407/500 [=======================>......] - ETA: 24s - loss: 1.2413 - regression_loss: 1.0883 - classification_loss: 0.1530 408/500 [=======================>......] - ETA: 23s - loss: 1.2428 - regression_loss: 1.0896 - classification_loss: 0.1532 409/500 [=======================>......] - ETA: 23s - loss: 1.2423 - regression_loss: 1.0892 - classification_loss: 0.1532 410/500 [=======================>......] - ETA: 23s - loss: 1.2425 - regression_loss: 1.0894 - classification_loss: 0.1531 411/500 [=======================>......] - ETA: 22s - loss: 1.2427 - regression_loss: 1.0897 - classification_loss: 0.1530 412/500 [=======================>......] - ETA: 22s - loss: 1.2426 - regression_loss: 1.0897 - classification_loss: 0.1529 413/500 [=======================>......] - ETA: 22s - loss: 1.2422 - regression_loss: 1.0894 - classification_loss: 0.1528 414/500 [=======================>......] - ETA: 22s - loss: 1.2406 - regression_loss: 1.0880 - classification_loss: 0.1526 415/500 [=======================>......] - ETA: 21s - loss: 1.2411 - regression_loss: 1.0884 - classification_loss: 0.1527 416/500 [=======================>......] - ETA: 21s - loss: 1.2412 - regression_loss: 1.0887 - classification_loss: 0.1524 417/500 [========================>.....] - ETA: 21s - loss: 1.2398 - regression_loss: 1.0874 - classification_loss: 0.1524 418/500 [========================>.....] - ETA: 21s - loss: 1.2408 - regression_loss: 1.0883 - classification_loss: 0.1525 419/500 [========================>.....] - ETA: 20s - loss: 1.2403 - regression_loss: 1.0880 - classification_loss: 0.1524 420/500 [========================>.....] - ETA: 20s - loss: 1.2390 - regression_loss: 1.0868 - classification_loss: 0.1522 421/500 [========================>.....] - ETA: 20s - loss: 1.2381 - regression_loss: 1.0860 - classification_loss: 0.1521 422/500 [========================>.....] - ETA: 20s - loss: 1.2389 - regression_loss: 1.0868 - classification_loss: 0.1520 423/500 [========================>.....] - ETA: 19s - loss: 1.2398 - regression_loss: 1.0876 - classification_loss: 0.1522 424/500 [========================>.....] - ETA: 19s - loss: 1.2394 - regression_loss: 1.0872 - classification_loss: 0.1522 425/500 [========================>.....] - ETA: 19s - loss: 1.2392 - regression_loss: 1.0871 - classification_loss: 0.1521 426/500 [========================>.....] - ETA: 19s - loss: 1.2405 - regression_loss: 1.0881 - classification_loss: 0.1524 427/500 [========================>.....] - ETA: 18s - loss: 1.2403 - regression_loss: 1.0880 - classification_loss: 0.1523 428/500 [========================>.....] - ETA: 18s - loss: 1.2396 - regression_loss: 1.0874 - classification_loss: 0.1522 429/500 [========================>.....] - ETA: 18s - loss: 1.2391 - regression_loss: 1.0869 - classification_loss: 0.1522 430/500 [========================>.....] - ETA: 18s - loss: 1.2399 - regression_loss: 1.0876 - classification_loss: 0.1523 431/500 [========================>.....] - ETA: 17s - loss: 1.2385 - regression_loss: 1.0864 - classification_loss: 0.1521 432/500 [========================>.....] - ETA: 17s - loss: 1.2374 - regression_loss: 1.0855 - classification_loss: 0.1519 433/500 [========================>.....] - ETA: 17s - loss: 1.2382 - regression_loss: 1.0860 - classification_loss: 0.1522 434/500 [=========================>....] - ETA: 17s - loss: 1.2378 - regression_loss: 1.0857 - classification_loss: 0.1522 435/500 [=========================>....] - ETA: 16s - loss: 1.2371 - regression_loss: 1.0849 - classification_loss: 0.1521 436/500 [=========================>....] - ETA: 16s - loss: 1.2380 - regression_loss: 1.0856 - classification_loss: 0.1523 437/500 [=========================>....] - ETA: 16s - loss: 1.2369 - regression_loss: 1.0846 - classification_loss: 0.1523 438/500 [=========================>....] - ETA: 16s - loss: 1.2375 - regression_loss: 1.0851 - classification_loss: 0.1524 439/500 [=========================>....] - ETA: 15s - loss: 1.2356 - regression_loss: 1.0833 - classification_loss: 0.1522 440/500 [=========================>....] - ETA: 15s - loss: 1.2362 - regression_loss: 1.0839 - classification_loss: 0.1524 441/500 [=========================>....] - ETA: 15s - loss: 1.2353 - regression_loss: 1.0831 - classification_loss: 0.1522 442/500 [=========================>....] - ETA: 14s - loss: 1.2343 - regression_loss: 1.0823 - classification_loss: 0.1520 443/500 [=========================>....] - ETA: 14s - loss: 1.2352 - regression_loss: 1.0830 - classification_loss: 0.1521 444/500 [=========================>....] - ETA: 14s - loss: 1.2349 - regression_loss: 1.0829 - classification_loss: 0.1521 445/500 [=========================>....] - ETA: 14s - loss: 1.2348 - regression_loss: 1.0828 - classification_loss: 0.1520 446/500 [=========================>....] - ETA: 13s - loss: 1.2335 - regression_loss: 1.0818 - classification_loss: 0.1518 447/500 [=========================>....] - ETA: 13s - loss: 1.2326 - regression_loss: 1.0810 - classification_loss: 0.1516 448/500 [=========================>....] - ETA: 13s - loss: 1.2328 - regression_loss: 1.0811 - classification_loss: 0.1516 449/500 [=========================>....] - ETA: 13s - loss: 1.2309 - regression_loss: 1.0795 - classification_loss: 0.1514 450/500 [==========================>...] - ETA: 12s - loss: 1.2292 - regression_loss: 1.0779 - classification_loss: 0.1512 451/500 [==========================>...] - ETA: 12s - loss: 1.2300 - regression_loss: 1.0789 - classification_loss: 0.1512 452/500 [==========================>...] - ETA: 12s - loss: 1.2299 - regression_loss: 1.0788 - classification_loss: 0.1511 453/500 [==========================>...] - ETA: 12s - loss: 1.2297 - regression_loss: 1.0786 - classification_loss: 0.1511 454/500 [==========================>...] - ETA: 11s - loss: 1.2279 - regression_loss: 1.0771 - classification_loss: 0.1508 455/500 [==========================>...] - ETA: 11s - loss: 1.2279 - regression_loss: 1.0770 - classification_loss: 0.1508 456/500 [==========================>...] - ETA: 11s - loss: 1.2276 - regression_loss: 1.0768 - classification_loss: 0.1508 457/500 [==========================>...] - ETA: 11s - loss: 1.2277 - regression_loss: 1.0769 - classification_loss: 0.1508 458/500 [==========================>...] - ETA: 10s - loss: 1.2272 - regression_loss: 1.0765 - classification_loss: 0.1507 459/500 [==========================>...] - ETA: 10s - loss: 1.2269 - regression_loss: 1.0762 - classification_loss: 0.1507 460/500 [==========================>...] - ETA: 10s - loss: 1.2265 - regression_loss: 1.0758 - classification_loss: 0.1507 461/500 [==========================>...] - ETA: 10s - loss: 1.2278 - regression_loss: 1.0770 - classification_loss: 0.1508 462/500 [==========================>...] - ETA: 9s - loss: 1.2267 - regression_loss: 1.0760 - classification_loss: 0.1508  463/500 [==========================>...] - ETA: 9s - loss: 1.2263 - regression_loss: 1.0756 - classification_loss: 0.1507 464/500 [==========================>...] - ETA: 9s - loss: 1.2277 - regression_loss: 1.0770 - classification_loss: 0.1506 465/500 [==========================>...] - ETA: 9s - loss: 1.2263 - regression_loss: 1.0759 - classification_loss: 0.1504 466/500 [==========================>...] - ETA: 8s - loss: 1.2259 - regression_loss: 1.0756 - classification_loss: 0.1503 467/500 [===========================>..] - ETA: 8s - loss: 1.2265 - regression_loss: 1.0761 - classification_loss: 0.1504 468/500 [===========================>..] - ETA: 8s - loss: 1.2255 - regression_loss: 1.0753 - classification_loss: 0.1502 469/500 [===========================>..] - ETA: 7s - loss: 1.2261 - regression_loss: 1.0757 - classification_loss: 0.1504 470/500 [===========================>..] - ETA: 7s - loss: 1.2260 - regression_loss: 1.0756 - classification_loss: 0.1504 471/500 [===========================>..] - ETA: 7s - loss: 1.2263 - regression_loss: 1.0759 - classification_loss: 0.1504 472/500 [===========================>..] - ETA: 7s - loss: 1.2261 - regression_loss: 1.0758 - classification_loss: 0.1503 473/500 [===========================>..] - ETA: 6s - loss: 1.2247 - regression_loss: 1.0746 - classification_loss: 0.1501 474/500 [===========================>..] - ETA: 6s - loss: 1.2248 - regression_loss: 1.0746 - classification_loss: 0.1502 475/500 [===========================>..] - ETA: 6s - loss: 1.2256 - regression_loss: 1.0753 - classification_loss: 0.1503 476/500 [===========================>..] - ETA: 6s - loss: 1.2246 - regression_loss: 1.0744 - classification_loss: 0.1502 477/500 [===========================>..] - ETA: 5s - loss: 1.2251 - regression_loss: 1.0748 - classification_loss: 0.1503 478/500 [===========================>..] - ETA: 5s - loss: 1.2235 - regression_loss: 1.0735 - classification_loss: 0.1501 479/500 [===========================>..] - ETA: 5s - loss: 1.2236 - regression_loss: 1.0735 - classification_loss: 0.1500 480/500 [===========================>..] - ETA: 5s - loss: 1.2226 - regression_loss: 1.0727 - classification_loss: 0.1498 481/500 [===========================>..] - ETA: 4s - loss: 1.2214 - regression_loss: 1.0718 - classification_loss: 0.1496 482/500 [===========================>..] - ETA: 4s - loss: 1.2217 - regression_loss: 1.0720 - classification_loss: 0.1497 483/500 [===========================>..] - ETA: 4s - loss: 1.2216 - regression_loss: 1.0720 - classification_loss: 0.1497 484/500 [============================>.] - ETA: 4s - loss: 1.2218 - regression_loss: 1.0721 - classification_loss: 0.1497 485/500 [============================>.] - ETA: 3s - loss: 1.2233 - regression_loss: 1.0732 - classification_loss: 0.1500 486/500 [============================>.] - ETA: 3s - loss: 1.2230 - regression_loss: 1.0731 - classification_loss: 0.1499 487/500 [============================>.] - ETA: 3s - loss: 1.2236 - regression_loss: 1.0735 - classification_loss: 0.1501 488/500 [============================>.] - ETA: 3s - loss: 1.2235 - regression_loss: 1.0734 - classification_loss: 0.1501 489/500 [============================>.] - ETA: 2s - loss: 1.2236 - regression_loss: 1.0735 - classification_loss: 0.1501 490/500 [============================>.] - ETA: 2s - loss: 1.2228 - regression_loss: 1.0729 - classification_loss: 0.1500 491/500 [============================>.] - ETA: 2s - loss: 1.2233 - regression_loss: 1.0732 - classification_loss: 0.1500 492/500 [============================>.] - ETA: 2s - loss: 1.2225 - regression_loss: 1.0726 - classification_loss: 0.1499 493/500 [============================>.] - ETA: 1s - loss: 1.2225 - regression_loss: 1.0724 - classification_loss: 0.1501 494/500 [============================>.] - ETA: 1s - loss: 1.2219 - regression_loss: 1.0720 - classification_loss: 0.1500 495/500 [============================>.] - ETA: 1s - loss: 1.2221 - regression_loss: 1.0721 - classification_loss: 0.1500 496/500 [============================>.] - ETA: 1s - loss: 1.2214 - regression_loss: 1.0715 - classification_loss: 0.1499 497/500 [============================>.] - ETA: 0s - loss: 1.2211 - regression_loss: 1.0714 - classification_loss: 0.1497 498/500 [============================>.] - ETA: 0s - loss: 1.2220 - regression_loss: 1.0722 - classification_loss: 0.1498 499/500 [============================>.] - ETA: 0s - loss: 1.2221 - regression_loss: 1.0722 - classification_loss: 0.1499 500/500 [==============================] - 128s 257ms/step - loss: 1.2208 - regression_loss: 1.0711 - classification_loss: 0.1497 1172 instances of class plum with average precision: 0.7369 mAP: 0.7369 Epoch 00023: saving model to ./training/snapshots/resnet50_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:02 - loss: 0.9631 - regression_loss: 0.8581 - classification_loss: 0.1050 2/500 [..............................] - ETA: 2:03 - loss: 1.1224 - regression_loss: 0.9827 - classification_loss: 0.1398 3/500 [..............................] - ETA: 2:05 - loss: 1.1563 - regression_loss: 1.0094 - classification_loss: 0.1469 4/500 [..............................] - ETA: 2:03 - loss: 0.9823 - regression_loss: 0.8627 - classification_loss: 0.1196 5/500 [..............................] - ETA: 2:03 - loss: 1.0625 - regression_loss: 0.9350 - classification_loss: 0.1275 6/500 [..............................] - ETA: 2:02 - loss: 1.0524 - regression_loss: 0.9108 - classification_loss: 0.1416 7/500 [..............................] - ETA: 2:02 - loss: 1.0742 - regression_loss: 0.9262 - classification_loss: 0.1480 8/500 [..............................] - ETA: 2:01 - loss: 1.0332 - regression_loss: 0.8950 - classification_loss: 0.1382 9/500 [..............................] - ETA: 1:59 - loss: 1.1184 - regression_loss: 0.9662 - classification_loss: 0.1521 10/500 [..............................] - ETA: 1:59 - loss: 1.1843 - regression_loss: 1.0176 - classification_loss: 0.1667 11/500 [..............................] - ETA: 1:59 - loss: 1.1876 - regression_loss: 1.0212 - classification_loss: 0.1664 12/500 [..............................] - ETA: 1:59 - loss: 1.2118 - regression_loss: 1.0429 - classification_loss: 0.1689 13/500 [..............................] - ETA: 1:58 - loss: 1.2462 - regression_loss: 1.0750 - classification_loss: 0.1712 14/500 [..............................] - ETA: 1:58 - loss: 1.2417 - regression_loss: 1.0709 - classification_loss: 0.1708 15/500 [..............................] - ETA: 1:57 - loss: 1.2199 - regression_loss: 1.0556 - classification_loss: 0.1643 16/500 [..............................] - ETA: 1:58 - loss: 1.2359 - regression_loss: 1.0703 - classification_loss: 0.1656 17/500 [>.............................] - ETA: 1:58 - loss: 1.2370 - regression_loss: 1.0727 - classification_loss: 0.1642 18/500 [>.............................] - ETA: 1:58 - loss: 1.2288 - regression_loss: 1.0666 - classification_loss: 0.1622 19/500 [>.............................] - ETA: 1:58 - loss: 1.2591 - regression_loss: 1.0958 - classification_loss: 0.1633 20/500 [>.............................] - ETA: 1:58 - loss: 1.2721 - regression_loss: 1.1067 - classification_loss: 0.1654 21/500 [>.............................] - ETA: 1:57 - loss: 1.2518 - regression_loss: 1.0908 - classification_loss: 0.1610 22/500 [>.............................] - ETA: 1:58 - loss: 1.2528 - regression_loss: 1.0892 - classification_loss: 0.1636 23/500 [>.............................] - ETA: 1:58 - loss: 1.2523 - regression_loss: 1.0883 - classification_loss: 0.1639 24/500 [>.............................] - ETA: 1:57 - loss: 1.2545 - regression_loss: 1.0925 - classification_loss: 0.1620 25/500 [>.............................] - ETA: 1:58 - loss: 1.2617 - regression_loss: 1.0989 - classification_loss: 0.1628 26/500 [>.............................] - ETA: 1:57 - loss: 1.2500 - regression_loss: 1.0896 - classification_loss: 0.1604 27/500 [>.............................] - ETA: 1:58 - loss: 1.2849 - regression_loss: 1.1163 - classification_loss: 0.1686 28/500 [>.............................] - ETA: 1:57 - loss: 1.2888 - regression_loss: 1.1199 - classification_loss: 0.1689 29/500 [>.............................] - ETA: 1:57 - loss: 1.2913 - regression_loss: 1.1243 - classification_loss: 0.1670 30/500 [>.............................] - ETA: 1:57 - loss: 1.3072 - regression_loss: 1.1403 - classification_loss: 0.1669 31/500 [>.............................] - ETA: 1:57 - loss: 1.2903 - regression_loss: 1.1267 - classification_loss: 0.1637 32/500 [>.............................] - ETA: 1:56 - loss: 1.2790 - regression_loss: 1.1187 - classification_loss: 0.1603 33/500 [>.............................] - ETA: 1:56 - loss: 1.2817 - regression_loss: 1.1223 - classification_loss: 0.1594 34/500 [=>............................] - ETA: 1:56 - loss: 1.2961 - regression_loss: 1.1354 - classification_loss: 0.1607 35/500 [=>............................] - ETA: 1:56 - loss: 1.2820 - regression_loss: 1.1236 - classification_loss: 0.1583 36/500 [=>............................] - ETA: 1:56 - loss: 1.2641 - regression_loss: 1.1078 - classification_loss: 0.1563 37/500 [=>............................] - ETA: 1:56 - loss: 1.2649 - regression_loss: 1.1081 - classification_loss: 0.1568 38/500 [=>............................] - ETA: 1:56 - loss: 1.2636 - regression_loss: 1.1068 - classification_loss: 0.1568 39/500 [=>............................] - ETA: 1:55 - loss: 1.2664 - regression_loss: 1.1089 - classification_loss: 0.1575 40/500 [=>............................] - ETA: 1:55 - loss: 1.2670 - regression_loss: 1.1090 - classification_loss: 0.1580 41/500 [=>............................] - ETA: 1:55 - loss: 1.2692 - regression_loss: 1.1109 - classification_loss: 0.1584 42/500 [=>............................] - ETA: 1:55 - loss: 1.2648 - regression_loss: 1.1070 - classification_loss: 0.1578 43/500 [=>............................] - ETA: 1:54 - loss: 1.2706 - regression_loss: 1.1121 - classification_loss: 0.1585 44/500 [=>............................] - ETA: 1:54 - loss: 1.2713 - regression_loss: 1.1122 - classification_loss: 0.1591 45/500 [=>............................] - ETA: 1:54 - loss: 1.2683 - regression_loss: 1.1095 - classification_loss: 0.1588 46/500 [=>............................] - ETA: 1:54 - loss: 1.2803 - regression_loss: 1.1198 - classification_loss: 0.1605 47/500 [=>............................] - ETA: 1:54 - loss: 1.2774 - regression_loss: 1.1160 - classification_loss: 0.1614 48/500 [=>............................] - ETA: 1:53 - loss: 1.2691 - regression_loss: 1.1093 - classification_loss: 0.1598 49/500 [=>............................] - ETA: 1:53 - loss: 1.2756 - regression_loss: 1.1153 - classification_loss: 0.1604 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2730 - regression_loss: 1.1134 - classification_loss: 0.1596 51/500 [==>...........................] - ETA: 1:53 - loss: 1.2766 - regression_loss: 1.1165 - classification_loss: 0.1601 52/500 [==>...........................] - ETA: 1:53 - loss: 1.2888 - regression_loss: 1.1268 - classification_loss: 0.1620 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2893 - regression_loss: 1.1267 - classification_loss: 0.1627 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2815 - regression_loss: 1.1195 - classification_loss: 0.1619 55/500 [==>...........................] - ETA: 1:52 - loss: 1.2850 - regression_loss: 1.1228 - classification_loss: 0.1622 56/500 [==>...........................] - ETA: 1:52 - loss: 1.2847 - regression_loss: 1.1222 - classification_loss: 0.1625 57/500 [==>...........................] - ETA: 1:52 - loss: 1.2809 - regression_loss: 1.1195 - classification_loss: 0.1615 58/500 [==>...........................] - ETA: 1:52 - loss: 1.2887 - regression_loss: 1.1261 - classification_loss: 0.1626 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2962 - regression_loss: 1.1328 - classification_loss: 0.1635 60/500 [==>...........................] - ETA: 1:51 - loss: 1.2994 - regression_loss: 1.1360 - classification_loss: 0.1634 61/500 [==>...........................] - ETA: 1:51 - loss: 1.3044 - regression_loss: 1.1405 - classification_loss: 0.1638 62/500 [==>...........................] - ETA: 1:51 - loss: 1.3035 - regression_loss: 1.1394 - classification_loss: 0.1641 63/500 [==>...........................] - ETA: 1:50 - loss: 1.3031 - regression_loss: 1.1398 - classification_loss: 0.1633 64/500 [==>...........................] - ETA: 1:50 - loss: 1.3004 - regression_loss: 1.1376 - classification_loss: 0.1628 65/500 [==>...........................] - ETA: 1:50 - loss: 1.2990 - regression_loss: 1.1368 - classification_loss: 0.1623 66/500 [==>...........................] - ETA: 1:50 - loss: 1.2980 - regression_loss: 1.1356 - classification_loss: 0.1624 67/500 [===>..........................] - ETA: 1:50 - loss: 1.3041 - regression_loss: 1.1413 - classification_loss: 0.1628 68/500 [===>..........................] - ETA: 1:49 - loss: 1.3094 - regression_loss: 1.1461 - classification_loss: 0.1634 69/500 [===>..........................] - ETA: 1:49 - loss: 1.3057 - regression_loss: 1.1431 - classification_loss: 0.1625 70/500 [===>..........................] - ETA: 1:49 - loss: 1.2976 - regression_loss: 1.1359 - classification_loss: 0.1617 71/500 [===>..........................] - ETA: 1:48 - loss: 1.2974 - regression_loss: 1.1355 - classification_loss: 0.1619 72/500 [===>..........................] - ETA: 1:48 - loss: 1.2992 - regression_loss: 1.1371 - classification_loss: 0.1620 73/500 [===>..........................] - ETA: 1:48 - loss: 1.2878 - regression_loss: 1.1267 - classification_loss: 0.1611 74/500 [===>..........................] - ETA: 1:48 - loss: 1.2901 - regression_loss: 1.1291 - classification_loss: 0.1610 75/500 [===>..........................] - ETA: 1:48 - loss: 1.2895 - regression_loss: 1.1298 - classification_loss: 0.1597 76/500 [===>..........................] - ETA: 1:47 - loss: 1.2813 - regression_loss: 1.1223 - classification_loss: 0.1590 77/500 [===>..........................] - ETA: 1:47 - loss: 1.2799 - regression_loss: 1.1213 - classification_loss: 0.1586 78/500 [===>..........................] - ETA: 1:47 - loss: 1.2722 - regression_loss: 1.1148 - classification_loss: 0.1574 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2619 - regression_loss: 1.1051 - classification_loss: 0.1568 80/500 [===>..........................] - ETA: 1:46 - loss: 1.2550 - regression_loss: 1.0996 - classification_loss: 0.1554 81/500 [===>..........................] - ETA: 1:46 - loss: 1.2655 - regression_loss: 1.1083 - classification_loss: 0.1573 82/500 [===>..........................] - ETA: 1:46 - loss: 1.2573 - regression_loss: 1.1013 - classification_loss: 0.1560 83/500 [===>..........................] - ETA: 1:45 - loss: 1.2556 - regression_loss: 1.1001 - classification_loss: 0.1555 84/500 [====>.........................] - ETA: 1:45 - loss: 1.2504 - regression_loss: 1.0956 - classification_loss: 0.1548 85/500 [====>.........................] - ETA: 1:45 - loss: 1.2461 - regression_loss: 1.0918 - classification_loss: 0.1542 86/500 [====>.........................] - ETA: 1:45 - loss: 1.2499 - regression_loss: 1.0946 - classification_loss: 0.1553 87/500 [====>.........................] - ETA: 1:45 - loss: 1.2508 - regression_loss: 1.0952 - classification_loss: 0.1556 88/500 [====>.........................] - ETA: 1:44 - loss: 1.2428 - regression_loss: 1.0884 - classification_loss: 0.1544 89/500 [====>.........................] - ETA: 1:44 - loss: 1.2416 - regression_loss: 1.0872 - classification_loss: 0.1544 90/500 [====>.........................] - ETA: 1:44 - loss: 1.2411 - regression_loss: 1.0873 - classification_loss: 0.1538 91/500 [====>.........................] - ETA: 1:44 - loss: 1.2409 - regression_loss: 1.0874 - classification_loss: 0.1535 92/500 [====>.........................] - ETA: 1:43 - loss: 1.2455 - regression_loss: 1.0912 - classification_loss: 0.1542 93/500 [====>.........................] - ETA: 1:43 - loss: 1.2424 - regression_loss: 1.0892 - classification_loss: 0.1533 94/500 [====>.........................] - ETA: 1:43 - loss: 1.2442 - regression_loss: 1.0906 - classification_loss: 0.1536 95/500 [====>.........................] - ETA: 1:43 - loss: 1.2402 - regression_loss: 1.0876 - classification_loss: 0.1526 96/500 [====>.........................] - ETA: 1:42 - loss: 1.2486 - regression_loss: 1.0950 - classification_loss: 0.1537 97/500 [====>.........................] - ETA: 1:42 - loss: 1.2579 - regression_loss: 1.1030 - classification_loss: 0.1549 98/500 [====>.........................] - ETA: 1:42 - loss: 1.2573 - regression_loss: 1.1027 - classification_loss: 0.1546 99/500 [====>.........................] - ETA: 1:42 - loss: 1.2572 - regression_loss: 1.1028 - classification_loss: 0.1544 100/500 [=====>........................] - ETA: 1:42 - loss: 1.2498 - regression_loss: 1.0964 - classification_loss: 0.1533 101/500 [=====>........................] - ETA: 1:41 - loss: 1.2517 - regression_loss: 1.0982 - classification_loss: 0.1536 102/500 [=====>........................] - ETA: 1:41 - loss: 1.2580 - regression_loss: 1.1030 - classification_loss: 0.1550 103/500 [=====>........................] - ETA: 1:41 - loss: 1.2585 - regression_loss: 1.1035 - classification_loss: 0.1551 104/500 [=====>........................] - ETA: 1:41 - loss: 1.2605 - regression_loss: 1.1049 - classification_loss: 0.1556 105/500 [=====>........................] - ETA: 1:40 - loss: 1.2596 - regression_loss: 1.1037 - classification_loss: 0.1559 106/500 [=====>........................] - ETA: 1:40 - loss: 1.2617 - regression_loss: 1.1058 - classification_loss: 0.1559 107/500 [=====>........................] - ETA: 1:40 - loss: 1.2573 - regression_loss: 1.1023 - classification_loss: 0.1551 108/500 [=====>........................] - ETA: 1:40 - loss: 1.2594 - regression_loss: 1.1042 - classification_loss: 0.1552 109/500 [=====>........................] - ETA: 1:39 - loss: 1.2595 - regression_loss: 1.1043 - classification_loss: 0.1552 110/500 [=====>........................] - ETA: 1:39 - loss: 1.2541 - regression_loss: 1.0982 - classification_loss: 0.1558 111/500 [=====>........................] - ETA: 1:39 - loss: 1.2577 - regression_loss: 1.1016 - classification_loss: 0.1561 112/500 [=====>........................] - ETA: 1:39 - loss: 1.2565 - regression_loss: 1.1007 - classification_loss: 0.1558 113/500 [=====>........................] - ETA: 1:39 - loss: 1.2527 - regression_loss: 1.0975 - classification_loss: 0.1552 114/500 [=====>........................] - ETA: 1:38 - loss: 1.2508 - regression_loss: 1.0958 - classification_loss: 0.1550 115/500 [=====>........................] - ETA: 1:38 - loss: 1.2493 - regression_loss: 1.0952 - classification_loss: 0.1540 116/500 [=====>........................] - ETA: 1:38 - loss: 1.2507 - regression_loss: 1.0964 - classification_loss: 0.1543 117/500 [======>.......................] - ETA: 1:38 - loss: 1.2486 - regression_loss: 1.0948 - classification_loss: 0.1539 118/500 [======>.......................] - ETA: 1:37 - loss: 1.2446 - regression_loss: 1.0913 - classification_loss: 0.1534 119/500 [======>.......................] - ETA: 1:37 - loss: 1.2445 - regression_loss: 1.0913 - classification_loss: 0.1532 120/500 [======>.......................] - ETA: 1:37 - loss: 1.2457 - regression_loss: 1.0922 - classification_loss: 0.1535 121/500 [======>.......................] - ETA: 1:36 - loss: 1.2444 - regression_loss: 1.0908 - classification_loss: 0.1536 122/500 [======>.......................] - ETA: 1:36 - loss: 1.2489 - regression_loss: 1.0944 - classification_loss: 0.1545 123/500 [======>.......................] - ETA: 1:36 - loss: 1.2482 - regression_loss: 1.0942 - classification_loss: 0.1539 124/500 [======>.......................] - ETA: 1:36 - loss: 1.2499 - regression_loss: 1.0951 - classification_loss: 0.1548 125/500 [======>.......................] - ETA: 1:35 - loss: 1.2500 - regression_loss: 1.0951 - classification_loss: 0.1549 126/500 [======>.......................] - ETA: 1:35 - loss: 1.2507 - regression_loss: 1.0957 - classification_loss: 0.1550 127/500 [======>.......................] - ETA: 1:35 - loss: 1.2484 - regression_loss: 1.0937 - classification_loss: 0.1547 128/500 [======>.......................] - ETA: 1:35 - loss: 1.2429 - regression_loss: 1.0890 - classification_loss: 0.1539 129/500 [======>.......................] - ETA: 1:34 - loss: 1.2427 - regression_loss: 1.0891 - classification_loss: 0.1536 130/500 [======>.......................] - ETA: 1:34 - loss: 1.2449 - regression_loss: 1.0909 - classification_loss: 0.1541 131/500 [======>.......................] - ETA: 1:34 - loss: 1.2436 - regression_loss: 1.0895 - classification_loss: 0.1541 132/500 [======>.......................] - ETA: 1:34 - loss: 1.2467 - regression_loss: 1.0923 - classification_loss: 0.1545 133/500 [======>.......................] - ETA: 1:33 - loss: 1.2512 - regression_loss: 1.0958 - classification_loss: 0.1554 134/500 [=======>......................] - ETA: 1:33 - loss: 1.2532 - regression_loss: 1.0975 - classification_loss: 0.1557 135/500 [=======>......................] - ETA: 1:33 - loss: 1.2509 - regression_loss: 1.0960 - classification_loss: 0.1549 136/500 [=======>......................] - ETA: 1:33 - loss: 1.2500 - regression_loss: 1.0952 - classification_loss: 0.1548 137/500 [=======>......................] - ETA: 1:32 - loss: 1.2507 - regression_loss: 1.0961 - classification_loss: 0.1546 138/500 [=======>......................] - ETA: 1:32 - loss: 1.2484 - regression_loss: 1.0944 - classification_loss: 0.1540 139/500 [=======>......................] - ETA: 1:32 - loss: 1.2449 - regression_loss: 1.0917 - classification_loss: 0.1532 140/500 [=======>......................] - ETA: 1:32 - loss: 1.2424 - regression_loss: 1.0897 - classification_loss: 0.1527 141/500 [=======>......................] - ETA: 1:32 - loss: 1.2415 - regression_loss: 1.0889 - classification_loss: 0.1525 142/500 [=======>......................] - ETA: 1:31 - loss: 1.2401 - regression_loss: 1.0879 - classification_loss: 0.1522 143/500 [=======>......................] - ETA: 1:31 - loss: 1.2427 - regression_loss: 1.0903 - classification_loss: 0.1524 144/500 [=======>......................] - ETA: 1:31 - loss: 1.2404 - regression_loss: 1.0882 - classification_loss: 0.1522 145/500 [=======>......................] - ETA: 1:31 - loss: 1.2401 - regression_loss: 1.0876 - classification_loss: 0.1524 146/500 [=======>......................] - ETA: 1:30 - loss: 1.2398 - regression_loss: 1.0873 - classification_loss: 0.1525 147/500 [=======>......................] - ETA: 1:30 - loss: 1.2412 - regression_loss: 1.0882 - classification_loss: 0.1530 148/500 [=======>......................] - ETA: 1:30 - loss: 1.2400 - regression_loss: 1.0870 - classification_loss: 0.1530 149/500 [=======>......................] - ETA: 1:30 - loss: 1.2379 - regression_loss: 1.0853 - classification_loss: 0.1526 150/500 [========>.....................] - ETA: 1:29 - loss: 1.2424 - regression_loss: 1.0890 - classification_loss: 0.1534 151/500 [========>.....................] - ETA: 1:29 - loss: 1.2412 - regression_loss: 1.0877 - classification_loss: 0.1535 152/500 [========>.....................] - ETA: 1:29 - loss: 1.2447 - regression_loss: 1.0910 - classification_loss: 0.1537 153/500 [========>.....................] - ETA: 1:29 - loss: 1.2414 - regression_loss: 1.0883 - classification_loss: 0.1531 154/500 [========>.....................] - ETA: 1:28 - loss: 1.2375 - regression_loss: 1.0849 - classification_loss: 0.1526 155/500 [========>.....................] - ETA: 1:28 - loss: 1.2323 - regression_loss: 1.0805 - classification_loss: 0.1519 156/500 [========>.....................] - ETA: 1:28 - loss: 1.2303 - regression_loss: 1.0787 - classification_loss: 0.1516 157/500 [========>.....................] - ETA: 1:28 - loss: 1.2299 - regression_loss: 1.0782 - classification_loss: 0.1518 158/500 [========>.....................] - ETA: 1:27 - loss: 1.2308 - regression_loss: 1.0789 - classification_loss: 0.1520 159/500 [========>.....................] - ETA: 1:27 - loss: 1.2331 - regression_loss: 1.0808 - classification_loss: 0.1523 160/500 [========>.....................] - ETA: 1:27 - loss: 1.2298 - regression_loss: 1.0780 - classification_loss: 0.1518 161/500 [========>.....................] - ETA: 1:27 - loss: 1.2291 - regression_loss: 1.0771 - classification_loss: 0.1520 162/500 [========>.....................] - ETA: 1:26 - loss: 1.2305 - regression_loss: 1.0786 - classification_loss: 0.1520 163/500 [========>.....................] - ETA: 1:26 - loss: 1.2326 - regression_loss: 1.0802 - classification_loss: 0.1523 164/500 [========>.....................] - ETA: 1:26 - loss: 1.2287 - regression_loss: 1.0756 - classification_loss: 0.1531 165/500 [========>.....................] - ETA: 1:26 - loss: 1.2271 - regression_loss: 1.0746 - classification_loss: 0.1525 166/500 [========>.....................] - ETA: 1:25 - loss: 1.2296 - regression_loss: 1.0766 - classification_loss: 0.1529 167/500 [=========>....................] - ETA: 1:25 - loss: 1.2307 - regression_loss: 1.0776 - classification_loss: 0.1531 168/500 [=========>....................] - ETA: 1:25 - loss: 1.2274 - regression_loss: 1.0749 - classification_loss: 0.1524 169/500 [=========>....................] - ETA: 1:25 - loss: 1.2281 - regression_loss: 1.0756 - classification_loss: 0.1525 170/500 [=========>....................] - ETA: 1:24 - loss: 1.2312 - regression_loss: 1.0781 - classification_loss: 0.1531 171/500 [=========>....................] - ETA: 1:24 - loss: 1.2311 - regression_loss: 1.0779 - classification_loss: 0.1532 172/500 [=========>....................] - ETA: 1:24 - loss: 1.2304 - regression_loss: 1.0772 - classification_loss: 0.1531 173/500 [=========>....................] - ETA: 1:24 - loss: 1.2319 - regression_loss: 1.0783 - classification_loss: 0.1536 174/500 [=========>....................] - ETA: 1:23 - loss: 1.2319 - regression_loss: 1.0782 - classification_loss: 0.1537 175/500 [=========>....................] - ETA: 1:23 - loss: 1.2305 - regression_loss: 1.0772 - classification_loss: 0.1534 176/500 [=========>....................] - ETA: 1:23 - loss: 1.2317 - regression_loss: 1.0778 - classification_loss: 0.1540 177/500 [=========>....................] - ETA: 1:23 - loss: 1.2341 - regression_loss: 1.0797 - classification_loss: 0.1543 178/500 [=========>....................] - ETA: 1:22 - loss: 1.2362 - regression_loss: 1.0816 - classification_loss: 0.1546 179/500 [=========>....................] - ETA: 1:22 - loss: 1.2395 - regression_loss: 1.0832 - classification_loss: 0.1563 180/500 [=========>....................] - ETA: 1:22 - loss: 1.2396 - regression_loss: 1.0835 - classification_loss: 0.1561 181/500 [=========>....................] - ETA: 1:22 - loss: 1.2381 - regression_loss: 1.0823 - classification_loss: 0.1558 182/500 [=========>....................] - ETA: 1:21 - loss: 1.2411 - regression_loss: 1.0848 - classification_loss: 0.1563 183/500 [=========>....................] - ETA: 1:21 - loss: 1.2391 - regression_loss: 1.0833 - classification_loss: 0.1558 184/500 [==========>...................] - ETA: 1:21 - loss: 1.2441 - regression_loss: 1.0874 - classification_loss: 0.1567 185/500 [==========>...................] - ETA: 1:20 - loss: 1.2415 - regression_loss: 1.0853 - classification_loss: 0.1562 186/500 [==========>...................] - ETA: 1:20 - loss: 1.2430 - regression_loss: 1.0868 - classification_loss: 0.1563 187/500 [==========>...................] - ETA: 1:20 - loss: 1.2427 - regression_loss: 1.0866 - classification_loss: 0.1560 188/500 [==========>...................] - ETA: 1:20 - loss: 1.2420 - regression_loss: 1.0861 - classification_loss: 0.1559 189/500 [==========>...................] - ETA: 1:19 - loss: 1.2430 - regression_loss: 1.0872 - classification_loss: 0.1558 190/500 [==========>...................] - ETA: 1:19 - loss: 1.2438 - regression_loss: 1.0880 - classification_loss: 0.1558 191/500 [==========>...................] - ETA: 1:19 - loss: 1.2409 - regression_loss: 1.0857 - classification_loss: 0.1552 192/500 [==========>...................] - ETA: 1:19 - loss: 1.2438 - regression_loss: 1.0879 - classification_loss: 0.1558 193/500 [==========>...................] - ETA: 1:18 - loss: 1.2445 - regression_loss: 1.0886 - classification_loss: 0.1559 194/500 [==========>...................] - ETA: 1:18 - loss: 1.2440 - regression_loss: 1.0884 - classification_loss: 0.1557 195/500 [==========>...................] - ETA: 1:18 - loss: 1.2437 - regression_loss: 1.0882 - classification_loss: 0.1556 196/500 [==========>...................] - ETA: 1:18 - loss: 1.2433 - regression_loss: 1.0879 - classification_loss: 0.1554 197/500 [==========>...................] - ETA: 1:17 - loss: 1.2439 - regression_loss: 1.0884 - classification_loss: 0.1555 198/500 [==========>...................] - ETA: 1:17 - loss: 1.2451 - regression_loss: 1.0897 - classification_loss: 0.1554 199/500 [==========>...................] - ETA: 1:17 - loss: 1.2443 - regression_loss: 1.0892 - classification_loss: 0.1551 200/500 [===========>..................] - ETA: 1:17 - loss: 1.2425 - regression_loss: 1.0878 - classification_loss: 0.1547 201/500 [===========>..................] - ETA: 1:16 - loss: 1.2426 - regression_loss: 1.0879 - classification_loss: 0.1546 202/500 [===========>..................] - ETA: 1:16 - loss: 1.2434 - regression_loss: 1.0886 - classification_loss: 0.1548 203/500 [===========>..................] - ETA: 1:16 - loss: 1.2454 - regression_loss: 1.0904 - classification_loss: 0.1550 204/500 [===========>..................] - ETA: 1:16 - loss: 1.2448 - regression_loss: 1.0899 - classification_loss: 0.1549 205/500 [===========>..................] - ETA: 1:15 - loss: 1.2442 - regression_loss: 1.0896 - classification_loss: 0.1546 206/500 [===========>..................] - ETA: 1:15 - loss: 1.2442 - regression_loss: 1.0896 - classification_loss: 0.1546 207/500 [===========>..................] - ETA: 1:15 - loss: 1.2457 - regression_loss: 1.0908 - classification_loss: 0.1549 208/500 [===========>..................] - ETA: 1:15 - loss: 1.2446 - regression_loss: 1.0899 - classification_loss: 0.1547 209/500 [===========>..................] - ETA: 1:14 - loss: 1.2441 - regression_loss: 1.0894 - classification_loss: 0.1547 210/500 [===========>..................] - ETA: 1:14 - loss: 1.2440 - regression_loss: 1.0894 - classification_loss: 0.1545 211/500 [===========>..................] - ETA: 1:14 - loss: 1.2452 - regression_loss: 1.0905 - classification_loss: 0.1547 212/500 [===========>..................] - ETA: 1:14 - loss: 1.2463 - regression_loss: 1.0915 - classification_loss: 0.1548 213/500 [===========>..................] - ETA: 1:13 - loss: 1.2480 - regression_loss: 1.0928 - classification_loss: 0.1552 214/500 [===========>..................] - ETA: 1:13 - loss: 1.2504 - regression_loss: 1.0947 - classification_loss: 0.1557 215/500 [===========>..................] - ETA: 1:13 - loss: 1.2488 - regression_loss: 1.0935 - classification_loss: 0.1553 216/500 [===========>..................] - ETA: 1:13 - loss: 1.2489 - regression_loss: 1.0935 - classification_loss: 0.1554 217/500 [============>.................] - ETA: 1:12 - loss: 1.2515 - regression_loss: 1.0956 - classification_loss: 0.1559 218/500 [============>.................] - ETA: 1:12 - loss: 1.2484 - regression_loss: 1.0926 - classification_loss: 0.1558 219/500 [============>.................] - ETA: 1:12 - loss: 1.2480 - regression_loss: 1.0923 - classification_loss: 0.1557 220/500 [============>.................] - ETA: 1:11 - loss: 1.2471 - regression_loss: 1.0916 - classification_loss: 0.1555 221/500 [============>.................] - ETA: 1:11 - loss: 1.2454 - regression_loss: 1.0902 - classification_loss: 0.1552 222/500 [============>.................] - ETA: 1:11 - loss: 1.2482 - regression_loss: 1.0925 - classification_loss: 0.1556 223/500 [============>.................] - ETA: 1:11 - loss: 1.2472 - regression_loss: 1.0916 - classification_loss: 0.1556 224/500 [============>.................] - ETA: 1:10 - loss: 1.2482 - regression_loss: 1.0924 - classification_loss: 0.1558 225/500 [============>.................] - ETA: 1:10 - loss: 1.2446 - regression_loss: 1.0892 - classification_loss: 0.1554 226/500 [============>.................] - ETA: 1:10 - loss: 1.2440 - regression_loss: 1.0887 - classification_loss: 0.1553 227/500 [============>.................] - ETA: 1:10 - loss: 1.2397 - regression_loss: 1.0848 - classification_loss: 0.1549 228/500 [============>.................] - ETA: 1:09 - loss: 1.2405 - regression_loss: 1.0857 - classification_loss: 0.1548 229/500 [============>.................] - ETA: 1:09 - loss: 1.2418 - regression_loss: 1.0868 - classification_loss: 0.1549 230/500 [============>.................] - ETA: 1:09 - loss: 1.2408 - regression_loss: 1.0862 - classification_loss: 0.1546 231/500 [============>.................] - ETA: 1:09 - loss: 1.2404 - regression_loss: 1.0856 - classification_loss: 0.1548 232/500 [============>.................] - ETA: 1:08 - loss: 1.2415 - regression_loss: 1.0865 - classification_loss: 0.1550 233/500 [============>.................] - ETA: 1:08 - loss: 1.2406 - regression_loss: 1.0856 - classification_loss: 0.1550 234/500 [=============>................] - ETA: 1:08 - loss: 1.2398 - regression_loss: 1.0850 - classification_loss: 0.1549 235/500 [=============>................] - ETA: 1:08 - loss: 1.2398 - regression_loss: 1.0847 - classification_loss: 0.1550 236/500 [=============>................] - ETA: 1:07 - loss: 1.2398 - regression_loss: 1.0848 - classification_loss: 0.1550 237/500 [=============>................] - ETA: 1:07 - loss: 1.2370 - regression_loss: 1.0824 - classification_loss: 0.1546 238/500 [=============>................] - ETA: 1:07 - loss: 1.2389 - regression_loss: 1.0845 - classification_loss: 0.1544 239/500 [=============>................] - ETA: 1:07 - loss: 1.2408 - regression_loss: 1.0861 - classification_loss: 0.1547 240/500 [=============>................] - ETA: 1:06 - loss: 1.2387 - regression_loss: 1.0843 - classification_loss: 0.1544 241/500 [=============>................] - ETA: 1:06 - loss: 1.2409 - regression_loss: 1.0863 - classification_loss: 0.1546 242/500 [=============>................] - ETA: 1:06 - loss: 1.2432 - regression_loss: 1.0883 - classification_loss: 0.1550 243/500 [=============>................] - ETA: 1:06 - loss: 1.2428 - regression_loss: 1.0879 - classification_loss: 0.1549 244/500 [=============>................] - ETA: 1:05 - loss: 1.2421 - regression_loss: 1.0870 - classification_loss: 0.1550 245/500 [=============>................] - ETA: 1:05 - loss: 1.2399 - regression_loss: 1.0853 - classification_loss: 0.1547 246/500 [=============>................] - ETA: 1:05 - loss: 1.2386 - regression_loss: 1.0843 - classification_loss: 0.1543 247/500 [=============>................] - ETA: 1:05 - loss: 1.2403 - regression_loss: 1.0857 - classification_loss: 0.1546 248/500 [=============>................] - ETA: 1:04 - loss: 1.2461 - regression_loss: 1.0900 - classification_loss: 0.1561 249/500 [=============>................] - ETA: 1:04 - loss: 1.2430 - regression_loss: 1.0874 - classification_loss: 0.1556 250/500 [==============>...............] - ETA: 1:04 - loss: 1.2424 - regression_loss: 1.0869 - classification_loss: 0.1555 251/500 [==============>...............] - ETA: 1:04 - loss: 1.2448 - regression_loss: 1.0888 - classification_loss: 0.1560 252/500 [==============>...............] - ETA: 1:03 - loss: 1.2422 - regression_loss: 1.0866 - classification_loss: 0.1556 253/500 [==============>...............] - ETA: 1:03 - loss: 1.2399 - regression_loss: 1.0846 - classification_loss: 0.1553 254/500 [==============>...............] - ETA: 1:03 - loss: 1.2383 - regression_loss: 1.0832 - classification_loss: 0.1551 255/500 [==============>...............] - ETA: 1:03 - loss: 1.2404 - regression_loss: 1.0850 - classification_loss: 0.1554 256/500 [==============>...............] - ETA: 1:02 - loss: 1.2406 - regression_loss: 1.0850 - classification_loss: 0.1556 257/500 [==============>...............] - ETA: 1:02 - loss: 1.2418 - regression_loss: 1.0861 - classification_loss: 0.1556 258/500 [==============>...............] - ETA: 1:02 - loss: 1.2395 - regression_loss: 1.0843 - classification_loss: 0.1552 259/500 [==============>...............] - ETA: 1:02 - loss: 1.2409 - regression_loss: 1.0846 - classification_loss: 0.1563 260/500 [==============>...............] - ETA: 1:01 - loss: 1.2425 - regression_loss: 1.0859 - classification_loss: 0.1565 261/500 [==============>...............] - ETA: 1:01 - loss: 1.2438 - regression_loss: 1.0871 - classification_loss: 0.1566 262/500 [==============>...............] - ETA: 1:01 - loss: 1.2458 - regression_loss: 1.0889 - classification_loss: 0.1569 263/500 [==============>...............] - ETA: 1:01 - loss: 1.2434 - regression_loss: 1.0867 - classification_loss: 0.1566 264/500 [==============>...............] - ETA: 1:00 - loss: 1.2437 - regression_loss: 1.0870 - classification_loss: 0.1566 265/500 [==============>...............] - ETA: 1:00 - loss: 1.2444 - regression_loss: 1.0877 - classification_loss: 0.1567 266/500 [==============>...............] - ETA: 1:00 - loss: 1.2473 - regression_loss: 1.0903 - classification_loss: 0.1570 267/500 [===============>..............] - ETA: 1:00 - loss: 1.2448 - regression_loss: 1.0882 - classification_loss: 0.1566 268/500 [===============>..............] - ETA: 59s - loss: 1.2466 - regression_loss: 1.0897 - classification_loss: 0.1569  269/500 [===============>..............] - ETA: 59s - loss: 1.2465 - regression_loss: 1.0896 - classification_loss: 0.1569 270/500 [===============>..............] - ETA: 59s - loss: 1.2468 - regression_loss: 1.0901 - classification_loss: 0.1567 271/500 [===============>..............] - ETA: 59s - loss: 1.2475 - regression_loss: 1.0907 - classification_loss: 0.1569 272/500 [===============>..............] - ETA: 58s - loss: 1.2515 - regression_loss: 1.0944 - classification_loss: 0.1572 273/500 [===============>..............] - ETA: 58s - loss: 1.2518 - regression_loss: 1.0947 - classification_loss: 0.1571 274/500 [===============>..............] - ETA: 58s - loss: 1.2506 - regression_loss: 1.0937 - classification_loss: 0.1568 275/500 [===============>..............] - ETA: 58s - loss: 1.2527 - regression_loss: 1.0956 - classification_loss: 0.1571 276/500 [===============>..............] - ETA: 57s - loss: 1.2529 - regression_loss: 1.0957 - classification_loss: 0.1571 277/500 [===============>..............] - ETA: 57s - loss: 1.2534 - regression_loss: 1.0962 - classification_loss: 0.1572 278/500 [===============>..............] - ETA: 57s - loss: 1.2536 - regression_loss: 1.0963 - classification_loss: 0.1573 279/500 [===============>..............] - ETA: 56s - loss: 1.2527 - regression_loss: 1.0955 - classification_loss: 0.1572 280/500 [===============>..............] - ETA: 56s - loss: 1.2514 - regression_loss: 1.0942 - classification_loss: 0.1572 281/500 [===============>..............] - ETA: 56s - loss: 1.2515 - regression_loss: 1.0943 - classification_loss: 0.1572 282/500 [===============>..............] - ETA: 56s - loss: 1.2516 - regression_loss: 1.0945 - classification_loss: 0.1571 283/500 [===============>..............] - ETA: 55s - loss: 1.2496 - regression_loss: 1.0929 - classification_loss: 0.1567 284/500 [================>.............] - ETA: 55s - loss: 1.2476 - regression_loss: 1.0911 - classification_loss: 0.1566 285/500 [================>.............] - ETA: 55s - loss: 1.2466 - regression_loss: 1.0903 - classification_loss: 0.1563 286/500 [================>.............] - ETA: 55s - loss: 1.2463 - regression_loss: 1.0901 - classification_loss: 0.1562 287/500 [================>.............] - ETA: 54s - loss: 1.2461 - regression_loss: 1.0901 - classification_loss: 0.1560 288/500 [================>.............] - ETA: 54s - loss: 1.2460 - regression_loss: 1.0899 - classification_loss: 0.1561 289/500 [================>.............] - ETA: 54s - loss: 1.2444 - regression_loss: 1.0886 - classification_loss: 0.1558 290/500 [================>.............] - ETA: 54s - loss: 1.2425 - regression_loss: 1.0872 - classification_loss: 0.1554 291/500 [================>.............] - ETA: 54s - loss: 1.2426 - regression_loss: 1.0873 - classification_loss: 0.1554 292/500 [================>.............] - ETA: 53s - loss: 1.2431 - regression_loss: 1.0876 - classification_loss: 0.1554 293/500 [================>.............] - ETA: 53s - loss: 1.2425 - regression_loss: 1.0873 - classification_loss: 0.1552 294/500 [================>.............] - ETA: 53s - loss: 1.2437 - regression_loss: 1.0883 - classification_loss: 0.1555 295/500 [================>.............] - ETA: 52s - loss: 1.2425 - regression_loss: 1.0873 - classification_loss: 0.1552 296/500 [================>.............] - ETA: 52s - loss: 1.2428 - regression_loss: 1.0875 - classification_loss: 0.1553 297/500 [================>.............] - ETA: 52s - loss: 1.2434 - regression_loss: 1.0880 - classification_loss: 0.1554 298/500 [================>.............] - ETA: 52s - loss: 1.2428 - regression_loss: 1.0877 - classification_loss: 0.1551 299/500 [================>.............] - ETA: 51s - loss: 1.2425 - regression_loss: 1.0874 - classification_loss: 0.1551 300/500 [=================>............] - ETA: 51s - loss: 1.2434 - regression_loss: 1.0882 - classification_loss: 0.1552 301/500 [=================>............] - ETA: 51s - loss: 1.2438 - regression_loss: 1.0885 - classification_loss: 0.1553 302/500 [=================>............] - ETA: 51s - loss: 1.2413 - regression_loss: 1.0864 - classification_loss: 0.1550 303/500 [=================>............] - ETA: 50s - loss: 1.2409 - regression_loss: 1.0860 - classification_loss: 0.1548 304/500 [=================>............] - ETA: 50s - loss: 1.2403 - regression_loss: 1.0855 - classification_loss: 0.1548 305/500 [=================>............] - ETA: 50s - loss: 1.2408 - regression_loss: 1.0858 - classification_loss: 0.1551 306/500 [=================>............] - ETA: 50s - loss: 1.2381 - regression_loss: 1.0834 - classification_loss: 0.1547 307/500 [=================>............] - ETA: 49s - loss: 1.2385 - regression_loss: 1.0836 - classification_loss: 0.1548 308/500 [=================>............] - ETA: 49s - loss: 1.2370 - regression_loss: 1.0824 - classification_loss: 0.1545 309/500 [=================>............] - ETA: 49s - loss: 1.2361 - regression_loss: 1.0816 - classification_loss: 0.1545 310/500 [=================>............] - ETA: 49s - loss: 1.2344 - regression_loss: 1.0802 - classification_loss: 0.1542 311/500 [=================>............] - ETA: 48s - loss: 1.2327 - regression_loss: 1.0788 - classification_loss: 0.1539 312/500 [=================>............] - ETA: 48s - loss: 1.2313 - regression_loss: 1.0776 - classification_loss: 0.1537 313/500 [=================>............] - ETA: 48s - loss: 1.2313 - regression_loss: 1.0777 - classification_loss: 0.1537 314/500 [=================>............] - ETA: 47s - loss: 1.2288 - regression_loss: 1.0755 - classification_loss: 0.1533 315/500 [=================>............] - ETA: 47s - loss: 1.2277 - regression_loss: 1.0747 - classification_loss: 0.1530 316/500 [=================>............] - ETA: 47s - loss: 1.2277 - regression_loss: 1.0748 - classification_loss: 0.1529 317/500 [==================>...........] - ETA: 47s - loss: 1.2270 - regression_loss: 1.0740 - classification_loss: 0.1530 318/500 [==================>...........] - ETA: 46s - loss: 1.2271 - regression_loss: 1.0742 - classification_loss: 0.1529 319/500 [==================>...........] - ETA: 46s - loss: 1.2276 - regression_loss: 1.0747 - classification_loss: 0.1529 320/500 [==================>...........] - ETA: 46s - loss: 1.2289 - regression_loss: 1.0758 - classification_loss: 0.1530 321/500 [==================>...........] - ETA: 46s - loss: 1.2305 - regression_loss: 1.0772 - classification_loss: 0.1533 322/500 [==================>...........] - ETA: 45s - loss: 1.2312 - regression_loss: 1.0777 - classification_loss: 0.1535 323/500 [==================>...........] - ETA: 45s - loss: 1.2324 - regression_loss: 1.0787 - classification_loss: 0.1537 324/500 [==================>...........] - ETA: 45s - loss: 1.2323 - regression_loss: 1.0787 - classification_loss: 0.1536 325/500 [==================>...........] - ETA: 45s - loss: 1.2338 - regression_loss: 1.0800 - classification_loss: 0.1538 326/500 [==================>...........] - ETA: 44s - loss: 1.2338 - regression_loss: 1.0801 - classification_loss: 0.1537 327/500 [==================>...........] - ETA: 44s - loss: 1.2361 - regression_loss: 1.0822 - classification_loss: 0.1538 328/500 [==================>...........] - ETA: 44s - loss: 1.2363 - regression_loss: 1.0824 - classification_loss: 0.1538 329/500 [==================>...........] - ETA: 44s - loss: 1.2386 - regression_loss: 1.0843 - classification_loss: 0.1543 330/500 [==================>...........] - ETA: 43s - loss: 1.2392 - regression_loss: 1.0849 - classification_loss: 0.1543 331/500 [==================>...........] - ETA: 43s - loss: 1.2380 - regression_loss: 1.0840 - classification_loss: 0.1540 332/500 [==================>...........] - ETA: 43s - loss: 1.2379 - regression_loss: 1.0839 - classification_loss: 0.1540 333/500 [==================>...........] - ETA: 42s - loss: 1.2380 - regression_loss: 1.0840 - classification_loss: 0.1540 334/500 [===================>..........] - ETA: 42s - loss: 1.2381 - regression_loss: 1.0840 - classification_loss: 0.1541 335/500 [===================>..........] - ETA: 42s - loss: 1.2380 - regression_loss: 1.0840 - classification_loss: 0.1540 336/500 [===================>..........] - ETA: 42s - loss: 1.2373 - regression_loss: 1.0831 - classification_loss: 0.1541 337/500 [===================>..........] - ETA: 41s - loss: 1.2358 - regression_loss: 1.0819 - classification_loss: 0.1539 338/500 [===================>..........] - ETA: 41s - loss: 1.2367 - regression_loss: 1.0828 - classification_loss: 0.1539 339/500 [===================>..........] - ETA: 41s - loss: 1.2370 - regression_loss: 1.0830 - classification_loss: 0.1540 340/500 [===================>..........] - ETA: 41s - loss: 1.2360 - regression_loss: 1.0822 - classification_loss: 0.1538 341/500 [===================>..........] - ETA: 40s - loss: 1.2349 - regression_loss: 1.0812 - classification_loss: 0.1536 342/500 [===================>..........] - ETA: 40s - loss: 1.2338 - regression_loss: 1.0804 - classification_loss: 0.1534 343/500 [===================>..........] - ETA: 40s - loss: 1.2334 - regression_loss: 1.0801 - classification_loss: 0.1532 344/500 [===================>..........] - ETA: 40s - loss: 1.2343 - regression_loss: 1.0808 - classification_loss: 0.1534 345/500 [===================>..........] - ETA: 39s - loss: 1.2343 - regression_loss: 1.0809 - classification_loss: 0.1534 346/500 [===================>..........] - ETA: 39s - loss: 1.2335 - regression_loss: 1.0800 - classification_loss: 0.1535 347/500 [===================>..........] - ETA: 39s - loss: 1.2337 - regression_loss: 1.0803 - classification_loss: 0.1534 348/500 [===================>..........] - ETA: 38s - loss: 1.2338 - regression_loss: 1.0804 - classification_loss: 0.1534 349/500 [===================>..........] - ETA: 38s - loss: 1.2339 - regression_loss: 1.0805 - classification_loss: 0.1534 350/500 [====================>.........] - ETA: 38s - loss: 1.2335 - regression_loss: 1.0801 - classification_loss: 0.1534 351/500 [====================>.........] - ETA: 38s - loss: 1.2347 - regression_loss: 1.0811 - classification_loss: 0.1536 352/500 [====================>.........] - ETA: 37s - loss: 1.2357 - regression_loss: 1.0819 - classification_loss: 0.1538 353/500 [====================>.........] - ETA: 37s - loss: 1.2361 - regression_loss: 1.0823 - classification_loss: 0.1538 354/500 [====================>.........] - ETA: 37s - loss: 1.2363 - regression_loss: 1.0826 - classification_loss: 0.1537 355/500 [====================>.........] - ETA: 37s - loss: 1.2353 - regression_loss: 1.0819 - classification_loss: 0.1534 356/500 [====================>.........] - ETA: 36s - loss: 1.2357 - regression_loss: 1.0823 - classification_loss: 0.1534 357/500 [====================>.........] - ETA: 36s - loss: 1.2353 - regression_loss: 1.0821 - classification_loss: 0.1532 358/500 [====================>.........] - ETA: 36s - loss: 1.2358 - regression_loss: 1.0825 - classification_loss: 0.1533 359/500 [====================>.........] - ETA: 36s - loss: 1.2364 - regression_loss: 1.0828 - classification_loss: 0.1535 360/500 [====================>.........] - ETA: 35s - loss: 1.2349 - regression_loss: 1.0817 - classification_loss: 0.1532 361/500 [====================>.........] - ETA: 35s - loss: 1.2339 - regression_loss: 1.0808 - classification_loss: 0.1531 362/500 [====================>.........] - ETA: 35s - loss: 1.2344 - regression_loss: 1.0813 - classification_loss: 0.1531 363/500 [====================>.........] - ETA: 34s - loss: 1.2346 - regression_loss: 1.0814 - classification_loss: 0.1532 364/500 [====================>.........] - ETA: 34s - loss: 1.2334 - regression_loss: 1.0804 - classification_loss: 0.1530 365/500 [====================>.........] - ETA: 34s - loss: 1.2337 - regression_loss: 1.0806 - classification_loss: 0.1531 366/500 [====================>.........] - ETA: 34s - loss: 1.2346 - regression_loss: 1.0815 - classification_loss: 0.1530 367/500 [=====================>........] - ETA: 33s - loss: 1.2362 - regression_loss: 1.0829 - classification_loss: 0.1533 368/500 [=====================>........] - ETA: 33s - loss: 1.2350 - regression_loss: 1.0820 - classification_loss: 0.1530 369/500 [=====================>........] - ETA: 33s - loss: 1.2342 - regression_loss: 1.0809 - classification_loss: 0.1533 370/500 [=====================>........] - ETA: 33s - loss: 1.2356 - regression_loss: 1.0820 - classification_loss: 0.1535 371/500 [=====================>........] - ETA: 32s - loss: 1.2365 - regression_loss: 1.0828 - classification_loss: 0.1537 372/500 [=====================>........] - ETA: 32s - loss: 1.2368 - regression_loss: 1.0831 - classification_loss: 0.1538 373/500 [=====================>........] - ETA: 32s - loss: 1.2377 - regression_loss: 1.0838 - classification_loss: 0.1540 374/500 [=====================>........] - ETA: 32s - loss: 1.2362 - regression_loss: 1.0825 - classification_loss: 0.1537 375/500 [=====================>........] - ETA: 31s - loss: 1.2354 - regression_loss: 1.0819 - classification_loss: 0.1535 376/500 [=====================>........] - ETA: 31s - loss: 1.2352 - regression_loss: 1.0818 - classification_loss: 0.1534 377/500 [=====================>........] - ETA: 31s - loss: 1.2342 - regression_loss: 1.0810 - classification_loss: 0.1533 378/500 [=====================>........] - ETA: 31s - loss: 1.2342 - regression_loss: 1.0809 - classification_loss: 0.1533 379/500 [=====================>........] - ETA: 30s - loss: 1.2353 - regression_loss: 1.0816 - classification_loss: 0.1536 380/500 [=====================>........] - ETA: 30s - loss: 1.2357 - regression_loss: 1.0818 - classification_loss: 0.1539 381/500 [=====================>........] - ETA: 30s - loss: 1.2346 - regression_loss: 1.0808 - classification_loss: 0.1537 382/500 [=====================>........] - ETA: 30s - loss: 1.2352 - regression_loss: 1.0813 - classification_loss: 0.1539 383/500 [=====================>........] - ETA: 29s - loss: 1.2366 - regression_loss: 1.0827 - classification_loss: 0.1540 384/500 [======================>.......] - ETA: 29s - loss: 1.2348 - regression_loss: 1.0811 - classification_loss: 0.1537 385/500 [======================>.......] - ETA: 29s - loss: 1.2340 - regression_loss: 1.0805 - classification_loss: 0.1535 386/500 [======================>.......] - ETA: 29s - loss: 1.2339 - regression_loss: 1.0805 - classification_loss: 0.1535 387/500 [======================>.......] - ETA: 28s - loss: 1.2346 - regression_loss: 1.0810 - classification_loss: 0.1537 388/500 [======================>.......] - ETA: 28s - loss: 1.2329 - regression_loss: 1.0794 - classification_loss: 0.1534 389/500 [======================>.......] - ETA: 28s - loss: 1.2339 - regression_loss: 1.0802 - classification_loss: 0.1537 390/500 [======================>.......] - ETA: 27s - loss: 1.2324 - regression_loss: 1.0789 - classification_loss: 0.1535 391/500 [======================>.......] - ETA: 27s - loss: 1.2300 - regression_loss: 1.0769 - classification_loss: 0.1531 392/500 [======================>.......] - ETA: 27s - loss: 1.2309 - regression_loss: 1.0775 - classification_loss: 0.1534 393/500 [======================>.......] - ETA: 27s - loss: 1.2294 - regression_loss: 1.0762 - classification_loss: 0.1531 394/500 [======================>.......] - ETA: 26s - loss: 1.2295 - regression_loss: 1.0763 - classification_loss: 0.1531 395/500 [======================>.......] - ETA: 26s - loss: 1.2301 - regression_loss: 1.0768 - classification_loss: 0.1532 396/500 [======================>.......] - ETA: 26s - loss: 1.2317 - regression_loss: 1.0782 - classification_loss: 0.1535 397/500 [======================>.......] - ETA: 26s - loss: 1.2324 - regression_loss: 1.0789 - classification_loss: 0.1535 398/500 [======================>.......] - ETA: 25s - loss: 1.2331 - regression_loss: 1.0795 - classification_loss: 0.1536 399/500 [======================>.......] - ETA: 25s - loss: 1.2328 - regression_loss: 1.0794 - classification_loss: 0.1535 400/500 [=======================>......] - ETA: 25s - loss: 1.2328 - regression_loss: 1.0793 - classification_loss: 0.1535 401/500 [=======================>......] - ETA: 25s - loss: 1.2341 - regression_loss: 1.0804 - classification_loss: 0.1537 402/500 [=======================>......] - ETA: 24s - loss: 1.2355 - regression_loss: 1.0818 - classification_loss: 0.1537 403/500 [=======================>......] - ETA: 24s - loss: 1.2366 - regression_loss: 1.0828 - classification_loss: 0.1538 404/500 [=======================>......] - ETA: 24s - loss: 1.2373 - regression_loss: 1.0835 - classification_loss: 0.1538 405/500 [=======================>......] - ETA: 24s - loss: 1.2376 - regression_loss: 1.0838 - classification_loss: 0.1538 406/500 [=======================>......] - ETA: 23s - loss: 1.2378 - regression_loss: 1.0840 - classification_loss: 0.1538 407/500 [=======================>......] - ETA: 23s - loss: 1.2386 - regression_loss: 1.0846 - classification_loss: 0.1539 408/500 [=======================>......] - ETA: 23s - loss: 1.2385 - regression_loss: 1.0845 - classification_loss: 0.1540 409/500 [=======================>......] - ETA: 23s - loss: 1.2383 - regression_loss: 1.0844 - classification_loss: 0.1538 410/500 [=======================>......] - ETA: 22s - loss: 1.2367 - regression_loss: 1.0831 - classification_loss: 0.1536 411/500 [=======================>......] - ETA: 22s - loss: 1.2356 - regression_loss: 1.0822 - classification_loss: 0.1534 412/500 [=======================>......] - ETA: 22s - loss: 1.2353 - regression_loss: 1.0816 - classification_loss: 0.1538 413/500 [=======================>......] - ETA: 22s - loss: 1.2357 - regression_loss: 1.0819 - classification_loss: 0.1538 414/500 [=======================>......] - ETA: 21s - loss: 1.2351 - regression_loss: 1.0816 - classification_loss: 0.1536 415/500 [=======================>......] - ETA: 21s - loss: 1.2352 - regression_loss: 1.0815 - classification_loss: 0.1537 416/500 [=======================>......] - ETA: 21s - loss: 1.2353 - regression_loss: 1.0816 - classification_loss: 0.1537 417/500 [========================>.....] - ETA: 21s - loss: 1.2352 - regression_loss: 1.0814 - classification_loss: 0.1537 418/500 [========================>.....] - ETA: 20s - loss: 1.2337 - regression_loss: 1.0802 - classification_loss: 0.1535 419/500 [========================>.....] - ETA: 20s - loss: 1.2337 - regression_loss: 1.0802 - classification_loss: 0.1535 420/500 [========================>.....] - ETA: 20s - loss: 1.2324 - regression_loss: 1.0792 - classification_loss: 0.1532 421/500 [========================>.....] - ETA: 20s - loss: 1.2315 - regression_loss: 1.0785 - classification_loss: 0.1530 422/500 [========================>.....] - ETA: 19s - loss: 1.2315 - regression_loss: 1.0785 - classification_loss: 0.1530 423/500 [========================>.....] - ETA: 19s - loss: 1.2305 - regression_loss: 1.0777 - classification_loss: 0.1528 424/500 [========================>.....] - ETA: 19s - loss: 1.2312 - regression_loss: 1.0783 - classification_loss: 0.1528 425/500 [========================>.....] - ETA: 18s - loss: 1.2325 - regression_loss: 1.0795 - classification_loss: 0.1531 426/500 [========================>.....] - ETA: 18s - loss: 1.2325 - regression_loss: 1.0794 - classification_loss: 0.1531 427/500 [========================>.....] - ETA: 18s - loss: 1.2328 - regression_loss: 1.0798 - classification_loss: 0.1530 428/500 [========================>.....] - ETA: 18s - loss: 1.2323 - regression_loss: 1.0794 - classification_loss: 0.1530 429/500 [========================>.....] - ETA: 17s - loss: 1.2302 - regression_loss: 1.0775 - classification_loss: 0.1527 430/500 [========================>.....] - ETA: 17s - loss: 1.2312 - regression_loss: 1.0784 - classification_loss: 0.1528 431/500 [========================>.....] - ETA: 17s - loss: 1.2314 - regression_loss: 1.0788 - classification_loss: 0.1526 432/500 [========================>.....] - ETA: 17s - loss: 1.2328 - regression_loss: 1.0800 - classification_loss: 0.1528 433/500 [========================>.....] - ETA: 16s - loss: 1.2318 - regression_loss: 1.0792 - classification_loss: 0.1526 434/500 [=========================>....] - ETA: 16s - loss: 1.2318 - regression_loss: 1.0792 - classification_loss: 0.1526 435/500 [=========================>....] - ETA: 16s - loss: 1.2311 - regression_loss: 1.0786 - classification_loss: 0.1525 436/500 [=========================>....] - ETA: 16s - loss: 1.2315 - regression_loss: 1.0790 - classification_loss: 0.1525 437/500 [=========================>....] - ETA: 15s - loss: 1.2302 - regression_loss: 1.0779 - classification_loss: 0.1523 438/500 [=========================>....] - ETA: 15s - loss: 1.2308 - regression_loss: 1.0784 - classification_loss: 0.1524 439/500 [=========================>....] - ETA: 15s - loss: 1.2318 - regression_loss: 1.0793 - classification_loss: 0.1525 440/500 [=========================>....] - ETA: 15s - loss: 1.2335 - regression_loss: 1.0807 - classification_loss: 0.1528 441/500 [=========================>....] - ETA: 14s - loss: 1.2342 - regression_loss: 1.0812 - classification_loss: 0.1529 442/500 [=========================>....] - ETA: 14s - loss: 1.2337 - regression_loss: 1.0809 - classification_loss: 0.1528 443/500 [=========================>....] - ETA: 14s - loss: 1.2333 - regression_loss: 1.0805 - classification_loss: 0.1528 444/500 [=========================>....] - ETA: 14s - loss: 1.2332 - regression_loss: 1.0805 - classification_loss: 0.1527 445/500 [=========================>....] - ETA: 13s - loss: 1.2342 - regression_loss: 1.0813 - classification_loss: 0.1529 446/500 [=========================>....] - ETA: 13s - loss: 1.2330 - regression_loss: 1.0804 - classification_loss: 0.1527 447/500 [=========================>....] - ETA: 13s - loss: 1.2312 - regression_loss: 1.0788 - classification_loss: 0.1524 448/500 [=========================>....] - ETA: 13s - loss: 1.2317 - regression_loss: 1.0793 - classification_loss: 0.1524 449/500 [=========================>....] - ETA: 12s - loss: 1.2321 - regression_loss: 1.0796 - classification_loss: 0.1525 450/500 [==========================>...] - ETA: 12s - loss: 1.2326 - regression_loss: 1.0800 - classification_loss: 0.1526 451/500 [==========================>...] - ETA: 12s - loss: 1.2329 - regression_loss: 1.0803 - classification_loss: 0.1525 452/500 [==========================>...] - ETA: 12s - loss: 1.2335 - regression_loss: 1.0808 - classification_loss: 0.1527 453/500 [==========================>...] - ETA: 11s - loss: 1.2338 - regression_loss: 1.0811 - classification_loss: 0.1527 454/500 [==========================>...] - ETA: 11s - loss: 1.2344 - regression_loss: 1.0817 - classification_loss: 0.1527 455/500 [==========================>...] - ETA: 11s - loss: 1.2347 - regression_loss: 1.0820 - classification_loss: 0.1528 456/500 [==========================>...] - ETA: 11s - loss: 1.2347 - regression_loss: 1.0820 - classification_loss: 0.1527 457/500 [==========================>...] - ETA: 10s - loss: 1.2352 - regression_loss: 1.0823 - classification_loss: 0.1529 458/500 [==========================>...] - ETA: 10s - loss: 1.2355 - regression_loss: 1.0826 - classification_loss: 0.1528 459/500 [==========================>...] - ETA: 10s - loss: 1.2352 - regression_loss: 1.0824 - classification_loss: 0.1528 460/500 [==========================>...] - ETA: 10s - loss: 1.2359 - regression_loss: 1.0830 - classification_loss: 0.1529 461/500 [==========================>...] - ETA: 9s - loss: 1.2365 - regression_loss: 1.0834 - classification_loss: 0.1530  462/500 [==========================>...] - ETA: 9s - loss: 1.2349 - regression_loss: 1.0821 - classification_loss: 0.1528 463/500 [==========================>...] - ETA: 9s - loss: 1.2350 - regression_loss: 1.0823 - classification_loss: 0.1527 464/500 [==========================>...] - ETA: 9s - loss: 1.2353 - regression_loss: 1.0825 - classification_loss: 0.1528 465/500 [==========================>...] - ETA: 8s - loss: 1.2366 - regression_loss: 1.0836 - classification_loss: 0.1530 466/500 [==========================>...] - ETA: 8s - loss: 1.2364 - regression_loss: 1.0834 - classification_loss: 0.1530 467/500 [===========================>..] - ETA: 8s - loss: 1.2349 - regression_loss: 1.0822 - classification_loss: 0.1527 468/500 [===========================>..] - ETA: 8s - loss: 1.2342 - regression_loss: 1.0815 - classification_loss: 0.1527 469/500 [===========================>..] - ETA: 7s - loss: 1.2327 - regression_loss: 1.0801 - classification_loss: 0.1526 470/500 [===========================>..] - ETA: 7s - loss: 1.2319 - regression_loss: 1.0796 - classification_loss: 0.1524 471/500 [===========================>..] - ETA: 7s - loss: 1.2319 - regression_loss: 1.0795 - classification_loss: 0.1524 472/500 [===========================>..] - ETA: 7s - loss: 1.2323 - regression_loss: 1.0799 - classification_loss: 0.1525 473/500 [===========================>..] - ETA: 6s - loss: 1.2327 - regression_loss: 1.0803 - classification_loss: 0.1524 474/500 [===========================>..] - ETA: 6s - loss: 1.2314 - regression_loss: 1.0793 - classification_loss: 0.1522 475/500 [===========================>..] - ETA: 6s - loss: 1.2311 - regression_loss: 1.0790 - classification_loss: 0.1521 476/500 [===========================>..] - ETA: 6s - loss: 1.2319 - regression_loss: 1.0798 - classification_loss: 0.1521 477/500 [===========================>..] - ETA: 5s - loss: 1.2311 - regression_loss: 1.0791 - classification_loss: 0.1519 478/500 [===========================>..] - ETA: 5s - loss: 1.2303 - regression_loss: 1.0785 - classification_loss: 0.1518 479/500 [===========================>..] - ETA: 5s - loss: 1.2308 - regression_loss: 1.0789 - classification_loss: 0.1519 480/500 [===========================>..] - ETA: 5s - loss: 1.2308 - regression_loss: 1.0789 - classification_loss: 0.1519 481/500 [===========================>..] - ETA: 4s - loss: 1.2299 - regression_loss: 1.0782 - classification_loss: 0.1517 482/500 [===========================>..] - ETA: 4s - loss: 1.2284 - regression_loss: 1.0769 - classification_loss: 0.1515 483/500 [===========================>..] - ETA: 4s - loss: 1.2280 - regression_loss: 1.0767 - classification_loss: 0.1513 484/500 [============================>.] - ETA: 4s - loss: 1.2273 - regression_loss: 1.0760 - classification_loss: 0.1512 485/500 [============================>.] - ETA: 3s - loss: 1.2267 - regression_loss: 1.0756 - classification_loss: 0.1511 486/500 [============================>.] - ETA: 3s - loss: 1.2261 - regression_loss: 1.0752 - classification_loss: 0.1509 487/500 [============================>.] - ETA: 3s - loss: 1.2258 - regression_loss: 1.0750 - classification_loss: 0.1508 488/500 [============================>.] - ETA: 3s - loss: 1.2261 - regression_loss: 1.0754 - classification_loss: 0.1507 489/500 [============================>.] - ETA: 2s - loss: 1.2262 - regression_loss: 1.0754 - classification_loss: 0.1508 490/500 [============================>.] - ETA: 2s - loss: 1.2263 - regression_loss: 1.0755 - classification_loss: 0.1508 491/500 [============================>.] - ETA: 2s - loss: 1.2269 - regression_loss: 1.0759 - classification_loss: 0.1510 492/500 [============================>.] - ETA: 2s - loss: 1.2271 - regression_loss: 1.0761 - classification_loss: 0.1510 493/500 [============================>.] - ETA: 1s - loss: 1.2278 - regression_loss: 1.0766 - classification_loss: 0.1511 494/500 [============================>.] - ETA: 1s - loss: 1.2269 - regression_loss: 1.0759 - classification_loss: 0.1509 495/500 [============================>.] - ETA: 1s - loss: 1.2264 - regression_loss: 1.0755 - classification_loss: 0.1508 496/500 [============================>.] - ETA: 1s - loss: 1.2262 - regression_loss: 1.0755 - classification_loss: 0.1508 497/500 [============================>.] - ETA: 0s - loss: 1.2261 - regression_loss: 1.0753 - classification_loss: 0.1507 498/500 [============================>.] - ETA: 0s - loss: 1.2250 - regression_loss: 1.0745 - classification_loss: 0.1506 499/500 [============================>.] - ETA: 0s - loss: 1.2245 - regression_loss: 1.0740 - classification_loss: 0.1505 500/500 [==============================] - 127s 253ms/step - loss: 1.2256 - regression_loss: 1.0749 - classification_loss: 0.1507 1172 instances of class plum with average precision: 0.7707 mAP: 0.7707 Epoch 00024: saving model to ./training/snapshots/resnet50_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 1:58 - loss: 1.2641 - regression_loss: 1.1128 - classification_loss: 0.1512 2/500 [..............................] - ETA: 2:02 - loss: 1.3098 - regression_loss: 1.1576 - classification_loss: 0.1523 3/500 [..............................] - ETA: 1:58 - loss: 1.1006 - regression_loss: 0.9716 - classification_loss: 0.1289 4/500 [..............................] - ETA: 1:59 - loss: 1.1868 - regression_loss: 1.0482 - classification_loss: 0.1386 5/500 [..............................] - ETA: 1:59 - loss: 1.1978 - regression_loss: 1.0598 - classification_loss: 0.1380 6/500 [..............................] - ETA: 1:58 - loss: 1.2383 - regression_loss: 1.0875 - classification_loss: 0.1508 7/500 [..............................] - ETA: 1:56 - loss: 1.2017 - regression_loss: 1.0493 - classification_loss: 0.1524 8/500 [..............................] - ETA: 1:57 - loss: 1.2439 - regression_loss: 1.0851 - classification_loss: 0.1588 9/500 [..............................] - ETA: 1:58 - loss: 1.2434 - regression_loss: 1.0858 - classification_loss: 0.1576 10/500 [..............................] - ETA: 1:58 - loss: 1.1689 - regression_loss: 1.0213 - classification_loss: 0.1476 11/500 [..............................] - ETA: 1:57 - loss: 1.1208 - regression_loss: 0.9812 - classification_loss: 0.1396 12/500 [..............................] - ETA: 1:56 - loss: 1.1353 - regression_loss: 0.9975 - classification_loss: 0.1378 13/500 [..............................] - ETA: 1:56 - loss: 1.1641 - regression_loss: 1.0228 - classification_loss: 0.1413 14/500 [..............................] - ETA: 1:56 - loss: 1.1523 - regression_loss: 1.0136 - classification_loss: 0.1387 15/500 [..............................] - ETA: 1:57 - loss: 1.1035 - regression_loss: 0.9699 - classification_loss: 0.1336 16/500 [..............................] - ETA: 1:56 - loss: 1.1139 - regression_loss: 0.9800 - classification_loss: 0.1338 17/500 [>.............................] - ETA: 1:55 - loss: 1.1150 - regression_loss: 0.9850 - classification_loss: 0.1300 18/500 [>.............................] - ETA: 1:55 - loss: 1.1226 - regression_loss: 0.9939 - classification_loss: 0.1287 19/500 [>.............................] - ETA: 1:55 - loss: 1.0955 - regression_loss: 0.9708 - classification_loss: 0.1248 20/500 [>.............................] - ETA: 1:55 - loss: 1.1308 - regression_loss: 0.9986 - classification_loss: 0.1322 21/500 [>.............................] - ETA: 1:55 - loss: 1.1358 - regression_loss: 1.0044 - classification_loss: 0.1314 22/500 [>.............................] - ETA: 1:55 - loss: 1.1097 - regression_loss: 0.9816 - classification_loss: 0.1281 23/500 [>.............................] - ETA: 1:55 - loss: 1.1435 - regression_loss: 1.0146 - classification_loss: 0.1289 24/500 [>.............................] - ETA: 1:55 - loss: 1.1339 - regression_loss: 1.0034 - classification_loss: 0.1305 25/500 [>.............................] - ETA: 1:55 - loss: 1.1157 - regression_loss: 0.9879 - classification_loss: 0.1277 26/500 [>.............................] - ETA: 1:55 - loss: 1.1279 - regression_loss: 0.9991 - classification_loss: 0.1288 27/500 [>.............................] - ETA: 1:55 - loss: 1.0975 - regression_loss: 0.9716 - classification_loss: 0.1259 28/500 [>.............................] - ETA: 1:54 - loss: 1.0792 - regression_loss: 0.9567 - classification_loss: 0.1225 29/500 [>.............................] - ETA: 1:54 - loss: 1.0583 - regression_loss: 0.9388 - classification_loss: 0.1195 30/500 [>.............................] - ETA: 1:54 - loss: 1.0587 - regression_loss: 0.9362 - classification_loss: 0.1225 31/500 [>.............................] - ETA: 1:54 - loss: 1.0638 - regression_loss: 0.9370 - classification_loss: 0.1269 32/500 [>.............................] - ETA: 1:53 - loss: 1.0533 - regression_loss: 0.9282 - classification_loss: 0.1251 33/500 [>.............................] - ETA: 1:53 - loss: 1.0679 - regression_loss: 0.9410 - classification_loss: 0.1269 34/500 [=>............................] - ETA: 1:53 - loss: 1.0650 - regression_loss: 0.9399 - classification_loss: 0.1250 35/500 [=>............................] - ETA: 1:52 - loss: 1.0746 - regression_loss: 0.9477 - classification_loss: 0.1268 36/500 [=>............................] - ETA: 1:52 - loss: 1.0906 - regression_loss: 0.9607 - classification_loss: 0.1299 37/500 [=>............................] - ETA: 1:52 - loss: 1.0928 - regression_loss: 0.9627 - classification_loss: 0.1301 38/500 [=>............................] - ETA: 1:52 - loss: 1.1046 - regression_loss: 0.9710 - classification_loss: 0.1336 39/500 [=>............................] - ETA: 1:52 - loss: 1.1112 - regression_loss: 0.9765 - classification_loss: 0.1347 40/500 [=>............................] - ETA: 1:51 - loss: 1.1044 - regression_loss: 0.9714 - classification_loss: 0.1330 41/500 [=>............................] - ETA: 1:51 - loss: 1.1032 - regression_loss: 0.9707 - classification_loss: 0.1326 42/500 [=>............................] - ETA: 1:51 - loss: 1.0984 - regression_loss: 0.9675 - classification_loss: 0.1309 43/500 [=>............................] - ETA: 1:51 - loss: 1.1150 - regression_loss: 0.9813 - classification_loss: 0.1337 44/500 [=>............................] - ETA: 1:50 - loss: 1.1152 - regression_loss: 0.9826 - classification_loss: 0.1325 45/500 [=>............................] - ETA: 1:50 - loss: 1.1103 - regression_loss: 0.9791 - classification_loss: 0.1312 46/500 [=>............................] - ETA: 1:50 - loss: 1.1112 - regression_loss: 0.9793 - classification_loss: 0.1319 47/500 [=>............................] - ETA: 1:50 - loss: 1.1115 - regression_loss: 0.9791 - classification_loss: 0.1324 48/500 [=>............................] - ETA: 1:50 - loss: 1.1211 - regression_loss: 0.9877 - classification_loss: 0.1334 49/500 [=>............................] - ETA: 1:50 - loss: 1.1218 - regression_loss: 0.9883 - classification_loss: 0.1335 50/500 [==>...........................] - ETA: 1:50 - loss: 1.1265 - regression_loss: 0.9939 - classification_loss: 0.1326 51/500 [==>...........................] - ETA: 1:49 - loss: 1.1389 - regression_loss: 1.0045 - classification_loss: 0.1344 52/500 [==>...........................] - ETA: 1:49 - loss: 1.1389 - regression_loss: 1.0047 - classification_loss: 0.1342 53/500 [==>...........................] - ETA: 1:49 - loss: 1.1472 - regression_loss: 1.0113 - classification_loss: 0.1358 54/500 [==>...........................] - ETA: 1:49 - loss: 1.1403 - regression_loss: 1.0052 - classification_loss: 0.1351 55/500 [==>...........................] - ETA: 1:48 - loss: 1.1486 - regression_loss: 1.0134 - classification_loss: 0.1352 56/500 [==>...........................] - ETA: 1:48 - loss: 1.1518 - regression_loss: 1.0159 - classification_loss: 0.1359 57/500 [==>...........................] - ETA: 1:48 - loss: 1.1483 - regression_loss: 1.0131 - classification_loss: 0.1352 58/500 [==>...........................] - ETA: 1:48 - loss: 1.1559 - regression_loss: 1.0198 - classification_loss: 0.1361 59/500 [==>...........................] - ETA: 1:47 - loss: 1.1595 - regression_loss: 1.0231 - classification_loss: 0.1364 60/500 [==>...........................] - ETA: 1:47 - loss: 1.1622 - regression_loss: 1.0258 - classification_loss: 0.1363 61/500 [==>...........................] - ETA: 1:47 - loss: 1.1563 - regression_loss: 1.0209 - classification_loss: 0.1354 62/500 [==>...........................] - ETA: 1:47 - loss: 1.1607 - regression_loss: 1.0249 - classification_loss: 0.1358 63/500 [==>...........................] - ETA: 1:46 - loss: 1.1569 - regression_loss: 1.0214 - classification_loss: 0.1355 64/500 [==>...........................] - ETA: 1:46 - loss: 1.1574 - regression_loss: 1.0219 - classification_loss: 0.1355 65/500 [==>...........................] - ETA: 1:45 - loss: 1.1590 - regression_loss: 1.0234 - classification_loss: 0.1356 66/500 [==>...........................] - ETA: 1:45 - loss: 1.1673 - regression_loss: 1.0305 - classification_loss: 0.1368 67/500 [===>..........................] - ETA: 1:45 - loss: 1.1728 - regression_loss: 1.0348 - classification_loss: 0.1381 68/500 [===>..........................] - ETA: 1:45 - loss: 1.1758 - regression_loss: 1.0371 - classification_loss: 0.1387 69/500 [===>..........................] - ETA: 1:44 - loss: 1.1832 - regression_loss: 1.0425 - classification_loss: 0.1407 70/500 [===>..........................] - ETA: 1:44 - loss: 1.1834 - regression_loss: 1.0426 - classification_loss: 0.1408 71/500 [===>..........................] - ETA: 1:44 - loss: 1.1938 - regression_loss: 1.0494 - classification_loss: 0.1444 72/500 [===>..........................] - ETA: 1:44 - loss: 1.1875 - regression_loss: 1.0443 - classification_loss: 0.1433 73/500 [===>..........................] - ETA: 1:43 - loss: 1.2064 - regression_loss: 1.0592 - classification_loss: 0.1471 74/500 [===>..........................] - ETA: 1:43 - loss: 1.2122 - regression_loss: 1.0634 - classification_loss: 0.1488 75/500 [===>..........................] - ETA: 1:43 - loss: 1.2193 - regression_loss: 1.0696 - classification_loss: 0.1497 76/500 [===>..........................] - ETA: 1:43 - loss: 1.2200 - regression_loss: 1.0703 - classification_loss: 0.1497 77/500 [===>..........................] - ETA: 1:42 - loss: 1.2181 - regression_loss: 1.0690 - classification_loss: 0.1491 78/500 [===>..........................] - ETA: 1:42 - loss: 1.2169 - regression_loss: 1.0680 - classification_loss: 0.1489 79/500 [===>..........................] - ETA: 1:42 - loss: 1.2137 - regression_loss: 1.0658 - classification_loss: 0.1479 80/500 [===>..........................] - ETA: 1:42 - loss: 1.2109 - regression_loss: 1.0632 - classification_loss: 0.1477 81/500 [===>..........................] - ETA: 1:41 - loss: 1.2088 - regression_loss: 1.0618 - classification_loss: 0.1470 82/500 [===>..........................] - ETA: 1:41 - loss: 1.2097 - regression_loss: 1.0626 - classification_loss: 0.1471 83/500 [===>..........................] - ETA: 1:41 - loss: 1.2098 - regression_loss: 1.0617 - classification_loss: 0.1481 84/500 [====>.........................] - ETA: 1:41 - loss: 1.2182 - regression_loss: 1.0684 - classification_loss: 0.1498 85/500 [====>.........................] - ETA: 1:41 - loss: 1.2249 - regression_loss: 1.0737 - classification_loss: 0.1512 86/500 [====>.........................] - ETA: 1:41 - loss: 1.2253 - regression_loss: 1.0743 - classification_loss: 0.1510 87/500 [====>.........................] - ETA: 1:41 - loss: 1.2304 - regression_loss: 1.0784 - classification_loss: 0.1520 88/500 [====>.........................] - ETA: 1:40 - loss: 1.2325 - regression_loss: 1.0808 - classification_loss: 0.1517 89/500 [====>.........................] - ETA: 1:40 - loss: 1.2314 - regression_loss: 1.0799 - classification_loss: 0.1516 90/500 [====>.........................] - ETA: 1:40 - loss: 1.2528 - regression_loss: 1.1001 - classification_loss: 0.1527 91/500 [====>.........................] - ETA: 1:40 - loss: 1.2542 - regression_loss: 1.1016 - classification_loss: 0.1526 92/500 [====>.........................] - ETA: 1:40 - loss: 1.2553 - regression_loss: 1.1023 - classification_loss: 0.1530 93/500 [====>.........................] - ETA: 1:39 - loss: 1.2614 - regression_loss: 1.1078 - classification_loss: 0.1537 94/500 [====>.........................] - ETA: 1:39 - loss: 1.2565 - regression_loss: 1.1038 - classification_loss: 0.1527 95/500 [====>.........................] - ETA: 1:39 - loss: 1.2621 - regression_loss: 1.1088 - classification_loss: 0.1533 96/500 [====>.........................] - ETA: 1:39 - loss: 1.2660 - regression_loss: 1.1113 - classification_loss: 0.1546 97/500 [====>.........................] - ETA: 1:39 - loss: 1.2601 - regression_loss: 1.1065 - classification_loss: 0.1536 98/500 [====>.........................] - ETA: 1:38 - loss: 1.2557 - regression_loss: 1.1027 - classification_loss: 0.1530 99/500 [====>.........................] - ETA: 1:38 - loss: 1.2604 - regression_loss: 1.1066 - classification_loss: 0.1538 100/500 [=====>........................] - ETA: 1:38 - loss: 1.2628 - regression_loss: 1.1088 - classification_loss: 0.1540 101/500 [=====>........................] - ETA: 1:38 - loss: 1.2560 - regression_loss: 1.1032 - classification_loss: 0.1528 102/500 [=====>........................] - ETA: 1:37 - loss: 1.2553 - regression_loss: 1.1032 - classification_loss: 0.1521 103/500 [=====>........................] - ETA: 1:37 - loss: 1.2589 - regression_loss: 1.1065 - classification_loss: 0.1524 104/500 [=====>........................] - ETA: 1:37 - loss: 1.2530 - regression_loss: 1.1016 - classification_loss: 0.1515 105/500 [=====>........................] - ETA: 1:37 - loss: 1.2559 - regression_loss: 1.1038 - classification_loss: 0.1521 106/500 [=====>........................] - ETA: 1:37 - loss: 1.2508 - regression_loss: 1.0996 - classification_loss: 0.1512 107/500 [=====>........................] - ETA: 1:36 - loss: 1.2506 - regression_loss: 1.0995 - classification_loss: 0.1510 108/500 [=====>........................] - ETA: 1:36 - loss: 1.2519 - regression_loss: 1.1004 - classification_loss: 0.1515 109/500 [=====>........................] - ETA: 1:36 - loss: 1.2530 - regression_loss: 1.1014 - classification_loss: 0.1516 110/500 [=====>........................] - ETA: 1:36 - loss: 1.2523 - regression_loss: 1.1010 - classification_loss: 0.1513 111/500 [=====>........................] - ETA: 1:36 - loss: 1.2533 - regression_loss: 1.1018 - classification_loss: 0.1515 112/500 [=====>........................] - ETA: 1:35 - loss: 1.2552 - regression_loss: 1.1034 - classification_loss: 0.1518 113/500 [=====>........................] - ETA: 1:35 - loss: 1.2574 - regression_loss: 1.1051 - classification_loss: 0.1523 114/500 [=====>........................] - ETA: 1:35 - loss: 1.2523 - regression_loss: 1.1007 - classification_loss: 0.1515 115/500 [=====>........................] - ETA: 1:35 - loss: 1.2498 - regression_loss: 1.0991 - classification_loss: 0.1507 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2519 - regression_loss: 1.1003 - classification_loss: 0.1516 117/500 [======>.......................] - ETA: 1:34 - loss: 1.2491 - regression_loss: 1.0980 - classification_loss: 0.1511 118/500 [======>.......................] - ETA: 1:34 - loss: 1.2464 - regression_loss: 1.0956 - classification_loss: 0.1508 119/500 [======>.......................] - ETA: 1:34 - loss: 1.2462 - regression_loss: 1.0952 - classification_loss: 0.1510 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2496 - regression_loss: 1.0979 - classification_loss: 0.1517 121/500 [======>.......................] - ETA: 1:33 - loss: 1.2526 - regression_loss: 1.1003 - classification_loss: 0.1522 122/500 [======>.......................] - ETA: 1:33 - loss: 1.2535 - regression_loss: 1.1012 - classification_loss: 0.1523 123/500 [======>.......................] - ETA: 1:33 - loss: 1.2528 - regression_loss: 1.1006 - classification_loss: 0.1522 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2541 - regression_loss: 1.1014 - classification_loss: 0.1526 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2540 - regression_loss: 1.1014 - classification_loss: 0.1526 126/500 [======>.......................] - ETA: 1:32 - loss: 1.2577 - regression_loss: 1.1043 - classification_loss: 0.1533 127/500 [======>.......................] - ETA: 1:32 - loss: 1.2528 - regression_loss: 1.1003 - classification_loss: 0.1525 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2506 - regression_loss: 1.0981 - classification_loss: 0.1525 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2511 - regression_loss: 1.0989 - classification_loss: 0.1522 130/500 [======>.......................] - ETA: 1:31 - loss: 1.2509 - regression_loss: 1.0988 - classification_loss: 0.1521 131/500 [======>.......................] - ETA: 1:31 - loss: 1.2445 - regression_loss: 1.0932 - classification_loss: 0.1513 132/500 [======>.......................] - ETA: 1:31 - loss: 1.2439 - regression_loss: 1.0930 - classification_loss: 0.1509 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2466 - regression_loss: 1.0951 - classification_loss: 0.1514 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2437 - regression_loss: 1.0925 - classification_loss: 0.1512 135/500 [=======>......................] - ETA: 1:30 - loss: 1.2460 - regression_loss: 1.0943 - classification_loss: 0.1517 136/500 [=======>......................] - ETA: 1:30 - loss: 1.2472 - regression_loss: 1.0955 - classification_loss: 0.1517 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2444 - regression_loss: 1.0929 - classification_loss: 0.1514 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2402 - regression_loss: 1.0896 - classification_loss: 0.1506 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2396 - regression_loss: 1.0887 - classification_loss: 0.1510 140/500 [=======>......................] - ETA: 1:29 - loss: 1.2415 - regression_loss: 1.0903 - classification_loss: 0.1512 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2389 - regression_loss: 1.0883 - classification_loss: 0.1507 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2391 - regression_loss: 1.0882 - classification_loss: 0.1509 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2382 - regression_loss: 1.0874 - classification_loss: 0.1508 144/500 [=======>......................] - ETA: 1:28 - loss: 1.2411 - regression_loss: 1.0899 - classification_loss: 0.1513 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2349 - regression_loss: 1.0846 - classification_loss: 0.1504 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2324 - regression_loss: 1.0826 - classification_loss: 0.1498 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2284 - regression_loss: 1.0790 - classification_loss: 0.1494 148/500 [=======>......................] - ETA: 1:27 - loss: 1.2248 - regression_loss: 1.0762 - classification_loss: 0.1486 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2242 - regression_loss: 1.0754 - classification_loss: 0.1488 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2210 - regression_loss: 1.0727 - classification_loss: 0.1483 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2169 - regression_loss: 1.0691 - classification_loss: 0.1479 152/500 [========>.....................] - ETA: 1:26 - loss: 1.2151 - regression_loss: 1.0676 - classification_loss: 0.1475 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2133 - regression_loss: 1.0662 - classification_loss: 0.1471 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2149 - regression_loss: 1.0674 - classification_loss: 0.1475 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2217 - regression_loss: 1.0718 - classification_loss: 0.1498 156/500 [========>.....................] - ETA: 1:25 - loss: 1.2216 - regression_loss: 1.0717 - classification_loss: 0.1499 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2261 - regression_loss: 1.0756 - classification_loss: 0.1505 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2222 - regression_loss: 1.0721 - classification_loss: 0.1500 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2216 - regression_loss: 1.0717 - classification_loss: 0.1498 160/500 [========>.....................] - ETA: 1:24 - loss: 1.2232 - regression_loss: 1.0735 - classification_loss: 0.1497 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2231 - regression_loss: 1.0734 - classification_loss: 0.1497 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2198 - regression_loss: 1.0709 - classification_loss: 0.1490 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2191 - regression_loss: 1.0704 - classification_loss: 0.1487 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2168 - regression_loss: 1.0685 - classification_loss: 0.1483 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2159 - regression_loss: 1.0675 - classification_loss: 0.1484 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2171 - regression_loss: 1.0683 - classification_loss: 0.1488 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2162 - regression_loss: 1.0676 - classification_loss: 0.1486 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2176 - regression_loss: 1.0694 - classification_loss: 0.1482 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2173 - regression_loss: 1.0693 - classification_loss: 0.1480 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2190 - regression_loss: 1.0706 - classification_loss: 0.1484 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2208 - regression_loss: 1.0719 - classification_loss: 0.1488 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2212 - regression_loss: 1.0725 - classification_loss: 0.1487 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2205 - regression_loss: 1.0717 - classification_loss: 0.1488 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2234 - regression_loss: 1.0740 - classification_loss: 0.1493 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2231 - regression_loss: 1.0737 - classification_loss: 0.1494 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2226 - regression_loss: 1.0732 - classification_loss: 0.1494 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2236 - regression_loss: 1.0743 - classification_loss: 0.1493 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2228 - regression_loss: 1.0737 - classification_loss: 0.1491 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2231 - regression_loss: 1.0740 - classification_loss: 0.1491 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2194 - regression_loss: 1.0709 - classification_loss: 0.1485 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2172 - regression_loss: 1.0692 - classification_loss: 0.1480 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2171 - regression_loss: 1.0691 - classification_loss: 0.1480 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2182 - regression_loss: 1.0700 - classification_loss: 0.1482 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2170 - regression_loss: 1.0690 - classification_loss: 0.1480 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2189 - regression_loss: 1.0705 - classification_loss: 0.1484 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2211 - regression_loss: 1.0724 - classification_loss: 0.1488 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2200 - regression_loss: 1.0714 - classification_loss: 0.1486 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2183 - regression_loss: 1.0702 - classification_loss: 0.1481 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2188 - regression_loss: 1.0708 - classification_loss: 0.1480 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2189 - regression_loss: 1.0710 - classification_loss: 0.1479 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2232 - regression_loss: 1.0745 - classification_loss: 0.1487 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2248 - regression_loss: 1.0760 - classification_loss: 0.1488 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2234 - regression_loss: 1.0748 - classification_loss: 0.1486 194/500 [==========>...................] - ETA: 1:17 - loss: 1.2244 - regression_loss: 1.0756 - classification_loss: 0.1488 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2238 - regression_loss: 1.0751 - classification_loss: 0.1487 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2240 - regression_loss: 1.0751 - classification_loss: 0.1488 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2207 - regression_loss: 1.0722 - classification_loss: 0.1485 198/500 [==========>...................] - ETA: 1:16 - loss: 1.2204 - regression_loss: 1.0720 - classification_loss: 0.1484 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2203 - regression_loss: 1.0716 - classification_loss: 0.1487 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2211 - regression_loss: 1.0725 - classification_loss: 0.1486 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2206 - regression_loss: 1.0719 - classification_loss: 0.1486 202/500 [===========>..................] - ETA: 1:15 - loss: 1.2219 - regression_loss: 1.0730 - classification_loss: 0.1489 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2208 - regression_loss: 1.0721 - classification_loss: 0.1488 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2218 - regression_loss: 1.0730 - classification_loss: 0.1489 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2221 - regression_loss: 1.0732 - classification_loss: 0.1489 206/500 [===========>..................] - ETA: 1:14 - loss: 1.2247 - regression_loss: 1.0752 - classification_loss: 0.1495 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2227 - regression_loss: 1.0734 - classification_loss: 0.1493 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2248 - regression_loss: 1.0752 - classification_loss: 0.1496 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2264 - regression_loss: 1.0766 - classification_loss: 0.1498 210/500 [===========>..................] - ETA: 1:13 - loss: 1.2269 - regression_loss: 1.0770 - classification_loss: 0.1499 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2268 - regression_loss: 1.0769 - classification_loss: 0.1500 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2281 - regression_loss: 1.0780 - classification_loss: 0.1501 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2269 - regression_loss: 1.0769 - classification_loss: 0.1501 214/500 [===========>..................] - ETA: 1:12 - loss: 1.2279 - regression_loss: 1.0776 - classification_loss: 0.1503 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2246 - regression_loss: 1.0747 - classification_loss: 0.1499 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2253 - regression_loss: 1.0753 - classification_loss: 0.1500 217/500 [============>.................] - ETA: 1:11 - loss: 1.2263 - regression_loss: 1.0762 - classification_loss: 0.1502 218/500 [============>.................] - ETA: 1:11 - loss: 1.2263 - regression_loss: 1.0762 - classification_loss: 0.1501 219/500 [============>.................] - ETA: 1:10 - loss: 1.2267 - regression_loss: 1.0765 - classification_loss: 0.1502 220/500 [============>.................] - ETA: 1:10 - loss: 1.2267 - regression_loss: 1.0764 - classification_loss: 0.1503 221/500 [============>.................] - ETA: 1:10 - loss: 1.2252 - regression_loss: 1.0752 - classification_loss: 0.1500 222/500 [============>.................] - ETA: 1:10 - loss: 1.2261 - regression_loss: 1.0760 - classification_loss: 0.1501 223/500 [============>.................] - ETA: 1:10 - loss: 1.2265 - regression_loss: 1.0764 - classification_loss: 0.1501 224/500 [============>.................] - ETA: 1:09 - loss: 1.2245 - regression_loss: 1.0749 - classification_loss: 0.1496 225/500 [============>.................] - ETA: 1:09 - loss: 1.2231 - regression_loss: 1.0738 - classification_loss: 0.1493 226/500 [============>.................] - ETA: 1:09 - loss: 1.2221 - regression_loss: 1.0725 - classification_loss: 0.1497 227/500 [============>.................] - ETA: 1:09 - loss: 1.2190 - regression_loss: 1.0695 - classification_loss: 0.1495 228/500 [============>.................] - ETA: 1:08 - loss: 1.2195 - regression_loss: 1.0699 - classification_loss: 0.1495 229/500 [============>.................] - ETA: 1:08 - loss: 1.2179 - regression_loss: 1.0688 - classification_loss: 0.1491 230/500 [============>.................] - ETA: 1:08 - loss: 1.2182 - regression_loss: 1.0693 - classification_loss: 0.1489 231/500 [============>.................] - ETA: 1:08 - loss: 1.2165 - regression_loss: 1.0678 - classification_loss: 0.1486 232/500 [============>.................] - ETA: 1:07 - loss: 1.2185 - regression_loss: 1.0696 - classification_loss: 0.1489 233/500 [============>.................] - ETA: 1:07 - loss: 1.2185 - regression_loss: 1.0695 - classification_loss: 0.1490 234/500 [=============>................] - ETA: 1:07 - loss: 1.2181 - regression_loss: 1.0693 - classification_loss: 0.1488 235/500 [=============>................] - ETA: 1:07 - loss: 1.2210 - regression_loss: 1.0717 - classification_loss: 0.1493 236/500 [=============>................] - ETA: 1:06 - loss: 1.2206 - regression_loss: 1.0715 - classification_loss: 0.1491 237/500 [=============>................] - ETA: 1:06 - loss: 1.2210 - regression_loss: 1.0717 - classification_loss: 0.1493 238/500 [=============>................] - ETA: 1:06 - loss: 1.2186 - regression_loss: 1.0696 - classification_loss: 0.1490 239/500 [=============>................] - ETA: 1:06 - loss: 1.2158 - regression_loss: 1.0672 - classification_loss: 0.1486 240/500 [=============>................] - ETA: 1:05 - loss: 1.2182 - regression_loss: 1.0692 - classification_loss: 0.1490 241/500 [=============>................] - ETA: 1:05 - loss: 1.2182 - regression_loss: 1.0692 - classification_loss: 0.1490 242/500 [=============>................] - ETA: 1:05 - loss: 1.2194 - regression_loss: 1.0701 - classification_loss: 0.1493 243/500 [=============>................] - ETA: 1:05 - loss: 1.2186 - regression_loss: 1.0693 - classification_loss: 0.1493 244/500 [=============>................] - ETA: 1:04 - loss: 1.2199 - regression_loss: 1.0704 - classification_loss: 0.1495 245/500 [=============>................] - ETA: 1:04 - loss: 1.2174 - regression_loss: 1.0683 - classification_loss: 0.1491 246/500 [=============>................] - ETA: 1:04 - loss: 1.2200 - regression_loss: 1.0705 - classification_loss: 0.1495 247/500 [=============>................] - ETA: 1:04 - loss: 1.2203 - regression_loss: 1.0709 - classification_loss: 0.1495 248/500 [=============>................] - ETA: 1:03 - loss: 1.2230 - regression_loss: 1.0732 - classification_loss: 0.1498 249/500 [=============>................] - ETA: 1:03 - loss: 1.2230 - regression_loss: 1.0732 - classification_loss: 0.1498 250/500 [==============>...............] - ETA: 1:03 - loss: 1.2225 - regression_loss: 1.0727 - classification_loss: 0.1498 251/500 [==============>...............] - ETA: 1:03 - loss: 1.2244 - regression_loss: 1.0742 - classification_loss: 0.1501 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2255 - regression_loss: 1.0752 - classification_loss: 0.1503 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2276 - regression_loss: 1.0770 - classification_loss: 0.1507 254/500 [==============>...............] - ETA: 1:02 - loss: 1.2258 - regression_loss: 1.0755 - classification_loss: 0.1503 255/500 [==============>...............] - ETA: 1:02 - loss: 1.2243 - regression_loss: 1.0743 - classification_loss: 0.1500 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2253 - regression_loss: 1.0751 - classification_loss: 0.1502 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2270 - regression_loss: 1.0765 - classification_loss: 0.1505 258/500 [==============>...............] - ETA: 1:01 - loss: 1.2255 - regression_loss: 1.0752 - classification_loss: 0.1504 259/500 [==============>...............] - ETA: 1:01 - loss: 1.2247 - regression_loss: 1.0745 - classification_loss: 0.1502 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2225 - regression_loss: 1.0727 - classification_loss: 0.1498 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2236 - regression_loss: 1.0737 - classification_loss: 0.1499 262/500 [==============>...............] - ETA: 1:00 - loss: 1.2234 - regression_loss: 1.0735 - classification_loss: 0.1499 263/500 [==============>...............] - ETA: 1:00 - loss: 1.2251 - regression_loss: 1.0750 - classification_loss: 0.1501 264/500 [==============>...............] - ETA: 59s - loss: 1.2260 - regression_loss: 1.0758 - classification_loss: 0.1502  265/500 [==============>...............] - ETA: 59s - loss: 1.2249 - regression_loss: 1.0747 - classification_loss: 0.1503 266/500 [==============>...............] - ETA: 59s - loss: 1.2245 - regression_loss: 1.0744 - classification_loss: 0.1501 267/500 [===============>..............] - ETA: 59s - loss: 1.2231 - regression_loss: 1.0733 - classification_loss: 0.1498 268/500 [===============>..............] - ETA: 58s - loss: 1.2208 - regression_loss: 1.0713 - classification_loss: 0.1494 269/500 [===============>..............] - ETA: 58s - loss: 1.2212 - regression_loss: 1.0718 - classification_loss: 0.1495 270/500 [===============>..............] - ETA: 58s - loss: 1.2224 - regression_loss: 1.0727 - classification_loss: 0.1497 271/500 [===============>..............] - ETA: 58s - loss: 1.2209 - regression_loss: 1.0715 - classification_loss: 0.1494 272/500 [===============>..............] - ETA: 57s - loss: 1.2202 - regression_loss: 1.0710 - classification_loss: 0.1492 273/500 [===============>..............] - ETA: 57s - loss: 1.2207 - regression_loss: 1.0715 - classification_loss: 0.1492 274/500 [===============>..............] - ETA: 57s - loss: 1.2178 - regression_loss: 1.0690 - classification_loss: 0.1488 275/500 [===============>..............] - ETA: 57s - loss: 1.2179 - regression_loss: 1.0691 - classification_loss: 0.1488 276/500 [===============>..............] - ETA: 56s - loss: 1.2180 - regression_loss: 1.0692 - classification_loss: 0.1488 277/500 [===============>..............] - ETA: 56s - loss: 1.2167 - regression_loss: 1.0681 - classification_loss: 0.1486 278/500 [===============>..............] - ETA: 56s - loss: 1.2176 - regression_loss: 1.0689 - classification_loss: 0.1487 279/500 [===============>..............] - ETA: 56s - loss: 1.2157 - regression_loss: 1.0673 - classification_loss: 0.1484 280/500 [===============>..............] - ETA: 55s - loss: 1.2148 - regression_loss: 1.0666 - classification_loss: 0.1482 281/500 [===============>..............] - ETA: 55s - loss: 1.2152 - regression_loss: 1.0670 - classification_loss: 0.1482 282/500 [===============>..............] - ETA: 55s - loss: 1.2159 - regression_loss: 1.0675 - classification_loss: 0.1484 283/500 [===============>..............] - ETA: 55s - loss: 1.2148 - regression_loss: 1.0666 - classification_loss: 0.1482 284/500 [================>.............] - ETA: 54s - loss: 1.2153 - regression_loss: 1.0669 - classification_loss: 0.1484 285/500 [================>.............] - ETA: 54s - loss: 1.2133 - regression_loss: 1.0650 - classification_loss: 0.1483 286/500 [================>.............] - ETA: 54s - loss: 1.2137 - regression_loss: 1.0653 - classification_loss: 0.1484 287/500 [================>.............] - ETA: 54s - loss: 1.2161 - regression_loss: 1.0675 - classification_loss: 0.1485 288/500 [================>.............] - ETA: 53s - loss: 1.2152 - regression_loss: 1.0669 - classification_loss: 0.1483 289/500 [================>.............] - ETA: 53s - loss: 1.2155 - regression_loss: 1.0672 - classification_loss: 0.1484 290/500 [================>.............] - ETA: 53s - loss: 1.2160 - regression_loss: 1.0675 - classification_loss: 0.1485 291/500 [================>.............] - ETA: 53s - loss: 1.2158 - regression_loss: 1.0673 - classification_loss: 0.1485 292/500 [================>.............] - ETA: 52s - loss: 1.2148 - regression_loss: 1.0664 - classification_loss: 0.1484 293/500 [================>.............] - ETA: 52s - loss: 1.2138 - regression_loss: 1.0656 - classification_loss: 0.1482 294/500 [================>.............] - ETA: 52s - loss: 1.2142 - regression_loss: 1.0661 - classification_loss: 0.1481 295/500 [================>.............] - ETA: 52s - loss: 1.2154 - regression_loss: 1.0670 - classification_loss: 0.1484 296/500 [================>.............] - ETA: 51s - loss: 1.2157 - regression_loss: 1.0673 - classification_loss: 0.1484 297/500 [================>.............] - ETA: 51s - loss: 1.2171 - regression_loss: 1.0686 - classification_loss: 0.1485 298/500 [================>.............] - ETA: 51s - loss: 1.2169 - regression_loss: 1.0684 - classification_loss: 0.1485 299/500 [================>.............] - ETA: 51s - loss: 1.2152 - regression_loss: 1.0671 - classification_loss: 0.1481 300/500 [=================>............] - ETA: 50s - loss: 1.2162 - regression_loss: 1.0681 - classification_loss: 0.1481 301/500 [=================>............] - ETA: 50s - loss: 1.2160 - regression_loss: 1.0679 - classification_loss: 0.1481 302/500 [=================>............] - ETA: 50s - loss: 1.2157 - regression_loss: 1.0677 - classification_loss: 0.1481 303/500 [=================>............] - ETA: 50s - loss: 1.2150 - regression_loss: 1.0670 - classification_loss: 0.1480 304/500 [=================>............] - ETA: 49s - loss: 1.2169 - regression_loss: 1.0687 - classification_loss: 0.1482 305/500 [=================>............] - ETA: 49s - loss: 1.2171 - regression_loss: 1.0690 - classification_loss: 0.1482 306/500 [=================>............] - ETA: 49s - loss: 1.2154 - regression_loss: 1.0675 - classification_loss: 0.1479 307/500 [=================>............] - ETA: 49s - loss: 1.2169 - regression_loss: 1.0689 - classification_loss: 0.1480 308/500 [=================>............] - ETA: 48s - loss: 1.2178 - regression_loss: 1.0698 - classification_loss: 0.1480 309/500 [=================>............] - ETA: 48s - loss: 1.2179 - regression_loss: 1.0701 - classification_loss: 0.1478 310/500 [=================>............] - ETA: 48s - loss: 1.2160 - regression_loss: 1.0685 - classification_loss: 0.1475 311/500 [=================>............] - ETA: 48s - loss: 1.2130 - regression_loss: 1.0658 - classification_loss: 0.1471 312/500 [=================>............] - ETA: 47s - loss: 1.2130 - regression_loss: 1.0657 - classification_loss: 0.1473 313/500 [=================>............] - ETA: 47s - loss: 1.2135 - regression_loss: 1.0663 - classification_loss: 0.1472 314/500 [=================>............] - ETA: 47s - loss: 1.2126 - regression_loss: 1.0656 - classification_loss: 0.1470 315/500 [=================>............] - ETA: 47s - loss: 1.2132 - regression_loss: 1.0660 - classification_loss: 0.1472 316/500 [=================>............] - ETA: 46s - loss: 1.2122 - regression_loss: 1.0650 - classification_loss: 0.1472 317/500 [==================>...........] - ETA: 46s - loss: 1.2133 - regression_loss: 1.0658 - classification_loss: 0.1475 318/500 [==================>...........] - ETA: 46s - loss: 1.2128 - regression_loss: 1.0654 - classification_loss: 0.1475 319/500 [==================>...........] - ETA: 46s - loss: 1.2130 - regression_loss: 1.0656 - classification_loss: 0.1474 320/500 [==================>...........] - ETA: 45s - loss: 1.2114 - regression_loss: 1.0642 - classification_loss: 0.1471 321/500 [==================>...........] - ETA: 45s - loss: 1.2122 - regression_loss: 1.0649 - classification_loss: 0.1473 322/500 [==================>...........] - ETA: 45s - loss: 1.2127 - regression_loss: 1.0654 - classification_loss: 0.1474 323/500 [==================>...........] - ETA: 45s - loss: 1.2136 - regression_loss: 1.0661 - classification_loss: 0.1475 324/500 [==================>...........] - ETA: 44s - loss: 1.2135 - regression_loss: 1.0661 - classification_loss: 0.1473 325/500 [==================>...........] - ETA: 44s - loss: 1.2124 - regression_loss: 1.0653 - classification_loss: 0.1471 326/500 [==================>...........] - ETA: 44s - loss: 1.2126 - regression_loss: 1.0655 - classification_loss: 0.1471 327/500 [==================>...........] - ETA: 44s - loss: 1.2137 - regression_loss: 1.0664 - classification_loss: 0.1473 328/500 [==================>...........] - ETA: 43s - loss: 1.2136 - regression_loss: 1.0664 - classification_loss: 0.1472 329/500 [==================>...........] - ETA: 43s - loss: 1.2128 - regression_loss: 1.0658 - classification_loss: 0.1470 330/500 [==================>...........] - ETA: 43s - loss: 1.2131 - regression_loss: 1.0660 - classification_loss: 0.1471 331/500 [==================>...........] - ETA: 43s - loss: 1.2103 - regression_loss: 1.0636 - classification_loss: 0.1467 332/500 [==================>...........] - ETA: 42s - loss: 1.2099 - regression_loss: 1.0633 - classification_loss: 0.1466 333/500 [==================>...........] - ETA: 42s - loss: 1.2088 - regression_loss: 1.0622 - classification_loss: 0.1466 334/500 [===================>..........] - ETA: 42s - loss: 1.2089 - regression_loss: 1.0623 - classification_loss: 0.1467 335/500 [===================>..........] - ETA: 42s - loss: 1.2073 - regression_loss: 1.0608 - classification_loss: 0.1465 336/500 [===================>..........] - ETA: 41s - loss: 1.2075 - regression_loss: 1.0610 - classification_loss: 0.1465 337/500 [===================>..........] - ETA: 41s - loss: 1.2081 - regression_loss: 1.0615 - classification_loss: 0.1467 338/500 [===================>..........] - ETA: 41s - loss: 1.2086 - regression_loss: 1.0619 - classification_loss: 0.1467 339/500 [===================>..........] - ETA: 41s - loss: 1.2102 - regression_loss: 1.0629 - classification_loss: 0.1473 340/500 [===================>..........] - ETA: 40s - loss: 1.2107 - regression_loss: 1.0633 - classification_loss: 0.1474 341/500 [===================>..........] - ETA: 40s - loss: 1.2095 - regression_loss: 1.0623 - classification_loss: 0.1472 342/500 [===================>..........] - ETA: 40s - loss: 1.2095 - regression_loss: 1.0623 - classification_loss: 0.1472 343/500 [===================>..........] - ETA: 40s - loss: 1.2103 - regression_loss: 1.0634 - classification_loss: 0.1469 344/500 [===================>..........] - ETA: 39s - loss: 1.2093 - regression_loss: 1.0625 - classification_loss: 0.1468 345/500 [===================>..........] - ETA: 39s - loss: 1.2088 - regression_loss: 1.0620 - classification_loss: 0.1468 346/500 [===================>..........] - ETA: 39s - loss: 1.2090 - regression_loss: 1.0621 - classification_loss: 0.1468 347/500 [===================>..........] - ETA: 38s - loss: 1.2091 - regression_loss: 1.0624 - classification_loss: 0.1467 348/500 [===================>..........] - ETA: 38s - loss: 1.2072 - regression_loss: 1.0607 - classification_loss: 0.1465 349/500 [===================>..........] - ETA: 38s - loss: 1.2077 - regression_loss: 1.0611 - classification_loss: 0.1466 350/500 [====================>.........] - ETA: 38s - loss: 1.2067 - regression_loss: 1.0604 - classification_loss: 0.1463 351/500 [====================>.........] - ETA: 37s - loss: 1.2049 - regression_loss: 1.0588 - classification_loss: 0.1460 352/500 [====================>.........] - ETA: 37s - loss: 1.2053 - regression_loss: 1.0592 - classification_loss: 0.1461 353/500 [====================>.........] - ETA: 37s - loss: 1.2086 - regression_loss: 1.0618 - classification_loss: 0.1467 354/500 [====================>.........] - ETA: 37s - loss: 1.2096 - regression_loss: 1.0628 - classification_loss: 0.1468 355/500 [====================>.........] - ETA: 36s - loss: 1.2096 - regression_loss: 1.0628 - classification_loss: 0.1468 356/500 [====================>.........] - ETA: 36s - loss: 1.2083 - regression_loss: 1.0616 - classification_loss: 0.1467 357/500 [====================>.........] - ETA: 36s - loss: 1.2063 - regression_loss: 1.0600 - classification_loss: 0.1463 358/500 [====================>.........] - ETA: 36s - loss: 1.2064 - regression_loss: 1.0601 - classification_loss: 0.1463 359/500 [====================>.........] - ETA: 35s - loss: 1.2076 - regression_loss: 1.0612 - classification_loss: 0.1464 360/500 [====================>.........] - ETA: 35s - loss: 1.2082 - regression_loss: 1.0616 - classification_loss: 0.1466 361/500 [====================>.........] - ETA: 35s - loss: 1.2067 - regression_loss: 1.0603 - classification_loss: 0.1464 362/500 [====================>.........] - ETA: 35s - loss: 1.2080 - regression_loss: 1.0614 - classification_loss: 0.1466 363/500 [====================>.........] - ETA: 34s - loss: 1.2065 - regression_loss: 1.0603 - classification_loss: 0.1463 364/500 [====================>.........] - ETA: 34s - loss: 1.2046 - regression_loss: 1.0585 - classification_loss: 0.1461 365/500 [====================>.........] - ETA: 34s - loss: 1.2045 - regression_loss: 1.0585 - classification_loss: 0.1460 366/500 [====================>.........] - ETA: 34s - loss: 1.2041 - regression_loss: 1.0581 - classification_loss: 0.1460 367/500 [=====================>........] - ETA: 33s - loss: 1.2052 - regression_loss: 1.0590 - classification_loss: 0.1462 368/500 [=====================>........] - ETA: 33s - loss: 1.2041 - regression_loss: 1.0580 - classification_loss: 0.1461 369/500 [=====================>........] - ETA: 33s - loss: 1.2041 - regression_loss: 1.0581 - classification_loss: 0.1460 370/500 [=====================>........] - ETA: 33s - loss: 1.2055 - regression_loss: 1.0592 - classification_loss: 0.1462 371/500 [=====================>........] - ETA: 32s - loss: 1.2056 - regression_loss: 1.0594 - classification_loss: 0.1462 372/500 [=====================>........] - ETA: 32s - loss: 1.2036 - regression_loss: 1.0577 - classification_loss: 0.1459 373/500 [=====================>........] - ETA: 32s - loss: 1.2034 - regression_loss: 1.0576 - classification_loss: 0.1458 374/500 [=====================>........] - ETA: 32s - loss: 1.2032 - regression_loss: 1.0575 - classification_loss: 0.1457 375/500 [=====================>........] - ETA: 31s - loss: 1.2026 - regression_loss: 1.0569 - classification_loss: 0.1456 376/500 [=====================>........] - ETA: 31s - loss: 1.2020 - regression_loss: 1.0565 - classification_loss: 0.1455 377/500 [=====================>........] - ETA: 31s - loss: 1.2032 - regression_loss: 1.0575 - classification_loss: 0.1457 378/500 [=====================>........] - ETA: 30s - loss: 1.2029 - regression_loss: 1.0573 - classification_loss: 0.1456 379/500 [=====================>........] - ETA: 30s - loss: 1.2034 - regression_loss: 1.0577 - classification_loss: 0.1457 380/500 [=====================>........] - ETA: 30s - loss: 1.2060 - regression_loss: 1.0597 - classification_loss: 0.1462 381/500 [=====================>........] - ETA: 30s - loss: 1.2055 - regression_loss: 1.0593 - classification_loss: 0.1462 382/500 [=====================>........] - ETA: 29s - loss: 1.2057 - regression_loss: 1.0595 - classification_loss: 0.1462 383/500 [=====================>........] - ETA: 29s - loss: 1.2054 - regression_loss: 1.0593 - classification_loss: 0.1462 384/500 [======================>.......] - ETA: 29s - loss: 1.2037 - regression_loss: 1.0577 - classification_loss: 0.1460 385/500 [======================>.......] - ETA: 29s - loss: 1.2029 - regression_loss: 1.0570 - classification_loss: 0.1459 386/500 [======================>.......] - ETA: 28s - loss: 1.2043 - regression_loss: 1.0583 - classification_loss: 0.1460 387/500 [======================>.......] - ETA: 28s - loss: 1.2046 - regression_loss: 1.0585 - classification_loss: 0.1462 388/500 [======================>.......] - ETA: 28s - loss: 1.2044 - regression_loss: 1.0583 - classification_loss: 0.1461 389/500 [======================>.......] - ETA: 28s - loss: 1.2057 - regression_loss: 1.0592 - classification_loss: 0.1464 390/500 [======================>.......] - ETA: 27s - loss: 1.2062 - regression_loss: 1.0597 - classification_loss: 0.1465 391/500 [======================>.......] - ETA: 27s - loss: 1.2047 - regression_loss: 1.0584 - classification_loss: 0.1462 392/500 [======================>.......] - ETA: 27s - loss: 1.2036 - regression_loss: 1.0576 - classification_loss: 0.1461 393/500 [======================>.......] - ETA: 27s - loss: 1.2044 - regression_loss: 1.0582 - classification_loss: 0.1462 394/500 [======================>.......] - ETA: 26s - loss: 1.2040 - regression_loss: 1.0579 - classification_loss: 0.1461 395/500 [======================>.......] - ETA: 26s - loss: 1.2038 - regression_loss: 1.0576 - classification_loss: 0.1462 396/500 [======================>.......] - ETA: 26s - loss: 1.2025 - regression_loss: 1.0565 - classification_loss: 0.1460 397/500 [======================>.......] - ETA: 26s - loss: 1.2033 - regression_loss: 1.0572 - classification_loss: 0.1461 398/500 [======================>.......] - ETA: 25s - loss: 1.2033 - regression_loss: 1.0573 - classification_loss: 0.1460 399/500 [======================>.......] - ETA: 25s - loss: 1.2026 - regression_loss: 1.0567 - classification_loss: 0.1460 400/500 [=======================>......] - ETA: 25s - loss: 1.2032 - regression_loss: 1.0571 - classification_loss: 0.1460 401/500 [=======================>......] - ETA: 25s - loss: 1.2034 - regression_loss: 1.0575 - classification_loss: 0.1459 402/500 [=======================>......] - ETA: 24s - loss: 1.2032 - regression_loss: 1.0573 - classification_loss: 0.1459 403/500 [=======================>......] - ETA: 24s - loss: 1.2026 - regression_loss: 1.0569 - classification_loss: 0.1457 404/500 [=======================>......] - ETA: 24s - loss: 1.2039 - regression_loss: 1.0581 - classification_loss: 0.1458 405/500 [=======================>......] - ETA: 24s - loss: 1.2045 - regression_loss: 1.0585 - classification_loss: 0.1459 406/500 [=======================>......] - ETA: 23s - loss: 1.2053 - regression_loss: 1.0593 - classification_loss: 0.1460 407/500 [=======================>......] - ETA: 23s - loss: 1.2044 - regression_loss: 1.0586 - classification_loss: 0.1458 408/500 [=======================>......] - ETA: 23s - loss: 1.2042 - regression_loss: 1.0586 - classification_loss: 0.1456 409/500 [=======================>......] - ETA: 22s - loss: 1.2029 - regression_loss: 1.0575 - classification_loss: 0.1454 410/500 [=======================>......] - ETA: 22s - loss: 1.2035 - regression_loss: 1.0580 - classification_loss: 0.1455 411/500 [=======================>......] - ETA: 22s - loss: 1.2038 - regression_loss: 1.0583 - classification_loss: 0.1455 412/500 [=======================>......] - ETA: 22s - loss: 1.2034 - regression_loss: 1.0580 - classification_loss: 0.1454 413/500 [=======================>......] - ETA: 21s - loss: 1.2035 - regression_loss: 1.0581 - classification_loss: 0.1453 414/500 [=======================>......] - ETA: 21s - loss: 1.2022 - regression_loss: 1.0571 - classification_loss: 0.1451 415/500 [=======================>......] - ETA: 21s - loss: 1.2030 - regression_loss: 1.0575 - classification_loss: 0.1454 416/500 [=======================>......] - ETA: 21s - loss: 1.2024 - regression_loss: 1.0572 - classification_loss: 0.1452 417/500 [========================>.....] - ETA: 20s - loss: 1.2026 - regression_loss: 1.0574 - classification_loss: 0.1453 418/500 [========================>.....] - ETA: 20s - loss: 1.2031 - regression_loss: 1.0576 - classification_loss: 0.1455 419/500 [========================>.....] - ETA: 20s - loss: 1.2018 - regression_loss: 1.0566 - classification_loss: 0.1453 420/500 [========================>.....] - ETA: 20s - loss: 1.2017 - regression_loss: 1.0565 - classification_loss: 0.1452 421/500 [========================>.....] - ETA: 19s - loss: 1.2021 - regression_loss: 1.0569 - classification_loss: 0.1453 422/500 [========================>.....] - ETA: 19s - loss: 1.2027 - regression_loss: 1.0572 - classification_loss: 0.1455 423/500 [========================>.....] - ETA: 19s - loss: 1.2024 - regression_loss: 1.0568 - classification_loss: 0.1456 424/500 [========================>.....] - ETA: 19s - loss: 1.2034 - regression_loss: 1.0576 - classification_loss: 0.1459 425/500 [========================>.....] - ETA: 18s - loss: 1.2039 - regression_loss: 1.0579 - classification_loss: 0.1460 426/500 [========================>.....] - ETA: 18s - loss: 1.2046 - regression_loss: 1.0587 - classification_loss: 0.1460 427/500 [========================>.....] - ETA: 18s - loss: 1.2051 - regression_loss: 1.0590 - classification_loss: 0.1461 428/500 [========================>.....] - ETA: 18s - loss: 1.2053 - regression_loss: 1.0591 - classification_loss: 0.1461 429/500 [========================>.....] - ETA: 17s - loss: 1.2061 - regression_loss: 1.0598 - classification_loss: 0.1463 430/500 [========================>.....] - ETA: 17s - loss: 1.2079 - regression_loss: 1.0611 - classification_loss: 0.1468 431/500 [========================>.....] - ETA: 17s - loss: 1.2083 - regression_loss: 1.0613 - classification_loss: 0.1470 432/500 [========================>.....] - ETA: 17s - loss: 1.2086 - regression_loss: 1.0615 - classification_loss: 0.1471 433/500 [========================>.....] - ETA: 16s - loss: 1.2088 - regression_loss: 1.0618 - classification_loss: 0.1470 434/500 [=========================>....] - ETA: 16s - loss: 1.2125 - regression_loss: 1.0647 - classification_loss: 0.1478 435/500 [=========================>....] - ETA: 16s - loss: 1.2116 - regression_loss: 1.0640 - classification_loss: 0.1476 436/500 [=========================>....] - ETA: 16s - loss: 1.2118 - regression_loss: 1.0641 - classification_loss: 0.1476 437/500 [=========================>....] - ETA: 15s - loss: 1.2120 - regression_loss: 1.0642 - classification_loss: 0.1478 438/500 [=========================>....] - ETA: 15s - loss: 1.2128 - regression_loss: 1.0649 - classification_loss: 0.1479 439/500 [=========================>....] - ETA: 15s - loss: 1.2153 - regression_loss: 1.0671 - classification_loss: 0.1482 440/500 [=========================>....] - ETA: 15s - loss: 1.2141 - regression_loss: 1.0660 - classification_loss: 0.1480 441/500 [=========================>....] - ETA: 14s - loss: 1.2154 - regression_loss: 1.0669 - classification_loss: 0.1485 442/500 [=========================>....] - ETA: 14s - loss: 1.2148 - regression_loss: 1.0665 - classification_loss: 0.1483 443/500 [=========================>....] - ETA: 14s - loss: 1.2151 - regression_loss: 1.0667 - classification_loss: 0.1484 444/500 [=========================>....] - ETA: 14s - loss: 1.2148 - regression_loss: 1.0664 - classification_loss: 0.1484 445/500 [=========================>....] - ETA: 13s - loss: 1.2161 - regression_loss: 1.0674 - classification_loss: 0.1486 446/500 [=========================>....] - ETA: 13s - loss: 1.2148 - regression_loss: 1.0664 - classification_loss: 0.1485 447/500 [=========================>....] - ETA: 13s - loss: 1.2143 - regression_loss: 1.0659 - classification_loss: 0.1484 448/500 [=========================>....] - ETA: 13s - loss: 1.2130 - regression_loss: 1.0648 - classification_loss: 0.1481 449/500 [=========================>....] - ETA: 12s - loss: 1.2140 - regression_loss: 1.0657 - classification_loss: 0.1483 450/500 [==========================>...] - ETA: 12s - loss: 1.2139 - regression_loss: 1.0656 - classification_loss: 0.1483 451/500 [==========================>...] - ETA: 12s - loss: 1.2131 - regression_loss: 1.0650 - classification_loss: 0.1481 452/500 [==========================>...] - ETA: 12s - loss: 1.2131 - regression_loss: 1.0651 - classification_loss: 0.1480 453/500 [==========================>...] - ETA: 11s - loss: 1.2124 - regression_loss: 1.0646 - classification_loss: 0.1478 454/500 [==========================>...] - ETA: 11s - loss: 1.2113 - regression_loss: 1.0637 - classification_loss: 0.1476 455/500 [==========================>...] - ETA: 11s - loss: 1.2122 - regression_loss: 1.0644 - classification_loss: 0.1478 456/500 [==========================>...] - ETA: 11s - loss: 1.2115 - regression_loss: 1.0639 - classification_loss: 0.1476 457/500 [==========================>...] - ETA: 10s - loss: 1.2131 - regression_loss: 1.0654 - classification_loss: 0.1478 458/500 [==========================>...] - ETA: 10s - loss: 1.2128 - regression_loss: 1.0651 - classification_loss: 0.1478 459/500 [==========================>...] - ETA: 10s - loss: 1.2122 - regression_loss: 1.0646 - classification_loss: 0.1476 460/500 [==========================>...] - ETA: 10s - loss: 1.2110 - regression_loss: 1.0635 - classification_loss: 0.1475 461/500 [==========================>...] - ETA: 9s - loss: 1.2110 - regression_loss: 1.0634 - classification_loss: 0.1476  462/500 [==========================>...] - ETA: 9s - loss: 1.2110 - regression_loss: 1.0633 - classification_loss: 0.1476 463/500 [==========================>...] - ETA: 9s - loss: 1.2109 - regression_loss: 1.0634 - classification_loss: 0.1475 464/500 [==========================>...] - ETA: 9s - loss: 1.2118 - regression_loss: 1.0641 - classification_loss: 0.1477 465/500 [==========================>...] - ETA: 8s - loss: 1.2126 - regression_loss: 1.0648 - classification_loss: 0.1478 466/500 [==========================>...] - ETA: 8s - loss: 1.2123 - regression_loss: 1.0646 - classification_loss: 0.1477 467/500 [===========================>..] - ETA: 8s - loss: 1.2119 - regression_loss: 1.0644 - classification_loss: 0.1476 468/500 [===========================>..] - ETA: 8s - loss: 1.2103 - regression_loss: 1.0630 - classification_loss: 0.1474 469/500 [===========================>..] - ETA: 7s - loss: 1.2102 - regression_loss: 1.0629 - classification_loss: 0.1472 470/500 [===========================>..] - ETA: 7s - loss: 1.2092 - regression_loss: 1.0622 - classification_loss: 0.1470 471/500 [===========================>..] - ETA: 7s - loss: 1.2093 - regression_loss: 1.0623 - classification_loss: 0.1470 472/500 [===========================>..] - ETA: 7s - loss: 1.2094 - regression_loss: 1.0624 - classification_loss: 0.1470 473/500 [===========================>..] - ETA: 6s - loss: 1.2096 - regression_loss: 1.0626 - classification_loss: 0.1470 474/500 [===========================>..] - ETA: 6s - loss: 1.2109 - regression_loss: 1.0636 - classification_loss: 0.1472 475/500 [===========================>..] - ETA: 6s - loss: 1.2094 - regression_loss: 1.0624 - classification_loss: 0.1470 476/500 [===========================>..] - ETA: 6s - loss: 1.2088 - regression_loss: 1.0618 - classification_loss: 0.1470 477/500 [===========================>..] - ETA: 5s - loss: 1.2085 - regression_loss: 1.0615 - classification_loss: 0.1470 478/500 [===========================>..] - ETA: 5s - loss: 1.2084 - regression_loss: 1.0615 - classification_loss: 0.1469 479/500 [===========================>..] - ETA: 5s - loss: 1.2089 - regression_loss: 1.0620 - classification_loss: 0.1469 480/500 [===========================>..] - ETA: 5s - loss: 1.2092 - regression_loss: 1.0622 - classification_loss: 0.1470 481/500 [===========================>..] - ETA: 4s - loss: 1.2091 - regression_loss: 1.0621 - classification_loss: 0.1470 482/500 [===========================>..] - ETA: 4s - loss: 1.2096 - regression_loss: 1.0624 - classification_loss: 0.1472 483/500 [===========================>..] - ETA: 4s - loss: 1.2103 - regression_loss: 1.0630 - classification_loss: 0.1473 484/500 [============================>.] - ETA: 4s - loss: 1.2112 - regression_loss: 1.0638 - classification_loss: 0.1474 485/500 [============================>.] - ETA: 3s - loss: 1.2125 - regression_loss: 1.0647 - classification_loss: 0.1478 486/500 [============================>.] - ETA: 3s - loss: 1.2130 - regression_loss: 1.0652 - classification_loss: 0.1478 487/500 [============================>.] - ETA: 3s - loss: 1.2136 - regression_loss: 1.0657 - classification_loss: 0.1479 488/500 [============================>.] - ETA: 3s - loss: 1.2129 - regression_loss: 1.0651 - classification_loss: 0.1478 489/500 [============================>.] - ETA: 2s - loss: 1.2139 - regression_loss: 1.0659 - classification_loss: 0.1480 490/500 [============================>.] - ETA: 2s - loss: 1.2138 - regression_loss: 1.0658 - classification_loss: 0.1479 491/500 [============================>.] - ETA: 2s - loss: 1.2131 - regression_loss: 1.0652 - classification_loss: 0.1479 492/500 [============================>.] - ETA: 2s - loss: 1.2140 - regression_loss: 1.0661 - classification_loss: 0.1479 493/500 [============================>.] - ETA: 1s - loss: 1.2151 - regression_loss: 1.0670 - classification_loss: 0.1481 494/500 [============================>.] - ETA: 1s - loss: 1.2140 - regression_loss: 1.0660 - classification_loss: 0.1480 495/500 [============================>.] - ETA: 1s - loss: 1.2145 - regression_loss: 1.0665 - classification_loss: 0.1480 496/500 [============================>.] - ETA: 1s - loss: 1.2155 - regression_loss: 1.0675 - classification_loss: 0.1481 497/500 [============================>.] - ETA: 0s - loss: 1.2157 - regression_loss: 1.0675 - classification_loss: 0.1481 498/500 [============================>.] - ETA: 0s - loss: 1.2146 - regression_loss: 1.0666 - classification_loss: 0.1480 499/500 [============================>.] - ETA: 0s - loss: 1.2149 - regression_loss: 1.0670 - classification_loss: 0.1480 500/500 [==============================] - 127s 253ms/step - loss: 1.2143 - regression_loss: 1.0666 - classification_loss: 0.1477 1172 instances of class plum with average precision: 0.7590 mAP: 0.7590 Epoch 00025: saving model to ./training/snapshots/resnet50_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:04 - loss: 1.3403 - regression_loss: 1.2012 - classification_loss: 0.1391 2/500 [..............................] - ETA: 2:07 - loss: 1.3358 - regression_loss: 1.2283 - classification_loss: 0.1074 3/500 [..............................] - ETA: 2:06 - loss: 1.3571 - regression_loss: 1.2381 - classification_loss: 0.1190 4/500 [..............................] - ETA: 2:06 - loss: 1.2425 - regression_loss: 1.1185 - classification_loss: 0.1240 5/500 [..............................] - ETA: 2:06 - loss: 1.1396 - regression_loss: 1.0281 - classification_loss: 0.1115 6/500 [..............................] - ETA: 2:05 - loss: 1.1669 - regression_loss: 1.0459 - classification_loss: 0.1210 7/500 [..............................] - ETA: 2:04 - loss: 1.1058 - regression_loss: 0.9883 - classification_loss: 0.1176 8/500 [..............................] - ETA: 2:05 - loss: 1.0938 - regression_loss: 0.9782 - classification_loss: 0.1156 9/500 [..............................] - ETA: 2:05 - loss: 1.3857 - regression_loss: 1.0799 - classification_loss: 0.3058 10/500 [..............................] - ETA: 2:05 - loss: 1.3878 - regression_loss: 1.0942 - classification_loss: 0.2936 11/500 [..............................] - ETA: 2:05 - loss: 1.3679 - regression_loss: 1.0910 - classification_loss: 0.2769 12/500 [..............................] - ETA: 2:05 - loss: 1.3722 - regression_loss: 1.1032 - classification_loss: 0.2690 13/500 [..............................] - ETA: 2:05 - loss: 1.4029 - regression_loss: 1.1349 - classification_loss: 0.2681 14/500 [..............................] - ETA: 2:05 - loss: 1.3982 - regression_loss: 1.1389 - classification_loss: 0.2593 15/500 [..............................] - ETA: 2:04 - loss: 1.3671 - regression_loss: 1.1189 - classification_loss: 0.2482 16/500 [..............................] - ETA: 2:04 - loss: 1.3677 - regression_loss: 1.1247 - classification_loss: 0.2431 17/500 [>.............................] - ETA: 2:04 - loss: 1.3523 - regression_loss: 1.1169 - classification_loss: 0.2354 18/500 [>.............................] - ETA: 2:03 - loss: 1.3523 - regression_loss: 1.1205 - classification_loss: 0.2318 19/500 [>.............................] - ETA: 2:03 - loss: 1.3085 - regression_loss: 1.0861 - classification_loss: 0.2224 20/500 [>.............................] - ETA: 2:03 - loss: 1.3186 - regression_loss: 1.0974 - classification_loss: 0.2212 21/500 [>.............................] - ETA: 2:03 - loss: 1.3205 - regression_loss: 1.1012 - classification_loss: 0.2192 22/500 [>.............................] - ETA: 2:02 - loss: 1.3153 - regression_loss: 1.0980 - classification_loss: 0.2173 23/500 [>.............................] - ETA: 2:02 - loss: 1.3298 - regression_loss: 1.1163 - classification_loss: 0.2134 24/500 [>.............................] - ETA: 2:02 - loss: 1.3390 - regression_loss: 1.1256 - classification_loss: 0.2134 25/500 [>.............................] - ETA: 2:01 - loss: 1.3344 - regression_loss: 1.1236 - classification_loss: 0.2108 26/500 [>.............................] - ETA: 2:01 - loss: 1.3102 - regression_loss: 1.1057 - classification_loss: 0.2044 27/500 [>.............................] - ETA: 2:00 - loss: 1.2861 - regression_loss: 1.0868 - classification_loss: 0.1993 28/500 [>.............................] - ETA: 2:00 - loss: 1.2608 - regression_loss: 1.0674 - classification_loss: 0.1934 29/500 [>.............................] - ETA: 2:00 - loss: 1.2356 - regression_loss: 1.0474 - classification_loss: 0.1882 30/500 [>.............................] - ETA: 2:00 - loss: 1.2340 - regression_loss: 1.0468 - classification_loss: 0.1872 31/500 [>.............................] - ETA: 2:00 - loss: 1.2177 - regression_loss: 1.0350 - classification_loss: 0.1827 32/500 [>.............................] - ETA: 1:59 - loss: 1.2123 - regression_loss: 1.0311 - classification_loss: 0.1811 33/500 [>.............................] - ETA: 1:59 - loss: 1.2182 - regression_loss: 1.0368 - classification_loss: 0.1813 34/500 [=>............................] - ETA: 1:59 - loss: 1.2138 - regression_loss: 1.0346 - classification_loss: 0.1793 35/500 [=>............................] - ETA: 1:59 - loss: 1.2075 - regression_loss: 1.0296 - classification_loss: 0.1779 36/500 [=>............................] - ETA: 1:59 - loss: 1.1990 - regression_loss: 1.0214 - classification_loss: 0.1776 37/500 [=>............................] - ETA: 1:59 - loss: 1.1966 - regression_loss: 1.0202 - classification_loss: 0.1763 38/500 [=>............................] - ETA: 1:58 - loss: 1.2133 - regression_loss: 1.0369 - classification_loss: 0.1764 39/500 [=>............................] - ETA: 1:58 - loss: 1.1960 - regression_loss: 1.0232 - classification_loss: 0.1728 40/500 [=>............................] - ETA: 1:58 - loss: 1.2088 - regression_loss: 1.0340 - classification_loss: 0.1748 41/500 [=>............................] - ETA: 1:57 - loss: 1.2020 - regression_loss: 1.0290 - classification_loss: 0.1729 42/500 [=>............................] - ETA: 1:57 - loss: 1.2001 - regression_loss: 1.0279 - classification_loss: 0.1722 43/500 [=>............................] - ETA: 1:57 - loss: 1.1995 - regression_loss: 1.0284 - classification_loss: 0.1711 44/500 [=>............................] - ETA: 1:57 - loss: 1.2024 - regression_loss: 1.0314 - classification_loss: 0.1709 45/500 [=>............................] - ETA: 1:56 - loss: 1.2137 - regression_loss: 1.0424 - classification_loss: 0.1713 46/500 [=>............................] - ETA: 1:56 - loss: 1.2155 - regression_loss: 1.0456 - classification_loss: 0.1699 47/500 [=>............................] - ETA: 1:56 - loss: 1.2084 - regression_loss: 1.0406 - classification_loss: 0.1678 48/500 [=>............................] - ETA: 1:56 - loss: 1.2110 - regression_loss: 1.0424 - classification_loss: 0.1686 49/500 [=>............................] - ETA: 1:56 - loss: 1.2010 - regression_loss: 1.0351 - classification_loss: 0.1659 50/500 [==>...........................] - ETA: 1:56 - loss: 1.1899 - regression_loss: 1.0261 - classification_loss: 0.1638 51/500 [==>...........................] - ETA: 1:55 - loss: 1.1921 - regression_loss: 1.0293 - classification_loss: 0.1628 52/500 [==>...........................] - ETA: 1:55 - loss: 1.1906 - regression_loss: 1.0283 - classification_loss: 0.1623 53/500 [==>...........................] - ETA: 1:55 - loss: 1.1892 - regression_loss: 1.0276 - classification_loss: 0.1616 54/500 [==>...........................] - ETA: 1:54 - loss: 1.1857 - regression_loss: 1.0253 - classification_loss: 0.1603 55/500 [==>...........................] - ETA: 1:54 - loss: 1.1758 - regression_loss: 1.0173 - classification_loss: 0.1585 56/500 [==>...........................] - ETA: 1:54 - loss: 1.1856 - regression_loss: 1.0260 - classification_loss: 0.1596 57/500 [==>...........................] - ETA: 1:54 - loss: 1.1936 - regression_loss: 1.0334 - classification_loss: 0.1602 58/500 [==>...........................] - ETA: 1:54 - loss: 1.1819 - regression_loss: 1.0237 - classification_loss: 0.1582 59/500 [==>...........................] - ETA: 1:53 - loss: 1.1897 - regression_loss: 1.0308 - classification_loss: 0.1588 60/500 [==>...........................] - ETA: 1:53 - loss: 1.1809 - regression_loss: 1.0232 - classification_loss: 0.1578 61/500 [==>...........................] - ETA: 1:53 - loss: 1.1687 - regression_loss: 1.0125 - classification_loss: 0.1562 62/500 [==>...........................] - ETA: 1:52 - loss: 1.1707 - regression_loss: 1.0153 - classification_loss: 0.1554 63/500 [==>...........................] - ETA: 1:52 - loss: 1.1703 - regression_loss: 1.0154 - classification_loss: 0.1550 64/500 [==>...........................] - ETA: 1:52 - loss: 1.1634 - regression_loss: 1.0096 - classification_loss: 0.1538 65/500 [==>...........................] - ETA: 1:52 - loss: 1.1537 - regression_loss: 1.0017 - classification_loss: 0.1520 66/500 [==>...........................] - ETA: 1:52 - loss: 1.1534 - regression_loss: 1.0020 - classification_loss: 0.1514 67/500 [===>..........................] - ETA: 1:51 - loss: 1.1556 - regression_loss: 1.0042 - classification_loss: 0.1514 68/500 [===>..........................] - ETA: 1:51 - loss: 1.1562 - regression_loss: 1.0051 - classification_loss: 0.1512 69/500 [===>..........................] - ETA: 1:51 - loss: 1.1579 - regression_loss: 1.0069 - classification_loss: 0.1510 70/500 [===>..........................] - ETA: 1:51 - loss: 1.1528 - regression_loss: 1.0032 - classification_loss: 0.1497 71/500 [===>..........................] - ETA: 1:50 - loss: 1.1540 - regression_loss: 1.0039 - classification_loss: 0.1501 72/500 [===>..........................] - ETA: 1:50 - loss: 1.1582 - regression_loss: 1.0076 - classification_loss: 0.1506 73/500 [===>..........................] - ETA: 1:50 - loss: 1.1601 - regression_loss: 1.0094 - classification_loss: 0.1507 74/500 [===>..........................] - ETA: 1:50 - loss: 1.1651 - regression_loss: 1.0138 - classification_loss: 0.1514 75/500 [===>..........................] - ETA: 1:49 - loss: 1.1738 - regression_loss: 1.0214 - classification_loss: 0.1524 76/500 [===>..........................] - ETA: 1:49 - loss: 1.1727 - regression_loss: 1.0203 - classification_loss: 0.1524 77/500 [===>..........................] - ETA: 1:49 - loss: 1.1751 - regression_loss: 1.0232 - classification_loss: 0.1519 78/500 [===>..........................] - ETA: 1:49 - loss: 1.1791 - regression_loss: 1.0260 - classification_loss: 0.1531 79/500 [===>..........................] - ETA: 1:48 - loss: 1.1881 - regression_loss: 1.0342 - classification_loss: 0.1539 80/500 [===>..........................] - ETA: 1:48 - loss: 1.1867 - regression_loss: 1.0329 - classification_loss: 0.1538 81/500 [===>..........................] - ETA: 1:48 - loss: 1.1975 - regression_loss: 1.0426 - classification_loss: 0.1548 82/500 [===>..........................] - ETA: 1:48 - loss: 1.2012 - regression_loss: 1.0453 - classification_loss: 0.1558 83/500 [===>..........................] - ETA: 1:47 - loss: 1.2035 - regression_loss: 1.0477 - classification_loss: 0.1557 84/500 [====>.........................] - ETA: 1:47 - loss: 1.2047 - regression_loss: 1.0487 - classification_loss: 0.1559 85/500 [====>.........................] - ETA: 1:47 - loss: 1.2062 - regression_loss: 1.0504 - classification_loss: 0.1558 86/500 [====>.........................] - ETA: 1:46 - loss: 1.2130 - regression_loss: 1.0562 - classification_loss: 0.1568 87/500 [====>.........................] - ETA: 1:46 - loss: 1.2166 - regression_loss: 1.0595 - classification_loss: 0.1571 88/500 [====>.........................] - ETA: 1:46 - loss: 1.2148 - regression_loss: 1.0585 - classification_loss: 0.1564 89/500 [====>.........................] - ETA: 1:46 - loss: 1.2095 - regression_loss: 1.0541 - classification_loss: 0.1554 90/500 [====>.........................] - ETA: 1:45 - loss: 1.2027 - regression_loss: 1.0483 - classification_loss: 0.1544 91/500 [====>.........................] - ETA: 1:45 - loss: 1.2004 - regression_loss: 1.0465 - classification_loss: 0.1540 92/500 [====>.........................] - ETA: 1:45 - loss: 1.1976 - regression_loss: 1.0442 - classification_loss: 0.1535 93/500 [====>.........................] - ETA: 1:45 - loss: 1.2007 - regression_loss: 1.0470 - classification_loss: 0.1537 94/500 [====>.........................] - ETA: 1:44 - loss: 1.2046 - regression_loss: 1.0508 - classification_loss: 0.1539 95/500 [====>.........................] - ETA: 1:44 - loss: 1.2097 - regression_loss: 1.0550 - classification_loss: 0.1547 96/500 [====>.........................] - ETA: 1:44 - loss: 1.2156 - regression_loss: 1.0597 - classification_loss: 0.1558 97/500 [====>.........................] - ETA: 1:43 - loss: 1.2174 - regression_loss: 1.0613 - classification_loss: 0.1561 98/500 [====>.........................] - ETA: 1:43 - loss: 1.2175 - regression_loss: 1.0615 - classification_loss: 0.1560 99/500 [====>.........................] - ETA: 1:43 - loss: 1.2110 - regression_loss: 1.0563 - classification_loss: 0.1547 100/500 [=====>........................] - ETA: 1:43 - loss: 1.2132 - regression_loss: 1.0587 - classification_loss: 0.1545 101/500 [=====>........................] - ETA: 1:43 - loss: 1.2144 - regression_loss: 1.0597 - classification_loss: 0.1546 102/500 [=====>........................] - ETA: 1:42 - loss: 1.2057 - regression_loss: 1.0523 - classification_loss: 0.1534 103/500 [=====>........................] - ETA: 1:42 - loss: 1.2065 - regression_loss: 1.0529 - classification_loss: 0.1536 104/500 [=====>........................] - ETA: 1:42 - loss: 1.2078 - regression_loss: 1.0544 - classification_loss: 0.1535 105/500 [=====>........................] - ETA: 1:41 - loss: 1.2074 - regression_loss: 1.0540 - classification_loss: 0.1534 106/500 [=====>........................] - ETA: 1:41 - loss: 1.2066 - regression_loss: 1.0539 - classification_loss: 0.1527 107/500 [=====>........................] - ETA: 1:41 - loss: 1.2048 - regression_loss: 1.0530 - classification_loss: 0.1518 108/500 [=====>........................] - ETA: 1:40 - loss: 1.2045 - regression_loss: 1.0531 - classification_loss: 0.1514 109/500 [=====>........................] - ETA: 1:40 - loss: 1.2087 - regression_loss: 1.0568 - classification_loss: 0.1518 110/500 [=====>........................] - ETA: 1:40 - loss: 1.2125 - regression_loss: 1.0602 - classification_loss: 0.1523 111/500 [=====>........................] - ETA: 1:39 - loss: 1.2153 - regression_loss: 1.0630 - classification_loss: 0.1524 112/500 [=====>........................] - ETA: 1:39 - loss: 1.2155 - regression_loss: 1.0633 - classification_loss: 0.1522 113/500 [=====>........................] - ETA: 1:39 - loss: 1.2175 - regression_loss: 1.0653 - classification_loss: 0.1522 114/500 [=====>........................] - ETA: 1:38 - loss: 1.2188 - regression_loss: 1.0668 - classification_loss: 0.1520 115/500 [=====>........................] - ETA: 1:38 - loss: 1.2197 - regression_loss: 1.0677 - classification_loss: 0.1520 116/500 [=====>........................] - ETA: 1:38 - loss: 1.2161 - regression_loss: 1.0649 - classification_loss: 0.1512 117/500 [======>.......................] - ETA: 1:37 - loss: 1.2191 - regression_loss: 1.0675 - classification_loss: 0.1515 118/500 [======>.......................] - ETA: 1:37 - loss: 1.2222 - regression_loss: 1.0702 - classification_loss: 0.1520 119/500 [======>.......................] - ETA: 1:37 - loss: 1.2254 - regression_loss: 1.0731 - classification_loss: 0.1523 120/500 [======>.......................] - ETA: 1:37 - loss: 1.2264 - regression_loss: 1.0740 - classification_loss: 0.1524 121/500 [======>.......................] - ETA: 1:36 - loss: 1.2282 - regression_loss: 1.0760 - classification_loss: 0.1522 122/500 [======>.......................] - ETA: 1:36 - loss: 1.2246 - regression_loss: 1.0733 - classification_loss: 0.1513 123/500 [======>.......................] - ETA: 1:36 - loss: 1.2235 - regression_loss: 1.0724 - classification_loss: 0.1510 124/500 [======>.......................] - ETA: 1:36 - loss: 1.2233 - regression_loss: 1.0725 - classification_loss: 0.1508 125/500 [======>.......................] - ETA: 1:35 - loss: 1.2180 - regression_loss: 1.0678 - classification_loss: 0.1502 126/500 [======>.......................] - ETA: 1:35 - loss: 1.2170 - regression_loss: 1.0669 - classification_loss: 0.1501 127/500 [======>.......................] - ETA: 1:35 - loss: 1.2180 - regression_loss: 1.0680 - classification_loss: 0.1499 128/500 [======>.......................] - ETA: 1:35 - loss: 1.2228 - regression_loss: 1.0720 - classification_loss: 0.1507 129/500 [======>.......................] - ETA: 1:34 - loss: 1.2189 - regression_loss: 1.0690 - classification_loss: 0.1498 130/500 [======>.......................] - ETA: 1:34 - loss: 1.2240 - regression_loss: 1.0736 - classification_loss: 0.1504 131/500 [======>.......................] - ETA: 1:34 - loss: 1.2273 - regression_loss: 1.0761 - classification_loss: 0.1512 132/500 [======>.......................] - ETA: 1:33 - loss: 1.2302 - regression_loss: 1.0784 - classification_loss: 0.1518 133/500 [======>.......................] - ETA: 1:33 - loss: 1.2293 - regression_loss: 1.0776 - classification_loss: 0.1517 134/500 [=======>......................] - ETA: 1:33 - loss: 1.2332 - regression_loss: 1.0809 - classification_loss: 0.1523 135/500 [=======>......................] - ETA: 1:33 - loss: 1.2285 - regression_loss: 1.0767 - classification_loss: 0.1518 136/500 [=======>......................] - ETA: 1:32 - loss: 1.2271 - regression_loss: 1.0756 - classification_loss: 0.1515 137/500 [=======>......................] - ETA: 1:32 - loss: 1.2284 - regression_loss: 1.0768 - classification_loss: 0.1516 138/500 [=======>......................] - ETA: 1:32 - loss: 1.2248 - regression_loss: 1.0737 - classification_loss: 0.1510 139/500 [=======>......................] - ETA: 1:32 - loss: 1.2269 - regression_loss: 1.0757 - classification_loss: 0.1512 140/500 [=======>......................] - ETA: 1:31 - loss: 1.2248 - regression_loss: 1.0742 - classification_loss: 0.1506 141/500 [=======>......................] - ETA: 1:31 - loss: 1.2274 - regression_loss: 1.0763 - classification_loss: 0.1511 142/500 [=======>......................] - ETA: 1:31 - loss: 1.2290 - regression_loss: 1.0774 - classification_loss: 0.1515 143/500 [=======>......................] - ETA: 1:31 - loss: 1.2253 - regression_loss: 1.0740 - classification_loss: 0.1512 144/500 [=======>......................] - ETA: 1:30 - loss: 1.2262 - regression_loss: 1.0744 - classification_loss: 0.1517 145/500 [=======>......................] - ETA: 1:30 - loss: 1.2204 - regression_loss: 1.0695 - classification_loss: 0.1509 146/500 [=======>......................] - ETA: 1:30 - loss: 1.2221 - regression_loss: 1.0704 - classification_loss: 0.1517 147/500 [=======>......................] - ETA: 1:29 - loss: 1.2209 - regression_loss: 1.0697 - classification_loss: 0.1513 148/500 [=======>......................] - ETA: 1:29 - loss: 1.2169 - regression_loss: 1.0663 - classification_loss: 0.1506 149/500 [=======>......................] - ETA: 1:29 - loss: 1.2161 - regression_loss: 1.0655 - classification_loss: 0.1506 150/500 [========>.....................] - ETA: 1:28 - loss: 1.2148 - regression_loss: 1.0647 - classification_loss: 0.1501 151/500 [========>.....................] - ETA: 1:28 - loss: 1.2102 - regression_loss: 1.0609 - classification_loss: 0.1494 152/500 [========>.....................] - ETA: 1:28 - loss: 1.2136 - regression_loss: 1.0639 - classification_loss: 0.1497 153/500 [========>.....................] - ETA: 1:28 - loss: 1.2170 - regression_loss: 1.0666 - classification_loss: 0.1504 154/500 [========>.....................] - ETA: 1:27 - loss: 1.2186 - regression_loss: 1.0679 - classification_loss: 0.1507 155/500 [========>.....................] - ETA: 1:27 - loss: 1.2189 - regression_loss: 1.0678 - classification_loss: 0.1510 156/500 [========>.....................] - ETA: 1:27 - loss: 1.2163 - regression_loss: 1.0659 - classification_loss: 0.1504 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2133 - regression_loss: 1.0633 - classification_loss: 0.1500 158/500 [========>.....................] - ETA: 1:26 - loss: 1.2143 - regression_loss: 1.0642 - classification_loss: 0.1501 159/500 [========>.....................] - ETA: 1:26 - loss: 1.2138 - regression_loss: 1.0637 - classification_loss: 0.1501 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2167 - regression_loss: 1.0664 - classification_loss: 0.1504 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2207 - regression_loss: 1.0694 - classification_loss: 0.1512 162/500 [========>.....................] - ETA: 1:25 - loss: 1.2232 - regression_loss: 1.0714 - classification_loss: 0.1518 163/500 [========>.....................] - ETA: 1:25 - loss: 1.2252 - regression_loss: 1.0732 - classification_loss: 0.1521 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2262 - regression_loss: 1.0739 - classification_loss: 0.1523 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2289 - regression_loss: 1.0764 - classification_loss: 0.1525 166/500 [========>.....................] - ETA: 1:24 - loss: 1.2291 - regression_loss: 1.0763 - classification_loss: 0.1529 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2268 - regression_loss: 1.0744 - classification_loss: 0.1524 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2244 - regression_loss: 1.0723 - classification_loss: 0.1521 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2248 - regression_loss: 1.0729 - classification_loss: 0.1519 170/500 [=========>....................] - ETA: 1:23 - loss: 1.2233 - regression_loss: 1.0719 - classification_loss: 0.1514 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2247 - regression_loss: 1.0732 - classification_loss: 0.1515 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2198 - regression_loss: 1.0690 - classification_loss: 0.1508 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2193 - regression_loss: 1.0688 - classification_loss: 0.1505 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2197 - regression_loss: 1.0689 - classification_loss: 0.1508 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2207 - regression_loss: 1.0697 - classification_loss: 0.1510 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2176 - regression_loss: 1.0673 - classification_loss: 0.1503 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2175 - regression_loss: 1.0673 - classification_loss: 0.1502 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2179 - regression_loss: 1.0678 - classification_loss: 0.1501 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2177 - regression_loss: 1.0676 - classification_loss: 0.1502 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2185 - regression_loss: 1.0683 - classification_loss: 0.1502 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2141 - regression_loss: 1.0645 - classification_loss: 0.1497 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2113 - regression_loss: 1.0621 - classification_loss: 0.1492 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2100 - regression_loss: 1.0611 - classification_loss: 0.1490 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2067 - regression_loss: 1.0584 - classification_loss: 0.1484 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2062 - regression_loss: 1.0581 - classification_loss: 0.1482 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2058 - regression_loss: 1.0578 - classification_loss: 0.1481 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2078 - regression_loss: 1.0592 - classification_loss: 0.1486 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2100 - regression_loss: 1.0610 - classification_loss: 0.1491 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2079 - regression_loss: 1.0594 - classification_loss: 0.1485 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2077 - regression_loss: 1.0594 - classification_loss: 0.1483 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2059 - regression_loss: 1.0578 - classification_loss: 0.1481 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2056 - regression_loss: 1.0577 - classification_loss: 0.1480 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2063 - regression_loss: 1.0582 - classification_loss: 0.1481 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2035 - regression_loss: 1.0559 - classification_loss: 0.1476 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2043 - regression_loss: 1.0569 - classification_loss: 0.1474 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2049 - regression_loss: 1.0577 - classification_loss: 0.1472 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2077 - regression_loss: 1.0598 - classification_loss: 0.1479 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2047 - regression_loss: 1.0574 - classification_loss: 0.1473 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2046 - regression_loss: 1.0572 - classification_loss: 0.1475 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2005 - regression_loss: 1.0536 - classification_loss: 0.1469 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2013 - regression_loss: 1.0542 - classification_loss: 0.1472 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1973 - regression_loss: 1.0508 - classification_loss: 0.1466 203/500 [===========>..................] - ETA: 1:14 - loss: 1.1944 - regression_loss: 1.0484 - classification_loss: 0.1460 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1920 - regression_loss: 1.0464 - classification_loss: 0.1456 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1890 - regression_loss: 1.0439 - classification_loss: 0.1451 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1897 - regression_loss: 1.0444 - classification_loss: 0.1452 207/500 [===========>..................] - ETA: 1:13 - loss: 1.1925 - regression_loss: 1.0470 - classification_loss: 0.1456 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1927 - regression_loss: 1.0471 - classification_loss: 0.1456 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1934 - regression_loss: 1.0477 - classification_loss: 0.1456 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1949 - regression_loss: 1.0488 - classification_loss: 0.1461 211/500 [===========>..................] - ETA: 1:12 - loss: 1.1905 - regression_loss: 1.0449 - classification_loss: 0.1457 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1895 - regression_loss: 1.0437 - classification_loss: 0.1458 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1882 - regression_loss: 1.0426 - classification_loss: 0.1456 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1902 - regression_loss: 1.0445 - classification_loss: 0.1456 215/500 [===========>..................] - ETA: 1:11 - loss: 1.1908 - regression_loss: 1.0450 - classification_loss: 0.1458 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1918 - regression_loss: 1.0458 - classification_loss: 0.1460 217/500 [============>.................] - ETA: 1:11 - loss: 1.1923 - regression_loss: 1.0463 - classification_loss: 0.1460 218/500 [============>.................] - ETA: 1:11 - loss: 1.1943 - regression_loss: 1.0480 - classification_loss: 0.1463 219/500 [============>.................] - ETA: 1:10 - loss: 1.1937 - regression_loss: 1.0474 - classification_loss: 0.1462 220/500 [============>.................] - ETA: 1:10 - loss: 1.1946 - regression_loss: 1.0480 - classification_loss: 0.1465 221/500 [============>.................] - ETA: 1:10 - loss: 1.1939 - regression_loss: 1.0474 - classification_loss: 0.1465 222/500 [============>.................] - ETA: 1:10 - loss: 1.1946 - regression_loss: 1.0480 - classification_loss: 0.1466 223/500 [============>.................] - ETA: 1:09 - loss: 1.1961 - regression_loss: 1.0496 - classification_loss: 0.1465 224/500 [============>.................] - ETA: 1:09 - loss: 1.1969 - regression_loss: 1.0504 - classification_loss: 0.1465 225/500 [============>.................] - ETA: 1:09 - loss: 1.1964 - regression_loss: 1.0499 - classification_loss: 0.1465 226/500 [============>.................] - ETA: 1:09 - loss: 1.1970 - regression_loss: 1.0502 - classification_loss: 0.1467 227/500 [============>.................] - ETA: 1:08 - loss: 1.1966 - regression_loss: 1.0501 - classification_loss: 0.1465 228/500 [============>.................] - ETA: 1:08 - loss: 1.1951 - regression_loss: 1.0489 - classification_loss: 0.1462 229/500 [============>.................] - ETA: 1:08 - loss: 1.1933 - regression_loss: 1.0476 - classification_loss: 0.1457 230/500 [============>.................] - ETA: 1:08 - loss: 1.1939 - regression_loss: 1.0480 - classification_loss: 0.1460 231/500 [============>.................] - ETA: 1:07 - loss: 1.1959 - regression_loss: 1.0497 - classification_loss: 0.1462 232/500 [============>.................] - ETA: 1:07 - loss: 1.1957 - regression_loss: 1.0495 - classification_loss: 0.1461 233/500 [============>.................] - ETA: 1:07 - loss: 1.1956 - regression_loss: 1.0497 - classification_loss: 0.1459 234/500 [=============>................] - ETA: 1:07 - loss: 1.1953 - regression_loss: 1.0494 - classification_loss: 0.1459 235/500 [=============>................] - ETA: 1:06 - loss: 1.1956 - regression_loss: 1.0497 - classification_loss: 0.1460 236/500 [=============>................] - ETA: 1:06 - loss: 1.1982 - regression_loss: 1.0518 - classification_loss: 0.1463 237/500 [=============>................] - ETA: 1:06 - loss: 1.1977 - regression_loss: 1.0516 - classification_loss: 0.1461 238/500 [=============>................] - ETA: 1:06 - loss: 1.1989 - regression_loss: 1.0527 - classification_loss: 0.1463 239/500 [=============>................] - ETA: 1:05 - loss: 1.1975 - regression_loss: 1.0515 - classification_loss: 0.1460 240/500 [=============>................] - ETA: 1:05 - loss: 1.1987 - regression_loss: 1.0524 - classification_loss: 0.1463 241/500 [=============>................] - ETA: 1:05 - loss: 1.1990 - regression_loss: 1.0526 - classification_loss: 0.1464 242/500 [=============>................] - ETA: 1:05 - loss: 1.1987 - regression_loss: 1.0524 - classification_loss: 0.1463 243/500 [=============>................] - ETA: 1:04 - loss: 1.2008 - regression_loss: 1.0544 - classification_loss: 0.1464 244/500 [=============>................] - ETA: 1:04 - loss: 1.2021 - regression_loss: 1.0554 - classification_loss: 0.1467 245/500 [=============>................] - ETA: 1:04 - loss: 1.2054 - regression_loss: 1.0582 - classification_loss: 0.1472 246/500 [=============>................] - ETA: 1:04 - loss: 1.2065 - regression_loss: 1.0593 - classification_loss: 0.1472 247/500 [=============>................] - ETA: 1:03 - loss: 1.2071 - regression_loss: 1.0598 - classification_loss: 0.1473 248/500 [=============>................] - ETA: 1:03 - loss: 1.2057 - regression_loss: 1.0583 - classification_loss: 0.1474 249/500 [=============>................] - ETA: 1:03 - loss: 1.2059 - regression_loss: 1.0584 - classification_loss: 0.1475 250/500 [==============>...............] - ETA: 1:03 - loss: 1.2103 - regression_loss: 1.0622 - classification_loss: 0.1481 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2105 - regression_loss: 1.0623 - classification_loss: 0.1482 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2100 - regression_loss: 1.0620 - classification_loss: 0.1481 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2084 - regression_loss: 1.0607 - classification_loss: 0.1477 254/500 [==============>...............] - ETA: 1:02 - loss: 1.2105 - regression_loss: 1.0625 - classification_loss: 0.1480 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2082 - regression_loss: 1.0606 - classification_loss: 0.1477 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2073 - regression_loss: 1.0598 - classification_loss: 0.1475 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2059 - regression_loss: 1.0588 - classification_loss: 0.1471 258/500 [==============>...............] - ETA: 1:01 - loss: 1.2077 - regression_loss: 1.0602 - classification_loss: 0.1475 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2088 - regression_loss: 1.0610 - classification_loss: 0.1478 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2081 - regression_loss: 1.0605 - classification_loss: 0.1476 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2067 - regression_loss: 1.0592 - classification_loss: 0.1475 262/500 [==============>...............] - ETA: 1:00 - loss: 1.2064 - regression_loss: 1.0589 - classification_loss: 0.1475 263/500 [==============>...............] - ETA: 59s - loss: 1.2064 - regression_loss: 1.0589 - classification_loss: 0.1475  264/500 [==============>...............] - ETA: 59s - loss: 1.2072 - regression_loss: 1.0597 - classification_loss: 0.1475 265/500 [==============>...............] - ETA: 59s - loss: 1.2070 - regression_loss: 1.0595 - classification_loss: 0.1475 266/500 [==============>...............] - ETA: 59s - loss: 1.2049 - regression_loss: 1.0578 - classification_loss: 0.1471 267/500 [===============>..............] - ETA: 58s - loss: 1.2041 - regression_loss: 1.0571 - classification_loss: 0.1470 268/500 [===============>..............] - ETA: 58s - loss: 1.2040 - regression_loss: 1.0570 - classification_loss: 0.1470 269/500 [===============>..............] - ETA: 58s - loss: 1.2016 - regression_loss: 1.0550 - classification_loss: 0.1466 270/500 [===============>..............] - ETA: 58s - loss: 1.2000 - regression_loss: 1.0535 - classification_loss: 0.1465 271/500 [===============>..............] - ETA: 57s - loss: 1.2006 - regression_loss: 1.0542 - classification_loss: 0.1464 272/500 [===============>..............] - ETA: 57s - loss: 1.2006 - regression_loss: 1.0545 - classification_loss: 0.1461 273/500 [===============>..............] - ETA: 57s - loss: 1.1972 - regression_loss: 1.0516 - classification_loss: 0.1457 274/500 [===============>..............] - ETA: 57s - loss: 1.1993 - regression_loss: 1.0533 - classification_loss: 0.1460 275/500 [===============>..............] - ETA: 56s - loss: 1.1976 - regression_loss: 1.0520 - classification_loss: 0.1457 276/500 [===============>..............] - ETA: 56s - loss: 1.1960 - regression_loss: 1.0506 - classification_loss: 0.1454 277/500 [===============>..............] - ETA: 56s - loss: 1.1959 - regression_loss: 1.0506 - classification_loss: 0.1453 278/500 [===============>..............] - ETA: 56s - loss: 1.1950 - regression_loss: 1.0499 - classification_loss: 0.1451 279/500 [===============>..............] - ETA: 55s - loss: 1.1935 - regression_loss: 1.0487 - classification_loss: 0.1448 280/500 [===============>..............] - ETA: 55s - loss: 1.1916 - regression_loss: 1.0471 - classification_loss: 0.1445 281/500 [===============>..............] - ETA: 55s - loss: 1.1929 - regression_loss: 1.0481 - classification_loss: 0.1448 282/500 [===============>..............] - ETA: 55s - loss: 1.1928 - regression_loss: 1.0480 - classification_loss: 0.1448 283/500 [===============>..............] - ETA: 54s - loss: 1.1915 - regression_loss: 1.0469 - classification_loss: 0.1447 284/500 [================>.............] - ETA: 54s - loss: 1.1919 - regression_loss: 1.0472 - classification_loss: 0.1447 285/500 [================>.............] - ETA: 54s - loss: 1.1911 - regression_loss: 1.0466 - classification_loss: 0.1445 286/500 [================>.............] - ETA: 54s - loss: 1.1923 - regression_loss: 1.0476 - classification_loss: 0.1447 287/500 [================>.............] - ETA: 53s - loss: 1.1928 - regression_loss: 1.0481 - classification_loss: 0.1447 288/500 [================>.............] - ETA: 53s - loss: 1.1934 - regression_loss: 1.0486 - classification_loss: 0.1448 289/500 [================>.............] - ETA: 53s - loss: 1.1945 - regression_loss: 1.0493 - classification_loss: 0.1452 290/500 [================>.............] - ETA: 53s - loss: 1.1937 - regression_loss: 1.0486 - classification_loss: 0.1451 291/500 [================>.............] - ETA: 52s - loss: 1.1946 - regression_loss: 1.0491 - classification_loss: 0.1456 292/500 [================>.............] - ETA: 52s - loss: 1.1938 - regression_loss: 1.0484 - classification_loss: 0.1455 293/500 [================>.............] - ETA: 52s - loss: 1.1938 - regression_loss: 1.0483 - classification_loss: 0.1455 294/500 [================>.............] - ETA: 52s - loss: 1.1948 - regression_loss: 1.0490 - classification_loss: 0.1457 295/500 [================>.............] - ETA: 51s - loss: 1.1970 - regression_loss: 1.0508 - classification_loss: 0.1462 296/500 [================>.............] - ETA: 51s - loss: 1.1975 - regression_loss: 1.0510 - classification_loss: 0.1464 297/500 [================>.............] - ETA: 51s - loss: 1.1978 - regression_loss: 1.0513 - classification_loss: 0.1465 298/500 [================>.............] - ETA: 51s - loss: 1.1985 - regression_loss: 1.0518 - classification_loss: 0.1467 299/500 [================>.............] - ETA: 50s - loss: 1.1992 - regression_loss: 1.0526 - classification_loss: 0.1467 300/500 [=================>............] - ETA: 50s - loss: 1.1986 - regression_loss: 1.0521 - classification_loss: 0.1464 301/500 [=================>............] - ETA: 50s - loss: 1.1985 - regression_loss: 1.0522 - classification_loss: 0.1463 302/500 [=================>............] - ETA: 50s - loss: 1.1988 - regression_loss: 1.0526 - classification_loss: 0.1462 303/500 [=================>............] - ETA: 49s - loss: 1.1992 - regression_loss: 1.0526 - classification_loss: 0.1466 304/500 [=================>............] - ETA: 49s - loss: 1.1990 - regression_loss: 1.0526 - classification_loss: 0.1465 305/500 [=================>............] - ETA: 49s - loss: 1.1966 - regression_loss: 1.0505 - classification_loss: 0.1461 306/500 [=================>............] - ETA: 49s - loss: 1.1968 - regression_loss: 1.0508 - classification_loss: 0.1460 307/500 [=================>............] - ETA: 48s - loss: 1.1957 - regression_loss: 1.0499 - classification_loss: 0.1458 308/500 [=================>............] - ETA: 48s - loss: 1.1956 - regression_loss: 1.0499 - classification_loss: 0.1457 309/500 [=================>............] - ETA: 48s - loss: 1.1951 - regression_loss: 1.0496 - classification_loss: 0.1455 310/500 [=================>............] - ETA: 48s - loss: 1.1945 - regression_loss: 1.0491 - classification_loss: 0.1454 311/500 [=================>............] - ETA: 47s - loss: 1.1958 - regression_loss: 1.0504 - classification_loss: 0.1454 312/500 [=================>............] - ETA: 47s - loss: 1.1946 - regression_loss: 1.0493 - classification_loss: 0.1453 313/500 [=================>............] - ETA: 47s - loss: 1.1950 - regression_loss: 1.0497 - classification_loss: 0.1453 314/500 [=================>............] - ETA: 47s - loss: 1.1960 - regression_loss: 1.0505 - classification_loss: 0.1455 315/500 [=================>............] - ETA: 46s - loss: 1.1937 - regression_loss: 1.0484 - classification_loss: 0.1453 316/500 [=================>............] - ETA: 46s - loss: 1.1953 - regression_loss: 1.0498 - classification_loss: 0.1455 317/500 [==================>...........] - ETA: 46s - loss: 1.1954 - regression_loss: 1.0500 - classification_loss: 0.1455 318/500 [==================>...........] - ETA: 46s - loss: 1.1956 - regression_loss: 1.0503 - classification_loss: 0.1453 319/500 [==================>...........] - ETA: 45s - loss: 1.1946 - regression_loss: 1.0496 - classification_loss: 0.1450 320/500 [==================>...........] - ETA: 45s - loss: 1.1927 - regression_loss: 1.0479 - classification_loss: 0.1448 321/500 [==================>...........] - ETA: 45s - loss: 1.1933 - regression_loss: 1.0484 - classification_loss: 0.1449 322/500 [==================>...........] - ETA: 45s - loss: 1.1908 - regression_loss: 1.0462 - classification_loss: 0.1446 323/500 [==================>...........] - ETA: 44s - loss: 1.1908 - regression_loss: 1.0461 - classification_loss: 0.1446 324/500 [==================>...........] - ETA: 44s - loss: 1.1888 - regression_loss: 1.0445 - classification_loss: 0.1444 325/500 [==================>...........] - ETA: 44s - loss: 1.1897 - regression_loss: 1.0451 - classification_loss: 0.1446 326/500 [==================>...........] - ETA: 44s - loss: 1.1894 - regression_loss: 1.0449 - classification_loss: 0.1444 327/500 [==================>...........] - ETA: 43s - loss: 1.1913 - regression_loss: 1.0464 - classification_loss: 0.1449 328/500 [==================>...........] - ETA: 43s - loss: 1.1914 - regression_loss: 1.0465 - classification_loss: 0.1450 329/500 [==================>...........] - ETA: 43s - loss: 1.1923 - regression_loss: 1.0472 - classification_loss: 0.1451 330/500 [==================>...........] - ETA: 43s - loss: 1.1916 - regression_loss: 1.0468 - classification_loss: 0.1449 331/500 [==================>...........] - ETA: 42s - loss: 1.1901 - regression_loss: 1.0455 - classification_loss: 0.1446 332/500 [==================>...........] - ETA: 42s - loss: 1.1875 - regression_loss: 1.0431 - classification_loss: 0.1444 333/500 [==================>...........] - ETA: 42s - loss: 1.1883 - regression_loss: 1.0439 - classification_loss: 0.1444 334/500 [===================>..........] - ETA: 42s - loss: 1.1861 - regression_loss: 1.0420 - classification_loss: 0.1441 335/500 [===================>..........] - ETA: 41s - loss: 1.1852 - regression_loss: 1.0413 - classification_loss: 0.1439 336/500 [===================>..........] - ETA: 41s - loss: 1.1855 - regression_loss: 1.0417 - classification_loss: 0.1438 337/500 [===================>..........] - ETA: 41s - loss: 1.1866 - regression_loss: 1.0426 - classification_loss: 0.1441 338/500 [===================>..........] - ETA: 41s - loss: 1.1862 - regression_loss: 1.0424 - classification_loss: 0.1438 339/500 [===================>..........] - ETA: 40s - loss: 1.1860 - regression_loss: 1.0422 - classification_loss: 0.1438 340/500 [===================>..........] - ETA: 40s - loss: 1.1878 - regression_loss: 1.0437 - classification_loss: 0.1441 341/500 [===================>..........] - ETA: 40s - loss: 1.1853 - regression_loss: 1.0415 - classification_loss: 0.1438 342/500 [===================>..........] - ETA: 39s - loss: 1.1865 - regression_loss: 1.0425 - classification_loss: 0.1440 343/500 [===================>..........] - ETA: 39s - loss: 1.1887 - regression_loss: 1.0443 - classification_loss: 0.1444 344/500 [===================>..........] - ETA: 39s - loss: 1.1879 - regression_loss: 1.0437 - classification_loss: 0.1442 345/500 [===================>..........] - ETA: 39s - loss: 1.1893 - regression_loss: 1.0448 - classification_loss: 0.1444 346/500 [===================>..........] - ETA: 38s - loss: 1.1886 - regression_loss: 1.0444 - classification_loss: 0.1442 347/500 [===================>..........] - ETA: 38s - loss: 1.1896 - regression_loss: 1.0452 - classification_loss: 0.1444 348/500 [===================>..........] - ETA: 38s - loss: 1.1889 - regression_loss: 1.0447 - classification_loss: 0.1443 349/500 [===================>..........] - ETA: 38s - loss: 1.1875 - regression_loss: 1.0436 - classification_loss: 0.1440 350/500 [====================>.........] - ETA: 37s - loss: 1.1875 - regression_loss: 1.0435 - classification_loss: 0.1440 351/500 [====================>.........] - ETA: 37s - loss: 1.1884 - regression_loss: 1.0443 - classification_loss: 0.1441 352/500 [====================>.........] - ETA: 37s - loss: 1.1899 - regression_loss: 1.0456 - classification_loss: 0.1443 353/500 [====================>.........] - ETA: 37s - loss: 1.1913 - regression_loss: 1.0468 - classification_loss: 0.1445 354/500 [====================>.........] - ETA: 36s - loss: 1.1933 - regression_loss: 1.0482 - classification_loss: 0.1451 355/500 [====================>.........] - ETA: 36s - loss: 1.1913 - regression_loss: 1.0464 - classification_loss: 0.1448 356/500 [====================>.........] - ETA: 36s - loss: 1.1914 - regression_loss: 1.0465 - classification_loss: 0.1448 357/500 [====================>.........] - ETA: 36s - loss: 1.1905 - regression_loss: 1.0457 - classification_loss: 0.1448 358/500 [====================>.........] - ETA: 35s - loss: 1.1921 - regression_loss: 1.0471 - classification_loss: 0.1450 359/500 [====================>.........] - ETA: 35s - loss: 1.1916 - regression_loss: 1.0468 - classification_loss: 0.1447 360/500 [====================>.........] - ETA: 35s - loss: 1.1903 - regression_loss: 1.0458 - classification_loss: 0.1445 361/500 [====================>.........] - ETA: 35s - loss: 1.1919 - regression_loss: 1.0470 - classification_loss: 0.1448 362/500 [====================>.........] - ETA: 34s - loss: 1.1927 - regression_loss: 1.0479 - classification_loss: 0.1449 363/500 [====================>.........] - ETA: 34s - loss: 1.1928 - regression_loss: 1.0479 - classification_loss: 0.1449 364/500 [====================>.........] - ETA: 34s - loss: 1.1928 - regression_loss: 1.0479 - classification_loss: 0.1449 365/500 [====================>.........] - ETA: 34s - loss: 1.1935 - regression_loss: 1.0486 - classification_loss: 0.1449 366/500 [====================>.........] - ETA: 33s - loss: 1.1937 - regression_loss: 1.0488 - classification_loss: 0.1449 367/500 [=====================>........] - ETA: 33s - loss: 1.1928 - regression_loss: 1.0480 - classification_loss: 0.1448 368/500 [=====================>........] - ETA: 33s - loss: 1.1923 - regression_loss: 1.0476 - classification_loss: 0.1447 369/500 [=====================>........] - ETA: 33s - loss: 1.1933 - regression_loss: 1.0486 - classification_loss: 0.1448 370/500 [=====================>........] - ETA: 32s - loss: 1.1920 - regression_loss: 1.0475 - classification_loss: 0.1445 371/500 [=====================>........] - ETA: 32s - loss: 1.1907 - regression_loss: 1.0464 - classification_loss: 0.1444 372/500 [=====================>........] - ETA: 32s - loss: 1.1894 - regression_loss: 1.0452 - classification_loss: 0.1441 373/500 [=====================>........] - ETA: 32s - loss: 1.1884 - regression_loss: 1.0444 - classification_loss: 0.1440 374/500 [=====================>........] - ETA: 31s - loss: 1.1894 - regression_loss: 1.0452 - classification_loss: 0.1442 375/500 [=====================>........] - ETA: 31s - loss: 1.1900 - regression_loss: 1.0458 - classification_loss: 0.1442 376/500 [=====================>........] - ETA: 31s - loss: 1.1908 - regression_loss: 1.0465 - classification_loss: 0.1443 377/500 [=====================>........] - ETA: 31s - loss: 1.1888 - regression_loss: 1.0448 - classification_loss: 0.1440 378/500 [=====================>........] - ETA: 30s - loss: 1.1880 - regression_loss: 1.0441 - classification_loss: 0.1439 379/500 [=====================>........] - ETA: 30s - loss: 1.1891 - regression_loss: 1.0449 - classification_loss: 0.1442 380/500 [=====================>........] - ETA: 30s - loss: 1.1900 - regression_loss: 1.0457 - classification_loss: 0.1444 381/500 [=====================>........] - ETA: 30s - loss: 1.1909 - regression_loss: 1.0464 - classification_loss: 0.1445 382/500 [=====================>........] - ETA: 29s - loss: 1.1911 - regression_loss: 1.0467 - classification_loss: 0.1445 383/500 [=====================>........] - ETA: 29s - loss: 1.1920 - regression_loss: 1.0474 - classification_loss: 0.1446 384/500 [======================>.......] - ETA: 29s - loss: 1.1916 - regression_loss: 1.0472 - classification_loss: 0.1445 385/500 [======================>.......] - ETA: 29s - loss: 1.1916 - regression_loss: 1.0472 - classification_loss: 0.1444 386/500 [======================>.......] - ETA: 28s - loss: 1.1902 - regression_loss: 1.0460 - classification_loss: 0.1442 387/500 [======================>.......] - ETA: 28s - loss: 1.1908 - regression_loss: 1.0464 - classification_loss: 0.1444 388/500 [======================>.......] - ETA: 28s - loss: 1.1900 - regression_loss: 1.0457 - classification_loss: 0.1443 389/500 [======================>.......] - ETA: 28s - loss: 1.1904 - regression_loss: 1.0460 - classification_loss: 0.1444 390/500 [======================>.......] - ETA: 27s - loss: 1.1918 - regression_loss: 1.0472 - classification_loss: 0.1445 391/500 [======================>.......] - ETA: 27s - loss: 1.1916 - regression_loss: 1.0472 - classification_loss: 0.1444 392/500 [======================>.......] - ETA: 27s - loss: 1.1918 - regression_loss: 1.0473 - classification_loss: 0.1445 393/500 [======================>.......] - ETA: 27s - loss: 1.1914 - regression_loss: 1.0470 - classification_loss: 0.1443 394/500 [======================>.......] - ETA: 26s - loss: 1.1916 - regression_loss: 1.0471 - classification_loss: 0.1446 395/500 [======================>.......] - ETA: 26s - loss: 1.1935 - regression_loss: 1.0486 - classification_loss: 0.1449 396/500 [======================>.......] - ETA: 26s - loss: 1.1934 - regression_loss: 1.0485 - classification_loss: 0.1449 397/500 [======================>.......] - ETA: 26s - loss: 1.1934 - regression_loss: 1.0486 - classification_loss: 0.1448 398/500 [======================>.......] - ETA: 25s - loss: 1.1924 - regression_loss: 1.0477 - classification_loss: 0.1447 399/500 [======================>.......] - ETA: 25s - loss: 1.1934 - regression_loss: 1.0486 - classification_loss: 0.1448 400/500 [=======================>......] - ETA: 25s - loss: 1.1917 - regression_loss: 1.0472 - classification_loss: 0.1445 401/500 [=======================>......] - ETA: 25s - loss: 1.1921 - regression_loss: 1.0476 - classification_loss: 0.1445 402/500 [=======================>......] - ETA: 24s - loss: 1.1908 - regression_loss: 1.0465 - classification_loss: 0.1443 403/500 [=======================>......] - ETA: 24s - loss: 1.1903 - regression_loss: 1.0460 - classification_loss: 0.1442 404/500 [=======================>......] - ETA: 24s - loss: 1.1909 - regression_loss: 1.0462 - classification_loss: 0.1447 405/500 [=======================>......] - ETA: 23s - loss: 1.1924 - regression_loss: 1.0474 - classification_loss: 0.1449 406/500 [=======================>......] - ETA: 23s - loss: 1.1926 - regression_loss: 1.0476 - classification_loss: 0.1450 407/500 [=======================>......] - ETA: 23s - loss: 1.1928 - regression_loss: 1.0478 - classification_loss: 0.1450 408/500 [=======================>......] - ETA: 23s - loss: 1.1920 - regression_loss: 1.0472 - classification_loss: 0.1448 409/500 [=======================>......] - ETA: 22s - loss: 1.1907 - regression_loss: 1.0461 - classification_loss: 0.1445 410/500 [=======================>......] - ETA: 22s - loss: 1.1899 - regression_loss: 1.0455 - classification_loss: 0.1444 411/500 [=======================>......] - ETA: 22s - loss: 1.1908 - regression_loss: 1.0462 - classification_loss: 0.1445 412/500 [=======================>......] - ETA: 22s - loss: 1.1892 - regression_loss: 1.0449 - classification_loss: 0.1444 413/500 [=======================>......] - ETA: 21s - loss: 1.1882 - regression_loss: 1.0439 - classification_loss: 0.1442 414/500 [=======================>......] - ETA: 21s - loss: 1.1864 - regression_loss: 1.0424 - classification_loss: 0.1440 415/500 [=======================>......] - ETA: 21s - loss: 1.1855 - regression_loss: 1.0417 - classification_loss: 0.1438 416/500 [=======================>......] - ETA: 21s - loss: 1.1856 - regression_loss: 1.0419 - classification_loss: 0.1437 417/500 [========================>.....] - ETA: 20s - loss: 1.1849 - regression_loss: 1.0413 - classification_loss: 0.1436 418/500 [========================>.....] - ETA: 20s - loss: 1.1856 - regression_loss: 1.0419 - classification_loss: 0.1437 419/500 [========================>.....] - ETA: 20s - loss: 1.1841 - regression_loss: 1.0406 - classification_loss: 0.1435 420/500 [========================>.....] - ETA: 20s - loss: 1.1838 - regression_loss: 1.0404 - classification_loss: 0.1434 421/500 [========================>.....] - ETA: 19s - loss: 1.1826 - regression_loss: 1.0395 - classification_loss: 0.1432 422/500 [========================>.....] - ETA: 19s - loss: 1.1831 - regression_loss: 1.0398 - classification_loss: 0.1433 423/500 [========================>.....] - ETA: 19s - loss: 1.1829 - regression_loss: 1.0396 - classification_loss: 0.1433 424/500 [========================>.....] - ETA: 19s - loss: 1.1838 - regression_loss: 1.0404 - classification_loss: 0.1435 425/500 [========================>.....] - ETA: 18s - loss: 1.1822 - regression_loss: 1.0390 - classification_loss: 0.1432 426/500 [========================>.....] - ETA: 18s - loss: 1.1832 - regression_loss: 1.0398 - classification_loss: 0.1434 427/500 [========================>.....] - ETA: 18s - loss: 1.1816 - regression_loss: 1.0384 - classification_loss: 0.1431 428/500 [========================>.....] - ETA: 18s - loss: 1.1817 - regression_loss: 1.0385 - classification_loss: 0.1431 429/500 [========================>.....] - ETA: 17s - loss: 1.1796 - regression_loss: 1.0368 - classification_loss: 0.1429 430/500 [========================>.....] - ETA: 17s - loss: 1.1803 - regression_loss: 1.0374 - classification_loss: 0.1429 431/500 [========================>.....] - ETA: 17s - loss: 1.1788 - regression_loss: 1.0361 - classification_loss: 0.1427 432/500 [========================>.....] - ETA: 17s - loss: 1.1791 - regression_loss: 1.0364 - classification_loss: 0.1427 433/500 [========================>.....] - ETA: 16s - loss: 1.1786 - regression_loss: 1.0360 - classification_loss: 0.1426 434/500 [=========================>....] - ETA: 16s - loss: 1.1784 - regression_loss: 1.0359 - classification_loss: 0.1425 435/500 [=========================>....] - ETA: 16s - loss: 1.1779 - regression_loss: 1.0355 - classification_loss: 0.1425 436/500 [=========================>....] - ETA: 16s - loss: 1.1784 - regression_loss: 1.0359 - classification_loss: 0.1425 437/500 [=========================>....] - ETA: 15s - loss: 1.1788 - regression_loss: 1.0362 - classification_loss: 0.1426 438/500 [=========================>....] - ETA: 15s - loss: 1.1786 - regression_loss: 1.0361 - classification_loss: 0.1425 439/500 [=========================>....] - ETA: 15s - loss: 1.1797 - regression_loss: 1.0370 - classification_loss: 0.1426 440/500 [=========================>....] - ETA: 15s - loss: 1.1802 - regression_loss: 1.0375 - classification_loss: 0.1427 441/500 [=========================>....] - ETA: 14s - loss: 1.1809 - regression_loss: 1.0381 - classification_loss: 0.1427 442/500 [=========================>....] - ETA: 14s - loss: 1.1816 - regression_loss: 1.0388 - classification_loss: 0.1428 443/500 [=========================>....] - ETA: 14s - loss: 1.1811 - regression_loss: 1.0385 - classification_loss: 0.1426 444/500 [=========================>....] - ETA: 14s - loss: 1.1814 - regression_loss: 1.0388 - classification_loss: 0.1426 445/500 [=========================>....] - ETA: 13s - loss: 1.1824 - regression_loss: 1.0396 - classification_loss: 0.1428 446/500 [=========================>....] - ETA: 13s - loss: 1.1828 - regression_loss: 1.0400 - classification_loss: 0.1429 447/500 [=========================>....] - ETA: 13s - loss: 1.1825 - regression_loss: 1.0397 - classification_loss: 0.1428 448/500 [=========================>....] - ETA: 13s - loss: 1.1811 - regression_loss: 1.0385 - classification_loss: 0.1425 449/500 [=========================>....] - ETA: 12s - loss: 1.1830 - regression_loss: 1.0402 - classification_loss: 0.1428 450/500 [==========================>...] - ETA: 12s - loss: 1.1829 - regression_loss: 1.0401 - classification_loss: 0.1428 451/500 [==========================>...] - ETA: 12s - loss: 1.1815 - regression_loss: 1.0389 - classification_loss: 0.1426 452/500 [==========================>...] - ETA: 12s - loss: 1.1821 - regression_loss: 1.0394 - classification_loss: 0.1427 453/500 [==========================>...] - ETA: 11s - loss: 1.1826 - regression_loss: 1.0398 - classification_loss: 0.1427 454/500 [==========================>...] - ETA: 11s - loss: 1.1818 - regression_loss: 1.0392 - classification_loss: 0.1426 455/500 [==========================>...] - ETA: 11s - loss: 1.1816 - regression_loss: 1.0390 - classification_loss: 0.1427 456/500 [==========================>...] - ETA: 11s - loss: 1.1823 - regression_loss: 1.0395 - classification_loss: 0.1428 457/500 [==========================>...] - ETA: 10s - loss: 1.1827 - regression_loss: 1.0399 - classification_loss: 0.1428 458/500 [==========================>...] - ETA: 10s - loss: 1.1822 - regression_loss: 1.0394 - classification_loss: 0.1428 459/500 [==========================>...] - ETA: 10s - loss: 1.1829 - regression_loss: 1.0397 - classification_loss: 0.1432 460/500 [==========================>...] - ETA: 10s - loss: 1.1833 - regression_loss: 1.0401 - classification_loss: 0.1432 461/500 [==========================>...] - ETA: 9s - loss: 1.1825 - regression_loss: 1.0395 - classification_loss: 0.1430  462/500 [==========================>...] - ETA: 9s - loss: 1.1833 - regression_loss: 1.0401 - classification_loss: 0.1432 463/500 [==========================>...] - ETA: 9s - loss: 1.1824 - regression_loss: 1.0394 - classification_loss: 0.1430 464/500 [==========================>...] - ETA: 9s - loss: 1.1830 - regression_loss: 1.0400 - classification_loss: 0.1430 465/500 [==========================>...] - ETA: 8s - loss: 1.1821 - regression_loss: 1.0392 - classification_loss: 0.1429 466/500 [==========================>...] - ETA: 8s - loss: 1.1828 - regression_loss: 1.0398 - classification_loss: 0.1430 467/500 [===========================>..] - ETA: 8s - loss: 1.1841 - regression_loss: 1.0409 - classification_loss: 0.1432 468/500 [===========================>..] - ETA: 8s - loss: 1.1840 - regression_loss: 1.0408 - classification_loss: 0.1432 469/500 [===========================>..] - ETA: 7s - loss: 1.1827 - regression_loss: 1.0396 - classification_loss: 0.1430 470/500 [===========================>..] - ETA: 7s - loss: 1.1845 - regression_loss: 1.0410 - classification_loss: 0.1435 471/500 [===========================>..] - ETA: 7s - loss: 1.1832 - regression_loss: 1.0400 - classification_loss: 0.1432 472/500 [===========================>..] - ETA: 7s - loss: 1.1833 - regression_loss: 1.0401 - classification_loss: 0.1432 473/500 [===========================>..] - ETA: 6s - loss: 1.1835 - regression_loss: 1.0402 - classification_loss: 0.1433 474/500 [===========================>..] - ETA: 6s - loss: 1.1835 - regression_loss: 1.0401 - classification_loss: 0.1434 475/500 [===========================>..] - ETA: 6s - loss: 1.1842 - regression_loss: 1.0407 - classification_loss: 0.1436 476/500 [===========================>..] - ETA: 6s - loss: 1.1850 - regression_loss: 1.0413 - classification_loss: 0.1437 477/500 [===========================>..] - ETA: 5s - loss: 1.1857 - regression_loss: 1.0418 - classification_loss: 0.1439 478/500 [===========================>..] - ETA: 5s - loss: 1.1845 - regression_loss: 1.0408 - classification_loss: 0.1437 479/500 [===========================>..] - ETA: 5s - loss: 1.1843 - regression_loss: 1.0407 - classification_loss: 0.1437 480/500 [===========================>..] - ETA: 5s - loss: 1.1856 - regression_loss: 1.0417 - classification_loss: 0.1438 481/500 [===========================>..] - ETA: 4s - loss: 1.1851 - regression_loss: 1.0412 - classification_loss: 0.1439 482/500 [===========================>..] - ETA: 4s - loss: 1.1851 - regression_loss: 1.0412 - classification_loss: 0.1439 483/500 [===========================>..] - ETA: 4s - loss: 1.1856 - regression_loss: 1.0415 - classification_loss: 0.1441 484/500 [============================>.] - ETA: 4s - loss: 1.1860 - regression_loss: 1.0417 - classification_loss: 0.1444 485/500 [============================>.] - ETA: 3s - loss: 1.1864 - regression_loss: 1.0419 - classification_loss: 0.1445 486/500 [============================>.] - ETA: 3s - loss: 1.1859 - regression_loss: 1.0416 - classification_loss: 0.1444 487/500 [============================>.] - ETA: 3s - loss: 1.1857 - regression_loss: 1.0414 - classification_loss: 0.1443 488/500 [============================>.] - ETA: 3s - loss: 1.1860 - regression_loss: 1.0416 - classification_loss: 0.1444 489/500 [============================>.] - ETA: 2s - loss: 1.1869 - regression_loss: 1.0425 - classification_loss: 0.1444 490/500 [============================>.] - ETA: 2s - loss: 1.1874 - regression_loss: 1.0429 - classification_loss: 0.1445 491/500 [============================>.] - ETA: 2s - loss: 1.1878 - regression_loss: 1.0432 - classification_loss: 0.1446 492/500 [============================>.] - ETA: 2s - loss: 1.1861 - regression_loss: 1.0417 - classification_loss: 0.1444 493/500 [============================>.] - ETA: 1s - loss: 1.1862 - regression_loss: 1.0418 - classification_loss: 0.1444 494/500 [============================>.] - ETA: 1s - loss: 1.1866 - regression_loss: 1.0421 - classification_loss: 0.1445 495/500 [============================>.] - ETA: 1s - loss: 1.1869 - regression_loss: 1.0424 - classification_loss: 0.1445 496/500 [============================>.] - ETA: 1s - loss: 1.1870 - regression_loss: 1.0424 - classification_loss: 0.1446 497/500 [============================>.] - ETA: 0s - loss: 1.1867 - regression_loss: 1.0421 - classification_loss: 0.1445 498/500 [============================>.] - ETA: 0s - loss: 1.1855 - regression_loss: 1.0411 - classification_loss: 0.1444 499/500 [============================>.] - ETA: 0s - loss: 1.1865 - regression_loss: 1.0420 - classification_loss: 0.1445 500/500 [==============================] - 127s 253ms/step - loss: 1.1869 - regression_loss: 1.0423 - classification_loss: 0.1446 1172 instances of class plum with average precision: 0.7542 mAP: 0.7542 Epoch 00026: saving model to ./training/snapshots/resnet50_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:03 - loss: 1.7007 - regression_loss: 1.4669 - classification_loss: 0.2338 2/500 [..............................] - ETA: 2:02 - loss: 1.4581 - regression_loss: 1.2795 - classification_loss: 0.1787 3/500 [..............................] - ETA: 2:05 - loss: 1.3407 - regression_loss: 1.1909 - classification_loss: 0.1498 4/500 [..............................] - ETA: 2:05 - loss: 1.2746 - regression_loss: 1.1396 - classification_loss: 0.1350 5/500 [..............................] - ETA: 2:06 - loss: 1.2679 - regression_loss: 1.1298 - classification_loss: 0.1381 6/500 [..............................] - ETA: 2:05 - loss: 1.2762 - regression_loss: 1.1371 - classification_loss: 0.1391 7/500 [..............................] - ETA: 2:05 - loss: 1.2924 - regression_loss: 1.1461 - classification_loss: 0.1463 8/500 [..............................] - ETA: 2:06 - loss: 1.2078 - regression_loss: 1.0697 - classification_loss: 0.1381 9/500 [..............................] - ETA: 2:05 - loss: 1.1749 - regression_loss: 1.0384 - classification_loss: 0.1365 10/500 [..............................] - ETA: 2:05 - loss: 1.2033 - regression_loss: 1.0595 - classification_loss: 0.1438 11/500 [..............................] - ETA: 2:03 - loss: 1.1861 - regression_loss: 1.0464 - classification_loss: 0.1397 12/500 [..............................] - ETA: 2:03 - loss: 1.1726 - regression_loss: 1.0340 - classification_loss: 0.1387 13/500 [..............................] - ETA: 2:03 - loss: 1.2272 - regression_loss: 1.0748 - classification_loss: 0.1524 14/500 [..............................] - ETA: 2:02 - loss: 1.2454 - regression_loss: 1.0906 - classification_loss: 0.1548 15/500 [..............................] - ETA: 2:01 - loss: 1.2376 - regression_loss: 1.0883 - classification_loss: 0.1493 16/500 [..............................] - ETA: 2:01 - loss: 1.2200 - regression_loss: 1.0731 - classification_loss: 0.1469 17/500 [>.............................] - ETA: 2:01 - loss: 1.2022 - regression_loss: 1.0597 - classification_loss: 0.1425 18/500 [>.............................] - ETA: 2:01 - loss: 1.1982 - regression_loss: 1.0580 - classification_loss: 0.1401 19/500 [>.............................] - ETA: 2:01 - loss: 1.1961 - regression_loss: 1.0558 - classification_loss: 0.1403 20/500 [>.............................] - ETA: 2:00 - loss: 1.2059 - regression_loss: 1.0638 - classification_loss: 0.1421 21/500 [>.............................] - ETA: 2:00 - loss: 1.2161 - regression_loss: 1.0723 - classification_loss: 0.1438 22/500 [>.............................] - ETA: 2:00 - loss: 1.2057 - regression_loss: 1.0628 - classification_loss: 0.1429 23/500 [>.............................] - ETA: 2:00 - loss: 1.2073 - regression_loss: 1.0647 - classification_loss: 0.1426 24/500 [>.............................] - ETA: 1:59 - loss: 1.1993 - regression_loss: 1.0568 - classification_loss: 0.1425 25/500 [>.............................] - ETA: 1:58 - loss: 1.2201 - regression_loss: 1.0739 - classification_loss: 0.1462 26/500 [>.............................] - ETA: 1:57 - loss: 1.2192 - regression_loss: 1.0714 - classification_loss: 0.1478 27/500 [>.............................] - ETA: 1:57 - loss: 1.2360 - regression_loss: 1.0865 - classification_loss: 0.1495 28/500 [>.............................] - ETA: 1:56 - loss: 1.2539 - regression_loss: 1.1010 - classification_loss: 0.1529 29/500 [>.............................] - ETA: 1:55 - loss: 1.2487 - regression_loss: 1.0974 - classification_loss: 0.1513 30/500 [>.............................] - ETA: 1:55 - loss: 1.2407 - regression_loss: 1.0915 - classification_loss: 0.1491 31/500 [>.............................] - ETA: 1:55 - loss: 1.2476 - regression_loss: 1.0959 - classification_loss: 0.1517 32/500 [>.............................] - ETA: 1:55 - loss: 1.2359 - regression_loss: 1.0868 - classification_loss: 0.1490 33/500 [>.............................] - ETA: 1:54 - loss: 1.2202 - regression_loss: 1.0739 - classification_loss: 0.1463 34/500 [=>............................] - ETA: 1:54 - loss: 1.2310 - regression_loss: 1.0827 - classification_loss: 0.1483 35/500 [=>............................] - ETA: 1:54 - loss: 1.2377 - regression_loss: 1.0899 - classification_loss: 0.1478 36/500 [=>............................] - ETA: 1:54 - loss: 1.2288 - regression_loss: 1.0818 - classification_loss: 0.1470 37/500 [=>............................] - ETA: 1:54 - loss: 1.2181 - regression_loss: 1.0720 - classification_loss: 0.1462 38/500 [=>............................] - ETA: 1:53 - loss: 1.2001 - regression_loss: 1.0567 - classification_loss: 0.1434 39/500 [=>............................] - ETA: 1:53 - loss: 1.1860 - regression_loss: 1.0448 - classification_loss: 0.1412 40/500 [=>............................] - ETA: 1:53 - loss: 1.1888 - regression_loss: 1.0473 - classification_loss: 0.1415 41/500 [=>............................] - ETA: 1:53 - loss: 1.2006 - regression_loss: 1.0587 - classification_loss: 0.1419 42/500 [=>............................] - ETA: 1:53 - loss: 1.1814 - regression_loss: 1.0420 - classification_loss: 0.1395 43/500 [=>............................] - ETA: 1:53 - loss: 1.1825 - regression_loss: 1.0420 - classification_loss: 0.1405 44/500 [=>............................] - ETA: 1:52 - loss: 1.1822 - regression_loss: 1.0422 - classification_loss: 0.1401 45/500 [=>............................] - ETA: 1:52 - loss: 1.1783 - regression_loss: 1.0390 - classification_loss: 0.1393 46/500 [=>............................] - ETA: 1:52 - loss: 1.1909 - regression_loss: 1.0496 - classification_loss: 0.1413 47/500 [=>............................] - ETA: 1:52 - loss: 1.1917 - regression_loss: 1.0510 - classification_loss: 0.1407 48/500 [=>............................] - ETA: 1:52 - loss: 1.1987 - regression_loss: 1.0570 - classification_loss: 0.1418 49/500 [=>............................] - ETA: 1:51 - loss: 1.1822 - regression_loss: 1.0427 - classification_loss: 0.1395 50/500 [==>...........................] - ETA: 1:51 - loss: 1.1806 - regression_loss: 1.0409 - classification_loss: 0.1396 51/500 [==>...........................] - ETA: 1:51 - loss: 1.1742 - regression_loss: 1.0354 - classification_loss: 0.1388 52/500 [==>...........................] - ETA: 1:51 - loss: 1.1590 - regression_loss: 1.0223 - classification_loss: 0.1366 53/500 [==>...........................] - ETA: 1:51 - loss: 1.1607 - regression_loss: 1.0236 - classification_loss: 0.1371 54/500 [==>...........................] - ETA: 1:50 - loss: 1.1940 - regression_loss: 1.0381 - classification_loss: 0.1560 55/500 [==>...........................] - ETA: 1:50 - loss: 1.2006 - regression_loss: 1.0438 - classification_loss: 0.1568 56/500 [==>...........................] - ETA: 1:50 - loss: 1.1975 - regression_loss: 1.0418 - classification_loss: 0.1558 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2070 - regression_loss: 1.0498 - classification_loss: 0.1573 58/500 [==>...........................] - ETA: 1:49 - loss: 1.2049 - regression_loss: 1.0492 - classification_loss: 0.1557 59/500 [==>...........................] - ETA: 1:49 - loss: 1.1969 - regression_loss: 1.0427 - classification_loss: 0.1542 60/500 [==>...........................] - ETA: 1:49 - loss: 1.1881 - regression_loss: 1.0354 - classification_loss: 0.1527 61/500 [==>...........................] - ETA: 1:48 - loss: 1.1912 - regression_loss: 1.0364 - classification_loss: 0.1548 62/500 [==>...........................] - ETA: 1:48 - loss: 1.1940 - regression_loss: 1.0375 - classification_loss: 0.1566 63/500 [==>...........................] - ETA: 1:48 - loss: 1.1835 - regression_loss: 1.0286 - classification_loss: 0.1549 64/500 [==>...........................] - ETA: 1:48 - loss: 1.1849 - regression_loss: 1.0298 - classification_loss: 0.1551 65/500 [==>...........................] - ETA: 1:48 - loss: 1.1748 - regression_loss: 1.0209 - classification_loss: 0.1538 66/500 [==>...........................] - ETA: 1:48 - loss: 1.1733 - regression_loss: 1.0193 - classification_loss: 0.1540 67/500 [===>..........................] - ETA: 1:47 - loss: 1.1728 - regression_loss: 1.0194 - classification_loss: 0.1534 68/500 [===>..........................] - ETA: 1:47 - loss: 1.1702 - regression_loss: 1.0179 - classification_loss: 0.1523 69/500 [===>..........................] - ETA: 1:47 - loss: 1.1718 - regression_loss: 1.0193 - classification_loss: 0.1525 70/500 [===>..........................] - ETA: 1:47 - loss: 1.1796 - regression_loss: 1.0266 - classification_loss: 0.1529 71/500 [===>..........................] - ETA: 1:47 - loss: 1.1726 - regression_loss: 1.0204 - classification_loss: 0.1522 72/500 [===>..........................] - ETA: 1:46 - loss: 1.1704 - regression_loss: 1.0190 - classification_loss: 0.1514 73/500 [===>..........................] - ETA: 1:46 - loss: 1.1657 - regression_loss: 1.0149 - classification_loss: 0.1508 74/500 [===>..........................] - ETA: 1:46 - loss: 1.1694 - regression_loss: 1.0177 - classification_loss: 0.1517 75/500 [===>..........................] - ETA: 1:46 - loss: 1.1786 - regression_loss: 1.0239 - classification_loss: 0.1547 76/500 [===>..........................] - ETA: 1:45 - loss: 1.1843 - regression_loss: 1.0275 - classification_loss: 0.1568 77/500 [===>..........................] - ETA: 1:45 - loss: 1.1886 - regression_loss: 1.0317 - classification_loss: 0.1569 78/500 [===>..........................] - ETA: 1:45 - loss: 1.1793 - regression_loss: 1.0240 - classification_loss: 0.1553 79/500 [===>..........................] - ETA: 1:45 - loss: 1.1805 - regression_loss: 1.0253 - classification_loss: 0.1552 80/500 [===>..........................] - ETA: 1:44 - loss: 1.1804 - regression_loss: 1.0254 - classification_loss: 0.1549 81/500 [===>..........................] - ETA: 1:44 - loss: 1.1792 - regression_loss: 1.0242 - classification_loss: 0.1551 82/500 [===>..........................] - ETA: 1:44 - loss: 1.1792 - regression_loss: 1.0247 - classification_loss: 0.1545 83/500 [===>..........................] - ETA: 1:44 - loss: 1.1774 - regression_loss: 1.0237 - classification_loss: 0.1538 84/500 [====>.........................] - ETA: 1:43 - loss: 1.1718 - regression_loss: 1.0173 - classification_loss: 0.1546 85/500 [====>.........................] - ETA: 1:43 - loss: 1.1657 - regression_loss: 1.0123 - classification_loss: 0.1534 86/500 [====>.........................] - ETA: 1:43 - loss: 1.1627 - regression_loss: 1.0082 - classification_loss: 0.1545 87/500 [====>.........................] - ETA: 1:43 - loss: 1.1662 - regression_loss: 1.0107 - classification_loss: 0.1555 88/500 [====>.........................] - ETA: 1:42 - loss: 1.1582 - regression_loss: 1.0041 - classification_loss: 0.1542 89/500 [====>.........................] - ETA: 1:42 - loss: 1.1698 - regression_loss: 1.0143 - classification_loss: 0.1555 90/500 [====>.........................] - ETA: 1:42 - loss: 1.1678 - regression_loss: 1.0129 - classification_loss: 0.1549 91/500 [====>.........................] - ETA: 1:42 - loss: 1.1647 - regression_loss: 1.0107 - classification_loss: 0.1540 92/500 [====>.........................] - ETA: 1:41 - loss: 1.1630 - regression_loss: 1.0096 - classification_loss: 0.1534 93/500 [====>.........................] - ETA: 1:41 - loss: 1.1643 - regression_loss: 1.0108 - classification_loss: 0.1534 94/500 [====>.........................] - ETA: 1:41 - loss: 1.1648 - regression_loss: 1.0108 - classification_loss: 0.1540 95/500 [====>.........................] - ETA: 1:41 - loss: 1.1605 - regression_loss: 1.0075 - classification_loss: 0.1530 96/500 [====>.........................] - ETA: 1:41 - loss: 1.1653 - regression_loss: 1.0114 - classification_loss: 0.1539 97/500 [====>.........................] - ETA: 1:40 - loss: 1.1665 - regression_loss: 1.0115 - classification_loss: 0.1549 98/500 [====>.........................] - ETA: 1:40 - loss: 1.1677 - regression_loss: 1.0127 - classification_loss: 0.1551 99/500 [====>.........................] - ETA: 1:40 - loss: 1.1736 - regression_loss: 1.0183 - classification_loss: 0.1554 100/500 [=====>........................] - ETA: 1:40 - loss: 1.1684 - regression_loss: 1.0141 - classification_loss: 0.1543 101/500 [=====>........................] - ETA: 1:39 - loss: 1.1679 - regression_loss: 1.0133 - classification_loss: 0.1546 102/500 [=====>........................] - ETA: 1:39 - loss: 1.1647 - regression_loss: 1.0107 - classification_loss: 0.1540 103/500 [=====>........................] - ETA: 1:39 - loss: 1.1726 - regression_loss: 1.0175 - classification_loss: 0.1551 104/500 [=====>........................] - ETA: 1:39 - loss: 1.1765 - regression_loss: 1.0209 - classification_loss: 0.1556 105/500 [=====>........................] - ETA: 1:39 - loss: 1.1772 - regression_loss: 1.0217 - classification_loss: 0.1554 106/500 [=====>........................] - ETA: 1:38 - loss: 1.1716 - regression_loss: 1.0172 - classification_loss: 0.1544 107/500 [=====>........................] - ETA: 1:38 - loss: 1.1709 - regression_loss: 1.0161 - classification_loss: 0.1548 108/500 [=====>........................] - ETA: 1:38 - loss: 1.1723 - regression_loss: 1.0170 - classification_loss: 0.1553 109/500 [=====>........................] - ETA: 1:38 - loss: 1.1779 - regression_loss: 1.0216 - classification_loss: 0.1563 110/500 [=====>........................] - ETA: 1:37 - loss: 1.1736 - regression_loss: 1.0179 - classification_loss: 0.1556 111/500 [=====>........................] - ETA: 1:37 - loss: 1.1697 - regression_loss: 1.0148 - classification_loss: 0.1549 112/500 [=====>........................] - ETA: 1:37 - loss: 1.1728 - regression_loss: 1.0174 - classification_loss: 0.1555 113/500 [=====>........................] - ETA: 1:37 - loss: 1.1728 - regression_loss: 1.0178 - classification_loss: 0.1550 114/500 [=====>........................] - ETA: 1:37 - loss: 1.1715 - regression_loss: 1.0166 - classification_loss: 0.1549 115/500 [=====>........................] - ETA: 1:36 - loss: 1.1691 - regression_loss: 1.0143 - classification_loss: 0.1548 116/500 [=====>........................] - ETA: 1:36 - loss: 1.1673 - regression_loss: 1.0132 - classification_loss: 0.1541 117/500 [======>.......................] - ETA: 1:36 - loss: 1.1676 - regression_loss: 1.0130 - classification_loss: 0.1546 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1619 - regression_loss: 1.0084 - classification_loss: 0.1535 119/500 [======>.......................] - ETA: 1:35 - loss: 1.1642 - regression_loss: 1.0105 - classification_loss: 0.1538 120/500 [======>.......................] - ETA: 1:35 - loss: 1.1655 - regression_loss: 1.0123 - classification_loss: 0.1532 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1686 - regression_loss: 1.0153 - classification_loss: 0.1533 122/500 [======>.......................] - ETA: 1:35 - loss: 1.1724 - regression_loss: 1.0181 - classification_loss: 0.1543 123/500 [======>.......................] - ETA: 1:34 - loss: 1.1770 - regression_loss: 1.0221 - classification_loss: 0.1549 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1796 - regression_loss: 1.0245 - classification_loss: 0.1550 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1797 - regression_loss: 1.0251 - classification_loss: 0.1546 126/500 [======>.......................] - ETA: 1:34 - loss: 1.1796 - regression_loss: 1.0250 - classification_loss: 0.1546 127/500 [======>.......................] - ETA: 1:33 - loss: 1.1812 - regression_loss: 1.0258 - classification_loss: 0.1554 128/500 [======>.......................] - ETA: 1:33 - loss: 1.1758 - regression_loss: 1.0212 - classification_loss: 0.1546 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1781 - regression_loss: 1.0232 - classification_loss: 0.1549 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1772 - regression_loss: 1.0228 - classification_loss: 0.1544 131/500 [======>.......................] - ETA: 1:32 - loss: 1.1788 - regression_loss: 1.0244 - classification_loss: 0.1544 132/500 [======>.......................] - ETA: 1:32 - loss: 1.1785 - regression_loss: 1.0242 - classification_loss: 0.1544 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1799 - regression_loss: 1.0254 - classification_loss: 0.1545 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1828 - regression_loss: 1.0282 - classification_loss: 0.1546 135/500 [=======>......................] - ETA: 1:31 - loss: 1.1807 - regression_loss: 1.0269 - classification_loss: 0.1538 136/500 [=======>......................] - ETA: 1:31 - loss: 1.1850 - regression_loss: 1.0304 - classification_loss: 0.1546 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1863 - regression_loss: 1.0316 - classification_loss: 0.1547 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1842 - regression_loss: 1.0301 - classification_loss: 0.1541 139/500 [=======>......................] - ETA: 1:30 - loss: 1.1814 - regression_loss: 1.0277 - classification_loss: 0.1537 140/500 [=======>......................] - ETA: 1:30 - loss: 1.1839 - regression_loss: 1.0299 - classification_loss: 0.1540 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1825 - regression_loss: 1.0288 - classification_loss: 0.1537 142/500 [=======>......................] - ETA: 1:30 - loss: 1.1827 - regression_loss: 1.0289 - classification_loss: 0.1538 143/500 [=======>......................] - ETA: 1:29 - loss: 1.1811 - regression_loss: 1.0276 - classification_loss: 0.1534 144/500 [=======>......................] - ETA: 1:29 - loss: 1.1813 - regression_loss: 1.0284 - classification_loss: 0.1529 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1817 - regression_loss: 1.0292 - classification_loss: 0.1525 146/500 [=======>......................] - ETA: 1:29 - loss: 1.1817 - regression_loss: 1.0295 - classification_loss: 0.1522 147/500 [=======>......................] - ETA: 1:28 - loss: 1.1841 - regression_loss: 1.0316 - classification_loss: 0.1525 148/500 [=======>......................] - ETA: 1:28 - loss: 1.1853 - regression_loss: 1.0327 - classification_loss: 0.1526 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1867 - regression_loss: 1.0340 - classification_loss: 0.1527 150/500 [========>.....................] - ETA: 1:28 - loss: 1.1840 - regression_loss: 1.0319 - classification_loss: 0.1521 151/500 [========>.....................] - ETA: 1:27 - loss: 1.1826 - regression_loss: 1.0306 - classification_loss: 0.1520 152/500 [========>.....................] - ETA: 1:27 - loss: 1.1825 - regression_loss: 1.0305 - classification_loss: 0.1520 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1849 - regression_loss: 1.0326 - classification_loss: 0.1523 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1841 - regression_loss: 1.0321 - classification_loss: 0.1520 155/500 [========>.....................] - ETA: 1:26 - loss: 1.1805 - regression_loss: 1.0290 - classification_loss: 0.1516 156/500 [========>.....................] - ETA: 1:26 - loss: 1.1812 - regression_loss: 1.0298 - classification_loss: 0.1514 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1780 - regression_loss: 1.0273 - classification_loss: 0.1507 158/500 [========>.....................] - ETA: 1:26 - loss: 1.1762 - regression_loss: 1.0261 - classification_loss: 0.1501 159/500 [========>.....................] - ETA: 1:25 - loss: 1.1785 - regression_loss: 1.0280 - classification_loss: 0.1505 160/500 [========>.....................] - ETA: 1:25 - loss: 1.1808 - regression_loss: 1.0300 - classification_loss: 0.1508 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1793 - regression_loss: 1.0289 - classification_loss: 0.1504 162/500 [========>.....................] - ETA: 1:25 - loss: 1.1792 - regression_loss: 1.0290 - classification_loss: 0.1502 163/500 [========>.....................] - ETA: 1:25 - loss: 1.1767 - regression_loss: 1.0268 - classification_loss: 0.1499 164/500 [========>.....................] - ETA: 1:24 - loss: 1.1773 - regression_loss: 1.0274 - classification_loss: 0.1498 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1785 - regression_loss: 1.0284 - classification_loss: 0.1501 166/500 [========>.....................] - ETA: 1:24 - loss: 1.1792 - regression_loss: 1.0291 - classification_loss: 0.1501 167/500 [=========>....................] - ETA: 1:24 - loss: 1.1801 - regression_loss: 1.0300 - classification_loss: 0.1501 168/500 [=========>....................] - ETA: 1:23 - loss: 1.1805 - regression_loss: 1.0304 - classification_loss: 0.1500 169/500 [=========>....................] - ETA: 1:23 - loss: 1.1816 - regression_loss: 1.0314 - classification_loss: 0.1502 170/500 [=========>....................] - ETA: 1:23 - loss: 1.1824 - regression_loss: 1.0321 - classification_loss: 0.1503 171/500 [=========>....................] - ETA: 1:22 - loss: 1.1795 - regression_loss: 1.0298 - classification_loss: 0.1497 172/500 [=========>....................] - ETA: 1:22 - loss: 1.1784 - regression_loss: 1.0291 - classification_loss: 0.1493 173/500 [=========>....................] - ETA: 1:22 - loss: 1.1794 - regression_loss: 1.0300 - classification_loss: 0.1495 174/500 [=========>....................] - ETA: 1:22 - loss: 1.1759 - regression_loss: 1.0269 - classification_loss: 0.1490 175/500 [=========>....................] - ETA: 1:21 - loss: 1.1760 - regression_loss: 1.0271 - classification_loss: 0.1489 176/500 [=========>....................] - ETA: 1:21 - loss: 1.1732 - regression_loss: 1.0249 - classification_loss: 0.1483 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1751 - regression_loss: 1.0265 - classification_loss: 0.1486 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1733 - regression_loss: 1.0250 - classification_loss: 0.1483 179/500 [=========>....................] - ETA: 1:20 - loss: 1.1719 - regression_loss: 1.0240 - classification_loss: 0.1479 180/500 [=========>....................] - ETA: 1:20 - loss: 1.1713 - regression_loss: 1.0234 - classification_loss: 0.1479 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1739 - regression_loss: 1.0254 - classification_loss: 0.1484 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1758 - regression_loss: 1.0270 - classification_loss: 0.1487 183/500 [=========>....................] - ETA: 1:19 - loss: 1.1744 - regression_loss: 1.0257 - classification_loss: 0.1487 184/500 [==========>...................] - ETA: 1:19 - loss: 1.1763 - regression_loss: 1.0272 - classification_loss: 0.1491 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1782 - regression_loss: 1.0289 - classification_loss: 0.1493 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1744 - regression_loss: 1.0257 - classification_loss: 0.1487 187/500 [==========>...................] - ETA: 1:18 - loss: 1.1745 - regression_loss: 1.0258 - classification_loss: 0.1487 188/500 [==========>...................] - ETA: 1:18 - loss: 1.1744 - regression_loss: 1.0256 - classification_loss: 0.1488 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1723 - regression_loss: 1.0237 - classification_loss: 0.1486 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1742 - regression_loss: 1.0255 - classification_loss: 0.1487 191/500 [==========>...................] - ETA: 1:17 - loss: 1.1759 - regression_loss: 1.0271 - classification_loss: 0.1488 192/500 [==========>...................] - ETA: 1:17 - loss: 1.1774 - regression_loss: 1.0285 - classification_loss: 0.1489 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1782 - regression_loss: 1.0288 - classification_loss: 0.1493 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1742 - regression_loss: 1.0255 - classification_loss: 0.1487 195/500 [==========>...................] - ETA: 1:16 - loss: 1.1729 - regression_loss: 1.0242 - classification_loss: 0.1487 196/500 [==========>...................] - ETA: 1:16 - loss: 1.1711 - regression_loss: 1.0228 - classification_loss: 0.1483 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1674 - regression_loss: 1.0197 - classification_loss: 0.1477 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1676 - regression_loss: 1.0200 - classification_loss: 0.1476 199/500 [==========>...................] - ETA: 1:15 - loss: 1.1704 - regression_loss: 1.0227 - classification_loss: 0.1477 200/500 [===========>..................] - ETA: 1:15 - loss: 1.1712 - regression_loss: 1.0235 - classification_loss: 0.1477 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1729 - regression_loss: 1.0249 - classification_loss: 0.1480 202/500 [===========>..................] - ETA: 1:14 - loss: 1.1735 - regression_loss: 1.0256 - classification_loss: 0.1480 203/500 [===========>..................] - ETA: 1:14 - loss: 1.1699 - regression_loss: 1.0225 - classification_loss: 0.1474 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1708 - regression_loss: 1.0225 - classification_loss: 0.1483 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1733 - regression_loss: 1.0246 - classification_loss: 0.1487 206/500 [===========>..................] - ETA: 1:13 - loss: 1.1707 - regression_loss: 1.0225 - classification_loss: 0.1482 207/500 [===========>..................] - ETA: 1:13 - loss: 1.1711 - regression_loss: 1.0229 - classification_loss: 0.1482 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1730 - regression_loss: 1.0247 - classification_loss: 0.1484 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1732 - regression_loss: 1.0249 - classification_loss: 0.1483 210/500 [===========>..................] - ETA: 1:12 - loss: 1.1703 - regression_loss: 1.0225 - classification_loss: 0.1478 211/500 [===========>..................] - ETA: 1:12 - loss: 1.1683 - regression_loss: 1.0210 - classification_loss: 0.1473 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1696 - regression_loss: 1.0223 - classification_loss: 0.1473 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1672 - regression_loss: 1.0204 - classification_loss: 0.1468 214/500 [===========>..................] - ETA: 1:11 - loss: 1.1668 - regression_loss: 1.0201 - classification_loss: 0.1467 215/500 [===========>..................] - ETA: 1:11 - loss: 1.1664 - regression_loss: 1.0196 - classification_loss: 0.1468 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1665 - regression_loss: 1.0199 - classification_loss: 0.1466 217/500 [============>.................] - ETA: 1:11 - loss: 1.1687 - regression_loss: 1.0218 - classification_loss: 0.1469 218/500 [============>.................] - ETA: 1:10 - loss: 1.1691 - regression_loss: 1.0222 - classification_loss: 0.1469 219/500 [============>.................] - ETA: 1:10 - loss: 1.1683 - regression_loss: 1.0216 - classification_loss: 0.1467 220/500 [============>.................] - ETA: 1:10 - loss: 1.1698 - regression_loss: 1.0229 - classification_loss: 0.1469 221/500 [============>.................] - ETA: 1:10 - loss: 1.1683 - regression_loss: 1.0217 - classification_loss: 0.1466 222/500 [============>.................] - ETA: 1:09 - loss: 1.1675 - regression_loss: 1.0213 - classification_loss: 0.1462 223/500 [============>.................] - ETA: 1:09 - loss: 1.1691 - regression_loss: 1.0222 - classification_loss: 0.1469 224/500 [============>.................] - ETA: 1:09 - loss: 1.1690 - regression_loss: 1.0222 - classification_loss: 0.1468 225/500 [============>.................] - ETA: 1:09 - loss: 1.1699 - regression_loss: 1.0232 - classification_loss: 0.1467 226/500 [============>.................] - ETA: 1:08 - loss: 1.1697 - regression_loss: 1.0230 - classification_loss: 0.1467 227/500 [============>.................] - ETA: 1:08 - loss: 1.1699 - regression_loss: 1.0233 - classification_loss: 0.1466 228/500 [============>.................] - ETA: 1:08 - loss: 1.1688 - regression_loss: 1.0226 - classification_loss: 0.1462 229/500 [============>.................] - ETA: 1:08 - loss: 1.1696 - regression_loss: 1.0232 - classification_loss: 0.1464 230/500 [============>.................] - ETA: 1:07 - loss: 1.1685 - regression_loss: 1.0221 - classification_loss: 0.1464 231/500 [============>.................] - ETA: 1:07 - loss: 1.1701 - regression_loss: 1.0235 - classification_loss: 0.1466 232/500 [============>.................] - ETA: 1:07 - loss: 1.1709 - regression_loss: 1.0242 - classification_loss: 0.1467 233/500 [============>.................] - ETA: 1:07 - loss: 1.1731 - regression_loss: 1.0260 - classification_loss: 0.1471 234/500 [=============>................] - ETA: 1:06 - loss: 1.1731 - regression_loss: 1.0261 - classification_loss: 0.1470 235/500 [=============>................] - ETA: 1:06 - loss: 1.1740 - regression_loss: 1.0267 - classification_loss: 0.1473 236/500 [=============>................] - ETA: 1:06 - loss: 1.1758 - regression_loss: 1.0286 - classification_loss: 0.1472 237/500 [=============>................] - ETA: 1:06 - loss: 1.1766 - regression_loss: 1.0294 - classification_loss: 0.1471 238/500 [=============>................] - ETA: 1:05 - loss: 1.1742 - regression_loss: 1.0275 - classification_loss: 0.1467 239/500 [=============>................] - ETA: 1:05 - loss: 1.1726 - regression_loss: 1.0262 - classification_loss: 0.1465 240/500 [=============>................] - ETA: 1:05 - loss: 1.1702 - regression_loss: 1.0240 - classification_loss: 0.1461 241/500 [=============>................] - ETA: 1:05 - loss: 1.1717 - regression_loss: 1.0253 - classification_loss: 0.1464 242/500 [=============>................] - ETA: 1:04 - loss: 1.1696 - regression_loss: 1.0236 - classification_loss: 0.1461 243/500 [=============>................] - ETA: 1:04 - loss: 1.1656 - regression_loss: 1.0199 - classification_loss: 0.1457 244/500 [=============>................] - ETA: 1:04 - loss: 1.1657 - regression_loss: 1.0203 - classification_loss: 0.1454 245/500 [=============>................] - ETA: 1:04 - loss: 1.1662 - regression_loss: 1.0207 - classification_loss: 0.1455 246/500 [=============>................] - ETA: 1:03 - loss: 1.1679 - regression_loss: 1.0223 - classification_loss: 0.1456 247/500 [=============>................] - ETA: 1:03 - loss: 1.1683 - regression_loss: 1.0227 - classification_loss: 0.1456 248/500 [=============>................] - ETA: 1:03 - loss: 1.1664 - regression_loss: 1.0212 - classification_loss: 0.1452 249/500 [=============>................] - ETA: 1:03 - loss: 1.1680 - regression_loss: 1.0226 - classification_loss: 0.1454 250/500 [==============>...............] - ETA: 1:02 - loss: 1.1685 - regression_loss: 1.0232 - classification_loss: 0.1453 251/500 [==============>...............] - ETA: 1:02 - loss: 1.1682 - regression_loss: 1.0230 - classification_loss: 0.1452 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1700 - regression_loss: 1.0243 - classification_loss: 0.1457 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1707 - regression_loss: 1.0248 - classification_loss: 0.1459 254/500 [==============>...............] - ETA: 1:01 - loss: 1.1703 - regression_loss: 1.0242 - classification_loss: 0.1461 255/500 [==============>...............] - ETA: 1:01 - loss: 1.1703 - regression_loss: 1.0245 - classification_loss: 0.1459 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1689 - regression_loss: 1.0231 - classification_loss: 0.1457 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1699 - regression_loss: 1.0243 - classification_loss: 0.1455 258/500 [==============>...............] - ETA: 1:00 - loss: 1.1705 - regression_loss: 1.0250 - classification_loss: 0.1455 259/500 [==============>...............] - ETA: 1:00 - loss: 1.1720 - regression_loss: 1.0262 - classification_loss: 0.1458 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1720 - regression_loss: 1.0264 - classification_loss: 0.1456 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1752 - regression_loss: 1.0290 - classification_loss: 0.1462 262/500 [==============>...............] - ETA: 59s - loss: 1.1746 - regression_loss: 1.0286 - classification_loss: 0.1460  263/500 [==============>...............] - ETA: 59s - loss: 1.1751 - regression_loss: 1.0288 - classification_loss: 0.1463 264/500 [==============>...............] - ETA: 59s - loss: 1.1768 - regression_loss: 1.0303 - classification_loss: 0.1465 265/500 [==============>...............] - ETA: 59s - loss: 1.1779 - regression_loss: 1.0313 - classification_loss: 0.1466 266/500 [==============>...............] - ETA: 58s - loss: 1.1781 - regression_loss: 1.0316 - classification_loss: 0.1465 267/500 [===============>..............] - ETA: 58s - loss: 1.1755 - regression_loss: 1.0295 - classification_loss: 0.1461 268/500 [===============>..............] - ETA: 58s - loss: 1.1759 - regression_loss: 1.0299 - classification_loss: 0.1461 269/500 [===============>..............] - ETA: 58s - loss: 1.1776 - regression_loss: 1.0315 - classification_loss: 0.1461 270/500 [===============>..............] - ETA: 57s - loss: 1.1779 - regression_loss: 1.0318 - classification_loss: 0.1461 271/500 [===============>..............] - ETA: 57s - loss: 1.1780 - regression_loss: 1.0319 - classification_loss: 0.1461 272/500 [===============>..............] - ETA: 57s - loss: 1.1782 - regression_loss: 1.0321 - classification_loss: 0.1461 273/500 [===============>..............] - ETA: 57s - loss: 1.1763 - regression_loss: 1.0305 - classification_loss: 0.1457 274/500 [===============>..............] - ETA: 56s - loss: 1.1763 - regression_loss: 1.0306 - classification_loss: 0.1458 275/500 [===============>..............] - ETA: 56s - loss: 1.1751 - regression_loss: 1.0294 - classification_loss: 0.1457 276/500 [===============>..............] - ETA: 56s - loss: 1.1771 - regression_loss: 1.0309 - classification_loss: 0.1462 277/500 [===============>..............] - ETA: 56s - loss: 1.1775 - regression_loss: 1.0314 - classification_loss: 0.1461 278/500 [===============>..............] - ETA: 55s - loss: 1.1784 - regression_loss: 1.0322 - classification_loss: 0.1462 279/500 [===============>..............] - ETA: 55s - loss: 1.1779 - regression_loss: 1.0319 - classification_loss: 0.1461 280/500 [===============>..............] - ETA: 55s - loss: 1.1786 - regression_loss: 1.0325 - classification_loss: 0.1461 281/500 [===============>..............] - ETA: 55s - loss: 1.1781 - regression_loss: 1.0320 - classification_loss: 0.1461 282/500 [===============>..............] - ETA: 54s - loss: 1.1786 - regression_loss: 1.0322 - classification_loss: 0.1464 283/500 [===============>..............] - ETA: 54s - loss: 1.1795 - regression_loss: 1.0329 - classification_loss: 0.1465 284/500 [================>.............] - ETA: 54s - loss: 1.1775 - regression_loss: 1.0314 - classification_loss: 0.1461 285/500 [================>.............] - ETA: 54s - loss: 1.1751 - regression_loss: 1.0293 - classification_loss: 0.1458 286/500 [================>.............] - ETA: 53s - loss: 1.1724 - regression_loss: 1.0269 - classification_loss: 0.1455 287/500 [================>.............] - ETA: 53s - loss: 1.1741 - regression_loss: 1.0283 - classification_loss: 0.1458 288/500 [================>.............] - ETA: 53s - loss: 1.1742 - regression_loss: 1.0285 - classification_loss: 0.1457 289/500 [================>.............] - ETA: 53s - loss: 1.1759 - regression_loss: 1.0299 - classification_loss: 0.1460 290/500 [================>.............] - ETA: 52s - loss: 1.1755 - regression_loss: 1.0295 - classification_loss: 0.1460 291/500 [================>.............] - ETA: 52s - loss: 1.1744 - regression_loss: 1.0285 - classification_loss: 0.1459 292/500 [================>.............] - ETA: 52s - loss: 1.1754 - regression_loss: 1.0296 - classification_loss: 0.1458 293/500 [================>.............] - ETA: 52s - loss: 1.1763 - regression_loss: 1.0304 - classification_loss: 0.1459 294/500 [================>.............] - ETA: 51s - loss: 1.1759 - regression_loss: 1.0301 - classification_loss: 0.1458 295/500 [================>.............] - ETA: 51s - loss: 1.1754 - regression_loss: 1.0298 - classification_loss: 0.1456 296/500 [================>.............] - ETA: 51s - loss: 1.1746 - regression_loss: 1.0293 - classification_loss: 0.1453 297/500 [================>.............] - ETA: 51s - loss: 1.1748 - regression_loss: 1.0294 - classification_loss: 0.1454 298/500 [================>.............] - ETA: 50s - loss: 1.1748 - regression_loss: 1.0294 - classification_loss: 0.1454 299/500 [================>.............] - ETA: 50s - loss: 1.1744 - regression_loss: 1.0291 - classification_loss: 0.1453 300/500 [=================>............] - ETA: 50s - loss: 1.1741 - regression_loss: 1.0289 - classification_loss: 0.1452 301/500 [=================>............] - ETA: 50s - loss: 1.1741 - regression_loss: 1.0291 - classification_loss: 0.1450 302/500 [=================>............] - ETA: 49s - loss: 1.1722 - regression_loss: 1.0274 - classification_loss: 0.1448 303/500 [=================>............] - ETA: 49s - loss: 1.1716 - regression_loss: 1.0269 - classification_loss: 0.1447 304/500 [=================>............] - ETA: 49s - loss: 1.1713 - regression_loss: 1.0267 - classification_loss: 0.1446 305/500 [=================>............] - ETA: 49s - loss: 1.1728 - regression_loss: 1.0280 - classification_loss: 0.1448 306/500 [=================>............] - ETA: 48s - loss: 1.1750 - regression_loss: 1.0299 - classification_loss: 0.1451 307/500 [=================>............] - ETA: 48s - loss: 1.1753 - regression_loss: 1.0302 - classification_loss: 0.1451 308/500 [=================>............] - ETA: 48s - loss: 1.1767 - regression_loss: 1.0312 - classification_loss: 0.1455 309/500 [=================>............] - ETA: 48s - loss: 1.1750 - regression_loss: 1.0297 - classification_loss: 0.1453 310/500 [=================>............] - ETA: 47s - loss: 1.1759 - regression_loss: 1.0303 - classification_loss: 0.1456 311/500 [=================>............] - ETA: 47s - loss: 1.1758 - regression_loss: 1.0302 - classification_loss: 0.1456 312/500 [=================>............] - ETA: 47s - loss: 1.1756 - regression_loss: 1.0300 - classification_loss: 0.1456 313/500 [=================>............] - ETA: 47s - loss: 1.1734 - regression_loss: 1.0282 - classification_loss: 0.1452 314/500 [=================>............] - ETA: 46s - loss: 1.1744 - regression_loss: 1.0289 - classification_loss: 0.1455 315/500 [=================>............] - ETA: 46s - loss: 1.1753 - regression_loss: 1.0298 - classification_loss: 0.1455 316/500 [=================>............] - ETA: 46s - loss: 1.1755 - regression_loss: 1.0301 - classification_loss: 0.1454 317/500 [==================>...........] - ETA: 46s - loss: 1.1761 - regression_loss: 1.0306 - classification_loss: 0.1455 318/500 [==================>...........] - ETA: 45s - loss: 1.1755 - regression_loss: 1.0301 - classification_loss: 0.1454 319/500 [==================>...........] - ETA: 45s - loss: 1.1754 - regression_loss: 1.0301 - classification_loss: 0.1453 320/500 [==================>...........] - ETA: 45s - loss: 1.1752 - regression_loss: 1.0300 - classification_loss: 0.1452 321/500 [==================>...........] - ETA: 45s - loss: 1.1763 - regression_loss: 1.0309 - classification_loss: 0.1454 322/500 [==================>...........] - ETA: 44s - loss: 1.1758 - regression_loss: 1.0306 - classification_loss: 0.1452 323/500 [==================>...........] - ETA: 44s - loss: 1.1751 - regression_loss: 1.0300 - classification_loss: 0.1452 324/500 [==================>...........] - ETA: 44s - loss: 1.1744 - regression_loss: 1.0294 - classification_loss: 0.1450 325/500 [==================>...........] - ETA: 44s - loss: 1.1727 - regression_loss: 1.0280 - classification_loss: 0.1447 326/500 [==================>...........] - ETA: 43s - loss: 1.1722 - regression_loss: 1.0275 - classification_loss: 0.1447 327/500 [==================>...........] - ETA: 43s - loss: 1.1706 - regression_loss: 1.0262 - classification_loss: 0.1444 328/500 [==================>...........] - ETA: 43s - loss: 1.1721 - regression_loss: 1.0274 - classification_loss: 0.1446 329/500 [==================>...........] - ETA: 43s - loss: 1.1717 - regression_loss: 1.0271 - classification_loss: 0.1445 330/500 [==================>...........] - ETA: 42s - loss: 1.1727 - regression_loss: 1.0277 - classification_loss: 0.1450 331/500 [==================>...........] - ETA: 42s - loss: 1.1738 - regression_loss: 1.0287 - classification_loss: 0.1451 332/500 [==================>...........] - ETA: 42s - loss: 1.1738 - regression_loss: 1.0288 - classification_loss: 0.1450 333/500 [==================>...........] - ETA: 42s - loss: 1.1757 - regression_loss: 1.0307 - classification_loss: 0.1450 334/500 [===================>..........] - ETA: 41s - loss: 1.1762 - regression_loss: 1.0312 - classification_loss: 0.1450 335/500 [===================>..........] - ETA: 41s - loss: 1.1753 - regression_loss: 1.0304 - classification_loss: 0.1449 336/500 [===================>..........] - ETA: 41s - loss: 1.1743 - regression_loss: 1.0297 - classification_loss: 0.1446 337/500 [===================>..........] - ETA: 41s - loss: 1.1754 - regression_loss: 1.0306 - classification_loss: 0.1448 338/500 [===================>..........] - ETA: 40s - loss: 1.1748 - regression_loss: 1.0301 - classification_loss: 0.1447 339/500 [===================>..........] - ETA: 40s - loss: 1.1734 - regression_loss: 1.0291 - classification_loss: 0.1443 340/500 [===================>..........] - ETA: 40s - loss: 1.1728 - regression_loss: 1.0287 - classification_loss: 0.1440 341/500 [===================>..........] - ETA: 40s - loss: 1.1716 - regression_loss: 1.0278 - classification_loss: 0.1438 342/500 [===================>..........] - ETA: 39s - loss: 1.1717 - regression_loss: 1.0280 - classification_loss: 0.1437 343/500 [===================>..........] - ETA: 39s - loss: 1.1721 - regression_loss: 1.0283 - classification_loss: 0.1438 344/500 [===================>..........] - ETA: 39s - loss: 1.1707 - regression_loss: 1.0271 - classification_loss: 0.1435 345/500 [===================>..........] - ETA: 39s - loss: 1.1712 - regression_loss: 1.0276 - classification_loss: 0.1437 346/500 [===================>..........] - ETA: 38s - loss: 1.1699 - regression_loss: 1.0263 - classification_loss: 0.1436 347/500 [===================>..........] - ETA: 38s - loss: 1.1706 - regression_loss: 1.0270 - classification_loss: 0.1437 348/500 [===================>..........] - ETA: 38s - loss: 1.1697 - regression_loss: 1.0263 - classification_loss: 0.1434 349/500 [===================>..........] - ETA: 38s - loss: 1.1688 - regression_loss: 1.0255 - classification_loss: 0.1433 350/500 [====================>.........] - ETA: 37s - loss: 1.1702 - regression_loss: 1.0267 - classification_loss: 0.1435 351/500 [====================>.........] - ETA: 37s - loss: 1.1707 - regression_loss: 1.0272 - classification_loss: 0.1435 352/500 [====================>.........] - ETA: 37s - loss: 1.1686 - regression_loss: 1.0254 - classification_loss: 0.1432 353/500 [====================>.........] - ETA: 37s - loss: 1.1679 - regression_loss: 1.0249 - classification_loss: 0.1430 354/500 [====================>.........] - ETA: 36s - loss: 1.1683 - regression_loss: 1.0253 - classification_loss: 0.1430 355/500 [====================>.........] - ETA: 36s - loss: 1.1687 - regression_loss: 1.0257 - classification_loss: 0.1429 356/500 [====================>.........] - ETA: 36s - loss: 1.1676 - regression_loss: 1.0247 - classification_loss: 0.1429 357/500 [====================>.........] - ETA: 36s - loss: 1.1681 - regression_loss: 1.0251 - classification_loss: 0.1430 358/500 [====================>.........] - ETA: 35s - loss: 1.1672 - regression_loss: 1.0244 - classification_loss: 0.1428 359/500 [====================>.........] - ETA: 35s - loss: 1.1681 - regression_loss: 1.0250 - classification_loss: 0.1431 360/500 [====================>.........] - ETA: 35s - loss: 1.1673 - regression_loss: 1.0243 - classification_loss: 0.1429 361/500 [====================>.........] - ETA: 35s - loss: 1.1678 - regression_loss: 1.0248 - classification_loss: 0.1430 362/500 [====================>.........] - ETA: 34s - loss: 1.1669 - regression_loss: 1.0242 - classification_loss: 0.1428 363/500 [====================>.........] - ETA: 34s - loss: 1.1696 - regression_loss: 1.0263 - classification_loss: 0.1433 364/500 [====================>.........] - ETA: 34s - loss: 1.1678 - regression_loss: 1.0247 - classification_loss: 0.1431 365/500 [====================>.........] - ETA: 34s - loss: 1.1681 - regression_loss: 1.0249 - classification_loss: 0.1432 366/500 [====================>.........] - ETA: 33s - loss: 1.1675 - regression_loss: 1.0244 - classification_loss: 0.1431 367/500 [=====================>........] - ETA: 33s - loss: 1.1679 - regression_loss: 1.0249 - classification_loss: 0.1431 368/500 [=====================>........] - ETA: 33s - loss: 1.1691 - regression_loss: 1.0258 - classification_loss: 0.1433 369/500 [=====================>........] - ETA: 33s - loss: 1.1694 - regression_loss: 1.0261 - classification_loss: 0.1433 370/500 [=====================>........] - ETA: 32s - loss: 1.1695 - regression_loss: 1.0261 - classification_loss: 0.1434 371/500 [=====================>........] - ETA: 32s - loss: 1.1691 - regression_loss: 1.0257 - classification_loss: 0.1434 372/500 [=====================>........] - ETA: 32s - loss: 1.1695 - regression_loss: 1.0261 - classification_loss: 0.1435 373/500 [=====================>........] - ETA: 32s - loss: 1.1699 - regression_loss: 1.0263 - classification_loss: 0.1436 374/500 [=====================>........] - ETA: 31s - loss: 1.1679 - regression_loss: 1.0245 - classification_loss: 0.1433 375/500 [=====================>........] - ETA: 31s - loss: 1.1666 - regression_loss: 1.0235 - classification_loss: 0.1431 376/500 [=====================>........] - ETA: 31s - loss: 1.1667 - regression_loss: 1.0236 - classification_loss: 0.1431 377/500 [=====================>........] - ETA: 30s - loss: 1.1670 - regression_loss: 1.0238 - classification_loss: 0.1432 378/500 [=====================>........] - ETA: 30s - loss: 1.1651 - regression_loss: 1.0222 - classification_loss: 0.1430 379/500 [=====================>........] - ETA: 30s - loss: 1.1641 - regression_loss: 1.0214 - classification_loss: 0.1427 380/500 [=====================>........] - ETA: 30s - loss: 1.1630 - regression_loss: 1.0206 - classification_loss: 0.1424 381/500 [=====================>........] - ETA: 29s - loss: 1.1630 - regression_loss: 1.0206 - classification_loss: 0.1424 382/500 [=====================>........] - ETA: 29s - loss: 1.1644 - regression_loss: 1.0220 - classification_loss: 0.1424 383/500 [=====================>........] - ETA: 29s - loss: 1.1631 - regression_loss: 1.0210 - classification_loss: 0.1421 384/500 [======================>.......] - ETA: 29s - loss: 1.1614 - regression_loss: 1.0194 - classification_loss: 0.1420 385/500 [======================>.......] - ETA: 28s - loss: 1.1617 - regression_loss: 1.0196 - classification_loss: 0.1420 386/500 [======================>.......] - ETA: 28s - loss: 1.1612 - regression_loss: 1.0193 - classification_loss: 0.1419 387/500 [======================>.......] - ETA: 28s - loss: 1.1619 - regression_loss: 1.0198 - classification_loss: 0.1421 388/500 [======================>.......] - ETA: 28s - loss: 1.1613 - regression_loss: 1.0194 - classification_loss: 0.1419 389/500 [======================>.......] - ETA: 27s - loss: 1.1614 - regression_loss: 1.0195 - classification_loss: 0.1419 390/500 [======================>.......] - ETA: 27s - loss: 1.1606 - regression_loss: 1.0189 - classification_loss: 0.1417 391/500 [======================>.......] - ETA: 27s - loss: 1.1643 - regression_loss: 1.0219 - classification_loss: 0.1423 392/500 [======================>.......] - ETA: 27s - loss: 1.1646 - regression_loss: 1.0222 - classification_loss: 0.1424 393/500 [======================>.......] - ETA: 26s - loss: 1.1626 - regression_loss: 1.0205 - classification_loss: 0.1421 394/500 [======================>.......] - ETA: 26s - loss: 1.1627 - regression_loss: 1.0206 - classification_loss: 0.1421 395/500 [======================>.......] - ETA: 26s - loss: 1.1637 - regression_loss: 1.0215 - classification_loss: 0.1422 396/500 [======================>.......] - ETA: 26s - loss: 1.1649 - regression_loss: 1.0225 - classification_loss: 0.1424 397/500 [======================>.......] - ETA: 25s - loss: 1.1642 - regression_loss: 1.0219 - classification_loss: 0.1424 398/500 [======================>.......] - ETA: 25s - loss: 1.1630 - regression_loss: 1.0208 - classification_loss: 0.1422 399/500 [======================>.......] - ETA: 25s - loss: 1.1634 - regression_loss: 1.0211 - classification_loss: 0.1423 400/500 [=======================>......] - ETA: 25s - loss: 1.1615 - regression_loss: 1.0194 - classification_loss: 0.1421 401/500 [=======================>......] - ETA: 24s - loss: 1.1601 - regression_loss: 1.0182 - classification_loss: 0.1419 402/500 [=======================>......] - ETA: 24s - loss: 1.1611 - regression_loss: 1.0192 - classification_loss: 0.1420 403/500 [=======================>......] - ETA: 24s - loss: 1.1611 - regression_loss: 1.0190 - classification_loss: 0.1421 404/500 [=======================>......] - ETA: 24s - loss: 1.1615 - regression_loss: 1.0195 - classification_loss: 0.1421 405/500 [=======================>......] - ETA: 23s - loss: 1.1626 - regression_loss: 1.0205 - classification_loss: 0.1422 406/500 [=======================>......] - ETA: 23s - loss: 1.1611 - regression_loss: 1.0191 - classification_loss: 0.1420 407/500 [=======================>......] - ETA: 23s - loss: 1.1610 - regression_loss: 1.0190 - classification_loss: 0.1420 408/500 [=======================>......] - ETA: 23s - loss: 1.1615 - regression_loss: 1.0195 - classification_loss: 0.1420 409/500 [=======================>......] - ETA: 22s - loss: 1.1601 - regression_loss: 1.0183 - classification_loss: 0.1418 410/500 [=======================>......] - ETA: 22s - loss: 1.1603 - regression_loss: 1.0184 - classification_loss: 0.1419 411/500 [=======================>......] - ETA: 22s - loss: 1.1599 - regression_loss: 1.0179 - classification_loss: 0.1420 412/500 [=======================>......] - ETA: 22s - loss: 1.1584 - regression_loss: 1.0166 - classification_loss: 0.1418 413/500 [=======================>......] - ETA: 21s - loss: 1.1589 - regression_loss: 1.0171 - classification_loss: 0.1418 414/500 [=======================>......] - ETA: 21s - loss: 1.1593 - regression_loss: 1.0175 - classification_loss: 0.1418 415/500 [=======================>......] - ETA: 21s - loss: 1.1600 - regression_loss: 1.0183 - classification_loss: 0.1417 416/500 [=======================>......] - ETA: 21s - loss: 1.1595 - regression_loss: 1.0180 - classification_loss: 0.1415 417/500 [========================>.....] - ETA: 20s - loss: 1.1600 - regression_loss: 1.0185 - classification_loss: 0.1415 418/500 [========================>.....] - ETA: 20s - loss: 1.1602 - regression_loss: 1.0187 - classification_loss: 0.1415 419/500 [========================>.....] - ETA: 20s - loss: 1.1613 - regression_loss: 1.0196 - classification_loss: 0.1417 420/500 [========================>.....] - ETA: 20s - loss: 1.1609 - regression_loss: 1.0192 - classification_loss: 0.1417 421/500 [========================>.....] - ETA: 19s - loss: 1.1600 - regression_loss: 1.0185 - classification_loss: 0.1415 422/500 [========================>.....] - ETA: 19s - loss: 1.1614 - regression_loss: 1.0196 - classification_loss: 0.1418 423/500 [========================>.....] - ETA: 19s - loss: 1.1624 - regression_loss: 1.0204 - classification_loss: 0.1419 424/500 [========================>.....] - ETA: 19s - loss: 1.1631 - regression_loss: 1.0210 - classification_loss: 0.1421 425/500 [========================>.....] - ETA: 18s - loss: 1.1636 - regression_loss: 1.0211 - classification_loss: 0.1425 426/500 [========================>.....] - ETA: 18s - loss: 1.1622 - regression_loss: 1.0198 - classification_loss: 0.1423 427/500 [========================>.....] - ETA: 18s - loss: 1.1623 - regression_loss: 1.0199 - classification_loss: 0.1424 428/500 [========================>.....] - ETA: 18s - loss: 1.1618 - regression_loss: 1.0193 - classification_loss: 0.1425 429/500 [========================>.....] - ETA: 17s - loss: 1.1629 - regression_loss: 1.0202 - classification_loss: 0.1427 430/500 [========================>.....] - ETA: 17s - loss: 1.1630 - regression_loss: 1.0203 - classification_loss: 0.1427 431/500 [========================>.....] - ETA: 17s - loss: 1.1616 - regression_loss: 1.0190 - classification_loss: 0.1425 432/500 [========================>.....] - ETA: 17s - loss: 1.1622 - regression_loss: 1.0196 - classification_loss: 0.1426 433/500 [========================>.....] - ETA: 16s - loss: 1.1621 - regression_loss: 1.0195 - classification_loss: 0.1426 434/500 [=========================>....] - ETA: 16s - loss: 1.1621 - regression_loss: 1.0194 - classification_loss: 0.1426 435/500 [=========================>....] - ETA: 16s - loss: 1.1628 - regression_loss: 1.0201 - classification_loss: 0.1427 436/500 [=========================>....] - ETA: 16s - loss: 1.1620 - regression_loss: 1.0195 - classification_loss: 0.1425 437/500 [=========================>....] - ETA: 15s - loss: 1.1624 - regression_loss: 1.0199 - classification_loss: 0.1426 438/500 [=========================>....] - ETA: 15s - loss: 1.1629 - regression_loss: 1.0203 - classification_loss: 0.1426 439/500 [=========================>....] - ETA: 15s - loss: 1.1640 - regression_loss: 1.0213 - classification_loss: 0.1427 440/500 [=========================>....] - ETA: 15s - loss: 1.1640 - regression_loss: 1.0214 - classification_loss: 0.1426 441/500 [=========================>....] - ETA: 14s - loss: 1.1637 - regression_loss: 1.0212 - classification_loss: 0.1426 442/500 [=========================>....] - ETA: 14s - loss: 1.1636 - regression_loss: 1.0212 - classification_loss: 0.1424 443/500 [=========================>....] - ETA: 14s - loss: 1.1635 - regression_loss: 1.0212 - classification_loss: 0.1423 444/500 [=========================>....] - ETA: 14s - loss: 1.1633 - regression_loss: 1.0210 - classification_loss: 0.1423 445/500 [=========================>....] - ETA: 13s - loss: 1.1624 - regression_loss: 1.0203 - classification_loss: 0.1422 446/500 [=========================>....] - ETA: 13s - loss: 1.1623 - regression_loss: 1.0201 - classification_loss: 0.1422 447/500 [=========================>....] - ETA: 13s - loss: 1.1629 - regression_loss: 1.0206 - classification_loss: 0.1423 448/500 [=========================>....] - ETA: 13s - loss: 1.1638 - regression_loss: 1.0213 - classification_loss: 0.1426 449/500 [=========================>....] - ETA: 12s - loss: 1.1637 - regression_loss: 1.0211 - classification_loss: 0.1427 450/500 [==========================>...] - ETA: 12s - loss: 1.1643 - regression_loss: 1.0217 - classification_loss: 0.1426 451/500 [==========================>...] - ETA: 12s - loss: 1.1660 - regression_loss: 1.0230 - classification_loss: 0.1429 452/500 [==========================>...] - ETA: 12s - loss: 1.1655 - regression_loss: 1.0226 - classification_loss: 0.1429 453/500 [==========================>...] - ETA: 11s - loss: 1.1656 - regression_loss: 1.0227 - classification_loss: 0.1429 454/500 [==========================>...] - ETA: 11s - loss: 1.1658 - regression_loss: 1.0229 - classification_loss: 0.1430 455/500 [==========================>...] - ETA: 11s - loss: 1.1664 - regression_loss: 1.0233 - classification_loss: 0.1431 456/500 [==========================>...] - ETA: 11s - loss: 1.1670 - regression_loss: 1.0238 - classification_loss: 0.1432 457/500 [==========================>...] - ETA: 10s - loss: 1.1676 - regression_loss: 1.0244 - classification_loss: 0.1433 458/500 [==========================>...] - ETA: 10s - loss: 1.1671 - regression_loss: 1.0239 - classification_loss: 0.1432 459/500 [==========================>...] - ETA: 10s - loss: 1.1662 - regression_loss: 1.0233 - classification_loss: 0.1429 460/500 [==========================>...] - ETA: 10s - loss: 1.1665 - regression_loss: 1.0234 - classification_loss: 0.1430 461/500 [==========================>...] - ETA: 9s - loss: 1.1672 - regression_loss: 1.0241 - classification_loss: 0.1431  462/500 [==========================>...] - ETA: 9s - loss: 1.1657 - regression_loss: 1.0228 - classification_loss: 0.1429 463/500 [==========================>...] - ETA: 9s - loss: 1.1657 - regression_loss: 1.0228 - classification_loss: 0.1429 464/500 [==========================>...] - ETA: 9s - loss: 1.1664 - regression_loss: 1.0233 - classification_loss: 0.1431 465/500 [==========================>...] - ETA: 8s - loss: 1.1662 - regression_loss: 1.0231 - classification_loss: 0.1431 466/500 [==========================>...] - ETA: 8s - loss: 1.1670 - regression_loss: 1.0238 - classification_loss: 0.1432 467/500 [===========================>..] - ETA: 8s - loss: 1.1660 - regression_loss: 1.0229 - classification_loss: 0.1430 468/500 [===========================>..] - ETA: 8s - loss: 1.1662 - regression_loss: 1.0232 - classification_loss: 0.1431 469/500 [===========================>..] - ETA: 7s - loss: 1.1671 - regression_loss: 1.0240 - classification_loss: 0.1432 470/500 [===========================>..] - ETA: 7s - loss: 1.1658 - regression_loss: 1.0229 - classification_loss: 0.1429 471/500 [===========================>..] - ETA: 7s - loss: 1.1652 - regression_loss: 1.0224 - classification_loss: 0.1428 472/500 [===========================>..] - ETA: 7s - loss: 1.1651 - regression_loss: 1.0223 - classification_loss: 0.1428 473/500 [===========================>..] - ETA: 6s - loss: 1.1653 - regression_loss: 1.0226 - classification_loss: 0.1427 474/500 [===========================>..] - ETA: 6s - loss: 1.1660 - regression_loss: 1.0232 - classification_loss: 0.1428 475/500 [===========================>..] - ETA: 6s - loss: 1.1667 - regression_loss: 1.0238 - classification_loss: 0.1429 476/500 [===========================>..] - ETA: 6s - loss: 1.1682 - regression_loss: 1.0251 - classification_loss: 0.1431 477/500 [===========================>..] - ETA: 5s - loss: 1.1681 - regression_loss: 1.0251 - classification_loss: 0.1430 478/500 [===========================>..] - ETA: 5s - loss: 1.1677 - regression_loss: 1.0248 - classification_loss: 0.1429 479/500 [===========================>..] - ETA: 5s - loss: 1.1683 - regression_loss: 1.0253 - classification_loss: 0.1430 480/500 [===========================>..] - ETA: 5s - loss: 1.1687 - regression_loss: 1.0258 - classification_loss: 0.1430 481/500 [===========================>..] - ETA: 4s - loss: 1.1684 - regression_loss: 1.0256 - classification_loss: 0.1428 482/500 [===========================>..] - ETA: 4s - loss: 1.1690 - regression_loss: 1.0260 - classification_loss: 0.1430 483/500 [===========================>..] - ETA: 4s - loss: 1.1707 - regression_loss: 1.0277 - classification_loss: 0.1431 484/500 [============================>.] - ETA: 4s - loss: 1.1717 - regression_loss: 1.0286 - classification_loss: 0.1432 485/500 [============================>.] - ETA: 3s - loss: 1.1728 - regression_loss: 1.0294 - classification_loss: 0.1433 486/500 [============================>.] - ETA: 3s - loss: 1.1735 - regression_loss: 1.0300 - classification_loss: 0.1435 487/500 [============================>.] - ETA: 3s - loss: 1.1730 - regression_loss: 1.0296 - classification_loss: 0.1434 488/500 [============================>.] - ETA: 3s - loss: 1.1729 - regression_loss: 1.0296 - classification_loss: 0.1433 489/500 [============================>.] - ETA: 2s - loss: 1.1725 - regression_loss: 1.0293 - classification_loss: 0.1432 490/500 [============================>.] - ETA: 2s - loss: 1.1717 - regression_loss: 1.0287 - classification_loss: 0.1430 491/500 [============================>.] - ETA: 2s - loss: 1.1716 - regression_loss: 1.0286 - classification_loss: 0.1430 492/500 [============================>.] - ETA: 2s - loss: 1.1705 - regression_loss: 1.0277 - classification_loss: 0.1428 493/500 [============================>.] - ETA: 1s - loss: 1.1700 - regression_loss: 1.0273 - classification_loss: 0.1427 494/500 [============================>.] - ETA: 1s - loss: 1.1695 - regression_loss: 1.0268 - classification_loss: 0.1426 495/500 [============================>.] - ETA: 1s - loss: 1.1681 - regression_loss: 1.0256 - classification_loss: 0.1424 496/500 [============================>.] - ETA: 1s - loss: 1.1678 - regression_loss: 1.0255 - classification_loss: 0.1423 497/500 [============================>.] - ETA: 0s - loss: 1.1685 - regression_loss: 1.0261 - classification_loss: 0.1425 498/500 [============================>.] - ETA: 0s - loss: 1.1695 - regression_loss: 1.0269 - classification_loss: 0.1426 499/500 [============================>.] - ETA: 0s - loss: 1.1699 - regression_loss: 1.0273 - classification_loss: 0.1427 500/500 [==============================] - 126s 252ms/step - loss: 1.1694 - regression_loss: 1.0268 - classification_loss: 0.1426 1172 instances of class plum with average precision: 0.7617 mAP: 0.7617 Epoch 00027: saving model to ./training/snapshots/resnet50_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 1:47 - loss: 1.1877 - regression_loss: 1.0431 - classification_loss: 0.1446 2/500 [..............................] - ETA: 1:59 - loss: 0.8842 - regression_loss: 0.7798 - classification_loss: 0.1044 3/500 [..............................] - ETA: 2:01 - loss: 1.1131 - regression_loss: 0.9456 - classification_loss: 0.1675 4/500 [..............................] - ETA: 2:03 - loss: 0.9912 - regression_loss: 0.8551 - classification_loss: 0.1361 5/500 [..............................] - ETA: 2:03 - loss: 1.0674 - regression_loss: 0.9235 - classification_loss: 0.1439 6/500 [..............................] - ETA: 2:02 - loss: 1.0448 - regression_loss: 0.9031 - classification_loss: 0.1417 7/500 [..............................] - ETA: 2:02 - loss: 0.9941 - regression_loss: 0.8657 - classification_loss: 0.1284 8/500 [..............................] - ETA: 2:02 - loss: 0.9942 - regression_loss: 0.8686 - classification_loss: 0.1255 9/500 [..............................] - ETA: 2:02 - loss: 1.0181 - regression_loss: 0.8922 - classification_loss: 0.1259 10/500 [..............................] - ETA: 2:02 - loss: 1.0295 - regression_loss: 0.9048 - classification_loss: 0.1247 11/500 [..............................] - ETA: 2:02 - loss: 1.0239 - regression_loss: 0.8986 - classification_loss: 0.1252 12/500 [..............................] - ETA: 2:02 - loss: 1.0398 - regression_loss: 0.9120 - classification_loss: 0.1278 13/500 [..............................] - ETA: 2:02 - loss: 1.0609 - regression_loss: 0.9293 - classification_loss: 0.1317 14/500 [..............................] - ETA: 2:01 - loss: 1.0743 - regression_loss: 0.9410 - classification_loss: 0.1334 15/500 [..............................] - ETA: 2:01 - loss: 1.0821 - regression_loss: 0.9507 - classification_loss: 0.1315 16/500 [..............................] - ETA: 2:01 - loss: 1.0390 - regression_loss: 0.9130 - classification_loss: 0.1260 17/500 [>.............................] - ETA: 2:01 - loss: 1.0796 - regression_loss: 0.9471 - classification_loss: 0.1324 18/500 [>.............................] - ETA: 2:00 - loss: 1.0433 - regression_loss: 0.9167 - classification_loss: 0.1266 19/500 [>.............................] - ETA: 2:00 - loss: 1.0436 - regression_loss: 0.9170 - classification_loss: 0.1266 20/500 [>.............................] - ETA: 2:00 - loss: 1.0393 - regression_loss: 0.9143 - classification_loss: 0.1249 21/500 [>.............................] - ETA: 2:00 - loss: 1.0771 - regression_loss: 0.9461 - classification_loss: 0.1310 22/500 [>.............................] - ETA: 2:00 - loss: 1.0820 - regression_loss: 0.9500 - classification_loss: 0.1320 23/500 [>.............................] - ETA: 1:59 - loss: 1.0824 - regression_loss: 0.9507 - classification_loss: 0.1317 24/500 [>.............................] - ETA: 1:59 - loss: 1.1106 - regression_loss: 0.9734 - classification_loss: 0.1372 25/500 [>.............................] - ETA: 1:59 - loss: 1.1195 - regression_loss: 0.9822 - classification_loss: 0.1373 26/500 [>.............................] - ETA: 1:58 - loss: 1.1244 - regression_loss: 0.9869 - classification_loss: 0.1375 27/500 [>.............................] - ETA: 1:58 - loss: 1.1232 - regression_loss: 0.9858 - classification_loss: 0.1374 28/500 [>.............................] - ETA: 1:58 - loss: 1.1220 - regression_loss: 0.9844 - classification_loss: 0.1376 29/500 [>.............................] - ETA: 1:57 - loss: 1.1258 - regression_loss: 0.9875 - classification_loss: 0.1384 30/500 [>.............................] - ETA: 1:57 - loss: 1.1323 - regression_loss: 0.9924 - classification_loss: 0.1399 31/500 [>.............................] - ETA: 1:57 - loss: 1.1324 - regression_loss: 0.9930 - classification_loss: 0.1394 32/500 [>.............................] - ETA: 1:57 - loss: 1.1257 - regression_loss: 0.9886 - classification_loss: 0.1371 33/500 [>.............................] - ETA: 1:57 - loss: 1.1004 - regression_loss: 0.9668 - classification_loss: 0.1336 34/500 [=>............................] - ETA: 1:57 - loss: 1.0932 - regression_loss: 0.9615 - classification_loss: 0.1317 35/500 [=>............................] - ETA: 1:56 - loss: 1.0834 - regression_loss: 0.9527 - classification_loss: 0.1307 36/500 [=>............................] - ETA: 1:56 - loss: 1.0927 - regression_loss: 0.9590 - classification_loss: 0.1337 37/500 [=>............................] - ETA: 1:56 - loss: 1.0972 - regression_loss: 0.9627 - classification_loss: 0.1346 38/500 [=>............................] - ETA: 1:56 - loss: 1.1098 - regression_loss: 0.9748 - classification_loss: 0.1349 39/500 [=>............................] - ETA: 1:55 - loss: 1.1145 - regression_loss: 0.9794 - classification_loss: 0.1352 40/500 [=>............................] - ETA: 1:55 - loss: 1.1203 - regression_loss: 0.9842 - classification_loss: 0.1361 41/500 [=>............................] - ETA: 1:55 - loss: 1.1249 - regression_loss: 0.9887 - classification_loss: 0.1362 42/500 [=>............................] - ETA: 1:55 - loss: 1.1288 - regression_loss: 0.9917 - classification_loss: 0.1371 43/500 [=>............................] - ETA: 1:55 - loss: 1.1302 - regression_loss: 0.9927 - classification_loss: 0.1375 44/500 [=>............................] - ETA: 1:54 - loss: 1.1317 - regression_loss: 0.9941 - classification_loss: 0.1375 45/500 [=>............................] - ETA: 1:54 - loss: 1.1158 - regression_loss: 0.9803 - classification_loss: 0.1355 46/500 [=>............................] - ETA: 1:54 - loss: 1.1157 - regression_loss: 0.9808 - classification_loss: 0.1349 47/500 [=>............................] - ETA: 1:54 - loss: 1.1227 - regression_loss: 0.9866 - classification_loss: 0.1361 48/500 [=>............................] - ETA: 1:53 - loss: 1.1208 - regression_loss: 0.9852 - classification_loss: 0.1356 49/500 [=>............................] - ETA: 1:53 - loss: 1.1228 - regression_loss: 0.9863 - classification_loss: 0.1365 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1318 - regression_loss: 0.9939 - classification_loss: 0.1378 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1366 - regression_loss: 0.9985 - classification_loss: 0.1381 52/500 [==>...........................] - ETA: 1:52 - loss: 1.1376 - regression_loss: 1.0002 - classification_loss: 0.1374 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1227 - regression_loss: 0.9874 - classification_loss: 0.1354 54/500 [==>...........................] - ETA: 1:51 - loss: 1.1201 - regression_loss: 0.9850 - classification_loss: 0.1351 55/500 [==>...........................] - ETA: 1:51 - loss: 1.1178 - regression_loss: 0.9824 - classification_loss: 0.1354 56/500 [==>...........................] - ETA: 1:50 - loss: 1.1051 - regression_loss: 0.9708 - classification_loss: 0.1344 57/500 [==>...........................] - ETA: 1:50 - loss: 1.1112 - regression_loss: 0.9743 - classification_loss: 0.1369 58/500 [==>...........................] - ETA: 1:50 - loss: 1.1144 - regression_loss: 0.9776 - classification_loss: 0.1368 59/500 [==>...........................] - ETA: 1:50 - loss: 1.1181 - regression_loss: 0.9828 - classification_loss: 0.1353 60/500 [==>...........................] - ETA: 1:50 - loss: 1.1231 - regression_loss: 0.9870 - classification_loss: 0.1360 61/500 [==>...........................] - ETA: 1:49 - loss: 1.1238 - regression_loss: 0.9868 - classification_loss: 0.1371 62/500 [==>...........................] - ETA: 1:49 - loss: 1.1284 - regression_loss: 0.9920 - classification_loss: 0.1364 63/500 [==>...........................] - ETA: 1:49 - loss: 1.1282 - regression_loss: 0.9921 - classification_loss: 0.1360 64/500 [==>...........................] - ETA: 1:49 - loss: 1.1305 - regression_loss: 0.9946 - classification_loss: 0.1359 65/500 [==>...........................] - ETA: 1:48 - loss: 1.1292 - regression_loss: 0.9934 - classification_loss: 0.1359 66/500 [==>...........................] - ETA: 1:48 - loss: 1.1275 - regression_loss: 0.9920 - classification_loss: 0.1355 67/500 [===>..........................] - ETA: 1:48 - loss: 1.1368 - regression_loss: 0.9993 - classification_loss: 0.1375 68/500 [===>..........................] - ETA: 1:48 - loss: 1.1246 - regression_loss: 0.9885 - classification_loss: 0.1361 69/500 [===>..........................] - ETA: 1:47 - loss: 1.1207 - regression_loss: 0.9849 - classification_loss: 0.1358 70/500 [===>..........................] - ETA: 1:47 - loss: 1.1241 - regression_loss: 0.9880 - classification_loss: 0.1361 71/500 [===>..........................] - ETA: 1:47 - loss: 1.1163 - regression_loss: 0.9815 - classification_loss: 0.1348 72/500 [===>..........................] - ETA: 1:47 - loss: 1.1164 - regression_loss: 0.9819 - classification_loss: 0.1344 73/500 [===>..........................] - ETA: 1:47 - loss: 1.1189 - regression_loss: 0.9838 - classification_loss: 0.1350 74/500 [===>..........................] - ETA: 1:46 - loss: 1.1244 - regression_loss: 0.9887 - classification_loss: 0.1357 75/500 [===>..........................] - ETA: 1:46 - loss: 1.1289 - regression_loss: 0.9932 - classification_loss: 0.1358 76/500 [===>..........................] - ETA: 1:46 - loss: 1.1274 - regression_loss: 0.9919 - classification_loss: 0.1355 77/500 [===>..........................] - ETA: 1:46 - loss: 1.1203 - regression_loss: 0.9859 - classification_loss: 0.1344 78/500 [===>..........................] - ETA: 1:45 - loss: 1.1258 - regression_loss: 0.9908 - classification_loss: 0.1350 79/500 [===>..........................] - ETA: 1:45 - loss: 1.1330 - regression_loss: 0.9969 - classification_loss: 0.1361 80/500 [===>..........................] - ETA: 1:45 - loss: 1.1314 - regression_loss: 0.9954 - classification_loss: 0.1359 81/500 [===>..........................] - ETA: 1:45 - loss: 1.1348 - regression_loss: 0.9994 - classification_loss: 0.1353 82/500 [===>..........................] - ETA: 1:45 - loss: 1.1284 - regression_loss: 0.9942 - classification_loss: 0.1342 83/500 [===>..........................] - ETA: 1:44 - loss: 1.1303 - regression_loss: 0.9965 - classification_loss: 0.1337 84/500 [====>.........................] - ETA: 1:44 - loss: 1.1327 - regression_loss: 0.9992 - classification_loss: 0.1335 85/500 [====>.........................] - ETA: 1:44 - loss: 1.1286 - regression_loss: 0.9950 - classification_loss: 0.1336 86/500 [====>.........................] - ETA: 1:44 - loss: 1.1231 - regression_loss: 0.9903 - classification_loss: 0.1328 87/500 [====>.........................] - ETA: 1:43 - loss: 1.1254 - regression_loss: 0.9924 - classification_loss: 0.1330 88/500 [====>.........................] - ETA: 1:43 - loss: 1.1247 - regression_loss: 0.9917 - classification_loss: 0.1330 89/500 [====>.........................] - ETA: 1:43 - loss: 1.1204 - regression_loss: 0.9878 - classification_loss: 0.1326 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1216 - regression_loss: 0.9890 - classification_loss: 0.1326 91/500 [====>.........................] - ETA: 1:42 - loss: 1.1238 - regression_loss: 0.9913 - classification_loss: 0.1324 92/500 [====>.........................] - ETA: 1:42 - loss: 1.1295 - regression_loss: 0.9961 - classification_loss: 0.1334 93/500 [====>.........................] - ETA: 1:42 - loss: 1.1285 - regression_loss: 0.9950 - classification_loss: 0.1334 94/500 [====>.........................] - ETA: 1:42 - loss: 1.1272 - regression_loss: 0.9939 - classification_loss: 0.1333 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1333 - regression_loss: 0.9996 - classification_loss: 0.1338 96/500 [====>.........................] - ETA: 1:41 - loss: 1.1264 - regression_loss: 0.9936 - classification_loss: 0.1328 97/500 [====>.........................] - ETA: 1:41 - loss: 1.1280 - regression_loss: 0.9955 - classification_loss: 0.1325 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1320 - regression_loss: 0.9989 - classification_loss: 0.1332 99/500 [====>.........................] - ETA: 1:41 - loss: 1.1342 - regression_loss: 1.0007 - classification_loss: 0.1335 100/500 [=====>........................] - ETA: 1:40 - loss: 1.1364 - regression_loss: 1.0029 - classification_loss: 0.1335 101/500 [=====>........................] - ETA: 1:40 - loss: 1.1400 - regression_loss: 1.0063 - classification_loss: 0.1336 102/500 [=====>........................] - ETA: 1:40 - loss: 1.1335 - regression_loss: 1.0009 - classification_loss: 0.1326 103/500 [=====>........................] - ETA: 1:40 - loss: 1.1312 - regression_loss: 0.9984 - classification_loss: 0.1329 104/500 [=====>........................] - ETA: 1:39 - loss: 1.1331 - regression_loss: 1.0003 - classification_loss: 0.1328 105/500 [=====>........................] - ETA: 1:39 - loss: 1.1349 - regression_loss: 1.0018 - classification_loss: 0.1332 106/500 [=====>........................] - ETA: 1:39 - loss: 1.1339 - regression_loss: 1.0010 - classification_loss: 0.1329 107/500 [=====>........................] - ETA: 1:38 - loss: 1.1328 - regression_loss: 1.0004 - classification_loss: 0.1324 108/500 [=====>........................] - ETA: 1:38 - loss: 1.1328 - regression_loss: 1.0007 - classification_loss: 0.1321 109/500 [=====>........................] - ETA: 1:38 - loss: 1.1254 - regression_loss: 0.9942 - classification_loss: 0.1312 110/500 [=====>........................] - ETA: 1:38 - loss: 1.1294 - regression_loss: 0.9979 - classification_loss: 0.1316 111/500 [=====>........................] - ETA: 1:37 - loss: 1.1298 - regression_loss: 0.9982 - classification_loss: 0.1316 112/500 [=====>........................] - ETA: 1:37 - loss: 1.1306 - regression_loss: 0.9984 - classification_loss: 0.1322 113/500 [=====>........................] - ETA: 1:37 - loss: 1.1294 - regression_loss: 0.9974 - classification_loss: 0.1320 114/500 [=====>........................] - ETA: 1:37 - loss: 1.1313 - regression_loss: 0.9998 - classification_loss: 0.1315 115/500 [=====>........................] - ETA: 1:36 - loss: 1.1336 - regression_loss: 1.0018 - classification_loss: 0.1318 116/500 [=====>........................] - ETA: 1:36 - loss: 1.1352 - regression_loss: 1.0036 - classification_loss: 0.1316 117/500 [======>.......................] - ETA: 1:36 - loss: 1.1395 - regression_loss: 1.0074 - classification_loss: 0.1321 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1414 - regression_loss: 1.0094 - classification_loss: 0.1321 119/500 [======>.......................] - ETA: 1:35 - loss: 1.1415 - regression_loss: 1.0096 - classification_loss: 0.1319 120/500 [======>.......................] - ETA: 1:35 - loss: 1.1391 - regression_loss: 1.0079 - classification_loss: 0.1311 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1395 - regression_loss: 1.0086 - classification_loss: 0.1310 122/500 [======>.......................] - ETA: 1:35 - loss: 1.1363 - regression_loss: 1.0059 - classification_loss: 0.1304 123/500 [======>.......................] - ETA: 1:34 - loss: 1.1331 - regression_loss: 1.0030 - classification_loss: 0.1301 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1299 - regression_loss: 1.0000 - classification_loss: 0.1298 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1285 - regression_loss: 0.9988 - classification_loss: 0.1296 126/500 [======>.......................] - ETA: 1:34 - loss: 1.1298 - regression_loss: 1.0001 - classification_loss: 0.1297 127/500 [======>.......................] - ETA: 1:33 - loss: 1.1287 - regression_loss: 0.9992 - classification_loss: 0.1295 128/500 [======>.......................] - ETA: 1:33 - loss: 1.1291 - regression_loss: 0.9995 - classification_loss: 0.1296 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1279 - regression_loss: 0.9983 - classification_loss: 0.1295 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1236 - regression_loss: 0.9947 - classification_loss: 0.1289 131/500 [======>.......................] - ETA: 1:32 - loss: 1.1243 - regression_loss: 0.9954 - classification_loss: 0.1290 132/500 [======>.......................] - ETA: 1:32 - loss: 1.1271 - regression_loss: 0.9982 - classification_loss: 0.1288 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1267 - regression_loss: 0.9978 - classification_loss: 0.1289 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1269 - regression_loss: 0.9981 - classification_loss: 0.1288 135/500 [=======>......................] - ETA: 1:31 - loss: 1.1268 - regression_loss: 0.9974 - classification_loss: 0.1293 136/500 [=======>......................] - ETA: 1:31 - loss: 1.1252 - regression_loss: 0.9963 - classification_loss: 0.1289 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1222 - regression_loss: 0.9940 - classification_loss: 0.1283 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1202 - regression_loss: 0.9925 - classification_loss: 0.1277 139/500 [=======>......................] - ETA: 1:31 - loss: 1.1196 - regression_loss: 0.9919 - classification_loss: 0.1278 140/500 [=======>......................] - ETA: 1:30 - loss: 1.1202 - regression_loss: 0.9925 - classification_loss: 0.1277 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1198 - regression_loss: 0.9925 - classification_loss: 0.1273 142/500 [=======>......................] - ETA: 1:30 - loss: 1.1184 - regression_loss: 0.9912 - classification_loss: 0.1272 143/500 [=======>......................] - ETA: 1:30 - loss: 1.1188 - regression_loss: 0.9914 - classification_loss: 0.1274 144/500 [=======>......................] - ETA: 1:29 - loss: 1.1172 - regression_loss: 0.9903 - classification_loss: 0.1270 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1205 - regression_loss: 0.9926 - classification_loss: 0.1279 146/500 [=======>......................] - ETA: 1:29 - loss: 1.1230 - regression_loss: 0.9943 - classification_loss: 0.1287 147/500 [=======>......................] - ETA: 1:29 - loss: 1.1234 - regression_loss: 0.9944 - classification_loss: 0.1289 148/500 [=======>......................] - ETA: 1:28 - loss: 1.1244 - regression_loss: 0.9953 - classification_loss: 0.1291 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1256 - regression_loss: 0.9962 - classification_loss: 0.1294 150/500 [========>.....................] - ETA: 1:28 - loss: 1.1282 - regression_loss: 0.9983 - classification_loss: 0.1299 151/500 [========>.....................] - ETA: 1:27 - loss: 1.1342 - regression_loss: 1.0031 - classification_loss: 0.1312 152/500 [========>.....................] - ETA: 1:27 - loss: 1.1306 - regression_loss: 1.0000 - classification_loss: 0.1306 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1321 - regression_loss: 1.0012 - classification_loss: 0.1309 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1330 - regression_loss: 1.0021 - classification_loss: 0.1308 155/500 [========>.....................] - ETA: 1:26 - loss: 1.1354 - regression_loss: 1.0042 - classification_loss: 0.1312 156/500 [========>.....................] - ETA: 1:26 - loss: 1.1372 - regression_loss: 1.0054 - classification_loss: 0.1318 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1396 - regression_loss: 1.0072 - classification_loss: 0.1323 158/500 [========>.....................] - ETA: 1:26 - loss: 1.1349 - regression_loss: 1.0032 - classification_loss: 0.1317 159/500 [========>.....................] - ETA: 1:25 - loss: 1.1346 - regression_loss: 1.0030 - classification_loss: 0.1317 160/500 [========>.....................] - ETA: 1:25 - loss: 1.1348 - regression_loss: 1.0032 - classification_loss: 0.1316 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1329 - regression_loss: 1.0010 - classification_loss: 0.1319 162/500 [========>.....................] - ETA: 1:25 - loss: 1.1373 - regression_loss: 1.0047 - classification_loss: 0.1326 163/500 [========>.....................] - ETA: 1:24 - loss: 1.1402 - regression_loss: 1.0069 - classification_loss: 0.1333 164/500 [========>.....................] - ETA: 1:24 - loss: 1.1365 - regression_loss: 1.0037 - classification_loss: 0.1327 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1351 - regression_loss: 1.0027 - classification_loss: 0.1323 166/500 [========>.....................] - ETA: 1:24 - loss: 1.1357 - regression_loss: 1.0033 - classification_loss: 0.1324 167/500 [=========>....................] - ETA: 1:23 - loss: 1.1384 - regression_loss: 1.0056 - classification_loss: 0.1328 168/500 [=========>....................] - ETA: 1:23 - loss: 1.1364 - regression_loss: 1.0041 - classification_loss: 0.1323 169/500 [=========>....................] - ETA: 1:23 - loss: 1.1339 - regression_loss: 1.0021 - classification_loss: 0.1318 170/500 [=========>....................] - ETA: 1:23 - loss: 1.1357 - regression_loss: 1.0037 - classification_loss: 0.1320 171/500 [=========>....................] - ETA: 1:22 - loss: 1.1341 - regression_loss: 1.0023 - classification_loss: 0.1317 172/500 [=========>....................] - ETA: 1:22 - loss: 1.1327 - regression_loss: 1.0011 - classification_loss: 0.1316 173/500 [=========>....................] - ETA: 1:22 - loss: 1.1330 - regression_loss: 1.0015 - classification_loss: 0.1315 174/500 [=========>....................] - ETA: 1:22 - loss: 1.1347 - regression_loss: 1.0029 - classification_loss: 0.1317 175/500 [=========>....................] - ETA: 1:21 - loss: 1.1330 - regression_loss: 1.0015 - classification_loss: 0.1314 176/500 [=========>....................] - ETA: 1:21 - loss: 1.1345 - regression_loss: 1.0029 - classification_loss: 0.1316 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1375 - regression_loss: 1.0056 - classification_loss: 0.1319 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1381 - regression_loss: 1.0062 - classification_loss: 0.1318 179/500 [=========>....................] - ETA: 1:20 - loss: 1.1401 - regression_loss: 1.0079 - classification_loss: 0.1322 180/500 [=========>....................] - ETA: 1:20 - loss: 1.1406 - regression_loss: 1.0081 - classification_loss: 0.1324 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1436 - regression_loss: 1.0101 - classification_loss: 0.1335 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1446 - regression_loss: 1.0110 - classification_loss: 0.1336 183/500 [=========>....................] - ETA: 1:19 - loss: 1.1463 - regression_loss: 1.0123 - classification_loss: 0.1339 184/500 [==========>...................] - ETA: 1:19 - loss: 1.1434 - regression_loss: 1.0099 - classification_loss: 0.1335 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1408 - regression_loss: 1.0074 - classification_loss: 0.1333 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1382 - regression_loss: 1.0053 - classification_loss: 0.1329 187/500 [==========>...................] - ETA: 1:18 - loss: 1.1416 - regression_loss: 1.0081 - classification_loss: 0.1335 188/500 [==========>...................] - ETA: 1:18 - loss: 1.1440 - regression_loss: 1.0101 - classification_loss: 0.1339 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1441 - regression_loss: 1.0100 - classification_loss: 0.1341 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1458 - regression_loss: 1.0114 - classification_loss: 0.1344 191/500 [==========>...................] - ETA: 1:17 - loss: 1.1470 - regression_loss: 1.0121 - classification_loss: 0.1349 192/500 [==========>...................] - ETA: 1:17 - loss: 1.1492 - regression_loss: 1.0141 - classification_loss: 0.1352 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1484 - regression_loss: 1.0133 - classification_loss: 0.1352 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1501 - regression_loss: 1.0147 - classification_loss: 0.1354 195/500 [==========>...................] - ETA: 1:16 - loss: 1.1502 - regression_loss: 1.0149 - classification_loss: 0.1353 196/500 [==========>...................] - ETA: 1:16 - loss: 1.1520 - regression_loss: 1.0166 - classification_loss: 0.1355 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1503 - regression_loss: 1.0150 - classification_loss: 0.1353 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1498 - regression_loss: 1.0146 - classification_loss: 0.1352 199/500 [==========>...................] - ETA: 1:15 - loss: 1.1496 - regression_loss: 1.0142 - classification_loss: 0.1354 200/500 [===========>..................] - ETA: 1:15 - loss: 1.1512 - regression_loss: 1.0157 - classification_loss: 0.1354 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1503 - regression_loss: 1.0148 - classification_loss: 0.1356 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1468 - regression_loss: 1.0115 - classification_loss: 0.1352 203/500 [===========>..................] - ETA: 1:14 - loss: 1.1453 - regression_loss: 1.0101 - classification_loss: 0.1352 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1455 - regression_loss: 1.0101 - classification_loss: 0.1353 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1475 - regression_loss: 1.0118 - classification_loss: 0.1357 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1445 - regression_loss: 1.0092 - classification_loss: 0.1352 207/500 [===========>..................] - ETA: 1:13 - loss: 1.1448 - regression_loss: 1.0094 - classification_loss: 0.1354 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1454 - regression_loss: 1.0098 - classification_loss: 0.1356 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1420 - regression_loss: 1.0068 - classification_loss: 0.1352 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1415 - regression_loss: 1.0064 - classification_loss: 0.1351 211/500 [===========>..................] - ETA: 1:12 - loss: 1.1387 - regression_loss: 1.0039 - classification_loss: 0.1348 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1395 - regression_loss: 1.0047 - classification_loss: 0.1348 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1415 - regression_loss: 1.0065 - classification_loss: 0.1350 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1409 - regression_loss: 1.0058 - classification_loss: 0.1351 215/500 [===========>..................] - ETA: 1:11 - loss: 1.1409 - regression_loss: 1.0058 - classification_loss: 0.1351 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1406 - regression_loss: 1.0056 - classification_loss: 0.1350 217/500 [============>.................] - ETA: 1:11 - loss: 1.1407 - regression_loss: 1.0057 - classification_loss: 0.1350 218/500 [============>.................] - ETA: 1:11 - loss: 1.1419 - regression_loss: 1.0066 - classification_loss: 0.1353 219/500 [============>.................] - ETA: 1:10 - loss: 1.1431 - regression_loss: 1.0076 - classification_loss: 0.1355 220/500 [============>.................] - ETA: 1:10 - loss: 1.1450 - regression_loss: 1.0092 - classification_loss: 0.1357 221/500 [============>.................] - ETA: 1:10 - loss: 1.1461 - regression_loss: 1.0103 - classification_loss: 0.1358 222/500 [============>.................] - ETA: 1:10 - loss: 1.1453 - regression_loss: 1.0097 - classification_loss: 0.1356 223/500 [============>.................] - ETA: 1:09 - loss: 1.1483 - regression_loss: 1.0122 - classification_loss: 0.1361 224/500 [============>.................] - ETA: 1:09 - loss: 1.1484 - regression_loss: 1.0122 - classification_loss: 0.1362 225/500 [============>.................] - ETA: 1:09 - loss: 1.1485 - regression_loss: 1.0123 - classification_loss: 0.1362 226/500 [============>.................] - ETA: 1:09 - loss: 1.1491 - regression_loss: 1.0129 - classification_loss: 0.1362 227/500 [============>.................] - ETA: 1:08 - loss: 1.1491 - regression_loss: 1.0130 - classification_loss: 0.1362 228/500 [============>.................] - ETA: 1:08 - loss: 1.1458 - regression_loss: 1.0101 - classification_loss: 0.1357 229/500 [============>.................] - ETA: 1:08 - loss: 1.1425 - regression_loss: 1.0071 - classification_loss: 0.1354 230/500 [============>.................] - ETA: 1:08 - loss: 1.1402 - regression_loss: 1.0052 - classification_loss: 0.1350 231/500 [============>.................] - ETA: 1:07 - loss: 1.1427 - regression_loss: 1.0072 - classification_loss: 0.1355 232/500 [============>.................] - ETA: 1:07 - loss: 1.1416 - regression_loss: 1.0063 - classification_loss: 0.1353 233/500 [============>.................] - ETA: 1:07 - loss: 1.1416 - regression_loss: 1.0063 - classification_loss: 0.1353 234/500 [=============>................] - ETA: 1:07 - loss: 1.1447 - regression_loss: 1.0084 - classification_loss: 0.1363 235/500 [=============>................] - ETA: 1:06 - loss: 1.1456 - regression_loss: 1.0092 - classification_loss: 0.1364 236/500 [=============>................] - ETA: 1:06 - loss: 1.1464 - regression_loss: 1.0099 - classification_loss: 0.1364 237/500 [=============>................] - ETA: 1:06 - loss: 1.1475 - regression_loss: 1.0109 - classification_loss: 0.1365 238/500 [=============>................] - ETA: 1:06 - loss: 1.1484 - regression_loss: 1.0118 - classification_loss: 0.1366 239/500 [=============>................] - ETA: 1:05 - loss: 1.1485 - regression_loss: 1.0119 - classification_loss: 0.1366 240/500 [=============>................] - ETA: 1:05 - loss: 1.1485 - regression_loss: 1.0118 - classification_loss: 0.1366 241/500 [=============>................] - ETA: 1:05 - loss: 1.1468 - regression_loss: 1.0101 - classification_loss: 0.1367 242/500 [=============>................] - ETA: 1:05 - loss: 1.1463 - regression_loss: 1.0097 - classification_loss: 0.1365 243/500 [=============>................] - ETA: 1:04 - loss: 1.1478 - regression_loss: 1.0110 - classification_loss: 0.1368 244/500 [=============>................] - ETA: 1:04 - loss: 1.1497 - regression_loss: 1.0129 - classification_loss: 0.1368 245/500 [=============>................] - ETA: 1:04 - loss: 1.1490 - regression_loss: 1.0126 - classification_loss: 0.1365 246/500 [=============>................] - ETA: 1:04 - loss: 1.1488 - regression_loss: 1.0125 - classification_loss: 0.1364 247/500 [=============>................] - ETA: 1:03 - loss: 1.1513 - regression_loss: 1.0145 - classification_loss: 0.1368 248/500 [=============>................] - ETA: 1:03 - loss: 1.1510 - regression_loss: 1.0142 - classification_loss: 0.1369 249/500 [=============>................] - ETA: 1:03 - loss: 1.1493 - regression_loss: 1.0127 - classification_loss: 0.1366 250/500 [==============>...............] - ETA: 1:03 - loss: 1.1473 - regression_loss: 1.0109 - classification_loss: 0.1364 251/500 [==============>...............] - ETA: 1:02 - loss: 1.1446 - regression_loss: 1.0087 - classification_loss: 0.1360 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1468 - regression_loss: 1.0103 - classification_loss: 0.1365 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1437 - regression_loss: 1.0075 - classification_loss: 0.1361 254/500 [==============>...............] - ETA: 1:02 - loss: 1.1441 - regression_loss: 1.0080 - classification_loss: 0.1361 255/500 [==============>...............] - ETA: 1:01 - loss: 1.1437 - regression_loss: 1.0076 - classification_loss: 0.1361 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1427 - regression_loss: 1.0068 - classification_loss: 0.1359 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1412 - regression_loss: 1.0056 - classification_loss: 0.1357 258/500 [==============>...............] - ETA: 1:01 - loss: 1.1417 - regression_loss: 1.0059 - classification_loss: 0.1358 259/500 [==============>...............] - ETA: 1:00 - loss: 1.1425 - regression_loss: 1.0064 - classification_loss: 0.1361 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1431 - regression_loss: 1.0071 - classification_loss: 0.1361 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1439 - regression_loss: 1.0078 - classification_loss: 0.1361 262/500 [==============>...............] - ETA: 1:00 - loss: 1.1433 - regression_loss: 1.0074 - classification_loss: 0.1358 263/500 [==============>...............] - ETA: 59s - loss: 1.1429 - regression_loss: 1.0073 - classification_loss: 0.1356  264/500 [==============>...............] - ETA: 59s - loss: 1.1418 - regression_loss: 1.0063 - classification_loss: 0.1355 265/500 [==============>...............] - ETA: 59s - loss: 1.1396 - regression_loss: 1.0042 - classification_loss: 0.1354 266/500 [==============>...............] - ETA: 59s - loss: 1.1400 - regression_loss: 1.0045 - classification_loss: 0.1355 267/500 [===============>..............] - ETA: 58s - loss: 1.1390 - regression_loss: 1.0035 - classification_loss: 0.1354 268/500 [===============>..............] - ETA: 58s - loss: 1.1402 - regression_loss: 1.0045 - classification_loss: 0.1357 269/500 [===============>..............] - ETA: 58s - loss: 1.1378 - regression_loss: 1.0024 - classification_loss: 0.1354 270/500 [===============>..............] - ETA: 58s - loss: 1.1398 - regression_loss: 1.0041 - classification_loss: 0.1357 271/500 [===============>..............] - ETA: 57s - loss: 1.1406 - regression_loss: 1.0050 - classification_loss: 0.1356 272/500 [===============>..............] - ETA: 57s - loss: 1.1415 - regression_loss: 1.0058 - classification_loss: 0.1357 273/500 [===============>..............] - ETA: 57s - loss: 1.1410 - regression_loss: 1.0056 - classification_loss: 0.1354 274/500 [===============>..............] - ETA: 57s - loss: 1.1439 - regression_loss: 1.0084 - classification_loss: 0.1355 275/500 [===============>..............] - ETA: 56s - loss: 1.1442 - regression_loss: 1.0087 - classification_loss: 0.1355 276/500 [===============>..............] - ETA: 56s - loss: 1.1429 - regression_loss: 1.0076 - classification_loss: 0.1353 277/500 [===============>..............] - ETA: 56s - loss: 1.1443 - regression_loss: 1.0088 - classification_loss: 0.1355 278/500 [===============>..............] - ETA: 56s - loss: 1.1458 - regression_loss: 1.0101 - classification_loss: 0.1357 279/500 [===============>..............] - ETA: 55s - loss: 1.1460 - regression_loss: 1.0104 - classification_loss: 0.1356 280/500 [===============>..............] - ETA: 55s - loss: 1.1447 - regression_loss: 1.0093 - classification_loss: 0.1354 281/500 [===============>..............] - ETA: 55s - loss: 1.1450 - regression_loss: 1.0095 - classification_loss: 0.1355 282/500 [===============>..............] - ETA: 55s - loss: 1.1443 - regression_loss: 1.0090 - classification_loss: 0.1353 283/500 [===============>..............] - ETA: 54s - loss: 1.1426 - regression_loss: 1.0076 - classification_loss: 0.1350 284/500 [================>.............] - ETA: 54s - loss: 1.1426 - regression_loss: 1.0075 - classification_loss: 0.1351 285/500 [================>.............] - ETA: 54s - loss: 1.1450 - regression_loss: 1.0097 - classification_loss: 0.1353 286/500 [================>.............] - ETA: 54s - loss: 1.1453 - regression_loss: 1.0099 - classification_loss: 0.1353 287/500 [================>.............] - ETA: 53s - loss: 1.1467 - regression_loss: 1.0111 - classification_loss: 0.1356 288/500 [================>.............] - ETA: 53s - loss: 1.1475 - regression_loss: 1.0118 - classification_loss: 0.1357 289/500 [================>.............] - ETA: 53s - loss: 1.1489 - regression_loss: 1.0129 - classification_loss: 0.1360 290/500 [================>.............] - ETA: 52s - loss: 1.1482 - regression_loss: 1.0122 - classification_loss: 0.1360 291/500 [================>.............] - ETA: 52s - loss: 1.1481 - regression_loss: 1.0122 - classification_loss: 0.1359 292/500 [================>.............] - ETA: 52s - loss: 1.1490 - regression_loss: 1.0128 - classification_loss: 0.1361 293/500 [================>.............] - ETA: 52s - loss: 1.1513 - regression_loss: 1.0148 - classification_loss: 0.1365 294/500 [================>.............] - ETA: 51s - loss: 1.1505 - regression_loss: 1.0143 - classification_loss: 0.1362 295/500 [================>.............] - ETA: 51s - loss: 1.1510 - regression_loss: 1.0146 - classification_loss: 0.1364 296/500 [================>.............] - ETA: 51s - loss: 1.1486 - regression_loss: 1.0126 - classification_loss: 0.1360 297/500 [================>.............] - ETA: 51s - loss: 1.1470 - regression_loss: 1.0113 - classification_loss: 0.1357 298/500 [================>.............] - ETA: 50s - loss: 1.1462 - regression_loss: 1.0107 - classification_loss: 0.1355 299/500 [================>.............] - ETA: 50s - loss: 1.1456 - regression_loss: 1.0102 - classification_loss: 0.1354 300/500 [=================>............] - ETA: 50s - loss: 1.1444 - regression_loss: 1.0092 - classification_loss: 0.1352 301/500 [=================>............] - ETA: 50s - loss: 1.1427 - regression_loss: 1.0078 - classification_loss: 0.1349 302/500 [=================>............] - ETA: 49s - loss: 1.1433 - regression_loss: 1.0083 - classification_loss: 0.1350 303/500 [=================>............] - ETA: 49s - loss: 1.1431 - regression_loss: 1.0082 - classification_loss: 0.1349 304/500 [=================>............] - ETA: 49s - loss: 1.1440 - regression_loss: 1.0091 - classification_loss: 0.1349 305/500 [=================>............] - ETA: 49s - loss: 1.1429 - regression_loss: 1.0081 - classification_loss: 0.1348 306/500 [=================>............] - ETA: 48s - loss: 1.1435 - regression_loss: 1.0086 - classification_loss: 0.1349 307/500 [=================>............] - ETA: 48s - loss: 1.1446 - regression_loss: 1.0095 - classification_loss: 0.1351 308/500 [=================>............] - ETA: 48s - loss: 1.1453 - regression_loss: 1.0103 - classification_loss: 0.1350 309/500 [=================>............] - ETA: 48s - loss: 1.1447 - regression_loss: 1.0099 - classification_loss: 0.1349 310/500 [=================>............] - ETA: 47s - loss: 1.1447 - regression_loss: 1.0098 - classification_loss: 0.1349 311/500 [=================>............] - ETA: 47s - loss: 1.1437 - regression_loss: 1.0087 - classification_loss: 0.1349 312/500 [=================>............] - ETA: 47s - loss: 1.1439 - regression_loss: 1.0090 - classification_loss: 0.1349 313/500 [=================>............] - ETA: 47s - loss: 1.1445 - regression_loss: 1.0097 - classification_loss: 0.1349 314/500 [=================>............] - ETA: 46s - loss: 1.1426 - regression_loss: 1.0081 - classification_loss: 0.1345 315/500 [=================>............] - ETA: 46s - loss: 1.1440 - regression_loss: 1.0091 - classification_loss: 0.1348 316/500 [=================>............] - ETA: 46s - loss: 1.1435 - regression_loss: 1.0089 - classification_loss: 0.1346 317/500 [==================>...........] - ETA: 46s - loss: 1.1434 - regression_loss: 1.0088 - classification_loss: 0.1346 318/500 [==================>...........] - ETA: 45s - loss: 1.1474 - regression_loss: 1.0103 - classification_loss: 0.1371 319/500 [==================>...........] - ETA: 45s - loss: 1.1453 - regression_loss: 1.0086 - classification_loss: 0.1367 320/500 [==================>...........] - ETA: 45s - loss: 1.1467 - regression_loss: 1.0099 - classification_loss: 0.1368 321/500 [==================>...........] - ETA: 45s - loss: 1.1475 - regression_loss: 1.0107 - classification_loss: 0.1368 322/500 [==================>...........] - ETA: 44s - loss: 1.1482 - regression_loss: 1.0112 - classification_loss: 0.1369 323/500 [==================>...........] - ETA: 44s - loss: 1.1482 - regression_loss: 1.0113 - classification_loss: 0.1369 324/500 [==================>...........] - ETA: 44s - loss: 1.1487 - regression_loss: 1.0119 - classification_loss: 0.1368 325/500 [==================>...........] - ETA: 44s - loss: 1.1498 - regression_loss: 1.0129 - classification_loss: 0.1368 326/500 [==================>...........] - ETA: 43s - loss: 1.1484 - regression_loss: 1.0118 - classification_loss: 0.1366 327/500 [==================>...........] - ETA: 43s - loss: 1.1477 - regression_loss: 1.0112 - classification_loss: 0.1365 328/500 [==================>...........] - ETA: 43s - loss: 1.1466 - regression_loss: 1.0104 - classification_loss: 0.1362 329/500 [==================>...........] - ETA: 43s - loss: 1.1467 - regression_loss: 1.0103 - classification_loss: 0.1364 330/500 [==================>...........] - ETA: 42s - loss: 1.1469 - regression_loss: 1.0105 - classification_loss: 0.1364 331/500 [==================>...........] - ETA: 42s - loss: 1.1454 - regression_loss: 1.0093 - classification_loss: 0.1360 332/500 [==================>...........] - ETA: 42s - loss: 1.1451 - regression_loss: 1.0092 - classification_loss: 0.1360 333/500 [==================>...........] - ETA: 42s - loss: 1.1435 - regression_loss: 1.0078 - classification_loss: 0.1357 334/500 [===================>..........] - ETA: 41s - loss: 1.1446 - regression_loss: 1.0087 - classification_loss: 0.1359 335/500 [===================>..........] - ETA: 41s - loss: 1.1428 - regression_loss: 1.0069 - classification_loss: 0.1359 336/500 [===================>..........] - ETA: 41s - loss: 1.1420 - regression_loss: 1.0062 - classification_loss: 0.1359 337/500 [===================>..........] - ETA: 41s - loss: 1.1398 - regression_loss: 1.0042 - classification_loss: 0.1356 338/500 [===================>..........] - ETA: 40s - loss: 1.1394 - regression_loss: 1.0039 - classification_loss: 0.1355 339/500 [===================>..........] - ETA: 40s - loss: 1.1399 - regression_loss: 1.0042 - classification_loss: 0.1357 340/500 [===================>..........] - ETA: 40s - loss: 1.1404 - regression_loss: 1.0046 - classification_loss: 0.1358 341/500 [===================>..........] - ETA: 40s - loss: 1.1410 - regression_loss: 1.0052 - classification_loss: 0.1358 342/500 [===================>..........] - ETA: 39s - loss: 1.1403 - regression_loss: 1.0047 - classification_loss: 0.1357 343/500 [===================>..........] - ETA: 39s - loss: 1.1429 - regression_loss: 1.0066 - classification_loss: 0.1362 344/500 [===================>..........] - ETA: 39s - loss: 1.1415 - regression_loss: 1.0054 - classification_loss: 0.1361 345/500 [===================>..........] - ETA: 39s - loss: 1.1416 - regression_loss: 1.0055 - classification_loss: 0.1361 346/500 [===================>..........] - ETA: 38s - loss: 1.1417 - regression_loss: 1.0057 - classification_loss: 0.1361 347/500 [===================>..........] - ETA: 38s - loss: 1.1436 - regression_loss: 1.0072 - classification_loss: 0.1365 348/500 [===================>..........] - ETA: 38s - loss: 1.1429 - regression_loss: 1.0066 - classification_loss: 0.1363 349/500 [===================>..........] - ETA: 38s - loss: 1.1442 - regression_loss: 1.0077 - classification_loss: 0.1365 350/500 [====================>.........] - ETA: 37s - loss: 1.1445 - regression_loss: 1.0078 - classification_loss: 0.1366 351/500 [====================>.........] - ETA: 37s - loss: 1.1431 - regression_loss: 1.0066 - classification_loss: 0.1365 352/500 [====================>.........] - ETA: 37s - loss: 1.1440 - regression_loss: 1.0075 - classification_loss: 0.1365 353/500 [====================>.........] - ETA: 37s - loss: 1.1431 - regression_loss: 1.0065 - classification_loss: 0.1366 354/500 [====================>.........] - ETA: 36s - loss: 1.1435 - regression_loss: 1.0068 - classification_loss: 0.1366 355/500 [====================>.........] - ETA: 36s - loss: 1.1438 - regression_loss: 1.0072 - classification_loss: 0.1366 356/500 [====================>.........] - ETA: 36s - loss: 1.1420 - regression_loss: 1.0057 - classification_loss: 0.1363 357/500 [====================>.........] - ETA: 36s - loss: 1.1413 - regression_loss: 1.0052 - classification_loss: 0.1361 358/500 [====================>.........] - ETA: 35s - loss: 1.1434 - regression_loss: 1.0072 - classification_loss: 0.1361 359/500 [====================>.........] - ETA: 35s - loss: 1.1440 - regression_loss: 1.0078 - classification_loss: 0.1363 360/500 [====================>.........] - ETA: 35s - loss: 1.1425 - regression_loss: 1.0066 - classification_loss: 0.1360 361/500 [====================>.........] - ETA: 35s - loss: 1.1426 - regression_loss: 1.0067 - classification_loss: 0.1359 362/500 [====================>.........] - ETA: 34s - loss: 1.1411 - regression_loss: 1.0055 - classification_loss: 0.1356 363/500 [====================>.........] - ETA: 34s - loss: 1.1399 - regression_loss: 1.0044 - classification_loss: 0.1355 364/500 [====================>.........] - ETA: 34s - loss: 1.1400 - regression_loss: 1.0045 - classification_loss: 0.1355 365/500 [====================>.........] - ETA: 34s - loss: 1.1389 - regression_loss: 1.0036 - classification_loss: 0.1353 366/500 [====================>.........] - ETA: 33s - loss: 1.1391 - regression_loss: 1.0038 - classification_loss: 0.1354 367/500 [=====================>........] - ETA: 33s - loss: 1.1418 - regression_loss: 1.0059 - classification_loss: 0.1360 368/500 [=====================>........] - ETA: 33s - loss: 1.1414 - regression_loss: 1.0055 - classification_loss: 0.1359 369/500 [=====================>........] - ETA: 33s - loss: 1.1413 - regression_loss: 1.0054 - classification_loss: 0.1359 370/500 [=====================>........] - ETA: 32s - loss: 1.1423 - regression_loss: 1.0062 - classification_loss: 0.1361 371/500 [=====================>........] - ETA: 32s - loss: 1.1424 - regression_loss: 1.0063 - classification_loss: 0.1362 372/500 [=====================>........] - ETA: 32s - loss: 1.1433 - regression_loss: 1.0069 - classification_loss: 0.1363 373/500 [=====================>........] - ETA: 32s - loss: 1.1416 - regression_loss: 1.0055 - classification_loss: 0.1361 374/500 [=====================>........] - ETA: 31s - loss: 1.1411 - regression_loss: 1.0050 - classification_loss: 0.1360 375/500 [=====================>........] - ETA: 31s - loss: 1.1410 - regression_loss: 1.0049 - classification_loss: 0.1361 376/500 [=====================>........] - ETA: 31s - loss: 1.1417 - regression_loss: 1.0056 - classification_loss: 0.1361 377/500 [=====================>........] - ETA: 31s - loss: 1.1413 - regression_loss: 1.0052 - classification_loss: 0.1361 378/500 [=====================>........] - ETA: 30s - loss: 1.1423 - regression_loss: 1.0062 - classification_loss: 0.1361 379/500 [=====================>........] - ETA: 30s - loss: 1.1429 - regression_loss: 1.0067 - classification_loss: 0.1362 380/500 [=====================>........] - ETA: 30s - loss: 1.1442 - regression_loss: 1.0079 - classification_loss: 0.1363 381/500 [=====================>........] - ETA: 30s - loss: 1.1452 - regression_loss: 1.0087 - classification_loss: 0.1366 382/500 [=====================>........] - ETA: 29s - loss: 1.1454 - regression_loss: 1.0088 - classification_loss: 0.1367 383/500 [=====================>........] - ETA: 29s - loss: 1.1466 - regression_loss: 1.0098 - classification_loss: 0.1368 384/500 [======================>.......] - ETA: 29s - loss: 1.1476 - regression_loss: 1.0106 - classification_loss: 0.1370 385/500 [======================>.......] - ETA: 29s - loss: 1.1454 - regression_loss: 1.0086 - classification_loss: 0.1367 386/500 [======================>.......] - ETA: 28s - loss: 1.1457 - regression_loss: 1.0090 - classification_loss: 0.1367 387/500 [======================>.......] - ETA: 28s - loss: 1.1458 - regression_loss: 1.0091 - classification_loss: 0.1367 388/500 [======================>.......] - ETA: 28s - loss: 1.1462 - regression_loss: 1.0095 - classification_loss: 0.1367 389/500 [======================>.......] - ETA: 28s - loss: 1.1465 - regression_loss: 1.0097 - classification_loss: 0.1368 390/500 [======================>.......] - ETA: 27s - loss: 1.1460 - regression_loss: 1.0093 - classification_loss: 0.1367 391/500 [======================>.......] - ETA: 27s - loss: 1.1468 - regression_loss: 1.0099 - classification_loss: 0.1368 392/500 [======================>.......] - ETA: 27s - loss: 1.1458 - regression_loss: 1.0091 - classification_loss: 0.1367 393/500 [======================>.......] - ETA: 27s - loss: 1.1464 - regression_loss: 1.0096 - classification_loss: 0.1368 394/500 [======================>.......] - ETA: 26s - loss: 1.1466 - regression_loss: 1.0098 - classification_loss: 0.1368 395/500 [======================>.......] - ETA: 26s - loss: 1.1467 - regression_loss: 1.0100 - classification_loss: 0.1367 396/500 [======================>.......] - ETA: 26s - loss: 1.1469 - regression_loss: 1.0102 - classification_loss: 0.1367 397/500 [======================>.......] - ETA: 26s - loss: 1.1486 - regression_loss: 1.0116 - classification_loss: 0.1369 398/500 [======================>.......] - ETA: 25s - loss: 1.1483 - regression_loss: 1.0115 - classification_loss: 0.1368 399/500 [======================>.......] - ETA: 25s - loss: 1.1485 - regression_loss: 1.0117 - classification_loss: 0.1368 400/500 [=======================>......] - ETA: 25s - loss: 1.1474 - regression_loss: 1.0106 - classification_loss: 0.1368 401/500 [=======================>......] - ETA: 25s - loss: 1.1477 - regression_loss: 1.0108 - classification_loss: 0.1369 402/500 [=======================>......] - ETA: 24s - loss: 1.1480 - regression_loss: 1.0111 - classification_loss: 0.1369 403/500 [=======================>......] - ETA: 24s - loss: 1.1485 - regression_loss: 1.0114 - classification_loss: 0.1372 404/500 [=======================>......] - ETA: 24s - loss: 1.1482 - regression_loss: 1.0113 - classification_loss: 0.1369 405/500 [=======================>......] - ETA: 24s - loss: 1.1470 - regression_loss: 1.0104 - classification_loss: 0.1367 406/500 [=======================>......] - ETA: 23s - loss: 1.1476 - regression_loss: 1.0109 - classification_loss: 0.1367 407/500 [=======================>......] - ETA: 23s - loss: 1.1488 - regression_loss: 1.0121 - classification_loss: 0.1367 408/500 [=======================>......] - ETA: 23s - loss: 1.1484 - regression_loss: 1.0117 - classification_loss: 0.1367 409/500 [=======================>......] - ETA: 23s - loss: 1.1487 - regression_loss: 1.0120 - classification_loss: 0.1367 410/500 [=======================>......] - ETA: 22s - loss: 1.1492 - regression_loss: 1.0124 - classification_loss: 0.1368 411/500 [=======================>......] - ETA: 22s - loss: 1.1497 - regression_loss: 1.0130 - classification_loss: 0.1368 412/500 [=======================>......] - ETA: 22s - loss: 1.1494 - regression_loss: 1.0128 - classification_loss: 0.1366 413/500 [=======================>......] - ETA: 22s - loss: 1.1494 - regression_loss: 1.0130 - classification_loss: 0.1364 414/500 [=======================>......] - ETA: 21s - loss: 1.1495 - regression_loss: 1.0132 - classification_loss: 0.1363 415/500 [=======================>......] - ETA: 21s - loss: 1.1500 - regression_loss: 1.0136 - classification_loss: 0.1364 416/500 [=======================>......] - ETA: 21s - loss: 1.1512 - regression_loss: 1.0147 - classification_loss: 0.1365 417/500 [========================>.....] - ETA: 21s - loss: 1.1515 - regression_loss: 1.0149 - classification_loss: 0.1366 418/500 [========================>.....] - ETA: 20s - loss: 1.1515 - regression_loss: 1.0150 - classification_loss: 0.1365 419/500 [========================>.....] - ETA: 20s - loss: 1.1512 - regression_loss: 1.0148 - classification_loss: 0.1364 420/500 [========================>.....] - ETA: 20s - loss: 1.1501 - regression_loss: 1.0139 - classification_loss: 0.1362 421/500 [========================>.....] - ETA: 19s - loss: 1.1499 - regression_loss: 1.0137 - classification_loss: 0.1362 422/500 [========================>.....] - ETA: 19s - loss: 1.1501 - regression_loss: 1.0139 - classification_loss: 0.1362 423/500 [========================>.....] - ETA: 19s - loss: 1.1491 - regression_loss: 1.0129 - classification_loss: 0.1362 424/500 [========================>.....] - ETA: 19s - loss: 1.1496 - regression_loss: 1.0134 - classification_loss: 0.1362 425/500 [========================>.....] - ETA: 18s - loss: 1.1494 - regression_loss: 1.0132 - classification_loss: 0.1361 426/500 [========================>.....] - ETA: 18s - loss: 1.1498 - regression_loss: 1.0135 - classification_loss: 0.1363 427/500 [========================>.....] - ETA: 18s - loss: 1.1515 - regression_loss: 1.0148 - classification_loss: 0.1367 428/500 [========================>.....] - ETA: 18s - loss: 1.1512 - regression_loss: 1.0147 - classification_loss: 0.1365 429/500 [========================>.....] - ETA: 17s - loss: 1.1514 - regression_loss: 1.0150 - classification_loss: 0.1365 430/500 [========================>.....] - ETA: 17s - loss: 1.1512 - regression_loss: 1.0148 - classification_loss: 0.1364 431/500 [========================>.....] - ETA: 17s - loss: 1.1501 - regression_loss: 1.0138 - classification_loss: 0.1363 432/500 [========================>.....] - ETA: 17s - loss: 1.1498 - regression_loss: 1.0135 - classification_loss: 0.1363 433/500 [========================>.....] - ETA: 16s - loss: 1.1497 - regression_loss: 1.0134 - classification_loss: 0.1363 434/500 [=========================>....] - ETA: 16s - loss: 1.1485 - regression_loss: 1.0122 - classification_loss: 0.1363 435/500 [=========================>....] - ETA: 16s - loss: 1.1479 - regression_loss: 1.0117 - classification_loss: 0.1363 436/500 [=========================>....] - ETA: 16s - loss: 1.1481 - regression_loss: 1.0119 - classification_loss: 0.1363 437/500 [=========================>....] - ETA: 15s - loss: 1.1475 - regression_loss: 1.0114 - classification_loss: 0.1361 438/500 [=========================>....] - ETA: 15s - loss: 1.1488 - regression_loss: 1.0125 - classification_loss: 0.1363 439/500 [=========================>....] - ETA: 15s - loss: 1.1485 - regression_loss: 1.0123 - classification_loss: 0.1362 440/500 [=========================>....] - ETA: 15s - loss: 1.1493 - regression_loss: 1.0129 - classification_loss: 0.1364 441/500 [=========================>....] - ETA: 14s - loss: 1.1499 - regression_loss: 1.0135 - classification_loss: 0.1365 442/500 [=========================>....] - ETA: 14s - loss: 1.1508 - regression_loss: 1.0142 - classification_loss: 0.1367 443/500 [=========================>....] - ETA: 14s - loss: 1.1520 - regression_loss: 1.0151 - classification_loss: 0.1369 444/500 [=========================>....] - ETA: 14s - loss: 1.1510 - regression_loss: 1.0143 - classification_loss: 0.1367 445/500 [=========================>....] - ETA: 13s - loss: 1.1501 - regression_loss: 1.0136 - classification_loss: 0.1365 446/500 [=========================>....] - ETA: 13s - loss: 1.1493 - regression_loss: 1.0129 - classification_loss: 0.1364 447/500 [=========================>....] - ETA: 13s - loss: 1.1497 - regression_loss: 1.0133 - classification_loss: 0.1365 448/500 [=========================>....] - ETA: 13s - loss: 1.1491 - regression_loss: 1.0127 - classification_loss: 0.1364 449/500 [=========================>....] - ETA: 12s - loss: 1.1489 - regression_loss: 1.0124 - classification_loss: 0.1365 450/500 [==========================>...] - ETA: 12s - loss: 1.1485 - regression_loss: 1.0121 - classification_loss: 0.1363 451/500 [==========================>...] - ETA: 12s - loss: 1.1488 - regression_loss: 1.0123 - classification_loss: 0.1364 452/500 [==========================>...] - ETA: 12s - loss: 1.1486 - regression_loss: 1.0121 - classification_loss: 0.1365 453/500 [==========================>...] - ETA: 11s - loss: 1.1486 - regression_loss: 1.0121 - classification_loss: 0.1365 454/500 [==========================>...] - ETA: 11s - loss: 1.1484 - regression_loss: 1.0119 - classification_loss: 0.1365 455/500 [==========================>...] - ETA: 11s - loss: 1.1490 - regression_loss: 1.0125 - classification_loss: 0.1365 456/500 [==========================>...] - ETA: 11s - loss: 1.1482 - regression_loss: 1.0118 - classification_loss: 0.1364 457/500 [==========================>...] - ETA: 10s - loss: 1.1465 - regression_loss: 1.0104 - classification_loss: 0.1362 458/500 [==========================>...] - ETA: 10s - loss: 1.1470 - regression_loss: 1.0107 - classification_loss: 0.1363 459/500 [==========================>...] - ETA: 10s - loss: 1.1465 - regression_loss: 1.0103 - classification_loss: 0.1362 460/500 [==========================>...] - ETA: 10s - loss: 1.1460 - regression_loss: 1.0099 - classification_loss: 0.1361 461/500 [==========================>...] - ETA: 9s - loss: 1.1462 - regression_loss: 1.0100 - classification_loss: 0.1361  462/500 [==========================>...] - ETA: 9s - loss: 1.1464 - regression_loss: 1.0103 - classification_loss: 0.1361 463/500 [==========================>...] - ETA: 9s - loss: 1.1460 - regression_loss: 1.0100 - classification_loss: 0.1361 464/500 [==========================>...] - ETA: 9s - loss: 1.1469 - regression_loss: 1.0107 - classification_loss: 0.1362 465/500 [==========================>...] - ETA: 8s - loss: 1.1476 - regression_loss: 1.0115 - classification_loss: 0.1361 466/500 [==========================>...] - ETA: 8s - loss: 1.1474 - regression_loss: 1.0113 - classification_loss: 0.1360 467/500 [===========================>..] - ETA: 8s - loss: 1.1466 - regression_loss: 1.0107 - classification_loss: 0.1359 468/500 [===========================>..] - ETA: 8s - loss: 1.1459 - regression_loss: 1.0102 - classification_loss: 0.1357 469/500 [===========================>..] - ETA: 7s - loss: 1.1450 - regression_loss: 1.0094 - classification_loss: 0.1355 470/500 [===========================>..] - ETA: 7s - loss: 1.1441 - regression_loss: 1.0087 - classification_loss: 0.1354 471/500 [===========================>..] - ETA: 7s - loss: 1.1431 - regression_loss: 1.0079 - classification_loss: 0.1352 472/500 [===========================>..] - ETA: 7s - loss: 1.1436 - regression_loss: 1.0083 - classification_loss: 0.1353 473/500 [===========================>..] - ETA: 6s - loss: 1.1443 - regression_loss: 1.0089 - classification_loss: 0.1354 474/500 [===========================>..] - ETA: 6s - loss: 1.1444 - regression_loss: 1.0089 - classification_loss: 0.1355 475/500 [===========================>..] - ETA: 6s - loss: 1.1459 - regression_loss: 1.0102 - classification_loss: 0.1358 476/500 [===========================>..] - ETA: 6s - loss: 1.1459 - regression_loss: 1.0102 - classification_loss: 0.1357 477/500 [===========================>..] - ETA: 5s - loss: 1.1440 - regression_loss: 1.0085 - classification_loss: 0.1355 478/500 [===========================>..] - ETA: 5s - loss: 1.1437 - regression_loss: 1.0083 - classification_loss: 0.1354 479/500 [===========================>..] - ETA: 5s - loss: 1.1442 - regression_loss: 1.0086 - classification_loss: 0.1356 480/500 [===========================>..] - ETA: 5s - loss: 1.1440 - regression_loss: 1.0074 - classification_loss: 0.1366 481/500 [===========================>..] - ETA: 4s - loss: 1.1442 - regression_loss: 1.0076 - classification_loss: 0.1366 482/500 [===========================>..] - ETA: 4s - loss: 1.1446 - regression_loss: 1.0077 - classification_loss: 0.1369 483/500 [===========================>..] - ETA: 4s - loss: 1.1453 - regression_loss: 1.0083 - classification_loss: 0.1369 484/500 [============================>.] - ETA: 4s - loss: 1.1442 - regression_loss: 1.0074 - classification_loss: 0.1368 485/500 [============================>.] - ETA: 3s - loss: 1.1447 - regression_loss: 1.0078 - classification_loss: 0.1369 486/500 [============================>.] - ETA: 3s - loss: 1.1453 - regression_loss: 1.0084 - classification_loss: 0.1370 487/500 [============================>.] - ETA: 3s - loss: 1.1459 - regression_loss: 1.0088 - classification_loss: 0.1371 488/500 [============================>.] - ETA: 3s - loss: 1.1473 - regression_loss: 1.0101 - classification_loss: 0.1372 489/500 [============================>.] - ETA: 2s - loss: 1.1467 - regression_loss: 1.0095 - classification_loss: 0.1372 490/500 [============================>.] - ETA: 2s - loss: 1.1463 - regression_loss: 1.0092 - classification_loss: 0.1371 491/500 [============================>.] - ETA: 2s - loss: 1.1449 - regression_loss: 1.0080 - classification_loss: 0.1369 492/500 [============================>.] - ETA: 2s - loss: 1.1454 - regression_loss: 1.0084 - classification_loss: 0.1369 493/500 [============================>.] - ETA: 1s - loss: 1.1460 - regression_loss: 1.0090 - classification_loss: 0.1370 494/500 [============================>.] - ETA: 1s - loss: 1.1464 - regression_loss: 1.0093 - classification_loss: 0.1371 495/500 [============================>.] - ETA: 1s - loss: 1.1473 - regression_loss: 1.0101 - classification_loss: 0.1372 496/500 [============================>.] - ETA: 1s - loss: 1.1470 - regression_loss: 1.0098 - classification_loss: 0.1371 497/500 [============================>.] - ETA: 0s - loss: 1.1471 - regression_loss: 1.0098 - classification_loss: 0.1373 498/500 [============================>.] - ETA: 0s - loss: 1.1474 - regression_loss: 1.0100 - classification_loss: 0.1373 499/500 [============================>.] - ETA: 0s - loss: 1.1464 - regression_loss: 1.0092 - classification_loss: 0.1371 500/500 [==============================] - 127s 253ms/step - loss: 1.1476 - regression_loss: 1.0104 - classification_loss: 0.1372 1172 instances of class plum with average precision: 0.7237 mAP: 0.7237 Epoch 00028: saving model to ./training/snapshots/resnet50_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:00 - loss: 1.5470 - regression_loss: 1.3404 - classification_loss: 0.2066 2/500 [..............................] - ETA: 2:06 - loss: 1.2992 - regression_loss: 1.1623 - classification_loss: 0.1369 3/500 [..............................] - ETA: 2:05 - loss: 1.2612 - regression_loss: 1.1230 - classification_loss: 0.1383 4/500 [..............................] - ETA: 2:04 - loss: 1.1767 - regression_loss: 1.0630 - classification_loss: 0.1137 5/500 [..............................] - ETA: 2:04 - loss: 1.0703 - regression_loss: 0.9650 - classification_loss: 0.1052 6/500 [..............................] - ETA: 2:05 - loss: 1.1136 - regression_loss: 0.9971 - classification_loss: 0.1164 7/500 [..............................] - ETA: 2:05 - loss: 1.0989 - regression_loss: 0.9854 - classification_loss: 0.1135 8/500 [..............................] - ETA: 2:05 - loss: 1.0919 - regression_loss: 0.9792 - classification_loss: 0.1126 9/500 [..............................] - ETA: 2:05 - loss: 1.1509 - regression_loss: 1.0283 - classification_loss: 0.1226 10/500 [..............................] - ETA: 2:04 - loss: 1.1335 - regression_loss: 1.0122 - classification_loss: 0.1213 11/500 [..............................] - ETA: 2:05 - loss: 1.1286 - regression_loss: 1.0088 - classification_loss: 0.1198 12/500 [..............................] - ETA: 2:04 - loss: 1.1157 - regression_loss: 0.9988 - classification_loss: 0.1169 13/500 [..............................] - ETA: 2:04 - loss: 1.1021 - regression_loss: 0.9877 - classification_loss: 0.1143 14/500 [..............................] - ETA: 2:04 - loss: 1.1320 - regression_loss: 1.0125 - classification_loss: 0.1195 15/500 [..............................] - ETA: 2:04 - loss: 1.1374 - regression_loss: 1.0190 - classification_loss: 0.1185 16/500 [..............................] - ETA: 2:03 - loss: 1.1555 - regression_loss: 1.0325 - classification_loss: 0.1230 17/500 [>.............................] - ETA: 2:03 - loss: 1.1605 - regression_loss: 1.0394 - classification_loss: 0.1211 18/500 [>.............................] - ETA: 2:03 - loss: 1.1332 - regression_loss: 1.0156 - classification_loss: 0.1176 19/500 [>.............................] - ETA: 2:03 - loss: 1.1517 - regression_loss: 1.0304 - classification_loss: 0.1214 20/500 [>.............................] - ETA: 2:02 - loss: 1.1688 - regression_loss: 1.0459 - classification_loss: 0.1229 21/500 [>.............................] - ETA: 2:02 - loss: 1.1598 - regression_loss: 1.0363 - classification_loss: 0.1235 22/500 [>.............................] - ETA: 2:02 - loss: 1.1460 - regression_loss: 1.0258 - classification_loss: 0.1202 23/500 [>.............................] - ETA: 2:02 - loss: 1.1299 - regression_loss: 1.0125 - classification_loss: 0.1174 24/500 [>.............................] - ETA: 2:01 - loss: 1.1407 - regression_loss: 1.0209 - classification_loss: 0.1199 25/500 [>.............................] - ETA: 2:01 - loss: 1.1352 - regression_loss: 1.0171 - classification_loss: 0.1181 26/500 [>.............................] - ETA: 2:01 - loss: 1.1313 - regression_loss: 1.0127 - classification_loss: 0.1187 27/500 [>.............................] - ETA: 2:01 - loss: 1.1369 - regression_loss: 1.0172 - classification_loss: 0.1196 28/500 [>.............................] - ETA: 2:00 - loss: 1.1344 - regression_loss: 1.0149 - classification_loss: 0.1195 29/500 [>.............................] - ETA: 2:00 - loss: 1.1123 - regression_loss: 0.9957 - classification_loss: 0.1166 30/500 [>.............................] - ETA: 2:00 - loss: 1.1104 - regression_loss: 0.9939 - classification_loss: 0.1165 31/500 [>.............................] - ETA: 1:59 - loss: 1.1068 - regression_loss: 0.9905 - classification_loss: 0.1163 32/500 [>.............................] - ETA: 1:59 - loss: 1.1108 - regression_loss: 0.9931 - classification_loss: 0.1177 33/500 [>.............................] - ETA: 1:59 - loss: 1.1226 - regression_loss: 1.0021 - classification_loss: 0.1205 34/500 [=>............................] - ETA: 1:59 - loss: 1.1382 - regression_loss: 1.0150 - classification_loss: 0.1233 35/500 [=>............................] - ETA: 1:58 - loss: 1.1507 - regression_loss: 1.0259 - classification_loss: 0.1249 36/500 [=>............................] - ETA: 1:58 - loss: 1.1445 - regression_loss: 1.0198 - classification_loss: 0.1247 37/500 [=>............................] - ETA: 1:58 - loss: 1.1306 - regression_loss: 1.0080 - classification_loss: 0.1226 38/500 [=>............................] - ETA: 1:58 - loss: 1.1313 - regression_loss: 1.0084 - classification_loss: 0.1230 39/500 [=>............................] - ETA: 1:57 - loss: 1.1381 - regression_loss: 1.0142 - classification_loss: 0.1239 40/500 [=>............................] - ETA: 1:57 - loss: 1.1368 - regression_loss: 1.0129 - classification_loss: 0.1239 41/500 [=>............................] - ETA: 1:57 - loss: 1.1358 - regression_loss: 1.0114 - classification_loss: 0.1243 42/500 [=>............................] - ETA: 1:57 - loss: 1.1239 - regression_loss: 1.0006 - classification_loss: 0.1233 43/500 [=>............................] - ETA: 1:56 - loss: 1.1105 - regression_loss: 0.9886 - classification_loss: 0.1220 44/500 [=>............................] - ETA: 1:56 - loss: 1.1015 - regression_loss: 0.9805 - classification_loss: 0.1210 45/500 [=>............................] - ETA: 1:55 - loss: 1.1082 - regression_loss: 0.9853 - classification_loss: 0.1230 46/500 [=>............................] - ETA: 1:55 - loss: 1.0999 - regression_loss: 0.9783 - classification_loss: 0.1216 47/500 [=>............................] - ETA: 1:55 - loss: 1.1107 - regression_loss: 0.9868 - classification_loss: 0.1239 48/500 [=>............................] - ETA: 1:55 - loss: 1.1203 - regression_loss: 0.9939 - classification_loss: 0.1264 49/500 [=>............................] - ETA: 1:54 - loss: 1.1243 - regression_loss: 0.9973 - classification_loss: 0.1271 50/500 [==>...........................] - ETA: 1:54 - loss: 1.1298 - regression_loss: 1.0017 - classification_loss: 0.1281 51/500 [==>...........................] - ETA: 1:54 - loss: 1.1273 - regression_loss: 0.9987 - classification_loss: 0.1287 52/500 [==>...........................] - ETA: 1:54 - loss: 1.1290 - regression_loss: 0.9998 - classification_loss: 0.1292 53/500 [==>...........................] - ETA: 1:54 - loss: 1.1269 - regression_loss: 0.9979 - classification_loss: 0.1290 54/500 [==>...........................] - ETA: 1:53 - loss: 1.1378 - regression_loss: 1.0063 - classification_loss: 0.1315 55/500 [==>...........................] - ETA: 1:53 - loss: 1.1319 - regression_loss: 1.0024 - classification_loss: 0.1295 56/500 [==>...........................] - ETA: 1:53 - loss: 1.1226 - regression_loss: 0.9949 - classification_loss: 0.1277 57/500 [==>...........................] - ETA: 1:53 - loss: 1.1225 - regression_loss: 0.9944 - classification_loss: 0.1281 58/500 [==>...........................] - ETA: 1:53 - loss: 1.1285 - regression_loss: 0.9998 - classification_loss: 0.1287 59/500 [==>...........................] - ETA: 1:52 - loss: 1.1256 - regression_loss: 0.9974 - classification_loss: 0.1282 60/500 [==>...........................] - ETA: 1:52 - loss: 1.1294 - regression_loss: 1.0006 - classification_loss: 0.1288 61/500 [==>...........................] - ETA: 1:52 - loss: 1.1288 - regression_loss: 1.0007 - classification_loss: 0.1281 62/500 [==>...........................] - ETA: 1:51 - loss: 1.1394 - regression_loss: 1.0089 - classification_loss: 0.1305 63/500 [==>...........................] - ETA: 1:51 - loss: 1.1419 - regression_loss: 1.0115 - classification_loss: 0.1304 64/500 [==>...........................] - ETA: 1:51 - loss: 1.1338 - regression_loss: 1.0046 - classification_loss: 0.1292 65/500 [==>...........................] - ETA: 1:51 - loss: 1.1265 - regression_loss: 0.9981 - classification_loss: 0.1284 66/500 [==>...........................] - ETA: 1:50 - loss: 1.1300 - regression_loss: 1.0014 - classification_loss: 0.1287 67/500 [===>..........................] - ETA: 1:50 - loss: 1.1347 - regression_loss: 1.0054 - classification_loss: 0.1294 68/500 [===>..........................] - ETA: 1:50 - loss: 1.1269 - regression_loss: 0.9988 - classification_loss: 0.1281 69/500 [===>..........................] - ETA: 1:50 - loss: 1.1274 - regression_loss: 0.9997 - classification_loss: 0.1277 70/500 [===>..........................] - ETA: 1:49 - loss: 1.1357 - regression_loss: 1.0063 - classification_loss: 0.1294 71/500 [===>..........................] - ETA: 1:49 - loss: 1.1346 - regression_loss: 1.0049 - classification_loss: 0.1297 72/500 [===>..........................] - ETA: 1:49 - loss: 1.1324 - regression_loss: 1.0023 - classification_loss: 0.1301 73/500 [===>..........................] - ETA: 1:48 - loss: 1.1399 - regression_loss: 1.0094 - classification_loss: 0.1306 74/500 [===>..........................] - ETA: 1:48 - loss: 1.1538 - regression_loss: 1.0188 - classification_loss: 0.1349 75/500 [===>..........................] - ETA: 1:48 - loss: 1.1559 - regression_loss: 1.0202 - classification_loss: 0.1356 76/500 [===>..........................] - ETA: 1:48 - loss: 1.1501 - regression_loss: 1.0149 - classification_loss: 0.1352 77/500 [===>..........................] - ETA: 1:47 - loss: 1.1442 - regression_loss: 1.0098 - classification_loss: 0.1343 78/500 [===>..........................] - ETA: 1:47 - loss: 1.1389 - regression_loss: 1.0053 - classification_loss: 0.1337 79/500 [===>..........................] - ETA: 1:47 - loss: 1.1390 - regression_loss: 1.0054 - classification_loss: 0.1336 80/500 [===>..........................] - ETA: 1:46 - loss: 1.1420 - regression_loss: 1.0080 - classification_loss: 0.1339 81/500 [===>..........................] - ETA: 1:46 - loss: 1.1429 - regression_loss: 1.0085 - classification_loss: 0.1345 82/500 [===>..........................] - ETA: 1:46 - loss: 1.1433 - regression_loss: 1.0091 - classification_loss: 0.1342 83/500 [===>..........................] - ETA: 1:45 - loss: 1.1413 - regression_loss: 1.0073 - classification_loss: 0.1340 84/500 [====>.........................] - ETA: 1:45 - loss: 1.1323 - regression_loss: 0.9992 - classification_loss: 0.1330 85/500 [====>.........................] - ETA: 1:45 - loss: 1.1324 - regression_loss: 0.9990 - classification_loss: 0.1334 86/500 [====>.........................] - ETA: 1:44 - loss: 1.1311 - regression_loss: 0.9978 - classification_loss: 0.1334 87/500 [====>.........................] - ETA: 1:44 - loss: 1.1235 - regression_loss: 0.9913 - classification_loss: 0.1322 88/500 [====>.........................] - ETA: 1:44 - loss: 1.1179 - regression_loss: 0.9866 - classification_loss: 0.1314 89/500 [====>.........................] - ETA: 1:44 - loss: 1.1192 - regression_loss: 0.9877 - classification_loss: 0.1315 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1239 - regression_loss: 0.9919 - classification_loss: 0.1320 91/500 [====>.........................] - ETA: 1:43 - loss: 1.1262 - regression_loss: 0.9937 - classification_loss: 0.1324 92/500 [====>.........................] - ETA: 1:43 - loss: 1.1292 - regression_loss: 0.9962 - classification_loss: 0.1330 93/500 [====>.........................] - ETA: 1:43 - loss: 1.1238 - regression_loss: 0.9916 - classification_loss: 0.1322 94/500 [====>.........................] - ETA: 1:42 - loss: 1.1276 - regression_loss: 0.9946 - classification_loss: 0.1331 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1340 - regression_loss: 1.0000 - classification_loss: 0.1340 96/500 [====>.........................] - ETA: 1:42 - loss: 1.1359 - regression_loss: 1.0016 - classification_loss: 0.1343 97/500 [====>.........................] - ETA: 1:42 - loss: 1.1379 - regression_loss: 1.0033 - classification_loss: 0.1347 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1314 - regression_loss: 0.9979 - classification_loss: 0.1335 99/500 [====>.........................] - ETA: 1:41 - loss: 1.1335 - regression_loss: 0.9995 - classification_loss: 0.1340 100/500 [=====>........................] - ETA: 1:41 - loss: 1.1308 - regression_loss: 0.9974 - classification_loss: 0.1335 101/500 [=====>........................] - ETA: 1:41 - loss: 1.1229 - regression_loss: 0.9906 - classification_loss: 0.1323 102/500 [=====>........................] - ETA: 1:40 - loss: 1.1278 - regression_loss: 0.9948 - classification_loss: 0.1330 103/500 [=====>........................] - ETA: 1:40 - loss: 1.1292 - regression_loss: 0.9959 - classification_loss: 0.1333 104/500 [=====>........................] - ETA: 1:40 - loss: 1.1253 - regression_loss: 0.9927 - classification_loss: 0.1326 105/500 [=====>........................] - ETA: 1:40 - loss: 1.1304 - regression_loss: 0.9972 - classification_loss: 0.1332 106/500 [=====>........................] - ETA: 1:39 - loss: 1.1319 - regression_loss: 0.9985 - classification_loss: 0.1334 107/500 [=====>........................] - ETA: 1:39 - loss: 1.1287 - regression_loss: 0.9955 - classification_loss: 0.1332 108/500 [=====>........................] - ETA: 1:39 - loss: 1.1244 - regression_loss: 0.9920 - classification_loss: 0.1324 109/500 [=====>........................] - ETA: 1:39 - loss: 1.1285 - regression_loss: 0.9953 - classification_loss: 0.1332 110/500 [=====>........................] - ETA: 1:38 - loss: 1.1278 - regression_loss: 0.9947 - classification_loss: 0.1330 111/500 [=====>........................] - ETA: 1:38 - loss: 1.1266 - regression_loss: 0.9935 - classification_loss: 0.1330 112/500 [=====>........................] - ETA: 1:38 - loss: 1.1280 - regression_loss: 0.9951 - classification_loss: 0.1329 113/500 [=====>........................] - ETA: 1:38 - loss: 1.1315 - regression_loss: 0.9984 - classification_loss: 0.1331 114/500 [=====>........................] - ETA: 1:37 - loss: 1.1292 - regression_loss: 0.9964 - classification_loss: 0.1328 115/500 [=====>........................] - ETA: 1:37 - loss: 1.1304 - regression_loss: 0.9972 - classification_loss: 0.1332 116/500 [=====>........................] - ETA: 1:37 - loss: 1.1313 - regression_loss: 0.9974 - classification_loss: 0.1339 117/500 [======>.......................] - ETA: 1:37 - loss: 1.1309 - regression_loss: 0.9970 - classification_loss: 0.1339 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1296 - regression_loss: 0.9958 - classification_loss: 0.1337 119/500 [======>.......................] - ETA: 1:36 - loss: 1.1263 - regression_loss: 0.9930 - classification_loss: 0.1333 120/500 [======>.......................] - ETA: 1:36 - loss: 1.1283 - regression_loss: 0.9944 - classification_loss: 0.1339 121/500 [======>.......................] - ETA: 1:36 - loss: 1.1280 - regression_loss: 0.9943 - classification_loss: 0.1337 122/500 [======>.......................] - ETA: 1:35 - loss: 1.1280 - regression_loss: 0.9943 - classification_loss: 0.1337 123/500 [======>.......................] - ETA: 1:35 - loss: 1.1223 - regression_loss: 0.9894 - classification_loss: 0.1329 124/500 [======>.......................] - ETA: 1:35 - loss: 1.1230 - regression_loss: 0.9898 - classification_loss: 0.1332 125/500 [======>.......................] - ETA: 1:35 - loss: 1.1267 - regression_loss: 0.9933 - classification_loss: 0.1334 126/500 [======>.......................] - ETA: 1:34 - loss: 1.1261 - regression_loss: 0.9929 - classification_loss: 0.1332 127/500 [======>.......................] - ETA: 1:34 - loss: 1.1237 - regression_loss: 0.9908 - classification_loss: 0.1329 128/500 [======>.......................] - ETA: 1:34 - loss: 1.1211 - regression_loss: 0.9886 - classification_loss: 0.1324 129/500 [======>.......................] - ETA: 1:34 - loss: 1.1208 - regression_loss: 0.9888 - classification_loss: 0.1321 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1251 - regression_loss: 0.9921 - classification_loss: 0.1330 131/500 [======>.......................] - ETA: 1:33 - loss: 1.1272 - regression_loss: 0.9941 - classification_loss: 0.1331 132/500 [======>.......................] - ETA: 1:33 - loss: 1.1271 - regression_loss: 0.9943 - classification_loss: 0.1328 133/500 [======>.......................] - ETA: 1:33 - loss: 1.1298 - regression_loss: 0.9966 - classification_loss: 0.1331 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1299 - regression_loss: 0.9966 - classification_loss: 0.1333 135/500 [=======>......................] - ETA: 1:32 - loss: 1.1316 - regression_loss: 0.9981 - classification_loss: 0.1335 136/500 [=======>......................] - ETA: 1:32 - loss: 1.1335 - regression_loss: 0.9997 - classification_loss: 0.1338 137/500 [=======>......................] - ETA: 1:32 - loss: 1.1345 - regression_loss: 1.0007 - classification_loss: 0.1338 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1342 - regression_loss: 1.0000 - classification_loss: 0.1342 139/500 [=======>......................] - ETA: 1:31 - loss: 1.1350 - regression_loss: 1.0006 - classification_loss: 0.1345 140/500 [=======>......................] - ETA: 1:31 - loss: 1.1331 - regression_loss: 0.9981 - classification_loss: 0.1350 141/500 [=======>......................] - ETA: 1:31 - loss: 1.1323 - regression_loss: 0.9974 - classification_loss: 0.1349 142/500 [=======>......................] - ETA: 1:30 - loss: 1.1320 - regression_loss: 0.9973 - classification_loss: 0.1346 143/500 [=======>......................] - ETA: 1:30 - loss: 1.1332 - regression_loss: 0.9987 - classification_loss: 0.1344 144/500 [=======>......................] - ETA: 1:30 - loss: 1.1340 - regression_loss: 0.9995 - classification_loss: 0.1346 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1336 - regression_loss: 0.9993 - classification_loss: 0.1343 146/500 [=======>......................] - ETA: 1:29 - loss: 1.1345 - regression_loss: 0.9999 - classification_loss: 0.1346 147/500 [=======>......................] - ETA: 1:29 - loss: 1.1305 - regression_loss: 0.9965 - classification_loss: 0.1340 148/500 [=======>......................] - ETA: 1:29 - loss: 1.1314 - regression_loss: 0.9973 - classification_loss: 0.1342 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1285 - regression_loss: 0.9947 - classification_loss: 0.1338 150/500 [========>.....................] - ETA: 1:28 - loss: 1.1278 - regression_loss: 0.9940 - classification_loss: 0.1338 151/500 [========>.....................] - ETA: 1:28 - loss: 1.1260 - regression_loss: 0.9925 - classification_loss: 0.1335 152/500 [========>.....................] - ETA: 1:28 - loss: 1.1270 - regression_loss: 0.9939 - classification_loss: 0.1331 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1285 - regression_loss: 0.9953 - classification_loss: 0.1332 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1296 - regression_loss: 0.9963 - classification_loss: 0.1334 155/500 [========>.....................] - ETA: 1:27 - loss: 1.1342 - regression_loss: 1.0004 - classification_loss: 0.1338 156/500 [========>.....................] - ETA: 1:27 - loss: 1.1309 - regression_loss: 0.9977 - classification_loss: 0.1332 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1334 - regression_loss: 0.9995 - classification_loss: 0.1339 158/500 [========>.....................] - ETA: 1:26 - loss: 1.1351 - regression_loss: 1.0008 - classification_loss: 0.1343 159/500 [========>.....................] - ETA: 1:26 - loss: 1.1367 - regression_loss: 1.0019 - classification_loss: 0.1348 160/500 [========>.....................] - ETA: 1:26 - loss: 1.1355 - regression_loss: 1.0011 - classification_loss: 0.1344 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1373 - regression_loss: 1.0025 - classification_loss: 0.1348 162/500 [========>.....................] - ETA: 1:25 - loss: 1.1394 - regression_loss: 1.0043 - classification_loss: 0.1350 163/500 [========>.....................] - ETA: 1:25 - loss: 1.1394 - regression_loss: 1.0044 - classification_loss: 0.1350 164/500 [========>.....................] - ETA: 1:25 - loss: 1.1397 - regression_loss: 1.0047 - classification_loss: 0.1349 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1414 - regression_loss: 1.0065 - classification_loss: 0.1349 166/500 [========>.....................] - ETA: 1:24 - loss: 1.1402 - regression_loss: 1.0058 - classification_loss: 0.1345 167/500 [=========>....................] - ETA: 1:24 - loss: 1.1400 - regression_loss: 1.0053 - classification_loss: 0.1346 168/500 [=========>....................] - ETA: 1:24 - loss: 1.1353 - regression_loss: 1.0011 - classification_loss: 0.1341 169/500 [=========>....................] - ETA: 1:23 - loss: 1.1353 - regression_loss: 1.0012 - classification_loss: 0.1341 170/500 [=========>....................] - ETA: 1:23 - loss: 1.1376 - regression_loss: 1.0037 - classification_loss: 0.1339 171/500 [=========>....................] - ETA: 1:23 - loss: 1.1381 - regression_loss: 1.0042 - classification_loss: 0.1339 172/500 [=========>....................] - ETA: 1:23 - loss: 1.1392 - regression_loss: 1.0054 - classification_loss: 0.1338 173/500 [=========>....................] - ETA: 1:22 - loss: 1.1405 - regression_loss: 1.0069 - classification_loss: 0.1336 174/500 [=========>....................] - ETA: 1:22 - loss: 1.1384 - regression_loss: 1.0052 - classification_loss: 0.1333 175/500 [=========>....................] - ETA: 1:22 - loss: 1.1367 - regression_loss: 1.0036 - classification_loss: 0.1331 176/500 [=========>....................] - ETA: 1:22 - loss: 1.1383 - regression_loss: 1.0050 - classification_loss: 0.1334 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1409 - regression_loss: 1.0071 - classification_loss: 0.1338 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1412 - regression_loss: 1.0074 - classification_loss: 0.1338 179/500 [=========>....................] - ETA: 1:21 - loss: 1.1400 - regression_loss: 1.0063 - classification_loss: 0.1337 180/500 [=========>....................] - ETA: 1:20 - loss: 1.1407 - regression_loss: 1.0068 - classification_loss: 0.1339 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1430 - regression_loss: 1.0089 - classification_loss: 0.1341 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1420 - regression_loss: 1.0083 - classification_loss: 0.1337 183/500 [=========>....................] - ETA: 1:20 - loss: 1.1455 - regression_loss: 1.0113 - classification_loss: 0.1343 184/500 [==========>...................] - ETA: 1:19 - loss: 1.1456 - regression_loss: 1.0116 - classification_loss: 0.1340 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1470 - regression_loss: 1.0129 - classification_loss: 0.1341 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1452 - regression_loss: 1.0113 - classification_loss: 0.1339 187/500 [==========>...................] - ETA: 1:19 - loss: 1.1435 - regression_loss: 1.0099 - classification_loss: 0.1336 188/500 [==========>...................] - ETA: 1:18 - loss: 1.1434 - regression_loss: 1.0097 - classification_loss: 0.1337 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1422 - regression_loss: 1.0088 - classification_loss: 0.1334 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1415 - regression_loss: 1.0081 - classification_loss: 0.1334 191/500 [==========>...................] - ETA: 1:18 - loss: 1.1387 - regression_loss: 1.0057 - classification_loss: 0.1330 192/500 [==========>...................] - ETA: 1:17 - loss: 1.1417 - regression_loss: 1.0077 - classification_loss: 0.1341 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1407 - regression_loss: 1.0070 - classification_loss: 0.1337 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1408 - regression_loss: 1.0066 - classification_loss: 0.1342 195/500 [==========>...................] - ETA: 1:17 - loss: 1.1413 - regression_loss: 1.0070 - classification_loss: 0.1343 196/500 [==========>...................] - ETA: 1:16 - loss: 1.1402 - regression_loss: 1.0058 - classification_loss: 0.1344 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1370 - regression_loss: 1.0032 - classification_loss: 0.1339 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1345 - regression_loss: 1.0009 - classification_loss: 0.1336 199/500 [==========>...................] - ETA: 1:16 - loss: 1.1344 - regression_loss: 1.0008 - classification_loss: 0.1336 200/500 [===========>..................] - ETA: 1:15 - loss: 1.1352 - regression_loss: 1.0016 - classification_loss: 0.1336 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1338 - regression_loss: 1.0003 - classification_loss: 0.1335 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1309 - regression_loss: 0.9978 - classification_loss: 0.1330 203/500 [===========>..................] - ETA: 1:15 - loss: 1.1315 - regression_loss: 0.9984 - classification_loss: 0.1331 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1320 - regression_loss: 0.9987 - classification_loss: 0.1333 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1316 - regression_loss: 0.9982 - classification_loss: 0.1333 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1329 - regression_loss: 0.9993 - classification_loss: 0.1336 207/500 [===========>..................] - ETA: 1:14 - loss: 1.1332 - regression_loss: 0.9999 - classification_loss: 0.1334 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1296 - regression_loss: 0.9967 - classification_loss: 0.1329 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1321 - regression_loss: 0.9986 - classification_loss: 0.1335 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1328 - regression_loss: 0.9990 - classification_loss: 0.1339 211/500 [===========>..................] - ETA: 1:13 - loss: 1.1340 - regression_loss: 1.0000 - classification_loss: 0.1340 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1317 - regression_loss: 0.9981 - classification_loss: 0.1336 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1323 - regression_loss: 0.9988 - classification_loss: 0.1335 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1313 - regression_loss: 0.9980 - classification_loss: 0.1333 215/500 [===========>..................] - ETA: 1:12 - loss: 1.1314 - regression_loss: 0.9981 - classification_loss: 0.1333 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1319 - regression_loss: 0.9984 - classification_loss: 0.1335 217/500 [============>.................] - ETA: 1:11 - loss: 1.1329 - regression_loss: 0.9993 - classification_loss: 0.1336 218/500 [============>.................] - ETA: 1:11 - loss: 1.1302 - regression_loss: 0.9969 - classification_loss: 0.1332 219/500 [============>.................] - ETA: 1:11 - loss: 1.1313 - regression_loss: 0.9979 - classification_loss: 0.1333 220/500 [============>.................] - ETA: 1:10 - loss: 1.1332 - regression_loss: 0.9993 - classification_loss: 0.1338 221/500 [============>.................] - ETA: 1:10 - loss: 1.1345 - regression_loss: 1.0008 - classification_loss: 0.1338 222/500 [============>.................] - ETA: 1:10 - loss: 1.1337 - regression_loss: 1.0001 - classification_loss: 0.1335 223/500 [============>.................] - ETA: 1:10 - loss: 1.1334 - regression_loss: 0.9999 - classification_loss: 0.1335 224/500 [============>.................] - ETA: 1:09 - loss: 1.1339 - regression_loss: 1.0000 - classification_loss: 0.1338 225/500 [============>.................] - ETA: 1:09 - loss: 1.1350 - regression_loss: 1.0010 - classification_loss: 0.1339 226/500 [============>.................] - ETA: 1:09 - loss: 1.1331 - regression_loss: 0.9995 - classification_loss: 0.1336 227/500 [============>.................] - ETA: 1:09 - loss: 1.1339 - regression_loss: 1.0002 - classification_loss: 0.1336 228/500 [============>.................] - ETA: 1:08 - loss: 1.1365 - regression_loss: 1.0026 - classification_loss: 0.1338 229/500 [============>.................] - ETA: 1:08 - loss: 1.1361 - regression_loss: 1.0024 - classification_loss: 0.1337 230/500 [============>.................] - ETA: 1:08 - loss: 1.1370 - regression_loss: 1.0031 - classification_loss: 0.1339 231/500 [============>.................] - ETA: 1:08 - loss: 1.1338 - regression_loss: 1.0003 - classification_loss: 0.1335 232/500 [============>.................] - ETA: 1:07 - loss: 1.1343 - regression_loss: 1.0009 - classification_loss: 0.1334 233/500 [============>.................] - ETA: 1:07 - loss: 1.1352 - regression_loss: 1.0018 - classification_loss: 0.1333 234/500 [=============>................] - ETA: 1:07 - loss: 1.1369 - regression_loss: 1.0033 - classification_loss: 0.1336 235/500 [=============>................] - ETA: 1:07 - loss: 1.1357 - regression_loss: 1.0024 - classification_loss: 0.1333 236/500 [=============>................] - ETA: 1:06 - loss: 1.1357 - regression_loss: 1.0024 - classification_loss: 0.1333 237/500 [=============>................] - ETA: 1:06 - loss: 1.1374 - regression_loss: 1.0039 - classification_loss: 0.1335 238/500 [=============>................] - ETA: 1:06 - loss: 1.1389 - regression_loss: 1.0052 - classification_loss: 0.1337 239/500 [=============>................] - ETA: 1:06 - loss: 1.1411 - regression_loss: 1.0070 - classification_loss: 0.1341 240/500 [=============>................] - ETA: 1:05 - loss: 1.1447 - regression_loss: 1.0102 - classification_loss: 0.1345 241/500 [=============>................] - ETA: 1:05 - loss: 1.1429 - regression_loss: 1.0088 - classification_loss: 0.1341 242/500 [=============>................] - ETA: 1:05 - loss: 1.1428 - regression_loss: 1.0086 - classification_loss: 0.1341 243/500 [=============>................] - ETA: 1:05 - loss: 1.1426 - regression_loss: 1.0086 - classification_loss: 0.1340 244/500 [=============>................] - ETA: 1:04 - loss: 1.1432 - regression_loss: 1.0091 - classification_loss: 0.1340 245/500 [=============>................] - ETA: 1:04 - loss: 1.1448 - regression_loss: 1.0107 - classification_loss: 0.1341 246/500 [=============>................] - ETA: 1:04 - loss: 1.1457 - regression_loss: 1.0112 - classification_loss: 0.1344 247/500 [=============>................] - ETA: 1:04 - loss: 1.1466 - regression_loss: 1.0121 - classification_loss: 0.1345 248/500 [=============>................] - ETA: 1:03 - loss: 1.1460 - regression_loss: 1.0119 - classification_loss: 0.1341 249/500 [=============>................] - ETA: 1:03 - loss: 1.1459 - regression_loss: 1.0118 - classification_loss: 0.1340 250/500 [==============>...............] - ETA: 1:03 - loss: 1.1462 - regression_loss: 1.0120 - classification_loss: 0.1342 251/500 [==============>...............] - ETA: 1:02 - loss: 1.1469 - regression_loss: 1.0125 - classification_loss: 0.1344 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1475 - regression_loss: 1.0130 - classification_loss: 0.1344 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1478 - regression_loss: 1.0133 - classification_loss: 0.1345 254/500 [==============>...............] - ETA: 1:02 - loss: 1.1444 - regression_loss: 1.0103 - classification_loss: 0.1341 255/500 [==============>...............] - ETA: 1:01 - loss: 1.1423 - regression_loss: 1.0083 - classification_loss: 0.1339 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1424 - regression_loss: 1.0085 - classification_loss: 0.1339 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1422 - regression_loss: 1.0083 - classification_loss: 0.1338 258/500 [==============>...............] - ETA: 1:01 - loss: 1.1426 - regression_loss: 1.0088 - classification_loss: 0.1338 259/500 [==============>...............] - ETA: 1:00 - loss: 1.1437 - regression_loss: 1.0097 - classification_loss: 0.1340 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1407 - regression_loss: 1.0071 - classification_loss: 0.1336 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1402 - regression_loss: 1.0068 - classification_loss: 0.1335 262/500 [==============>...............] - ETA: 59s - loss: 1.1396 - regression_loss: 1.0064 - classification_loss: 0.1332  263/500 [==============>...............] - ETA: 59s - loss: 1.1412 - regression_loss: 1.0079 - classification_loss: 0.1333 264/500 [==============>...............] - ETA: 59s - loss: 1.1422 - regression_loss: 1.0085 - classification_loss: 0.1336 265/500 [==============>...............] - ETA: 59s - loss: 1.1417 - regression_loss: 1.0080 - classification_loss: 0.1337 266/500 [==============>...............] - ETA: 58s - loss: 1.1428 - regression_loss: 1.0089 - classification_loss: 0.1339 267/500 [===============>..............] - ETA: 58s - loss: 1.1436 - regression_loss: 1.0096 - classification_loss: 0.1341 268/500 [===============>..............] - ETA: 58s - loss: 1.1430 - regression_loss: 1.0090 - classification_loss: 0.1340 269/500 [===============>..............] - ETA: 58s - loss: 1.1439 - regression_loss: 1.0098 - classification_loss: 0.1340 270/500 [===============>..............] - ETA: 57s - loss: 1.1434 - regression_loss: 1.0095 - classification_loss: 0.1340 271/500 [===============>..............] - ETA: 57s - loss: 1.1433 - regression_loss: 1.0093 - classification_loss: 0.1340 272/500 [===============>..............] - ETA: 57s - loss: 1.1412 - regression_loss: 1.0075 - classification_loss: 0.1338 273/500 [===============>..............] - ETA: 57s - loss: 1.1412 - regression_loss: 1.0075 - classification_loss: 0.1337 274/500 [===============>..............] - ETA: 56s - loss: 1.1412 - regression_loss: 1.0074 - classification_loss: 0.1338 275/500 [===============>..............] - ETA: 56s - loss: 1.1404 - regression_loss: 1.0068 - classification_loss: 0.1336 276/500 [===============>..............] - ETA: 56s - loss: 1.1422 - regression_loss: 1.0082 - classification_loss: 0.1340 277/500 [===============>..............] - ETA: 56s - loss: 1.1420 - regression_loss: 1.0080 - classification_loss: 0.1339 278/500 [===============>..............] - ETA: 55s - loss: 1.1431 - regression_loss: 1.0089 - classification_loss: 0.1342 279/500 [===============>..............] - ETA: 55s - loss: 1.1422 - regression_loss: 1.0082 - classification_loss: 0.1340 280/500 [===============>..............] - ETA: 55s - loss: 1.1395 - regression_loss: 1.0058 - classification_loss: 0.1337 281/500 [===============>..............] - ETA: 55s - loss: 1.1404 - regression_loss: 1.0066 - classification_loss: 0.1339 282/500 [===============>..............] - ETA: 54s - loss: 1.1417 - regression_loss: 1.0077 - classification_loss: 0.1340 283/500 [===============>..............] - ETA: 54s - loss: 1.1408 - regression_loss: 1.0068 - classification_loss: 0.1340 284/500 [================>.............] - ETA: 54s - loss: 1.1484 - regression_loss: 1.0129 - classification_loss: 0.1356 285/500 [================>.............] - ETA: 54s - loss: 1.1469 - regression_loss: 1.0117 - classification_loss: 0.1353 286/500 [================>.............] - ETA: 53s - loss: 1.1480 - regression_loss: 1.0126 - classification_loss: 0.1354 287/500 [================>.............] - ETA: 53s - loss: 1.1488 - regression_loss: 1.0132 - classification_loss: 0.1356 288/500 [================>.............] - ETA: 53s - loss: 1.1508 - regression_loss: 1.0148 - classification_loss: 0.1360 289/500 [================>.............] - ETA: 53s - loss: 1.1515 - regression_loss: 1.0153 - classification_loss: 0.1362 290/500 [================>.............] - ETA: 52s - loss: 1.1494 - regression_loss: 1.0135 - classification_loss: 0.1359 291/500 [================>.............] - ETA: 52s - loss: 1.1486 - regression_loss: 1.0128 - classification_loss: 0.1357 292/500 [================>.............] - ETA: 52s - loss: 1.1530 - regression_loss: 1.0168 - classification_loss: 0.1362 293/500 [================>.............] - ETA: 52s - loss: 1.1518 - regression_loss: 1.0157 - classification_loss: 0.1361 294/500 [================>.............] - ETA: 51s - loss: 1.1508 - regression_loss: 1.0149 - classification_loss: 0.1360 295/500 [================>.............] - ETA: 51s - loss: 1.1500 - regression_loss: 1.0142 - classification_loss: 0.1358 296/500 [================>.............] - ETA: 51s - loss: 1.1500 - regression_loss: 1.0141 - classification_loss: 0.1358 297/500 [================>.............] - ETA: 51s - loss: 1.1485 - regression_loss: 1.0128 - classification_loss: 0.1357 298/500 [================>.............] - ETA: 50s - loss: 1.1498 - regression_loss: 1.0141 - classification_loss: 0.1357 299/500 [================>.............] - ETA: 50s - loss: 1.1508 - regression_loss: 1.0148 - classification_loss: 0.1360 300/500 [=================>............] - ETA: 50s - loss: 1.1507 - regression_loss: 1.0148 - classification_loss: 0.1359 301/500 [=================>............] - ETA: 50s - loss: 1.1499 - regression_loss: 1.0141 - classification_loss: 0.1358 302/500 [=================>............] - ETA: 49s - loss: 1.1501 - regression_loss: 1.0144 - classification_loss: 0.1357 303/500 [=================>............] - ETA: 49s - loss: 1.1490 - regression_loss: 1.0134 - classification_loss: 0.1356 304/500 [=================>............] - ETA: 49s - loss: 1.1465 - regression_loss: 1.0112 - classification_loss: 0.1353 305/500 [=================>............] - ETA: 49s - loss: 1.1459 - regression_loss: 1.0108 - classification_loss: 0.1351 306/500 [=================>............] - ETA: 48s - loss: 1.1451 - regression_loss: 1.0102 - classification_loss: 0.1349 307/500 [=================>............] - ETA: 48s - loss: 1.1435 - regression_loss: 1.0089 - classification_loss: 0.1346 308/500 [=================>............] - ETA: 48s - loss: 1.1421 - regression_loss: 1.0077 - classification_loss: 0.1343 309/500 [=================>............] - ETA: 48s - loss: 1.1432 - regression_loss: 1.0088 - classification_loss: 0.1344 310/500 [=================>............] - ETA: 47s - loss: 1.1438 - regression_loss: 1.0092 - classification_loss: 0.1346 311/500 [=================>............] - ETA: 47s - loss: 1.1444 - regression_loss: 1.0099 - classification_loss: 0.1345 312/500 [=================>............] - ETA: 47s - loss: 1.1422 - regression_loss: 1.0080 - classification_loss: 0.1342 313/500 [=================>............] - ETA: 47s - loss: 1.1413 - regression_loss: 1.0072 - classification_loss: 0.1341 314/500 [=================>............] - ETA: 46s - loss: 1.1399 - regression_loss: 1.0061 - classification_loss: 0.1339 315/500 [=================>............] - ETA: 46s - loss: 1.1397 - regression_loss: 1.0059 - classification_loss: 0.1338 316/500 [=================>............] - ETA: 46s - loss: 1.1401 - regression_loss: 1.0063 - classification_loss: 0.1338 317/500 [==================>...........] - ETA: 46s - loss: 1.1410 - regression_loss: 1.0068 - classification_loss: 0.1341 318/500 [==================>...........] - ETA: 45s - loss: 1.1426 - regression_loss: 1.0081 - classification_loss: 0.1345 319/500 [==================>...........] - ETA: 45s - loss: 1.1418 - regression_loss: 1.0074 - classification_loss: 0.1344 320/500 [==================>...........] - ETA: 45s - loss: 1.1419 - regression_loss: 1.0075 - classification_loss: 0.1344 321/500 [==================>...........] - ETA: 45s - loss: 1.1432 - regression_loss: 1.0085 - classification_loss: 0.1347 322/500 [==================>...........] - ETA: 44s - loss: 1.1442 - regression_loss: 1.0094 - classification_loss: 0.1347 323/500 [==================>...........] - ETA: 44s - loss: 1.1414 - regression_loss: 1.0070 - classification_loss: 0.1344 324/500 [==================>...........] - ETA: 44s - loss: 1.1407 - regression_loss: 1.0064 - classification_loss: 0.1343 325/500 [==================>...........] - ETA: 44s - loss: 1.1393 - regression_loss: 1.0054 - classification_loss: 0.1340 326/500 [==================>...........] - ETA: 43s - loss: 1.1401 - regression_loss: 1.0063 - classification_loss: 0.1338 327/500 [==================>...........] - ETA: 43s - loss: 1.1407 - regression_loss: 1.0069 - classification_loss: 0.1339 328/500 [==================>...........] - ETA: 43s - loss: 1.1418 - regression_loss: 1.0077 - classification_loss: 0.1341 329/500 [==================>...........] - ETA: 43s - loss: 1.1423 - regression_loss: 1.0075 - classification_loss: 0.1347 330/500 [==================>...........] - ETA: 42s - loss: 1.1429 - regression_loss: 1.0080 - classification_loss: 0.1349 331/500 [==================>...........] - ETA: 42s - loss: 1.1424 - regression_loss: 1.0075 - classification_loss: 0.1348 332/500 [==================>...........] - ETA: 42s - loss: 1.1410 - regression_loss: 1.0062 - classification_loss: 0.1348 333/500 [==================>...........] - ETA: 42s - loss: 1.1420 - regression_loss: 1.0070 - classification_loss: 0.1350 334/500 [===================>..........] - ETA: 41s - loss: 1.1428 - regression_loss: 1.0076 - classification_loss: 0.1353 335/500 [===================>..........] - ETA: 41s - loss: 1.1435 - regression_loss: 1.0080 - classification_loss: 0.1354 336/500 [===================>..........] - ETA: 41s - loss: 1.1437 - regression_loss: 1.0082 - classification_loss: 0.1355 337/500 [===================>..........] - ETA: 41s - loss: 1.1435 - regression_loss: 1.0080 - classification_loss: 0.1355 338/500 [===================>..........] - ETA: 40s - loss: 1.1449 - regression_loss: 1.0092 - classification_loss: 0.1357 339/500 [===================>..........] - ETA: 40s - loss: 1.1437 - regression_loss: 1.0082 - classification_loss: 0.1355 340/500 [===================>..........] - ETA: 40s - loss: 1.1434 - regression_loss: 1.0080 - classification_loss: 0.1353 341/500 [===================>..........] - ETA: 40s - loss: 1.1446 - regression_loss: 1.0091 - classification_loss: 0.1355 342/500 [===================>..........] - ETA: 39s - loss: 1.1465 - regression_loss: 1.0107 - classification_loss: 0.1358 343/500 [===================>..........] - ETA: 39s - loss: 1.1444 - regression_loss: 1.0088 - classification_loss: 0.1356 344/500 [===================>..........] - ETA: 39s - loss: 1.1442 - regression_loss: 1.0085 - classification_loss: 0.1357 345/500 [===================>..........] - ETA: 39s - loss: 1.1424 - regression_loss: 1.0070 - classification_loss: 0.1354 346/500 [===================>..........] - ETA: 38s - loss: 1.1424 - regression_loss: 1.0070 - classification_loss: 0.1354 347/500 [===================>..........] - ETA: 38s - loss: 1.1398 - regression_loss: 1.0047 - classification_loss: 0.1351 348/500 [===================>..........] - ETA: 38s - loss: 1.1410 - regression_loss: 1.0057 - classification_loss: 0.1354 349/500 [===================>..........] - ETA: 38s - loss: 1.1392 - regression_loss: 1.0040 - classification_loss: 0.1352 350/500 [====================>.........] - ETA: 37s - loss: 1.1397 - regression_loss: 1.0044 - classification_loss: 0.1353 351/500 [====================>.........] - ETA: 37s - loss: 1.1382 - regression_loss: 1.0031 - classification_loss: 0.1351 352/500 [====================>.........] - ETA: 37s - loss: 1.1369 - regression_loss: 1.0021 - classification_loss: 0.1349 353/500 [====================>.........] - ETA: 37s - loss: 1.1363 - regression_loss: 1.0015 - classification_loss: 0.1349 354/500 [====================>.........] - ETA: 36s - loss: 1.1365 - regression_loss: 1.0016 - classification_loss: 0.1349 355/500 [====================>.........] - ETA: 36s - loss: 1.1376 - regression_loss: 1.0026 - classification_loss: 0.1350 356/500 [====================>.........] - ETA: 36s - loss: 1.1371 - regression_loss: 1.0023 - classification_loss: 0.1348 357/500 [====================>.........] - ETA: 36s - loss: 1.1369 - regression_loss: 1.0022 - classification_loss: 0.1347 358/500 [====================>.........] - ETA: 35s - loss: 1.1388 - regression_loss: 1.0037 - classification_loss: 0.1351 359/500 [====================>.........] - ETA: 35s - loss: 1.1402 - regression_loss: 1.0048 - classification_loss: 0.1354 360/500 [====================>.........] - ETA: 35s - loss: 1.1401 - regression_loss: 1.0047 - classification_loss: 0.1354 361/500 [====================>.........] - ETA: 35s - loss: 1.1380 - regression_loss: 1.0029 - classification_loss: 0.1352 362/500 [====================>.........] - ETA: 34s - loss: 1.1372 - regression_loss: 1.0021 - classification_loss: 0.1351 363/500 [====================>.........] - ETA: 34s - loss: 1.1378 - regression_loss: 1.0026 - classification_loss: 0.1352 364/500 [====================>.........] - ETA: 34s - loss: 1.1374 - regression_loss: 1.0023 - classification_loss: 0.1351 365/500 [====================>.........] - ETA: 34s - loss: 1.1388 - regression_loss: 1.0034 - classification_loss: 0.1355 366/500 [====================>.........] - ETA: 33s - loss: 1.1398 - regression_loss: 1.0041 - classification_loss: 0.1357 367/500 [=====================>........] - ETA: 33s - loss: 1.1407 - regression_loss: 1.0050 - classification_loss: 0.1358 368/500 [=====================>........] - ETA: 33s - loss: 1.1409 - regression_loss: 1.0051 - classification_loss: 0.1358 369/500 [=====================>........] - ETA: 33s - loss: 1.1412 - regression_loss: 1.0053 - classification_loss: 0.1359 370/500 [=====================>........] - ETA: 32s - loss: 1.1419 - regression_loss: 1.0058 - classification_loss: 0.1361 371/500 [=====================>........] - ETA: 32s - loss: 1.1419 - regression_loss: 1.0058 - classification_loss: 0.1361 372/500 [=====================>........] - ETA: 32s - loss: 1.1420 - regression_loss: 1.0058 - classification_loss: 0.1362 373/500 [=====================>........] - ETA: 32s - loss: 1.1396 - regression_loss: 1.0037 - classification_loss: 0.1359 374/500 [=====================>........] - ETA: 31s - loss: 1.1400 - regression_loss: 1.0042 - classification_loss: 0.1359 375/500 [=====================>........] - ETA: 31s - loss: 1.1396 - regression_loss: 1.0039 - classification_loss: 0.1357 376/500 [=====================>........] - ETA: 31s - loss: 1.1392 - regression_loss: 1.0036 - classification_loss: 0.1356 377/500 [=====================>........] - ETA: 31s - loss: 1.1391 - regression_loss: 1.0035 - classification_loss: 0.1356 378/500 [=====================>........] - ETA: 30s - loss: 1.1393 - regression_loss: 1.0036 - classification_loss: 0.1357 379/500 [=====================>........] - ETA: 30s - loss: 1.1399 - regression_loss: 1.0041 - classification_loss: 0.1358 380/500 [=====================>........] - ETA: 30s - loss: 1.1394 - regression_loss: 1.0038 - classification_loss: 0.1356 381/500 [=====================>........] - ETA: 30s - loss: 1.1403 - regression_loss: 1.0047 - classification_loss: 0.1356 382/500 [=====================>........] - ETA: 29s - loss: 1.1409 - regression_loss: 1.0052 - classification_loss: 0.1357 383/500 [=====================>........] - ETA: 29s - loss: 1.1408 - regression_loss: 1.0050 - classification_loss: 0.1358 384/500 [======================>.......] - ETA: 29s - loss: 1.1403 - regression_loss: 1.0046 - classification_loss: 0.1358 385/500 [======================>.......] - ETA: 29s - loss: 1.1402 - regression_loss: 1.0044 - classification_loss: 0.1359 386/500 [======================>.......] - ETA: 28s - loss: 1.1402 - regression_loss: 1.0046 - classification_loss: 0.1356 387/500 [======================>.......] - ETA: 28s - loss: 1.1381 - regression_loss: 1.0027 - classification_loss: 0.1354 388/500 [======================>.......] - ETA: 28s - loss: 1.1364 - regression_loss: 1.0011 - classification_loss: 0.1352 389/500 [======================>.......] - ETA: 28s - loss: 1.1356 - regression_loss: 1.0005 - classification_loss: 0.1351 390/500 [======================>.......] - ETA: 27s - loss: 1.1360 - regression_loss: 1.0009 - classification_loss: 0.1351 391/500 [======================>.......] - ETA: 27s - loss: 1.1373 - regression_loss: 1.0021 - classification_loss: 0.1353 392/500 [======================>.......] - ETA: 27s - loss: 1.1372 - regression_loss: 1.0019 - classification_loss: 0.1353 393/500 [======================>.......] - ETA: 27s - loss: 1.1354 - regression_loss: 1.0002 - classification_loss: 0.1352 394/500 [======================>.......] - ETA: 26s - loss: 1.1368 - regression_loss: 1.0013 - classification_loss: 0.1355 395/500 [======================>.......] - ETA: 26s - loss: 1.1353 - regression_loss: 1.0000 - classification_loss: 0.1352 396/500 [======================>.......] - ETA: 26s - loss: 1.1357 - regression_loss: 1.0004 - classification_loss: 0.1353 397/500 [======================>.......] - ETA: 26s - loss: 1.1349 - regression_loss: 0.9997 - classification_loss: 0.1352 398/500 [======================>.......] - ETA: 25s - loss: 1.1340 - regression_loss: 0.9991 - classification_loss: 0.1350 399/500 [======================>.......] - ETA: 25s - loss: 1.1340 - regression_loss: 0.9991 - classification_loss: 0.1349 400/500 [=======================>......] - ETA: 25s - loss: 1.1342 - regression_loss: 0.9993 - classification_loss: 0.1349 401/500 [=======================>......] - ETA: 24s - loss: 1.1348 - regression_loss: 0.9999 - classification_loss: 0.1350 402/500 [=======================>......] - ETA: 24s - loss: 1.1373 - regression_loss: 1.0017 - classification_loss: 0.1355 403/500 [=======================>......] - ETA: 24s - loss: 1.1376 - regression_loss: 1.0020 - classification_loss: 0.1356 404/500 [=======================>......] - ETA: 24s - loss: 1.1381 - regression_loss: 1.0025 - classification_loss: 0.1356 405/500 [=======================>......] - ETA: 23s - loss: 1.1384 - regression_loss: 1.0027 - classification_loss: 0.1357 406/500 [=======================>......] - ETA: 23s - loss: 1.1381 - regression_loss: 1.0024 - classification_loss: 0.1356 407/500 [=======================>......] - ETA: 23s - loss: 1.1380 - regression_loss: 1.0024 - classification_loss: 0.1357 408/500 [=======================>......] - ETA: 23s - loss: 1.1382 - regression_loss: 1.0026 - classification_loss: 0.1356 409/500 [=======================>......] - ETA: 22s - loss: 1.1380 - regression_loss: 1.0024 - classification_loss: 0.1356 410/500 [=======================>......] - ETA: 22s - loss: 1.1380 - regression_loss: 1.0024 - classification_loss: 0.1356 411/500 [=======================>......] - ETA: 22s - loss: 1.1388 - regression_loss: 1.0031 - classification_loss: 0.1357 412/500 [=======================>......] - ETA: 22s - loss: 1.1387 - regression_loss: 1.0030 - classification_loss: 0.1357 413/500 [=======================>......] - ETA: 21s - loss: 1.1396 - regression_loss: 1.0038 - classification_loss: 0.1358 414/500 [=======================>......] - ETA: 21s - loss: 1.1380 - regression_loss: 1.0024 - classification_loss: 0.1356 415/500 [=======================>......] - ETA: 21s - loss: 1.1381 - regression_loss: 1.0025 - classification_loss: 0.1356 416/500 [=======================>......] - ETA: 21s - loss: 1.1373 - regression_loss: 1.0020 - classification_loss: 0.1354 417/500 [========================>.....] - ETA: 20s - loss: 1.1367 - regression_loss: 1.0015 - classification_loss: 0.1352 418/500 [========================>.....] - ETA: 20s - loss: 1.1368 - regression_loss: 1.0016 - classification_loss: 0.1352 419/500 [========================>.....] - ETA: 20s - loss: 1.1378 - regression_loss: 1.0025 - classification_loss: 0.1353 420/500 [========================>.....] - ETA: 20s - loss: 1.1357 - regression_loss: 1.0007 - classification_loss: 0.1350 421/500 [========================>.....] - ETA: 19s - loss: 1.1352 - regression_loss: 1.0003 - classification_loss: 0.1349 422/500 [========================>.....] - ETA: 19s - loss: 1.1363 - regression_loss: 1.0013 - classification_loss: 0.1351 423/500 [========================>.....] - ETA: 19s - loss: 1.1362 - regression_loss: 1.0012 - classification_loss: 0.1350 424/500 [========================>.....] - ETA: 19s - loss: 1.1354 - regression_loss: 1.0006 - classification_loss: 0.1348 425/500 [========================>.....] - ETA: 18s - loss: 1.1351 - regression_loss: 1.0002 - classification_loss: 0.1349 426/500 [========================>.....] - ETA: 18s - loss: 1.1347 - regression_loss: 1.0000 - classification_loss: 0.1348 427/500 [========================>.....] - ETA: 18s - loss: 1.1344 - regression_loss: 0.9998 - classification_loss: 0.1346 428/500 [========================>.....] - ETA: 18s - loss: 1.1345 - regression_loss: 0.9999 - classification_loss: 0.1346 429/500 [========================>.....] - ETA: 17s - loss: 1.1345 - regression_loss: 0.9997 - classification_loss: 0.1348 430/500 [========================>.....] - ETA: 17s - loss: 1.1345 - regression_loss: 0.9998 - classification_loss: 0.1347 431/500 [========================>.....] - ETA: 17s - loss: 1.1334 - regression_loss: 0.9989 - classification_loss: 0.1345 432/500 [========================>.....] - ETA: 17s - loss: 1.1338 - regression_loss: 0.9992 - classification_loss: 0.1346 433/500 [========================>.....] - ETA: 16s - loss: 1.1338 - regression_loss: 0.9993 - classification_loss: 0.1345 434/500 [=========================>....] - ETA: 16s - loss: 1.1336 - regression_loss: 0.9990 - classification_loss: 0.1345 435/500 [=========================>....] - ETA: 16s - loss: 1.1342 - regression_loss: 0.9996 - classification_loss: 0.1346 436/500 [=========================>....] - ETA: 16s - loss: 1.1340 - regression_loss: 0.9995 - classification_loss: 0.1345 437/500 [=========================>....] - ETA: 15s - loss: 1.1337 - regression_loss: 0.9994 - classification_loss: 0.1343 438/500 [=========================>....] - ETA: 15s - loss: 1.1335 - regression_loss: 0.9992 - classification_loss: 0.1343 439/500 [=========================>....] - ETA: 15s - loss: 1.1318 - regression_loss: 0.9977 - classification_loss: 0.1341 440/500 [=========================>....] - ETA: 15s - loss: 1.1319 - regression_loss: 0.9978 - classification_loss: 0.1341 441/500 [=========================>....] - ETA: 14s - loss: 1.1322 - regression_loss: 0.9981 - classification_loss: 0.1341 442/500 [=========================>....] - ETA: 14s - loss: 1.1325 - regression_loss: 0.9984 - classification_loss: 0.1341 443/500 [=========================>....] - ETA: 14s - loss: 1.1326 - regression_loss: 0.9985 - classification_loss: 0.1341 444/500 [=========================>....] - ETA: 14s - loss: 1.1336 - regression_loss: 0.9992 - classification_loss: 0.1344 445/500 [=========================>....] - ETA: 13s - loss: 1.1340 - regression_loss: 0.9996 - classification_loss: 0.1343 446/500 [=========================>....] - ETA: 13s - loss: 1.1345 - regression_loss: 1.0002 - classification_loss: 0.1343 447/500 [=========================>....] - ETA: 13s - loss: 1.1330 - regression_loss: 0.9988 - classification_loss: 0.1341 448/500 [=========================>....] - ETA: 13s - loss: 1.1348 - regression_loss: 1.0003 - classification_loss: 0.1345 449/500 [=========================>....] - ETA: 12s - loss: 1.1358 - regression_loss: 1.0011 - classification_loss: 0.1347 450/500 [==========================>...] - ETA: 12s - loss: 1.1367 - regression_loss: 1.0019 - classification_loss: 0.1347 451/500 [==========================>...] - ETA: 12s - loss: 1.1371 - regression_loss: 1.0023 - classification_loss: 0.1349 452/500 [==========================>...] - ETA: 12s - loss: 1.1375 - regression_loss: 1.0025 - classification_loss: 0.1350 453/500 [==========================>...] - ETA: 11s - loss: 1.1369 - regression_loss: 1.0021 - classification_loss: 0.1349 454/500 [==========================>...] - ETA: 11s - loss: 1.1356 - regression_loss: 1.0009 - classification_loss: 0.1347 455/500 [==========================>...] - ETA: 11s - loss: 1.1357 - regression_loss: 1.0011 - classification_loss: 0.1346 456/500 [==========================>...] - ETA: 11s - loss: 1.1371 - regression_loss: 1.0021 - classification_loss: 0.1350 457/500 [==========================>...] - ETA: 10s - loss: 1.1372 - regression_loss: 1.0023 - classification_loss: 0.1349 458/500 [==========================>...] - ETA: 10s - loss: 1.1380 - regression_loss: 1.0030 - classification_loss: 0.1349 459/500 [==========================>...] - ETA: 10s - loss: 1.1386 - regression_loss: 1.0037 - classification_loss: 0.1349 460/500 [==========================>...] - ETA: 10s - loss: 1.1370 - regression_loss: 1.0023 - classification_loss: 0.1346 461/500 [==========================>...] - ETA: 9s - loss: 1.1372 - regression_loss: 1.0026 - classification_loss: 0.1347  462/500 [==========================>...] - ETA: 9s - loss: 1.1379 - regression_loss: 1.0030 - classification_loss: 0.1349 463/500 [==========================>...] - ETA: 9s - loss: 1.1363 - regression_loss: 1.0017 - classification_loss: 0.1346 464/500 [==========================>...] - ETA: 9s - loss: 1.1373 - regression_loss: 1.0024 - classification_loss: 0.1349 465/500 [==========================>...] - ETA: 8s - loss: 1.1367 - regression_loss: 1.0020 - classification_loss: 0.1347 466/500 [==========================>...] - ETA: 8s - loss: 1.1365 - regression_loss: 1.0018 - classification_loss: 0.1347 467/500 [===========================>..] - ETA: 8s - loss: 1.1369 - regression_loss: 1.0020 - classification_loss: 0.1349 468/500 [===========================>..] - ETA: 8s - loss: 1.1362 - regression_loss: 1.0014 - classification_loss: 0.1348 469/500 [===========================>..] - ETA: 7s - loss: 1.1357 - regression_loss: 1.0010 - classification_loss: 0.1347 470/500 [===========================>..] - ETA: 7s - loss: 1.1358 - regression_loss: 1.0011 - classification_loss: 0.1348 471/500 [===========================>..] - ETA: 7s - loss: 1.1357 - regression_loss: 1.0009 - classification_loss: 0.1348 472/500 [===========================>..] - ETA: 7s - loss: 1.1370 - regression_loss: 1.0019 - classification_loss: 0.1351 473/500 [===========================>..] - ETA: 6s - loss: 1.1379 - regression_loss: 1.0027 - classification_loss: 0.1353 474/500 [===========================>..] - ETA: 6s - loss: 1.1381 - regression_loss: 1.0029 - classification_loss: 0.1353 475/500 [===========================>..] - ETA: 6s - loss: 1.1377 - regression_loss: 1.0026 - classification_loss: 0.1352 476/500 [===========================>..] - ETA: 6s - loss: 1.1384 - regression_loss: 1.0031 - classification_loss: 0.1353 477/500 [===========================>..] - ETA: 5s - loss: 1.1367 - regression_loss: 1.0015 - classification_loss: 0.1352 478/500 [===========================>..] - ETA: 5s - loss: 1.1354 - regression_loss: 1.0004 - classification_loss: 0.1350 479/500 [===========================>..] - ETA: 5s - loss: 1.1360 - regression_loss: 1.0007 - classification_loss: 0.1353 480/500 [===========================>..] - ETA: 5s - loss: 1.1355 - regression_loss: 1.0002 - classification_loss: 0.1353 481/500 [===========================>..] - ETA: 4s - loss: 1.1357 - regression_loss: 1.0004 - classification_loss: 0.1353 482/500 [===========================>..] - ETA: 4s - loss: 1.1350 - regression_loss: 0.9998 - classification_loss: 0.1352 483/500 [===========================>..] - ETA: 4s - loss: 1.1352 - regression_loss: 0.9999 - classification_loss: 0.1353 484/500 [============================>.] - ETA: 4s - loss: 1.1348 - regression_loss: 0.9996 - classification_loss: 0.1352 485/500 [============================>.] - ETA: 3s - loss: 1.1348 - regression_loss: 0.9996 - classification_loss: 0.1352 486/500 [============================>.] - ETA: 3s - loss: 1.1344 - regression_loss: 0.9993 - classification_loss: 0.1351 487/500 [============================>.] - ETA: 3s - loss: 1.1337 - regression_loss: 0.9988 - classification_loss: 0.1349 488/500 [============================>.] - ETA: 3s - loss: 1.1340 - regression_loss: 0.9990 - classification_loss: 0.1350 489/500 [============================>.] - ETA: 2s - loss: 1.1340 - regression_loss: 0.9989 - classification_loss: 0.1350 490/500 [============================>.] - ETA: 2s - loss: 1.1344 - regression_loss: 0.9994 - classification_loss: 0.1351 491/500 [============================>.] - ETA: 2s - loss: 1.1336 - regression_loss: 0.9987 - classification_loss: 0.1349 492/500 [============================>.] - ETA: 2s - loss: 1.1337 - regression_loss: 0.9988 - classification_loss: 0.1349 493/500 [============================>.] - ETA: 1s - loss: 1.1342 - regression_loss: 0.9994 - classification_loss: 0.1348 494/500 [============================>.] - ETA: 1s - loss: 1.1331 - regression_loss: 0.9985 - classification_loss: 0.1346 495/500 [============================>.] - ETA: 1s - loss: 1.1325 - regression_loss: 0.9980 - classification_loss: 0.1345 496/500 [============================>.] - ETA: 1s - loss: 1.1334 - regression_loss: 0.9988 - classification_loss: 0.1346 497/500 [============================>.] - ETA: 0s - loss: 1.1342 - regression_loss: 0.9994 - classification_loss: 0.1347 498/500 [============================>.] - ETA: 0s - loss: 1.1335 - regression_loss: 0.9989 - classification_loss: 0.1346 499/500 [============================>.] - ETA: 0s - loss: 1.1331 - regression_loss: 0.9986 - classification_loss: 0.1345 500/500 [==============================] - 126s 252ms/step - loss: 1.1343 - regression_loss: 0.9996 - classification_loss: 0.1347 1172 instances of class plum with average precision: 0.7905 mAP: 0.7905 Epoch 00029: saving model to ./training/snapshots/resnet50_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 1:56 - loss: 1.4739 - regression_loss: 1.3279 - classification_loss: 0.1461 2/500 [..............................] - ETA: 2:00 - loss: 1.0504 - regression_loss: 0.9556 - classification_loss: 0.0948 3/500 [..............................] - ETA: 2:02 - loss: 1.1175 - regression_loss: 1.0005 - classification_loss: 0.1170 4/500 [..............................] - ETA: 2:03 - loss: 1.0362 - regression_loss: 0.9370 - classification_loss: 0.0993 5/500 [..............................] - ETA: 2:03 - loss: 1.1507 - regression_loss: 1.0254 - classification_loss: 0.1253 6/500 [..............................] - ETA: 2:03 - loss: 1.2177 - regression_loss: 1.0791 - classification_loss: 0.1386 7/500 [..............................] - ETA: 2:03 - loss: 1.2171 - regression_loss: 1.0852 - classification_loss: 0.1318 8/500 [..............................] - ETA: 2:03 - loss: 1.2034 - regression_loss: 1.0693 - classification_loss: 0.1341 9/500 [..............................] - ETA: 2:03 - loss: 1.2232 - regression_loss: 1.0850 - classification_loss: 0.1383 10/500 [..............................] - ETA: 2:02 - loss: 1.2256 - regression_loss: 1.0917 - classification_loss: 0.1339 11/500 [..............................] - ETA: 2:02 - loss: 1.2313 - regression_loss: 1.0943 - classification_loss: 0.1370 12/500 [..............................] - ETA: 2:02 - loss: 1.2228 - regression_loss: 1.0900 - classification_loss: 0.1329 13/500 [..............................] - ETA: 2:02 - loss: 1.2965 - regression_loss: 1.1538 - classification_loss: 0.1426 14/500 [..............................] - ETA: 2:02 - loss: 1.2797 - regression_loss: 1.1375 - classification_loss: 0.1422 15/500 [..............................] - ETA: 2:02 - loss: 1.2588 - regression_loss: 1.1184 - classification_loss: 0.1404 16/500 [..............................] - ETA: 2:02 - loss: 1.2013 - regression_loss: 1.0674 - classification_loss: 0.1339 17/500 [>.............................] - ETA: 2:02 - loss: 1.1578 - regression_loss: 1.0292 - classification_loss: 0.1286 18/500 [>.............................] - ETA: 2:02 - loss: 1.1705 - regression_loss: 1.0388 - classification_loss: 0.1317 19/500 [>.............................] - ETA: 2:02 - loss: 1.1679 - regression_loss: 1.0341 - classification_loss: 0.1339 20/500 [>.............................] - ETA: 2:01 - loss: 1.1675 - regression_loss: 1.0328 - classification_loss: 0.1347 21/500 [>.............................] - ETA: 2:00 - loss: 1.1579 - regression_loss: 1.0242 - classification_loss: 0.1336 22/500 [>.............................] - ETA: 2:00 - loss: 1.1353 - regression_loss: 1.0039 - classification_loss: 0.1314 23/500 [>.............................] - ETA: 2:00 - loss: 1.1373 - regression_loss: 1.0035 - classification_loss: 0.1338 24/500 [>.............................] - ETA: 2:00 - loss: 1.1199 - regression_loss: 0.9902 - classification_loss: 0.1297 25/500 [>.............................] - ETA: 1:59 - loss: 1.1309 - regression_loss: 1.0007 - classification_loss: 0.1303 26/500 [>.............................] - ETA: 1:59 - loss: 1.1189 - regression_loss: 0.9904 - classification_loss: 0.1285 27/500 [>.............................] - ETA: 1:59 - loss: 1.1292 - regression_loss: 0.9989 - classification_loss: 0.1303 28/500 [>.............................] - ETA: 1:59 - loss: 1.1286 - regression_loss: 0.9979 - classification_loss: 0.1307 29/500 [>.............................] - ETA: 1:58 - loss: 1.1432 - regression_loss: 1.0108 - classification_loss: 0.1324 30/500 [>.............................] - ETA: 1:58 - loss: 1.1375 - regression_loss: 1.0057 - classification_loss: 0.1318 31/500 [>.............................] - ETA: 1:58 - loss: 1.1443 - regression_loss: 1.0120 - classification_loss: 0.1323 32/500 [>.............................] - ETA: 1:58 - loss: 1.1545 - regression_loss: 1.0205 - classification_loss: 0.1341 33/500 [>.............................] - ETA: 1:57 - loss: 1.1337 - regression_loss: 1.0028 - classification_loss: 0.1309 34/500 [=>............................] - ETA: 1:57 - loss: 1.1229 - regression_loss: 0.9943 - classification_loss: 0.1286 35/500 [=>............................] - ETA: 1:57 - loss: 1.1140 - regression_loss: 0.9857 - classification_loss: 0.1283 36/500 [=>............................] - ETA: 1:57 - loss: 1.1147 - regression_loss: 0.9862 - classification_loss: 0.1285 37/500 [=>............................] - ETA: 1:56 - loss: 1.0991 - regression_loss: 0.9729 - classification_loss: 0.1263 38/500 [=>............................] - ETA: 1:56 - loss: 1.1130 - regression_loss: 0.9843 - classification_loss: 0.1287 39/500 [=>............................] - ETA: 1:56 - loss: 1.1058 - regression_loss: 0.9798 - classification_loss: 0.1260 40/500 [=>............................] - ETA: 1:56 - loss: 1.1161 - regression_loss: 0.9884 - classification_loss: 0.1277 41/500 [=>............................] - ETA: 1:56 - loss: 1.1133 - regression_loss: 0.9879 - classification_loss: 0.1254 42/500 [=>............................] - ETA: 1:55 - loss: 1.1261 - regression_loss: 0.9957 - classification_loss: 0.1303 43/500 [=>............................] - ETA: 1:55 - loss: 1.1241 - regression_loss: 0.9945 - classification_loss: 0.1296 44/500 [=>............................] - ETA: 1:55 - loss: 1.1185 - regression_loss: 0.9906 - classification_loss: 0.1280 45/500 [=>............................] - ETA: 1:55 - loss: 1.1059 - regression_loss: 0.9793 - classification_loss: 0.1267 46/500 [=>............................] - ETA: 1:54 - loss: 1.1096 - regression_loss: 0.9828 - classification_loss: 0.1268 47/500 [=>............................] - ETA: 1:54 - loss: 1.1142 - regression_loss: 0.9866 - classification_loss: 0.1276 48/500 [=>............................] - ETA: 1:54 - loss: 1.1195 - regression_loss: 0.9908 - classification_loss: 0.1286 49/500 [=>............................] - ETA: 1:54 - loss: 1.1210 - regression_loss: 0.9916 - classification_loss: 0.1294 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1192 - regression_loss: 0.9904 - classification_loss: 0.1288 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1208 - regression_loss: 0.9917 - classification_loss: 0.1290 52/500 [==>...........................] - ETA: 1:53 - loss: 1.1212 - regression_loss: 0.9920 - classification_loss: 0.1292 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1280 - regression_loss: 0.9972 - classification_loss: 0.1308 54/500 [==>...........................] - ETA: 1:52 - loss: 1.1314 - regression_loss: 0.9993 - classification_loss: 0.1321 55/500 [==>...........................] - ETA: 1:52 - loss: 1.1355 - regression_loss: 1.0030 - classification_loss: 0.1325 56/500 [==>...........................] - ETA: 1:52 - loss: 1.1337 - regression_loss: 1.0012 - classification_loss: 0.1325 57/500 [==>...........................] - ETA: 1:51 - loss: 1.1427 - regression_loss: 1.0091 - classification_loss: 0.1336 58/500 [==>...........................] - ETA: 1:51 - loss: 1.1299 - regression_loss: 0.9980 - classification_loss: 0.1319 59/500 [==>...........................] - ETA: 1:51 - loss: 1.1280 - regression_loss: 0.9968 - classification_loss: 0.1312 60/500 [==>...........................] - ETA: 1:51 - loss: 1.1182 - regression_loss: 0.9882 - classification_loss: 0.1300 61/500 [==>...........................] - ETA: 1:51 - loss: 1.1189 - regression_loss: 0.9886 - classification_loss: 0.1303 62/500 [==>...........................] - ETA: 1:50 - loss: 1.1128 - regression_loss: 0.9827 - classification_loss: 0.1301 63/500 [==>...........................] - ETA: 1:50 - loss: 1.1380 - regression_loss: 1.0021 - classification_loss: 0.1358 64/500 [==>...........................] - ETA: 1:50 - loss: 1.1392 - regression_loss: 1.0026 - classification_loss: 0.1366 65/500 [==>...........................] - ETA: 1:50 - loss: 1.1416 - regression_loss: 1.0047 - classification_loss: 0.1369 66/500 [==>...........................] - ETA: 1:49 - loss: 1.1435 - regression_loss: 1.0061 - classification_loss: 0.1373 67/500 [===>..........................] - ETA: 1:49 - loss: 1.1465 - regression_loss: 1.0085 - classification_loss: 0.1380 68/500 [===>..........................] - ETA: 1:49 - loss: 1.1483 - regression_loss: 1.0105 - classification_loss: 0.1378 69/500 [===>..........................] - ETA: 1:49 - loss: 1.1450 - regression_loss: 1.0072 - classification_loss: 0.1378 70/500 [===>..........................] - ETA: 1:48 - loss: 1.1452 - regression_loss: 1.0075 - classification_loss: 0.1377 71/500 [===>..........................] - ETA: 1:48 - loss: 1.1493 - regression_loss: 1.0110 - classification_loss: 0.1383 72/500 [===>..........................] - ETA: 1:48 - loss: 1.1497 - regression_loss: 1.0116 - classification_loss: 0.1381 73/500 [===>..........................] - ETA: 1:47 - loss: 1.1461 - regression_loss: 1.0087 - classification_loss: 0.1374 74/500 [===>..........................] - ETA: 1:47 - loss: 1.1485 - regression_loss: 1.0102 - classification_loss: 0.1383 75/500 [===>..........................] - ETA: 1:47 - loss: 1.1415 - regression_loss: 1.0042 - classification_loss: 0.1372 76/500 [===>..........................] - ETA: 1:47 - loss: 1.1334 - regression_loss: 0.9970 - classification_loss: 0.1364 77/500 [===>..........................] - ETA: 1:46 - loss: 1.1365 - regression_loss: 1.0000 - classification_loss: 0.1365 78/500 [===>..........................] - ETA: 1:46 - loss: 1.1272 - regression_loss: 0.9920 - classification_loss: 0.1352 79/500 [===>..........................] - ETA: 1:46 - loss: 1.1232 - regression_loss: 0.9885 - classification_loss: 0.1347 80/500 [===>..........................] - ETA: 1:46 - loss: 1.1178 - regression_loss: 0.9841 - classification_loss: 0.1337 81/500 [===>..........................] - ETA: 1:45 - loss: 1.1222 - regression_loss: 0.9879 - classification_loss: 0.1343 82/500 [===>..........................] - ETA: 1:45 - loss: 1.1218 - regression_loss: 0.9873 - classification_loss: 0.1344 83/500 [===>..........................] - ETA: 1:45 - loss: 1.1267 - regression_loss: 0.9916 - classification_loss: 0.1350 84/500 [====>.........................] - ETA: 1:45 - loss: 1.1271 - regression_loss: 0.9920 - classification_loss: 0.1350 85/500 [====>.........................] - ETA: 1:44 - loss: 1.1235 - regression_loss: 0.9895 - classification_loss: 0.1341 86/500 [====>.........................] - ETA: 1:44 - loss: 1.1236 - regression_loss: 0.9894 - classification_loss: 0.1342 87/500 [====>.........................] - ETA: 1:44 - loss: 1.1273 - regression_loss: 0.9932 - classification_loss: 0.1341 88/500 [====>.........................] - ETA: 1:43 - loss: 1.1315 - regression_loss: 0.9967 - classification_loss: 0.1348 89/500 [====>.........................] - ETA: 1:43 - loss: 1.1376 - regression_loss: 1.0016 - classification_loss: 0.1360 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1364 - regression_loss: 1.0005 - classification_loss: 0.1359 91/500 [====>.........................] - ETA: 1:43 - loss: 1.1353 - regression_loss: 0.9999 - classification_loss: 0.1354 92/500 [====>.........................] - ETA: 1:43 - loss: 1.1366 - regression_loss: 1.0014 - classification_loss: 0.1352 93/500 [====>.........................] - ETA: 1:42 - loss: 1.1341 - regression_loss: 0.9993 - classification_loss: 0.1348 94/500 [====>.........................] - ETA: 1:42 - loss: 1.1366 - regression_loss: 1.0015 - classification_loss: 0.1351 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1364 - regression_loss: 1.0012 - classification_loss: 0.1352 96/500 [====>.........................] - ETA: 1:42 - loss: 1.1310 - regression_loss: 0.9968 - classification_loss: 0.1342 97/500 [====>.........................] - ETA: 1:41 - loss: 1.1253 - regression_loss: 0.9918 - classification_loss: 0.1335 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1192 - regression_loss: 0.9865 - classification_loss: 0.1327 99/500 [====>.........................] - ETA: 1:41 - loss: 1.1144 - regression_loss: 0.9826 - classification_loss: 0.1318 100/500 [=====>........................] - ETA: 1:41 - loss: 1.1080 - regression_loss: 0.9772 - classification_loss: 0.1307 101/500 [=====>........................] - ETA: 1:40 - loss: 1.1103 - regression_loss: 0.9793 - classification_loss: 0.1309 102/500 [=====>........................] - ETA: 1:40 - loss: 1.1158 - regression_loss: 0.9841 - classification_loss: 0.1316 103/500 [=====>........................] - ETA: 1:40 - loss: 1.1113 - regression_loss: 0.9803 - classification_loss: 0.1310 104/500 [=====>........................] - ETA: 1:40 - loss: 1.1134 - regression_loss: 0.9825 - classification_loss: 0.1309 105/500 [=====>........................] - ETA: 1:39 - loss: 1.1110 - regression_loss: 0.9812 - classification_loss: 0.1298 106/500 [=====>........................] - ETA: 1:39 - loss: 1.1162 - regression_loss: 0.9856 - classification_loss: 0.1307 107/500 [=====>........................] - ETA: 1:39 - loss: 1.1177 - regression_loss: 0.9873 - classification_loss: 0.1304 108/500 [=====>........................] - ETA: 1:39 - loss: 1.1171 - regression_loss: 0.9866 - classification_loss: 0.1305 109/500 [=====>........................] - ETA: 1:38 - loss: 1.1175 - regression_loss: 0.9870 - classification_loss: 0.1305 110/500 [=====>........................] - ETA: 1:38 - loss: 1.1126 - regression_loss: 0.9829 - classification_loss: 0.1297 111/500 [=====>........................] - ETA: 1:37 - loss: 1.1084 - regression_loss: 0.9793 - classification_loss: 0.1290 112/500 [=====>........................] - ETA: 1:37 - loss: 1.1105 - regression_loss: 0.9811 - classification_loss: 0.1294 113/500 [=====>........................] - ETA: 1:37 - loss: 1.1060 - regression_loss: 0.9773 - classification_loss: 0.1287 114/500 [=====>........................] - ETA: 1:36 - loss: 1.1067 - regression_loss: 0.9779 - classification_loss: 0.1288 115/500 [=====>........................] - ETA: 1:36 - loss: 1.1105 - regression_loss: 0.9811 - classification_loss: 0.1294 116/500 [=====>........................] - ETA: 1:36 - loss: 1.1205 - regression_loss: 0.9893 - classification_loss: 0.1312 117/500 [======>.......................] - ETA: 1:36 - loss: 1.1228 - regression_loss: 0.9912 - classification_loss: 0.1316 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1278 - regression_loss: 0.9953 - classification_loss: 0.1325 119/500 [======>.......................] - ETA: 1:35 - loss: 1.1249 - regression_loss: 0.9928 - classification_loss: 0.1321 120/500 [======>.......................] - ETA: 1:35 - loss: 1.1265 - regression_loss: 0.9943 - classification_loss: 0.1322 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1315 - regression_loss: 0.9985 - classification_loss: 0.1330 122/500 [======>.......................] - ETA: 1:34 - loss: 1.1295 - regression_loss: 0.9970 - classification_loss: 0.1326 123/500 [======>.......................] - ETA: 1:34 - loss: 1.1300 - regression_loss: 0.9973 - classification_loss: 0.1327 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1274 - regression_loss: 0.9952 - classification_loss: 0.1322 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1281 - regression_loss: 0.9957 - classification_loss: 0.1324 126/500 [======>.......................] - ETA: 1:33 - loss: 1.1267 - regression_loss: 0.9945 - classification_loss: 0.1322 127/500 [======>.......................] - ETA: 1:33 - loss: 1.1286 - regression_loss: 0.9962 - classification_loss: 0.1325 128/500 [======>.......................] - ETA: 1:33 - loss: 1.1246 - regression_loss: 0.9918 - classification_loss: 0.1327 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1237 - regression_loss: 0.9913 - classification_loss: 0.1324 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1253 - regression_loss: 0.9926 - classification_loss: 0.1327 131/500 [======>.......................] - ETA: 1:32 - loss: 1.1272 - regression_loss: 0.9939 - classification_loss: 0.1333 132/500 [======>.......................] - ETA: 1:32 - loss: 1.1300 - regression_loss: 0.9961 - classification_loss: 0.1339 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1279 - regression_loss: 0.9947 - classification_loss: 0.1332 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1288 - regression_loss: 0.9953 - classification_loss: 0.1336 135/500 [=======>......................] - ETA: 1:31 - loss: 1.1325 - regression_loss: 0.9983 - classification_loss: 0.1342 136/500 [=======>......................] - ETA: 1:31 - loss: 1.1340 - regression_loss: 0.9997 - classification_loss: 0.1343 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1338 - regression_loss: 0.9996 - classification_loss: 0.1342 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1346 - regression_loss: 1.0003 - classification_loss: 0.1343 139/500 [=======>......................] - ETA: 1:30 - loss: 1.1388 - regression_loss: 1.0035 - classification_loss: 0.1352 140/500 [=======>......................] - ETA: 1:30 - loss: 1.1368 - regression_loss: 1.0019 - classification_loss: 0.1349 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1370 - regression_loss: 1.0021 - classification_loss: 0.1348 142/500 [=======>......................] - ETA: 1:30 - loss: 1.1391 - regression_loss: 1.0039 - classification_loss: 0.1352 143/500 [=======>......................] - ETA: 1:29 - loss: 1.1361 - regression_loss: 1.0014 - classification_loss: 0.1347 144/500 [=======>......................] - ETA: 1:29 - loss: 1.1326 - regression_loss: 0.9984 - classification_loss: 0.1342 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1270 - regression_loss: 0.9936 - classification_loss: 0.1334 146/500 [=======>......................] - ETA: 1:29 - loss: 1.1297 - regression_loss: 0.9956 - classification_loss: 0.1341 147/500 [=======>......................] - ETA: 1:28 - loss: 1.1302 - regression_loss: 0.9961 - classification_loss: 0.1342 148/500 [=======>......................] - ETA: 1:28 - loss: 1.1290 - regression_loss: 0.9951 - classification_loss: 0.1339 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1286 - regression_loss: 0.9948 - classification_loss: 0.1338 150/500 [========>.....................] - ETA: 1:28 - loss: 1.1298 - regression_loss: 0.9959 - classification_loss: 0.1339 151/500 [========>.....................] - ETA: 1:28 - loss: 1.1306 - regression_loss: 0.9967 - classification_loss: 0.1338 152/500 [========>.....................] - ETA: 1:28 - loss: 1.1324 - regression_loss: 0.9982 - classification_loss: 0.1342 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1353 - regression_loss: 1.0007 - classification_loss: 0.1346 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1357 - regression_loss: 1.0008 - classification_loss: 0.1349 155/500 [========>.....................] - ETA: 1:27 - loss: 1.1352 - regression_loss: 1.0002 - classification_loss: 0.1349 156/500 [========>.....................] - ETA: 1:27 - loss: 1.1370 - regression_loss: 1.0021 - classification_loss: 0.1349 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1404 - regression_loss: 1.0053 - classification_loss: 0.1351 158/500 [========>.....................] - ETA: 1:26 - loss: 1.1396 - regression_loss: 1.0044 - classification_loss: 0.1352 159/500 [========>.....................] - ETA: 1:26 - loss: 1.1407 - regression_loss: 1.0055 - classification_loss: 0.1352 160/500 [========>.....................] - ETA: 1:26 - loss: 1.1400 - regression_loss: 1.0048 - classification_loss: 0.1351 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1396 - regression_loss: 1.0046 - classification_loss: 0.1350 162/500 [========>.....................] - ETA: 1:25 - loss: 1.1373 - regression_loss: 1.0027 - classification_loss: 0.1346 163/500 [========>.....................] - ETA: 1:25 - loss: 1.1339 - regression_loss: 0.9998 - classification_loss: 0.1340 164/500 [========>.....................] - ETA: 1:25 - loss: 1.1355 - regression_loss: 1.0012 - classification_loss: 0.1343 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1355 - regression_loss: 1.0008 - classification_loss: 0.1348 166/500 [========>.....................] - ETA: 1:24 - loss: 1.1367 - regression_loss: 1.0017 - classification_loss: 0.1350 167/500 [=========>....................] - ETA: 1:24 - loss: 1.1340 - regression_loss: 0.9992 - classification_loss: 0.1347 168/500 [=========>....................] - ETA: 1:24 - loss: 1.1340 - regression_loss: 0.9996 - classification_loss: 0.1344 169/500 [=========>....................] - ETA: 1:23 - loss: 1.1310 - regression_loss: 0.9971 - classification_loss: 0.1339 170/500 [=========>....................] - ETA: 1:23 - loss: 1.1294 - regression_loss: 0.9957 - classification_loss: 0.1338 171/500 [=========>....................] - ETA: 1:23 - loss: 1.1266 - regression_loss: 0.9933 - classification_loss: 0.1333 172/500 [=========>....................] - ETA: 1:23 - loss: 1.1274 - regression_loss: 0.9941 - classification_loss: 0.1333 173/500 [=========>....................] - ETA: 1:22 - loss: 1.1235 - regression_loss: 0.9908 - classification_loss: 0.1326 174/500 [=========>....................] - ETA: 1:22 - loss: 1.1210 - regression_loss: 0.9889 - classification_loss: 0.1321 175/500 [=========>....................] - ETA: 1:22 - loss: 1.1221 - regression_loss: 0.9893 - classification_loss: 0.1328 176/500 [=========>....................] - ETA: 1:22 - loss: 1.1214 - regression_loss: 0.9888 - classification_loss: 0.1326 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1213 - regression_loss: 0.9888 - classification_loss: 0.1324 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1214 - regression_loss: 0.9890 - classification_loss: 0.1324 179/500 [=========>....................] - ETA: 1:21 - loss: 1.1209 - regression_loss: 0.9887 - classification_loss: 0.1322 180/500 [=========>....................] - ETA: 1:21 - loss: 1.1184 - regression_loss: 0.9867 - classification_loss: 0.1317 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1168 - regression_loss: 0.9853 - classification_loss: 0.1315 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1172 - regression_loss: 0.9858 - classification_loss: 0.1315 183/500 [=========>....................] - ETA: 1:20 - loss: 1.1183 - regression_loss: 0.9865 - classification_loss: 0.1317 184/500 [==========>...................] - ETA: 1:20 - loss: 1.1181 - regression_loss: 0.9862 - classification_loss: 0.1319 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1204 - regression_loss: 0.9881 - classification_loss: 0.1323 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1232 - regression_loss: 0.9902 - classification_loss: 0.1330 187/500 [==========>...................] - ETA: 1:19 - loss: 1.1238 - regression_loss: 0.9907 - classification_loss: 0.1331 188/500 [==========>...................] - ETA: 1:19 - loss: 1.1241 - regression_loss: 0.9908 - classification_loss: 0.1333 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1244 - regression_loss: 0.9909 - classification_loss: 0.1335 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1237 - regression_loss: 0.9904 - classification_loss: 0.1333 191/500 [==========>...................] - ETA: 1:18 - loss: 1.1241 - regression_loss: 0.9911 - classification_loss: 0.1330 192/500 [==========>...................] - ETA: 1:18 - loss: 1.1240 - regression_loss: 0.9912 - classification_loss: 0.1328 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1246 - regression_loss: 0.9918 - classification_loss: 0.1327 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1221 - regression_loss: 0.9895 - classification_loss: 0.1326 195/500 [==========>...................] - ETA: 1:17 - loss: 1.1219 - regression_loss: 0.9893 - classification_loss: 0.1326 196/500 [==========>...................] - ETA: 1:17 - loss: 1.1220 - regression_loss: 0.9897 - classification_loss: 0.1323 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1255 - regression_loss: 0.9931 - classification_loss: 0.1324 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1269 - regression_loss: 0.9944 - classification_loss: 0.1325 199/500 [==========>...................] - ETA: 1:16 - loss: 1.1286 - regression_loss: 0.9958 - classification_loss: 0.1328 200/500 [===========>..................] - ETA: 1:16 - loss: 1.1289 - regression_loss: 0.9961 - classification_loss: 0.1328 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1268 - regression_loss: 0.9945 - classification_loss: 0.1323 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1259 - regression_loss: 0.9937 - classification_loss: 0.1322 203/500 [===========>..................] - ETA: 1:15 - loss: 1.1271 - regression_loss: 0.9949 - classification_loss: 0.1322 204/500 [===========>..................] - ETA: 1:15 - loss: 1.1258 - regression_loss: 0.9935 - classification_loss: 0.1323 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1280 - regression_loss: 0.9953 - classification_loss: 0.1327 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1315 - regression_loss: 0.9975 - classification_loss: 0.1340 207/500 [===========>..................] - ETA: 1:14 - loss: 1.1317 - regression_loss: 0.9980 - classification_loss: 0.1337 208/500 [===========>..................] - ETA: 1:14 - loss: 1.1338 - regression_loss: 0.9998 - classification_loss: 0.1340 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1322 - regression_loss: 0.9985 - classification_loss: 0.1338 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1316 - regression_loss: 0.9981 - classification_loss: 0.1335 211/500 [===========>..................] - ETA: 1:13 - loss: 1.1338 - regression_loss: 0.9998 - classification_loss: 0.1340 212/500 [===========>..................] - ETA: 1:13 - loss: 1.1337 - regression_loss: 1.0001 - classification_loss: 0.1336 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1302 - regression_loss: 0.9971 - classification_loss: 0.1332 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1267 - regression_loss: 0.9940 - classification_loss: 0.1327 215/500 [===========>..................] - ETA: 1:12 - loss: 1.1246 - regression_loss: 0.9922 - classification_loss: 0.1324 216/500 [===========>..................] - ETA: 1:12 - loss: 1.1273 - regression_loss: 0.9942 - classification_loss: 0.1331 217/500 [============>.................] - ETA: 1:11 - loss: 1.1275 - regression_loss: 0.9944 - classification_loss: 0.1331 218/500 [============>.................] - ETA: 1:11 - loss: 1.1274 - regression_loss: 0.9945 - classification_loss: 0.1330 219/500 [============>.................] - ETA: 1:11 - loss: 1.1270 - regression_loss: 0.9941 - classification_loss: 0.1329 220/500 [============>.................] - ETA: 1:11 - loss: 1.1264 - regression_loss: 0.9934 - classification_loss: 0.1330 221/500 [============>.................] - ETA: 1:10 - loss: 1.1264 - regression_loss: 0.9938 - classification_loss: 0.1327 222/500 [============>.................] - ETA: 1:10 - loss: 1.1259 - regression_loss: 0.9932 - classification_loss: 0.1327 223/500 [============>.................] - ETA: 1:10 - loss: 1.1234 - regression_loss: 0.9910 - classification_loss: 0.1324 224/500 [============>.................] - ETA: 1:10 - loss: 1.1210 - regression_loss: 0.9889 - classification_loss: 0.1321 225/500 [============>.................] - ETA: 1:09 - loss: 1.1218 - regression_loss: 0.9896 - classification_loss: 0.1322 226/500 [============>.................] - ETA: 1:09 - loss: 1.1251 - regression_loss: 0.9924 - classification_loss: 0.1327 227/500 [============>.................] - ETA: 1:09 - loss: 1.1255 - regression_loss: 0.9927 - classification_loss: 0.1328 228/500 [============>.................] - ETA: 1:09 - loss: 1.1249 - regression_loss: 0.9921 - classification_loss: 0.1327 229/500 [============>.................] - ETA: 1:08 - loss: 1.1264 - regression_loss: 0.9936 - classification_loss: 0.1329 230/500 [============>.................] - ETA: 1:08 - loss: 1.1258 - regression_loss: 0.9930 - classification_loss: 0.1327 231/500 [============>.................] - ETA: 1:08 - loss: 1.1243 - regression_loss: 0.9920 - classification_loss: 0.1323 232/500 [============>.................] - ETA: 1:08 - loss: 1.1238 - regression_loss: 0.9913 - classification_loss: 0.1325 233/500 [============>.................] - ETA: 1:07 - loss: 1.1241 - regression_loss: 0.9916 - classification_loss: 0.1325 234/500 [=============>................] - ETA: 1:07 - loss: 1.1270 - regression_loss: 0.9940 - classification_loss: 0.1330 235/500 [=============>................] - ETA: 1:07 - loss: 1.1282 - regression_loss: 0.9949 - classification_loss: 0.1333 236/500 [=============>................] - ETA: 1:07 - loss: 1.1275 - regression_loss: 0.9943 - classification_loss: 0.1333 237/500 [=============>................] - ETA: 1:06 - loss: 1.1250 - regression_loss: 0.9922 - classification_loss: 0.1329 238/500 [=============>................] - ETA: 1:06 - loss: 1.1220 - regression_loss: 0.9894 - classification_loss: 0.1325 239/500 [=============>................] - ETA: 1:06 - loss: 1.1203 - regression_loss: 0.9882 - classification_loss: 0.1322 240/500 [=============>................] - ETA: 1:05 - loss: 1.1203 - regression_loss: 0.9881 - classification_loss: 0.1322 241/500 [=============>................] - ETA: 1:05 - loss: 1.1237 - regression_loss: 0.9913 - classification_loss: 0.1324 242/500 [=============>................] - ETA: 1:05 - loss: 1.1235 - regression_loss: 0.9911 - classification_loss: 0.1324 243/500 [=============>................] - ETA: 1:05 - loss: 1.1238 - regression_loss: 0.9915 - classification_loss: 0.1324 244/500 [=============>................] - ETA: 1:04 - loss: 1.1208 - regression_loss: 0.9889 - classification_loss: 0.1319 245/500 [=============>................] - ETA: 1:04 - loss: 1.1206 - regression_loss: 0.9886 - classification_loss: 0.1321 246/500 [=============>................] - ETA: 1:04 - loss: 1.1221 - regression_loss: 0.9898 - classification_loss: 0.1323 247/500 [=============>................] - ETA: 1:04 - loss: 1.1210 - regression_loss: 0.9888 - classification_loss: 0.1322 248/500 [=============>................] - ETA: 1:03 - loss: 1.1211 - regression_loss: 0.9889 - classification_loss: 0.1322 249/500 [=============>................] - ETA: 1:03 - loss: 1.1213 - regression_loss: 0.9891 - classification_loss: 0.1321 250/500 [==============>...............] - ETA: 1:03 - loss: 1.1187 - regression_loss: 0.9869 - classification_loss: 0.1318 251/500 [==============>...............] - ETA: 1:03 - loss: 1.1196 - regression_loss: 0.9876 - classification_loss: 0.1320 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1209 - regression_loss: 0.9886 - classification_loss: 0.1323 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1201 - regression_loss: 0.9880 - classification_loss: 0.1321 254/500 [==============>...............] - ETA: 1:02 - loss: 1.1186 - regression_loss: 0.9869 - classification_loss: 0.1317 255/500 [==============>...............] - ETA: 1:02 - loss: 1.1184 - regression_loss: 0.9866 - classification_loss: 0.1318 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1207 - regression_loss: 0.9886 - classification_loss: 0.1321 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1223 - regression_loss: 0.9901 - classification_loss: 0.1322 258/500 [==============>...............] - ETA: 1:01 - loss: 1.1204 - regression_loss: 0.9884 - classification_loss: 0.1320 259/500 [==============>...............] - ETA: 1:01 - loss: 1.1212 - regression_loss: 0.9891 - classification_loss: 0.1321 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1217 - regression_loss: 0.9896 - classification_loss: 0.1321 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1236 - regression_loss: 0.9911 - classification_loss: 0.1324 262/500 [==============>...............] - ETA: 1:00 - loss: 1.1252 - regression_loss: 0.9925 - classification_loss: 0.1327 263/500 [==============>...............] - ETA: 1:00 - loss: 1.1249 - regression_loss: 0.9923 - classification_loss: 0.1327 264/500 [==============>...............] - ETA: 59s - loss: 1.1250 - regression_loss: 0.9922 - classification_loss: 0.1328  265/500 [==============>...............] - ETA: 59s - loss: 1.1250 - regression_loss: 0.9922 - classification_loss: 0.1328 266/500 [==============>...............] - ETA: 59s - loss: 1.1271 - regression_loss: 0.9940 - classification_loss: 0.1331 267/500 [===============>..............] - ETA: 59s - loss: 1.1290 - regression_loss: 0.9956 - classification_loss: 0.1334 268/500 [===============>..............] - ETA: 58s - loss: 1.1299 - regression_loss: 0.9964 - classification_loss: 0.1335 269/500 [===============>..............] - ETA: 58s - loss: 1.1289 - regression_loss: 0.9955 - classification_loss: 0.1334 270/500 [===============>..............] - ETA: 58s - loss: 1.1324 - regression_loss: 0.9986 - classification_loss: 0.1338 271/500 [===============>..............] - ETA: 58s - loss: 1.1331 - regression_loss: 0.9990 - classification_loss: 0.1341 272/500 [===============>..............] - ETA: 57s - loss: 1.1312 - regression_loss: 0.9973 - classification_loss: 0.1339 273/500 [===============>..............] - ETA: 57s - loss: 1.1310 - regression_loss: 0.9971 - classification_loss: 0.1339 274/500 [===============>..............] - ETA: 57s - loss: 1.1291 - regression_loss: 0.9956 - classification_loss: 0.1335 275/500 [===============>..............] - ETA: 57s - loss: 1.1293 - regression_loss: 0.9957 - classification_loss: 0.1337 276/500 [===============>..............] - ETA: 56s - loss: 1.1306 - regression_loss: 0.9968 - classification_loss: 0.1339 277/500 [===============>..............] - ETA: 56s - loss: 1.1282 - regression_loss: 0.9946 - classification_loss: 0.1336 278/500 [===============>..............] - ETA: 56s - loss: 1.1288 - regression_loss: 0.9951 - classification_loss: 0.1336 279/500 [===============>..............] - ETA: 56s - loss: 1.1288 - regression_loss: 0.9952 - classification_loss: 0.1336 280/500 [===============>..............] - ETA: 55s - loss: 1.1303 - regression_loss: 0.9966 - classification_loss: 0.1337 281/500 [===============>..............] - ETA: 55s - loss: 1.1307 - regression_loss: 0.9971 - classification_loss: 0.1336 282/500 [===============>..............] - ETA: 55s - loss: 1.1312 - regression_loss: 0.9975 - classification_loss: 0.1337 283/500 [===============>..............] - ETA: 55s - loss: 1.1322 - regression_loss: 0.9984 - classification_loss: 0.1338 284/500 [================>.............] - ETA: 54s - loss: 1.1315 - regression_loss: 0.9979 - classification_loss: 0.1336 285/500 [================>.............] - ETA: 54s - loss: 1.1303 - regression_loss: 0.9970 - classification_loss: 0.1333 286/500 [================>.............] - ETA: 54s - loss: 1.1323 - regression_loss: 0.9987 - classification_loss: 0.1337 287/500 [================>.............] - ETA: 54s - loss: 1.1335 - regression_loss: 0.9997 - classification_loss: 0.1338 288/500 [================>.............] - ETA: 53s - loss: 1.1347 - regression_loss: 1.0009 - classification_loss: 0.1338 289/500 [================>.............] - ETA: 53s - loss: 1.1348 - regression_loss: 1.0010 - classification_loss: 0.1338 290/500 [================>.............] - ETA: 53s - loss: 1.1334 - regression_loss: 0.9998 - classification_loss: 0.1336 291/500 [================>.............] - ETA: 53s - loss: 1.1338 - regression_loss: 1.0002 - classification_loss: 0.1336 292/500 [================>.............] - ETA: 52s - loss: 1.1337 - regression_loss: 1.0001 - classification_loss: 0.1336 293/500 [================>.............] - ETA: 52s - loss: 1.1318 - regression_loss: 0.9983 - classification_loss: 0.1335 294/500 [================>.............] - ETA: 52s - loss: 1.1303 - regression_loss: 0.9971 - classification_loss: 0.1332 295/500 [================>.............] - ETA: 51s - loss: 1.1308 - regression_loss: 0.9975 - classification_loss: 0.1333 296/500 [================>.............] - ETA: 51s - loss: 1.1298 - regression_loss: 0.9967 - classification_loss: 0.1331 297/500 [================>.............] - ETA: 51s - loss: 1.1319 - regression_loss: 0.9985 - classification_loss: 0.1334 298/500 [================>.............] - ETA: 51s - loss: 1.1325 - regression_loss: 0.9990 - classification_loss: 0.1336 299/500 [================>.............] - ETA: 50s - loss: 1.1302 - regression_loss: 0.9969 - classification_loss: 0.1333 300/500 [=================>............] - ETA: 50s - loss: 1.1308 - regression_loss: 0.9973 - classification_loss: 0.1334 301/500 [=================>............] - ETA: 50s - loss: 1.1321 - regression_loss: 0.9985 - classification_loss: 0.1336 302/500 [=================>............] - ETA: 50s - loss: 1.1331 - regression_loss: 0.9993 - classification_loss: 0.1338 303/500 [=================>............] - ETA: 49s - loss: 1.1342 - regression_loss: 1.0000 - classification_loss: 0.1342 304/500 [=================>............] - ETA: 49s - loss: 1.1349 - regression_loss: 1.0008 - classification_loss: 0.1342 305/500 [=================>............] - ETA: 49s - loss: 1.1356 - regression_loss: 1.0014 - classification_loss: 0.1342 306/500 [=================>............] - ETA: 49s - loss: 1.1344 - regression_loss: 1.0003 - classification_loss: 0.1341 307/500 [=================>............] - ETA: 48s - loss: 1.1345 - regression_loss: 1.0003 - classification_loss: 0.1342 308/500 [=================>............] - ETA: 48s - loss: 1.1350 - regression_loss: 1.0009 - classification_loss: 0.1341 309/500 [=================>............] - ETA: 48s - loss: 1.1327 - regression_loss: 0.9989 - classification_loss: 0.1338 310/500 [=================>............] - ETA: 48s - loss: 1.1307 - regression_loss: 0.9972 - classification_loss: 0.1335 311/500 [=================>............] - ETA: 47s - loss: 1.1290 - regression_loss: 0.9959 - classification_loss: 0.1331 312/500 [=================>............] - ETA: 47s - loss: 1.1314 - regression_loss: 0.9978 - classification_loss: 0.1336 313/500 [=================>............] - ETA: 47s - loss: 1.1298 - regression_loss: 0.9964 - classification_loss: 0.1334 314/500 [=================>............] - ETA: 47s - loss: 1.1316 - regression_loss: 0.9981 - classification_loss: 0.1335 315/500 [=================>............] - ETA: 46s - loss: 1.1326 - regression_loss: 0.9990 - classification_loss: 0.1336 316/500 [=================>............] - ETA: 46s - loss: 1.1344 - regression_loss: 1.0009 - classification_loss: 0.1335 317/500 [==================>...........] - ETA: 46s - loss: 1.1343 - regression_loss: 1.0008 - classification_loss: 0.1335 318/500 [==================>...........] - ETA: 46s - loss: 1.1348 - regression_loss: 1.0014 - classification_loss: 0.1335 319/500 [==================>...........] - ETA: 45s - loss: 1.1346 - regression_loss: 1.0014 - classification_loss: 0.1332 320/500 [==================>...........] - ETA: 45s - loss: 1.1341 - regression_loss: 1.0010 - classification_loss: 0.1331 321/500 [==================>...........] - ETA: 45s - loss: 1.1321 - regression_loss: 0.9992 - classification_loss: 0.1329 322/500 [==================>...........] - ETA: 45s - loss: 1.1310 - regression_loss: 0.9982 - classification_loss: 0.1327 323/500 [==================>...........] - ETA: 44s - loss: 1.1298 - regression_loss: 0.9973 - classification_loss: 0.1325 324/500 [==================>...........] - ETA: 44s - loss: 1.1298 - regression_loss: 0.9974 - classification_loss: 0.1324 325/500 [==================>...........] - ETA: 44s - loss: 1.1300 - regression_loss: 0.9974 - classification_loss: 0.1326 326/500 [==================>...........] - ETA: 44s - loss: 1.1307 - regression_loss: 0.9981 - classification_loss: 0.1326 327/500 [==================>...........] - ETA: 43s - loss: 1.1318 - regression_loss: 0.9990 - classification_loss: 0.1328 328/500 [==================>...........] - ETA: 43s - loss: 1.1298 - regression_loss: 0.9973 - classification_loss: 0.1325 329/500 [==================>...........] - ETA: 43s - loss: 1.1290 - regression_loss: 0.9967 - classification_loss: 0.1323 330/500 [==================>...........] - ETA: 43s - loss: 1.1291 - regression_loss: 0.9968 - classification_loss: 0.1323 331/500 [==================>...........] - ETA: 42s - loss: 1.1292 - regression_loss: 0.9967 - classification_loss: 0.1324 332/500 [==================>...........] - ETA: 42s - loss: 1.1294 - regression_loss: 0.9971 - classification_loss: 0.1323 333/500 [==================>...........] - ETA: 42s - loss: 1.1280 - regression_loss: 0.9959 - classification_loss: 0.1321 334/500 [===================>..........] - ETA: 42s - loss: 1.1288 - regression_loss: 0.9967 - classification_loss: 0.1321 335/500 [===================>..........] - ETA: 41s - loss: 1.1296 - regression_loss: 0.9975 - classification_loss: 0.1321 336/500 [===================>..........] - ETA: 41s - loss: 1.1297 - regression_loss: 0.9976 - classification_loss: 0.1320 337/500 [===================>..........] - ETA: 41s - loss: 1.1322 - regression_loss: 0.9997 - classification_loss: 0.1325 338/500 [===================>..........] - ETA: 41s - loss: 1.1328 - regression_loss: 1.0002 - classification_loss: 0.1326 339/500 [===================>..........] - ETA: 40s - loss: 1.1336 - regression_loss: 1.0008 - classification_loss: 0.1328 340/500 [===================>..........] - ETA: 40s - loss: 1.1328 - regression_loss: 1.0000 - classification_loss: 0.1329 341/500 [===================>..........] - ETA: 40s - loss: 1.1315 - regression_loss: 0.9987 - classification_loss: 0.1328 342/500 [===================>..........] - ETA: 40s - loss: 1.1306 - regression_loss: 0.9976 - classification_loss: 0.1330 343/500 [===================>..........] - ETA: 39s - loss: 1.1314 - regression_loss: 0.9983 - classification_loss: 0.1331 344/500 [===================>..........] - ETA: 39s - loss: 1.1326 - regression_loss: 0.9993 - classification_loss: 0.1333 345/500 [===================>..........] - ETA: 39s - loss: 1.1311 - regression_loss: 0.9981 - classification_loss: 0.1330 346/500 [===================>..........] - ETA: 39s - loss: 1.1315 - regression_loss: 0.9985 - classification_loss: 0.1329 347/500 [===================>..........] - ETA: 38s - loss: 1.1310 - regression_loss: 0.9981 - classification_loss: 0.1329 348/500 [===================>..........] - ETA: 38s - loss: 1.1314 - regression_loss: 0.9986 - classification_loss: 0.1329 349/500 [===================>..........] - ETA: 38s - loss: 1.1297 - regression_loss: 0.9971 - classification_loss: 0.1326 350/500 [====================>.........] - ETA: 38s - loss: 1.1310 - regression_loss: 0.9983 - classification_loss: 0.1327 351/500 [====================>.........] - ETA: 37s - loss: 1.1320 - regression_loss: 0.9990 - classification_loss: 0.1329 352/500 [====================>.........] - ETA: 37s - loss: 1.1332 - regression_loss: 0.9999 - classification_loss: 0.1333 353/500 [====================>.........] - ETA: 37s - loss: 1.1330 - regression_loss: 0.9999 - classification_loss: 0.1330 354/500 [====================>.........] - ETA: 37s - loss: 1.1337 - regression_loss: 1.0005 - classification_loss: 0.1332 355/500 [====================>.........] - ETA: 36s - loss: 1.1330 - regression_loss: 0.9999 - classification_loss: 0.1332 356/500 [====================>.........] - ETA: 36s - loss: 1.1326 - regression_loss: 0.9994 - classification_loss: 0.1331 357/500 [====================>.........] - ETA: 36s - loss: 1.1336 - regression_loss: 1.0003 - classification_loss: 0.1333 358/500 [====================>.........] - ETA: 36s - loss: 1.1340 - regression_loss: 1.0006 - classification_loss: 0.1334 359/500 [====================>.........] - ETA: 35s - loss: 1.1339 - regression_loss: 1.0004 - classification_loss: 0.1335 360/500 [====================>.........] - ETA: 35s - loss: 1.1341 - regression_loss: 1.0006 - classification_loss: 0.1335 361/500 [====================>.........] - ETA: 35s - loss: 1.1334 - regression_loss: 0.9999 - classification_loss: 0.1334 362/500 [====================>.........] - ETA: 35s - loss: 1.1336 - regression_loss: 0.9999 - classification_loss: 0.1337 363/500 [====================>.........] - ETA: 34s - loss: 1.1330 - regression_loss: 0.9994 - classification_loss: 0.1336 364/500 [====================>.........] - ETA: 34s - loss: 1.1317 - regression_loss: 0.9983 - classification_loss: 0.1334 365/500 [====================>.........] - ETA: 34s - loss: 1.1316 - regression_loss: 0.9983 - classification_loss: 0.1333 366/500 [====================>.........] - ETA: 34s - loss: 1.1321 - regression_loss: 0.9986 - classification_loss: 0.1335 367/500 [=====================>........] - ETA: 33s - loss: 1.1315 - regression_loss: 0.9981 - classification_loss: 0.1334 368/500 [=====================>........] - ETA: 33s - loss: 1.1316 - regression_loss: 0.9982 - classification_loss: 0.1334 369/500 [=====================>........] - ETA: 33s - loss: 1.1301 - regression_loss: 0.9970 - classification_loss: 0.1331 370/500 [=====================>........] - ETA: 33s - loss: 1.1285 - regression_loss: 0.9956 - classification_loss: 0.1329 371/500 [=====================>........] - ETA: 32s - loss: 1.1286 - regression_loss: 0.9956 - classification_loss: 0.1330 372/500 [=====================>........] - ETA: 32s - loss: 1.1285 - regression_loss: 0.9955 - classification_loss: 0.1330 373/500 [=====================>........] - ETA: 32s - loss: 1.1279 - regression_loss: 0.9950 - classification_loss: 0.1329 374/500 [=====================>........] - ETA: 31s - loss: 1.1293 - regression_loss: 0.9962 - classification_loss: 0.1331 375/500 [=====================>........] - ETA: 31s - loss: 1.1295 - regression_loss: 0.9963 - classification_loss: 0.1332 376/500 [=====================>........] - ETA: 31s - loss: 1.1294 - regression_loss: 0.9962 - classification_loss: 0.1332 377/500 [=====================>........] - ETA: 31s - loss: 1.1287 - regression_loss: 0.9956 - classification_loss: 0.1331 378/500 [=====================>........] - ETA: 30s - loss: 1.1278 - regression_loss: 0.9948 - classification_loss: 0.1330 379/500 [=====================>........] - ETA: 30s - loss: 1.1279 - regression_loss: 0.9948 - classification_loss: 0.1330 380/500 [=====================>........] - ETA: 30s - loss: 1.1266 - regression_loss: 0.9937 - classification_loss: 0.1329 381/500 [=====================>........] - ETA: 30s - loss: 1.1271 - regression_loss: 0.9943 - classification_loss: 0.1328 382/500 [=====================>........] - ETA: 29s - loss: 1.1256 - regression_loss: 0.9930 - classification_loss: 0.1326 383/500 [=====================>........] - ETA: 29s - loss: 1.1263 - regression_loss: 0.9936 - classification_loss: 0.1327 384/500 [======================>.......] - ETA: 29s - loss: 1.1284 - regression_loss: 0.9954 - classification_loss: 0.1330 385/500 [======================>.......] - ETA: 29s - loss: 1.1298 - regression_loss: 0.9965 - classification_loss: 0.1333 386/500 [======================>.......] - ETA: 28s - loss: 1.1303 - regression_loss: 0.9970 - classification_loss: 0.1334 387/500 [======================>.......] - ETA: 28s - loss: 1.1303 - regression_loss: 0.9970 - classification_loss: 0.1333 388/500 [======================>.......] - ETA: 28s - loss: 1.1312 - regression_loss: 0.9977 - classification_loss: 0.1335 389/500 [======================>.......] - ETA: 28s - loss: 1.1313 - regression_loss: 0.9978 - classification_loss: 0.1335 390/500 [======================>.......] - ETA: 27s - loss: 1.1294 - regression_loss: 0.9961 - classification_loss: 0.1333 391/500 [======================>.......] - ETA: 27s - loss: 1.1287 - regression_loss: 0.9956 - classification_loss: 0.1331 392/500 [======================>.......] - ETA: 27s - loss: 1.1287 - regression_loss: 0.9956 - classification_loss: 0.1332 393/500 [======================>.......] - ETA: 27s - loss: 1.1273 - regression_loss: 0.9943 - classification_loss: 0.1330 394/500 [======================>.......] - ETA: 26s - loss: 1.1285 - regression_loss: 0.9952 - classification_loss: 0.1332 395/500 [======================>.......] - ETA: 26s - loss: 1.1287 - regression_loss: 0.9954 - classification_loss: 0.1332 396/500 [======================>.......] - ETA: 26s - loss: 1.1296 - regression_loss: 0.9964 - classification_loss: 0.1333 397/500 [======================>.......] - ETA: 26s - loss: 1.1299 - regression_loss: 0.9966 - classification_loss: 0.1332 398/500 [======================>.......] - ETA: 25s - loss: 1.1284 - regression_loss: 0.9954 - classification_loss: 0.1330 399/500 [======================>.......] - ETA: 25s - loss: 1.1268 - regression_loss: 0.9940 - classification_loss: 0.1327 400/500 [=======================>......] - ETA: 25s - loss: 1.1281 - regression_loss: 0.9952 - classification_loss: 0.1330 401/500 [=======================>......] - ETA: 25s - loss: 1.1282 - regression_loss: 0.9953 - classification_loss: 0.1329 402/500 [=======================>......] - ETA: 24s - loss: 1.1270 - regression_loss: 0.9944 - classification_loss: 0.1326 403/500 [=======================>......] - ETA: 24s - loss: 1.1272 - regression_loss: 0.9945 - classification_loss: 0.1327 404/500 [=======================>......] - ETA: 24s - loss: 1.1284 - regression_loss: 0.9954 - classification_loss: 0.1330 405/500 [=======================>......] - ETA: 24s - loss: 1.1285 - regression_loss: 0.9956 - classification_loss: 0.1329 406/500 [=======================>......] - ETA: 23s - loss: 1.1292 - regression_loss: 0.9961 - classification_loss: 0.1331 407/500 [=======================>......] - ETA: 23s - loss: 1.1291 - regression_loss: 0.9961 - classification_loss: 0.1330 408/500 [=======================>......] - ETA: 23s - loss: 1.1277 - regression_loss: 0.9950 - classification_loss: 0.1327 409/500 [=======================>......] - ETA: 23s - loss: 1.1261 - regression_loss: 0.9935 - classification_loss: 0.1326 410/500 [=======================>......] - ETA: 22s - loss: 1.1268 - regression_loss: 0.9940 - classification_loss: 0.1327 411/500 [=======================>......] - ETA: 22s - loss: 1.1259 - regression_loss: 0.9933 - classification_loss: 0.1326 412/500 [=======================>......] - ETA: 22s - loss: 1.1245 - regression_loss: 0.9920 - classification_loss: 0.1325 413/500 [=======================>......] - ETA: 22s - loss: 1.1247 - regression_loss: 0.9922 - classification_loss: 0.1326 414/500 [=======================>......] - ETA: 21s - loss: 1.1254 - regression_loss: 0.9928 - classification_loss: 0.1326 415/500 [=======================>......] - ETA: 21s - loss: 1.1256 - regression_loss: 0.9930 - classification_loss: 0.1326 416/500 [=======================>......] - ETA: 21s - loss: 1.1266 - regression_loss: 0.9938 - classification_loss: 0.1328 417/500 [========================>.....] - ETA: 21s - loss: 1.1277 - regression_loss: 0.9947 - classification_loss: 0.1330 418/500 [========================>.....] - ETA: 20s - loss: 1.1279 - regression_loss: 0.9948 - classification_loss: 0.1331 419/500 [========================>.....] - ETA: 20s - loss: 1.1276 - regression_loss: 0.9946 - classification_loss: 0.1330 420/500 [========================>.....] - ETA: 20s - loss: 1.1278 - regression_loss: 0.9947 - classification_loss: 0.1331 421/500 [========================>.....] - ETA: 20s - loss: 1.1261 - regression_loss: 0.9932 - classification_loss: 0.1328 422/500 [========================>.....] - ETA: 19s - loss: 1.1250 - regression_loss: 0.9924 - classification_loss: 0.1326 423/500 [========================>.....] - ETA: 19s - loss: 1.1255 - regression_loss: 0.9928 - classification_loss: 0.1327 424/500 [========================>.....] - ETA: 19s - loss: 1.1264 - regression_loss: 0.9936 - classification_loss: 0.1328 425/500 [========================>.....] - ETA: 19s - loss: 1.1250 - regression_loss: 0.9924 - classification_loss: 0.1326 426/500 [========================>.....] - ETA: 18s - loss: 1.1259 - regression_loss: 0.9931 - classification_loss: 0.1328 427/500 [========================>.....] - ETA: 18s - loss: 1.1273 - regression_loss: 0.9943 - classification_loss: 0.1331 428/500 [========================>.....] - ETA: 18s - loss: 1.1275 - regression_loss: 0.9944 - classification_loss: 0.1331 429/500 [========================>.....] - ETA: 18s - loss: 1.1264 - regression_loss: 0.9935 - classification_loss: 0.1329 430/500 [========================>.....] - ETA: 17s - loss: 1.1253 - regression_loss: 0.9926 - classification_loss: 0.1327 431/500 [========================>.....] - ETA: 17s - loss: 1.1253 - regression_loss: 0.9926 - classification_loss: 0.1327 432/500 [========================>.....] - ETA: 17s - loss: 1.1257 - regression_loss: 0.9928 - classification_loss: 0.1329 433/500 [========================>.....] - ETA: 17s - loss: 1.1257 - regression_loss: 0.9928 - classification_loss: 0.1329 434/500 [=========================>....] - ETA: 16s - loss: 1.1267 - regression_loss: 0.9937 - classification_loss: 0.1329 435/500 [=========================>....] - ETA: 16s - loss: 1.1266 - regression_loss: 0.9938 - classification_loss: 0.1329 436/500 [=========================>....] - ETA: 16s - loss: 1.1255 - regression_loss: 0.9928 - classification_loss: 0.1327 437/500 [=========================>....] - ETA: 15s - loss: 1.1244 - regression_loss: 0.9918 - classification_loss: 0.1326 438/500 [=========================>....] - ETA: 15s - loss: 1.1228 - regression_loss: 0.9905 - classification_loss: 0.1324 439/500 [=========================>....] - ETA: 15s - loss: 1.1236 - regression_loss: 0.9911 - classification_loss: 0.1325 440/500 [=========================>....] - ETA: 15s - loss: 1.1233 - regression_loss: 0.9909 - classification_loss: 0.1324 441/500 [=========================>....] - ETA: 14s - loss: 1.1231 - regression_loss: 0.9907 - classification_loss: 0.1324 442/500 [=========================>....] - ETA: 14s - loss: 1.1226 - regression_loss: 0.9904 - classification_loss: 0.1323 443/500 [=========================>....] - ETA: 14s - loss: 1.1222 - regression_loss: 0.9900 - classification_loss: 0.1322 444/500 [=========================>....] - ETA: 14s - loss: 1.1208 - regression_loss: 0.9888 - classification_loss: 0.1320 445/500 [=========================>....] - ETA: 13s - loss: 1.1217 - regression_loss: 0.9895 - classification_loss: 0.1322 446/500 [=========================>....] - ETA: 13s - loss: 1.1228 - regression_loss: 0.9905 - classification_loss: 0.1323 447/500 [=========================>....] - ETA: 13s - loss: 1.1227 - regression_loss: 0.9905 - classification_loss: 0.1322 448/500 [=========================>....] - ETA: 13s - loss: 1.1224 - regression_loss: 0.9902 - classification_loss: 0.1321 449/500 [=========================>....] - ETA: 12s - loss: 1.1217 - regression_loss: 0.9897 - classification_loss: 0.1320 450/500 [==========================>...] - ETA: 12s - loss: 1.1215 - regression_loss: 0.9895 - classification_loss: 0.1320 451/500 [==========================>...] - ETA: 12s - loss: 1.1215 - regression_loss: 0.9895 - classification_loss: 0.1320 452/500 [==========================>...] - ETA: 12s - loss: 1.1210 - regression_loss: 0.9890 - classification_loss: 0.1319 453/500 [==========================>...] - ETA: 11s - loss: 1.1216 - regression_loss: 0.9896 - classification_loss: 0.1320 454/500 [==========================>...] - ETA: 11s - loss: 1.1210 - regression_loss: 0.9891 - classification_loss: 0.1319 455/500 [==========================>...] - ETA: 11s - loss: 1.1207 - regression_loss: 0.9887 - classification_loss: 0.1321 456/500 [==========================>...] - ETA: 11s - loss: 1.1201 - regression_loss: 0.9881 - classification_loss: 0.1320 457/500 [==========================>...] - ETA: 10s - loss: 1.1195 - regression_loss: 0.9876 - classification_loss: 0.1319 458/500 [==========================>...] - ETA: 10s - loss: 1.1186 - regression_loss: 0.9869 - classification_loss: 0.1317 459/500 [==========================>...] - ETA: 10s - loss: 1.1180 - regression_loss: 0.9865 - classification_loss: 0.1315 460/500 [==========================>...] - ETA: 10s - loss: 1.1182 - regression_loss: 0.9867 - classification_loss: 0.1315 461/500 [==========================>...] - ETA: 9s - loss: 1.1184 - regression_loss: 0.9868 - classification_loss: 0.1316  462/500 [==========================>...] - ETA: 9s - loss: 1.1191 - regression_loss: 0.9873 - classification_loss: 0.1317 463/500 [==========================>...] - ETA: 9s - loss: 1.1199 - regression_loss: 0.9880 - classification_loss: 0.1319 464/500 [==========================>...] - ETA: 9s - loss: 1.1198 - regression_loss: 0.9880 - classification_loss: 0.1318 465/500 [==========================>...] - ETA: 8s - loss: 1.1207 - regression_loss: 0.9887 - classification_loss: 0.1320 466/500 [==========================>...] - ETA: 8s - loss: 1.1201 - regression_loss: 0.9883 - classification_loss: 0.1318 467/500 [===========================>..] - ETA: 8s - loss: 1.1188 - regression_loss: 0.9872 - classification_loss: 0.1316 468/500 [===========================>..] - ETA: 8s - loss: 1.1203 - regression_loss: 0.9885 - classification_loss: 0.1318 469/500 [===========================>..] - ETA: 7s - loss: 1.1196 - regression_loss: 0.9879 - classification_loss: 0.1317 470/500 [===========================>..] - ETA: 7s - loss: 1.1184 - regression_loss: 0.9869 - classification_loss: 0.1314 471/500 [===========================>..] - ETA: 7s - loss: 1.1187 - regression_loss: 0.9874 - classification_loss: 0.1313 472/500 [===========================>..] - ETA: 7s - loss: 1.1181 - regression_loss: 0.9869 - classification_loss: 0.1312 473/500 [===========================>..] - ETA: 6s - loss: 1.1190 - regression_loss: 0.9877 - classification_loss: 0.1314 474/500 [===========================>..] - ETA: 6s - loss: 1.1186 - regression_loss: 0.9873 - classification_loss: 0.1314 475/500 [===========================>..] - ETA: 6s - loss: 1.1195 - regression_loss: 0.9880 - classification_loss: 0.1315 476/500 [===========================>..] - ETA: 6s - loss: 1.1197 - regression_loss: 0.9882 - classification_loss: 0.1316 477/500 [===========================>..] - ETA: 5s - loss: 1.1181 - regression_loss: 0.9867 - classification_loss: 0.1313 478/500 [===========================>..] - ETA: 5s - loss: 1.1166 - regression_loss: 0.9855 - classification_loss: 0.1311 479/500 [===========================>..] - ETA: 5s - loss: 1.1167 - regression_loss: 0.9855 - classification_loss: 0.1311 480/500 [===========================>..] - ETA: 5s - loss: 1.1168 - regression_loss: 0.9856 - classification_loss: 0.1312 481/500 [===========================>..] - ETA: 4s - loss: 1.1176 - regression_loss: 0.9863 - classification_loss: 0.1313 482/500 [===========================>..] - ETA: 4s - loss: 1.1165 - regression_loss: 0.9854 - classification_loss: 0.1311 483/500 [===========================>..] - ETA: 4s - loss: 1.1173 - regression_loss: 0.9861 - classification_loss: 0.1311 484/500 [============================>.] - ETA: 4s - loss: 1.1172 - regression_loss: 0.9861 - classification_loss: 0.1311 485/500 [============================>.] - ETA: 3s - loss: 1.1180 - regression_loss: 0.9869 - classification_loss: 0.1311 486/500 [============================>.] - ETA: 3s - loss: 1.1166 - regression_loss: 0.9856 - classification_loss: 0.1310 487/500 [============================>.] - ETA: 3s - loss: 1.1159 - regression_loss: 0.9850 - classification_loss: 0.1308 488/500 [============================>.] - ETA: 3s - loss: 1.1159 - regression_loss: 0.9850 - classification_loss: 0.1309 489/500 [============================>.] - ETA: 2s - loss: 1.1150 - regression_loss: 0.9843 - classification_loss: 0.1307 490/500 [============================>.] - ETA: 2s - loss: 1.1146 - regression_loss: 0.9840 - classification_loss: 0.1306 491/500 [============================>.] - ETA: 2s - loss: 1.1144 - regression_loss: 0.9839 - classification_loss: 0.1305 492/500 [============================>.] - ETA: 2s - loss: 1.1143 - regression_loss: 0.9840 - classification_loss: 0.1304 493/500 [============================>.] - ETA: 1s - loss: 1.1137 - regression_loss: 0.9835 - classification_loss: 0.1302 494/500 [============================>.] - ETA: 1s - loss: 1.1129 - regression_loss: 0.9829 - classification_loss: 0.1301 495/500 [============================>.] - ETA: 1s - loss: 1.1138 - regression_loss: 0.9836 - classification_loss: 0.1302 496/500 [============================>.] - ETA: 1s - loss: 1.1136 - regression_loss: 0.9835 - classification_loss: 0.1302 497/500 [============================>.] - ETA: 0s - loss: 1.1127 - regression_loss: 0.9826 - classification_loss: 0.1301 498/500 [============================>.] - ETA: 0s - loss: 1.1132 - regression_loss: 0.9830 - classification_loss: 0.1302 499/500 [============================>.] - ETA: 0s - loss: 1.1132 - regression_loss: 0.9830 - classification_loss: 0.1302 500/500 [==============================] - 127s 253ms/step - loss: 1.1134 - regression_loss: 0.9831 - classification_loss: 0.1303 1172 instances of class plum with average precision: 0.7771 mAP: 0.7771 Epoch 00030: saving model to ./training/snapshots/resnet50_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 1:51 - loss: 1.3301 - regression_loss: 1.1541 - classification_loss: 0.1760 2/500 [..............................] - ETA: 1:57 - loss: 1.4236 - regression_loss: 1.2490 - classification_loss: 0.1746 3/500 [..............................] - ETA: 1:58 - loss: 1.2423 - regression_loss: 1.0856 - classification_loss: 0.1567 4/500 [..............................] - ETA: 2:00 - loss: 1.2455 - regression_loss: 1.0834 - classification_loss: 0.1622 5/500 [..............................] - ETA: 2:01 - loss: 1.2532 - regression_loss: 1.0967 - classification_loss: 0.1564 6/500 [..............................] - ETA: 2:01 - loss: 1.1790 - regression_loss: 1.0333 - classification_loss: 0.1458 7/500 [..............................] - ETA: 2:01 - loss: 1.1856 - regression_loss: 1.0387 - classification_loss: 0.1469 8/500 [..............................] - ETA: 2:02 - loss: 1.1588 - regression_loss: 1.0179 - classification_loss: 0.1409 9/500 [..............................] - ETA: 2:03 - loss: 1.1275 - regression_loss: 0.9825 - classification_loss: 0.1450 10/500 [..............................] - ETA: 2:03 - loss: 1.1061 - regression_loss: 0.9656 - classification_loss: 0.1406 11/500 [..............................] - ETA: 2:02 - loss: 1.0266 - regression_loss: 0.8960 - classification_loss: 0.1306 12/500 [..............................] - ETA: 2:02 - loss: 1.0516 - regression_loss: 0.9220 - classification_loss: 0.1296 13/500 [..............................] - ETA: 2:02 - loss: 1.0692 - regression_loss: 0.9342 - classification_loss: 0.1350 14/500 [..............................] - ETA: 2:02 - loss: 1.0593 - regression_loss: 0.9275 - classification_loss: 0.1317 15/500 [..............................] - ETA: 2:02 - loss: 1.0477 - regression_loss: 0.9156 - classification_loss: 0.1321 16/500 [..............................] - ETA: 2:02 - loss: 1.0573 - regression_loss: 0.9242 - classification_loss: 0.1331 17/500 [>.............................] - ETA: 2:01 - loss: 1.0069 - regression_loss: 0.8793 - classification_loss: 0.1276 18/500 [>.............................] - ETA: 2:01 - loss: 1.0020 - regression_loss: 0.8777 - classification_loss: 0.1243 19/500 [>.............................] - ETA: 2:00 - loss: 1.0242 - regression_loss: 0.8963 - classification_loss: 0.1279 20/500 [>.............................] - ETA: 2:00 - loss: 1.0432 - regression_loss: 0.9127 - classification_loss: 0.1305 21/500 [>.............................] - ETA: 2:00 - loss: 1.0539 - regression_loss: 0.9219 - classification_loss: 0.1320 22/500 [>.............................] - ETA: 2:00 - loss: 1.0490 - regression_loss: 0.9196 - classification_loss: 0.1293 23/500 [>.............................] - ETA: 2:00 - loss: 1.0582 - regression_loss: 0.9294 - classification_loss: 0.1288 24/500 [>.............................] - ETA: 2:00 - loss: 1.0620 - regression_loss: 0.9322 - classification_loss: 0.1298 25/500 [>.............................] - ETA: 2:00 - loss: 1.0784 - regression_loss: 0.9473 - classification_loss: 0.1311 26/500 [>.............................] - ETA: 1:59 - loss: 1.0918 - regression_loss: 0.9604 - classification_loss: 0.1314 27/500 [>.............................] - ETA: 1:59 - loss: 1.0764 - regression_loss: 0.9476 - classification_loss: 0.1288 28/500 [>.............................] - ETA: 1:59 - loss: 1.0531 - regression_loss: 0.9267 - classification_loss: 0.1265 29/500 [>.............................] - ETA: 1:58 - loss: 1.0629 - regression_loss: 0.9358 - classification_loss: 0.1271 30/500 [>.............................] - ETA: 1:58 - loss: 1.0883 - regression_loss: 0.9578 - classification_loss: 0.1305 31/500 [>.............................] - ETA: 1:58 - loss: 1.0797 - regression_loss: 0.9509 - classification_loss: 0.1287 32/500 [>.............................] - ETA: 1:58 - loss: 1.0746 - regression_loss: 0.9465 - classification_loss: 0.1281 33/500 [>.............................] - ETA: 1:58 - loss: 1.0847 - regression_loss: 0.9557 - classification_loss: 0.1289 34/500 [=>............................] - ETA: 1:57 - loss: 1.0825 - regression_loss: 0.9532 - classification_loss: 0.1292 35/500 [=>............................] - ETA: 1:57 - loss: 1.0788 - regression_loss: 0.9514 - classification_loss: 0.1274 36/500 [=>............................] - ETA: 1:57 - loss: 1.0844 - regression_loss: 0.9577 - classification_loss: 0.1267 37/500 [=>............................] - ETA: 1:57 - loss: 1.0691 - regression_loss: 0.9453 - classification_loss: 0.1238 38/500 [=>............................] - ETA: 1:56 - loss: 1.0607 - regression_loss: 0.9385 - classification_loss: 0.1222 39/500 [=>............................] - ETA: 1:56 - loss: 1.0929 - regression_loss: 0.9647 - classification_loss: 0.1283 40/500 [=>............................] - ETA: 1:56 - loss: 1.0857 - regression_loss: 0.9587 - classification_loss: 0.1270 41/500 [=>............................] - ETA: 1:56 - loss: 1.0981 - regression_loss: 0.9699 - classification_loss: 0.1282 42/500 [=>............................] - ETA: 1:56 - loss: 1.1071 - regression_loss: 0.9775 - classification_loss: 0.1296 43/500 [=>............................] - ETA: 1:55 - loss: 1.1119 - regression_loss: 0.9826 - classification_loss: 0.1293 44/500 [=>............................] - ETA: 1:55 - loss: 1.0986 - regression_loss: 0.9713 - classification_loss: 0.1273 45/500 [=>............................] - ETA: 1:55 - loss: 1.1210 - regression_loss: 0.9914 - classification_loss: 0.1296 46/500 [=>............................] - ETA: 1:55 - loss: 1.1353 - regression_loss: 1.0039 - classification_loss: 0.1314 47/500 [=>............................] - ETA: 1:55 - loss: 1.1414 - regression_loss: 1.0090 - classification_loss: 0.1324 48/500 [=>............................] - ETA: 1:54 - loss: 1.1400 - regression_loss: 1.0075 - classification_loss: 0.1325 49/500 [=>............................] - ETA: 1:54 - loss: 1.1518 - regression_loss: 1.0156 - classification_loss: 0.1362 50/500 [==>...........................] - ETA: 1:54 - loss: 1.1508 - regression_loss: 1.0149 - classification_loss: 0.1359 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1438 - regression_loss: 1.0091 - classification_loss: 0.1347 52/500 [==>...........................] - ETA: 1:53 - loss: 1.1302 - regression_loss: 0.9973 - classification_loss: 0.1330 53/500 [==>...........................] - ETA: 1:53 - loss: 1.1320 - regression_loss: 0.9989 - classification_loss: 0.1331 54/500 [==>...........................] - ETA: 1:53 - loss: 1.1254 - regression_loss: 0.9931 - classification_loss: 0.1323 55/500 [==>...........................] - ETA: 1:52 - loss: 1.1221 - regression_loss: 0.9902 - classification_loss: 0.1319 56/500 [==>...........................] - ETA: 1:52 - loss: 1.1255 - regression_loss: 0.9926 - classification_loss: 0.1329 57/500 [==>...........................] - ETA: 1:52 - loss: 1.1283 - regression_loss: 0.9940 - classification_loss: 0.1344 58/500 [==>...........................] - ETA: 1:51 - loss: 1.1161 - regression_loss: 0.9833 - classification_loss: 0.1327 59/500 [==>...........................] - ETA: 1:51 - loss: 1.1190 - regression_loss: 0.9861 - classification_loss: 0.1329 60/500 [==>...........................] - ETA: 1:51 - loss: 1.1183 - regression_loss: 0.9864 - classification_loss: 0.1319 61/500 [==>...........................] - ETA: 1:51 - loss: 1.1226 - regression_loss: 0.9905 - classification_loss: 0.1322 62/500 [==>...........................] - ETA: 1:51 - loss: 1.1219 - regression_loss: 0.9900 - classification_loss: 0.1320 63/500 [==>...........................] - ETA: 1:50 - loss: 1.1151 - regression_loss: 0.9837 - classification_loss: 0.1314 64/500 [==>...........................] - ETA: 1:50 - loss: 1.1146 - regression_loss: 0.9832 - classification_loss: 0.1315 65/500 [==>...........................] - ETA: 1:50 - loss: 1.1059 - regression_loss: 0.9761 - classification_loss: 0.1298 66/500 [==>...........................] - ETA: 1:50 - loss: 1.0989 - regression_loss: 0.9700 - classification_loss: 0.1290 67/500 [===>..........................] - ETA: 1:49 - loss: 1.0962 - regression_loss: 0.9675 - classification_loss: 0.1287 68/500 [===>..........................] - ETA: 1:49 - loss: 1.0976 - regression_loss: 0.9683 - classification_loss: 0.1293 69/500 [===>..........................] - ETA: 1:49 - loss: 1.0985 - regression_loss: 0.9690 - classification_loss: 0.1295 70/500 [===>..........................] - ETA: 1:49 - loss: 1.1029 - regression_loss: 0.9730 - classification_loss: 0.1299 71/500 [===>..........................] - ETA: 1:48 - loss: 1.1019 - regression_loss: 0.9711 - classification_loss: 0.1308 72/500 [===>..........................] - ETA: 1:48 - loss: 1.1071 - regression_loss: 0.9754 - classification_loss: 0.1317 73/500 [===>..........................] - ETA: 1:48 - loss: 1.1081 - regression_loss: 0.9763 - classification_loss: 0.1318 74/500 [===>..........................] - ETA: 1:47 - loss: 1.1157 - regression_loss: 0.9836 - classification_loss: 0.1321 75/500 [===>..........................] - ETA: 1:47 - loss: 1.1159 - regression_loss: 0.9839 - classification_loss: 0.1320 76/500 [===>..........................] - ETA: 1:47 - loss: 1.1197 - regression_loss: 0.9870 - classification_loss: 0.1327 77/500 [===>..........................] - ETA: 1:47 - loss: 1.1107 - regression_loss: 0.9793 - classification_loss: 0.1314 78/500 [===>..........................] - ETA: 1:46 - loss: 1.1189 - regression_loss: 0.9859 - classification_loss: 0.1330 79/500 [===>..........................] - ETA: 1:46 - loss: 1.1132 - regression_loss: 0.9809 - classification_loss: 0.1323 80/500 [===>..........................] - ETA: 1:46 - loss: 1.1226 - regression_loss: 0.9886 - classification_loss: 0.1340 81/500 [===>..........................] - ETA: 1:46 - loss: 1.1176 - regression_loss: 0.9845 - classification_loss: 0.1332 82/500 [===>..........................] - ETA: 1:45 - loss: 1.1219 - regression_loss: 0.9876 - classification_loss: 0.1343 83/500 [===>..........................] - ETA: 1:45 - loss: 1.1202 - regression_loss: 0.9863 - classification_loss: 0.1339 84/500 [====>.........................] - ETA: 1:45 - loss: 1.1239 - regression_loss: 0.9894 - classification_loss: 0.1345 85/500 [====>.........................] - ETA: 1:45 - loss: 1.1270 - regression_loss: 0.9919 - classification_loss: 0.1351 86/500 [====>.........................] - ETA: 1:44 - loss: 1.1322 - regression_loss: 0.9959 - classification_loss: 0.1363 87/500 [====>.........................] - ETA: 1:44 - loss: 1.1353 - regression_loss: 0.9987 - classification_loss: 0.1366 88/500 [====>.........................] - ETA: 1:44 - loss: 1.1389 - regression_loss: 1.0021 - classification_loss: 0.1368 89/500 [====>.........................] - ETA: 1:44 - loss: 1.1366 - regression_loss: 1.0002 - classification_loss: 0.1364 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1288 - regression_loss: 0.9936 - classification_loss: 0.1352 91/500 [====>.........................] - ETA: 1:43 - loss: 1.1340 - regression_loss: 0.9987 - classification_loss: 0.1353 92/500 [====>.........................] - ETA: 1:43 - loss: 1.1349 - regression_loss: 0.9994 - classification_loss: 0.1355 93/500 [====>.........................] - ETA: 1:42 - loss: 1.1363 - regression_loss: 1.0003 - classification_loss: 0.1359 94/500 [====>.........................] - ETA: 1:42 - loss: 1.1396 - regression_loss: 1.0030 - classification_loss: 0.1366 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1463 - regression_loss: 1.0089 - classification_loss: 0.1374 96/500 [====>.........................] - ETA: 1:42 - loss: 1.1505 - regression_loss: 1.0127 - classification_loss: 0.1378 97/500 [====>.........................] - ETA: 1:41 - loss: 1.1451 - regression_loss: 1.0083 - classification_loss: 0.1368 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1454 - regression_loss: 1.0087 - classification_loss: 0.1368 99/500 [====>.........................] - ETA: 1:41 - loss: 1.1374 - regression_loss: 1.0018 - classification_loss: 0.1355 100/500 [=====>........................] - ETA: 1:41 - loss: 1.1371 - regression_loss: 1.0014 - classification_loss: 0.1357 101/500 [=====>........................] - ETA: 1:40 - loss: 1.1333 - regression_loss: 0.9982 - classification_loss: 0.1351 102/500 [=====>........................] - ETA: 1:40 - loss: 1.1319 - regression_loss: 0.9971 - classification_loss: 0.1348 103/500 [=====>........................] - ETA: 1:40 - loss: 1.1290 - regression_loss: 0.9940 - classification_loss: 0.1350 104/500 [=====>........................] - ETA: 1:40 - loss: 1.1303 - regression_loss: 0.9953 - classification_loss: 0.1350 105/500 [=====>........................] - ETA: 1:39 - loss: 1.1331 - regression_loss: 0.9981 - classification_loss: 0.1350 106/500 [=====>........................] - ETA: 1:39 - loss: 1.1310 - regression_loss: 0.9964 - classification_loss: 0.1346 107/500 [=====>........................] - ETA: 1:39 - loss: 1.1296 - regression_loss: 0.9953 - classification_loss: 0.1343 108/500 [=====>........................] - ETA: 1:39 - loss: 1.1323 - regression_loss: 0.9976 - classification_loss: 0.1347 109/500 [=====>........................] - ETA: 1:38 - loss: 1.1348 - regression_loss: 0.9999 - classification_loss: 0.1349 110/500 [=====>........................] - ETA: 1:38 - loss: 1.1324 - regression_loss: 0.9980 - classification_loss: 0.1343 111/500 [=====>........................] - ETA: 1:38 - loss: 1.1342 - regression_loss: 0.9998 - classification_loss: 0.1344 112/500 [=====>........................] - ETA: 1:38 - loss: 1.1358 - regression_loss: 1.0017 - classification_loss: 0.1342 113/500 [=====>........................] - ETA: 1:37 - loss: 1.1369 - regression_loss: 1.0025 - classification_loss: 0.1344 114/500 [=====>........................] - ETA: 1:37 - loss: 1.1379 - regression_loss: 1.0033 - classification_loss: 0.1346 115/500 [=====>........................] - ETA: 1:37 - loss: 1.1354 - regression_loss: 1.0010 - classification_loss: 0.1344 116/500 [=====>........................] - ETA: 1:36 - loss: 1.1322 - regression_loss: 0.9981 - classification_loss: 0.1341 117/500 [======>.......................] - ETA: 1:36 - loss: 1.1342 - regression_loss: 0.9995 - classification_loss: 0.1347 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1339 - regression_loss: 0.9992 - classification_loss: 0.1347 119/500 [======>.......................] - ETA: 1:36 - loss: 1.1352 - regression_loss: 1.0004 - classification_loss: 0.1348 120/500 [======>.......................] - ETA: 1:35 - loss: 1.1374 - regression_loss: 1.0024 - classification_loss: 0.1350 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1403 - regression_loss: 1.0048 - classification_loss: 0.1356 122/500 [======>.......................] - ETA: 1:35 - loss: 1.1373 - regression_loss: 1.0022 - classification_loss: 0.1351 123/500 [======>.......................] - ETA: 1:35 - loss: 1.1358 - regression_loss: 1.0009 - classification_loss: 0.1350 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1331 - regression_loss: 0.9983 - classification_loss: 0.1348 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1332 - regression_loss: 0.9986 - classification_loss: 0.1346 126/500 [======>.......................] - ETA: 1:34 - loss: 1.1384 - regression_loss: 1.0034 - classification_loss: 0.1349 127/500 [======>.......................] - ETA: 1:34 - loss: 1.1368 - regression_loss: 1.0022 - classification_loss: 0.1345 128/500 [======>.......................] - ETA: 1:33 - loss: 1.1390 - regression_loss: 1.0036 - classification_loss: 0.1354 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1347 - regression_loss: 0.9997 - classification_loss: 0.1349 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1297 - regression_loss: 0.9956 - classification_loss: 0.1342 131/500 [======>.......................] - ETA: 1:33 - loss: 1.1301 - regression_loss: 0.9959 - classification_loss: 0.1342 132/500 [======>.......................] - ETA: 1:32 - loss: 1.1289 - regression_loss: 0.9948 - classification_loss: 0.1341 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1252 - regression_loss: 0.9919 - classification_loss: 0.1334 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1261 - regression_loss: 0.9925 - classification_loss: 0.1335 135/500 [=======>......................] - ETA: 1:32 - loss: 1.1277 - regression_loss: 0.9941 - classification_loss: 0.1336 136/500 [=======>......................] - ETA: 1:31 - loss: 1.1229 - regression_loss: 0.9899 - classification_loss: 0.1330 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1228 - regression_loss: 0.9898 - classification_loss: 0.1330 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1200 - regression_loss: 0.9873 - classification_loss: 0.1327 139/500 [=======>......................] - ETA: 1:30 - loss: 1.1205 - regression_loss: 0.9875 - classification_loss: 0.1329 140/500 [=======>......................] - ETA: 1:30 - loss: 1.1196 - regression_loss: 0.9871 - classification_loss: 0.1325 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1195 - regression_loss: 0.9873 - classification_loss: 0.1323 142/500 [=======>......................] - ETA: 1:29 - loss: 1.1185 - regression_loss: 0.9864 - classification_loss: 0.1322 143/500 [=======>......................] - ETA: 1:29 - loss: 1.1199 - regression_loss: 0.9874 - classification_loss: 0.1325 144/500 [=======>......................] - ETA: 1:29 - loss: 1.1190 - regression_loss: 0.9865 - classification_loss: 0.1325 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1199 - regression_loss: 0.9873 - classification_loss: 0.1327 146/500 [=======>......................] - ETA: 1:28 - loss: 1.1205 - regression_loss: 0.9876 - classification_loss: 0.1329 147/500 [=======>......................] - ETA: 1:28 - loss: 1.1201 - regression_loss: 0.9873 - classification_loss: 0.1328 148/500 [=======>......................] - ETA: 1:28 - loss: 1.1159 - regression_loss: 0.9837 - classification_loss: 0.1322 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1158 - regression_loss: 0.9835 - classification_loss: 0.1323 150/500 [========>.....................] - ETA: 1:27 - loss: 1.1166 - regression_loss: 0.9841 - classification_loss: 0.1325 151/500 [========>.....................] - ETA: 1:27 - loss: 1.1124 - regression_loss: 0.9805 - classification_loss: 0.1320 152/500 [========>.....................] - ETA: 1:27 - loss: 1.1094 - regression_loss: 0.9780 - classification_loss: 0.1314 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1097 - regression_loss: 0.9781 - classification_loss: 0.1315 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1097 - regression_loss: 0.9783 - classification_loss: 0.1314 155/500 [========>.....................] - ETA: 1:26 - loss: 1.1086 - regression_loss: 0.9771 - classification_loss: 0.1315 156/500 [========>.....................] - ETA: 1:26 - loss: 1.1075 - regression_loss: 0.9760 - classification_loss: 0.1315 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1062 - regression_loss: 0.9748 - classification_loss: 0.1314 158/500 [========>.....................] - ETA: 1:25 - loss: 1.1065 - regression_loss: 0.9745 - classification_loss: 0.1320 159/500 [========>.....................] - ETA: 1:25 - loss: 1.1067 - regression_loss: 0.9746 - classification_loss: 0.1322 160/500 [========>.....................] - ETA: 1:25 - loss: 1.1122 - regression_loss: 0.9790 - classification_loss: 0.1333 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1123 - regression_loss: 0.9790 - classification_loss: 0.1333 162/500 [========>.....................] - ETA: 1:24 - loss: 1.1124 - regression_loss: 0.9793 - classification_loss: 0.1331 163/500 [========>.....................] - ETA: 1:24 - loss: 1.1145 - regression_loss: 0.9808 - classification_loss: 0.1337 164/500 [========>.....................] - ETA: 1:24 - loss: 1.1133 - regression_loss: 0.9800 - classification_loss: 0.1334 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1170 - regression_loss: 0.9830 - classification_loss: 0.1340 166/500 [========>.....................] - ETA: 1:23 - loss: 1.1129 - regression_loss: 0.9794 - classification_loss: 0.1335 167/500 [=========>....................] - ETA: 1:23 - loss: 1.1095 - regression_loss: 0.9766 - classification_loss: 0.1329 168/500 [=========>....................] - ETA: 1:23 - loss: 1.1131 - regression_loss: 0.9795 - classification_loss: 0.1335 169/500 [=========>....................] - ETA: 1:23 - loss: 1.1141 - regression_loss: 0.9805 - classification_loss: 0.1336 170/500 [=========>....................] - ETA: 1:22 - loss: 1.1153 - regression_loss: 0.9818 - classification_loss: 0.1335 171/500 [=========>....................] - ETA: 1:22 - loss: 1.1186 - regression_loss: 0.9842 - classification_loss: 0.1345 172/500 [=========>....................] - ETA: 1:22 - loss: 1.1209 - regression_loss: 0.9861 - classification_loss: 0.1348 173/500 [=========>....................] - ETA: 1:22 - loss: 1.1196 - regression_loss: 0.9851 - classification_loss: 0.1345 174/500 [=========>....................] - ETA: 1:22 - loss: 1.1163 - regression_loss: 0.9823 - classification_loss: 0.1340 175/500 [=========>....................] - ETA: 1:21 - loss: 1.1182 - regression_loss: 0.9839 - classification_loss: 0.1343 176/500 [=========>....................] - ETA: 1:21 - loss: 1.1169 - regression_loss: 0.9829 - classification_loss: 0.1340 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1154 - regression_loss: 0.9817 - classification_loss: 0.1336 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1159 - regression_loss: 0.9823 - classification_loss: 0.1336 179/500 [=========>....................] - ETA: 1:20 - loss: 1.1152 - regression_loss: 0.9820 - classification_loss: 0.1333 180/500 [=========>....................] - ETA: 1:20 - loss: 1.1171 - regression_loss: 0.9835 - classification_loss: 0.1336 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1183 - regression_loss: 0.9847 - classification_loss: 0.1336 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1150 - regression_loss: 0.9819 - classification_loss: 0.1331 183/500 [=========>....................] - ETA: 1:19 - loss: 1.1134 - regression_loss: 0.9805 - classification_loss: 0.1330 184/500 [==========>...................] - ETA: 1:19 - loss: 1.1141 - regression_loss: 0.9810 - classification_loss: 0.1331 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1137 - regression_loss: 0.9805 - classification_loss: 0.1332 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1146 - regression_loss: 0.9812 - classification_loss: 0.1334 187/500 [==========>...................] - ETA: 1:18 - loss: 1.1160 - regression_loss: 0.9826 - classification_loss: 0.1334 188/500 [==========>...................] - ETA: 1:18 - loss: 1.1136 - regression_loss: 0.9805 - classification_loss: 0.1331 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1154 - regression_loss: 0.9820 - classification_loss: 0.1334 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1147 - regression_loss: 0.9815 - classification_loss: 0.1332 191/500 [==========>...................] - ETA: 1:17 - loss: 1.1159 - regression_loss: 0.9826 - classification_loss: 0.1334 192/500 [==========>...................] - ETA: 1:17 - loss: 1.1243 - regression_loss: 0.9859 - classification_loss: 0.1384 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1208 - regression_loss: 0.9830 - classification_loss: 0.1378 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1229 - regression_loss: 0.9848 - classification_loss: 0.1382 195/500 [==========>...................] - ETA: 1:16 - loss: 1.1217 - regression_loss: 0.9839 - classification_loss: 0.1378 196/500 [==========>...................] - ETA: 1:16 - loss: 1.1186 - regression_loss: 0.9813 - classification_loss: 0.1373 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1161 - regression_loss: 0.9793 - classification_loss: 0.1369 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1156 - regression_loss: 0.9789 - classification_loss: 0.1367 199/500 [==========>...................] - ETA: 1:15 - loss: 1.1142 - regression_loss: 0.9779 - classification_loss: 0.1363 200/500 [===========>..................] - ETA: 1:15 - loss: 1.1138 - regression_loss: 0.9775 - classification_loss: 0.1363 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1140 - regression_loss: 0.9777 - classification_loss: 0.1363 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1123 - regression_loss: 0.9764 - classification_loss: 0.1359 203/500 [===========>..................] - ETA: 1:14 - loss: 1.1119 - regression_loss: 0.9762 - classification_loss: 0.1358 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1116 - regression_loss: 0.9762 - classification_loss: 0.1355 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1123 - regression_loss: 0.9768 - classification_loss: 0.1355 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1115 - regression_loss: 0.9759 - classification_loss: 0.1355 207/500 [===========>..................] - ETA: 1:13 - loss: 1.1136 - regression_loss: 0.9776 - classification_loss: 0.1360 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1118 - regression_loss: 0.9762 - classification_loss: 0.1356 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1133 - regression_loss: 0.9776 - classification_loss: 0.1356 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1119 - regression_loss: 0.9766 - classification_loss: 0.1352 211/500 [===========>..................] - ETA: 1:12 - loss: 1.1130 - regression_loss: 0.9780 - classification_loss: 0.1350 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1104 - regression_loss: 0.9758 - classification_loss: 0.1346 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1100 - regression_loss: 0.9756 - classification_loss: 0.1344 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1090 - regression_loss: 0.9748 - classification_loss: 0.1342 215/500 [===========>..................] - ETA: 1:11 - loss: 1.1068 - regression_loss: 0.9731 - classification_loss: 0.1337 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1072 - regression_loss: 0.9735 - classification_loss: 0.1337 217/500 [============>.................] - ETA: 1:11 - loss: 1.1047 - regression_loss: 0.9713 - classification_loss: 0.1334 218/500 [============>.................] - ETA: 1:11 - loss: 1.1054 - regression_loss: 0.9721 - classification_loss: 0.1334 219/500 [============>.................] - ETA: 1:10 - loss: 1.1045 - regression_loss: 0.9716 - classification_loss: 0.1329 220/500 [============>.................] - ETA: 1:10 - loss: 1.1069 - regression_loss: 0.9737 - classification_loss: 0.1332 221/500 [============>.................] - ETA: 1:10 - loss: 1.1055 - regression_loss: 0.9727 - classification_loss: 0.1328 222/500 [============>.................] - ETA: 1:10 - loss: 1.1070 - regression_loss: 0.9739 - classification_loss: 0.1331 223/500 [============>.................] - ETA: 1:09 - loss: 1.1047 - regression_loss: 0.9720 - classification_loss: 0.1327 224/500 [============>.................] - ETA: 1:09 - loss: 1.1071 - regression_loss: 0.9741 - classification_loss: 0.1330 225/500 [============>.................] - ETA: 1:09 - loss: 1.1083 - regression_loss: 0.9750 - classification_loss: 0.1334 226/500 [============>.................] - ETA: 1:09 - loss: 1.1094 - regression_loss: 0.9760 - classification_loss: 0.1334 227/500 [============>.................] - ETA: 1:08 - loss: 1.1105 - regression_loss: 0.9768 - classification_loss: 0.1336 228/500 [============>.................] - ETA: 1:08 - loss: 1.1107 - regression_loss: 0.9771 - classification_loss: 0.1336 229/500 [============>.................] - ETA: 1:08 - loss: 1.1115 - regression_loss: 0.9778 - classification_loss: 0.1337 230/500 [============>.................] - ETA: 1:08 - loss: 1.1119 - regression_loss: 0.9782 - classification_loss: 0.1337 231/500 [============>.................] - ETA: 1:07 - loss: 1.1148 - regression_loss: 0.9805 - classification_loss: 0.1344 232/500 [============>.................] - ETA: 1:07 - loss: 1.1168 - regression_loss: 0.9822 - classification_loss: 0.1346 233/500 [============>.................] - ETA: 1:07 - loss: 1.1168 - regression_loss: 0.9822 - classification_loss: 0.1346 234/500 [=============>................] - ETA: 1:07 - loss: 1.1147 - regression_loss: 0.9804 - classification_loss: 0.1343 235/500 [=============>................] - ETA: 1:06 - loss: 1.1142 - regression_loss: 0.9800 - classification_loss: 0.1342 236/500 [=============>................] - ETA: 1:06 - loss: 1.1142 - regression_loss: 0.9800 - classification_loss: 0.1342 237/500 [=============>................] - ETA: 1:06 - loss: 1.1135 - regression_loss: 0.9793 - classification_loss: 0.1342 238/500 [=============>................] - ETA: 1:06 - loss: 1.1127 - regression_loss: 0.9786 - classification_loss: 0.1340 239/500 [=============>................] - ETA: 1:05 - loss: 1.1117 - regression_loss: 0.9778 - classification_loss: 0.1339 240/500 [=============>................] - ETA: 1:05 - loss: 1.1113 - regression_loss: 0.9774 - classification_loss: 0.1339 241/500 [=============>................] - ETA: 1:05 - loss: 1.1118 - regression_loss: 0.9779 - classification_loss: 0.1339 242/500 [=============>................] - ETA: 1:04 - loss: 1.1145 - regression_loss: 0.9803 - classification_loss: 0.1342 243/500 [=============>................] - ETA: 1:04 - loss: 1.1118 - regression_loss: 0.9781 - classification_loss: 0.1338 244/500 [=============>................] - ETA: 1:04 - loss: 1.1133 - regression_loss: 0.9793 - classification_loss: 0.1340 245/500 [=============>................] - ETA: 1:04 - loss: 1.1125 - regression_loss: 0.9784 - classification_loss: 0.1340 246/500 [=============>................] - ETA: 1:03 - loss: 1.1110 - regression_loss: 0.9773 - classification_loss: 0.1338 247/500 [=============>................] - ETA: 1:03 - loss: 1.1082 - regression_loss: 0.9748 - classification_loss: 0.1334 248/500 [=============>................] - ETA: 1:03 - loss: 1.1082 - regression_loss: 0.9749 - classification_loss: 0.1334 249/500 [=============>................] - ETA: 1:03 - loss: 1.1076 - regression_loss: 0.9743 - classification_loss: 0.1333 250/500 [==============>...............] - ETA: 1:02 - loss: 1.1108 - regression_loss: 0.9772 - classification_loss: 0.1336 251/500 [==============>...............] - ETA: 1:02 - loss: 1.1093 - regression_loss: 0.9761 - classification_loss: 0.1332 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1088 - regression_loss: 0.9758 - classification_loss: 0.1330 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1099 - regression_loss: 0.9767 - classification_loss: 0.1332 254/500 [==============>...............] - ETA: 1:01 - loss: 1.1095 - regression_loss: 0.9764 - classification_loss: 0.1331 255/500 [==============>...............] - ETA: 1:01 - loss: 1.1099 - regression_loss: 0.9768 - classification_loss: 0.1331 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1100 - regression_loss: 0.9770 - classification_loss: 0.1330 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1105 - regression_loss: 0.9775 - classification_loss: 0.1330 258/500 [==============>...............] - ETA: 1:00 - loss: 1.1084 - regression_loss: 0.9758 - classification_loss: 0.1326 259/500 [==============>...............] - ETA: 1:00 - loss: 1.1086 - regression_loss: 0.9759 - classification_loss: 0.1327 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1068 - regression_loss: 0.9744 - classification_loss: 0.1324 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1073 - regression_loss: 0.9749 - classification_loss: 0.1323 262/500 [==============>...............] - ETA: 59s - loss: 1.1091 - regression_loss: 0.9768 - classification_loss: 0.1323  263/500 [==============>...............] - ETA: 59s - loss: 1.1091 - regression_loss: 0.9769 - classification_loss: 0.1322 264/500 [==============>...............] - ETA: 59s - loss: 1.1071 - regression_loss: 0.9752 - classification_loss: 0.1318 265/500 [==============>...............] - ETA: 59s - loss: 1.1067 - regression_loss: 0.9749 - classification_loss: 0.1318 266/500 [==============>...............] - ETA: 58s - loss: 1.1038 - regression_loss: 0.9724 - classification_loss: 0.1314 267/500 [===============>..............] - ETA: 58s - loss: 1.1048 - regression_loss: 0.9736 - classification_loss: 0.1312 268/500 [===============>..............] - ETA: 58s - loss: 1.1056 - regression_loss: 0.9742 - classification_loss: 0.1314 269/500 [===============>..............] - ETA: 58s - loss: 1.1064 - regression_loss: 0.9749 - classification_loss: 0.1314 270/500 [===============>..............] - ETA: 57s - loss: 1.1061 - regression_loss: 0.9748 - classification_loss: 0.1313 271/500 [===============>..............] - ETA: 57s - loss: 1.1067 - regression_loss: 0.9754 - classification_loss: 0.1313 272/500 [===============>..............] - ETA: 57s - loss: 1.1070 - regression_loss: 0.9756 - classification_loss: 0.1313 273/500 [===============>..............] - ETA: 57s - loss: 1.1071 - regression_loss: 0.9758 - classification_loss: 0.1313 274/500 [===============>..............] - ETA: 56s - loss: 1.1079 - regression_loss: 0.9764 - classification_loss: 0.1316 275/500 [===============>..............] - ETA: 56s - loss: 1.1099 - regression_loss: 0.9780 - classification_loss: 0.1319 276/500 [===============>..............] - ETA: 56s - loss: 1.1099 - regression_loss: 0.9777 - classification_loss: 0.1321 277/500 [===============>..............] - ETA: 56s - loss: 1.1108 - regression_loss: 0.9785 - classification_loss: 0.1323 278/500 [===============>..............] - ETA: 55s - loss: 1.1107 - regression_loss: 0.9784 - classification_loss: 0.1323 279/500 [===============>..............] - ETA: 55s - loss: 1.1095 - regression_loss: 0.9773 - classification_loss: 0.1322 280/500 [===============>..............] - ETA: 55s - loss: 1.1108 - regression_loss: 0.9784 - classification_loss: 0.1325 281/500 [===============>..............] - ETA: 55s - loss: 1.1139 - regression_loss: 0.9809 - classification_loss: 0.1330 282/500 [===============>..............] - ETA: 54s - loss: 1.1158 - regression_loss: 0.9824 - classification_loss: 0.1334 283/500 [===============>..............] - ETA: 54s - loss: 1.1171 - regression_loss: 0.9837 - classification_loss: 0.1334 284/500 [================>.............] - ETA: 54s - loss: 1.1192 - regression_loss: 0.9854 - classification_loss: 0.1338 285/500 [================>.............] - ETA: 54s - loss: 1.1200 - regression_loss: 0.9862 - classification_loss: 0.1339 286/500 [================>.............] - ETA: 53s - loss: 1.1211 - regression_loss: 0.9870 - classification_loss: 0.1341 287/500 [================>.............] - ETA: 53s - loss: 1.1212 - regression_loss: 0.9872 - classification_loss: 0.1340 288/500 [================>.............] - ETA: 53s - loss: 1.1208 - regression_loss: 0.9869 - classification_loss: 0.1339 289/500 [================>.............] - ETA: 53s - loss: 1.1201 - regression_loss: 0.9865 - classification_loss: 0.1336 290/500 [================>.............] - ETA: 52s - loss: 1.1206 - regression_loss: 0.9868 - classification_loss: 0.1338 291/500 [================>.............] - ETA: 52s - loss: 1.1195 - regression_loss: 0.9860 - classification_loss: 0.1335 292/500 [================>.............] - ETA: 52s - loss: 1.1168 - regression_loss: 0.9836 - classification_loss: 0.1331 293/500 [================>.............] - ETA: 52s - loss: 1.1143 - regression_loss: 0.9815 - classification_loss: 0.1328 294/500 [================>.............] - ETA: 51s - loss: 1.1142 - regression_loss: 0.9815 - classification_loss: 0.1327 295/500 [================>.............] - ETA: 51s - loss: 1.1131 - regression_loss: 0.9805 - classification_loss: 0.1325 296/500 [================>.............] - ETA: 51s - loss: 1.1114 - regression_loss: 0.9790 - classification_loss: 0.1324 297/500 [================>.............] - ETA: 51s - loss: 1.1101 - regression_loss: 0.9779 - classification_loss: 0.1322 298/500 [================>.............] - ETA: 50s - loss: 1.1105 - regression_loss: 0.9782 - classification_loss: 0.1323 299/500 [================>.............] - ETA: 50s - loss: 1.1140 - regression_loss: 0.9812 - classification_loss: 0.1327 300/500 [=================>............] - ETA: 50s - loss: 1.1140 - regression_loss: 0.9813 - classification_loss: 0.1327 301/500 [=================>............] - ETA: 50s - loss: 1.1146 - regression_loss: 0.9819 - classification_loss: 0.1327 302/500 [=================>............] - ETA: 49s - loss: 1.1139 - regression_loss: 0.9814 - classification_loss: 0.1326 303/500 [=================>............] - ETA: 49s - loss: 1.1119 - regression_loss: 0.9796 - classification_loss: 0.1322 304/500 [=================>............] - ETA: 49s - loss: 1.1122 - regression_loss: 0.9800 - classification_loss: 0.1322 305/500 [=================>............] - ETA: 49s - loss: 1.1122 - regression_loss: 0.9799 - classification_loss: 0.1322 306/500 [=================>............] - ETA: 48s - loss: 1.1113 - regression_loss: 0.9793 - classification_loss: 0.1321 307/500 [=================>............] - ETA: 48s - loss: 1.1119 - regression_loss: 0.9798 - classification_loss: 0.1322 308/500 [=================>............] - ETA: 48s - loss: 1.1117 - regression_loss: 0.9796 - classification_loss: 0.1321 309/500 [=================>............] - ETA: 48s - loss: 1.1129 - regression_loss: 0.9807 - classification_loss: 0.1322 310/500 [=================>............] - ETA: 47s - loss: 1.1110 - regression_loss: 0.9790 - classification_loss: 0.1320 311/500 [=================>............] - ETA: 47s - loss: 1.1122 - regression_loss: 0.9800 - classification_loss: 0.1321 312/500 [=================>............] - ETA: 47s - loss: 1.1120 - regression_loss: 0.9799 - classification_loss: 0.1321 313/500 [=================>............] - ETA: 47s - loss: 1.1119 - regression_loss: 0.9799 - classification_loss: 0.1320 314/500 [=================>............] - ETA: 46s - loss: 1.1126 - regression_loss: 0.9806 - classification_loss: 0.1319 315/500 [=================>............] - ETA: 46s - loss: 1.1140 - regression_loss: 0.9818 - classification_loss: 0.1321 316/500 [=================>............] - ETA: 46s - loss: 1.1147 - regression_loss: 0.9825 - classification_loss: 0.1322 317/500 [==================>...........] - ETA: 46s - loss: 1.1162 - regression_loss: 0.9837 - classification_loss: 0.1325 318/500 [==================>...........] - ETA: 45s - loss: 1.1173 - regression_loss: 0.9847 - classification_loss: 0.1326 319/500 [==================>...........] - ETA: 45s - loss: 1.1170 - regression_loss: 0.9846 - classification_loss: 0.1325 320/500 [==================>...........] - ETA: 45s - loss: 1.1171 - regression_loss: 0.9846 - classification_loss: 0.1325 321/500 [==================>...........] - ETA: 45s - loss: 1.1159 - regression_loss: 0.9837 - classification_loss: 0.1322 322/500 [==================>...........] - ETA: 44s - loss: 1.1156 - regression_loss: 0.9834 - classification_loss: 0.1322 323/500 [==================>...........] - ETA: 44s - loss: 1.1146 - regression_loss: 0.9827 - classification_loss: 0.1319 324/500 [==================>...........] - ETA: 44s - loss: 1.1156 - regression_loss: 0.9836 - classification_loss: 0.1320 325/500 [==================>...........] - ETA: 44s - loss: 1.1165 - regression_loss: 0.9844 - classification_loss: 0.1321 326/500 [==================>...........] - ETA: 43s - loss: 1.1173 - regression_loss: 0.9849 - classification_loss: 0.1324 327/500 [==================>...........] - ETA: 43s - loss: 1.1166 - regression_loss: 0.9843 - classification_loss: 0.1323 328/500 [==================>...........] - ETA: 43s - loss: 1.1166 - regression_loss: 0.9842 - classification_loss: 0.1324 329/500 [==================>...........] - ETA: 43s - loss: 1.1162 - regression_loss: 0.9841 - classification_loss: 0.1322 330/500 [==================>...........] - ETA: 42s - loss: 1.1143 - regression_loss: 0.9823 - classification_loss: 0.1320 331/500 [==================>...........] - ETA: 42s - loss: 1.1126 - regression_loss: 0.9809 - classification_loss: 0.1318 332/500 [==================>...........] - ETA: 42s - loss: 1.1118 - regression_loss: 0.9801 - classification_loss: 0.1317 333/500 [==================>...........] - ETA: 42s - loss: 1.1099 - regression_loss: 0.9785 - classification_loss: 0.1314 334/500 [===================>..........] - ETA: 41s - loss: 1.1110 - regression_loss: 0.9796 - classification_loss: 0.1314 335/500 [===================>..........] - ETA: 41s - loss: 1.1112 - regression_loss: 0.9799 - classification_loss: 0.1313 336/500 [===================>..........] - ETA: 41s - loss: 1.1126 - regression_loss: 0.9812 - classification_loss: 0.1315 337/500 [===================>..........] - ETA: 41s - loss: 1.1126 - regression_loss: 0.9810 - classification_loss: 0.1316 338/500 [===================>..........] - ETA: 40s - loss: 1.1135 - regression_loss: 0.9816 - classification_loss: 0.1319 339/500 [===================>..........] - ETA: 40s - loss: 1.1142 - regression_loss: 0.9822 - classification_loss: 0.1320 340/500 [===================>..........] - ETA: 40s - loss: 1.1139 - regression_loss: 0.9820 - classification_loss: 0.1319 341/500 [===================>..........] - ETA: 40s - loss: 1.1135 - regression_loss: 0.9816 - classification_loss: 0.1319 342/500 [===================>..........] - ETA: 39s - loss: 1.1135 - regression_loss: 0.9818 - classification_loss: 0.1317 343/500 [===================>..........] - ETA: 39s - loss: 1.1114 - regression_loss: 0.9800 - classification_loss: 0.1314 344/500 [===================>..........] - ETA: 39s - loss: 1.1118 - regression_loss: 0.9804 - classification_loss: 0.1315 345/500 [===================>..........] - ETA: 39s - loss: 1.1113 - regression_loss: 0.9800 - classification_loss: 0.1313 346/500 [===================>..........] - ETA: 38s - loss: 1.1097 - regression_loss: 0.9786 - classification_loss: 0.1311 347/500 [===================>..........] - ETA: 38s - loss: 1.1120 - regression_loss: 0.9804 - classification_loss: 0.1316 348/500 [===================>..........] - ETA: 38s - loss: 1.1113 - regression_loss: 0.9799 - classification_loss: 0.1314 349/500 [===================>..........] - ETA: 38s - loss: 1.1102 - regression_loss: 0.9790 - classification_loss: 0.1312 350/500 [====================>.........] - ETA: 37s - loss: 1.1113 - regression_loss: 0.9799 - classification_loss: 0.1314 351/500 [====================>.........] - ETA: 37s - loss: 1.1125 - regression_loss: 0.9809 - classification_loss: 0.1316 352/500 [====================>.........] - ETA: 37s - loss: 1.1123 - regression_loss: 0.9807 - classification_loss: 0.1317 353/500 [====================>.........] - ETA: 37s - loss: 1.1111 - regression_loss: 0.9797 - classification_loss: 0.1314 354/500 [====================>.........] - ETA: 36s - loss: 1.1108 - regression_loss: 0.9794 - classification_loss: 0.1314 355/500 [====================>.........] - ETA: 36s - loss: 1.1116 - regression_loss: 0.9801 - classification_loss: 0.1315 356/500 [====================>.........] - ETA: 36s - loss: 1.1096 - regression_loss: 0.9784 - classification_loss: 0.1312 357/500 [====================>.........] - ETA: 36s - loss: 1.1098 - regression_loss: 0.9786 - classification_loss: 0.1312 358/500 [====================>.........] - ETA: 35s - loss: 1.1091 - regression_loss: 0.9780 - classification_loss: 0.1311 359/500 [====================>.........] - ETA: 35s - loss: 1.1092 - regression_loss: 0.9781 - classification_loss: 0.1311 360/500 [====================>.........] - ETA: 35s - loss: 1.1093 - regression_loss: 0.9782 - classification_loss: 0.1310 361/500 [====================>.........] - ETA: 35s - loss: 1.1093 - regression_loss: 0.9781 - classification_loss: 0.1312 362/500 [====================>.........] - ETA: 34s - loss: 1.1079 - regression_loss: 0.9769 - classification_loss: 0.1309 363/500 [====================>.........] - ETA: 34s - loss: 1.1084 - regression_loss: 0.9774 - classification_loss: 0.1310 364/500 [====================>.........] - ETA: 34s - loss: 1.1072 - regression_loss: 0.9763 - classification_loss: 0.1308 365/500 [====================>.........] - ETA: 34s - loss: 1.1078 - regression_loss: 0.9769 - classification_loss: 0.1309 366/500 [====================>.........] - ETA: 33s - loss: 1.1068 - regression_loss: 0.9761 - classification_loss: 0.1307 367/500 [=====================>........] - ETA: 33s - loss: 1.1080 - regression_loss: 0.9771 - classification_loss: 0.1309 368/500 [=====================>........] - ETA: 33s - loss: 1.1085 - regression_loss: 0.9778 - classification_loss: 0.1308 369/500 [=====================>........] - ETA: 33s - loss: 1.1095 - regression_loss: 0.9785 - classification_loss: 0.1310 370/500 [=====================>........] - ETA: 32s - loss: 1.1097 - regression_loss: 0.9788 - classification_loss: 0.1309 371/500 [=====================>........] - ETA: 32s - loss: 1.1104 - regression_loss: 0.9794 - classification_loss: 0.1310 372/500 [=====================>........] - ETA: 32s - loss: 1.1099 - regression_loss: 0.9790 - classification_loss: 0.1309 373/500 [=====================>........] - ETA: 32s - loss: 1.1103 - regression_loss: 0.9792 - classification_loss: 0.1311 374/500 [=====================>........] - ETA: 31s - loss: 1.1105 - regression_loss: 0.9794 - classification_loss: 0.1311 375/500 [=====================>........] - ETA: 31s - loss: 1.1112 - regression_loss: 0.9800 - classification_loss: 0.1312 376/500 [=====================>........] - ETA: 31s - loss: 1.1095 - regression_loss: 0.9785 - classification_loss: 0.1311 377/500 [=====================>........] - ETA: 31s - loss: 1.1094 - regression_loss: 0.9784 - classification_loss: 0.1310 378/500 [=====================>........] - ETA: 30s - loss: 1.1102 - regression_loss: 0.9792 - classification_loss: 0.1310 379/500 [=====================>........] - ETA: 30s - loss: 1.1086 - regression_loss: 0.9778 - classification_loss: 0.1308 380/500 [=====================>........] - ETA: 30s - loss: 1.1099 - regression_loss: 0.9789 - classification_loss: 0.1310 381/500 [=====================>........] - ETA: 30s - loss: 1.1112 - regression_loss: 0.9799 - classification_loss: 0.1313 382/500 [=====================>........] - ETA: 29s - loss: 1.1113 - regression_loss: 0.9801 - classification_loss: 0.1313 383/500 [=====================>........] - ETA: 29s - loss: 1.1125 - regression_loss: 0.9810 - classification_loss: 0.1314 384/500 [======================>.......] - ETA: 29s - loss: 1.1112 - regression_loss: 0.9800 - classification_loss: 0.1312 385/500 [======================>.......] - ETA: 29s - loss: 1.1114 - regression_loss: 0.9802 - classification_loss: 0.1312 386/500 [======================>.......] - ETA: 28s - loss: 1.1111 - regression_loss: 0.9801 - classification_loss: 0.1310 387/500 [======================>.......] - ETA: 28s - loss: 1.1125 - regression_loss: 0.9813 - classification_loss: 0.1311 388/500 [======================>.......] - ETA: 28s - loss: 1.1108 - regression_loss: 0.9798 - classification_loss: 0.1310 389/500 [======================>.......] - ETA: 28s - loss: 1.1098 - regression_loss: 0.9790 - classification_loss: 0.1308 390/500 [======================>.......] - ETA: 27s - loss: 1.1083 - regression_loss: 0.9777 - classification_loss: 0.1306 391/500 [======================>.......] - ETA: 27s - loss: 1.1097 - regression_loss: 0.9787 - classification_loss: 0.1310 392/500 [======================>.......] - ETA: 27s - loss: 1.1096 - regression_loss: 0.9787 - classification_loss: 0.1309 393/500 [======================>.......] - ETA: 27s - loss: 1.1101 - regression_loss: 0.9791 - classification_loss: 0.1310 394/500 [======================>.......] - ETA: 26s - loss: 1.1094 - regression_loss: 0.9784 - classification_loss: 0.1309 395/500 [======================>.......] - ETA: 26s - loss: 1.1115 - regression_loss: 0.9802 - classification_loss: 0.1312 396/500 [======================>.......] - ETA: 26s - loss: 1.1121 - regression_loss: 0.9808 - classification_loss: 0.1313 397/500 [======================>.......] - ETA: 25s - loss: 1.1112 - regression_loss: 0.9801 - classification_loss: 0.1312 398/500 [======================>.......] - ETA: 25s - loss: 1.1125 - regression_loss: 0.9811 - classification_loss: 0.1314 399/500 [======================>.......] - ETA: 25s - loss: 1.1144 - regression_loss: 0.9827 - classification_loss: 0.1317 400/500 [=======================>......] - ETA: 25s - loss: 1.1137 - regression_loss: 0.9822 - classification_loss: 0.1315 401/500 [=======================>......] - ETA: 24s - loss: 1.1137 - regression_loss: 0.9822 - classification_loss: 0.1315 402/500 [=======================>......] - ETA: 24s - loss: 1.1140 - regression_loss: 0.9824 - classification_loss: 0.1316 403/500 [=======================>......] - ETA: 24s - loss: 1.1144 - regression_loss: 0.9826 - classification_loss: 0.1318 404/500 [=======================>......] - ETA: 24s - loss: 1.1145 - regression_loss: 0.9826 - classification_loss: 0.1319 405/500 [=======================>......] - ETA: 23s - loss: 1.1154 - regression_loss: 0.9834 - classification_loss: 0.1319 406/500 [=======================>......] - ETA: 23s - loss: 1.1167 - regression_loss: 0.9845 - classification_loss: 0.1322 407/500 [=======================>......] - ETA: 23s - loss: 1.1176 - regression_loss: 0.9853 - classification_loss: 0.1322 408/500 [=======================>......] - ETA: 23s - loss: 1.1180 - regression_loss: 0.9858 - classification_loss: 0.1323 409/500 [=======================>......] - ETA: 22s - loss: 1.1168 - regression_loss: 0.9844 - classification_loss: 0.1323 410/500 [=======================>......] - ETA: 22s - loss: 1.1181 - regression_loss: 0.9855 - classification_loss: 0.1326 411/500 [=======================>......] - ETA: 22s - loss: 1.1184 - regression_loss: 0.9858 - classification_loss: 0.1326 412/500 [=======================>......] - ETA: 22s - loss: 1.1169 - regression_loss: 0.9846 - classification_loss: 0.1323 413/500 [=======================>......] - ETA: 21s - loss: 1.1180 - regression_loss: 0.9854 - classification_loss: 0.1325 414/500 [=======================>......] - ETA: 21s - loss: 1.1184 - regression_loss: 0.9858 - classification_loss: 0.1325 415/500 [=======================>......] - ETA: 21s - loss: 1.1174 - regression_loss: 0.9850 - classification_loss: 0.1324 416/500 [=======================>......] - ETA: 21s - loss: 1.1179 - regression_loss: 0.9854 - classification_loss: 0.1325 417/500 [========================>.....] - ETA: 20s - loss: 1.1213 - regression_loss: 0.9879 - classification_loss: 0.1334 418/500 [========================>.....] - ETA: 20s - loss: 1.1200 - regression_loss: 0.9868 - classification_loss: 0.1332 419/500 [========================>.....] - ETA: 20s - loss: 1.1201 - regression_loss: 0.9869 - classification_loss: 0.1332 420/500 [========================>.....] - ETA: 20s - loss: 1.1206 - regression_loss: 0.9873 - classification_loss: 0.1332 421/500 [========================>.....] - ETA: 19s - loss: 1.1201 - regression_loss: 0.9869 - classification_loss: 0.1332 422/500 [========================>.....] - ETA: 19s - loss: 1.1199 - regression_loss: 0.9868 - classification_loss: 0.1331 423/500 [========================>.....] - ETA: 19s - loss: 1.1198 - regression_loss: 0.9867 - classification_loss: 0.1331 424/500 [========================>.....] - ETA: 19s - loss: 1.1210 - regression_loss: 0.9878 - classification_loss: 0.1332 425/500 [========================>.....] - ETA: 18s - loss: 1.1202 - regression_loss: 0.9870 - classification_loss: 0.1332 426/500 [========================>.....] - ETA: 18s - loss: 1.1220 - regression_loss: 0.9886 - classification_loss: 0.1334 427/500 [========================>.....] - ETA: 18s - loss: 1.1217 - regression_loss: 0.9884 - classification_loss: 0.1334 428/500 [========================>.....] - ETA: 18s - loss: 1.1205 - regression_loss: 0.9873 - classification_loss: 0.1332 429/500 [========================>.....] - ETA: 17s - loss: 1.1198 - regression_loss: 0.9869 - classification_loss: 0.1329 430/500 [========================>.....] - ETA: 17s - loss: 1.1205 - regression_loss: 0.9873 - classification_loss: 0.1332 431/500 [========================>.....] - ETA: 17s - loss: 1.1215 - regression_loss: 0.9882 - classification_loss: 0.1334 432/500 [========================>.....] - ETA: 17s - loss: 1.1215 - regression_loss: 0.9882 - classification_loss: 0.1333 433/500 [========================>.....] - ETA: 16s - loss: 1.1217 - regression_loss: 0.9883 - classification_loss: 0.1334 434/500 [=========================>....] - ETA: 16s - loss: 1.1209 - regression_loss: 0.9877 - classification_loss: 0.1332 435/500 [=========================>....] - ETA: 16s - loss: 1.1196 - regression_loss: 0.9866 - classification_loss: 0.1330 436/500 [=========================>....] - ETA: 16s - loss: 1.1187 - regression_loss: 0.9858 - classification_loss: 0.1329 437/500 [=========================>....] - ETA: 15s - loss: 1.1203 - regression_loss: 0.9873 - classification_loss: 0.1330 438/500 [=========================>....] - ETA: 15s - loss: 1.1197 - regression_loss: 0.9867 - classification_loss: 0.1330 439/500 [=========================>....] - ETA: 15s - loss: 1.1208 - regression_loss: 0.9875 - classification_loss: 0.1333 440/500 [=========================>....] - ETA: 15s - loss: 1.1205 - regression_loss: 0.9873 - classification_loss: 0.1331 441/500 [=========================>....] - ETA: 14s - loss: 1.1203 - regression_loss: 0.9871 - classification_loss: 0.1332 442/500 [=========================>....] - ETA: 14s - loss: 1.1205 - regression_loss: 0.9872 - classification_loss: 0.1333 443/500 [=========================>....] - ETA: 14s - loss: 1.1189 - regression_loss: 0.9859 - classification_loss: 0.1330 444/500 [=========================>....] - ETA: 14s - loss: 1.1190 - regression_loss: 0.9859 - classification_loss: 0.1331 445/500 [=========================>....] - ETA: 13s - loss: 1.1195 - regression_loss: 0.9863 - classification_loss: 0.1332 446/500 [=========================>....] - ETA: 13s - loss: 1.1181 - regression_loss: 0.9851 - classification_loss: 0.1330 447/500 [=========================>....] - ETA: 13s - loss: 1.1183 - regression_loss: 0.9852 - classification_loss: 0.1331 448/500 [=========================>....] - ETA: 13s - loss: 1.1178 - regression_loss: 0.9848 - classification_loss: 0.1330 449/500 [=========================>....] - ETA: 12s - loss: 1.1186 - regression_loss: 0.9855 - classification_loss: 0.1331 450/500 [==========================>...] - ETA: 12s - loss: 1.1177 - regression_loss: 0.9848 - classification_loss: 0.1329 451/500 [==========================>...] - ETA: 12s - loss: 1.1183 - regression_loss: 0.9852 - classification_loss: 0.1331 452/500 [==========================>...] - ETA: 12s - loss: 1.1175 - regression_loss: 0.9844 - classification_loss: 0.1331 453/500 [==========================>...] - ETA: 11s - loss: 1.1164 - regression_loss: 0.9835 - classification_loss: 0.1329 454/500 [==========================>...] - ETA: 11s - loss: 1.1156 - regression_loss: 0.9828 - classification_loss: 0.1328 455/500 [==========================>...] - ETA: 11s - loss: 1.1149 - regression_loss: 0.9822 - classification_loss: 0.1327 456/500 [==========================>...] - ETA: 11s - loss: 1.1153 - regression_loss: 0.9827 - classification_loss: 0.1326 457/500 [==========================>...] - ETA: 10s - loss: 1.1152 - regression_loss: 0.9826 - classification_loss: 0.1325 458/500 [==========================>...] - ETA: 10s - loss: 1.1154 - regression_loss: 0.9828 - classification_loss: 0.1326 459/500 [==========================>...] - ETA: 10s - loss: 1.1149 - regression_loss: 0.9825 - classification_loss: 0.1325 460/500 [==========================>...] - ETA: 10s - loss: 1.1157 - regression_loss: 0.9833 - classification_loss: 0.1324 461/500 [==========================>...] - ETA: 9s - loss: 1.1139 - regression_loss: 0.9817 - classification_loss: 0.1322  462/500 [==========================>...] - ETA: 9s - loss: 1.1147 - regression_loss: 0.9826 - classification_loss: 0.1321 463/500 [==========================>...] - ETA: 9s - loss: 1.1160 - regression_loss: 0.9837 - classification_loss: 0.1323 464/500 [==========================>...] - ETA: 9s - loss: 1.1169 - regression_loss: 0.9846 - classification_loss: 0.1324 465/500 [==========================>...] - ETA: 8s - loss: 1.1173 - regression_loss: 0.9849 - classification_loss: 0.1324 466/500 [==========================>...] - ETA: 8s - loss: 1.1173 - regression_loss: 0.9850 - classification_loss: 0.1322 467/500 [===========================>..] - ETA: 8s - loss: 1.1171 - regression_loss: 0.9850 - classification_loss: 0.1321 468/500 [===========================>..] - ETA: 8s - loss: 1.1172 - regression_loss: 0.9851 - classification_loss: 0.1321 469/500 [===========================>..] - ETA: 7s - loss: 1.1167 - regression_loss: 0.9847 - classification_loss: 0.1320 470/500 [===========================>..] - ETA: 7s - loss: 1.1165 - regression_loss: 0.9845 - classification_loss: 0.1320 471/500 [===========================>..] - ETA: 7s - loss: 1.1174 - regression_loss: 0.9853 - classification_loss: 0.1321 472/500 [===========================>..] - ETA: 7s - loss: 1.1175 - regression_loss: 0.9854 - classification_loss: 0.1321 473/500 [===========================>..] - ETA: 6s - loss: 1.1180 - regression_loss: 0.9859 - classification_loss: 0.1321 474/500 [===========================>..] - ETA: 6s - loss: 1.1165 - regression_loss: 0.9845 - classification_loss: 0.1319 475/500 [===========================>..] - ETA: 6s - loss: 1.1155 - regression_loss: 0.9838 - classification_loss: 0.1317 476/500 [===========================>..] - ETA: 6s - loss: 1.1157 - regression_loss: 0.9840 - classification_loss: 0.1318 477/500 [===========================>..] - ETA: 5s - loss: 1.1149 - regression_loss: 0.9833 - classification_loss: 0.1316 478/500 [===========================>..] - ETA: 5s - loss: 1.1153 - regression_loss: 0.9836 - classification_loss: 0.1317 479/500 [===========================>..] - ETA: 5s - loss: 1.1157 - regression_loss: 0.9840 - classification_loss: 0.1317 480/500 [===========================>..] - ETA: 5s - loss: 1.1142 - regression_loss: 0.9826 - classification_loss: 0.1315 481/500 [===========================>..] - ETA: 4s - loss: 1.1138 - regression_loss: 0.9824 - classification_loss: 0.1315 482/500 [===========================>..] - ETA: 4s - loss: 1.1134 - regression_loss: 0.9820 - classification_loss: 0.1314 483/500 [===========================>..] - ETA: 4s - loss: 1.1143 - regression_loss: 0.9826 - classification_loss: 0.1317 484/500 [============================>.] - ETA: 4s - loss: 1.1144 - regression_loss: 0.9827 - classification_loss: 0.1317 485/500 [============================>.] - ETA: 3s - loss: 1.1143 - regression_loss: 0.9826 - classification_loss: 0.1317 486/500 [============================>.] - ETA: 3s - loss: 1.1133 - regression_loss: 0.9818 - classification_loss: 0.1316 487/500 [============================>.] - ETA: 3s - loss: 1.1119 - regression_loss: 0.9806 - classification_loss: 0.1314 488/500 [============================>.] - ETA: 3s - loss: 1.1125 - regression_loss: 0.9811 - classification_loss: 0.1315 489/500 [============================>.] - ETA: 2s - loss: 1.1127 - regression_loss: 0.9813 - classification_loss: 0.1315 490/500 [============================>.] - ETA: 2s - loss: 1.1123 - regression_loss: 0.9808 - classification_loss: 0.1315 491/500 [============================>.] - ETA: 2s - loss: 1.1120 - regression_loss: 0.9803 - classification_loss: 0.1318 492/500 [============================>.] - ETA: 2s - loss: 1.1109 - regression_loss: 0.9793 - classification_loss: 0.1316 493/500 [============================>.] - ETA: 1s - loss: 1.1099 - regression_loss: 0.9785 - classification_loss: 0.1314 494/500 [============================>.] - ETA: 1s - loss: 1.1100 - regression_loss: 0.9786 - classification_loss: 0.1315 495/500 [============================>.] - ETA: 1s - loss: 1.1099 - regression_loss: 0.9785 - classification_loss: 0.1314 496/500 [============================>.] - ETA: 1s - loss: 1.1085 - regression_loss: 0.9773 - classification_loss: 0.1312 497/500 [============================>.] - ETA: 0s - loss: 1.1073 - regression_loss: 0.9761 - classification_loss: 0.1312 498/500 [============================>.] - ETA: 0s - loss: 1.1072 - regression_loss: 0.9760 - classification_loss: 0.1312 499/500 [============================>.] - ETA: 0s - loss: 1.1077 - regression_loss: 0.9766 - classification_loss: 0.1311 500/500 [==============================] - 126s 252ms/step - loss: 1.1058 - regression_loss: 0.9749 - classification_loss: 0.1309 1172 instances of class plum with average precision: 0.7350 mAP: 0.7350 Epoch 00031: saving model to ./training/snapshots/resnet50_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 1:56 - loss: 1.5245 - regression_loss: 1.3646 - classification_loss: 0.1599 2/500 [..............................] - ETA: 2:06 - loss: 1.4200 - regression_loss: 1.2601 - classification_loss: 0.1598 3/500 [..............................] - ETA: 2:04 - loss: 1.2510 - regression_loss: 1.1025 - classification_loss: 0.1485 4/500 [..............................] - ETA: 2:04 - loss: 1.1839 - regression_loss: 1.0430 - classification_loss: 0.1409 5/500 [..............................] - ETA: 2:05 - loss: 1.1924 - regression_loss: 1.0490 - classification_loss: 0.1434 6/500 [..............................] - ETA: 2:05 - loss: 1.1921 - regression_loss: 1.0423 - classification_loss: 0.1497 7/500 [..............................] - ETA: 2:05 - loss: 1.2678 - regression_loss: 1.1083 - classification_loss: 0.1595 8/500 [..............................] - ETA: 2:05 - loss: 1.2436 - regression_loss: 1.0856 - classification_loss: 0.1580 9/500 [..............................] - ETA: 2:05 - loss: 1.2559 - regression_loss: 1.0981 - classification_loss: 0.1578 10/500 [..............................] - ETA: 2:04 - loss: 1.2534 - regression_loss: 1.0965 - classification_loss: 0.1569 11/500 [..............................] - ETA: 2:04 - loss: 1.2272 - regression_loss: 1.0723 - classification_loss: 0.1549 12/500 [..............................] - ETA: 2:04 - loss: 1.1881 - regression_loss: 1.0435 - classification_loss: 0.1446 13/500 [..............................] - ETA: 2:04 - loss: 1.1973 - regression_loss: 1.0583 - classification_loss: 0.1390 14/500 [..............................] - ETA: 2:03 - loss: 1.2174 - regression_loss: 1.0795 - classification_loss: 0.1379 15/500 [..............................] - ETA: 2:03 - loss: 1.2283 - regression_loss: 1.0938 - classification_loss: 0.1344 16/500 [..............................] - ETA: 2:03 - loss: 1.1904 - regression_loss: 1.0585 - classification_loss: 0.1319 17/500 [>.............................] - ETA: 2:03 - loss: 1.1885 - regression_loss: 1.0562 - classification_loss: 0.1323 18/500 [>.............................] - ETA: 2:02 - loss: 1.1982 - regression_loss: 1.0610 - classification_loss: 0.1372 19/500 [>.............................] - ETA: 2:02 - loss: 1.2019 - regression_loss: 1.0657 - classification_loss: 0.1362 20/500 [>.............................] - ETA: 2:02 - loss: 1.1906 - regression_loss: 1.0577 - classification_loss: 0.1329 21/500 [>.............................] - ETA: 2:02 - loss: 1.1975 - regression_loss: 1.0625 - classification_loss: 0.1351 22/500 [>.............................] - ETA: 2:01 - loss: 1.1764 - regression_loss: 1.0454 - classification_loss: 0.1310 23/500 [>.............................] - ETA: 2:01 - loss: 1.1953 - regression_loss: 1.0613 - classification_loss: 0.1340 24/500 [>.............................] - ETA: 2:01 - loss: 1.2181 - regression_loss: 1.0748 - classification_loss: 0.1433 25/500 [>.............................] - ETA: 2:01 - loss: 1.2172 - regression_loss: 1.0743 - classification_loss: 0.1429 26/500 [>.............................] - ETA: 2:00 - loss: 1.1974 - regression_loss: 1.0582 - classification_loss: 0.1392 27/500 [>.............................] - ETA: 2:00 - loss: 1.2022 - regression_loss: 1.0619 - classification_loss: 0.1402 28/500 [>.............................] - ETA: 2:00 - loss: 1.1765 - regression_loss: 1.0393 - classification_loss: 0.1372 29/500 [>.............................] - ETA: 2:00 - loss: 1.1867 - regression_loss: 1.0498 - classification_loss: 0.1369 30/500 [>.............................] - ETA: 1:59 - loss: 1.1827 - regression_loss: 1.0456 - classification_loss: 0.1371 31/500 [>.............................] - ETA: 1:59 - loss: 1.1528 - regression_loss: 1.0196 - classification_loss: 0.1332 32/500 [>.............................] - ETA: 1:58 - loss: 1.1676 - regression_loss: 1.0319 - classification_loss: 0.1357 33/500 [>.............................] - ETA: 1:58 - loss: 1.1685 - regression_loss: 1.0302 - classification_loss: 0.1383 34/500 [=>............................] - ETA: 1:58 - loss: 1.1765 - regression_loss: 1.0363 - classification_loss: 0.1401 35/500 [=>............................] - ETA: 1:57 - loss: 1.1731 - regression_loss: 1.0335 - classification_loss: 0.1395 36/500 [=>............................] - ETA: 1:57 - loss: 1.1611 - regression_loss: 1.0240 - classification_loss: 0.1371 37/500 [=>............................] - ETA: 1:57 - loss: 1.1466 - regression_loss: 1.0123 - classification_loss: 0.1342 38/500 [=>............................] - ETA: 1:57 - loss: 1.1409 - regression_loss: 1.0088 - classification_loss: 0.1321 39/500 [=>............................] - ETA: 1:56 - loss: 1.1406 - regression_loss: 1.0090 - classification_loss: 0.1316 40/500 [=>............................] - ETA: 1:56 - loss: 1.1349 - regression_loss: 1.0058 - classification_loss: 0.1291 41/500 [=>............................] - ETA: 1:56 - loss: 1.1403 - regression_loss: 1.0106 - classification_loss: 0.1298 42/500 [=>............................] - ETA: 1:56 - loss: 1.1388 - regression_loss: 1.0085 - classification_loss: 0.1303 43/500 [=>............................] - ETA: 1:56 - loss: 1.1330 - regression_loss: 1.0041 - classification_loss: 0.1289 44/500 [=>............................] - ETA: 1:55 - loss: 1.1334 - regression_loss: 1.0052 - classification_loss: 0.1282 45/500 [=>............................] - ETA: 1:55 - loss: 1.1432 - regression_loss: 1.0131 - classification_loss: 0.1301 46/500 [=>............................] - ETA: 1:54 - loss: 1.1461 - regression_loss: 1.0153 - classification_loss: 0.1308 47/500 [=>............................] - ETA: 1:54 - loss: 1.1539 - regression_loss: 1.0210 - classification_loss: 0.1329 48/500 [=>............................] - ETA: 1:54 - loss: 1.1544 - regression_loss: 1.0210 - classification_loss: 0.1334 49/500 [=>............................] - ETA: 1:54 - loss: 1.1619 - regression_loss: 1.0262 - classification_loss: 0.1358 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1604 - regression_loss: 1.0248 - classification_loss: 0.1356 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1501 - regression_loss: 1.0161 - classification_loss: 0.1340 52/500 [==>...........................] - ETA: 1:53 - loss: 1.1519 - regression_loss: 1.0172 - classification_loss: 0.1348 53/500 [==>...........................] - ETA: 1:53 - loss: 1.1417 - regression_loss: 1.0082 - classification_loss: 0.1335 54/500 [==>...........................] - ETA: 1:52 - loss: 1.1402 - regression_loss: 1.0069 - classification_loss: 0.1333 55/500 [==>...........................] - ETA: 1:52 - loss: 1.1502 - regression_loss: 1.0145 - classification_loss: 0.1357 56/500 [==>...........................] - ETA: 1:52 - loss: 1.1486 - regression_loss: 1.0133 - classification_loss: 0.1353 57/500 [==>...........................] - ETA: 1:52 - loss: 1.1425 - regression_loss: 1.0075 - classification_loss: 0.1350 58/500 [==>...........................] - ETA: 1:51 - loss: 1.1406 - regression_loss: 1.0056 - classification_loss: 0.1350 59/500 [==>...........................] - ETA: 1:51 - loss: 1.1361 - regression_loss: 1.0020 - classification_loss: 0.1341 60/500 [==>...........................] - ETA: 1:51 - loss: 1.1389 - regression_loss: 1.0041 - classification_loss: 0.1348 61/500 [==>...........................] - ETA: 1:50 - loss: 1.1342 - regression_loss: 1.0001 - classification_loss: 0.1341 62/500 [==>...........................] - ETA: 1:50 - loss: 1.1320 - regression_loss: 0.9988 - classification_loss: 0.1332 63/500 [==>...........................] - ETA: 1:50 - loss: 1.1285 - regression_loss: 0.9960 - classification_loss: 0.1325 64/500 [==>...........................] - ETA: 1:49 - loss: 1.1326 - regression_loss: 0.9999 - classification_loss: 0.1327 65/500 [==>...........................] - ETA: 1:49 - loss: 1.1282 - regression_loss: 0.9967 - classification_loss: 0.1315 66/500 [==>...........................] - ETA: 1:49 - loss: 1.1206 - regression_loss: 0.9904 - classification_loss: 0.1303 67/500 [===>..........................] - ETA: 1:49 - loss: 1.1215 - regression_loss: 0.9908 - classification_loss: 0.1307 68/500 [===>..........................] - ETA: 1:49 - loss: 1.1226 - regression_loss: 0.9916 - classification_loss: 0.1310 69/500 [===>..........................] - ETA: 1:48 - loss: 1.1235 - regression_loss: 0.9930 - classification_loss: 0.1305 70/500 [===>..........................] - ETA: 1:48 - loss: 1.1244 - regression_loss: 0.9938 - classification_loss: 0.1305 71/500 [===>..........................] - ETA: 1:48 - loss: 1.1192 - regression_loss: 0.9896 - classification_loss: 0.1296 72/500 [===>..........................] - ETA: 1:48 - loss: 1.1159 - regression_loss: 0.9865 - classification_loss: 0.1294 73/500 [===>..........................] - ETA: 1:47 - loss: 1.1102 - regression_loss: 0.9815 - classification_loss: 0.1287 74/500 [===>..........................] - ETA: 1:47 - loss: 1.1047 - regression_loss: 0.9770 - classification_loss: 0.1277 75/500 [===>..........................] - ETA: 1:47 - loss: 1.0959 - regression_loss: 0.9696 - classification_loss: 0.1263 76/500 [===>..........................] - ETA: 1:47 - loss: 1.0916 - regression_loss: 0.9665 - classification_loss: 0.1251 77/500 [===>..........................] - ETA: 1:46 - loss: 1.0984 - regression_loss: 0.9719 - classification_loss: 0.1265 78/500 [===>..........................] - ETA: 1:46 - loss: 1.0941 - regression_loss: 0.9680 - classification_loss: 0.1261 79/500 [===>..........................] - ETA: 1:46 - loss: 1.0947 - regression_loss: 0.9687 - classification_loss: 0.1260 80/500 [===>..........................] - ETA: 1:46 - loss: 1.0992 - regression_loss: 0.9725 - classification_loss: 0.1267 81/500 [===>..........................] - ETA: 1:45 - loss: 1.1039 - regression_loss: 0.9765 - classification_loss: 0.1274 82/500 [===>..........................] - ETA: 1:45 - loss: 1.1041 - regression_loss: 0.9767 - classification_loss: 0.1274 83/500 [===>..........................] - ETA: 1:45 - loss: 1.1021 - regression_loss: 0.9751 - classification_loss: 0.1270 84/500 [====>.........................] - ETA: 1:45 - loss: 1.0974 - regression_loss: 0.9695 - classification_loss: 0.1279 85/500 [====>.........................] - ETA: 1:44 - loss: 1.0997 - regression_loss: 0.9711 - classification_loss: 0.1285 86/500 [====>.........................] - ETA: 1:44 - loss: 1.1018 - regression_loss: 0.9730 - classification_loss: 0.1288 87/500 [====>.........................] - ETA: 1:44 - loss: 1.1043 - regression_loss: 0.9755 - classification_loss: 0.1288 88/500 [====>.........................] - ETA: 1:44 - loss: 1.1041 - regression_loss: 0.9752 - classification_loss: 0.1289 89/500 [====>.........................] - ETA: 1:43 - loss: 1.1070 - regression_loss: 0.9777 - classification_loss: 0.1293 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1091 - regression_loss: 0.9804 - classification_loss: 0.1287 91/500 [====>.........................] - ETA: 1:43 - loss: 1.1123 - regression_loss: 0.9833 - classification_loss: 0.1290 92/500 [====>.........................] - ETA: 1:43 - loss: 1.1148 - regression_loss: 0.9859 - classification_loss: 0.1289 93/500 [====>.........................] - ETA: 1:42 - loss: 1.1158 - regression_loss: 0.9866 - classification_loss: 0.1292 94/500 [====>.........................] - ETA: 1:42 - loss: 1.1159 - regression_loss: 0.9866 - classification_loss: 0.1293 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1129 - regression_loss: 0.9841 - classification_loss: 0.1287 96/500 [====>.........................] - ETA: 1:42 - loss: 1.1158 - regression_loss: 0.9870 - classification_loss: 0.1288 97/500 [====>.........................] - ETA: 1:41 - loss: 1.1162 - regression_loss: 0.9875 - classification_loss: 0.1287 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1134 - regression_loss: 0.9852 - classification_loss: 0.1282 99/500 [====>.........................] - ETA: 1:41 - loss: 1.1144 - regression_loss: 0.9869 - classification_loss: 0.1276 100/500 [=====>........................] - ETA: 1:41 - loss: 1.1195 - regression_loss: 0.9912 - classification_loss: 0.1283 101/500 [=====>........................] - ETA: 1:40 - loss: 1.1228 - regression_loss: 0.9937 - classification_loss: 0.1291 102/500 [=====>........................] - ETA: 1:40 - loss: 1.1225 - regression_loss: 0.9940 - classification_loss: 0.1285 103/500 [=====>........................] - ETA: 1:40 - loss: 1.1215 - regression_loss: 0.9932 - classification_loss: 0.1283 104/500 [=====>........................] - ETA: 1:40 - loss: 1.1238 - regression_loss: 0.9948 - classification_loss: 0.1291 105/500 [=====>........................] - ETA: 1:39 - loss: 1.1242 - regression_loss: 0.9948 - classification_loss: 0.1293 106/500 [=====>........................] - ETA: 1:39 - loss: 1.1237 - regression_loss: 0.9938 - classification_loss: 0.1299 107/500 [=====>........................] - ETA: 1:39 - loss: 1.1217 - regression_loss: 0.9924 - classification_loss: 0.1293 108/500 [=====>........................] - ETA: 1:39 - loss: 1.1227 - regression_loss: 0.9930 - classification_loss: 0.1296 109/500 [=====>........................] - ETA: 1:38 - loss: 1.1265 - regression_loss: 0.9960 - classification_loss: 0.1306 110/500 [=====>........................] - ETA: 1:38 - loss: 1.1215 - regression_loss: 0.9916 - classification_loss: 0.1300 111/500 [=====>........................] - ETA: 1:38 - loss: 1.1269 - regression_loss: 0.9958 - classification_loss: 0.1312 112/500 [=====>........................] - ETA: 1:38 - loss: 1.1260 - regression_loss: 0.9947 - classification_loss: 0.1313 113/500 [=====>........................] - ETA: 1:37 - loss: 1.1256 - regression_loss: 0.9943 - classification_loss: 0.1313 114/500 [=====>........................] - ETA: 1:37 - loss: 1.1259 - regression_loss: 0.9940 - classification_loss: 0.1318 115/500 [=====>........................] - ETA: 1:37 - loss: 1.1230 - regression_loss: 0.9918 - classification_loss: 0.1311 116/500 [=====>........................] - ETA: 1:37 - loss: 1.1223 - regression_loss: 0.9911 - classification_loss: 0.1312 117/500 [======>.......................] - ETA: 1:36 - loss: 1.1161 - regression_loss: 0.9858 - classification_loss: 0.1303 118/500 [======>.......................] - ETA: 1:36 - loss: 1.1156 - regression_loss: 0.9853 - classification_loss: 0.1303 119/500 [======>.......................] - ETA: 1:36 - loss: 1.1199 - regression_loss: 0.9888 - classification_loss: 0.1311 120/500 [======>.......................] - ETA: 1:36 - loss: 1.1197 - regression_loss: 0.9888 - classification_loss: 0.1309 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1206 - regression_loss: 0.9896 - classification_loss: 0.1310 122/500 [======>.......................] - ETA: 1:35 - loss: 1.1200 - regression_loss: 0.9889 - classification_loss: 0.1311 123/500 [======>.......................] - ETA: 1:35 - loss: 1.1169 - regression_loss: 0.9862 - classification_loss: 0.1307 124/500 [======>.......................] - ETA: 1:35 - loss: 1.1175 - regression_loss: 0.9866 - classification_loss: 0.1308 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1178 - regression_loss: 0.9874 - classification_loss: 0.1305 126/500 [======>.......................] - ETA: 1:34 - loss: 1.1175 - regression_loss: 0.9873 - classification_loss: 0.1302 127/500 [======>.......................] - ETA: 1:34 - loss: 1.1194 - regression_loss: 0.9889 - classification_loss: 0.1304 128/500 [======>.......................] - ETA: 1:34 - loss: 1.1147 - regression_loss: 0.9846 - classification_loss: 0.1301 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1162 - regression_loss: 0.9857 - classification_loss: 0.1305 130/500 [======>.......................] - ETA: 1:33 - loss: 1.1112 - regression_loss: 0.9814 - classification_loss: 0.1299 131/500 [======>.......................] - ETA: 1:33 - loss: 1.1134 - regression_loss: 0.9837 - classification_loss: 0.1298 132/500 [======>.......................] - ETA: 1:33 - loss: 1.1142 - regression_loss: 0.9845 - classification_loss: 0.1297 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1131 - regression_loss: 0.9837 - classification_loss: 0.1295 134/500 [=======>......................] - ETA: 1:32 - loss: 1.1147 - regression_loss: 0.9854 - classification_loss: 0.1293 135/500 [=======>......................] - ETA: 1:32 - loss: 1.1127 - regression_loss: 0.9839 - classification_loss: 0.1288 136/500 [=======>......................] - ETA: 1:32 - loss: 1.1162 - regression_loss: 0.9868 - classification_loss: 0.1294 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1171 - regression_loss: 0.9877 - classification_loss: 0.1293 138/500 [=======>......................] - ETA: 1:31 - loss: 1.1184 - regression_loss: 0.9888 - classification_loss: 0.1296 139/500 [=======>......................] - ETA: 1:31 - loss: 1.1212 - regression_loss: 0.9912 - classification_loss: 0.1300 140/500 [=======>......................] - ETA: 1:31 - loss: 1.1224 - regression_loss: 0.9914 - classification_loss: 0.1310 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1173 - regression_loss: 0.9871 - classification_loss: 0.1302 142/500 [=======>......................] - ETA: 1:30 - loss: 1.1126 - regression_loss: 0.9831 - classification_loss: 0.1295 143/500 [=======>......................] - ETA: 1:30 - loss: 1.1088 - regression_loss: 0.9797 - classification_loss: 0.1291 144/500 [=======>......................] - ETA: 1:30 - loss: 1.1102 - regression_loss: 0.9808 - classification_loss: 0.1294 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1082 - regression_loss: 0.9790 - classification_loss: 0.1293 146/500 [=======>......................] - ETA: 1:29 - loss: 1.1071 - regression_loss: 0.9781 - classification_loss: 0.1290 147/500 [=======>......................] - ETA: 1:29 - loss: 1.1097 - regression_loss: 0.9805 - classification_loss: 0.1292 148/500 [=======>......................] - ETA: 1:29 - loss: 1.1096 - regression_loss: 0.9805 - classification_loss: 0.1291 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1070 - regression_loss: 0.9781 - classification_loss: 0.1289 150/500 [========>.....................] - ETA: 1:28 - loss: 1.1078 - regression_loss: 0.9787 - classification_loss: 0.1291 151/500 [========>.....................] - ETA: 1:28 - loss: 1.1042 - regression_loss: 0.9755 - classification_loss: 0.1287 152/500 [========>.....................] - ETA: 1:28 - loss: 1.1056 - regression_loss: 0.9768 - classification_loss: 0.1289 153/500 [========>.....................] - ETA: 1:27 - loss: 1.1066 - regression_loss: 0.9778 - classification_loss: 0.1288 154/500 [========>.....................] - ETA: 1:27 - loss: 1.1074 - regression_loss: 0.9785 - classification_loss: 0.1289 155/500 [========>.....................] - ETA: 1:27 - loss: 1.1085 - regression_loss: 0.9794 - classification_loss: 0.1291 156/500 [========>.....................] - ETA: 1:27 - loss: 1.1077 - regression_loss: 0.9789 - classification_loss: 0.1288 157/500 [========>.....................] - ETA: 1:26 - loss: 1.1098 - regression_loss: 0.9809 - classification_loss: 0.1288 158/500 [========>.....................] - ETA: 1:26 - loss: 1.1117 - regression_loss: 0.9825 - classification_loss: 0.1292 159/500 [========>.....................] - ETA: 1:26 - loss: 1.1073 - regression_loss: 0.9787 - classification_loss: 0.1286 160/500 [========>.....................] - ETA: 1:25 - loss: 1.1030 - regression_loss: 0.9749 - classification_loss: 0.1281 161/500 [========>.....................] - ETA: 1:25 - loss: 1.1034 - regression_loss: 0.9750 - classification_loss: 0.1284 162/500 [========>.....................] - ETA: 1:25 - loss: 1.1047 - regression_loss: 0.9760 - classification_loss: 0.1287 163/500 [========>.....................] - ETA: 1:25 - loss: 1.1033 - regression_loss: 0.9749 - classification_loss: 0.1284 164/500 [========>.....................] - ETA: 1:24 - loss: 1.1027 - regression_loss: 0.9745 - classification_loss: 0.1282 165/500 [========>.....................] - ETA: 1:24 - loss: 1.1000 - regression_loss: 0.9723 - classification_loss: 0.1277 166/500 [========>.....................] - ETA: 1:24 - loss: 1.0996 - regression_loss: 0.9721 - classification_loss: 0.1275 167/500 [=========>....................] - ETA: 1:23 - loss: 1.0981 - regression_loss: 0.9710 - classification_loss: 0.1271 168/500 [=========>....................] - ETA: 1:23 - loss: 1.0970 - regression_loss: 0.9701 - classification_loss: 0.1270 169/500 [=========>....................] - ETA: 1:23 - loss: 1.0957 - regression_loss: 0.9689 - classification_loss: 0.1268 170/500 [=========>....................] - ETA: 1:23 - loss: 1.0949 - regression_loss: 0.9685 - classification_loss: 0.1264 171/500 [=========>....................] - ETA: 1:22 - loss: 1.0927 - regression_loss: 0.9668 - classification_loss: 0.1259 172/500 [=========>....................] - ETA: 1:22 - loss: 1.0968 - regression_loss: 0.9704 - classification_loss: 0.1264 173/500 [=========>....................] - ETA: 1:22 - loss: 1.0981 - regression_loss: 0.9711 - classification_loss: 0.1270 174/500 [=========>....................] - ETA: 1:22 - loss: 1.0999 - regression_loss: 0.9727 - classification_loss: 0.1272 175/500 [=========>....................] - ETA: 1:21 - loss: 1.1044 - regression_loss: 0.9765 - classification_loss: 0.1279 176/500 [=========>....................] - ETA: 1:21 - loss: 1.1064 - regression_loss: 0.9781 - classification_loss: 0.1283 177/500 [=========>....................] - ETA: 1:21 - loss: 1.1064 - regression_loss: 0.9782 - classification_loss: 0.1282 178/500 [=========>....................] - ETA: 1:21 - loss: 1.1072 - regression_loss: 0.9786 - classification_loss: 0.1285 179/500 [=========>....................] - ETA: 1:20 - loss: 1.1033 - regression_loss: 0.9753 - classification_loss: 0.1280 180/500 [=========>....................] - ETA: 1:20 - loss: 1.1045 - regression_loss: 0.9766 - classification_loss: 0.1279 181/500 [=========>....................] - ETA: 1:20 - loss: 1.1044 - regression_loss: 0.9764 - classification_loss: 0.1280 182/500 [=========>....................] - ETA: 1:20 - loss: 1.1027 - regression_loss: 0.9748 - classification_loss: 0.1279 183/500 [=========>....................] - ETA: 1:19 - loss: 1.1059 - regression_loss: 0.9775 - classification_loss: 0.1284 184/500 [==========>...................] - ETA: 1:19 - loss: 1.1018 - regression_loss: 0.9740 - classification_loss: 0.1279 185/500 [==========>...................] - ETA: 1:19 - loss: 1.1022 - regression_loss: 0.9745 - classification_loss: 0.1278 186/500 [==========>...................] - ETA: 1:19 - loss: 1.1045 - regression_loss: 0.9767 - classification_loss: 0.1278 187/500 [==========>...................] - ETA: 1:18 - loss: 1.1066 - regression_loss: 0.9784 - classification_loss: 0.1282 188/500 [==========>...................] - ETA: 1:18 - loss: 1.1060 - regression_loss: 0.9779 - classification_loss: 0.1282 189/500 [==========>...................] - ETA: 1:18 - loss: 1.1087 - regression_loss: 0.9802 - classification_loss: 0.1286 190/500 [==========>...................] - ETA: 1:18 - loss: 1.1079 - regression_loss: 0.9794 - classification_loss: 0.1285 191/500 [==========>...................] - ETA: 1:17 - loss: 1.1084 - regression_loss: 0.9795 - classification_loss: 0.1288 192/500 [==========>...................] - ETA: 1:17 - loss: 1.1095 - regression_loss: 0.9804 - classification_loss: 0.1291 193/500 [==========>...................] - ETA: 1:17 - loss: 1.1084 - regression_loss: 0.9795 - classification_loss: 0.1289 194/500 [==========>...................] - ETA: 1:17 - loss: 1.1053 - regression_loss: 0.9769 - classification_loss: 0.1284 195/500 [==========>...................] - ETA: 1:16 - loss: 1.1033 - regression_loss: 0.9754 - classification_loss: 0.1279 196/500 [==========>...................] - ETA: 1:16 - loss: 1.1017 - regression_loss: 0.9739 - classification_loss: 0.1278 197/500 [==========>...................] - ETA: 1:16 - loss: 1.1033 - regression_loss: 0.9752 - classification_loss: 0.1281 198/500 [==========>...................] - ETA: 1:16 - loss: 1.1038 - regression_loss: 0.9757 - classification_loss: 0.1280 199/500 [==========>...................] - ETA: 1:15 - loss: 1.1033 - regression_loss: 0.9756 - classification_loss: 0.1277 200/500 [===========>..................] - ETA: 1:15 - loss: 1.1052 - regression_loss: 0.9771 - classification_loss: 0.1280 201/500 [===========>..................] - ETA: 1:15 - loss: 1.1034 - regression_loss: 0.9756 - classification_loss: 0.1277 202/500 [===========>..................] - ETA: 1:15 - loss: 1.1001 - regression_loss: 0.9729 - classification_loss: 0.1272 203/500 [===========>..................] - ETA: 1:14 - loss: 1.1010 - regression_loss: 0.9735 - classification_loss: 0.1275 204/500 [===========>..................] - ETA: 1:14 - loss: 1.1068 - regression_loss: 0.9785 - classification_loss: 0.1282 205/500 [===========>..................] - ETA: 1:14 - loss: 1.1051 - regression_loss: 0.9771 - classification_loss: 0.1280 206/500 [===========>..................] - ETA: 1:14 - loss: 1.1054 - regression_loss: 0.9776 - classification_loss: 0.1278 207/500 [===========>..................] - ETA: 1:13 - loss: 1.1052 - regression_loss: 0.9773 - classification_loss: 0.1279 208/500 [===========>..................] - ETA: 1:13 - loss: 1.1080 - regression_loss: 0.9803 - classification_loss: 0.1276 209/500 [===========>..................] - ETA: 1:13 - loss: 1.1077 - regression_loss: 0.9800 - classification_loss: 0.1277 210/500 [===========>..................] - ETA: 1:13 - loss: 1.1043 - regression_loss: 0.9770 - classification_loss: 0.1272 211/500 [===========>..................] - ETA: 1:12 - loss: 1.1050 - regression_loss: 0.9778 - classification_loss: 0.1273 212/500 [===========>..................] - ETA: 1:12 - loss: 1.1054 - regression_loss: 0.9780 - classification_loss: 0.1274 213/500 [===========>..................] - ETA: 1:12 - loss: 1.1045 - regression_loss: 0.9774 - classification_loss: 0.1271 214/500 [===========>..................] - ETA: 1:12 - loss: 1.1051 - regression_loss: 0.9782 - classification_loss: 0.1269 215/500 [===========>..................] - ETA: 1:11 - loss: 1.1062 - regression_loss: 0.9788 - classification_loss: 0.1273 216/500 [===========>..................] - ETA: 1:11 - loss: 1.1063 - regression_loss: 0.9787 - classification_loss: 0.1276 217/500 [============>.................] - ETA: 1:11 - loss: 1.1074 - regression_loss: 0.9796 - classification_loss: 0.1277 218/500 [============>.................] - ETA: 1:11 - loss: 1.1066 - regression_loss: 0.9790 - classification_loss: 0.1275 219/500 [============>.................] - ETA: 1:10 - loss: 1.1046 - regression_loss: 0.9776 - classification_loss: 0.1271 220/500 [============>.................] - ETA: 1:10 - loss: 1.1011 - regression_loss: 0.9745 - classification_loss: 0.1267 221/500 [============>.................] - ETA: 1:10 - loss: 1.1018 - regression_loss: 0.9751 - classification_loss: 0.1268 222/500 [============>.................] - ETA: 1:10 - loss: 1.1026 - regression_loss: 0.9756 - classification_loss: 0.1270 223/500 [============>.................] - ETA: 1:10 - loss: 1.1032 - regression_loss: 0.9761 - classification_loss: 0.1271 224/500 [============>.................] - ETA: 1:09 - loss: 1.1027 - regression_loss: 0.9756 - classification_loss: 0.1270 225/500 [============>.................] - ETA: 1:09 - loss: 1.1019 - regression_loss: 0.9747 - classification_loss: 0.1272 226/500 [============>.................] - ETA: 1:09 - loss: 1.1037 - regression_loss: 0.9760 - classification_loss: 0.1277 227/500 [============>.................] - ETA: 1:08 - loss: 1.1041 - regression_loss: 0.9763 - classification_loss: 0.1278 228/500 [============>.................] - ETA: 1:08 - loss: 1.1038 - regression_loss: 0.9761 - classification_loss: 0.1277 229/500 [============>.................] - ETA: 1:08 - loss: 1.1058 - regression_loss: 0.9778 - classification_loss: 0.1281 230/500 [============>.................] - ETA: 1:08 - loss: 1.1065 - regression_loss: 0.9786 - classification_loss: 0.1279 231/500 [============>.................] - ETA: 1:07 - loss: 1.1054 - regression_loss: 0.9777 - classification_loss: 0.1277 232/500 [============>.................] - ETA: 1:07 - loss: 1.1036 - regression_loss: 0.9762 - classification_loss: 0.1274 233/500 [============>.................] - ETA: 1:07 - loss: 1.1014 - regression_loss: 0.9744 - classification_loss: 0.1270 234/500 [=============>................] - ETA: 1:07 - loss: 1.1006 - regression_loss: 0.9734 - classification_loss: 0.1272 235/500 [=============>................] - ETA: 1:06 - loss: 1.1006 - regression_loss: 0.9735 - classification_loss: 0.1271 236/500 [=============>................] - ETA: 1:06 - loss: 1.0994 - regression_loss: 0.9727 - classification_loss: 0.1267 237/500 [=============>................] - ETA: 1:06 - loss: 1.1001 - regression_loss: 0.9733 - classification_loss: 0.1268 238/500 [=============>................] - ETA: 1:06 - loss: 1.1014 - regression_loss: 0.9744 - classification_loss: 0.1270 239/500 [=============>................] - ETA: 1:05 - loss: 1.0991 - regression_loss: 0.9725 - classification_loss: 0.1266 240/500 [=============>................] - ETA: 1:05 - loss: 1.1008 - regression_loss: 0.9739 - classification_loss: 0.1268 241/500 [=============>................] - ETA: 1:05 - loss: 1.1023 - regression_loss: 0.9751 - classification_loss: 0.1272 242/500 [=============>................] - ETA: 1:05 - loss: 1.1024 - regression_loss: 0.9753 - classification_loss: 0.1271 243/500 [=============>................] - ETA: 1:04 - loss: 1.1002 - regression_loss: 0.9735 - classification_loss: 0.1268 244/500 [=============>................] - ETA: 1:04 - loss: 1.1009 - regression_loss: 0.9741 - classification_loss: 0.1268 245/500 [=============>................] - ETA: 1:04 - loss: 1.1021 - regression_loss: 0.9752 - classification_loss: 0.1270 246/500 [=============>................] - ETA: 1:04 - loss: 1.1015 - regression_loss: 0.9746 - classification_loss: 0.1269 247/500 [=============>................] - ETA: 1:03 - loss: 1.1018 - regression_loss: 0.9748 - classification_loss: 0.1270 248/500 [=============>................] - ETA: 1:03 - loss: 1.1017 - regression_loss: 0.9747 - classification_loss: 0.1270 249/500 [=============>................] - ETA: 1:03 - loss: 1.1018 - regression_loss: 0.9748 - classification_loss: 0.1270 250/500 [==============>...............] - ETA: 1:03 - loss: 1.1012 - regression_loss: 0.9745 - classification_loss: 0.1267 251/500 [==============>...............] - ETA: 1:02 - loss: 1.1015 - regression_loss: 0.9745 - classification_loss: 0.1270 252/500 [==============>...............] - ETA: 1:02 - loss: 1.1014 - regression_loss: 0.9746 - classification_loss: 0.1268 253/500 [==============>...............] - ETA: 1:02 - loss: 1.1015 - regression_loss: 0.9746 - classification_loss: 0.1269 254/500 [==============>...............] - ETA: 1:02 - loss: 1.1012 - regression_loss: 0.9745 - classification_loss: 0.1266 255/500 [==============>...............] - ETA: 1:01 - loss: 1.1008 - regression_loss: 0.9741 - classification_loss: 0.1267 256/500 [==============>...............] - ETA: 1:01 - loss: 1.1041 - regression_loss: 0.9767 - classification_loss: 0.1274 257/500 [==============>...............] - ETA: 1:01 - loss: 1.1057 - regression_loss: 0.9783 - classification_loss: 0.1274 258/500 [==============>...............] - ETA: 1:01 - loss: 1.1057 - regression_loss: 0.9786 - classification_loss: 0.1271 259/500 [==============>...............] - ETA: 1:00 - loss: 1.1029 - regression_loss: 0.9761 - classification_loss: 0.1268 260/500 [==============>...............] - ETA: 1:00 - loss: 1.1044 - regression_loss: 0.9773 - classification_loss: 0.1271 261/500 [==============>...............] - ETA: 1:00 - loss: 1.1022 - regression_loss: 0.9755 - classification_loss: 0.1268 262/500 [==============>...............] - ETA: 1:00 - loss: 1.1015 - regression_loss: 0.9750 - classification_loss: 0.1265 263/500 [==============>...............] - ETA: 59s - loss: 1.1017 - regression_loss: 0.9753 - classification_loss: 0.1264  264/500 [==============>...............] - ETA: 59s - loss: 1.1045 - regression_loss: 0.9774 - classification_loss: 0.1271 265/500 [==============>...............] - ETA: 59s - loss: 1.1022 - regression_loss: 0.9754 - classification_loss: 0.1268 266/500 [==============>...............] - ETA: 59s - loss: 1.1025 - regression_loss: 0.9759 - classification_loss: 0.1266 267/500 [===============>..............] - ETA: 58s - loss: 1.0993 - regression_loss: 0.9729 - classification_loss: 0.1264 268/500 [===============>..............] - ETA: 58s - loss: 1.0981 - regression_loss: 0.9717 - classification_loss: 0.1264 269/500 [===============>..............] - ETA: 58s - loss: 1.0994 - regression_loss: 0.9728 - classification_loss: 0.1266 270/500 [===============>..............] - ETA: 58s - loss: 1.0968 - regression_loss: 0.9706 - classification_loss: 0.1262 271/500 [===============>..............] - ETA: 57s - loss: 1.0953 - regression_loss: 0.9693 - classification_loss: 0.1260 272/500 [===============>..............] - ETA: 57s - loss: 1.0953 - regression_loss: 0.9692 - classification_loss: 0.1261 273/500 [===============>..............] - ETA: 57s - loss: 1.0953 - regression_loss: 0.9693 - classification_loss: 0.1260 274/500 [===============>..............] - ETA: 57s - loss: 1.0962 - regression_loss: 0.9701 - classification_loss: 0.1261 275/500 [===============>..............] - ETA: 56s - loss: 1.0955 - regression_loss: 0.9694 - classification_loss: 0.1261 276/500 [===============>..............] - ETA: 56s - loss: 1.0963 - regression_loss: 0.9700 - classification_loss: 0.1263 277/500 [===============>..............] - ETA: 56s - loss: 1.0961 - regression_loss: 0.9698 - classification_loss: 0.1263 278/500 [===============>..............] - ETA: 56s - loss: 1.0959 - regression_loss: 0.9698 - classification_loss: 0.1261 279/500 [===============>..............] - ETA: 55s - loss: 1.0959 - regression_loss: 0.9698 - classification_loss: 0.1261 280/500 [===============>..............] - ETA: 55s - loss: 1.0945 - regression_loss: 0.9687 - classification_loss: 0.1258 281/500 [===============>..............] - ETA: 55s - loss: 1.0950 - regression_loss: 0.9691 - classification_loss: 0.1259 282/500 [===============>..............] - ETA: 55s - loss: 1.0945 - regression_loss: 0.9688 - classification_loss: 0.1257 283/500 [===============>..............] - ETA: 54s - loss: 1.0964 - regression_loss: 0.9704 - classification_loss: 0.1260 284/500 [================>.............] - ETA: 54s - loss: 1.0959 - regression_loss: 0.9699 - classification_loss: 0.1259 285/500 [================>.............] - ETA: 54s - loss: 1.0961 - regression_loss: 0.9702 - classification_loss: 0.1259 286/500 [================>.............] - ETA: 54s - loss: 1.0950 - regression_loss: 0.9691 - classification_loss: 0.1258 287/500 [================>.............] - ETA: 53s - loss: 1.0938 - regression_loss: 0.9683 - classification_loss: 0.1256 288/500 [================>.............] - ETA: 53s - loss: 1.0932 - regression_loss: 0.9679 - classification_loss: 0.1252 289/500 [================>.............] - ETA: 53s - loss: 1.0928 - regression_loss: 0.9677 - classification_loss: 0.1251 290/500 [================>.............] - ETA: 53s - loss: 1.0929 - regression_loss: 0.9678 - classification_loss: 0.1251 291/500 [================>.............] - ETA: 52s - loss: 1.0924 - regression_loss: 0.9674 - classification_loss: 0.1250 292/500 [================>.............] - ETA: 52s - loss: 1.0911 - regression_loss: 0.9663 - classification_loss: 0.1248 293/500 [================>.............] - ETA: 52s - loss: 1.0901 - regression_loss: 0.9654 - classification_loss: 0.1247 294/500 [================>.............] - ETA: 52s - loss: 1.0905 - regression_loss: 0.9658 - classification_loss: 0.1247 295/500 [================>.............] - ETA: 51s - loss: 1.0894 - regression_loss: 0.9650 - classification_loss: 0.1245 296/500 [================>.............] - ETA: 51s - loss: 1.0909 - regression_loss: 0.9662 - classification_loss: 0.1248 297/500 [================>.............] - ETA: 51s - loss: 1.0906 - regression_loss: 0.9658 - classification_loss: 0.1248 298/500 [================>.............] - ETA: 51s - loss: 1.0914 - regression_loss: 0.9666 - classification_loss: 0.1249 299/500 [================>.............] - ETA: 50s - loss: 1.0933 - regression_loss: 0.9680 - classification_loss: 0.1253 300/500 [=================>............] - ETA: 50s - loss: 1.0918 - regression_loss: 0.9667 - classification_loss: 0.1251 301/500 [=================>............] - ETA: 50s - loss: 1.0910 - regression_loss: 0.9659 - classification_loss: 0.1251 302/500 [=================>............] - ETA: 50s - loss: 1.0922 - regression_loss: 0.9671 - classification_loss: 0.1252 303/500 [=================>............] - ETA: 49s - loss: 1.0928 - regression_loss: 0.9676 - classification_loss: 0.1252 304/500 [=================>............] - ETA: 49s - loss: 1.0942 - regression_loss: 0.9689 - classification_loss: 0.1253 305/500 [=================>............] - ETA: 49s - loss: 1.0954 - regression_loss: 0.9700 - classification_loss: 0.1254 306/500 [=================>............] - ETA: 49s - loss: 1.0956 - regression_loss: 0.9701 - classification_loss: 0.1255 307/500 [=================>............] - ETA: 48s - loss: 1.0940 - regression_loss: 0.9687 - classification_loss: 0.1253 308/500 [=================>............] - ETA: 48s - loss: 1.0930 - regression_loss: 0.9679 - classification_loss: 0.1251 309/500 [=================>............] - ETA: 48s - loss: 1.0920 - regression_loss: 0.9670 - classification_loss: 0.1249 310/500 [=================>............] - ETA: 48s - loss: 1.0933 - regression_loss: 0.9681 - classification_loss: 0.1252 311/500 [=================>............] - ETA: 47s - loss: 1.0933 - regression_loss: 0.9681 - classification_loss: 0.1252 312/500 [=================>............] - ETA: 47s - loss: 1.0941 - regression_loss: 0.9688 - classification_loss: 0.1253 313/500 [=================>............] - ETA: 47s - loss: 1.0924 - regression_loss: 0.9671 - classification_loss: 0.1252 314/500 [=================>............] - ETA: 47s - loss: 1.0917 - regression_loss: 0.9664 - classification_loss: 0.1253 315/500 [=================>............] - ETA: 46s - loss: 1.0918 - regression_loss: 0.9666 - classification_loss: 0.1252 316/500 [=================>............] - ETA: 46s - loss: 1.0929 - regression_loss: 0.9676 - classification_loss: 0.1254 317/500 [==================>...........] - ETA: 46s - loss: 1.0952 - regression_loss: 0.9696 - classification_loss: 0.1256 318/500 [==================>...........] - ETA: 46s - loss: 1.0961 - regression_loss: 0.9705 - classification_loss: 0.1256 319/500 [==================>...........] - ETA: 45s - loss: 1.0942 - regression_loss: 0.9690 - classification_loss: 0.1253 320/500 [==================>...........] - ETA: 45s - loss: 1.0945 - regression_loss: 0.9692 - classification_loss: 0.1253 321/500 [==================>...........] - ETA: 45s - loss: 1.0946 - regression_loss: 0.9694 - classification_loss: 0.1252 322/500 [==================>...........] - ETA: 45s - loss: 1.0951 - regression_loss: 0.9698 - classification_loss: 0.1253 323/500 [==================>...........] - ETA: 44s - loss: 1.0956 - regression_loss: 0.9702 - classification_loss: 0.1254 324/500 [==================>...........] - ETA: 44s - loss: 1.0962 - regression_loss: 0.9706 - classification_loss: 0.1256 325/500 [==================>...........] - ETA: 44s - loss: 1.0955 - regression_loss: 0.9701 - classification_loss: 0.1253 326/500 [==================>...........] - ETA: 44s - loss: 1.0951 - regression_loss: 0.9698 - classification_loss: 0.1253 327/500 [==================>...........] - ETA: 43s - loss: 1.0954 - regression_loss: 0.9701 - classification_loss: 0.1253 328/500 [==================>...........] - ETA: 43s - loss: 1.0965 - regression_loss: 0.9710 - classification_loss: 0.1255 329/500 [==================>...........] - ETA: 43s - loss: 1.0970 - regression_loss: 0.9714 - classification_loss: 0.1256 330/500 [==================>...........] - ETA: 43s - loss: 1.0962 - regression_loss: 0.9707 - classification_loss: 0.1255 331/500 [==================>...........] - ETA: 42s - loss: 1.0953 - regression_loss: 0.9700 - classification_loss: 0.1253 332/500 [==================>...........] - ETA: 42s - loss: 1.0953 - regression_loss: 0.9699 - classification_loss: 0.1254 333/500 [==================>...........] - ETA: 42s - loss: 1.0963 - regression_loss: 0.9708 - classification_loss: 0.1255 334/500 [===================>..........] - ETA: 42s - loss: 1.0961 - regression_loss: 0.9706 - classification_loss: 0.1255 335/500 [===================>..........] - ETA: 41s - loss: 1.0966 - regression_loss: 0.9712 - classification_loss: 0.1254 336/500 [===================>..........] - ETA: 41s - loss: 1.0969 - regression_loss: 0.9714 - classification_loss: 0.1256 337/500 [===================>..........] - ETA: 41s - loss: 1.0951 - regression_loss: 0.9698 - classification_loss: 0.1253 338/500 [===================>..........] - ETA: 41s - loss: 1.0949 - regression_loss: 0.9697 - classification_loss: 0.1252 339/500 [===================>..........] - ETA: 40s - loss: 1.0958 - regression_loss: 0.9704 - classification_loss: 0.1254 340/500 [===================>..........] - ETA: 40s - loss: 1.0940 - regression_loss: 0.9689 - classification_loss: 0.1251 341/500 [===================>..........] - ETA: 40s - loss: 1.0920 - regression_loss: 0.9672 - classification_loss: 0.1248 342/500 [===================>..........] - ETA: 40s - loss: 1.0933 - regression_loss: 0.9683 - classification_loss: 0.1250 343/500 [===================>..........] - ETA: 39s - loss: 1.0932 - regression_loss: 0.9682 - classification_loss: 0.1250 344/500 [===================>..........] - ETA: 39s - loss: 1.0930 - regression_loss: 0.9681 - classification_loss: 0.1249 345/500 [===================>..........] - ETA: 39s - loss: 1.0942 - regression_loss: 0.9691 - classification_loss: 0.1251 346/500 [===================>..........] - ETA: 38s - loss: 1.0956 - regression_loss: 0.9701 - classification_loss: 0.1255 347/500 [===================>..........] - ETA: 38s - loss: 1.0958 - regression_loss: 0.9704 - classification_loss: 0.1254 348/500 [===================>..........] - ETA: 38s - loss: 1.0951 - regression_loss: 0.9699 - classification_loss: 0.1252 349/500 [===================>..........] - ETA: 38s - loss: 1.0962 - regression_loss: 0.9709 - classification_loss: 0.1253 350/500 [====================>.........] - ETA: 37s - loss: 1.0967 - regression_loss: 0.9715 - classification_loss: 0.1251 351/500 [====================>.........] - ETA: 37s - loss: 1.0946 - regression_loss: 0.9697 - classification_loss: 0.1249 352/500 [====================>.........] - ETA: 37s - loss: 1.0957 - regression_loss: 0.9706 - classification_loss: 0.1251 353/500 [====================>.........] - ETA: 37s - loss: 1.0944 - regression_loss: 0.9695 - classification_loss: 0.1249 354/500 [====================>.........] - ETA: 36s - loss: 1.0955 - regression_loss: 0.9705 - classification_loss: 0.1250 355/500 [====================>.........] - ETA: 36s - loss: 1.0967 - regression_loss: 0.9715 - classification_loss: 0.1252 356/500 [====================>.........] - ETA: 36s - loss: 1.0963 - regression_loss: 0.9712 - classification_loss: 0.1251 357/500 [====================>.........] - ETA: 36s - loss: 1.0953 - regression_loss: 0.9703 - classification_loss: 0.1250 358/500 [====================>.........] - ETA: 35s - loss: 1.0954 - regression_loss: 0.9703 - classification_loss: 0.1251 359/500 [====================>.........] - ETA: 35s - loss: 1.0949 - regression_loss: 0.9700 - classification_loss: 0.1249 360/500 [====================>.........] - ETA: 35s - loss: 1.0947 - regression_loss: 0.9698 - classification_loss: 0.1249 361/500 [====================>.........] - ETA: 35s - loss: 1.0931 - regression_loss: 0.9684 - classification_loss: 0.1248 362/500 [====================>.........] - ETA: 34s - loss: 1.0927 - regression_loss: 0.9680 - classification_loss: 0.1247 363/500 [====================>.........] - ETA: 34s - loss: 1.0917 - regression_loss: 0.9672 - classification_loss: 0.1246 364/500 [====================>.........] - ETA: 34s - loss: 1.0911 - regression_loss: 0.9665 - classification_loss: 0.1245 365/500 [====================>.........] - ETA: 34s - loss: 1.0893 - regression_loss: 0.9650 - classification_loss: 0.1243 366/500 [====================>.........] - ETA: 33s - loss: 1.0894 - regression_loss: 0.9650 - classification_loss: 0.1243 367/500 [=====================>........] - ETA: 33s - loss: 1.0901 - regression_loss: 0.9655 - classification_loss: 0.1246 368/500 [=====================>........] - ETA: 33s - loss: 1.0900 - regression_loss: 0.9655 - classification_loss: 0.1246 369/500 [=====================>........] - ETA: 33s - loss: 1.0904 - regression_loss: 0.9658 - classification_loss: 0.1246 370/500 [=====================>........] - ETA: 32s - loss: 1.0923 - regression_loss: 0.9674 - classification_loss: 0.1249 371/500 [=====================>........] - ETA: 32s - loss: 1.0926 - regression_loss: 0.9677 - classification_loss: 0.1249 372/500 [=====================>........] - ETA: 32s - loss: 1.0938 - regression_loss: 0.9689 - classification_loss: 0.1250 373/500 [=====================>........] - ETA: 32s - loss: 1.0946 - regression_loss: 0.9696 - classification_loss: 0.1251 374/500 [=====================>........] - ETA: 31s - loss: 1.0960 - regression_loss: 0.9707 - classification_loss: 0.1253 375/500 [=====================>........] - ETA: 31s - loss: 1.0957 - regression_loss: 0.9705 - classification_loss: 0.1252 376/500 [=====================>........] - ETA: 31s - loss: 1.0958 - regression_loss: 0.9707 - classification_loss: 0.1251 377/500 [=====================>........] - ETA: 31s - loss: 1.0947 - regression_loss: 0.9698 - classification_loss: 0.1249 378/500 [=====================>........] - ETA: 30s - loss: 1.0945 - regression_loss: 0.9697 - classification_loss: 0.1248 379/500 [=====================>........] - ETA: 30s - loss: 1.0956 - regression_loss: 0.9707 - classification_loss: 0.1249 380/500 [=====================>........] - ETA: 30s - loss: 1.0955 - regression_loss: 0.9706 - classification_loss: 0.1249 381/500 [=====================>........] - ETA: 30s - loss: 1.0955 - regression_loss: 0.9706 - classification_loss: 0.1250 382/500 [=====================>........] - ETA: 29s - loss: 1.0961 - regression_loss: 0.9710 - classification_loss: 0.1251 383/500 [=====================>........] - ETA: 29s - loss: 1.0959 - regression_loss: 0.9707 - classification_loss: 0.1252 384/500 [======================>.......] - ETA: 29s - loss: 1.0980 - regression_loss: 0.9725 - classification_loss: 0.1254 385/500 [======================>.......] - ETA: 29s - loss: 1.0986 - regression_loss: 0.9730 - classification_loss: 0.1256 386/500 [======================>.......] - ETA: 28s - loss: 1.0981 - regression_loss: 0.9726 - classification_loss: 0.1255 387/500 [======================>.......] - ETA: 28s - loss: 1.0987 - regression_loss: 0.9732 - classification_loss: 0.1255 388/500 [======================>.......] - ETA: 28s - loss: 1.0988 - regression_loss: 0.9734 - classification_loss: 0.1254 389/500 [======================>.......] - ETA: 28s - loss: 1.0989 - regression_loss: 0.9736 - classification_loss: 0.1254 390/500 [======================>.......] - ETA: 27s - loss: 1.0993 - regression_loss: 0.9738 - classification_loss: 0.1255 391/500 [======================>.......] - ETA: 27s - loss: 1.0993 - regression_loss: 0.9738 - classification_loss: 0.1255 392/500 [======================>.......] - ETA: 27s - loss: 1.0993 - regression_loss: 0.9738 - classification_loss: 0.1255 393/500 [======================>.......] - ETA: 27s - loss: 1.0985 - regression_loss: 0.9731 - classification_loss: 0.1254 394/500 [======================>.......] - ETA: 26s - loss: 1.1000 - regression_loss: 0.9745 - classification_loss: 0.1255 395/500 [======================>.......] - ETA: 26s - loss: 1.0991 - regression_loss: 0.9737 - classification_loss: 0.1254 396/500 [======================>.......] - ETA: 26s - loss: 1.0990 - regression_loss: 0.9737 - classification_loss: 0.1254 397/500 [======================>.......] - ETA: 26s - loss: 1.0977 - regression_loss: 0.9725 - classification_loss: 0.1252 398/500 [======================>.......] - ETA: 25s - loss: 1.0981 - regression_loss: 0.9727 - classification_loss: 0.1254 399/500 [======================>.......] - ETA: 25s - loss: 1.0972 - regression_loss: 0.9718 - classification_loss: 0.1254 400/500 [=======================>......] - ETA: 25s - loss: 1.0960 - regression_loss: 0.9707 - classification_loss: 0.1252 401/500 [=======================>......] - ETA: 25s - loss: 1.0955 - regression_loss: 0.9704 - classification_loss: 0.1251 402/500 [=======================>......] - ETA: 24s - loss: 1.0941 - regression_loss: 0.9691 - classification_loss: 0.1249 403/500 [=======================>......] - ETA: 24s - loss: 1.0939 - regression_loss: 0.9689 - classification_loss: 0.1250 404/500 [=======================>......] - ETA: 24s - loss: 1.0940 - regression_loss: 0.9690 - classification_loss: 0.1250 405/500 [=======================>......] - ETA: 24s - loss: 1.0936 - regression_loss: 0.9688 - classification_loss: 0.1248 406/500 [=======================>......] - ETA: 23s - loss: 1.0945 - regression_loss: 0.9695 - classification_loss: 0.1249 407/500 [=======================>......] - ETA: 23s - loss: 1.0956 - regression_loss: 0.9705 - classification_loss: 0.1252 408/500 [=======================>......] - ETA: 23s - loss: 1.0961 - regression_loss: 0.9708 - classification_loss: 0.1253 409/500 [=======================>......] - ETA: 23s - loss: 1.0961 - regression_loss: 0.9709 - classification_loss: 0.1253 410/500 [=======================>......] - ETA: 22s - loss: 1.0944 - regression_loss: 0.9693 - classification_loss: 0.1251 411/500 [=======================>......] - ETA: 22s - loss: 1.0957 - regression_loss: 0.9703 - classification_loss: 0.1253 412/500 [=======================>......] - ETA: 22s - loss: 1.0959 - regression_loss: 0.9705 - classification_loss: 0.1254 413/500 [=======================>......] - ETA: 22s - loss: 1.0960 - regression_loss: 0.9707 - classification_loss: 0.1254 414/500 [=======================>......] - ETA: 21s - loss: 1.0975 - regression_loss: 0.9721 - classification_loss: 0.1255 415/500 [=======================>......] - ETA: 21s - loss: 1.0975 - regression_loss: 0.9720 - classification_loss: 0.1255 416/500 [=======================>......] - ETA: 21s - loss: 1.0983 - regression_loss: 0.9726 - classification_loss: 0.1257 417/500 [========================>.....] - ETA: 21s - loss: 1.0968 - regression_loss: 0.9713 - classification_loss: 0.1255 418/500 [========================>.....] - ETA: 20s - loss: 1.0971 - regression_loss: 0.9716 - classification_loss: 0.1255 419/500 [========================>.....] - ETA: 20s - loss: 1.0957 - regression_loss: 0.9704 - classification_loss: 0.1253 420/500 [========================>.....] - ETA: 20s - loss: 1.0938 - regression_loss: 0.9688 - classification_loss: 0.1250 421/500 [========================>.....] - ETA: 20s - loss: 1.0950 - regression_loss: 0.9698 - classification_loss: 0.1252 422/500 [========================>.....] - ETA: 19s - loss: 1.0956 - regression_loss: 0.9703 - classification_loss: 0.1253 423/500 [========================>.....] - ETA: 19s - loss: 1.0974 - regression_loss: 0.9718 - classification_loss: 0.1256 424/500 [========================>.....] - ETA: 19s - loss: 1.0979 - regression_loss: 0.9724 - classification_loss: 0.1256 425/500 [========================>.....] - ETA: 19s - loss: 1.0967 - regression_loss: 0.9712 - classification_loss: 0.1255 426/500 [========================>.....] - ETA: 18s - loss: 1.0979 - regression_loss: 0.9722 - classification_loss: 0.1257 427/500 [========================>.....] - ETA: 18s - loss: 1.0971 - regression_loss: 0.9716 - classification_loss: 0.1255 428/500 [========================>.....] - ETA: 18s - loss: 1.0976 - regression_loss: 0.9720 - classification_loss: 0.1256 429/500 [========================>.....] - ETA: 17s - loss: 1.0966 - regression_loss: 0.9711 - classification_loss: 0.1255 430/500 [========================>.....] - ETA: 17s - loss: 1.0974 - regression_loss: 0.9716 - classification_loss: 0.1257 431/500 [========================>.....] - ETA: 17s - loss: 1.0981 - regression_loss: 0.9722 - classification_loss: 0.1259 432/500 [========================>.....] - ETA: 17s - loss: 1.0978 - regression_loss: 0.9720 - classification_loss: 0.1258 433/500 [========================>.....] - ETA: 16s - loss: 1.0961 - regression_loss: 0.9705 - classification_loss: 0.1256 434/500 [=========================>....] - ETA: 16s - loss: 1.0967 - regression_loss: 0.9710 - classification_loss: 0.1257 435/500 [=========================>....] - ETA: 16s - loss: 1.0954 - regression_loss: 0.9700 - classification_loss: 0.1255 436/500 [=========================>....] - ETA: 16s - loss: 1.0954 - regression_loss: 0.9699 - classification_loss: 0.1254 437/500 [=========================>....] - ETA: 15s - loss: 1.0949 - regression_loss: 0.9695 - classification_loss: 0.1254 438/500 [=========================>....] - ETA: 15s - loss: 1.0954 - regression_loss: 0.9700 - classification_loss: 0.1255 439/500 [=========================>....] - ETA: 15s - loss: 1.0961 - regression_loss: 0.9705 - classification_loss: 0.1255 440/500 [=========================>....] - ETA: 15s - loss: 1.0955 - regression_loss: 0.9701 - classification_loss: 0.1255 441/500 [=========================>....] - ETA: 14s - loss: 1.0953 - regression_loss: 0.9697 - classification_loss: 0.1256 442/500 [=========================>....] - ETA: 14s - loss: 1.0952 - regression_loss: 0.9696 - classification_loss: 0.1256 443/500 [=========================>....] - ETA: 14s - loss: 1.0947 - regression_loss: 0.9692 - classification_loss: 0.1255 444/500 [=========================>....] - ETA: 14s - loss: 1.0951 - regression_loss: 0.9695 - classification_loss: 0.1256 445/500 [=========================>....] - ETA: 13s - loss: 1.0960 - regression_loss: 0.9703 - classification_loss: 0.1257 446/500 [=========================>....] - ETA: 13s - loss: 1.0968 - regression_loss: 0.9710 - classification_loss: 0.1258 447/500 [=========================>....] - ETA: 13s - loss: 1.0958 - regression_loss: 0.9702 - classification_loss: 0.1256 448/500 [=========================>....] - ETA: 13s - loss: 1.0959 - regression_loss: 0.9704 - classification_loss: 0.1255 449/500 [=========================>....] - ETA: 12s - loss: 1.0965 - regression_loss: 0.9709 - classification_loss: 0.1256 450/500 [==========================>...] - ETA: 12s - loss: 1.0961 - regression_loss: 0.9705 - classification_loss: 0.1255 451/500 [==========================>...] - ETA: 12s - loss: 1.0968 - regression_loss: 0.9712 - classification_loss: 0.1256 452/500 [==========================>...] - ETA: 12s - loss: 1.0963 - regression_loss: 0.9708 - classification_loss: 0.1255 453/500 [==========================>...] - ETA: 11s - loss: 1.0964 - regression_loss: 0.9708 - classification_loss: 0.1256 454/500 [==========================>...] - ETA: 11s - loss: 1.0968 - regression_loss: 0.9712 - classification_loss: 0.1256 455/500 [==========================>...] - ETA: 11s - loss: 1.0979 - regression_loss: 0.9722 - classification_loss: 0.1257 456/500 [==========================>...] - ETA: 11s - loss: 1.0975 - regression_loss: 0.9719 - classification_loss: 0.1256 457/500 [==========================>...] - ETA: 10s - loss: 1.0959 - regression_loss: 0.9705 - classification_loss: 0.1254 458/500 [==========================>...] - ETA: 10s - loss: 1.0953 - regression_loss: 0.9700 - classification_loss: 0.1253 459/500 [==========================>...] - ETA: 10s - loss: 1.0960 - regression_loss: 0.9705 - classification_loss: 0.1255 460/500 [==========================>...] - ETA: 10s - loss: 1.0956 - regression_loss: 0.9702 - classification_loss: 0.1254 461/500 [==========================>...] - ETA: 9s - loss: 1.0951 - regression_loss: 0.9698 - classification_loss: 0.1254  462/500 [==========================>...] - ETA: 9s - loss: 1.0956 - regression_loss: 0.9701 - classification_loss: 0.1255 463/500 [==========================>...] - ETA: 9s - loss: 1.0958 - regression_loss: 0.9703 - classification_loss: 0.1255 464/500 [==========================>...] - ETA: 9s - loss: 1.0968 - regression_loss: 0.9711 - classification_loss: 0.1257 465/500 [==========================>...] - ETA: 8s - loss: 1.0970 - regression_loss: 0.9713 - classification_loss: 0.1257 466/500 [==========================>...] - ETA: 8s - loss: 1.0968 - regression_loss: 0.9712 - classification_loss: 0.1256 467/500 [===========================>..] - ETA: 8s - loss: 1.0972 - regression_loss: 0.9716 - classification_loss: 0.1257 468/500 [===========================>..] - ETA: 8s - loss: 1.0972 - regression_loss: 0.9716 - classification_loss: 0.1256 469/500 [===========================>..] - ETA: 7s - loss: 1.0956 - regression_loss: 0.9702 - classification_loss: 0.1254 470/500 [===========================>..] - ETA: 7s - loss: 1.0956 - regression_loss: 0.9702 - classification_loss: 0.1254 471/500 [===========================>..] - ETA: 7s - loss: 1.0960 - regression_loss: 0.9706 - classification_loss: 0.1255 472/500 [===========================>..] - ETA: 7s - loss: 1.0954 - regression_loss: 0.9700 - classification_loss: 0.1253 473/500 [===========================>..] - ETA: 6s - loss: 1.0957 - regression_loss: 0.9703 - classification_loss: 0.1254 474/500 [===========================>..] - ETA: 6s - loss: 1.0953 - regression_loss: 0.9701 - classification_loss: 0.1252 475/500 [===========================>..] - ETA: 6s - loss: 1.0943 - regression_loss: 0.9692 - classification_loss: 0.1251 476/500 [===========================>..] - ETA: 6s - loss: 1.0945 - regression_loss: 0.9692 - classification_loss: 0.1253 477/500 [===========================>..] - ETA: 5s - loss: 1.0944 - regression_loss: 0.9692 - classification_loss: 0.1252 478/500 [===========================>..] - ETA: 5s - loss: 1.0941 - regression_loss: 0.9689 - classification_loss: 0.1252 479/500 [===========================>..] - ETA: 5s - loss: 1.0937 - regression_loss: 0.9686 - classification_loss: 0.1251 480/500 [===========================>..] - ETA: 5s - loss: 1.0939 - regression_loss: 0.9688 - classification_loss: 0.1251 481/500 [===========================>..] - ETA: 4s - loss: 1.0933 - regression_loss: 0.9682 - classification_loss: 0.1251 482/500 [===========================>..] - ETA: 4s - loss: 1.0923 - regression_loss: 0.9674 - classification_loss: 0.1249 483/500 [===========================>..] - ETA: 4s - loss: 1.0920 - regression_loss: 0.9671 - classification_loss: 0.1249 484/500 [============================>.] - ETA: 4s - loss: 1.0910 - regression_loss: 0.9662 - classification_loss: 0.1248 485/500 [============================>.] - ETA: 3s - loss: 1.0915 - regression_loss: 0.9666 - classification_loss: 0.1250 486/500 [============================>.] - ETA: 3s - loss: 1.0923 - regression_loss: 0.9672 - classification_loss: 0.1251 487/500 [============================>.] - ETA: 3s - loss: 1.0916 - regression_loss: 0.9665 - classification_loss: 0.1251 488/500 [============================>.] - ETA: 3s - loss: 1.0930 - regression_loss: 0.9677 - classification_loss: 0.1253 489/500 [============================>.] - ETA: 2s - loss: 1.0926 - regression_loss: 0.9673 - classification_loss: 0.1253 490/500 [============================>.] - ETA: 2s - loss: 1.0915 - regression_loss: 0.9664 - classification_loss: 0.1251 491/500 [============================>.] - ETA: 2s - loss: 1.0915 - regression_loss: 0.9665 - classification_loss: 0.1250 492/500 [============================>.] - ETA: 2s - loss: 1.0905 - regression_loss: 0.9657 - classification_loss: 0.1248 493/500 [============================>.] - ETA: 1s - loss: 1.0910 - regression_loss: 0.9660 - classification_loss: 0.1250 494/500 [============================>.] - ETA: 1s - loss: 1.0921 - regression_loss: 0.9670 - classification_loss: 0.1250 495/500 [============================>.] - ETA: 1s - loss: 1.0916 - regression_loss: 0.9666 - classification_loss: 0.1249 496/500 [============================>.] - ETA: 1s - loss: 1.0909 - regression_loss: 0.9660 - classification_loss: 0.1249 497/500 [============================>.] - ETA: 0s - loss: 1.0906 - regression_loss: 0.9657 - classification_loss: 0.1249 498/500 [============================>.] - ETA: 0s - loss: 1.0903 - regression_loss: 0.9655 - classification_loss: 0.1247 499/500 [============================>.] - ETA: 0s - loss: 1.0899 - regression_loss: 0.9652 - classification_loss: 0.1247 500/500 [==============================] - 127s 254ms/step - loss: 1.0911 - regression_loss: 0.9662 - classification_loss: 0.1249 1172 instances of class plum with average precision: 0.7786 mAP: 0.7786 Epoch 00032: saving model to ./training/snapshots/resnet50_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 2:07 - loss: 0.5763 - regression_loss: 0.5298 - classification_loss: 0.0465 2/500 [..............................] - ETA: 2:08 - loss: 0.5949 - regression_loss: 0.5563 - classification_loss: 0.0386 3/500 [..............................] - ETA: 2:07 - loss: 0.7349 - regression_loss: 0.6713 - classification_loss: 0.0636 4/500 [..............................] - ETA: 2:07 - loss: 0.9007 - regression_loss: 0.8065 - classification_loss: 0.0942 5/500 [..............................] - ETA: 2:06 - loss: 0.8780 - regression_loss: 0.7940 - classification_loss: 0.0841 6/500 [..............................] - ETA: 2:03 - loss: 0.9366 - regression_loss: 0.8352 - classification_loss: 0.1015 7/500 [..............................] - ETA: 2:00 - loss: 1.0125 - regression_loss: 0.8967 - classification_loss: 0.1158 8/500 [..............................] - ETA: 1:59 - loss: 1.0315 - regression_loss: 0.9166 - classification_loss: 0.1149 9/500 [..............................] - ETA: 2:00 - loss: 0.9749 - regression_loss: 0.8658 - classification_loss: 0.1092 10/500 [..............................] - ETA: 2:00 - loss: 0.9690 - regression_loss: 0.8667 - classification_loss: 0.1023 11/500 [..............................] - ETA: 2:00 - loss: 0.9733 - regression_loss: 0.8740 - classification_loss: 0.0994 12/500 [..............................] - ETA: 1:59 - loss: 1.0312 - regression_loss: 0.9264 - classification_loss: 0.1048 13/500 [..............................] - ETA: 1:59 - loss: 1.0030 - regression_loss: 0.9033 - classification_loss: 0.0997 14/500 [..............................] - ETA: 1:58 - loss: 1.0251 - regression_loss: 0.9208 - classification_loss: 0.1043 15/500 [..............................] - ETA: 1:58 - loss: 1.0774 - regression_loss: 0.9569 - classification_loss: 0.1204 16/500 [..............................] - ETA: 1:57 - loss: 1.0928 - regression_loss: 0.9695 - classification_loss: 0.1233 17/500 [>.............................] - ETA: 1:56 - loss: 1.0876 - regression_loss: 0.9628 - classification_loss: 0.1249 18/500 [>.............................] - ETA: 1:55 - loss: 1.1061 - regression_loss: 0.9781 - classification_loss: 0.1280 19/500 [>.............................] - ETA: 1:54 - loss: 1.0895 - regression_loss: 0.9635 - classification_loss: 0.1259 20/500 [>.............................] - ETA: 1:54 - loss: 1.1043 - regression_loss: 0.9761 - classification_loss: 0.1282 21/500 [>.............................] - ETA: 1:53 - loss: 1.1054 - regression_loss: 0.9780 - classification_loss: 0.1274 22/500 [>.............................] - ETA: 1:53 - loss: 1.0841 - regression_loss: 0.9589 - classification_loss: 0.1251 23/500 [>.............................] - ETA: 1:53 - loss: 1.0847 - regression_loss: 0.9601 - classification_loss: 0.1247 24/500 [>.............................] - ETA: 1:53 - loss: 1.0594 - regression_loss: 0.9374 - classification_loss: 0.1220 25/500 [>.............................] - ETA: 1:53 - loss: 1.0370 - regression_loss: 0.9183 - classification_loss: 0.1186 26/500 [>.............................] - ETA: 1:53 - loss: 1.0397 - regression_loss: 0.9204 - classification_loss: 0.1193 27/500 [>.............................] - ETA: 1:52 - loss: 1.0390 - regression_loss: 0.9215 - classification_loss: 0.1174 28/500 [>.............................] - ETA: 1:52 - loss: 1.0367 - regression_loss: 0.9197 - classification_loss: 0.1170 29/500 [>.............................] - ETA: 1:52 - loss: 1.0142 - regression_loss: 0.9007 - classification_loss: 0.1135 30/500 [>.............................] - ETA: 1:52 - loss: 1.0289 - regression_loss: 0.9134 - classification_loss: 0.1155 31/500 [>.............................] - ETA: 1:52 - loss: 1.0399 - regression_loss: 0.9225 - classification_loss: 0.1174 32/500 [>.............................] - ETA: 1:52 - loss: 1.0595 - regression_loss: 0.9389 - classification_loss: 0.1206 33/500 [>.............................] - ETA: 1:52 - loss: 1.0506 - regression_loss: 0.9311 - classification_loss: 0.1195 34/500 [=>............................] - ETA: 1:52 - loss: 1.0420 - regression_loss: 0.9232 - classification_loss: 0.1188 35/500 [=>............................] - ETA: 1:52 - loss: 1.0409 - regression_loss: 0.9232 - classification_loss: 0.1177 36/500 [=>............................] - ETA: 1:52 - loss: 1.0290 - regression_loss: 0.9132 - classification_loss: 0.1158 37/500 [=>............................] - ETA: 1:52 - loss: 1.0217 - regression_loss: 0.9073 - classification_loss: 0.1145 38/500 [=>............................] - ETA: 1:52 - loss: 1.0037 - regression_loss: 0.8903 - classification_loss: 0.1134 39/500 [=>............................] - ETA: 1:51 - loss: 1.0016 - regression_loss: 0.8886 - classification_loss: 0.1130 40/500 [=>............................] - ETA: 1:51 - loss: 1.0028 - regression_loss: 0.8915 - classification_loss: 0.1113 41/500 [=>............................] - ETA: 1:51 - loss: 1.0085 - regression_loss: 0.8955 - classification_loss: 0.1129 42/500 [=>............................] - ETA: 1:51 - loss: 1.0109 - regression_loss: 0.8981 - classification_loss: 0.1127 43/500 [=>............................] - ETA: 1:51 - loss: 1.0202 - regression_loss: 0.9066 - classification_loss: 0.1136 44/500 [=>............................] - ETA: 1:51 - loss: 1.0269 - regression_loss: 0.9122 - classification_loss: 0.1147 45/500 [=>............................] - ETA: 1:50 - loss: 1.0355 - regression_loss: 0.9189 - classification_loss: 0.1166 46/500 [=>............................] - ETA: 1:50 - loss: 1.0377 - regression_loss: 0.9207 - classification_loss: 0.1170 47/500 [=>............................] - ETA: 1:50 - loss: 1.0336 - regression_loss: 0.9177 - classification_loss: 0.1159 48/500 [=>............................] - ETA: 1:50 - loss: 1.0289 - regression_loss: 0.9138 - classification_loss: 0.1152 49/500 [=>............................] - ETA: 1:50 - loss: 1.0420 - regression_loss: 0.9243 - classification_loss: 0.1177 50/500 [==>...........................] - ETA: 1:50 - loss: 1.0527 - regression_loss: 0.9347 - classification_loss: 0.1181 51/500 [==>...........................] - ETA: 1:49 - loss: 1.0615 - regression_loss: 0.9435 - classification_loss: 0.1180 52/500 [==>...........................] - ETA: 1:49 - loss: 1.0617 - regression_loss: 0.9436 - classification_loss: 0.1181 53/500 [==>...........................] - ETA: 1:49 - loss: 1.0600 - regression_loss: 0.9414 - classification_loss: 0.1185 54/500 [==>...........................] - ETA: 1:49 - loss: 1.0583 - regression_loss: 0.9397 - classification_loss: 0.1186 55/500 [==>...........................] - ETA: 1:49 - loss: 1.0638 - regression_loss: 0.9457 - classification_loss: 0.1181 56/500 [==>...........................] - ETA: 1:49 - loss: 1.0634 - regression_loss: 0.9450 - classification_loss: 0.1183 57/500 [==>...........................] - ETA: 1:48 - loss: 1.0677 - regression_loss: 0.9492 - classification_loss: 0.1185 58/500 [==>...........................] - ETA: 1:48 - loss: 1.0593 - regression_loss: 0.9417 - classification_loss: 0.1176 59/500 [==>...........................] - ETA: 1:48 - loss: 1.0672 - regression_loss: 0.9484 - classification_loss: 0.1188 60/500 [==>...........................] - ETA: 1:48 - loss: 1.0613 - regression_loss: 0.9434 - classification_loss: 0.1179 61/500 [==>...........................] - ETA: 1:48 - loss: 1.0462 - regression_loss: 0.9296 - classification_loss: 0.1166 62/500 [==>...........................] - ETA: 1:47 - loss: 1.0508 - regression_loss: 0.9353 - classification_loss: 0.1155 63/500 [==>...........................] - ETA: 1:47 - loss: 1.0522 - regression_loss: 0.9363 - classification_loss: 0.1159 64/500 [==>...........................] - ETA: 1:47 - loss: 1.0579 - regression_loss: 0.9410 - classification_loss: 0.1169 65/500 [==>...........................] - ETA: 1:47 - loss: 1.0502 - regression_loss: 0.9347 - classification_loss: 0.1155 66/500 [==>...........................] - ETA: 1:47 - loss: 1.0445 - regression_loss: 0.9295 - classification_loss: 0.1151 67/500 [===>..........................] - ETA: 1:46 - loss: 1.0458 - regression_loss: 0.9305 - classification_loss: 0.1153 68/500 [===>..........................] - ETA: 1:46 - loss: 1.0498 - regression_loss: 0.9340 - classification_loss: 0.1158 69/500 [===>..........................] - ETA: 1:46 - loss: 1.0550 - regression_loss: 0.9383 - classification_loss: 0.1167 70/500 [===>..........................] - ETA: 1:46 - loss: 1.0540 - regression_loss: 0.9369 - classification_loss: 0.1171 71/500 [===>..........................] - ETA: 1:45 - loss: 1.0573 - regression_loss: 0.9397 - classification_loss: 0.1176 72/500 [===>..........................] - ETA: 1:45 - loss: 1.0507 - regression_loss: 0.9344 - classification_loss: 0.1163 73/500 [===>..........................] - ETA: 1:45 - loss: 1.0522 - regression_loss: 0.9353 - classification_loss: 0.1170 74/500 [===>..........................] - ETA: 1:45 - loss: 1.0458 - regression_loss: 0.9298 - classification_loss: 0.1160 75/500 [===>..........................] - ETA: 1:45 - loss: 1.0467 - regression_loss: 0.9309 - classification_loss: 0.1157 76/500 [===>..........................] - ETA: 1:44 - loss: 1.0562 - regression_loss: 0.9389 - classification_loss: 0.1173 77/500 [===>..........................] - ETA: 1:44 - loss: 1.0666 - regression_loss: 0.9469 - classification_loss: 0.1197 78/500 [===>..........................] - ETA: 1:44 - loss: 1.0726 - regression_loss: 0.9520 - classification_loss: 0.1206 79/500 [===>..........................] - ETA: 1:44 - loss: 1.0731 - regression_loss: 0.9529 - classification_loss: 0.1202 80/500 [===>..........................] - ETA: 1:43 - loss: 1.0673 - regression_loss: 0.9481 - classification_loss: 0.1191 81/500 [===>..........................] - ETA: 1:43 - loss: 1.0700 - regression_loss: 0.9502 - classification_loss: 0.1198 82/500 [===>..........................] - ETA: 1:43 - loss: 1.0708 - regression_loss: 0.9508 - classification_loss: 0.1200 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0663 - regression_loss: 0.9471 - classification_loss: 0.1192 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0675 - regression_loss: 0.9480 - classification_loss: 0.1195 85/500 [====>.........................] - ETA: 1:42 - loss: 1.0716 - regression_loss: 0.9511 - classification_loss: 0.1205 86/500 [====>.........................] - ETA: 1:42 - loss: 1.0736 - regression_loss: 0.9528 - classification_loss: 0.1208 87/500 [====>.........................] - ETA: 1:42 - loss: 1.0723 - regression_loss: 0.9515 - classification_loss: 0.1208 88/500 [====>.........................] - ETA: 1:42 - loss: 1.0711 - regression_loss: 0.9509 - classification_loss: 0.1202 89/500 [====>.........................] - ETA: 1:41 - loss: 1.0662 - regression_loss: 0.9466 - classification_loss: 0.1196 90/500 [====>.........................] - ETA: 1:41 - loss: 1.0629 - regression_loss: 0.9437 - classification_loss: 0.1193 91/500 [====>.........................] - ETA: 1:41 - loss: 1.0688 - regression_loss: 0.9488 - classification_loss: 0.1200 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0707 - regression_loss: 0.9502 - classification_loss: 0.1205 93/500 [====>.........................] - ETA: 1:40 - loss: 1.0699 - regression_loss: 0.9500 - classification_loss: 0.1199 94/500 [====>.........................] - ETA: 1:40 - loss: 1.0690 - regression_loss: 0.9496 - classification_loss: 0.1195 95/500 [====>.........................] - ETA: 1:40 - loss: 1.0690 - regression_loss: 0.9496 - classification_loss: 0.1194 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0706 - regression_loss: 0.9503 - classification_loss: 0.1203 97/500 [====>.........................] - ETA: 1:39 - loss: 1.0635 - regression_loss: 0.9442 - classification_loss: 0.1194 98/500 [====>.........................] - ETA: 1:39 - loss: 1.0599 - regression_loss: 0.9411 - classification_loss: 0.1188 99/500 [====>.........................] - ETA: 1:39 - loss: 1.0630 - regression_loss: 0.9436 - classification_loss: 0.1194 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0644 - regression_loss: 0.9446 - classification_loss: 0.1198 101/500 [=====>........................] - ETA: 1:38 - loss: 1.0591 - regression_loss: 0.9370 - classification_loss: 0.1221 102/500 [=====>........................] - ETA: 1:38 - loss: 1.0655 - regression_loss: 0.9423 - classification_loss: 0.1233 103/500 [=====>........................] - ETA: 1:38 - loss: 1.0656 - regression_loss: 0.9426 - classification_loss: 0.1230 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0712 - regression_loss: 0.9472 - classification_loss: 0.1240 105/500 [=====>........................] - ETA: 1:37 - loss: 1.0669 - regression_loss: 0.9439 - classification_loss: 0.1231 106/500 [=====>........................] - ETA: 1:37 - loss: 1.0663 - regression_loss: 0.9430 - classification_loss: 0.1233 107/500 [=====>........................] - ETA: 1:37 - loss: 1.0614 - regression_loss: 0.9387 - classification_loss: 0.1227 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0627 - regression_loss: 0.9399 - classification_loss: 0.1228 109/500 [=====>........................] - ETA: 1:36 - loss: 1.0662 - regression_loss: 0.9433 - classification_loss: 0.1229 110/500 [=====>........................] - ETA: 1:36 - loss: 1.0669 - regression_loss: 0.9436 - classification_loss: 0.1232 111/500 [=====>........................] - ETA: 1:36 - loss: 1.0628 - regression_loss: 0.9403 - classification_loss: 0.1225 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0643 - regression_loss: 0.9399 - classification_loss: 0.1244 113/500 [=====>........................] - ETA: 1:35 - loss: 1.0654 - regression_loss: 0.9405 - classification_loss: 0.1249 114/500 [=====>........................] - ETA: 1:35 - loss: 1.0681 - regression_loss: 0.9428 - classification_loss: 0.1253 115/500 [=====>........................] - ETA: 1:35 - loss: 1.0732 - regression_loss: 0.9471 - classification_loss: 0.1260 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0782 - regression_loss: 0.9519 - classification_loss: 0.1263 117/500 [======>.......................] - ETA: 1:34 - loss: 1.0795 - regression_loss: 0.9532 - classification_loss: 0.1263 118/500 [======>.......................] - ETA: 1:34 - loss: 1.0799 - regression_loss: 0.9537 - classification_loss: 0.1262 119/500 [======>.......................] - ETA: 1:34 - loss: 1.0813 - regression_loss: 0.9552 - classification_loss: 0.1262 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0763 - regression_loss: 0.9510 - classification_loss: 0.1252 121/500 [======>.......................] - ETA: 1:33 - loss: 1.0788 - regression_loss: 0.9533 - classification_loss: 0.1255 122/500 [======>.......................] - ETA: 1:33 - loss: 1.0807 - regression_loss: 0.9550 - classification_loss: 0.1257 123/500 [======>.......................] - ETA: 1:33 - loss: 1.0823 - regression_loss: 0.9564 - classification_loss: 0.1258 124/500 [======>.......................] - ETA: 1:33 - loss: 1.0790 - regression_loss: 0.9538 - classification_loss: 0.1253 125/500 [======>.......................] - ETA: 1:32 - loss: 1.0777 - regression_loss: 0.9528 - classification_loss: 0.1249 126/500 [======>.......................] - ETA: 1:32 - loss: 1.0793 - regression_loss: 0.9544 - classification_loss: 0.1249 127/500 [======>.......................] - ETA: 1:32 - loss: 1.0827 - regression_loss: 0.9574 - classification_loss: 0.1253 128/500 [======>.......................] - ETA: 1:32 - loss: 1.0881 - regression_loss: 0.9613 - classification_loss: 0.1268 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0873 - regression_loss: 0.9607 - classification_loss: 0.1266 130/500 [======>.......................] - ETA: 1:31 - loss: 1.0878 - regression_loss: 0.9617 - classification_loss: 0.1262 131/500 [======>.......................] - ETA: 1:31 - loss: 1.0867 - regression_loss: 0.9606 - classification_loss: 0.1261 132/500 [======>.......................] - ETA: 1:31 - loss: 1.0902 - regression_loss: 0.9630 - classification_loss: 0.1272 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0844 - regression_loss: 0.9579 - classification_loss: 0.1264 134/500 [=======>......................] - ETA: 1:30 - loss: 1.0856 - regression_loss: 0.9591 - classification_loss: 0.1265 135/500 [=======>......................] - ETA: 1:30 - loss: 1.0879 - regression_loss: 0.9610 - classification_loss: 0.1269 136/500 [=======>......................] - ETA: 1:30 - loss: 1.0887 - regression_loss: 0.9618 - classification_loss: 0.1269 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0858 - regression_loss: 0.9593 - classification_loss: 0.1265 138/500 [=======>......................] - ETA: 1:29 - loss: 1.0863 - regression_loss: 0.9597 - classification_loss: 0.1266 139/500 [=======>......................] - ETA: 1:29 - loss: 1.0893 - regression_loss: 0.9625 - classification_loss: 0.1268 140/500 [=======>......................] - ETA: 1:29 - loss: 1.0888 - regression_loss: 0.9621 - classification_loss: 0.1268 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0939 - regression_loss: 0.9660 - classification_loss: 0.1279 142/500 [=======>......................] - ETA: 1:28 - loss: 1.0914 - regression_loss: 0.9639 - classification_loss: 0.1275 143/500 [=======>......................] - ETA: 1:28 - loss: 1.0908 - regression_loss: 0.9633 - classification_loss: 0.1275 144/500 [=======>......................] - ETA: 1:28 - loss: 1.0857 - regression_loss: 0.9589 - classification_loss: 0.1267 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0883 - regression_loss: 0.9612 - classification_loss: 0.1272 146/500 [=======>......................] - ETA: 1:27 - loss: 1.0857 - regression_loss: 0.9591 - classification_loss: 0.1266 147/500 [=======>......................] - ETA: 1:27 - loss: 1.0878 - regression_loss: 0.9609 - classification_loss: 0.1269 148/500 [=======>......................] - ETA: 1:27 - loss: 1.0828 - regression_loss: 0.9565 - classification_loss: 0.1262 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0828 - regression_loss: 0.9566 - classification_loss: 0.1262 150/500 [========>.....................] - ETA: 1:27 - loss: 1.0817 - regression_loss: 0.9557 - classification_loss: 0.1260 151/500 [========>.....................] - ETA: 1:26 - loss: 1.0773 - regression_loss: 0.9519 - classification_loss: 0.1254 152/500 [========>.....................] - ETA: 1:26 - loss: 1.0739 - regression_loss: 0.9490 - classification_loss: 0.1250 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0751 - regression_loss: 0.9499 - classification_loss: 0.1252 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0723 - regression_loss: 0.9476 - classification_loss: 0.1247 155/500 [========>.....................] - ETA: 1:25 - loss: 1.0727 - regression_loss: 0.9480 - classification_loss: 0.1247 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0720 - regression_loss: 0.9473 - classification_loss: 0.1247 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0762 - regression_loss: 0.9508 - classification_loss: 0.1254 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0722 - regression_loss: 0.9475 - classification_loss: 0.1247 159/500 [========>.....................] - ETA: 1:24 - loss: 1.0730 - regression_loss: 0.9482 - classification_loss: 0.1248 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0742 - regression_loss: 0.9493 - classification_loss: 0.1249 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0713 - regression_loss: 0.9469 - classification_loss: 0.1245 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0703 - regression_loss: 0.9461 - classification_loss: 0.1243 163/500 [========>.....................] - ETA: 1:23 - loss: 1.0716 - regression_loss: 0.9470 - classification_loss: 0.1247 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0740 - regression_loss: 0.9488 - classification_loss: 0.1252 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0737 - regression_loss: 0.9488 - classification_loss: 0.1249 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0732 - regression_loss: 0.9487 - classification_loss: 0.1245 167/500 [=========>....................] - ETA: 1:22 - loss: 1.0697 - regression_loss: 0.9456 - classification_loss: 0.1241 168/500 [=========>....................] - ETA: 1:22 - loss: 1.0703 - regression_loss: 0.9461 - classification_loss: 0.1242 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0693 - regression_loss: 0.9456 - classification_loss: 0.1238 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0717 - regression_loss: 0.9474 - classification_loss: 0.1243 171/500 [=========>....................] - ETA: 1:21 - loss: 1.0729 - regression_loss: 0.9481 - classification_loss: 0.1248 172/500 [=========>....................] - ETA: 1:21 - loss: 1.0746 - regression_loss: 0.9495 - classification_loss: 0.1250 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0736 - regression_loss: 0.9488 - classification_loss: 0.1249 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0768 - regression_loss: 0.9520 - classification_loss: 0.1249 175/500 [=========>....................] - ETA: 1:20 - loss: 1.0765 - regression_loss: 0.9520 - classification_loss: 0.1245 176/500 [=========>....................] - ETA: 1:20 - loss: 1.0781 - regression_loss: 0.9534 - classification_loss: 0.1247 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0789 - regression_loss: 0.9542 - classification_loss: 0.1246 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0761 - regression_loss: 0.9518 - classification_loss: 0.1242 179/500 [=========>....................] - ETA: 1:19 - loss: 1.0767 - regression_loss: 0.9524 - classification_loss: 0.1243 180/500 [=========>....................] - ETA: 1:19 - loss: 1.0749 - regression_loss: 0.9510 - classification_loss: 0.1239 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0747 - regression_loss: 0.9505 - classification_loss: 0.1242 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0722 - regression_loss: 0.9484 - classification_loss: 0.1238 183/500 [=========>....................] - ETA: 1:18 - loss: 1.0738 - regression_loss: 0.9499 - classification_loss: 0.1238 184/500 [==========>...................] - ETA: 1:18 - loss: 1.0754 - regression_loss: 0.9519 - classification_loss: 0.1235 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0738 - regression_loss: 0.9508 - classification_loss: 0.1230 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0765 - regression_loss: 0.9531 - classification_loss: 0.1234 187/500 [==========>...................] - ETA: 1:18 - loss: 1.0772 - regression_loss: 0.9536 - classification_loss: 0.1236 188/500 [==========>...................] - ETA: 1:17 - loss: 1.0736 - regression_loss: 0.9504 - classification_loss: 0.1232 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0755 - regression_loss: 0.9520 - classification_loss: 0.1235 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0765 - regression_loss: 0.9528 - classification_loss: 0.1237 191/500 [==========>...................] - ETA: 1:17 - loss: 1.0730 - regression_loss: 0.9497 - classification_loss: 0.1233 192/500 [==========>...................] - ETA: 1:16 - loss: 1.0757 - regression_loss: 0.9517 - classification_loss: 0.1240 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0779 - regression_loss: 0.9536 - classification_loss: 0.1243 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0802 - regression_loss: 0.9556 - classification_loss: 0.1246 195/500 [==========>...................] - ETA: 1:15 - loss: 1.0837 - regression_loss: 0.9586 - classification_loss: 0.1251 196/500 [==========>...................] - ETA: 1:15 - loss: 1.0853 - regression_loss: 0.9600 - classification_loss: 0.1253 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0845 - regression_loss: 0.9592 - classification_loss: 0.1253 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0880 - regression_loss: 0.9620 - classification_loss: 0.1260 199/500 [==========>...................] - ETA: 1:14 - loss: 1.0873 - regression_loss: 0.9608 - classification_loss: 0.1265 200/500 [===========>..................] - ETA: 1:14 - loss: 1.0865 - regression_loss: 0.9602 - classification_loss: 0.1263 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0882 - regression_loss: 0.9619 - classification_loss: 0.1263 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0881 - regression_loss: 0.9619 - classification_loss: 0.1262 203/500 [===========>..................] - ETA: 1:13 - loss: 1.0856 - regression_loss: 0.9597 - classification_loss: 0.1259 204/500 [===========>..................] - ETA: 1:13 - loss: 1.0852 - regression_loss: 0.9595 - classification_loss: 0.1256 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0852 - regression_loss: 0.9594 - classification_loss: 0.1258 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0848 - regression_loss: 0.9590 - classification_loss: 0.1258 207/500 [===========>..................] - ETA: 1:12 - loss: 1.0849 - regression_loss: 0.9591 - classification_loss: 0.1257 208/500 [===========>..................] - ETA: 1:12 - loss: 1.0876 - regression_loss: 0.9615 - classification_loss: 0.1262 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0880 - regression_loss: 0.9617 - classification_loss: 0.1263 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0891 - regression_loss: 0.9627 - classification_loss: 0.1265 211/500 [===========>..................] - ETA: 1:11 - loss: 1.0908 - regression_loss: 0.9640 - classification_loss: 0.1269 212/500 [===========>..................] - ETA: 1:11 - loss: 1.0888 - regression_loss: 0.9623 - classification_loss: 0.1265 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0902 - regression_loss: 0.9634 - classification_loss: 0.1268 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0907 - regression_loss: 0.9638 - classification_loss: 0.1269 215/500 [===========>..................] - ETA: 1:10 - loss: 1.0920 - regression_loss: 0.9649 - classification_loss: 0.1270 216/500 [===========>..................] - ETA: 1:10 - loss: 1.0887 - regression_loss: 0.9621 - classification_loss: 0.1266 217/500 [============>.................] - ETA: 1:10 - loss: 1.0879 - regression_loss: 0.9615 - classification_loss: 0.1264 218/500 [============>.................] - ETA: 1:10 - loss: 1.0872 - regression_loss: 0.9609 - classification_loss: 0.1263 219/500 [============>.................] - ETA: 1:10 - loss: 1.0870 - regression_loss: 0.9607 - classification_loss: 0.1263 220/500 [============>.................] - ETA: 1:09 - loss: 1.0855 - regression_loss: 0.9596 - classification_loss: 0.1259 221/500 [============>.................] - ETA: 1:09 - loss: 1.0860 - regression_loss: 0.9602 - classification_loss: 0.1257 222/500 [============>.................] - ETA: 1:09 - loss: 1.0844 - regression_loss: 0.9588 - classification_loss: 0.1255 223/500 [============>.................] - ETA: 1:09 - loss: 1.0819 - regression_loss: 0.9565 - classification_loss: 0.1253 224/500 [============>.................] - ETA: 1:08 - loss: 1.0823 - regression_loss: 0.9566 - classification_loss: 0.1257 225/500 [============>.................] - ETA: 1:08 - loss: 1.0816 - regression_loss: 0.9560 - classification_loss: 0.1256 226/500 [============>.................] - ETA: 1:08 - loss: 1.0827 - regression_loss: 0.9570 - classification_loss: 0.1257 227/500 [============>.................] - ETA: 1:08 - loss: 1.0811 - regression_loss: 0.9558 - classification_loss: 0.1253 228/500 [============>.................] - ETA: 1:07 - loss: 1.0787 - regression_loss: 0.9539 - classification_loss: 0.1249 229/500 [============>.................] - ETA: 1:07 - loss: 1.0786 - regression_loss: 0.9537 - classification_loss: 0.1249 230/500 [============>.................] - ETA: 1:07 - loss: 1.0770 - regression_loss: 0.9524 - classification_loss: 0.1246 231/500 [============>.................] - ETA: 1:07 - loss: 1.0776 - regression_loss: 0.9529 - classification_loss: 0.1248 232/500 [============>.................] - ETA: 1:06 - loss: 1.0754 - regression_loss: 0.9510 - classification_loss: 0.1244 233/500 [============>.................] - ETA: 1:06 - loss: 1.0751 - regression_loss: 0.9509 - classification_loss: 0.1242 234/500 [=============>................] - ETA: 1:06 - loss: 1.0737 - regression_loss: 0.9497 - classification_loss: 0.1240 235/500 [=============>................] - ETA: 1:06 - loss: 1.0728 - regression_loss: 0.9489 - classification_loss: 0.1240 236/500 [=============>................] - ETA: 1:05 - loss: 1.0741 - regression_loss: 0.9500 - classification_loss: 0.1241 237/500 [=============>................] - ETA: 1:05 - loss: 1.0742 - regression_loss: 0.9501 - classification_loss: 0.1241 238/500 [=============>................] - ETA: 1:05 - loss: 1.0749 - regression_loss: 0.9507 - classification_loss: 0.1242 239/500 [=============>................] - ETA: 1:05 - loss: 1.0769 - regression_loss: 0.9522 - classification_loss: 0.1247 240/500 [=============>................] - ETA: 1:04 - loss: 1.0775 - regression_loss: 0.9528 - classification_loss: 0.1247 241/500 [=============>................] - ETA: 1:04 - loss: 1.0771 - regression_loss: 0.9524 - classification_loss: 0.1247 242/500 [=============>................] - ETA: 1:04 - loss: 1.0784 - regression_loss: 0.9534 - classification_loss: 0.1249 243/500 [=============>................] - ETA: 1:04 - loss: 1.0789 - regression_loss: 0.9539 - classification_loss: 0.1251 244/500 [=============>................] - ETA: 1:03 - loss: 1.0779 - regression_loss: 0.9532 - classification_loss: 0.1248 245/500 [=============>................] - ETA: 1:03 - loss: 1.0787 - regression_loss: 0.9540 - classification_loss: 0.1247 246/500 [=============>................] - ETA: 1:03 - loss: 1.0780 - regression_loss: 0.9536 - classification_loss: 0.1244 247/500 [=============>................] - ETA: 1:03 - loss: 1.0785 - regression_loss: 0.9537 - classification_loss: 0.1247 248/500 [=============>................] - ETA: 1:02 - loss: 1.0788 - regression_loss: 0.9541 - classification_loss: 0.1247 249/500 [=============>................] - ETA: 1:02 - loss: 1.0818 - regression_loss: 0.9565 - classification_loss: 0.1253 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0803 - regression_loss: 0.9552 - classification_loss: 0.1251 251/500 [==============>...............] - ETA: 1:02 - loss: 1.0800 - regression_loss: 0.9552 - classification_loss: 0.1248 252/500 [==============>...............] - ETA: 1:01 - loss: 1.0773 - regression_loss: 0.9528 - classification_loss: 0.1245 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0745 - regression_loss: 0.9503 - classification_loss: 0.1241 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0730 - regression_loss: 0.9492 - classification_loss: 0.1238 255/500 [==============>...............] - ETA: 1:01 - loss: 1.0726 - regression_loss: 0.9489 - classification_loss: 0.1237 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0707 - regression_loss: 0.9473 - classification_loss: 0.1234 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0698 - regression_loss: 0.9466 - classification_loss: 0.1232 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0703 - regression_loss: 0.9471 - classification_loss: 0.1232 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0704 - regression_loss: 0.9472 - classification_loss: 0.1232 260/500 [==============>...............] - ETA: 59s - loss: 1.0684 - regression_loss: 0.9454 - classification_loss: 0.1230  261/500 [==============>...............] - ETA: 59s - loss: 1.0677 - regression_loss: 0.9448 - classification_loss: 0.1229 262/500 [==============>...............] - ETA: 59s - loss: 1.0677 - regression_loss: 0.9448 - classification_loss: 0.1229 263/500 [==============>...............] - ETA: 59s - loss: 1.0671 - regression_loss: 0.9444 - classification_loss: 0.1227 264/500 [==============>...............] - ETA: 58s - loss: 1.0691 - regression_loss: 0.9460 - classification_loss: 0.1230 265/500 [==============>...............] - ETA: 58s - loss: 1.0701 - regression_loss: 0.9470 - classification_loss: 0.1231 266/500 [==============>...............] - ETA: 58s - loss: 1.0720 - regression_loss: 0.9486 - classification_loss: 0.1234 267/500 [===============>..............] - ETA: 58s - loss: 1.0724 - regression_loss: 0.9490 - classification_loss: 0.1234 268/500 [===============>..............] - ETA: 57s - loss: 1.0717 - regression_loss: 0.9485 - classification_loss: 0.1232 269/500 [===============>..............] - ETA: 57s - loss: 1.0724 - regression_loss: 0.9493 - classification_loss: 0.1231 270/500 [===============>..............] - ETA: 57s - loss: 1.0729 - regression_loss: 0.9498 - classification_loss: 0.1231 271/500 [===============>..............] - ETA: 57s - loss: 1.0740 - regression_loss: 0.9506 - classification_loss: 0.1235 272/500 [===============>..............] - ETA: 56s - loss: 1.0731 - regression_loss: 0.9499 - classification_loss: 0.1232 273/500 [===============>..............] - ETA: 56s - loss: 1.0730 - regression_loss: 0.9498 - classification_loss: 0.1233 274/500 [===============>..............] - ETA: 56s - loss: 1.0712 - regression_loss: 0.9483 - classification_loss: 0.1229 275/500 [===============>..............] - ETA: 56s - loss: 1.0708 - regression_loss: 0.9478 - classification_loss: 0.1230 276/500 [===============>..............] - ETA: 55s - loss: 1.0711 - regression_loss: 0.9480 - classification_loss: 0.1231 277/500 [===============>..............] - ETA: 55s - loss: 1.0717 - regression_loss: 0.9488 - classification_loss: 0.1229 278/500 [===============>..............] - ETA: 55s - loss: 1.0738 - regression_loss: 0.9508 - classification_loss: 0.1230 279/500 [===============>..............] - ETA: 55s - loss: 1.0761 - regression_loss: 0.9529 - classification_loss: 0.1231 280/500 [===============>..............] - ETA: 54s - loss: 1.0754 - regression_loss: 0.9524 - classification_loss: 0.1230 281/500 [===============>..............] - ETA: 54s - loss: 1.0742 - regression_loss: 0.9515 - classification_loss: 0.1227 282/500 [===============>..............] - ETA: 54s - loss: 1.0757 - regression_loss: 0.9527 - classification_loss: 0.1230 283/500 [===============>..............] - ETA: 54s - loss: 1.0749 - regression_loss: 0.9520 - classification_loss: 0.1229 284/500 [================>.............] - ETA: 53s - loss: 1.0758 - regression_loss: 0.9528 - classification_loss: 0.1230 285/500 [================>.............] - ETA: 53s - loss: 1.0759 - regression_loss: 0.9528 - classification_loss: 0.1231 286/500 [================>.............] - ETA: 53s - loss: 1.0760 - regression_loss: 0.9529 - classification_loss: 0.1231 287/500 [================>.............] - ETA: 53s - loss: 1.0771 - regression_loss: 0.9538 - classification_loss: 0.1233 288/500 [================>.............] - ETA: 52s - loss: 1.0772 - regression_loss: 0.9538 - classification_loss: 0.1234 289/500 [================>.............] - ETA: 52s - loss: 1.0783 - regression_loss: 0.9547 - classification_loss: 0.1236 290/500 [================>.............] - ETA: 52s - loss: 1.0779 - regression_loss: 0.9543 - classification_loss: 0.1237 291/500 [================>.............] - ETA: 52s - loss: 1.0782 - regression_loss: 0.9545 - classification_loss: 0.1237 292/500 [================>.............] - ETA: 51s - loss: 1.0785 - regression_loss: 0.9545 - classification_loss: 0.1240 293/500 [================>.............] - ETA: 51s - loss: 1.0775 - regression_loss: 0.9536 - classification_loss: 0.1238 294/500 [================>.............] - ETA: 51s - loss: 1.0746 - regression_loss: 0.9511 - classification_loss: 0.1235 295/500 [================>.............] - ETA: 51s - loss: 1.0735 - regression_loss: 0.9501 - classification_loss: 0.1234 296/500 [================>.............] - ETA: 50s - loss: 1.0715 - regression_loss: 0.9483 - classification_loss: 0.1232 297/500 [================>.............] - ETA: 50s - loss: 1.0692 - regression_loss: 0.9463 - classification_loss: 0.1229 298/500 [================>.............] - ETA: 50s - loss: 1.0687 - regression_loss: 0.9458 - classification_loss: 0.1229 299/500 [================>.............] - ETA: 50s - loss: 1.0684 - regression_loss: 0.9454 - classification_loss: 0.1229 300/500 [=================>............] - ETA: 49s - loss: 1.0677 - regression_loss: 0.9450 - classification_loss: 0.1228 301/500 [=================>............] - ETA: 49s - loss: 1.0683 - regression_loss: 0.9454 - classification_loss: 0.1228 302/500 [=================>............] - ETA: 49s - loss: 1.0666 - regression_loss: 0.9441 - classification_loss: 0.1226 303/500 [=================>............] - ETA: 49s - loss: 1.0673 - regression_loss: 0.9446 - classification_loss: 0.1227 304/500 [=================>............] - ETA: 48s - loss: 1.0680 - regression_loss: 0.9452 - classification_loss: 0.1228 305/500 [=================>............] - ETA: 48s - loss: 1.0658 - regression_loss: 0.9432 - classification_loss: 0.1226 306/500 [=================>............] - ETA: 48s - loss: 1.0638 - regression_loss: 0.9414 - classification_loss: 0.1224 307/500 [=================>............] - ETA: 48s - loss: 1.0638 - regression_loss: 0.9414 - classification_loss: 0.1224 308/500 [=================>............] - ETA: 47s - loss: 1.0629 - regression_loss: 0.9407 - classification_loss: 0.1223 309/500 [=================>............] - ETA: 47s - loss: 1.0632 - regression_loss: 0.9410 - classification_loss: 0.1222 310/500 [=================>............] - ETA: 47s - loss: 1.0640 - regression_loss: 0.9417 - classification_loss: 0.1223 311/500 [=================>............] - ETA: 47s - loss: 1.0651 - regression_loss: 0.9426 - classification_loss: 0.1225 312/500 [=================>............] - ETA: 46s - loss: 1.0658 - regression_loss: 0.9432 - classification_loss: 0.1226 313/500 [=================>............] - ETA: 46s - loss: 1.0676 - regression_loss: 0.9448 - classification_loss: 0.1228 314/500 [=================>............] - ETA: 46s - loss: 1.0675 - regression_loss: 0.9447 - classification_loss: 0.1228 315/500 [=================>............] - ETA: 46s - loss: 1.0682 - regression_loss: 0.9454 - classification_loss: 0.1229 316/500 [=================>............] - ETA: 45s - loss: 1.0694 - regression_loss: 0.9465 - classification_loss: 0.1229 317/500 [==================>...........] - ETA: 45s - loss: 1.0704 - regression_loss: 0.9474 - classification_loss: 0.1231 318/500 [==================>...........] - ETA: 45s - loss: 1.0693 - regression_loss: 0.9465 - classification_loss: 0.1228 319/500 [==================>...........] - ETA: 45s - loss: 1.0688 - regression_loss: 0.9461 - classification_loss: 0.1227 320/500 [==================>...........] - ETA: 44s - loss: 1.0697 - regression_loss: 0.9470 - classification_loss: 0.1227 321/500 [==================>...........] - ETA: 44s - loss: 1.0686 - regression_loss: 0.9461 - classification_loss: 0.1225 322/500 [==================>...........] - ETA: 44s - loss: 1.0674 - regression_loss: 0.9452 - classification_loss: 0.1223 323/500 [==================>...........] - ETA: 44s - loss: 1.0676 - regression_loss: 0.9452 - classification_loss: 0.1223 324/500 [==================>...........] - ETA: 43s - loss: 1.0677 - regression_loss: 0.9455 - classification_loss: 0.1223 325/500 [==================>...........] - ETA: 43s - loss: 1.0690 - regression_loss: 0.9466 - classification_loss: 0.1224 326/500 [==================>...........] - ETA: 43s - loss: 1.0671 - regression_loss: 0.9450 - classification_loss: 0.1221 327/500 [==================>...........] - ETA: 43s - loss: 1.0679 - regression_loss: 0.9455 - classification_loss: 0.1223 328/500 [==================>...........] - ETA: 42s - loss: 1.0672 - regression_loss: 0.9450 - classification_loss: 0.1222 329/500 [==================>...........] - ETA: 42s - loss: 1.0680 - regression_loss: 0.9457 - classification_loss: 0.1223 330/500 [==================>...........] - ETA: 42s - loss: 1.0697 - regression_loss: 0.9474 - classification_loss: 0.1223 331/500 [==================>...........] - ETA: 42s - loss: 1.0703 - regression_loss: 0.9478 - classification_loss: 0.1224 332/500 [==================>...........] - ETA: 41s - loss: 1.0702 - regression_loss: 0.9477 - classification_loss: 0.1225 333/500 [==================>...........] - ETA: 41s - loss: 1.0709 - regression_loss: 0.9484 - classification_loss: 0.1225 334/500 [===================>..........] - ETA: 41s - loss: 1.0694 - regression_loss: 0.9470 - classification_loss: 0.1224 335/500 [===================>..........] - ETA: 41s - loss: 1.0689 - regression_loss: 0.9466 - classification_loss: 0.1223 336/500 [===================>..........] - ETA: 40s - loss: 1.0682 - regression_loss: 0.9461 - classification_loss: 0.1221 337/500 [===================>..........] - ETA: 40s - loss: 1.0689 - regression_loss: 0.9467 - classification_loss: 0.1222 338/500 [===================>..........] - ETA: 40s - loss: 1.0683 - regression_loss: 0.9461 - classification_loss: 0.1222 339/500 [===================>..........] - ETA: 40s - loss: 1.0681 - regression_loss: 0.9459 - classification_loss: 0.1221 340/500 [===================>..........] - ETA: 39s - loss: 1.0684 - regression_loss: 0.9463 - classification_loss: 0.1221 341/500 [===================>..........] - ETA: 39s - loss: 1.0693 - regression_loss: 0.9470 - classification_loss: 0.1223 342/500 [===================>..........] - ETA: 39s - loss: 1.0703 - regression_loss: 0.9477 - classification_loss: 0.1226 343/500 [===================>..........] - ETA: 39s - loss: 1.0702 - regression_loss: 0.9478 - classification_loss: 0.1224 344/500 [===================>..........] - ETA: 38s - loss: 1.0698 - regression_loss: 0.9474 - classification_loss: 0.1224 345/500 [===================>..........] - ETA: 38s - loss: 1.0696 - regression_loss: 0.9472 - classification_loss: 0.1225 346/500 [===================>..........] - ETA: 38s - loss: 1.0692 - regression_loss: 0.9468 - classification_loss: 0.1224 347/500 [===================>..........] - ETA: 38s - loss: 1.0677 - regression_loss: 0.9455 - classification_loss: 0.1222 348/500 [===================>..........] - ETA: 37s - loss: 1.0676 - regression_loss: 0.9454 - classification_loss: 0.1222 349/500 [===================>..........] - ETA: 37s - loss: 1.0686 - regression_loss: 0.9463 - classification_loss: 0.1223 350/500 [====================>.........] - ETA: 37s - loss: 1.0696 - regression_loss: 0.9473 - classification_loss: 0.1223 351/500 [====================>.........] - ETA: 37s - loss: 1.0702 - regression_loss: 0.9478 - classification_loss: 0.1224 352/500 [====================>.........] - ETA: 36s - loss: 1.0705 - regression_loss: 0.9481 - classification_loss: 0.1224 353/500 [====================>.........] - ETA: 36s - loss: 1.0701 - regression_loss: 0.9477 - classification_loss: 0.1224 354/500 [====================>.........] - ETA: 36s - loss: 1.0697 - regression_loss: 0.9474 - classification_loss: 0.1224 355/500 [====================>.........] - ETA: 36s - loss: 1.0713 - regression_loss: 0.9489 - classification_loss: 0.1224 356/500 [====================>.........] - ETA: 35s - loss: 1.0715 - regression_loss: 0.9491 - classification_loss: 0.1224 357/500 [====================>.........] - ETA: 35s - loss: 1.0698 - regression_loss: 0.9476 - classification_loss: 0.1222 358/500 [====================>.........] - ETA: 35s - loss: 1.0699 - regression_loss: 0.9476 - classification_loss: 0.1222 359/500 [====================>.........] - ETA: 35s - loss: 1.0711 - regression_loss: 0.9487 - classification_loss: 0.1224 360/500 [====================>.........] - ETA: 35s - loss: 1.0693 - regression_loss: 0.9472 - classification_loss: 0.1221 361/500 [====================>.........] - ETA: 34s - loss: 1.0687 - regression_loss: 0.9467 - classification_loss: 0.1220 362/500 [====================>.........] - ETA: 34s - loss: 1.0688 - regression_loss: 0.9467 - classification_loss: 0.1221 363/500 [====================>.........] - ETA: 34s - loss: 1.0703 - regression_loss: 0.9479 - classification_loss: 0.1224 364/500 [====================>.........] - ETA: 33s - loss: 1.0690 - regression_loss: 0.9468 - classification_loss: 0.1222 365/500 [====================>.........] - ETA: 33s - loss: 1.0705 - regression_loss: 0.9480 - classification_loss: 0.1225 366/500 [====================>.........] - ETA: 33s - loss: 1.0711 - regression_loss: 0.9484 - classification_loss: 0.1226 367/500 [=====================>........] - ETA: 33s - loss: 1.0710 - regression_loss: 0.9483 - classification_loss: 0.1227 368/500 [=====================>........] - ETA: 32s - loss: 1.0688 - regression_loss: 0.9464 - classification_loss: 0.1224 369/500 [=====================>........] - ETA: 32s - loss: 1.0676 - regression_loss: 0.9452 - classification_loss: 0.1223 370/500 [=====================>........] - ETA: 32s - loss: 1.0673 - regression_loss: 0.9450 - classification_loss: 0.1223 371/500 [=====================>........] - ETA: 32s - loss: 1.0677 - regression_loss: 0.9453 - classification_loss: 0.1224 372/500 [=====================>........] - ETA: 31s - loss: 1.0676 - regression_loss: 0.9453 - classification_loss: 0.1222 373/500 [=====================>........] - ETA: 31s - loss: 1.0658 - regression_loss: 0.9438 - classification_loss: 0.1220 374/500 [=====================>........] - ETA: 31s - loss: 1.0656 - regression_loss: 0.9437 - classification_loss: 0.1219 375/500 [=====================>........] - ETA: 31s - loss: 1.0650 - regression_loss: 0.9431 - classification_loss: 0.1219 376/500 [=====================>........] - ETA: 30s - loss: 1.0644 - regression_loss: 0.9427 - classification_loss: 0.1216 377/500 [=====================>........] - ETA: 30s - loss: 1.0652 - regression_loss: 0.9435 - classification_loss: 0.1217 378/500 [=====================>........] - ETA: 30s - loss: 1.0669 - regression_loss: 0.9450 - classification_loss: 0.1219 379/500 [=====================>........] - ETA: 30s - loss: 1.0679 - regression_loss: 0.9458 - classification_loss: 0.1221 380/500 [=====================>........] - ETA: 29s - loss: 1.0692 - regression_loss: 0.9470 - classification_loss: 0.1222 381/500 [=====================>........] - ETA: 29s - loss: 1.0699 - regression_loss: 0.9475 - classification_loss: 0.1224 382/500 [=====================>........] - ETA: 29s - loss: 1.0695 - regression_loss: 0.9472 - classification_loss: 0.1223 383/500 [=====================>........] - ETA: 29s - loss: 1.0709 - regression_loss: 0.9484 - classification_loss: 0.1225 384/500 [======================>.......] - ETA: 28s - loss: 1.0709 - regression_loss: 0.9482 - classification_loss: 0.1227 385/500 [======================>.......] - ETA: 28s - loss: 1.0722 - regression_loss: 0.9493 - classification_loss: 0.1229 386/500 [======================>.......] - ETA: 28s - loss: 1.0708 - regression_loss: 0.9481 - classification_loss: 0.1227 387/500 [======================>.......] - ETA: 28s - loss: 1.0717 - regression_loss: 0.9489 - classification_loss: 0.1228 388/500 [======================>.......] - ETA: 27s - loss: 1.0713 - regression_loss: 0.9486 - classification_loss: 0.1227 389/500 [======================>.......] - ETA: 27s - loss: 1.0703 - regression_loss: 0.9478 - classification_loss: 0.1225 390/500 [======================>.......] - ETA: 27s - loss: 1.0710 - regression_loss: 0.9483 - classification_loss: 0.1227 391/500 [======================>.......] - ETA: 27s - loss: 1.0724 - regression_loss: 0.9495 - classification_loss: 0.1229 392/500 [======================>.......] - ETA: 26s - loss: 1.0729 - regression_loss: 0.9499 - classification_loss: 0.1230 393/500 [======================>.......] - ETA: 26s - loss: 1.0718 - regression_loss: 0.9488 - classification_loss: 0.1230 394/500 [======================>.......] - ETA: 26s - loss: 1.0719 - regression_loss: 0.9490 - classification_loss: 0.1229 395/500 [======================>.......] - ETA: 26s - loss: 1.0725 - regression_loss: 0.9495 - classification_loss: 0.1230 396/500 [======================>.......] - ETA: 25s - loss: 1.0725 - regression_loss: 0.9496 - classification_loss: 0.1229 397/500 [======================>.......] - ETA: 25s - loss: 1.0729 - regression_loss: 0.9501 - classification_loss: 0.1228 398/500 [======================>.......] - ETA: 25s - loss: 1.0739 - regression_loss: 0.9508 - classification_loss: 0.1231 399/500 [======================>.......] - ETA: 25s - loss: 1.0734 - regression_loss: 0.9503 - classification_loss: 0.1230 400/500 [=======================>......] - ETA: 24s - loss: 1.0791 - regression_loss: 0.9521 - classification_loss: 0.1270 401/500 [=======================>......] - ETA: 24s - loss: 1.0789 - regression_loss: 0.9521 - classification_loss: 0.1269 402/500 [=======================>......] - ETA: 24s - loss: 1.0784 - regression_loss: 0.9517 - classification_loss: 0.1267 403/500 [=======================>......] - ETA: 24s - loss: 1.0764 - regression_loss: 0.9500 - classification_loss: 0.1264 404/500 [=======================>......] - ETA: 23s - loss: 1.0766 - regression_loss: 0.9500 - classification_loss: 0.1266 405/500 [=======================>......] - ETA: 23s - loss: 1.0767 - regression_loss: 0.9500 - classification_loss: 0.1266 406/500 [=======================>......] - ETA: 23s - loss: 1.0770 - regression_loss: 0.9505 - classification_loss: 0.1265 407/500 [=======================>......] - ETA: 23s - loss: 1.0768 - regression_loss: 0.9503 - classification_loss: 0.1265 408/500 [=======================>......] - ETA: 22s - loss: 1.0771 - regression_loss: 0.9505 - classification_loss: 0.1266 409/500 [=======================>......] - ETA: 22s - loss: 1.0758 - regression_loss: 0.9494 - classification_loss: 0.1264 410/500 [=======================>......] - ETA: 22s - loss: 1.0761 - regression_loss: 0.9497 - classification_loss: 0.1263 411/500 [=======================>......] - ETA: 22s - loss: 1.0748 - regression_loss: 0.9487 - classification_loss: 0.1261 412/500 [=======================>......] - ETA: 21s - loss: 1.0752 - regression_loss: 0.9490 - classification_loss: 0.1262 413/500 [=======================>......] - ETA: 21s - loss: 1.0747 - regression_loss: 0.9486 - classification_loss: 0.1261 414/500 [=======================>......] - ETA: 21s - loss: 1.0748 - regression_loss: 0.9487 - classification_loss: 0.1261 415/500 [=======================>......] - ETA: 21s - loss: 1.0757 - regression_loss: 0.9495 - classification_loss: 0.1262 416/500 [=======================>......] - ETA: 20s - loss: 1.0755 - regression_loss: 0.9494 - classification_loss: 0.1261 417/500 [========================>.....] - ETA: 20s - loss: 1.0756 - regression_loss: 0.9494 - classification_loss: 0.1261 418/500 [========================>.....] - ETA: 20s - loss: 1.0759 - regression_loss: 0.9497 - classification_loss: 0.1262 419/500 [========================>.....] - ETA: 20s - loss: 1.0772 - regression_loss: 0.9510 - classification_loss: 0.1262 420/500 [========================>.....] - ETA: 19s - loss: 1.0760 - regression_loss: 0.9499 - classification_loss: 0.1260 421/500 [========================>.....] - ETA: 19s - loss: 1.0757 - regression_loss: 0.9497 - classification_loss: 0.1261 422/500 [========================>.....] - ETA: 19s - loss: 1.0772 - regression_loss: 0.9509 - classification_loss: 0.1263 423/500 [========================>.....] - ETA: 19s - loss: 1.0772 - regression_loss: 0.9510 - classification_loss: 0.1262 424/500 [========================>.....] - ETA: 18s - loss: 1.0780 - regression_loss: 0.9517 - classification_loss: 0.1263 425/500 [========================>.....] - ETA: 18s - loss: 1.0784 - regression_loss: 0.9521 - classification_loss: 0.1263 426/500 [========================>.....] - ETA: 18s - loss: 1.0785 - regression_loss: 0.9522 - classification_loss: 0.1263 427/500 [========================>.....] - ETA: 18s - loss: 1.0792 - regression_loss: 0.9529 - classification_loss: 0.1263 428/500 [========================>.....] - ETA: 17s - loss: 1.0804 - regression_loss: 0.9521 - classification_loss: 0.1284 429/500 [========================>.....] - ETA: 17s - loss: 1.0808 - regression_loss: 0.9524 - classification_loss: 0.1285 430/500 [========================>.....] - ETA: 17s - loss: 1.0817 - regression_loss: 0.9531 - classification_loss: 0.1286 431/500 [========================>.....] - ETA: 17s - loss: 1.0815 - regression_loss: 0.9531 - classification_loss: 0.1285 432/500 [========================>.....] - ETA: 16s - loss: 1.0817 - regression_loss: 0.9532 - classification_loss: 0.1285 433/500 [========================>.....] - ETA: 16s - loss: 1.0820 - regression_loss: 0.9533 - classification_loss: 0.1286 434/500 [=========================>....] - ETA: 16s - loss: 1.0804 - regression_loss: 0.9519 - classification_loss: 0.1285 435/500 [=========================>....] - ETA: 16s - loss: 1.0802 - regression_loss: 0.9517 - classification_loss: 0.1285 436/500 [=========================>....] - ETA: 15s - loss: 1.0798 - regression_loss: 0.9513 - classification_loss: 0.1285 437/500 [=========================>....] - ETA: 15s - loss: 1.0811 - regression_loss: 0.9524 - classification_loss: 0.1288 438/500 [=========================>....] - ETA: 15s - loss: 1.0817 - regression_loss: 0.9528 - classification_loss: 0.1289 439/500 [=========================>....] - ETA: 15s - loss: 1.0828 - regression_loss: 0.9539 - classification_loss: 0.1289 440/500 [=========================>....] - ETA: 14s - loss: 1.0839 - regression_loss: 0.9548 - classification_loss: 0.1291 441/500 [=========================>....] - ETA: 14s - loss: 1.0846 - regression_loss: 0.9554 - classification_loss: 0.1292 442/500 [=========================>....] - ETA: 14s - loss: 1.0832 - regression_loss: 0.9541 - classification_loss: 0.1290 443/500 [=========================>....] - ETA: 14s - loss: 1.0820 - regression_loss: 0.9532 - classification_loss: 0.1288 444/500 [=========================>....] - ETA: 13s - loss: 1.0828 - regression_loss: 0.9539 - classification_loss: 0.1289 445/500 [=========================>....] - ETA: 13s - loss: 1.0818 - regression_loss: 0.9530 - classification_loss: 0.1288 446/500 [=========================>....] - ETA: 13s - loss: 1.0829 - regression_loss: 0.9539 - classification_loss: 0.1290 447/500 [=========================>....] - ETA: 13s - loss: 1.0833 - regression_loss: 0.9544 - classification_loss: 0.1289 448/500 [=========================>....] - ETA: 12s - loss: 1.0842 - regression_loss: 0.9552 - classification_loss: 0.1290 449/500 [=========================>....] - ETA: 12s - loss: 1.0838 - regression_loss: 0.9549 - classification_loss: 0.1289 450/500 [==========================>...] - ETA: 12s - loss: 1.0831 - regression_loss: 0.9543 - classification_loss: 0.1288 451/500 [==========================>...] - ETA: 12s - loss: 1.0827 - regression_loss: 0.9541 - classification_loss: 0.1286 452/500 [==========================>...] - ETA: 11s - loss: 1.0826 - regression_loss: 0.9540 - classification_loss: 0.1286 453/500 [==========================>...] - ETA: 11s - loss: 1.0826 - regression_loss: 0.9540 - classification_loss: 0.1286 454/500 [==========================>...] - ETA: 11s - loss: 1.0832 - regression_loss: 0.9545 - classification_loss: 0.1287 455/500 [==========================>...] - ETA: 11s - loss: 1.0825 - regression_loss: 0.9539 - classification_loss: 0.1286 456/500 [==========================>...] - ETA: 10s - loss: 1.0827 - regression_loss: 0.9540 - classification_loss: 0.1286 457/500 [==========================>...] - ETA: 10s - loss: 1.0830 - regression_loss: 0.9542 - classification_loss: 0.1287 458/500 [==========================>...] - ETA: 10s - loss: 1.0849 - regression_loss: 0.9558 - classification_loss: 0.1290 459/500 [==========================>...] - ETA: 10s - loss: 1.0842 - regression_loss: 0.9552 - classification_loss: 0.1290 460/500 [==========================>...] - ETA: 9s - loss: 1.0841 - regression_loss: 0.9552 - classification_loss: 0.1289  461/500 [==========================>...] - ETA: 9s - loss: 1.0842 - regression_loss: 0.9553 - classification_loss: 0.1289 462/500 [==========================>...] - ETA: 9s - loss: 1.0841 - regression_loss: 0.9551 - classification_loss: 0.1289 463/500 [==========================>...] - ETA: 9s - loss: 1.0841 - regression_loss: 0.9551 - classification_loss: 0.1289 464/500 [==========================>...] - ETA: 8s - loss: 1.0841 - regression_loss: 0.9552 - classification_loss: 0.1289 465/500 [==========================>...] - ETA: 8s - loss: 1.0829 - regression_loss: 0.9542 - classification_loss: 0.1287 466/500 [==========================>...] - ETA: 8s - loss: 1.0834 - regression_loss: 0.9547 - classification_loss: 0.1287 467/500 [===========================>..] - ETA: 8s - loss: 1.0815 - regression_loss: 0.9530 - classification_loss: 0.1285 468/500 [===========================>..] - ETA: 7s - loss: 1.0818 - regression_loss: 0.9533 - classification_loss: 0.1285 469/500 [===========================>..] - ETA: 7s - loss: 1.0813 - regression_loss: 0.9529 - classification_loss: 0.1285 470/500 [===========================>..] - ETA: 7s - loss: 1.0800 - regression_loss: 0.9518 - classification_loss: 0.1283 471/500 [===========================>..] - ETA: 7s - loss: 1.0798 - regression_loss: 0.9516 - classification_loss: 0.1282 472/500 [===========================>..] - ETA: 6s - loss: 1.0796 - regression_loss: 0.9513 - classification_loss: 0.1283 473/500 [===========================>..] - ETA: 6s - loss: 1.0799 - regression_loss: 0.9515 - classification_loss: 0.1283 474/500 [===========================>..] - ETA: 6s - loss: 1.0810 - regression_loss: 0.9524 - classification_loss: 0.1286 475/500 [===========================>..] - ETA: 6s - loss: 1.0807 - regression_loss: 0.9521 - classification_loss: 0.1286 476/500 [===========================>..] - ETA: 5s - loss: 1.0803 - regression_loss: 0.9518 - classification_loss: 0.1285 477/500 [===========================>..] - ETA: 5s - loss: 1.0797 - regression_loss: 0.9512 - classification_loss: 0.1285 478/500 [===========================>..] - ETA: 5s - loss: 1.0788 - regression_loss: 0.9504 - classification_loss: 0.1284 479/500 [===========================>..] - ETA: 5s - loss: 1.0794 - regression_loss: 0.9509 - classification_loss: 0.1285 480/500 [===========================>..] - ETA: 4s - loss: 1.0800 - regression_loss: 0.9515 - classification_loss: 0.1285 481/500 [===========================>..] - ETA: 4s - loss: 1.0789 - regression_loss: 0.9505 - classification_loss: 0.1284 482/500 [===========================>..] - ETA: 4s - loss: 1.0788 - regression_loss: 0.9504 - classification_loss: 0.1284 483/500 [===========================>..] - ETA: 4s - loss: 1.0797 - regression_loss: 0.9511 - classification_loss: 0.1286 484/500 [============================>.] - ETA: 3s - loss: 1.0799 - regression_loss: 0.9513 - classification_loss: 0.1286 485/500 [============================>.] - ETA: 3s - loss: 1.0797 - regression_loss: 0.9511 - classification_loss: 0.1286 486/500 [============================>.] - ETA: 3s - loss: 1.0799 - regression_loss: 0.9512 - classification_loss: 0.1286 487/500 [============================>.] - ETA: 3s - loss: 1.0812 - regression_loss: 0.9524 - classification_loss: 0.1288 488/500 [============================>.] - ETA: 2s - loss: 1.0806 - regression_loss: 0.9520 - classification_loss: 0.1287 489/500 [============================>.] - ETA: 2s - loss: 1.0799 - regression_loss: 0.9514 - classification_loss: 0.1285 490/500 [============================>.] - ETA: 2s - loss: 1.0789 - regression_loss: 0.9506 - classification_loss: 0.1283 491/500 [============================>.] - ETA: 2s - loss: 1.0795 - regression_loss: 0.9511 - classification_loss: 0.1284 492/500 [============================>.] - ETA: 1s - loss: 1.0791 - regression_loss: 0.9509 - classification_loss: 0.1283 493/500 [============================>.] - ETA: 1s - loss: 1.0784 - regression_loss: 0.9503 - classification_loss: 0.1281 494/500 [============================>.] - ETA: 1s - loss: 1.0778 - regression_loss: 0.9497 - classification_loss: 0.1280 495/500 [============================>.] - ETA: 1s - loss: 1.0765 - regression_loss: 0.9486 - classification_loss: 0.1279 496/500 [============================>.] - ETA: 0s - loss: 1.0765 - regression_loss: 0.9486 - classification_loss: 0.1279 497/500 [============================>.] - ETA: 0s - loss: 1.0753 - regression_loss: 0.9476 - classification_loss: 0.1277 498/500 [============================>.] - ETA: 0s - loss: 1.0751 - regression_loss: 0.9475 - classification_loss: 0.1276 499/500 [============================>.] - ETA: 0s - loss: 1.0736 - regression_loss: 0.9461 - classification_loss: 0.1275 500/500 [==============================] - 125s 249ms/step - loss: 1.0732 - regression_loss: 0.9458 - classification_loss: 0.1274 1172 instances of class plum with average precision: 0.7406 mAP: 0.7406 Epoch 00033: saving model to ./training/snapshots/resnet50_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 1:48 - loss: 1.1724 - regression_loss: 1.0494 - classification_loss: 0.1230 2/500 [..............................] - ETA: 1:52 - loss: 1.0372 - regression_loss: 0.9322 - classification_loss: 0.1050 3/500 [..............................] - ETA: 1:56 - loss: 1.0543 - regression_loss: 0.9519 - classification_loss: 0.1023 4/500 [..............................] - ETA: 1:57 - loss: 0.9681 - regression_loss: 0.8757 - classification_loss: 0.0924 5/500 [..............................] - ETA: 1:59 - loss: 0.8889 - regression_loss: 0.8079 - classification_loss: 0.0809 6/500 [..............................] - ETA: 1:59 - loss: 0.8480 - regression_loss: 0.7677 - classification_loss: 0.0803 7/500 [..............................] - ETA: 1:59 - loss: 0.9176 - regression_loss: 0.8246 - classification_loss: 0.0930 8/500 [..............................] - ETA: 2:00 - loss: 0.9173 - regression_loss: 0.8202 - classification_loss: 0.0971 9/500 [..............................] - ETA: 2:00 - loss: 0.9609 - regression_loss: 0.8587 - classification_loss: 0.1021 10/500 [..............................] - ETA: 2:00 - loss: 1.0296 - regression_loss: 0.9183 - classification_loss: 0.1113 11/500 [..............................] - ETA: 2:00 - loss: 1.0125 - regression_loss: 0.9013 - classification_loss: 0.1111 12/500 [..............................] - ETA: 2:00 - loss: 0.9846 - regression_loss: 0.8800 - classification_loss: 0.1046 13/500 [..............................] - ETA: 2:00 - loss: 0.9827 - regression_loss: 0.8751 - classification_loss: 0.1076 14/500 [..............................] - ETA: 2:01 - loss: 0.9458 - regression_loss: 0.8424 - classification_loss: 0.1034 15/500 [..............................] - ETA: 2:00 - loss: 0.9404 - regression_loss: 0.8420 - classification_loss: 0.0984 16/500 [..............................] - ETA: 2:00 - loss: 0.9231 - regression_loss: 0.8270 - classification_loss: 0.0961 17/500 [>.............................] - ETA: 2:00 - loss: 0.9389 - regression_loss: 0.8392 - classification_loss: 0.0997 18/500 [>.............................] - ETA: 2:00 - loss: 0.9182 - regression_loss: 0.8221 - classification_loss: 0.0961 19/500 [>.............................] - ETA: 2:00 - loss: 0.9167 - regression_loss: 0.8187 - classification_loss: 0.0981 20/500 [>.............................] - ETA: 2:00 - loss: 0.9206 - regression_loss: 0.8208 - classification_loss: 0.0998 21/500 [>.............................] - ETA: 1:59 - loss: 0.9349 - regression_loss: 0.8325 - classification_loss: 0.1024 22/500 [>.............................] - ETA: 1:59 - loss: 0.9551 - regression_loss: 0.8540 - classification_loss: 0.1011 23/500 [>.............................] - ETA: 1:59 - loss: 0.9713 - regression_loss: 0.8681 - classification_loss: 0.1032 24/500 [>.............................] - ETA: 1:59 - loss: 0.9840 - regression_loss: 0.8780 - classification_loss: 0.1059 25/500 [>.............................] - ETA: 1:58 - loss: 1.0122 - regression_loss: 0.9000 - classification_loss: 0.1121 26/500 [>.............................] - ETA: 1:58 - loss: 1.0296 - regression_loss: 0.9145 - classification_loss: 0.1151 27/500 [>.............................] - ETA: 1:58 - loss: 1.0412 - regression_loss: 0.9253 - classification_loss: 0.1160 28/500 [>.............................] - ETA: 1:58 - loss: 1.0544 - regression_loss: 0.9367 - classification_loss: 0.1178 29/500 [>.............................] - ETA: 1:57 - loss: 1.0413 - regression_loss: 0.9255 - classification_loss: 0.1158 30/500 [>.............................] - ETA: 1:57 - loss: 1.0440 - regression_loss: 0.9271 - classification_loss: 0.1168 31/500 [>.............................] - ETA: 1:57 - loss: 1.0434 - regression_loss: 0.9282 - classification_loss: 0.1152 32/500 [>.............................] - ETA: 1:57 - loss: 1.0555 - regression_loss: 0.9387 - classification_loss: 0.1169 33/500 [>.............................] - ETA: 1:57 - loss: 1.0433 - regression_loss: 0.9264 - classification_loss: 0.1169 34/500 [=>............................] - ETA: 1:56 - loss: 1.0417 - regression_loss: 0.9254 - classification_loss: 0.1163 35/500 [=>............................] - ETA: 1:56 - loss: 1.0370 - regression_loss: 0.9212 - classification_loss: 0.1158 36/500 [=>............................] - ETA: 1:56 - loss: 1.0390 - regression_loss: 0.9231 - classification_loss: 0.1158 37/500 [=>............................] - ETA: 1:56 - loss: 1.0349 - regression_loss: 0.9196 - classification_loss: 0.1153 38/500 [=>............................] - ETA: 1:56 - loss: 1.0209 - regression_loss: 0.9076 - classification_loss: 0.1133 39/500 [=>............................] - ETA: 1:55 - loss: 1.0085 - regression_loss: 0.8964 - classification_loss: 0.1120 40/500 [=>............................] - ETA: 1:55 - loss: 1.0174 - regression_loss: 0.9050 - classification_loss: 0.1124 41/500 [=>............................] - ETA: 1:55 - loss: 1.0054 - regression_loss: 0.8942 - classification_loss: 0.1112 42/500 [=>............................] - ETA: 1:55 - loss: 1.0113 - regression_loss: 0.9003 - classification_loss: 0.1110 43/500 [=>............................] - ETA: 1:54 - loss: 0.9984 - regression_loss: 0.8892 - classification_loss: 0.1091 44/500 [=>............................] - ETA: 1:54 - loss: 1.0135 - regression_loss: 0.9010 - classification_loss: 0.1125 45/500 [=>............................] - ETA: 1:53 - loss: 1.0183 - regression_loss: 0.9051 - classification_loss: 0.1132 46/500 [=>............................] - ETA: 1:53 - loss: 1.0096 - regression_loss: 0.8975 - classification_loss: 0.1121 47/500 [=>............................] - ETA: 1:52 - loss: 1.0017 - regression_loss: 0.8910 - classification_loss: 0.1107 48/500 [=>............................] - ETA: 1:51 - loss: 1.0072 - regression_loss: 0.8959 - classification_loss: 0.1113 49/500 [=>............................] - ETA: 1:51 - loss: 1.0148 - regression_loss: 0.9023 - classification_loss: 0.1125 50/500 [==>...........................] - ETA: 1:51 - loss: 1.0187 - regression_loss: 0.9060 - classification_loss: 0.1127 51/500 [==>...........................] - ETA: 1:51 - loss: 1.0298 - regression_loss: 0.9151 - classification_loss: 0.1147 52/500 [==>...........................] - ETA: 1:51 - loss: 1.0325 - regression_loss: 0.9179 - classification_loss: 0.1146 53/500 [==>...........................] - ETA: 1:50 - loss: 1.0302 - regression_loss: 0.9162 - classification_loss: 0.1141 54/500 [==>...........................] - ETA: 1:50 - loss: 1.0172 - regression_loss: 0.9043 - classification_loss: 0.1129 55/500 [==>...........................] - ETA: 1:50 - loss: 1.0084 - regression_loss: 0.8966 - classification_loss: 0.1117 56/500 [==>...........................] - ETA: 1:50 - loss: 1.0137 - regression_loss: 0.9009 - classification_loss: 0.1128 57/500 [==>...........................] - ETA: 1:49 - loss: 1.0147 - regression_loss: 0.9019 - classification_loss: 0.1128 58/500 [==>...........................] - ETA: 1:49 - loss: 1.0133 - regression_loss: 0.9008 - classification_loss: 0.1125 59/500 [==>...........................] - ETA: 1:49 - loss: 1.0024 - regression_loss: 0.8914 - classification_loss: 0.1109 60/500 [==>...........................] - ETA: 1:49 - loss: 1.0060 - regression_loss: 0.8945 - classification_loss: 0.1116 61/500 [==>...........................] - ETA: 1:48 - loss: 1.0123 - regression_loss: 0.8998 - classification_loss: 0.1125 62/500 [==>...........................] - ETA: 1:48 - loss: 1.0179 - regression_loss: 0.9043 - classification_loss: 0.1136 63/500 [==>...........................] - ETA: 1:48 - loss: 1.0236 - regression_loss: 0.9087 - classification_loss: 0.1148 64/500 [==>...........................] - ETA: 1:47 - loss: 1.0232 - regression_loss: 0.9087 - classification_loss: 0.1146 65/500 [==>...........................] - ETA: 1:47 - loss: 1.0388 - regression_loss: 0.9214 - classification_loss: 0.1175 66/500 [==>...........................] - ETA: 1:47 - loss: 1.0387 - regression_loss: 0.9212 - classification_loss: 0.1175 67/500 [===>..........................] - ETA: 1:47 - loss: 1.0403 - regression_loss: 0.9223 - classification_loss: 0.1180 68/500 [===>..........................] - ETA: 1:47 - loss: 1.0402 - regression_loss: 0.9224 - classification_loss: 0.1178 69/500 [===>..........................] - ETA: 1:46 - loss: 1.0314 - regression_loss: 0.9146 - classification_loss: 0.1168 70/500 [===>..........................] - ETA: 1:46 - loss: 1.0287 - regression_loss: 0.9118 - classification_loss: 0.1169 71/500 [===>..........................] - ETA: 1:46 - loss: 1.0288 - regression_loss: 0.9112 - classification_loss: 0.1176 72/500 [===>..........................] - ETA: 1:45 - loss: 1.0401 - regression_loss: 0.9205 - classification_loss: 0.1196 73/500 [===>..........................] - ETA: 1:45 - loss: 1.0456 - regression_loss: 0.9254 - classification_loss: 0.1202 74/500 [===>..........................] - ETA: 1:45 - loss: 1.0469 - regression_loss: 0.9267 - classification_loss: 0.1202 75/500 [===>..........................] - ETA: 1:45 - loss: 1.0470 - regression_loss: 0.9267 - classification_loss: 0.1203 76/500 [===>..........................] - ETA: 1:44 - loss: 1.0486 - regression_loss: 0.9278 - classification_loss: 0.1208 77/500 [===>..........................] - ETA: 1:44 - loss: 1.0384 - regression_loss: 0.9189 - classification_loss: 0.1195 78/500 [===>..........................] - ETA: 1:44 - loss: 1.0397 - regression_loss: 0.9204 - classification_loss: 0.1193 79/500 [===>..........................] - ETA: 1:44 - loss: 1.0444 - regression_loss: 0.9244 - classification_loss: 0.1200 80/500 [===>..........................] - ETA: 1:43 - loss: 1.0512 - regression_loss: 0.9298 - classification_loss: 0.1213 81/500 [===>..........................] - ETA: 1:43 - loss: 1.0516 - regression_loss: 0.9303 - classification_loss: 0.1213 82/500 [===>..........................] - ETA: 1:43 - loss: 1.0575 - regression_loss: 0.9353 - classification_loss: 0.1222 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0508 - regression_loss: 0.9291 - classification_loss: 0.1217 84/500 [====>.........................] - ETA: 1:42 - loss: 1.0514 - regression_loss: 0.9294 - classification_loss: 0.1220 85/500 [====>.........................] - ETA: 1:42 - loss: 1.0557 - regression_loss: 0.9336 - classification_loss: 0.1221 86/500 [====>.........................] - ETA: 1:42 - loss: 1.0624 - regression_loss: 0.9395 - classification_loss: 0.1230 87/500 [====>.........................] - ETA: 1:42 - loss: 1.0649 - regression_loss: 0.9416 - classification_loss: 0.1233 88/500 [====>.........................] - ETA: 1:41 - loss: 1.0600 - regression_loss: 0.9375 - classification_loss: 0.1225 89/500 [====>.........................] - ETA: 1:41 - loss: 1.0609 - regression_loss: 0.9378 - classification_loss: 0.1231 90/500 [====>.........................] - ETA: 1:41 - loss: 1.0645 - regression_loss: 0.9409 - classification_loss: 0.1237 91/500 [====>.........................] - ETA: 1:41 - loss: 1.0668 - regression_loss: 0.9429 - classification_loss: 0.1240 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0687 - regression_loss: 0.9443 - classification_loss: 0.1244 93/500 [====>.........................] - ETA: 1:40 - loss: 1.0669 - regression_loss: 0.9429 - classification_loss: 0.1240 94/500 [====>.........................] - ETA: 1:40 - loss: 1.0675 - regression_loss: 0.9432 - classification_loss: 0.1242 95/500 [====>.........................] - ETA: 1:40 - loss: 1.0752 - regression_loss: 0.9493 - classification_loss: 0.1258 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0712 - regression_loss: 0.9461 - classification_loss: 0.1250 97/500 [====>.........................] - ETA: 1:39 - loss: 1.0728 - regression_loss: 0.9475 - classification_loss: 0.1253 98/500 [====>.........................] - ETA: 1:39 - loss: 1.0711 - regression_loss: 0.9463 - classification_loss: 0.1248 99/500 [====>.........................] - ETA: 1:39 - loss: 1.0751 - regression_loss: 0.9495 - classification_loss: 0.1255 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0745 - regression_loss: 0.9494 - classification_loss: 0.1251 101/500 [=====>........................] - ETA: 1:38 - loss: 1.0785 - regression_loss: 0.9527 - classification_loss: 0.1257 102/500 [=====>........................] - ETA: 1:38 - loss: 1.0732 - regression_loss: 0.9485 - classification_loss: 0.1247 103/500 [=====>........................] - ETA: 1:38 - loss: 1.0750 - regression_loss: 0.9500 - classification_loss: 0.1250 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0765 - regression_loss: 0.9514 - classification_loss: 0.1250 105/500 [=====>........................] - ETA: 1:37 - loss: 1.0750 - regression_loss: 0.9502 - classification_loss: 0.1247 106/500 [=====>........................] - ETA: 1:37 - loss: 1.0743 - regression_loss: 0.9496 - classification_loss: 0.1246 107/500 [=====>........................] - ETA: 1:37 - loss: 1.0718 - regression_loss: 0.9475 - classification_loss: 0.1243 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0735 - regression_loss: 0.9483 - classification_loss: 0.1252 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0678 - regression_loss: 0.9432 - classification_loss: 0.1246 110/500 [=====>........................] - ETA: 1:36 - loss: 1.0691 - regression_loss: 0.9444 - classification_loss: 0.1248 111/500 [=====>........................] - ETA: 1:36 - loss: 1.0725 - regression_loss: 0.9467 - classification_loss: 0.1258 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0765 - regression_loss: 0.9502 - classification_loss: 0.1263 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0827 - regression_loss: 0.9550 - classification_loss: 0.1277 114/500 [=====>........................] - ETA: 1:35 - loss: 1.0776 - regression_loss: 0.9506 - classification_loss: 0.1269 115/500 [=====>........................] - ETA: 1:35 - loss: 1.0787 - regression_loss: 0.9517 - classification_loss: 0.1271 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0724 - regression_loss: 0.9462 - classification_loss: 0.1263 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0695 - regression_loss: 0.9436 - classification_loss: 0.1259 118/500 [======>.......................] - ETA: 1:35 - loss: 1.0658 - regression_loss: 0.9406 - classification_loss: 0.1252 119/500 [======>.......................] - ETA: 1:34 - loss: 1.0602 - regression_loss: 0.9355 - classification_loss: 0.1247 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0578 - regression_loss: 0.9338 - classification_loss: 0.1241 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0559 - regression_loss: 0.9320 - classification_loss: 0.1239 122/500 [======>.......................] - ETA: 1:34 - loss: 1.0552 - regression_loss: 0.9315 - classification_loss: 0.1236 123/500 [======>.......................] - ETA: 1:33 - loss: 1.0518 - regression_loss: 0.9289 - classification_loss: 0.1229 124/500 [======>.......................] - ETA: 1:33 - loss: 1.0518 - regression_loss: 0.9291 - classification_loss: 0.1228 125/500 [======>.......................] - ETA: 1:33 - loss: 1.0504 - regression_loss: 0.9279 - classification_loss: 0.1224 126/500 [======>.......................] - ETA: 1:33 - loss: 1.0514 - regression_loss: 0.9290 - classification_loss: 0.1224 127/500 [======>.......................] - ETA: 1:32 - loss: 1.0513 - regression_loss: 0.9290 - classification_loss: 0.1223 128/500 [======>.......................] - ETA: 1:32 - loss: 1.0527 - regression_loss: 0.9300 - classification_loss: 0.1228 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0570 - regression_loss: 0.9333 - classification_loss: 0.1237 130/500 [======>.......................] - ETA: 1:32 - loss: 1.0581 - regression_loss: 0.9345 - classification_loss: 0.1237 131/500 [======>.......................] - ETA: 1:31 - loss: 1.0580 - regression_loss: 0.9347 - classification_loss: 0.1233 132/500 [======>.......................] - ETA: 1:31 - loss: 1.0605 - regression_loss: 0.9368 - classification_loss: 0.1237 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0617 - regression_loss: 0.9378 - classification_loss: 0.1240 134/500 [=======>......................] - ETA: 1:31 - loss: 1.0585 - regression_loss: 0.9348 - classification_loss: 0.1237 135/500 [=======>......................] - ETA: 1:30 - loss: 1.0615 - regression_loss: 0.9373 - classification_loss: 0.1242 136/500 [=======>......................] - ETA: 1:30 - loss: 1.0587 - regression_loss: 0.9351 - classification_loss: 0.1236 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0599 - regression_loss: 0.9358 - classification_loss: 0.1241 138/500 [=======>......................] - ETA: 1:30 - loss: 1.0600 - regression_loss: 0.9361 - classification_loss: 0.1239 139/500 [=======>......................] - ETA: 1:29 - loss: 1.0562 - regression_loss: 0.9329 - classification_loss: 0.1233 140/500 [=======>......................] - ETA: 1:29 - loss: 1.0554 - regression_loss: 0.9323 - classification_loss: 0.1231 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0573 - regression_loss: 0.9342 - classification_loss: 0.1231 142/500 [=======>......................] - ETA: 1:29 - loss: 1.0543 - regression_loss: 0.9319 - classification_loss: 0.1224 143/500 [=======>......................] - ETA: 1:28 - loss: 1.0529 - regression_loss: 0.9307 - classification_loss: 0.1222 144/500 [=======>......................] - ETA: 1:28 - loss: 1.0539 - regression_loss: 0.9314 - classification_loss: 0.1225 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0496 - regression_loss: 0.9277 - classification_loss: 0.1219 146/500 [=======>......................] - ETA: 1:28 - loss: 1.0459 - regression_loss: 0.9246 - classification_loss: 0.1213 147/500 [=======>......................] - ETA: 1:27 - loss: 1.0433 - regression_loss: 0.9225 - classification_loss: 0.1208 148/500 [=======>......................] - ETA: 1:27 - loss: 1.0460 - regression_loss: 0.9250 - classification_loss: 0.1210 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0453 - regression_loss: 0.9243 - classification_loss: 0.1209 150/500 [========>.....................] - ETA: 1:27 - loss: 1.0429 - regression_loss: 0.9226 - classification_loss: 0.1203 151/500 [========>.....................] - ETA: 1:26 - loss: 1.0450 - regression_loss: 0.9243 - classification_loss: 0.1206 152/500 [========>.....................] - ETA: 1:26 - loss: 1.0443 - regression_loss: 0.9238 - classification_loss: 0.1205 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0415 - regression_loss: 0.9214 - classification_loss: 0.1201 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0427 - regression_loss: 0.9223 - classification_loss: 0.1204 155/500 [========>.....................] - ETA: 1:26 - loss: 1.0412 - regression_loss: 0.9210 - classification_loss: 0.1202 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0393 - regression_loss: 0.9195 - classification_loss: 0.1198 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0386 - regression_loss: 0.9190 - classification_loss: 0.1196 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0376 - regression_loss: 0.9182 - classification_loss: 0.1194 159/500 [========>.....................] - ETA: 1:25 - loss: 1.0411 - regression_loss: 0.9211 - classification_loss: 0.1200 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0417 - regression_loss: 0.9216 - classification_loss: 0.1201 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0422 - regression_loss: 0.9220 - classification_loss: 0.1202 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0434 - regression_loss: 0.9231 - classification_loss: 0.1203 163/500 [========>.....................] - ETA: 1:24 - loss: 1.0444 - regression_loss: 0.9238 - classification_loss: 0.1205 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0442 - regression_loss: 0.9234 - classification_loss: 0.1207 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0414 - regression_loss: 0.9212 - classification_loss: 0.1202 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0407 - regression_loss: 0.9208 - classification_loss: 0.1199 167/500 [=========>....................] - ETA: 1:23 - loss: 1.0407 - regression_loss: 0.9207 - classification_loss: 0.1200 168/500 [=========>....................] - ETA: 1:22 - loss: 1.0428 - regression_loss: 0.9228 - classification_loss: 0.1200 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0407 - regression_loss: 0.9211 - classification_loss: 0.1196 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0407 - regression_loss: 0.9210 - classification_loss: 0.1197 171/500 [=========>....................] - ETA: 1:22 - loss: 1.0417 - regression_loss: 0.9218 - classification_loss: 0.1199 172/500 [=========>....................] - ETA: 1:21 - loss: 1.0445 - regression_loss: 0.9241 - classification_loss: 0.1205 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0440 - regression_loss: 0.9237 - classification_loss: 0.1203 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0408 - regression_loss: 0.9210 - classification_loss: 0.1198 175/500 [=========>....................] - ETA: 1:21 - loss: 1.0380 - regression_loss: 0.9187 - classification_loss: 0.1193 176/500 [=========>....................] - ETA: 1:20 - loss: 1.0406 - regression_loss: 0.9209 - classification_loss: 0.1197 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0413 - regression_loss: 0.9216 - classification_loss: 0.1197 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0402 - regression_loss: 0.9208 - classification_loss: 0.1194 179/500 [=========>....................] - ETA: 1:20 - loss: 1.0434 - regression_loss: 0.9235 - classification_loss: 0.1199 180/500 [=========>....................] - ETA: 1:19 - loss: 1.0428 - regression_loss: 0.9230 - classification_loss: 0.1199 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0419 - regression_loss: 0.9223 - classification_loss: 0.1196 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0388 - regression_loss: 0.9197 - classification_loss: 0.1191 183/500 [=========>....................] - ETA: 1:19 - loss: 1.0417 - regression_loss: 0.9216 - classification_loss: 0.1200 184/500 [==========>...................] - ETA: 1:18 - loss: 1.0418 - regression_loss: 0.9216 - classification_loss: 0.1201 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0426 - regression_loss: 0.9226 - classification_loss: 0.1200 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0431 - regression_loss: 0.9232 - classification_loss: 0.1199 187/500 [==========>...................] - ETA: 1:18 - loss: 1.0453 - regression_loss: 0.9256 - classification_loss: 0.1197 188/500 [==========>...................] - ETA: 1:17 - loss: 1.0438 - regression_loss: 0.9243 - classification_loss: 0.1195 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0457 - regression_loss: 0.9259 - classification_loss: 0.1197 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0459 - regression_loss: 0.9262 - classification_loss: 0.1198 191/500 [==========>...................] - ETA: 1:17 - loss: 1.0448 - regression_loss: 0.9255 - classification_loss: 0.1193 192/500 [==========>...................] - ETA: 1:16 - loss: 1.0451 - regression_loss: 0.9258 - classification_loss: 0.1193 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0461 - regression_loss: 0.9266 - classification_loss: 0.1195 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0478 - regression_loss: 0.9282 - classification_loss: 0.1196 195/500 [==========>...................] - ETA: 1:16 - loss: 1.0461 - regression_loss: 0.9267 - classification_loss: 0.1194 196/500 [==========>...................] - ETA: 1:15 - loss: 1.0418 - regression_loss: 0.9228 - classification_loss: 0.1190 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0421 - regression_loss: 0.9232 - classification_loss: 0.1190 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0418 - regression_loss: 0.9229 - classification_loss: 0.1190 199/500 [==========>...................] - ETA: 1:15 - loss: 1.0423 - regression_loss: 0.9234 - classification_loss: 0.1189 200/500 [===========>..................] - ETA: 1:15 - loss: 1.0405 - regression_loss: 0.9219 - classification_loss: 0.1187 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0402 - regression_loss: 0.9217 - classification_loss: 0.1185 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0407 - regression_loss: 0.9220 - classification_loss: 0.1187 203/500 [===========>..................] - ETA: 1:14 - loss: 1.0401 - regression_loss: 0.9217 - classification_loss: 0.1184 204/500 [===========>..................] - ETA: 1:14 - loss: 1.0410 - regression_loss: 0.9224 - classification_loss: 0.1186 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0442 - regression_loss: 0.9250 - classification_loss: 0.1192 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0457 - regression_loss: 0.9263 - classification_loss: 0.1194 207/500 [===========>..................] - ETA: 1:13 - loss: 1.0469 - regression_loss: 0.9273 - classification_loss: 0.1195 208/500 [===========>..................] - ETA: 1:13 - loss: 1.0458 - regression_loss: 0.9266 - classification_loss: 0.1192 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0436 - regression_loss: 0.9247 - classification_loss: 0.1190 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0418 - regression_loss: 0.9233 - classification_loss: 0.1185 211/500 [===========>..................] - ETA: 1:12 - loss: 1.0441 - regression_loss: 0.9254 - classification_loss: 0.1187 212/500 [===========>..................] - ETA: 1:12 - loss: 1.0425 - regression_loss: 0.9240 - classification_loss: 0.1185 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0400 - regression_loss: 0.9219 - classification_loss: 0.1180 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0381 - regression_loss: 0.9203 - classification_loss: 0.1177 215/500 [===========>..................] - ETA: 1:11 - loss: 1.0384 - regression_loss: 0.9205 - classification_loss: 0.1179 216/500 [===========>..................] - ETA: 1:11 - loss: 1.0399 - regression_loss: 0.9218 - classification_loss: 0.1181 217/500 [============>.................] - ETA: 1:10 - loss: 1.0391 - regression_loss: 0.9211 - classification_loss: 0.1181 218/500 [============>.................] - ETA: 1:10 - loss: 1.0396 - regression_loss: 0.9215 - classification_loss: 0.1181 219/500 [============>.................] - ETA: 1:10 - loss: 1.0400 - regression_loss: 0.9218 - classification_loss: 0.1182 220/500 [============>.................] - ETA: 1:10 - loss: 1.0401 - regression_loss: 0.9220 - classification_loss: 0.1181 221/500 [============>.................] - ETA: 1:09 - loss: 1.0396 - regression_loss: 0.9215 - classification_loss: 0.1181 222/500 [============>.................] - ETA: 1:09 - loss: 1.0392 - regression_loss: 0.9212 - classification_loss: 0.1180 223/500 [============>.................] - ETA: 1:09 - loss: 1.0394 - regression_loss: 0.9216 - classification_loss: 0.1178 224/500 [============>.................] - ETA: 1:08 - loss: 1.0379 - regression_loss: 0.9204 - classification_loss: 0.1175 225/500 [============>.................] - ETA: 1:08 - loss: 1.0366 - regression_loss: 0.9193 - classification_loss: 0.1173 226/500 [============>.................] - ETA: 1:08 - loss: 1.0376 - regression_loss: 0.9202 - classification_loss: 0.1174 227/500 [============>.................] - ETA: 1:08 - loss: 1.0378 - regression_loss: 0.9200 - classification_loss: 0.1177 228/500 [============>.................] - ETA: 1:07 - loss: 1.0368 - regression_loss: 0.9194 - classification_loss: 0.1174 229/500 [============>.................] - ETA: 1:07 - loss: 1.0357 - regression_loss: 0.9184 - classification_loss: 0.1173 230/500 [============>.................] - ETA: 1:07 - loss: 1.0361 - regression_loss: 0.9189 - classification_loss: 0.1172 231/500 [============>.................] - ETA: 1:07 - loss: 1.0386 - regression_loss: 0.9210 - classification_loss: 0.1176 232/500 [============>.................] - ETA: 1:06 - loss: 1.0384 - regression_loss: 0.9209 - classification_loss: 0.1175 233/500 [============>.................] - ETA: 1:06 - loss: 1.0401 - regression_loss: 0.9224 - classification_loss: 0.1178 234/500 [=============>................] - ETA: 1:06 - loss: 1.0391 - regression_loss: 0.9216 - classification_loss: 0.1175 235/500 [=============>................] - ETA: 1:06 - loss: 1.0395 - regression_loss: 0.9220 - classification_loss: 0.1175 236/500 [=============>................] - ETA: 1:05 - loss: 1.0397 - regression_loss: 0.9222 - classification_loss: 0.1175 237/500 [=============>................] - ETA: 1:05 - loss: 1.0402 - regression_loss: 0.9226 - classification_loss: 0.1176 238/500 [=============>................] - ETA: 1:05 - loss: 1.0420 - regression_loss: 0.9242 - classification_loss: 0.1178 239/500 [=============>................] - ETA: 1:05 - loss: 1.0397 - regression_loss: 0.9223 - classification_loss: 0.1174 240/500 [=============>................] - ETA: 1:04 - loss: 1.0405 - regression_loss: 0.9229 - classification_loss: 0.1175 241/500 [=============>................] - ETA: 1:04 - loss: 1.0385 - regression_loss: 0.9213 - classification_loss: 0.1172 242/500 [=============>................] - ETA: 1:04 - loss: 1.0396 - regression_loss: 0.9222 - classification_loss: 0.1174 243/500 [=============>................] - ETA: 1:04 - loss: 1.0398 - regression_loss: 0.9223 - classification_loss: 0.1175 244/500 [=============>................] - ETA: 1:03 - loss: 1.0390 - regression_loss: 0.9217 - classification_loss: 0.1174 245/500 [=============>................] - ETA: 1:03 - loss: 1.0392 - regression_loss: 0.9218 - classification_loss: 0.1174 246/500 [=============>................] - ETA: 1:03 - loss: 1.0367 - regression_loss: 0.9196 - classification_loss: 0.1171 247/500 [=============>................] - ETA: 1:03 - loss: 1.0391 - regression_loss: 0.9214 - classification_loss: 0.1177 248/500 [=============>................] - ETA: 1:02 - loss: 1.0390 - regression_loss: 0.9214 - classification_loss: 0.1177 249/500 [=============>................] - ETA: 1:02 - loss: 1.0376 - regression_loss: 0.9202 - classification_loss: 0.1175 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0373 - regression_loss: 0.9199 - classification_loss: 0.1174 251/500 [==============>...............] - ETA: 1:02 - loss: 1.0391 - regression_loss: 0.9216 - classification_loss: 0.1175 252/500 [==============>...............] - ETA: 1:01 - loss: 1.0394 - regression_loss: 0.9219 - classification_loss: 0.1175 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0377 - regression_loss: 0.9204 - classification_loss: 0.1173 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0385 - regression_loss: 0.9212 - classification_loss: 0.1173 255/500 [==============>...............] - ETA: 1:01 - loss: 1.0359 - regression_loss: 0.9188 - classification_loss: 0.1170 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0356 - regression_loss: 0.9185 - classification_loss: 0.1170 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0363 - regression_loss: 0.9192 - classification_loss: 0.1171 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0373 - regression_loss: 0.9200 - classification_loss: 0.1173 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0368 - regression_loss: 0.9195 - classification_loss: 0.1173 260/500 [==============>...............] - ETA: 59s - loss: 1.0365 - regression_loss: 0.9191 - classification_loss: 0.1173  261/500 [==============>...............] - ETA: 59s - loss: 1.0373 - regression_loss: 0.9199 - classification_loss: 0.1175 262/500 [==============>...............] - ETA: 59s - loss: 1.0371 - regression_loss: 0.9195 - classification_loss: 0.1176 263/500 [==============>...............] - ETA: 59s - loss: 1.0350 - regression_loss: 0.9177 - classification_loss: 0.1174 264/500 [==============>...............] - ETA: 59s - loss: 1.0338 - regression_loss: 0.9166 - classification_loss: 0.1171 265/500 [==============>...............] - ETA: 58s - loss: 1.0368 - regression_loss: 0.9190 - classification_loss: 0.1178 266/500 [==============>...............] - ETA: 58s - loss: 1.0341 - regression_loss: 0.9166 - classification_loss: 0.1174 267/500 [===============>..............] - ETA: 58s - loss: 1.0324 - regression_loss: 0.9153 - classification_loss: 0.1171 268/500 [===============>..............] - ETA: 58s - loss: 1.0297 - regression_loss: 0.9129 - classification_loss: 0.1168 269/500 [===============>..............] - ETA: 57s - loss: 1.0298 - regression_loss: 0.9130 - classification_loss: 0.1168 270/500 [===============>..............] - ETA: 57s - loss: 1.0283 - regression_loss: 0.9118 - classification_loss: 0.1165 271/500 [===============>..............] - ETA: 57s - loss: 1.0274 - regression_loss: 0.9111 - classification_loss: 0.1164 272/500 [===============>..............] - ETA: 57s - loss: 1.0265 - regression_loss: 0.9103 - classification_loss: 0.1162 273/500 [===============>..............] - ETA: 56s - loss: 1.0250 - regression_loss: 0.9091 - classification_loss: 0.1159 274/500 [===============>..............] - ETA: 56s - loss: 1.0259 - regression_loss: 0.9099 - classification_loss: 0.1160 275/500 [===============>..............] - ETA: 56s - loss: 1.0264 - regression_loss: 0.9103 - classification_loss: 0.1162 276/500 [===============>..............] - ETA: 56s - loss: 1.0266 - regression_loss: 0.9105 - classification_loss: 0.1160 277/500 [===============>..............] - ETA: 55s - loss: 1.0265 - regression_loss: 0.9105 - classification_loss: 0.1160 278/500 [===============>..............] - ETA: 55s - loss: 1.0277 - regression_loss: 0.9113 - classification_loss: 0.1164 279/500 [===============>..............] - ETA: 55s - loss: 1.0259 - regression_loss: 0.9098 - classification_loss: 0.1161 280/500 [===============>..............] - ETA: 55s - loss: 1.0262 - regression_loss: 0.9101 - classification_loss: 0.1161 281/500 [===============>..............] - ETA: 54s - loss: 1.0260 - regression_loss: 0.9101 - classification_loss: 0.1159 282/500 [===============>..............] - ETA: 54s - loss: 1.0265 - regression_loss: 0.9105 - classification_loss: 0.1160 283/500 [===============>..............] - ETA: 54s - loss: 1.0264 - regression_loss: 0.9103 - classification_loss: 0.1161 284/500 [================>.............] - ETA: 54s - loss: 1.0269 - regression_loss: 0.9109 - classification_loss: 0.1159 285/500 [================>.............] - ETA: 53s - loss: 1.0282 - regression_loss: 0.9121 - classification_loss: 0.1161 286/500 [================>.............] - ETA: 53s - loss: 1.0271 - regression_loss: 0.9110 - classification_loss: 0.1161 287/500 [================>.............] - ETA: 53s - loss: 1.0280 - regression_loss: 0.9118 - classification_loss: 0.1162 288/500 [================>.............] - ETA: 53s - loss: 1.0295 - regression_loss: 0.9130 - classification_loss: 0.1165 289/500 [================>.............] - ETA: 52s - loss: 1.0304 - regression_loss: 0.9137 - classification_loss: 0.1166 290/500 [================>.............] - ETA: 52s - loss: 1.0283 - regression_loss: 0.9120 - classification_loss: 0.1163 291/500 [================>.............] - ETA: 52s - loss: 1.0295 - regression_loss: 0.9130 - classification_loss: 0.1165 292/500 [================>.............] - ETA: 52s - loss: 1.0296 - regression_loss: 0.9132 - classification_loss: 0.1164 293/500 [================>.............] - ETA: 51s - loss: 1.0274 - regression_loss: 0.9112 - classification_loss: 0.1161 294/500 [================>.............] - ETA: 51s - loss: 1.0287 - regression_loss: 0.9124 - classification_loss: 0.1163 295/500 [================>.............] - ETA: 51s - loss: 1.0270 - regression_loss: 0.9109 - classification_loss: 0.1160 296/500 [================>.............] - ETA: 51s - loss: 1.0273 - regression_loss: 0.9113 - classification_loss: 0.1160 297/500 [================>.............] - ETA: 50s - loss: 1.0276 - regression_loss: 0.9115 - classification_loss: 0.1160 298/500 [================>.............] - ETA: 50s - loss: 1.0289 - regression_loss: 0.9127 - classification_loss: 0.1162 299/500 [================>.............] - ETA: 50s - loss: 1.0294 - regression_loss: 0.9131 - classification_loss: 0.1163 300/500 [=================>............] - ETA: 50s - loss: 1.0308 - regression_loss: 0.9144 - classification_loss: 0.1163 301/500 [=================>............] - ETA: 49s - loss: 1.0311 - regression_loss: 0.9147 - classification_loss: 0.1164 302/500 [=================>............] - ETA: 49s - loss: 1.0305 - regression_loss: 0.9141 - classification_loss: 0.1164 303/500 [=================>............] - ETA: 49s - loss: 1.0296 - regression_loss: 0.9132 - classification_loss: 0.1163 304/500 [=================>............] - ETA: 49s - loss: 1.0308 - regression_loss: 0.9145 - classification_loss: 0.1163 305/500 [=================>............] - ETA: 48s - loss: 1.0292 - regression_loss: 0.9130 - classification_loss: 0.1161 306/500 [=================>............] - ETA: 48s - loss: 1.0280 - regression_loss: 0.9121 - classification_loss: 0.1160 307/500 [=================>............] - ETA: 48s - loss: 1.0284 - regression_loss: 0.9124 - classification_loss: 0.1160 308/500 [=================>............] - ETA: 48s - loss: 1.0282 - regression_loss: 0.9122 - classification_loss: 0.1160 309/500 [=================>............] - ETA: 47s - loss: 1.0285 - regression_loss: 0.9124 - classification_loss: 0.1161 310/500 [=================>............] - ETA: 47s - loss: 1.0277 - regression_loss: 0.9118 - classification_loss: 0.1159 311/500 [=================>............] - ETA: 47s - loss: 1.0276 - regression_loss: 0.9118 - classification_loss: 0.1158 312/500 [=================>............] - ETA: 47s - loss: 1.0261 - regression_loss: 0.9105 - classification_loss: 0.1157 313/500 [=================>............] - ETA: 46s - loss: 1.0249 - regression_loss: 0.9095 - classification_loss: 0.1154 314/500 [=================>............] - ETA: 46s - loss: 1.0236 - regression_loss: 0.9084 - classification_loss: 0.1152 315/500 [=================>............] - ETA: 46s - loss: 1.0240 - regression_loss: 0.9087 - classification_loss: 0.1153 316/500 [=================>............] - ETA: 46s - loss: 1.0237 - regression_loss: 0.9085 - classification_loss: 0.1152 317/500 [==================>...........] - ETA: 45s - loss: 1.0227 - regression_loss: 0.9075 - classification_loss: 0.1152 318/500 [==================>...........] - ETA: 45s - loss: 1.0238 - regression_loss: 0.9084 - classification_loss: 0.1154 319/500 [==================>...........] - ETA: 45s - loss: 1.0225 - regression_loss: 0.9073 - classification_loss: 0.1152 320/500 [==================>...........] - ETA: 45s - loss: 1.0239 - regression_loss: 0.9084 - classification_loss: 0.1155 321/500 [==================>...........] - ETA: 44s - loss: 1.0248 - regression_loss: 0.9091 - classification_loss: 0.1157 322/500 [==================>...........] - ETA: 44s - loss: 1.0227 - regression_loss: 0.9073 - classification_loss: 0.1154 323/500 [==================>...........] - ETA: 44s - loss: 1.0230 - regression_loss: 0.9075 - classification_loss: 0.1155 324/500 [==================>...........] - ETA: 44s - loss: 1.0240 - regression_loss: 0.9086 - classification_loss: 0.1154 325/500 [==================>...........] - ETA: 43s - loss: 1.0231 - regression_loss: 0.9079 - classification_loss: 0.1152 326/500 [==================>...........] - ETA: 43s - loss: 1.0249 - regression_loss: 0.9094 - classification_loss: 0.1155 327/500 [==================>...........] - ETA: 43s - loss: 1.0263 - regression_loss: 0.9106 - classification_loss: 0.1157 328/500 [==================>...........] - ETA: 43s - loss: 1.0294 - regression_loss: 0.9135 - classification_loss: 0.1159 329/500 [==================>...........] - ETA: 42s - loss: 1.0287 - regression_loss: 0.9128 - classification_loss: 0.1158 330/500 [==================>...........] - ETA: 42s - loss: 1.0290 - regression_loss: 0.9132 - classification_loss: 0.1159 331/500 [==================>...........] - ETA: 42s - loss: 1.0285 - regression_loss: 0.9127 - classification_loss: 0.1157 332/500 [==================>...........] - ETA: 42s - loss: 1.0279 - regression_loss: 0.9123 - classification_loss: 0.1156 333/500 [==================>...........] - ETA: 41s - loss: 1.0263 - regression_loss: 0.9109 - classification_loss: 0.1154 334/500 [===================>..........] - ETA: 41s - loss: 1.0266 - regression_loss: 0.9112 - classification_loss: 0.1154 335/500 [===================>..........] - ETA: 41s - loss: 1.0263 - regression_loss: 0.9109 - classification_loss: 0.1154 336/500 [===================>..........] - ETA: 41s - loss: 1.0247 - regression_loss: 0.9095 - classification_loss: 0.1152 337/500 [===================>..........] - ETA: 40s - loss: 1.0240 - regression_loss: 0.9089 - classification_loss: 0.1151 338/500 [===================>..........] - ETA: 40s - loss: 1.0240 - regression_loss: 0.9089 - classification_loss: 0.1151 339/500 [===================>..........] - ETA: 40s - loss: 1.0229 - regression_loss: 0.9080 - classification_loss: 0.1149 340/500 [===================>..........] - ETA: 40s - loss: 1.0233 - regression_loss: 0.9084 - classification_loss: 0.1149 341/500 [===================>..........] - ETA: 39s - loss: 1.0221 - regression_loss: 0.9075 - classification_loss: 0.1146 342/500 [===================>..........] - ETA: 39s - loss: 1.0243 - regression_loss: 0.9093 - classification_loss: 0.1151 343/500 [===================>..........] - ETA: 39s - loss: 1.0226 - regression_loss: 0.9077 - classification_loss: 0.1149 344/500 [===================>..........] - ETA: 39s - loss: 1.0233 - regression_loss: 0.9084 - classification_loss: 0.1149 345/500 [===================>..........] - ETA: 38s - loss: 1.0219 - regression_loss: 0.9073 - classification_loss: 0.1147 346/500 [===================>..........] - ETA: 38s - loss: 1.0228 - regression_loss: 0.9080 - classification_loss: 0.1148 347/500 [===================>..........] - ETA: 38s - loss: 1.0216 - regression_loss: 0.9070 - classification_loss: 0.1146 348/500 [===================>..........] - ETA: 38s - loss: 1.0220 - regression_loss: 0.9074 - classification_loss: 0.1146 349/500 [===================>..........] - ETA: 37s - loss: 1.0228 - regression_loss: 0.9081 - classification_loss: 0.1147 350/500 [====================>.........] - ETA: 37s - loss: 1.0242 - regression_loss: 0.9093 - classification_loss: 0.1149 351/500 [====================>.........] - ETA: 37s - loss: 1.0234 - regression_loss: 0.9085 - classification_loss: 0.1149 352/500 [====================>.........] - ETA: 37s - loss: 1.0242 - regression_loss: 0.9094 - classification_loss: 0.1148 353/500 [====================>.........] - ETA: 36s - loss: 1.0244 - regression_loss: 0.9098 - classification_loss: 0.1146 354/500 [====================>.........] - ETA: 36s - loss: 1.0253 - regression_loss: 0.9107 - classification_loss: 0.1147 355/500 [====================>.........] - ETA: 36s - loss: 1.0249 - regression_loss: 0.9103 - classification_loss: 0.1146 356/500 [====================>.........] - ETA: 36s - loss: 1.0256 - regression_loss: 0.9111 - classification_loss: 0.1145 357/500 [====================>.........] - ETA: 35s - loss: 1.0267 - regression_loss: 0.9121 - classification_loss: 0.1146 358/500 [====================>.........] - ETA: 35s - loss: 1.0270 - regression_loss: 0.9123 - classification_loss: 0.1147 359/500 [====================>.........] - ETA: 35s - loss: 1.0280 - regression_loss: 0.9132 - classification_loss: 0.1148 360/500 [====================>.........] - ETA: 35s - loss: 1.0288 - regression_loss: 0.9137 - classification_loss: 0.1150 361/500 [====================>.........] - ETA: 34s - loss: 1.0274 - regression_loss: 0.9125 - classification_loss: 0.1149 362/500 [====================>.........] - ETA: 34s - loss: 1.0289 - regression_loss: 0.9138 - classification_loss: 0.1152 363/500 [====================>.........] - ETA: 34s - loss: 1.0306 - regression_loss: 0.9152 - classification_loss: 0.1154 364/500 [====================>.........] - ETA: 34s - loss: 1.0307 - regression_loss: 0.9153 - classification_loss: 0.1154 365/500 [====================>.........] - ETA: 33s - loss: 1.0289 - regression_loss: 0.9137 - classification_loss: 0.1152 366/500 [====================>.........] - ETA: 33s - loss: 1.0281 - regression_loss: 0.9129 - classification_loss: 0.1151 367/500 [=====================>........] - ETA: 33s - loss: 1.0292 - regression_loss: 0.9139 - classification_loss: 0.1153 368/500 [=====================>........] - ETA: 33s - loss: 1.0297 - regression_loss: 0.9143 - classification_loss: 0.1154 369/500 [=====================>........] - ETA: 32s - loss: 1.0285 - regression_loss: 0.9132 - classification_loss: 0.1152 370/500 [=====================>........] - ETA: 32s - loss: 1.0287 - regression_loss: 0.9136 - classification_loss: 0.1152 371/500 [=====================>........] - ETA: 32s - loss: 1.0293 - regression_loss: 0.9141 - classification_loss: 0.1152 372/500 [=====================>........] - ETA: 32s - loss: 1.0301 - regression_loss: 0.9148 - classification_loss: 0.1153 373/500 [=====================>........] - ETA: 31s - loss: 1.0315 - regression_loss: 0.9160 - classification_loss: 0.1155 374/500 [=====================>........] - ETA: 31s - loss: 1.0314 - regression_loss: 0.9159 - classification_loss: 0.1155 375/500 [=====================>........] - ETA: 31s - loss: 1.0300 - regression_loss: 0.9147 - classification_loss: 0.1153 376/500 [=====================>........] - ETA: 31s - loss: 1.0309 - regression_loss: 0.9155 - classification_loss: 0.1155 377/500 [=====================>........] - ETA: 30s - loss: 1.0320 - regression_loss: 0.9164 - classification_loss: 0.1157 378/500 [=====================>........] - ETA: 30s - loss: 1.0318 - regression_loss: 0.9163 - classification_loss: 0.1156 379/500 [=====================>........] - ETA: 30s - loss: 1.0332 - regression_loss: 0.9173 - classification_loss: 0.1158 380/500 [=====================>........] - ETA: 30s - loss: 1.0334 - regression_loss: 0.9176 - classification_loss: 0.1158 381/500 [=====================>........] - ETA: 29s - loss: 1.0343 - regression_loss: 0.9184 - classification_loss: 0.1159 382/500 [=====================>........] - ETA: 29s - loss: 1.0329 - regression_loss: 0.9172 - classification_loss: 0.1157 383/500 [=====================>........] - ETA: 29s - loss: 1.0332 - regression_loss: 0.9173 - classification_loss: 0.1159 384/500 [======================>.......] - ETA: 29s - loss: 1.0331 - regression_loss: 0.9172 - classification_loss: 0.1159 385/500 [======================>.......] - ETA: 28s - loss: 1.0324 - regression_loss: 0.9166 - classification_loss: 0.1158 386/500 [======================>.......] - ETA: 28s - loss: 1.0311 - regression_loss: 0.9154 - classification_loss: 0.1156 387/500 [======================>.......] - ETA: 28s - loss: 1.0310 - regression_loss: 0.9155 - classification_loss: 0.1155 388/500 [======================>.......] - ETA: 28s - loss: 1.0304 - regression_loss: 0.9150 - classification_loss: 0.1154 389/500 [======================>.......] - ETA: 27s - loss: 1.0295 - regression_loss: 0.9142 - classification_loss: 0.1152 390/500 [======================>.......] - ETA: 27s - loss: 1.0307 - regression_loss: 0.9152 - classification_loss: 0.1155 391/500 [======================>.......] - ETA: 27s - loss: 1.0309 - regression_loss: 0.9154 - classification_loss: 0.1155 392/500 [======================>.......] - ETA: 27s - loss: 1.0315 - regression_loss: 0.9159 - classification_loss: 0.1155 393/500 [======================>.......] - ETA: 26s - loss: 1.0327 - regression_loss: 0.9171 - classification_loss: 0.1156 394/500 [======================>.......] - ETA: 26s - loss: 1.0317 - regression_loss: 0.9160 - classification_loss: 0.1157 395/500 [======================>.......] - ETA: 26s - loss: 1.0321 - regression_loss: 0.9162 - classification_loss: 0.1159 396/500 [======================>.......] - ETA: 26s - loss: 1.0329 - regression_loss: 0.9168 - classification_loss: 0.1160 397/500 [======================>.......] - ETA: 25s - loss: 1.0322 - regression_loss: 0.9162 - classification_loss: 0.1160 398/500 [======================>.......] - ETA: 25s - loss: 1.0325 - regression_loss: 0.9166 - classification_loss: 0.1160 399/500 [======================>.......] - ETA: 25s - loss: 1.0330 - regression_loss: 0.9170 - classification_loss: 0.1161 400/500 [=======================>......] - ETA: 25s - loss: 1.0330 - regression_loss: 0.9170 - classification_loss: 0.1160 401/500 [=======================>......] - ETA: 24s - loss: 1.0339 - regression_loss: 0.9176 - classification_loss: 0.1163 402/500 [=======================>......] - ETA: 24s - loss: 1.0341 - regression_loss: 0.9178 - classification_loss: 0.1163 403/500 [=======================>......] - ETA: 24s - loss: 1.0340 - regression_loss: 0.9179 - classification_loss: 0.1161 404/500 [=======================>......] - ETA: 24s - loss: 1.0341 - regression_loss: 0.9181 - classification_loss: 0.1161 405/500 [=======================>......] - ETA: 23s - loss: 1.0346 - regression_loss: 0.9184 - classification_loss: 0.1162 406/500 [=======================>......] - ETA: 23s - loss: 1.0341 - regression_loss: 0.9179 - classification_loss: 0.1162 407/500 [=======================>......] - ETA: 23s - loss: 1.0346 - regression_loss: 0.9183 - classification_loss: 0.1163 408/500 [=======================>......] - ETA: 23s - loss: 1.0335 - regression_loss: 0.9174 - classification_loss: 0.1161 409/500 [=======================>......] - ETA: 22s - loss: 1.0330 - regression_loss: 0.9169 - classification_loss: 0.1161 410/500 [=======================>......] - ETA: 22s - loss: 1.0328 - regression_loss: 0.9170 - classification_loss: 0.1159 411/500 [=======================>......] - ETA: 22s - loss: 1.0332 - regression_loss: 0.9172 - classification_loss: 0.1160 412/500 [=======================>......] - ETA: 22s - loss: 1.0338 - regression_loss: 0.9178 - classification_loss: 0.1161 413/500 [=======================>......] - ETA: 21s - loss: 1.0333 - regression_loss: 0.9173 - classification_loss: 0.1160 414/500 [=======================>......] - ETA: 21s - loss: 1.0319 - regression_loss: 0.9162 - classification_loss: 0.1158 415/500 [=======================>......] - ETA: 21s - loss: 1.0320 - regression_loss: 0.9162 - classification_loss: 0.1158 416/500 [=======================>......] - ETA: 21s - loss: 1.0334 - regression_loss: 0.9175 - classification_loss: 0.1159 417/500 [========================>.....] - ETA: 20s - loss: 1.0335 - regression_loss: 0.9177 - classification_loss: 0.1158 418/500 [========================>.....] - ETA: 20s - loss: 1.0319 - regression_loss: 0.9162 - classification_loss: 0.1156 419/500 [========================>.....] - ETA: 20s - loss: 1.0314 - regression_loss: 0.9158 - classification_loss: 0.1156 420/500 [========================>.....] - ETA: 20s - loss: 1.0301 - regression_loss: 0.9147 - classification_loss: 0.1154 421/500 [========================>.....] - ETA: 19s - loss: 1.0300 - regression_loss: 0.9147 - classification_loss: 0.1154 422/500 [========================>.....] - ETA: 19s - loss: 1.0312 - regression_loss: 0.9156 - classification_loss: 0.1156 423/500 [========================>.....] - ETA: 19s - loss: 1.0314 - regression_loss: 0.9160 - classification_loss: 0.1154 424/500 [========================>.....] - ETA: 19s - loss: 1.0303 - regression_loss: 0.9151 - classification_loss: 0.1153 425/500 [========================>.....] - ETA: 18s - loss: 1.0293 - regression_loss: 0.9143 - classification_loss: 0.1151 426/500 [========================>.....] - ETA: 18s - loss: 1.0303 - regression_loss: 0.9151 - classification_loss: 0.1153 427/500 [========================>.....] - ETA: 18s - loss: 1.0298 - regression_loss: 0.9146 - classification_loss: 0.1152 428/500 [========================>.....] - ETA: 18s - loss: 1.0306 - regression_loss: 0.9154 - classification_loss: 0.1153 429/500 [========================>.....] - ETA: 17s - loss: 1.0310 - regression_loss: 0.9157 - classification_loss: 0.1153 430/500 [========================>.....] - ETA: 17s - loss: 1.0301 - regression_loss: 0.9149 - classification_loss: 0.1152 431/500 [========================>.....] - ETA: 17s - loss: 1.0314 - regression_loss: 0.9160 - classification_loss: 0.1155 432/500 [========================>.....] - ETA: 17s - loss: 1.0320 - regression_loss: 0.9165 - classification_loss: 0.1155 433/500 [========================>.....] - ETA: 16s - loss: 1.0319 - regression_loss: 0.9164 - classification_loss: 0.1155 434/500 [=========================>....] - ETA: 16s - loss: 1.0323 - regression_loss: 0.9166 - classification_loss: 0.1157 435/500 [=========================>....] - ETA: 16s - loss: 1.0330 - regression_loss: 0.9172 - classification_loss: 0.1158 436/500 [=========================>....] - ETA: 16s - loss: 1.0339 - regression_loss: 0.9180 - classification_loss: 0.1159 437/500 [=========================>....] - ETA: 15s - loss: 1.0322 - regression_loss: 0.9164 - classification_loss: 0.1158 438/500 [=========================>....] - ETA: 15s - loss: 1.0309 - regression_loss: 0.9153 - classification_loss: 0.1156 439/500 [=========================>....] - ETA: 15s - loss: 1.0315 - regression_loss: 0.9159 - classification_loss: 0.1156 440/500 [=========================>....] - ETA: 15s - loss: 1.0300 - regression_loss: 0.9146 - classification_loss: 0.1154 441/500 [=========================>....] - ETA: 14s - loss: 1.0302 - regression_loss: 0.9149 - classification_loss: 0.1153 442/500 [=========================>....] - ETA: 14s - loss: 1.0304 - regression_loss: 0.9151 - classification_loss: 0.1153 443/500 [=========================>....] - ETA: 14s - loss: 1.0315 - regression_loss: 0.9160 - classification_loss: 0.1155 444/500 [=========================>....] - ETA: 14s - loss: 1.0312 - regression_loss: 0.9157 - classification_loss: 0.1155 445/500 [=========================>....] - ETA: 13s - loss: 1.0316 - regression_loss: 0.9161 - classification_loss: 0.1155 446/500 [=========================>....] - ETA: 13s - loss: 1.0302 - regression_loss: 0.9149 - classification_loss: 0.1153 447/500 [=========================>....] - ETA: 13s - loss: 1.0288 - regression_loss: 0.9136 - classification_loss: 0.1152 448/500 [=========================>....] - ETA: 13s - loss: 1.0277 - regression_loss: 0.9126 - classification_loss: 0.1150 449/500 [=========================>....] - ETA: 12s - loss: 1.0266 - regression_loss: 0.9117 - classification_loss: 0.1149 450/500 [==========================>...] - ETA: 12s - loss: 1.0273 - regression_loss: 0.9123 - classification_loss: 0.1151 451/500 [==========================>...] - ETA: 12s - loss: 1.0269 - regression_loss: 0.9119 - classification_loss: 0.1150 452/500 [==========================>...] - ETA: 12s - loss: 1.0274 - regression_loss: 0.9123 - classification_loss: 0.1151 453/500 [==========================>...] - ETA: 11s - loss: 1.0276 - regression_loss: 0.9124 - classification_loss: 0.1151 454/500 [==========================>...] - ETA: 11s - loss: 1.0278 - regression_loss: 0.9126 - classification_loss: 0.1152 455/500 [==========================>...] - ETA: 11s - loss: 1.0269 - regression_loss: 0.9117 - classification_loss: 0.1152 456/500 [==========================>...] - ETA: 11s - loss: 1.0266 - regression_loss: 0.9113 - classification_loss: 0.1153 457/500 [==========================>...] - ETA: 10s - loss: 1.0275 - regression_loss: 0.9118 - classification_loss: 0.1156 458/500 [==========================>...] - ETA: 10s - loss: 1.0278 - regression_loss: 0.9122 - classification_loss: 0.1156 459/500 [==========================>...] - ETA: 10s - loss: 1.0282 - regression_loss: 0.9125 - classification_loss: 0.1157 460/500 [==========================>...] - ETA: 10s - loss: 1.0284 - regression_loss: 0.9126 - classification_loss: 0.1158 461/500 [==========================>...] - ETA: 9s - loss: 1.0272 - regression_loss: 0.9116 - classification_loss: 0.1156  462/500 [==========================>...] - ETA: 9s - loss: 1.0279 - regression_loss: 0.9119 - classification_loss: 0.1160 463/500 [==========================>...] - ETA: 9s - loss: 1.0262 - regression_loss: 0.9104 - classification_loss: 0.1158 464/500 [==========================>...] - ETA: 9s - loss: 1.0270 - regression_loss: 0.9112 - classification_loss: 0.1158 465/500 [==========================>...] - ETA: 8s - loss: 1.0269 - regression_loss: 0.9111 - classification_loss: 0.1158 466/500 [==========================>...] - ETA: 8s - loss: 1.0268 - regression_loss: 0.9109 - classification_loss: 0.1158 467/500 [===========================>..] - ETA: 8s - loss: 1.0270 - regression_loss: 0.9112 - classification_loss: 0.1158 468/500 [===========================>..] - ETA: 8s - loss: 1.0260 - regression_loss: 0.9103 - classification_loss: 0.1157 469/500 [===========================>..] - ETA: 7s - loss: 1.0270 - regression_loss: 0.9111 - classification_loss: 0.1159 470/500 [===========================>..] - ETA: 7s - loss: 1.0267 - regression_loss: 0.9107 - classification_loss: 0.1160 471/500 [===========================>..] - ETA: 7s - loss: 1.0272 - regression_loss: 0.9111 - classification_loss: 0.1160 472/500 [===========================>..] - ETA: 7s - loss: 1.0271 - regression_loss: 0.9109 - classification_loss: 0.1161 473/500 [===========================>..] - ETA: 6s - loss: 1.0264 - regression_loss: 0.9103 - classification_loss: 0.1161 474/500 [===========================>..] - ETA: 6s - loss: 1.0253 - regression_loss: 0.9093 - classification_loss: 0.1159 475/500 [===========================>..] - ETA: 6s - loss: 1.0252 - regression_loss: 0.9092 - classification_loss: 0.1159 476/500 [===========================>..] - ETA: 6s - loss: 1.0247 - regression_loss: 0.9088 - classification_loss: 0.1158 477/500 [===========================>..] - ETA: 5s - loss: 1.0245 - regression_loss: 0.9087 - classification_loss: 0.1158 478/500 [===========================>..] - ETA: 5s - loss: 1.0247 - regression_loss: 0.9089 - classification_loss: 0.1158 479/500 [===========================>..] - ETA: 5s - loss: 1.0233 - regression_loss: 0.9076 - classification_loss: 0.1156 480/500 [===========================>..] - ETA: 5s - loss: 1.0242 - regression_loss: 0.9085 - classification_loss: 0.1157 481/500 [===========================>..] - ETA: 4s - loss: 1.0237 - regression_loss: 0.9080 - classification_loss: 0.1157 482/500 [===========================>..] - ETA: 4s - loss: 1.0243 - regression_loss: 0.9084 - classification_loss: 0.1159 483/500 [===========================>..] - ETA: 4s - loss: 1.0235 - regression_loss: 0.9077 - classification_loss: 0.1158 484/500 [============================>.] - ETA: 4s - loss: 1.0238 - regression_loss: 0.9078 - classification_loss: 0.1160 485/500 [============================>.] - ETA: 3s - loss: 1.0249 - regression_loss: 0.9087 - classification_loss: 0.1162 486/500 [============================>.] - ETA: 3s - loss: 1.0244 - regression_loss: 0.9082 - classification_loss: 0.1163 487/500 [============================>.] - ETA: 3s - loss: 1.0251 - regression_loss: 0.9088 - classification_loss: 0.1163 488/500 [============================>.] - ETA: 3s - loss: 1.0251 - regression_loss: 0.9087 - classification_loss: 0.1163 489/500 [============================>.] - ETA: 2s - loss: 1.0258 - regression_loss: 0.9093 - classification_loss: 0.1164 490/500 [============================>.] - ETA: 2s - loss: 1.0262 - regression_loss: 0.9098 - classification_loss: 0.1164 491/500 [============================>.] - ETA: 2s - loss: 1.0265 - regression_loss: 0.9101 - classification_loss: 0.1164 492/500 [============================>.] - ETA: 2s - loss: 1.0268 - regression_loss: 0.9104 - classification_loss: 0.1164 493/500 [============================>.] - ETA: 1s - loss: 1.0266 - regression_loss: 0.9102 - classification_loss: 0.1163 494/500 [============================>.] - ETA: 1s - loss: 1.0264 - regression_loss: 0.9100 - classification_loss: 0.1163 495/500 [============================>.] - ETA: 1s - loss: 1.0259 - regression_loss: 0.9097 - classification_loss: 0.1162 496/500 [============================>.] - ETA: 1s - loss: 1.0281 - regression_loss: 0.9115 - classification_loss: 0.1166 497/500 [============================>.] - ETA: 0s - loss: 1.0278 - regression_loss: 0.9112 - classification_loss: 0.1166 498/500 [============================>.] - ETA: 0s - loss: 1.0270 - regression_loss: 0.9106 - classification_loss: 0.1165 499/500 [============================>.] - ETA: 0s - loss: 1.0272 - regression_loss: 0.9106 - classification_loss: 0.1165 500/500 [==============================] - 125s 250ms/step - loss: 1.0272 - regression_loss: 0.9106 - classification_loss: 0.1166 1172 instances of class plum with average precision: 0.7237 mAP: 0.7237 Epoch 00034: saving model to ./training/snapshots/resnet50_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 1:49 - loss: 0.4175 - regression_loss: 0.3929 - classification_loss: 0.0246 2/500 [..............................] - ETA: 1:49 - loss: 0.6504 - regression_loss: 0.5934 - classification_loss: 0.0570 3/500 [..............................] - ETA: 1:54 - loss: 1.0183 - regression_loss: 0.8820 - classification_loss: 0.1363 4/500 [..............................] - ETA: 1:56 - loss: 1.0464 - regression_loss: 0.9011 - classification_loss: 0.1454 5/500 [..............................] - ETA: 1:57 - loss: 1.0969 - regression_loss: 0.9449 - classification_loss: 0.1520 6/500 [..............................] - ETA: 1:58 - loss: 1.0803 - regression_loss: 0.9367 - classification_loss: 0.1436 7/500 [..............................] - ETA: 1:59 - loss: 1.1167 - regression_loss: 0.9663 - classification_loss: 0.1504 8/500 [..............................] - ETA: 1:59 - loss: 1.1048 - regression_loss: 0.9575 - classification_loss: 0.1473 9/500 [..............................] - ETA: 1:58 - loss: 1.1117 - regression_loss: 0.9545 - classification_loss: 0.1573 10/500 [..............................] - ETA: 1:58 - loss: 1.0089 - regression_loss: 0.8590 - classification_loss: 0.1498 11/500 [..............................] - ETA: 1:58 - loss: 1.0212 - regression_loss: 0.8715 - classification_loss: 0.1497 12/500 [..............................] - ETA: 1:58 - loss: 1.0155 - regression_loss: 0.8732 - classification_loss: 0.1423 13/500 [..............................] - ETA: 1:58 - loss: 0.9916 - regression_loss: 0.8537 - classification_loss: 0.1378 14/500 [..............................] - ETA: 1:58 - loss: 0.9452 - regression_loss: 0.8155 - classification_loss: 0.1298 15/500 [..............................] - ETA: 1:58 - loss: 0.9616 - regression_loss: 0.8319 - classification_loss: 0.1297 16/500 [..............................] - ETA: 1:58 - loss: 0.9359 - regression_loss: 0.8126 - classification_loss: 0.1233 17/500 [>.............................] - ETA: 1:58 - loss: 0.9825 - regression_loss: 0.8502 - classification_loss: 0.1322 18/500 [>.............................] - ETA: 1:58 - loss: 1.0067 - regression_loss: 0.8723 - classification_loss: 0.1344 19/500 [>.............................] - ETA: 1:58 - loss: 0.9672 - regression_loss: 0.8391 - classification_loss: 0.1280 20/500 [>.............................] - ETA: 1:58 - loss: 1.0052 - regression_loss: 0.8694 - classification_loss: 0.1359 21/500 [>.............................] - ETA: 1:58 - loss: 1.0188 - regression_loss: 0.8829 - classification_loss: 0.1359 22/500 [>.............................] - ETA: 1:58 - loss: 1.0393 - regression_loss: 0.9004 - classification_loss: 0.1389 23/500 [>.............................] - ETA: 1:58 - loss: 1.0466 - regression_loss: 0.9087 - classification_loss: 0.1379 24/500 [>.............................] - ETA: 1:58 - loss: 1.0407 - regression_loss: 0.9041 - classification_loss: 0.1366 25/500 [>.............................] - ETA: 1:58 - loss: 1.0407 - regression_loss: 0.9056 - classification_loss: 0.1351 26/500 [>.............................] - ETA: 1:58 - loss: 1.0640 - regression_loss: 0.9253 - classification_loss: 0.1387 27/500 [>.............................] - ETA: 1:58 - loss: 1.0717 - regression_loss: 0.9330 - classification_loss: 0.1386 28/500 [>.............................] - ETA: 1:57 - loss: 1.0894 - regression_loss: 0.9495 - classification_loss: 0.1399 29/500 [>.............................] - ETA: 1:57 - loss: 1.0770 - regression_loss: 0.9404 - classification_loss: 0.1366 30/500 [>.............................] - ETA: 1:57 - loss: 1.0868 - regression_loss: 0.9503 - classification_loss: 0.1365 31/500 [>.............................] - ETA: 1:57 - loss: 1.0957 - regression_loss: 0.9581 - classification_loss: 0.1376 32/500 [>.............................] - ETA: 1:57 - loss: 1.1042 - regression_loss: 0.9674 - classification_loss: 0.1369 33/500 [>.............................] - ETA: 1:56 - loss: 1.0902 - regression_loss: 0.9563 - classification_loss: 0.1340 34/500 [=>............................] - ETA: 1:56 - loss: 1.0844 - regression_loss: 0.9524 - classification_loss: 0.1320 35/500 [=>............................] - ETA: 1:56 - loss: 1.0816 - regression_loss: 0.9512 - classification_loss: 0.1304 36/500 [=>............................] - ETA: 1:56 - loss: 1.0618 - regression_loss: 0.9335 - classification_loss: 0.1283 37/500 [=>............................] - ETA: 1:56 - loss: 1.0621 - regression_loss: 0.9336 - classification_loss: 0.1285 38/500 [=>............................] - ETA: 1:55 - loss: 1.0770 - regression_loss: 0.9460 - classification_loss: 0.1310 39/500 [=>............................] - ETA: 1:55 - loss: 1.0824 - regression_loss: 0.9522 - classification_loss: 0.1301 40/500 [=>............................] - ETA: 1:55 - loss: 1.0791 - regression_loss: 0.9503 - classification_loss: 0.1288 41/500 [=>............................] - ETA: 1:55 - loss: 1.0676 - regression_loss: 0.9413 - classification_loss: 0.1262 42/500 [=>............................] - ETA: 1:54 - loss: 1.0676 - regression_loss: 0.9416 - classification_loss: 0.1259 43/500 [=>............................] - ETA: 1:54 - loss: 1.0638 - regression_loss: 0.9377 - classification_loss: 0.1261 44/500 [=>............................] - ETA: 1:54 - loss: 1.0670 - regression_loss: 0.9414 - classification_loss: 0.1256 45/500 [=>............................] - ETA: 1:54 - loss: 1.0652 - regression_loss: 0.9403 - classification_loss: 0.1249 46/500 [=>............................] - ETA: 1:53 - loss: 1.0641 - regression_loss: 0.9391 - classification_loss: 0.1249 47/500 [=>............................] - ETA: 1:53 - loss: 1.0579 - regression_loss: 0.9347 - classification_loss: 0.1232 48/500 [=>............................] - ETA: 1:53 - loss: 1.0596 - regression_loss: 0.9370 - classification_loss: 0.1226 49/500 [=>............................] - ETA: 1:53 - loss: 1.0601 - regression_loss: 0.9371 - classification_loss: 0.1230 50/500 [==>...........................] - ETA: 1:52 - loss: 1.0597 - regression_loss: 0.9373 - classification_loss: 0.1224 51/500 [==>...........................] - ETA: 1:52 - loss: 1.0501 - regression_loss: 0.9291 - classification_loss: 0.1210 52/500 [==>...........................] - ETA: 1:52 - loss: 1.0532 - regression_loss: 0.9330 - classification_loss: 0.1203 53/500 [==>...........................] - ETA: 1:52 - loss: 1.0442 - regression_loss: 0.9249 - classification_loss: 0.1192 54/500 [==>...........................] - ETA: 1:51 - loss: 1.0390 - regression_loss: 0.9201 - classification_loss: 0.1189 55/500 [==>...........................] - ETA: 1:51 - loss: 1.0401 - regression_loss: 0.9198 - classification_loss: 0.1203 56/500 [==>...........................] - ETA: 1:51 - loss: 1.0353 - regression_loss: 0.9155 - classification_loss: 0.1198 57/500 [==>...........................] - ETA: 1:51 - loss: 1.0330 - regression_loss: 0.9143 - classification_loss: 0.1186 58/500 [==>...........................] - ETA: 1:50 - loss: 1.0254 - regression_loss: 0.9082 - classification_loss: 0.1171 59/500 [==>...........................] - ETA: 1:50 - loss: 1.0279 - regression_loss: 0.9105 - classification_loss: 0.1174 60/500 [==>...........................] - ETA: 1:50 - loss: 1.0304 - regression_loss: 0.9119 - classification_loss: 0.1185 61/500 [==>...........................] - ETA: 1:50 - loss: 1.0260 - regression_loss: 0.9081 - classification_loss: 0.1179 62/500 [==>...........................] - ETA: 1:50 - loss: 1.0244 - regression_loss: 0.9065 - classification_loss: 0.1178 63/500 [==>...........................] - ETA: 1:49 - loss: 1.0329 - regression_loss: 0.9137 - classification_loss: 0.1192 64/500 [==>...........................] - ETA: 1:49 - loss: 1.0365 - regression_loss: 0.9170 - classification_loss: 0.1194 65/500 [==>...........................] - ETA: 1:49 - loss: 1.0385 - regression_loss: 0.9186 - classification_loss: 0.1199 66/500 [==>...........................] - ETA: 1:49 - loss: 1.0436 - regression_loss: 0.9232 - classification_loss: 0.1204 67/500 [===>..........................] - ETA: 1:48 - loss: 1.0504 - regression_loss: 0.9290 - classification_loss: 0.1214 68/500 [===>..........................] - ETA: 1:48 - loss: 1.0557 - regression_loss: 0.9337 - classification_loss: 0.1221 69/500 [===>..........................] - ETA: 1:48 - loss: 1.0583 - regression_loss: 0.9359 - classification_loss: 0.1224 70/500 [===>..........................] - ETA: 1:48 - loss: 1.0592 - regression_loss: 0.9369 - classification_loss: 0.1223 71/500 [===>..........................] - ETA: 1:47 - loss: 1.0569 - regression_loss: 0.9353 - classification_loss: 0.1217 72/500 [===>..........................] - ETA: 1:47 - loss: 1.0500 - regression_loss: 0.9292 - classification_loss: 0.1208 73/500 [===>..........................] - ETA: 1:47 - loss: 1.0552 - regression_loss: 0.9337 - classification_loss: 0.1215 74/500 [===>..........................] - ETA: 1:46 - loss: 1.0579 - regression_loss: 0.9359 - classification_loss: 0.1220 75/500 [===>..........................] - ETA: 1:46 - loss: 1.0565 - regression_loss: 0.9347 - classification_loss: 0.1218 76/500 [===>..........................] - ETA: 1:45 - loss: 1.0576 - regression_loss: 0.9359 - classification_loss: 0.1218 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0626 - regression_loss: 0.9402 - classification_loss: 0.1224 78/500 [===>..........................] - ETA: 1:45 - loss: 1.0613 - regression_loss: 0.9391 - classification_loss: 0.1223 79/500 [===>..........................] - ETA: 1:45 - loss: 1.0605 - regression_loss: 0.9390 - classification_loss: 0.1216 80/500 [===>..........................] - ETA: 1:44 - loss: 1.0610 - regression_loss: 0.9396 - classification_loss: 0.1214 81/500 [===>..........................] - ETA: 1:44 - loss: 1.0603 - regression_loss: 0.9391 - classification_loss: 0.1212 82/500 [===>..........................] - ETA: 1:44 - loss: 1.0639 - regression_loss: 0.9411 - classification_loss: 0.1228 83/500 [===>..........................] - ETA: 1:44 - loss: 1.0606 - regression_loss: 0.9386 - classification_loss: 0.1219 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0527 - regression_loss: 0.9317 - classification_loss: 0.1210 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0550 - regression_loss: 0.9336 - classification_loss: 0.1214 86/500 [====>.........................] - ETA: 1:43 - loss: 1.0558 - regression_loss: 0.9340 - classification_loss: 0.1218 87/500 [====>.........................] - ETA: 1:43 - loss: 1.0486 - regression_loss: 0.9280 - classification_loss: 0.1206 88/500 [====>.........................] - ETA: 1:43 - loss: 1.0434 - regression_loss: 0.9237 - classification_loss: 0.1198 89/500 [====>.........................] - ETA: 1:42 - loss: 1.0418 - regression_loss: 0.9221 - classification_loss: 0.1198 90/500 [====>.........................] - ETA: 1:42 - loss: 1.0390 - regression_loss: 0.9196 - classification_loss: 0.1194 91/500 [====>.........................] - ETA: 1:42 - loss: 1.0400 - regression_loss: 0.9205 - classification_loss: 0.1195 92/500 [====>.........................] - ETA: 1:42 - loss: 1.0464 - regression_loss: 0.9250 - classification_loss: 0.1214 93/500 [====>.........................] - ETA: 1:41 - loss: 1.0409 - regression_loss: 0.9203 - classification_loss: 0.1206 94/500 [====>.........................] - ETA: 1:41 - loss: 1.0399 - regression_loss: 0.9192 - classification_loss: 0.1208 95/500 [====>.........................] - ETA: 1:41 - loss: 1.0401 - regression_loss: 0.9194 - classification_loss: 0.1208 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0399 - regression_loss: 0.9195 - classification_loss: 0.1204 97/500 [====>.........................] - ETA: 1:40 - loss: 1.0448 - regression_loss: 0.9236 - classification_loss: 0.1212 98/500 [====>.........................] - ETA: 1:40 - loss: 1.0375 - regression_loss: 0.9174 - classification_loss: 0.1201 99/500 [====>.........................] - ETA: 1:40 - loss: 1.0383 - regression_loss: 0.9187 - classification_loss: 0.1197 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0358 - regression_loss: 0.9170 - classification_loss: 0.1189 101/500 [=====>........................] - ETA: 1:39 - loss: 1.0334 - regression_loss: 0.9148 - classification_loss: 0.1186 102/500 [=====>........................] - ETA: 1:39 - loss: 1.0322 - regression_loss: 0.9127 - classification_loss: 0.1195 103/500 [=====>........................] - ETA: 1:38 - loss: 1.0335 - regression_loss: 0.9144 - classification_loss: 0.1191 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0332 - regression_loss: 0.9141 - classification_loss: 0.1191 105/500 [=====>........................] - ETA: 1:38 - loss: 1.0316 - regression_loss: 0.9128 - classification_loss: 0.1188 106/500 [=====>........................] - ETA: 1:38 - loss: 1.0406 - regression_loss: 0.9201 - classification_loss: 0.1205 107/500 [=====>........................] - ETA: 1:38 - loss: 1.0384 - regression_loss: 0.9183 - classification_loss: 0.1201 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0378 - regression_loss: 0.9177 - classification_loss: 0.1202 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0318 - regression_loss: 0.9126 - classification_loss: 0.1193 110/500 [=====>........................] - ETA: 1:37 - loss: 1.0264 - regression_loss: 0.9078 - classification_loss: 0.1185 111/500 [=====>........................] - ETA: 1:37 - loss: 1.0265 - regression_loss: 0.9083 - classification_loss: 0.1182 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0287 - regression_loss: 0.9100 - classification_loss: 0.1186 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0314 - regression_loss: 0.9124 - classification_loss: 0.1190 114/500 [=====>........................] - ETA: 1:36 - loss: 1.0320 - regression_loss: 0.9129 - classification_loss: 0.1191 115/500 [=====>........................] - ETA: 1:36 - loss: 1.0358 - regression_loss: 0.9158 - classification_loss: 0.1200 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0373 - regression_loss: 0.9168 - classification_loss: 0.1205 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0405 - regression_loss: 0.9195 - classification_loss: 0.1210 118/500 [======>.......................] - ETA: 1:35 - loss: 1.0398 - regression_loss: 0.9190 - classification_loss: 0.1208 119/500 [======>.......................] - ETA: 1:35 - loss: 1.0395 - regression_loss: 0.9189 - classification_loss: 0.1206 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0356 - regression_loss: 0.9156 - classification_loss: 0.1200 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0407 - regression_loss: 0.9199 - classification_loss: 0.1208 122/500 [======>.......................] - ETA: 1:34 - loss: 1.0386 - regression_loss: 0.9182 - classification_loss: 0.1203 123/500 [======>.......................] - ETA: 1:34 - loss: 1.0383 - regression_loss: 0.9180 - classification_loss: 0.1203 124/500 [======>.......................] - ETA: 1:33 - loss: 1.0345 - regression_loss: 0.9149 - classification_loss: 0.1196 125/500 [======>.......................] - ETA: 1:33 - loss: 1.0352 - regression_loss: 0.9157 - classification_loss: 0.1195 126/500 [======>.......................] - ETA: 1:33 - loss: 1.0365 - regression_loss: 0.9169 - classification_loss: 0.1195 127/500 [======>.......................] - ETA: 1:33 - loss: 1.0364 - regression_loss: 0.9168 - classification_loss: 0.1197 128/500 [======>.......................] - ETA: 1:33 - loss: 1.0354 - regression_loss: 0.9161 - classification_loss: 0.1193 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0342 - regression_loss: 0.9153 - classification_loss: 0.1189 130/500 [======>.......................] - ETA: 1:32 - loss: 1.0288 - regression_loss: 0.9106 - classification_loss: 0.1182 131/500 [======>.......................] - ETA: 1:32 - loss: 1.0337 - regression_loss: 0.9146 - classification_loss: 0.1191 132/500 [======>.......................] - ETA: 1:31 - loss: 1.0286 - regression_loss: 0.9102 - classification_loss: 0.1184 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0306 - regression_loss: 0.9118 - classification_loss: 0.1189 134/500 [=======>......................] - ETA: 1:31 - loss: 1.0323 - regression_loss: 0.9130 - classification_loss: 0.1193 135/500 [=======>......................] - ETA: 1:31 - loss: 1.0323 - regression_loss: 0.9133 - classification_loss: 0.1190 136/500 [=======>......................] - ETA: 1:31 - loss: 1.0334 - regression_loss: 0.9140 - classification_loss: 0.1194 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0340 - regression_loss: 0.9142 - classification_loss: 0.1198 138/500 [=======>......................] - ETA: 1:30 - loss: 1.0343 - regression_loss: 0.9143 - classification_loss: 0.1201 139/500 [=======>......................] - ETA: 1:30 - loss: 1.0328 - regression_loss: 0.9131 - classification_loss: 0.1196 140/500 [=======>......................] - ETA: 1:30 - loss: 1.0361 - regression_loss: 0.9161 - classification_loss: 0.1200 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0327 - regression_loss: 0.9135 - classification_loss: 0.1192 142/500 [=======>......................] - ETA: 1:29 - loss: 1.0331 - regression_loss: 0.9138 - classification_loss: 0.1193 143/500 [=======>......................] - ETA: 1:29 - loss: 1.0312 - regression_loss: 0.9125 - classification_loss: 0.1187 144/500 [=======>......................] - ETA: 1:29 - loss: 1.0327 - regression_loss: 0.9137 - classification_loss: 0.1190 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0294 - regression_loss: 0.9110 - classification_loss: 0.1185 146/500 [=======>......................] - ETA: 1:28 - loss: 1.0315 - regression_loss: 0.9130 - classification_loss: 0.1185 147/500 [=======>......................] - ETA: 1:28 - loss: 1.0304 - regression_loss: 0.9120 - classification_loss: 0.1184 148/500 [=======>......................] - ETA: 1:28 - loss: 1.0310 - regression_loss: 0.9126 - classification_loss: 0.1184 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0319 - regression_loss: 0.9132 - classification_loss: 0.1187 150/500 [========>.....................] - ETA: 1:27 - loss: 1.0276 - regression_loss: 0.9094 - classification_loss: 0.1183 151/500 [========>.....................] - ETA: 1:27 - loss: 1.0246 - regression_loss: 0.9067 - classification_loss: 0.1178 152/500 [========>.....................] - ETA: 1:27 - loss: 1.0210 - regression_loss: 0.9036 - classification_loss: 0.1174 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0184 - regression_loss: 0.9012 - classification_loss: 0.1172 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0185 - regression_loss: 0.9012 - classification_loss: 0.1173 155/500 [========>.....................] - ETA: 1:26 - loss: 1.0181 - regression_loss: 0.9012 - classification_loss: 0.1170 156/500 [========>.....................] - ETA: 1:26 - loss: 1.0176 - regression_loss: 0.9008 - classification_loss: 0.1168 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0177 - regression_loss: 0.9011 - classification_loss: 0.1166 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0144 - regression_loss: 0.8983 - classification_loss: 0.1160 159/500 [========>.....................] - ETA: 1:25 - loss: 1.0127 - regression_loss: 0.8969 - classification_loss: 0.1158 160/500 [========>.....................] - ETA: 1:25 - loss: 1.0166 - regression_loss: 0.9002 - classification_loss: 0.1164 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0168 - regression_loss: 0.9005 - classification_loss: 0.1163 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0148 - regression_loss: 0.8989 - classification_loss: 0.1159 163/500 [========>.....................] - ETA: 1:24 - loss: 1.0179 - regression_loss: 0.9014 - classification_loss: 0.1164 164/500 [========>.....................] - ETA: 1:24 - loss: 1.0168 - regression_loss: 0.9005 - classification_loss: 0.1163 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0172 - regression_loss: 0.9009 - classification_loss: 0.1163 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0210 - regression_loss: 0.9040 - classification_loss: 0.1170 167/500 [=========>....................] - ETA: 1:23 - loss: 1.0180 - regression_loss: 0.9014 - classification_loss: 0.1166 168/500 [=========>....................] - ETA: 1:23 - loss: 1.0142 - regression_loss: 0.8981 - classification_loss: 0.1160 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0133 - regression_loss: 0.8976 - classification_loss: 0.1157 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0137 - regression_loss: 0.8979 - classification_loss: 0.1158 171/500 [=========>....................] - ETA: 1:22 - loss: 1.0138 - regression_loss: 0.8979 - classification_loss: 0.1159 172/500 [=========>....................] - ETA: 1:22 - loss: 1.0107 - regression_loss: 0.8953 - classification_loss: 0.1154 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0055 - regression_loss: 0.8906 - classification_loss: 0.1148 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0064 - regression_loss: 0.8914 - classification_loss: 0.1149 175/500 [=========>....................] - ETA: 1:21 - loss: 1.0030 - regression_loss: 0.8885 - classification_loss: 0.1145 176/500 [=========>....................] - ETA: 1:21 - loss: 1.0037 - regression_loss: 0.8892 - classification_loss: 0.1146 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0059 - regression_loss: 0.8910 - classification_loss: 0.1149 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0073 - regression_loss: 0.8922 - classification_loss: 0.1152 179/500 [=========>....................] - ETA: 1:20 - loss: 1.0088 - regression_loss: 0.8933 - classification_loss: 0.1156 180/500 [=========>....................] - ETA: 1:20 - loss: 1.0120 - regression_loss: 0.8961 - classification_loss: 0.1159 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0152 - regression_loss: 0.8986 - classification_loss: 0.1167 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0161 - regression_loss: 0.8993 - classification_loss: 0.1169 183/500 [=========>....................] - ETA: 1:19 - loss: 1.0179 - regression_loss: 0.9009 - classification_loss: 0.1169 184/500 [==========>...................] - ETA: 1:19 - loss: 1.0160 - regression_loss: 0.8995 - classification_loss: 0.1165 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0145 - regression_loss: 0.8984 - classification_loss: 0.1161 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0112 - regression_loss: 0.8955 - classification_loss: 0.1158 187/500 [==========>...................] - ETA: 1:18 - loss: 1.0118 - regression_loss: 0.8962 - classification_loss: 0.1156 188/500 [==========>...................] - ETA: 1:18 - loss: 1.0119 - regression_loss: 0.8963 - classification_loss: 0.1156 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0136 - regression_loss: 0.8977 - classification_loss: 0.1159 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0134 - regression_loss: 0.8974 - classification_loss: 0.1160 191/500 [==========>...................] - ETA: 1:17 - loss: 1.0110 - regression_loss: 0.8955 - classification_loss: 0.1155 192/500 [==========>...................] - ETA: 1:17 - loss: 1.0114 - regression_loss: 0.8957 - classification_loss: 0.1157 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0095 - regression_loss: 0.8940 - classification_loss: 0.1155 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0098 - regression_loss: 0.8943 - classification_loss: 0.1156 195/500 [==========>...................] - ETA: 1:16 - loss: 1.0086 - regression_loss: 0.8933 - classification_loss: 0.1152 196/500 [==========>...................] - ETA: 1:16 - loss: 1.0074 - regression_loss: 0.8923 - classification_loss: 0.1151 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0080 - regression_loss: 0.8932 - classification_loss: 0.1148 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0078 - regression_loss: 0.8928 - classification_loss: 0.1150 199/500 [==========>...................] - ETA: 1:15 - loss: 1.0070 - regression_loss: 0.8923 - classification_loss: 0.1147 200/500 [===========>..................] - ETA: 1:15 - loss: 1.0095 - regression_loss: 0.8946 - classification_loss: 0.1149 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0107 - regression_loss: 0.8957 - classification_loss: 0.1150 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0117 - regression_loss: 0.8965 - classification_loss: 0.1152 203/500 [===========>..................] - ETA: 1:14 - loss: 1.0124 - regression_loss: 0.8970 - classification_loss: 0.1153 204/500 [===========>..................] - ETA: 1:14 - loss: 1.0126 - regression_loss: 0.8974 - classification_loss: 0.1152 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0107 - regression_loss: 0.8958 - classification_loss: 0.1149 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0089 - regression_loss: 0.8944 - classification_loss: 0.1145 207/500 [===========>..................] - ETA: 1:13 - loss: 1.0093 - regression_loss: 0.8949 - classification_loss: 0.1144 208/500 [===========>..................] - ETA: 1:13 - loss: 1.0107 - regression_loss: 0.8960 - classification_loss: 0.1147 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0109 - regression_loss: 0.8964 - classification_loss: 0.1145 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0127 - regression_loss: 0.8980 - classification_loss: 0.1147 211/500 [===========>..................] - ETA: 1:12 - loss: 1.0109 - regression_loss: 0.8966 - classification_loss: 0.1143 212/500 [===========>..................] - ETA: 1:12 - loss: 1.0121 - regression_loss: 0.8975 - classification_loss: 0.1146 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0092 - regression_loss: 0.8949 - classification_loss: 0.1144 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0096 - regression_loss: 0.8951 - classification_loss: 0.1144 215/500 [===========>..................] - ETA: 1:11 - loss: 1.0104 - regression_loss: 0.8958 - classification_loss: 0.1146 216/500 [===========>..................] - ETA: 1:11 - loss: 1.0119 - regression_loss: 0.8975 - classification_loss: 0.1145 217/500 [============>.................] - ETA: 1:10 - loss: 1.0126 - regression_loss: 0.8979 - classification_loss: 0.1146 218/500 [============>.................] - ETA: 1:10 - loss: 1.0113 - regression_loss: 0.8970 - classification_loss: 0.1143 219/500 [============>.................] - ETA: 1:10 - loss: 1.0127 - regression_loss: 0.8982 - classification_loss: 0.1145 220/500 [============>.................] - ETA: 1:10 - loss: 1.0130 - regression_loss: 0.8985 - classification_loss: 0.1146 221/500 [============>.................] - ETA: 1:09 - loss: 1.0154 - regression_loss: 0.9004 - classification_loss: 0.1150 222/500 [============>.................] - ETA: 1:09 - loss: 1.0148 - regression_loss: 0.9000 - classification_loss: 0.1147 223/500 [============>.................] - ETA: 1:09 - loss: 1.0157 - regression_loss: 0.9008 - classification_loss: 0.1149 224/500 [============>.................] - ETA: 1:09 - loss: 1.0159 - regression_loss: 0.9011 - classification_loss: 0.1148 225/500 [============>.................] - ETA: 1:08 - loss: 1.0178 - regression_loss: 0.9027 - classification_loss: 0.1151 226/500 [============>.................] - ETA: 1:08 - loss: 1.0181 - regression_loss: 0.9031 - classification_loss: 0.1150 227/500 [============>.................] - ETA: 1:08 - loss: 1.0172 - regression_loss: 0.9023 - classification_loss: 0.1149 228/500 [============>.................] - ETA: 1:08 - loss: 1.0177 - regression_loss: 0.9027 - classification_loss: 0.1150 229/500 [============>.................] - ETA: 1:07 - loss: 1.0166 - regression_loss: 0.9016 - classification_loss: 0.1149 230/500 [============>.................] - ETA: 1:07 - loss: 1.0183 - regression_loss: 0.9031 - classification_loss: 0.1152 231/500 [============>.................] - ETA: 1:07 - loss: 1.0171 - regression_loss: 0.9016 - classification_loss: 0.1154 232/500 [============>.................] - ETA: 1:07 - loss: 1.0157 - regression_loss: 0.9004 - classification_loss: 0.1153 233/500 [============>.................] - ETA: 1:06 - loss: 1.0202 - regression_loss: 0.9044 - classification_loss: 0.1159 234/500 [=============>................] - ETA: 1:06 - loss: 1.0233 - regression_loss: 0.9068 - classification_loss: 0.1164 235/500 [=============>................] - ETA: 1:06 - loss: 1.0238 - regression_loss: 0.9073 - classification_loss: 0.1164 236/500 [=============>................] - ETA: 1:06 - loss: 1.0251 - regression_loss: 0.9087 - classification_loss: 0.1163 237/500 [=============>................] - ETA: 1:05 - loss: 1.0245 - regression_loss: 0.9083 - classification_loss: 0.1163 238/500 [=============>................] - ETA: 1:05 - loss: 1.0256 - regression_loss: 0.9087 - classification_loss: 0.1170 239/500 [=============>................] - ETA: 1:05 - loss: 1.0257 - regression_loss: 0.9087 - classification_loss: 0.1171 240/500 [=============>................] - ETA: 1:05 - loss: 1.0248 - regression_loss: 0.9080 - classification_loss: 0.1168 241/500 [=============>................] - ETA: 1:04 - loss: 1.0270 - regression_loss: 0.9098 - classification_loss: 0.1172 242/500 [=============>................] - ETA: 1:04 - loss: 1.0246 - regression_loss: 0.9077 - classification_loss: 0.1170 243/500 [=============>................] - ETA: 1:04 - loss: 1.0244 - regression_loss: 0.9074 - classification_loss: 0.1170 244/500 [=============>................] - ETA: 1:04 - loss: 1.0258 - regression_loss: 0.9086 - classification_loss: 0.1172 245/500 [=============>................] - ETA: 1:03 - loss: 1.0240 - regression_loss: 0.9070 - classification_loss: 0.1169 246/500 [=============>................] - ETA: 1:03 - loss: 1.0243 - regression_loss: 0.9070 - classification_loss: 0.1173 247/500 [=============>................] - ETA: 1:03 - loss: 1.0268 - regression_loss: 0.9092 - classification_loss: 0.1176 248/500 [=============>................] - ETA: 1:03 - loss: 1.0276 - regression_loss: 0.9100 - classification_loss: 0.1177 249/500 [=============>................] - ETA: 1:02 - loss: 1.0300 - regression_loss: 0.9120 - classification_loss: 0.1180 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0281 - regression_loss: 0.9103 - classification_loss: 0.1178 251/500 [==============>...............] - ETA: 1:02 - loss: 1.0298 - regression_loss: 0.9117 - classification_loss: 0.1181 252/500 [==============>...............] - ETA: 1:02 - loss: 1.0312 - regression_loss: 0.9131 - classification_loss: 0.1182 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0315 - regression_loss: 0.9134 - classification_loss: 0.1182 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0314 - regression_loss: 0.9133 - classification_loss: 0.1181 255/500 [==============>...............] - ETA: 1:01 - loss: 1.0291 - regression_loss: 0.9114 - classification_loss: 0.1177 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0291 - regression_loss: 0.9115 - classification_loss: 0.1176 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0305 - regression_loss: 0.9128 - classification_loss: 0.1177 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0299 - regression_loss: 0.9124 - classification_loss: 0.1175 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0306 - regression_loss: 0.9129 - classification_loss: 0.1176 260/500 [==============>...............] - ETA: 1:00 - loss: 1.0323 - regression_loss: 0.9144 - classification_loss: 0.1179 261/500 [==============>...............] - ETA: 59s - loss: 1.0328 - regression_loss: 0.9149 - classification_loss: 0.1179  262/500 [==============>...............] - ETA: 59s - loss: 1.0344 - regression_loss: 0.9163 - classification_loss: 0.1181 263/500 [==============>...............] - ETA: 59s - loss: 1.0341 - regression_loss: 0.9161 - classification_loss: 0.1180 264/500 [==============>...............] - ETA: 59s - loss: 1.0346 - regression_loss: 0.9167 - classification_loss: 0.1179 265/500 [==============>...............] - ETA: 58s - loss: 1.0351 - regression_loss: 0.9171 - classification_loss: 0.1180 266/500 [==============>...............] - ETA: 58s - loss: 1.0348 - regression_loss: 0.9167 - classification_loss: 0.1181 267/500 [===============>..............] - ETA: 58s - loss: 1.0334 - regression_loss: 0.9156 - classification_loss: 0.1178 268/500 [===============>..............] - ETA: 57s - loss: 1.0325 - regression_loss: 0.9149 - classification_loss: 0.1176 269/500 [===============>..............] - ETA: 57s - loss: 1.0343 - regression_loss: 0.9166 - classification_loss: 0.1177 270/500 [===============>..............] - ETA: 57s - loss: 1.0315 - regression_loss: 0.9142 - classification_loss: 0.1173 271/500 [===============>..............] - ETA: 57s - loss: 1.0304 - regression_loss: 0.9132 - classification_loss: 0.1171 272/500 [===============>..............] - ETA: 57s - loss: 1.0309 - regression_loss: 0.9139 - classification_loss: 0.1171 273/500 [===============>..............] - ETA: 56s - loss: 1.0292 - regression_loss: 0.9124 - classification_loss: 0.1168 274/500 [===============>..............] - ETA: 56s - loss: 1.0303 - regression_loss: 0.9132 - classification_loss: 0.1171 275/500 [===============>..............] - ETA: 56s - loss: 1.0313 - regression_loss: 0.9141 - classification_loss: 0.1172 276/500 [===============>..............] - ETA: 56s - loss: 1.0300 - regression_loss: 0.9131 - classification_loss: 0.1169 277/500 [===============>..............] - ETA: 55s - loss: 1.0296 - regression_loss: 0.9129 - classification_loss: 0.1167 278/500 [===============>..............] - ETA: 55s - loss: 1.0295 - regression_loss: 0.9128 - classification_loss: 0.1167 279/500 [===============>..............] - ETA: 55s - loss: 1.0296 - regression_loss: 0.9128 - classification_loss: 0.1167 280/500 [===============>..............] - ETA: 55s - loss: 1.0303 - regression_loss: 0.9134 - classification_loss: 0.1168 281/500 [===============>..............] - ETA: 54s - loss: 1.0312 - regression_loss: 0.9142 - classification_loss: 0.1170 282/500 [===============>..............] - ETA: 54s - loss: 1.0302 - regression_loss: 0.9133 - classification_loss: 0.1168 283/500 [===============>..............] - ETA: 54s - loss: 1.0313 - regression_loss: 0.9143 - classification_loss: 0.1171 284/500 [================>.............] - ETA: 54s - loss: 1.0289 - regression_loss: 0.9122 - classification_loss: 0.1167 285/500 [================>.............] - ETA: 53s - loss: 1.0273 - regression_loss: 0.9107 - classification_loss: 0.1166 286/500 [================>.............] - ETA: 53s - loss: 1.0280 - regression_loss: 0.9113 - classification_loss: 0.1167 287/500 [================>.............] - ETA: 53s - loss: 1.0259 - regression_loss: 0.9095 - classification_loss: 0.1163 288/500 [================>.............] - ETA: 53s - loss: 1.0266 - regression_loss: 0.9103 - classification_loss: 0.1163 289/500 [================>.............] - ETA: 52s - loss: 1.0244 - regression_loss: 0.9084 - classification_loss: 0.1161 290/500 [================>.............] - ETA: 52s - loss: 1.0223 - regression_loss: 0.9065 - classification_loss: 0.1158 291/500 [================>.............] - ETA: 52s - loss: 1.0238 - regression_loss: 0.9079 - classification_loss: 0.1160 292/500 [================>.............] - ETA: 52s - loss: 1.0244 - regression_loss: 0.9086 - classification_loss: 0.1158 293/500 [================>.............] - ETA: 51s - loss: 1.0245 - regression_loss: 0.9086 - classification_loss: 0.1158 294/500 [================>.............] - ETA: 51s - loss: 1.0248 - regression_loss: 0.9090 - classification_loss: 0.1158 295/500 [================>.............] - ETA: 51s - loss: 1.0252 - regression_loss: 0.9094 - classification_loss: 0.1158 296/500 [================>.............] - ETA: 51s - loss: 1.0251 - regression_loss: 0.9094 - classification_loss: 0.1157 297/500 [================>.............] - ETA: 50s - loss: 1.0271 - regression_loss: 0.9110 - classification_loss: 0.1161 298/500 [================>.............] - ETA: 50s - loss: 1.0256 - regression_loss: 0.9098 - classification_loss: 0.1158 299/500 [================>.............] - ETA: 50s - loss: 1.0267 - regression_loss: 0.9109 - classification_loss: 0.1159 300/500 [=================>............] - ETA: 50s - loss: 1.0250 - regression_loss: 0.9094 - classification_loss: 0.1155 301/500 [=================>............] - ETA: 49s - loss: 1.0247 - regression_loss: 0.9093 - classification_loss: 0.1154 302/500 [=================>............] - ETA: 49s - loss: 1.0234 - regression_loss: 0.9082 - classification_loss: 0.1152 303/500 [=================>............] - ETA: 49s - loss: 1.0230 - regression_loss: 0.9080 - classification_loss: 0.1151 304/500 [=================>............] - ETA: 49s - loss: 1.0251 - regression_loss: 0.9099 - classification_loss: 0.1152 305/500 [=================>............] - ETA: 48s - loss: 1.0239 - regression_loss: 0.9089 - classification_loss: 0.1150 306/500 [=================>............] - ETA: 48s - loss: 1.0231 - regression_loss: 0.9082 - classification_loss: 0.1149 307/500 [=================>............] - ETA: 48s - loss: 1.0243 - regression_loss: 0.9094 - classification_loss: 0.1149 308/500 [=================>............] - ETA: 48s - loss: 1.0223 - regression_loss: 0.9077 - classification_loss: 0.1146 309/500 [=================>............] - ETA: 47s - loss: 1.0224 - regression_loss: 0.9076 - classification_loss: 0.1147 310/500 [=================>............] - ETA: 47s - loss: 1.0225 - regression_loss: 0.9079 - classification_loss: 0.1147 311/500 [=================>............] - ETA: 47s - loss: 1.0208 - regression_loss: 0.9064 - classification_loss: 0.1144 312/500 [=================>............] - ETA: 47s - loss: 1.0216 - regression_loss: 0.9072 - classification_loss: 0.1144 313/500 [=================>............] - ETA: 46s - loss: 1.0210 - regression_loss: 0.9068 - classification_loss: 0.1142 314/500 [=================>............] - ETA: 46s - loss: 1.0216 - regression_loss: 0.9072 - classification_loss: 0.1144 315/500 [=================>............] - ETA: 46s - loss: 1.0214 - regression_loss: 0.9069 - classification_loss: 0.1144 316/500 [=================>............] - ETA: 46s - loss: 1.0191 - regression_loss: 0.9049 - classification_loss: 0.1141 317/500 [==================>...........] - ETA: 45s - loss: 1.0192 - regression_loss: 0.9048 - classification_loss: 0.1143 318/500 [==================>...........] - ETA: 45s - loss: 1.0191 - regression_loss: 0.9049 - classification_loss: 0.1142 319/500 [==================>...........] - ETA: 45s - loss: 1.0189 - regression_loss: 0.9047 - classification_loss: 0.1142 320/500 [==================>...........] - ETA: 45s - loss: 1.0192 - regression_loss: 0.9049 - classification_loss: 0.1142 321/500 [==================>...........] - ETA: 44s - loss: 1.0185 - regression_loss: 0.9044 - classification_loss: 0.1140 322/500 [==================>...........] - ETA: 44s - loss: 1.0190 - regression_loss: 0.9049 - classification_loss: 0.1141 323/500 [==================>...........] - ETA: 44s - loss: 1.0193 - regression_loss: 0.9052 - classification_loss: 0.1141 324/500 [==================>...........] - ETA: 44s - loss: 1.0194 - regression_loss: 0.9055 - classification_loss: 0.1140 325/500 [==================>...........] - ETA: 43s - loss: 1.0178 - regression_loss: 0.9041 - classification_loss: 0.1137 326/500 [==================>...........] - ETA: 43s - loss: 1.0154 - regression_loss: 0.9020 - classification_loss: 0.1134 327/500 [==================>...........] - ETA: 43s - loss: 1.0199 - regression_loss: 0.9057 - classification_loss: 0.1142 328/500 [==================>...........] - ETA: 43s - loss: 1.0204 - regression_loss: 0.9061 - classification_loss: 0.1143 329/500 [==================>...........] - ETA: 42s - loss: 1.0211 - regression_loss: 0.9066 - classification_loss: 0.1145 330/500 [==================>...........] - ETA: 42s - loss: 1.0189 - regression_loss: 0.9047 - classification_loss: 0.1142 331/500 [==================>...........] - ETA: 42s - loss: 1.0183 - regression_loss: 0.9042 - classification_loss: 0.1142 332/500 [==================>...........] - ETA: 42s - loss: 1.0186 - regression_loss: 0.9042 - classification_loss: 0.1145 333/500 [==================>...........] - ETA: 41s - loss: 1.0166 - regression_loss: 0.9024 - classification_loss: 0.1142 334/500 [===================>..........] - ETA: 41s - loss: 1.0181 - regression_loss: 0.9038 - classification_loss: 0.1144 335/500 [===================>..........] - ETA: 41s - loss: 1.0182 - regression_loss: 0.9039 - classification_loss: 0.1143 336/500 [===================>..........] - ETA: 41s - loss: 1.0184 - regression_loss: 0.9040 - classification_loss: 0.1144 337/500 [===================>..........] - ETA: 40s - loss: 1.0186 - regression_loss: 0.9042 - classification_loss: 0.1144 338/500 [===================>..........] - ETA: 40s - loss: 1.0190 - regression_loss: 0.9046 - classification_loss: 0.1143 339/500 [===================>..........] - ETA: 40s - loss: 1.0191 - regression_loss: 0.9048 - classification_loss: 0.1143 340/500 [===================>..........] - ETA: 40s - loss: 1.0202 - regression_loss: 0.9055 - classification_loss: 0.1146 341/500 [===================>..........] - ETA: 39s - loss: 1.0199 - regression_loss: 0.9053 - classification_loss: 0.1146 342/500 [===================>..........] - ETA: 39s - loss: 1.0179 - regression_loss: 0.9036 - classification_loss: 0.1144 343/500 [===================>..........] - ETA: 39s - loss: 1.0191 - regression_loss: 0.9046 - classification_loss: 0.1146 344/500 [===================>..........] - ETA: 39s - loss: 1.0179 - regression_loss: 0.9035 - classification_loss: 0.1144 345/500 [===================>..........] - ETA: 38s - loss: 1.0187 - regression_loss: 0.9041 - classification_loss: 0.1145 346/500 [===================>..........] - ETA: 38s - loss: 1.0203 - regression_loss: 0.9055 - classification_loss: 0.1148 347/500 [===================>..........] - ETA: 38s - loss: 1.0201 - regression_loss: 0.9053 - classification_loss: 0.1148 348/500 [===================>..........] - ETA: 38s - loss: 1.0217 - regression_loss: 0.9066 - classification_loss: 0.1151 349/500 [===================>..........] - ETA: 37s - loss: 1.0222 - regression_loss: 0.9071 - classification_loss: 0.1151 350/500 [====================>.........] - ETA: 37s - loss: 1.0202 - regression_loss: 0.9053 - classification_loss: 0.1149 351/500 [====================>.........] - ETA: 37s - loss: 1.0192 - regression_loss: 0.9045 - classification_loss: 0.1147 352/500 [====================>.........] - ETA: 37s - loss: 1.0210 - regression_loss: 0.9060 - classification_loss: 0.1149 353/500 [====================>.........] - ETA: 36s - loss: 1.0220 - regression_loss: 0.9069 - classification_loss: 0.1151 354/500 [====================>.........] - ETA: 36s - loss: 1.0229 - regression_loss: 0.9077 - classification_loss: 0.1152 355/500 [====================>.........] - ETA: 36s - loss: 1.0239 - regression_loss: 0.9086 - classification_loss: 0.1152 356/500 [====================>.........] - ETA: 36s - loss: 1.0249 - regression_loss: 0.9095 - classification_loss: 0.1154 357/500 [====================>.........] - ETA: 35s - loss: 1.0245 - regression_loss: 0.9092 - classification_loss: 0.1153 358/500 [====================>.........] - ETA: 35s - loss: 1.0248 - regression_loss: 0.9093 - classification_loss: 0.1154 359/500 [====================>.........] - ETA: 35s - loss: 1.0244 - regression_loss: 0.9091 - classification_loss: 0.1152 360/500 [====================>.........] - ETA: 35s - loss: 1.0239 - regression_loss: 0.9087 - classification_loss: 0.1152 361/500 [====================>.........] - ETA: 34s - loss: 1.0227 - regression_loss: 0.9076 - classification_loss: 0.1150 362/500 [====================>.........] - ETA: 34s - loss: 1.0219 - regression_loss: 0.9071 - classification_loss: 0.1148 363/500 [====================>.........] - ETA: 34s - loss: 1.0223 - regression_loss: 0.9072 - classification_loss: 0.1151 364/500 [====================>.........] - ETA: 34s - loss: 1.0218 - regression_loss: 0.9068 - classification_loss: 0.1149 365/500 [====================>.........] - ETA: 33s - loss: 1.0217 - regression_loss: 0.9067 - classification_loss: 0.1150 366/500 [====================>.........] - ETA: 33s - loss: 1.0217 - regression_loss: 0.9066 - classification_loss: 0.1151 367/500 [=====================>........] - ETA: 33s - loss: 1.0228 - regression_loss: 0.9076 - classification_loss: 0.1153 368/500 [=====================>........] - ETA: 33s - loss: 1.0221 - regression_loss: 0.9070 - classification_loss: 0.1151 369/500 [=====================>........] - ETA: 32s - loss: 1.0211 - regression_loss: 0.9063 - classification_loss: 0.1149 370/500 [=====================>........] - ETA: 32s - loss: 1.0196 - regression_loss: 0.9050 - classification_loss: 0.1147 371/500 [=====================>........] - ETA: 32s - loss: 1.0200 - regression_loss: 0.9053 - classification_loss: 0.1147 372/500 [=====================>........] - ETA: 32s - loss: 1.0203 - regression_loss: 0.9055 - classification_loss: 0.1148 373/500 [=====================>........] - ETA: 31s - loss: 1.0206 - regression_loss: 0.9057 - classification_loss: 0.1149 374/500 [=====================>........] - ETA: 31s - loss: 1.0194 - regression_loss: 0.9047 - classification_loss: 0.1147 375/500 [=====================>........] - ETA: 31s - loss: 1.0203 - regression_loss: 0.9054 - classification_loss: 0.1149 376/500 [=====================>........] - ETA: 31s - loss: 1.0207 - regression_loss: 0.9057 - classification_loss: 0.1150 377/500 [=====================>........] - ETA: 30s - loss: 1.0212 - regression_loss: 0.9061 - classification_loss: 0.1151 378/500 [=====================>........] - ETA: 30s - loss: 1.0205 - regression_loss: 0.9055 - classification_loss: 0.1151 379/500 [=====================>........] - ETA: 30s - loss: 1.0219 - regression_loss: 0.9067 - classification_loss: 0.1152 380/500 [=====================>........] - ETA: 30s - loss: 1.0200 - regression_loss: 0.9050 - classification_loss: 0.1151 381/500 [=====================>........] - ETA: 29s - loss: 1.0206 - regression_loss: 0.9055 - classification_loss: 0.1151 382/500 [=====================>........] - ETA: 29s - loss: 1.0219 - regression_loss: 0.9065 - classification_loss: 0.1154 383/500 [=====================>........] - ETA: 29s - loss: 1.0215 - regression_loss: 0.9062 - classification_loss: 0.1153 384/500 [======================>.......] - ETA: 29s - loss: 1.0209 - regression_loss: 0.9057 - classification_loss: 0.1152 385/500 [======================>.......] - ETA: 28s - loss: 1.0213 - regression_loss: 0.9060 - classification_loss: 0.1153 386/500 [======================>.......] - ETA: 28s - loss: 1.0208 - regression_loss: 0.9056 - classification_loss: 0.1152 387/500 [======================>.......] - ETA: 28s - loss: 1.0219 - regression_loss: 0.9066 - classification_loss: 0.1153 388/500 [======================>.......] - ETA: 28s - loss: 1.0253 - regression_loss: 0.9090 - classification_loss: 0.1162 389/500 [======================>.......] - ETA: 27s - loss: 1.0255 - regression_loss: 0.9092 - classification_loss: 0.1163 390/500 [======================>.......] - ETA: 27s - loss: 1.0256 - regression_loss: 0.9092 - classification_loss: 0.1163 391/500 [======================>.......] - ETA: 27s - loss: 1.0251 - regression_loss: 0.9088 - classification_loss: 0.1163 392/500 [======================>.......] - ETA: 27s - loss: 1.0258 - regression_loss: 0.9094 - classification_loss: 0.1164 393/500 [======================>.......] - ETA: 26s - loss: 1.0266 - regression_loss: 0.9100 - classification_loss: 0.1166 394/500 [======================>.......] - ETA: 26s - loss: 1.0274 - regression_loss: 0.9108 - classification_loss: 0.1167 395/500 [======================>.......] - ETA: 26s - loss: 1.0276 - regression_loss: 0.9109 - classification_loss: 0.1167 396/500 [======================>.......] - ETA: 26s - loss: 1.0287 - regression_loss: 0.9118 - classification_loss: 0.1169 397/500 [======================>.......] - ETA: 25s - loss: 1.0288 - regression_loss: 0.9119 - classification_loss: 0.1169 398/500 [======================>.......] - ETA: 25s - loss: 1.0287 - regression_loss: 0.9119 - classification_loss: 0.1169 399/500 [======================>.......] - ETA: 25s - loss: 1.0292 - regression_loss: 0.9122 - classification_loss: 0.1169 400/500 [=======================>......] - ETA: 25s - loss: 1.0290 - regression_loss: 0.9120 - classification_loss: 0.1170 401/500 [=======================>......] - ETA: 24s - loss: 1.0273 - regression_loss: 0.9105 - classification_loss: 0.1168 402/500 [=======================>......] - ETA: 24s - loss: 1.0278 - regression_loss: 0.9110 - classification_loss: 0.1168 403/500 [=======================>......] - ETA: 24s - loss: 1.0284 - regression_loss: 0.9115 - classification_loss: 0.1168 404/500 [=======================>......] - ETA: 24s - loss: 1.0290 - regression_loss: 0.9120 - classification_loss: 0.1169 405/500 [=======================>......] - ETA: 23s - loss: 1.0295 - regression_loss: 0.9124 - classification_loss: 0.1171 406/500 [=======================>......] - ETA: 23s - loss: 1.0306 - regression_loss: 0.9132 - classification_loss: 0.1174 407/500 [=======================>......] - ETA: 23s - loss: 1.0292 - regression_loss: 0.9120 - classification_loss: 0.1172 408/500 [=======================>......] - ETA: 23s - loss: 1.0276 - regression_loss: 0.9106 - classification_loss: 0.1170 409/500 [=======================>......] - ETA: 22s - loss: 1.0272 - regression_loss: 0.9103 - classification_loss: 0.1169 410/500 [=======================>......] - ETA: 22s - loss: 1.0259 - regression_loss: 0.9091 - classification_loss: 0.1168 411/500 [=======================>......] - ETA: 22s - loss: 1.0275 - regression_loss: 0.9100 - classification_loss: 0.1175 412/500 [=======================>......] - ETA: 22s - loss: 1.0289 - regression_loss: 0.9112 - classification_loss: 0.1177 413/500 [=======================>......] - ETA: 21s - loss: 1.0293 - regression_loss: 0.9115 - classification_loss: 0.1178 414/500 [=======================>......] - ETA: 21s - loss: 1.0310 - regression_loss: 0.9130 - classification_loss: 0.1180 415/500 [=======================>......] - ETA: 21s - loss: 1.0323 - regression_loss: 0.9141 - classification_loss: 0.1181 416/500 [=======================>......] - ETA: 21s - loss: 1.0319 - regression_loss: 0.9138 - classification_loss: 0.1181 417/500 [========================>.....] - ETA: 20s - loss: 1.0319 - regression_loss: 0.9140 - classification_loss: 0.1179 418/500 [========================>.....] - ETA: 20s - loss: 1.0330 - regression_loss: 0.9151 - classification_loss: 0.1179 419/500 [========================>.....] - ETA: 20s - loss: 1.0320 - regression_loss: 0.9143 - classification_loss: 0.1177 420/500 [========================>.....] - ETA: 20s - loss: 1.0331 - regression_loss: 0.9152 - classification_loss: 0.1179 421/500 [========================>.....] - ETA: 19s - loss: 1.0330 - regression_loss: 0.9151 - classification_loss: 0.1180 422/500 [========================>.....] - ETA: 19s - loss: 1.0333 - regression_loss: 0.9153 - classification_loss: 0.1180 423/500 [========================>.....] - ETA: 19s - loss: 1.0338 - regression_loss: 0.9157 - classification_loss: 0.1181 424/500 [========================>.....] - ETA: 19s - loss: 1.0354 - regression_loss: 0.9170 - classification_loss: 0.1184 425/500 [========================>.....] - ETA: 18s - loss: 1.0351 - regression_loss: 0.9168 - classification_loss: 0.1183 426/500 [========================>.....] - ETA: 18s - loss: 1.0353 - regression_loss: 0.9170 - classification_loss: 0.1183 427/500 [========================>.....] - ETA: 18s - loss: 1.0348 - regression_loss: 0.9166 - classification_loss: 0.1182 428/500 [========================>.....] - ETA: 18s - loss: 1.0346 - regression_loss: 0.9164 - classification_loss: 0.1182 429/500 [========================>.....] - ETA: 17s - loss: 1.0341 - regression_loss: 0.9161 - classification_loss: 0.1181 430/500 [========================>.....] - ETA: 17s - loss: 1.0354 - regression_loss: 0.9171 - classification_loss: 0.1183 431/500 [========================>.....] - ETA: 17s - loss: 1.0351 - regression_loss: 0.9167 - classification_loss: 0.1184 432/500 [========================>.....] - ETA: 17s - loss: 1.0354 - regression_loss: 0.9171 - classification_loss: 0.1184 433/500 [========================>.....] - ETA: 16s - loss: 1.0360 - regression_loss: 0.9176 - classification_loss: 0.1184 434/500 [=========================>....] - ETA: 16s - loss: 1.0371 - regression_loss: 0.9186 - classification_loss: 0.1186 435/500 [=========================>....] - ETA: 16s - loss: 1.0365 - regression_loss: 0.9180 - classification_loss: 0.1185 436/500 [=========================>....] - ETA: 16s - loss: 1.0370 - regression_loss: 0.9185 - classification_loss: 0.1185 437/500 [=========================>....] - ETA: 15s - loss: 1.0357 - regression_loss: 0.9173 - classification_loss: 0.1184 438/500 [=========================>....] - ETA: 15s - loss: 1.0357 - regression_loss: 0.9173 - classification_loss: 0.1184 439/500 [=========================>....] - ETA: 15s - loss: 1.0349 - regression_loss: 0.9167 - classification_loss: 0.1182 440/500 [=========================>....] - ETA: 15s - loss: 1.0338 - regression_loss: 0.9157 - classification_loss: 0.1181 441/500 [=========================>....] - ETA: 14s - loss: 1.0349 - regression_loss: 0.9165 - classification_loss: 0.1184 442/500 [=========================>....] - ETA: 14s - loss: 1.0358 - regression_loss: 0.9173 - classification_loss: 0.1185 443/500 [=========================>....] - ETA: 14s - loss: 1.0369 - regression_loss: 0.9183 - classification_loss: 0.1186 444/500 [=========================>....] - ETA: 14s - loss: 1.0373 - regression_loss: 0.9187 - classification_loss: 0.1186 445/500 [=========================>....] - ETA: 13s - loss: 1.0362 - regression_loss: 0.9178 - classification_loss: 0.1184 446/500 [=========================>....] - ETA: 13s - loss: 1.0364 - regression_loss: 0.9180 - classification_loss: 0.1184 447/500 [=========================>....] - ETA: 13s - loss: 1.0370 - regression_loss: 0.9185 - classification_loss: 0.1185 448/500 [=========================>....] - ETA: 13s - loss: 1.0374 - regression_loss: 0.9187 - classification_loss: 0.1187 449/500 [=========================>....] - ETA: 12s - loss: 1.0385 - regression_loss: 0.9196 - classification_loss: 0.1188 450/500 [==========================>...] - ETA: 12s - loss: 1.0370 - regression_loss: 0.9184 - classification_loss: 0.1186 451/500 [==========================>...] - ETA: 12s - loss: 1.0376 - regression_loss: 0.9189 - classification_loss: 0.1187 452/500 [==========================>...] - ETA: 12s - loss: 1.0377 - regression_loss: 0.9190 - classification_loss: 0.1187 453/500 [==========================>...] - ETA: 11s - loss: 1.0383 - regression_loss: 0.9196 - classification_loss: 0.1187 454/500 [==========================>...] - ETA: 11s - loss: 1.0379 - regression_loss: 0.9194 - classification_loss: 0.1186 455/500 [==========================>...] - ETA: 11s - loss: 1.0369 - regression_loss: 0.9184 - classification_loss: 0.1185 456/500 [==========================>...] - ETA: 11s - loss: 1.0361 - regression_loss: 0.9178 - classification_loss: 0.1183 457/500 [==========================>...] - ETA: 10s - loss: 1.0359 - regression_loss: 0.9178 - classification_loss: 0.1181 458/500 [==========================>...] - ETA: 10s - loss: 1.0368 - regression_loss: 0.9187 - classification_loss: 0.1181 459/500 [==========================>...] - ETA: 10s - loss: 1.0358 - regression_loss: 0.9179 - classification_loss: 0.1179 460/500 [==========================>...] - ETA: 10s - loss: 1.0354 - regression_loss: 0.9176 - classification_loss: 0.1178 461/500 [==========================>...] - ETA: 9s - loss: 1.0361 - regression_loss: 0.9182 - classification_loss: 0.1179  462/500 [==========================>...] - ETA: 9s - loss: 1.0353 - regression_loss: 0.9175 - classification_loss: 0.1178 463/500 [==========================>...] - ETA: 9s - loss: 1.0359 - regression_loss: 0.9180 - classification_loss: 0.1179 464/500 [==========================>...] - ETA: 9s - loss: 1.0363 - regression_loss: 0.9183 - classification_loss: 0.1180 465/500 [==========================>...] - ETA: 8s - loss: 1.0357 - regression_loss: 0.9179 - classification_loss: 0.1178 466/500 [==========================>...] - ETA: 8s - loss: 1.0355 - regression_loss: 0.9178 - classification_loss: 0.1177 467/500 [===========================>..] - ETA: 8s - loss: 1.0357 - regression_loss: 0.9179 - classification_loss: 0.1177 468/500 [===========================>..] - ETA: 8s - loss: 1.0366 - regression_loss: 0.9187 - classification_loss: 0.1179 469/500 [===========================>..] - ETA: 7s - loss: 1.0366 - regression_loss: 0.9188 - classification_loss: 0.1178 470/500 [===========================>..] - ETA: 7s - loss: 1.0363 - regression_loss: 0.9185 - classification_loss: 0.1178 471/500 [===========================>..] - ETA: 7s - loss: 1.0356 - regression_loss: 0.9179 - classification_loss: 0.1177 472/500 [===========================>..] - ETA: 7s - loss: 1.0354 - regression_loss: 0.9177 - classification_loss: 0.1177 473/500 [===========================>..] - ETA: 6s - loss: 1.0353 - regression_loss: 0.9176 - classification_loss: 0.1177 474/500 [===========================>..] - ETA: 6s - loss: 1.0342 - regression_loss: 0.9165 - classification_loss: 0.1176 475/500 [===========================>..] - ETA: 6s - loss: 1.0340 - regression_loss: 0.9164 - classification_loss: 0.1176 476/500 [===========================>..] - ETA: 6s - loss: 1.0329 - regression_loss: 0.9155 - classification_loss: 0.1174 477/500 [===========================>..] - ETA: 5s - loss: 1.0326 - regression_loss: 0.9153 - classification_loss: 0.1173 478/500 [===========================>..] - ETA: 5s - loss: 1.0327 - regression_loss: 0.9153 - classification_loss: 0.1174 479/500 [===========================>..] - ETA: 5s - loss: 1.0330 - regression_loss: 0.9156 - classification_loss: 0.1174 480/500 [===========================>..] - ETA: 5s - loss: 1.0335 - regression_loss: 0.9160 - classification_loss: 0.1175 481/500 [===========================>..] - ETA: 4s - loss: 1.0342 - regression_loss: 0.9165 - classification_loss: 0.1177 482/500 [===========================>..] - ETA: 4s - loss: 1.0336 - regression_loss: 0.9160 - classification_loss: 0.1176 483/500 [===========================>..] - ETA: 4s - loss: 1.0336 - regression_loss: 0.9160 - classification_loss: 0.1176 484/500 [============================>.] - ETA: 4s - loss: 1.0326 - regression_loss: 0.9151 - classification_loss: 0.1175 485/500 [============================>.] - ETA: 3s - loss: 1.0333 - regression_loss: 0.9155 - classification_loss: 0.1178 486/500 [============================>.] - ETA: 3s - loss: 1.0336 - regression_loss: 0.9158 - classification_loss: 0.1178 487/500 [============================>.] - ETA: 3s - loss: 1.0329 - regression_loss: 0.9153 - classification_loss: 0.1177 488/500 [============================>.] - ETA: 3s - loss: 1.0333 - regression_loss: 0.9156 - classification_loss: 0.1177 489/500 [============================>.] - ETA: 2s - loss: 1.0322 - regression_loss: 0.9147 - classification_loss: 0.1175 490/500 [============================>.] - ETA: 2s - loss: 1.0321 - regression_loss: 0.9146 - classification_loss: 0.1174 491/500 [============================>.] - ETA: 2s - loss: 1.0318 - regression_loss: 0.9144 - classification_loss: 0.1174 492/500 [============================>.] - ETA: 2s - loss: 1.0318 - regression_loss: 0.9144 - classification_loss: 0.1173 493/500 [============================>.] - ETA: 1s - loss: 1.0325 - regression_loss: 0.9150 - classification_loss: 0.1174 494/500 [============================>.] - ETA: 1s - loss: 1.0312 - regression_loss: 0.9139 - classification_loss: 0.1173 495/500 [============================>.] - ETA: 1s - loss: 1.0299 - regression_loss: 0.9128 - classification_loss: 0.1171 496/500 [============================>.] - ETA: 1s - loss: 1.0299 - regression_loss: 0.9128 - classification_loss: 0.1171 497/500 [============================>.] - ETA: 0s - loss: 1.0298 - regression_loss: 0.9128 - classification_loss: 0.1170 498/500 [============================>.] - ETA: 0s - loss: 1.0300 - regression_loss: 0.9130 - classification_loss: 0.1169 499/500 [============================>.] - ETA: 0s - loss: 1.0306 - regression_loss: 0.9135 - classification_loss: 0.1171 500/500 [==============================] - 125s 250ms/step - loss: 1.0305 - regression_loss: 0.9135 - classification_loss: 0.1171 1172 instances of class plum with average precision: 0.8029 mAP: 0.8029 Epoch 00035: saving model to ./training/snapshots/resnet50_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:03 - loss: 0.7555 - regression_loss: 0.6964 - classification_loss: 0.0591 2/500 [..............................] - ETA: 1:58 - loss: 0.5479 - regression_loss: 0.5066 - classification_loss: 0.0413 3/500 [..............................] - ETA: 2:01 - loss: 0.5720 - regression_loss: 0.5270 - classification_loss: 0.0450 4/500 [..............................] - ETA: 2:01 - loss: 0.7358 - regression_loss: 0.6660 - classification_loss: 0.0698 5/500 [..............................] - ETA: 2:01 - loss: 0.7278 - regression_loss: 0.6553 - classification_loss: 0.0725 6/500 [..............................] - ETA: 1:59 - loss: 0.7543 - regression_loss: 0.6758 - classification_loss: 0.0785 7/500 [..............................] - ETA: 1:59 - loss: 0.6852 - regression_loss: 0.6151 - classification_loss: 0.0702 8/500 [..............................] - ETA: 1:59 - loss: 0.7085 - regression_loss: 0.6349 - classification_loss: 0.0736 9/500 [..............................] - ETA: 1:59 - loss: 0.8157 - regression_loss: 0.7211 - classification_loss: 0.0946 10/500 [..............................] - ETA: 1:59 - loss: 0.8359 - regression_loss: 0.7398 - classification_loss: 0.0961 11/500 [..............................] - ETA: 1:59 - loss: 0.8832 - regression_loss: 0.7804 - classification_loss: 0.1028 12/500 [..............................] - ETA: 1:59 - loss: 0.9040 - regression_loss: 0.8003 - classification_loss: 0.1037 13/500 [..............................] - ETA: 1:59 - loss: 0.9287 - regression_loss: 0.8219 - classification_loss: 0.1068 14/500 [..............................] - ETA: 1:58 - loss: 0.9544 - regression_loss: 0.8430 - classification_loss: 0.1114 15/500 [..............................] - ETA: 1:58 - loss: 0.9669 - regression_loss: 0.8526 - classification_loss: 0.1143 16/500 [..............................] - ETA: 1:58 - loss: 0.9406 - regression_loss: 0.8310 - classification_loss: 0.1096 17/500 [>.............................] - ETA: 1:58 - loss: 0.9260 - regression_loss: 0.8186 - classification_loss: 0.1074 18/500 [>.............................] - ETA: 1:58 - loss: 0.9153 - regression_loss: 0.8113 - classification_loss: 0.1040 19/500 [>.............................] - ETA: 1:58 - loss: 0.9008 - regression_loss: 0.7992 - classification_loss: 0.1016 20/500 [>.............................] - ETA: 1:58 - loss: 0.8839 - regression_loss: 0.7858 - classification_loss: 0.0980 21/500 [>.............................] - ETA: 1:58 - loss: 0.8721 - regression_loss: 0.7770 - classification_loss: 0.0951 22/500 [>.............................] - ETA: 1:57 - loss: 0.8912 - regression_loss: 0.7922 - classification_loss: 0.0989 23/500 [>.............................] - ETA: 1:57 - loss: 0.8944 - regression_loss: 0.7949 - classification_loss: 0.0995 24/500 [>.............................] - ETA: 1:57 - loss: 0.9062 - regression_loss: 0.8048 - classification_loss: 0.1014 25/500 [>.............................] - ETA: 1:57 - loss: 0.8994 - regression_loss: 0.7994 - classification_loss: 0.1000 26/500 [>.............................] - ETA: 1:57 - loss: 0.9122 - regression_loss: 0.8111 - classification_loss: 0.1012 27/500 [>.............................] - ETA: 1:57 - loss: 0.9210 - regression_loss: 0.8178 - classification_loss: 0.1031 28/500 [>.............................] - ETA: 1:56 - loss: 0.9234 - regression_loss: 0.8217 - classification_loss: 0.1017 29/500 [>.............................] - ETA: 1:56 - loss: 0.9263 - regression_loss: 0.8233 - classification_loss: 0.1031 30/500 [>.............................] - ETA: 1:56 - loss: 0.9276 - regression_loss: 0.8244 - classification_loss: 0.1032 31/500 [>.............................] - ETA: 1:56 - loss: 0.9482 - regression_loss: 0.8426 - classification_loss: 0.1056 32/500 [>.............................] - ETA: 1:55 - loss: 0.9590 - regression_loss: 0.8517 - classification_loss: 0.1073 33/500 [>.............................] - ETA: 1:55 - loss: 0.9783 - regression_loss: 0.8679 - classification_loss: 0.1104 34/500 [=>............................] - ETA: 1:55 - loss: 0.9802 - regression_loss: 0.8693 - classification_loss: 0.1109 35/500 [=>............................] - ETA: 1:54 - loss: 0.9695 - regression_loss: 0.8601 - classification_loss: 0.1094 36/500 [=>............................] - ETA: 1:54 - loss: 0.9725 - regression_loss: 0.8635 - classification_loss: 0.1090 37/500 [=>............................] - ETA: 1:54 - loss: 0.9569 - regression_loss: 0.8498 - classification_loss: 0.1071 38/500 [=>............................] - ETA: 1:54 - loss: 0.9687 - regression_loss: 0.8609 - classification_loss: 0.1078 39/500 [=>............................] - ETA: 1:54 - loss: 0.9611 - regression_loss: 0.8548 - classification_loss: 0.1062 40/500 [=>............................] - ETA: 1:54 - loss: 0.9776 - regression_loss: 0.8690 - classification_loss: 0.1086 41/500 [=>............................] - ETA: 1:54 - loss: 0.9608 - regression_loss: 0.8544 - classification_loss: 0.1064 42/500 [=>............................] - ETA: 1:53 - loss: 0.9646 - regression_loss: 0.8582 - classification_loss: 0.1064 43/500 [=>............................] - ETA: 1:53 - loss: 0.9687 - regression_loss: 0.8630 - classification_loss: 0.1057 44/500 [=>............................] - ETA: 1:53 - loss: 0.9769 - regression_loss: 0.8695 - classification_loss: 0.1074 45/500 [=>............................] - ETA: 1:53 - loss: 0.9659 - regression_loss: 0.8605 - classification_loss: 0.1055 46/500 [=>............................] - ETA: 1:53 - loss: 0.9628 - regression_loss: 0.8586 - classification_loss: 0.1042 47/500 [=>............................] - ETA: 1:53 - loss: 0.9692 - regression_loss: 0.8638 - classification_loss: 0.1053 48/500 [=>............................] - ETA: 1:52 - loss: 0.9668 - regression_loss: 0.8621 - classification_loss: 0.1047 49/500 [=>............................] - ETA: 1:52 - loss: 0.9560 - regression_loss: 0.8526 - classification_loss: 0.1034 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9583 - regression_loss: 0.8546 - classification_loss: 0.1037 51/500 [==>...........................] - ETA: 1:52 - loss: 0.9599 - regression_loss: 0.8562 - classification_loss: 0.1037 52/500 [==>...........................] - ETA: 1:52 - loss: 0.9751 - regression_loss: 0.8691 - classification_loss: 0.1060 53/500 [==>...........................] - ETA: 1:51 - loss: 0.9631 - regression_loss: 0.8585 - classification_loss: 0.1046 54/500 [==>...........................] - ETA: 1:51 - loss: 0.9696 - regression_loss: 0.8637 - classification_loss: 0.1059 55/500 [==>...........................] - ETA: 1:51 - loss: 0.9665 - regression_loss: 0.8612 - classification_loss: 0.1053 56/500 [==>...........................] - ETA: 1:50 - loss: 0.9684 - regression_loss: 0.8627 - classification_loss: 0.1058 57/500 [==>...........................] - ETA: 1:50 - loss: 0.9687 - regression_loss: 0.8626 - classification_loss: 0.1061 58/500 [==>...........................] - ETA: 1:50 - loss: 0.9751 - regression_loss: 0.8680 - classification_loss: 0.1071 59/500 [==>...........................] - ETA: 1:50 - loss: 0.9788 - regression_loss: 0.8699 - classification_loss: 0.1090 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9810 - regression_loss: 0.8715 - classification_loss: 0.1095 61/500 [==>...........................] - ETA: 1:49 - loss: 0.9855 - regression_loss: 0.8746 - classification_loss: 0.1108 62/500 [==>...........................] - ETA: 1:49 - loss: 0.9841 - regression_loss: 0.8725 - classification_loss: 0.1116 63/500 [==>...........................] - ETA: 1:49 - loss: 0.9848 - regression_loss: 0.8733 - classification_loss: 0.1115 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9848 - regression_loss: 0.8733 - classification_loss: 0.1114 65/500 [==>...........................] - ETA: 1:48 - loss: 0.9920 - regression_loss: 0.8793 - classification_loss: 0.1126 66/500 [==>...........................] - ETA: 1:48 - loss: 0.9977 - regression_loss: 0.8840 - classification_loss: 0.1137 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9995 - regression_loss: 0.8847 - classification_loss: 0.1148 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9965 - regression_loss: 0.8810 - classification_loss: 0.1154 69/500 [===>..........................] - ETA: 1:47 - loss: 1.0001 - regression_loss: 0.8832 - classification_loss: 0.1168 70/500 [===>..........................] - ETA: 1:47 - loss: 1.0001 - regression_loss: 0.8824 - classification_loss: 0.1177 71/500 [===>..........................] - ETA: 1:46 - loss: 1.0012 - regression_loss: 0.8832 - classification_loss: 0.1181 72/500 [===>..........................] - ETA: 1:46 - loss: 1.0142 - regression_loss: 0.8929 - classification_loss: 0.1213 73/500 [===>..........................] - ETA: 1:46 - loss: 1.0169 - regression_loss: 0.8945 - classification_loss: 0.1224 74/500 [===>..........................] - ETA: 1:46 - loss: 1.0220 - regression_loss: 0.8977 - classification_loss: 0.1242 75/500 [===>..........................] - ETA: 1:46 - loss: 1.0221 - regression_loss: 0.8977 - classification_loss: 0.1244 76/500 [===>..........................] - ETA: 1:45 - loss: 1.0284 - regression_loss: 0.9029 - classification_loss: 0.1255 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0295 - regression_loss: 0.9043 - classification_loss: 0.1253 78/500 [===>..........................] - ETA: 1:45 - loss: 1.0273 - regression_loss: 0.9023 - classification_loss: 0.1250 79/500 [===>..........................] - ETA: 1:45 - loss: 1.0319 - regression_loss: 0.9058 - classification_loss: 0.1261 80/500 [===>..........................] - ETA: 1:44 - loss: 1.0281 - regression_loss: 0.9027 - classification_loss: 0.1254 81/500 [===>..........................] - ETA: 1:44 - loss: 1.0310 - regression_loss: 0.9054 - classification_loss: 0.1256 82/500 [===>..........................] - ETA: 1:44 - loss: 1.0327 - regression_loss: 0.9069 - classification_loss: 0.1258 83/500 [===>..........................] - ETA: 1:44 - loss: 1.0351 - regression_loss: 0.9087 - classification_loss: 0.1264 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0310 - regression_loss: 0.9055 - classification_loss: 0.1254 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0307 - regression_loss: 0.9050 - classification_loss: 0.1258 86/500 [====>.........................] - ETA: 1:43 - loss: 1.0330 - regression_loss: 0.9069 - classification_loss: 0.1261 87/500 [====>.........................] - ETA: 1:43 - loss: 1.0363 - regression_loss: 0.9098 - classification_loss: 0.1265 88/500 [====>.........................] - ETA: 1:42 - loss: 1.0386 - regression_loss: 0.9121 - classification_loss: 0.1265 89/500 [====>.........................] - ETA: 1:42 - loss: 1.0336 - regression_loss: 0.9082 - classification_loss: 0.1254 90/500 [====>.........................] - ETA: 1:42 - loss: 1.0365 - regression_loss: 0.9109 - classification_loss: 0.1257 91/500 [====>.........................] - ETA: 1:42 - loss: 1.0321 - regression_loss: 0.9073 - classification_loss: 0.1249 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0269 - regression_loss: 0.9027 - classification_loss: 0.1242 93/500 [====>.........................] - ETA: 1:41 - loss: 1.0230 - regression_loss: 0.8997 - classification_loss: 0.1233 94/500 [====>.........................] - ETA: 1:41 - loss: 1.0192 - regression_loss: 0.8968 - classification_loss: 0.1224 95/500 [====>.........................] - ETA: 1:41 - loss: 1.0172 - regression_loss: 0.8955 - classification_loss: 0.1217 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0197 - regression_loss: 0.8975 - classification_loss: 0.1222 97/500 [====>.........................] - ETA: 1:40 - loss: 1.0236 - regression_loss: 0.9007 - classification_loss: 0.1229 98/500 [====>.........................] - ETA: 1:40 - loss: 1.0182 - regression_loss: 0.8963 - classification_loss: 0.1219 99/500 [====>.........................] - ETA: 1:40 - loss: 1.0192 - regression_loss: 0.8974 - classification_loss: 0.1219 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0234 - regression_loss: 0.9011 - classification_loss: 0.1222 101/500 [=====>........................] - ETA: 1:39 - loss: 1.0180 - regression_loss: 0.8967 - classification_loss: 0.1213 102/500 [=====>........................] - ETA: 1:39 - loss: 1.0204 - regression_loss: 0.8989 - classification_loss: 0.1215 103/500 [=====>........................] - ETA: 1:38 - loss: 1.0143 - regression_loss: 0.8938 - classification_loss: 0.1206 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0143 - regression_loss: 0.8938 - classification_loss: 0.1204 105/500 [=====>........................] - ETA: 1:38 - loss: 1.0139 - regression_loss: 0.8940 - classification_loss: 0.1200 106/500 [=====>........................] - ETA: 1:37 - loss: 1.0150 - regression_loss: 0.8949 - classification_loss: 0.1201 107/500 [=====>........................] - ETA: 1:37 - loss: 1.0093 - regression_loss: 0.8900 - classification_loss: 0.1193 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0070 - regression_loss: 0.8884 - classification_loss: 0.1186 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0051 - regression_loss: 0.8870 - classification_loss: 0.1181 110/500 [=====>........................] - ETA: 1:36 - loss: 1.0076 - regression_loss: 0.8892 - classification_loss: 0.1184 111/500 [=====>........................] - ETA: 1:36 - loss: 1.0079 - regression_loss: 0.8899 - classification_loss: 0.1180 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0130 - regression_loss: 0.8942 - classification_loss: 0.1189 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0080 - regression_loss: 0.8900 - classification_loss: 0.1180 114/500 [=====>........................] - ETA: 1:35 - loss: 1.0083 - regression_loss: 0.8903 - classification_loss: 0.1180 115/500 [=====>........................] - ETA: 1:35 - loss: 1.0097 - regression_loss: 0.8915 - classification_loss: 0.1182 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0086 - regression_loss: 0.8911 - classification_loss: 0.1175 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0111 - regression_loss: 0.8936 - classification_loss: 0.1175 118/500 [======>.......................] - ETA: 1:34 - loss: 1.0106 - regression_loss: 0.8933 - classification_loss: 0.1173 119/500 [======>.......................] - ETA: 1:34 - loss: 1.0142 - regression_loss: 0.8961 - classification_loss: 0.1181 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0131 - regression_loss: 0.8952 - classification_loss: 0.1179 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0084 - regression_loss: 0.8912 - classification_loss: 0.1172 122/500 [======>.......................] - ETA: 1:34 - loss: 1.0166 - regression_loss: 0.8971 - classification_loss: 0.1195 123/500 [======>.......................] - ETA: 1:33 - loss: 1.0228 - regression_loss: 0.9028 - classification_loss: 0.1200 124/500 [======>.......................] - ETA: 1:33 - loss: 1.0249 - regression_loss: 0.9046 - classification_loss: 0.1203 125/500 [======>.......................] - ETA: 1:33 - loss: 1.0262 - regression_loss: 0.9057 - classification_loss: 0.1204 126/500 [======>.......................] - ETA: 1:33 - loss: 1.0220 - regression_loss: 0.9023 - classification_loss: 0.1197 127/500 [======>.......................] - ETA: 1:32 - loss: 1.0188 - regression_loss: 0.8998 - classification_loss: 0.1189 128/500 [======>.......................] - ETA: 1:32 - loss: 1.0262 - regression_loss: 0.9057 - classification_loss: 0.1206 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0242 - regression_loss: 0.9042 - classification_loss: 0.1199 130/500 [======>.......................] - ETA: 1:32 - loss: 1.0250 - regression_loss: 0.9050 - classification_loss: 0.1200 131/500 [======>.......................] - ETA: 1:31 - loss: 1.0270 - regression_loss: 0.9067 - classification_loss: 0.1203 132/500 [======>.......................] - ETA: 1:31 - loss: 1.0283 - regression_loss: 0.9080 - classification_loss: 0.1203 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0266 - regression_loss: 0.9069 - classification_loss: 0.1197 134/500 [=======>......................] - ETA: 1:30 - loss: 1.0291 - regression_loss: 0.9089 - classification_loss: 0.1202 135/500 [=======>......................] - ETA: 1:30 - loss: 1.0246 - regression_loss: 0.9049 - classification_loss: 0.1196 136/500 [=======>......................] - ETA: 1:30 - loss: 1.0293 - regression_loss: 0.9091 - classification_loss: 0.1202 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0237 - regression_loss: 0.9042 - classification_loss: 0.1195 138/500 [=======>......................] - ETA: 1:29 - loss: 1.0213 - regression_loss: 0.9023 - classification_loss: 0.1190 139/500 [=======>......................] - ETA: 1:29 - loss: 1.0224 - regression_loss: 0.9031 - classification_loss: 0.1192 140/500 [=======>......................] - ETA: 1:29 - loss: 1.0194 - regression_loss: 0.9007 - classification_loss: 0.1187 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0161 - regression_loss: 0.8978 - classification_loss: 0.1183 142/500 [=======>......................] - ETA: 1:29 - loss: 1.0167 - regression_loss: 0.8977 - classification_loss: 0.1190 143/500 [=======>......................] - ETA: 1:28 - loss: 1.0119 - regression_loss: 0.8937 - classification_loss: 0.1182 144/500 [=======>......................] - ETA: 1:28 - loss: 1.0115 - regression_loss: 0.8932 - classification_loss: 0.1184 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0141 - regression_loss: 0.8953 - classification_loss: 0.1188 146/500 [=======>......................] - ETA: 1:28 - loss: 1.0154 - regression_loss: 0.8965 - classification_loss: 0.1189 147/500 [=======>......................] - ETA: 1:27 - loss: 1.0165 - regression_loss: 0.8977 - classification_loss: 0.1188 148/500 [=======>......................] - ETA: 1:27 - loss: 1.0161 - regression_loss: 0.8975 - classification_loss: 0.1186 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0152 - regression_loss: 0.8965 - classification_loss: 0.1186 150/500 [========>.....................] - ETA: 1:27 - loss: 1.0147 - regression_loss: 0.8963 - classification_loss: 0.1185 151/500 [========>.....................] - ETA: 1:26 - loss: 1.0187 - regression_loss: 0.8997 - classification_loss: 0.1190 152/500 [========>.....................] - ETA: 1:26 - loss: 1.0203 - regression_loss: 0.9009 - classification_loss: 0.1194 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0179 - regression_loss: 0.8989 - classification_loss: 0.1190 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0173 - regression_loss: 0.8984 - classification_loss: 0.1189 155/500 [========>.....................] - ETA: 1:25 - loss: 1.0179 - regression_loss: 0.8990 - classification_loss: 0.1189 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0178 - regression_loss: 0.8990 - classification_loss: 0.1188 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0164 - regression_loss: 0.8979 - classification_loss: 0.1185 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0215 - regression_loss: 0.9019 - classification_loss: 0.1196 159/500 [========>.....................] - ETA: 1:24 - loss: 1.0211 - regression_loss: 0.9018 - classification_loss: 0.1193 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0216 - regression_loss: 0.9021 - classification_loss: 0.1195 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0245 - regression_loss: 0.9045 - classification_loss: 0.1199 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0247 - regression_loss: 0.9049 - classification_loss: 0.1198 163/500 [========>.....................] - ETA: 1:23 - loss: 1.0268 - regression_loss: 0.9074 - classification_loss: 0.1194 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0236 - regression_loss: 0.9047 - classification_loss: 0.1189 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0244 - regression_loss: 0.9057 - classification_loss: 0.1187 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0226 - regression_loss: 0.9041 - classification_loss: 0.1185 167/500 [=========>....................] - ETA: 1:23 - loss: 1.0205 - regression_loss: 0.9025 - classification_loss: 0.1180 168/500 [=========>....................] - ETA: 1:22 - loss: 1.0203 - regression_loss: 0.9021 - classification_loss: 0.1182 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0181 - regression_loss: 0.9003 - classification_loss: 0.1179 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0206 - regression_loss: 0.9025 - classification_loss: 0.1181 171/500 [=========>....................] - ETA: 1:22 - loss: 1.0185 - regression_loss: 0.9006 - classification_loss: 0.1180 172/500 [=========>....................] - ETA: 1:21 - loss: 1.0174 - regression_loss: 0.8996 - classification_loss: 0.1178 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0161 - regression_loss: 0.8986 - classification_loss: 0.1175 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0146 - regression_loss: 0.8974 - classification_loss: 0.1172 175/500 [=========>....................] - ETA: 1:21 - loss: 1.0136 - regression_loss: 0.8963 - classification_loss: 0.1173 176/500 [=========>....................] - ETA: 1:20 - loss: 1.0140 - regression_loss: 0.8967 - classification_loss: 0.1173 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0147 - regression_loss: 0.8973 - classification_loss: 0.1174 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0137 - regression_loss: 0.8963 - classification_loss: 0.1174 179/500 [=========>....................] - ETA: 1:19 - loss: 1.0092 - regression_loss: 0.8923 - classification_loss: 0.1170 180/500 [=========>....................] - ETA: 1:19 - loss: 1.0092 - regression_loss: 0.8922 - classification_loss: 0.1169 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0114 - regression_loss: 0.8944 - classification_loss: 0.1170 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0143 - regression_loss: 0.8966 - classification_loss: 0.1177 183/500 [=========>....................] - ETA: 1:18 - loss: 1.0149 - regression_loss: 0.8970 - classification_loss: 0.1179 184/500 [==========>...................] - ETA: 1:18 - loss: 1.0147 - regression_loss: 0.8969 - classification_loss: 0.1179 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0171 - regression_loss: 0.8989 - classification_loss: 0.1182 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0159 - regression_loss: 0.8980 - classification_loss: 0.1179 187/500 [==========>...................] - ETA: 1:18 - loss: 1.0183 - regression_loss: 0.8985 - classification_loss: 0.1198 188/500 [==========>...................] - ETA: 1:17 - loss: 1.0160 - regression_loss: 0.8963 - classification_loss: 0.1197 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0132 - regression_loss: 0.8939 - classification_loss: 0.1193 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0123 - regression_loss: 0.8932 - classification_loss: 0.1192 191/500 [==========>...................] - ETA: 1:17 - loss: 1.0131 - regression_loss: 0.8939 - classification_loss: 0.1192 192/500 [==========>...................] - ETA: 1:16 - loss: 1.0127 - regression_loss: 0.8935 - classification_loss: 0.1193 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0115 - regression_loss: 0.8925 - classification_loss: 0.1190 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0096 - regression_loss: 0.8911 - classification_loss: 0.1185 195/500 [==========>...................] - ETA: 1:16 - loss: 1.0101 - regression_loss: 0.8912 - classification_loss: 0.1190 196/500 [==========>...................] - ETA: 1:15 - loss: 1.0074 - regression_loss: 0.8889 - classification_loss: 0.1185 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0074 - regression_loss: 0.8892 - classification_loss: 0.1182 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0080 - regression_loss: 0.8898 - classification_loss: 0.1182 199/500 [==========>...................] - ETA: 1:15 - loss: 1.0098 - regression_loss: 0.8913 - classification_loss: 0.1185 200/500 [===========>..................] - ETA: 1:14 - loss: 1.0123 - regression_loss: 0.8933 - classification_loss: 0.1190 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0135 - regression_loss: 0.8942 - classification_loss: 0.1192 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0142 - regression_loss: 0.8950 - classification_loss: 0.1192 203/500 [===========>..................] - ETA: 1:14 - loss: 1.0156 - regression_loss: 0.8961 - classification_loss: 0.1195 204/500 [===========>..................] - ETA: 1:13 - loss: 1.0127 - regression_loss: 0.8936 - classification_loss: 0.1191 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0124 - regression_loss: 0.8935 - classification_loss: 0.1188 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0140 - regression_loss: 0.8950 - classification_loss: 0.1190 207/500 [===========>..................] - ETA: 1:13 - loss: 1.0131 - regression_loss: 0.8943 - classification_loss: 0.1187 208/500 [===========>..................] - ETA: 1:12 - loss: 1.0111 - regression_loss: 0.8928 - classification_loss: 0.1184 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0119 - regression_loss: 0.8932 - classification_loss: 0.1187 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0144 - regression_loss: 0.8954 - classification_loss: 0.1191 211/500 [===========>..................] - ETA: 1:12 - loss: 1.0134 - regression_loss: 0.8946 - classification_loss: 0.1188 212/500 [===========>..................] - ETA: 1:11 - loss: 1.0141 - regression_loss: 0.8952 - classification_loss: 0.1189 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0180 - regression_loss: 0.8982 - classification_loss: 0.1198 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0147 - regression_loss: 0.8952 - classification_loss: 0.1195 215/500 [===========>..................] - ETA: 1:11 - loss: 1.0120 - regression_loss: 0.8930 - classification_loss: 0.1190 216/500 [===========>..................] - ETA: 1:10 - loss: 1.0121 - regression_loss: 0.8930 - classification_loss: 0.1191 217/500 [============>.................] - ETA: 1:10 - loss: 1.0098 - regression_loss: 0.8911 - classification_loss: 0.1187 218/500 [============>.................] - ETA: 1:10 - loss: 1.0112 - regression_loss: 0.8921 - classification_loss: 0.1190 219/500 [============>.................] - ETA: 1:10 - loss: 1.0119 - regression_loss: 0.8927 - classification_loss: 0.1191 220/500 [============>.................] - ETA: 1:09 - loss: 1.0109 - regression_loss: 0.8920 - classification_loss: 0.1189 221/500 [============>.................] - ETA: 1:09 - loss: 1.0126 - regression_loss: 0.8936 - classification_loss: 0.1189 222/500 [============>.................] - ETA: 1:09 - loss: 1.0155 - regression_loss: 0.8959 - classification_loss: 0.1196 223/500 [============>.................] - ETA: 1:09 - loss: 1.0166 - regression_loss: 0.8969 - classification_loss: 0.1197 224/500 [============>.................] - ETA: 1:08 - loss: 1.0206 - regression_loss: 0.9002 - classification_loss: 0.1204 225/500 [============>.................] - ETA: 1:08 - loss: 1.0271 - regression_loss: 0.9060 - classification_loss: 0.1211 226/500 [============>.................] - ETA: 1:08 - loss: 1.0256 - regression_loss: 0.9047 - classification_loss: 0.1209 227/500 [============>.................] - ETA: 1:08 - loss: 1.0267 - regression_loss: 0.9055 - classification_loss: 0.1212 228/500 [============>.................] - ETA: 1:07 - loss: 1.0270 - regression_loss: 0.9057 - classification_loss: 0.1213 229/500 [============>.................] - ETA: 1:07 - loss: 1.0293 - regression_loss: 0.9077 - classification_loss: 0.1216 230/500 [============>.................] - ETA: 1:07 - loss: 1.0292 - regression_loss: 0.9078 - classification_loss: 0.1215 231/500 [============>.................] - ETA: 1:07 - loss: 1.0306 - regression_loss: 0.9088 - classification_loss: 0.1218 232/500 [============>.................] - ETA: 1:06 - loss: 1.0292 - regression_loss: 0.9076 - classification_loss: 0.1216 233/500 [============>.................] - ETA: 1:06 - loss: 1.0280 - regression_loss: 0.9064 - classification_loss: 0.1216 234/500 [=============>................] - ETA: 1:06 - loss: 1.0281 - regression_loss: 0.9066 - classification_loss: 0.1215 235/500 [=============>................] - ETA: 1:06 - loss: 1.0261 - regression_loss: 0.9050 - classification_loss: 0.1211 236/500 [=============>................] - ETA: 1:05 - loss: 1.0273 - regression_loss: 0.9061 - classification_loss: 0.1212 237/500 [=============>................] - ETA: 1:05 - loss: 1.0281 - regression_loss: 0.9067 - classification_loss: 0.1214 238/500 [=============>................] - ETA: 1:05 - loss: 1.0283 - regression_loss: 0.9070 - classification_loss: 0.1213 239/500 [=============>................] - ETA: 1:05 - loss: 1.0289 - regression_loss: 0.9076 - classification_loss: 0.1213 240/500 [=============>................] - ETA: 1:04 - loss: 1.0308 - regression_loss: 0.9092 - classification_loss: 0.1215 241/500 [=============>................] - ETA: 1:04 - loss: 1.0293 - regression_loss: 0.9080 - classification_loss: 0.1212 242/500 [=============>................] - ETA: 1:04 - loss: 1.0293 - regression_loss: 0.9083 - classification_loss: 0.1210 243/500 [=============>................] - ETA: 1:04 - loss: 1.0301 - regression_loss: 0.9089 - classification_loss: 0.1212 244/500 [=============>................] - ETA: 1:03 - loss: 1.0291 - regression_loss: 0.9081 - classification_loss: 0.1210 245/500 [=============>................] - ETA: 1:03 - loss: 1.0289 - regression_loss: 0.9082 - classification_loss: 0.1208 246/500 [=============>................] - ETA: 1:03 - loss: 1.0283 - regression_loss: 0.9076 - classification_loss: 0.1207 247/500 [=============>................] - ETA: 1:03 - loss: 1.0285 - regression_loss: 0.9079 - classification_loss: 0.1206 248/500 [=============>................] - ETA: 1:02 - loss: 1.0266 - regression_loss: 0.9063 - classification_loss: 0.1203 249/500 [=============>................] - ETA: 1:02 - loss: 1.0269 - regression_loss: 0.9065 - classification_loss: 0.1204 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0286 - regression_loss: 0.9081 - classification_loss: 0.1204 251/500 [==============>...............] - ETA: 1:02 - loss: 1.0292 - regression_loss: 0.9087 - classification_loss: 0.1205 252/500 [==============>...............] - ETA: 1:01 - loss: 1.0308 - regression_loss: 0.9100 - classification_loss: 0.1208 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0302 - regression_loss: 0.9095 - classification_loss: 0.1207 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0306 - regression_loss: 0.9099 - classification_loss: 0.1206 255/500 [==============>...............] - ETA: 1:01 - loss: 1.0307 - regression_loss: 0.9103 - classification_loss: 0.1204 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0323 - regression_loss: 0.9119 - classification_loss: 0.1204 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0338 - regression_loss: 0.9132 - classification_loss: 0.1206 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0349 - regression_loss: 0.9141 - classification_loss: 0.1208 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0355 - regression_loss: 0.9147 - classification_loss: 0.1208 260/500 [==============>...............] - ETA: 59s - loss: 1.0363 - regression_loss: 0.9154 - classification_loss: 0.1210  261/500 [==============>...............] - ETA: 59s - loss: 1.0350 - regression_loss: 0.9143 - classification_loss: 0.1207 262/500 [==============>...............] - ETA: 59s - loss: 1.0350 - regression_loss: 0.9144 - classification_loss: 0.1207 263/500 [==============>...............] - ETA: 59s - loss: 1.0350 - regression_loss: 0.9144 - classification_loss: 0.1206 264/500 [==============>...............] - ETA: 58s - loss: 1.0379 - regression_loss: 0.9172 - classification_loss: 0.1207 265/500 [==============>...............] - ETA: 58s - loss: 1.0388 - regression_loss: 0.9180 - classification_loss: 0.1208 266/500 [==============>...............] - ETA: 58s - loss: 1.0376 - regression_loss: 0.9171 - classification_loss: 0.1205 267/500 [===============>..............] - ETA: 58s - loss: 1.0376 - regression_loss: 0.9171 - classification_loss: 0.1205 268/500 [===============>..............] - ETA: 57s - loss: 1.0392 - regression_loss: 0.9184 - classification_loss: 0.1208 269/500 [===============>..............] - ETA: 57s - loss: 1.0376 - regression_loss: 0.9170 - classification_loss: 0.1205 270/500 [===============>..............] - ETA: 57s - loss: 1.0370 - regression_loss: 0.9165 - classification_loss: 0.1205 271/500 [===============>..............] - ETA: 57s - loss: 1.0366 - regression_loss: 0.9161 - classification_loss: 0.1204 272/500 [===============>..............] - ETA: 56s - loss: 1.0356 - regression_loss: 0.9153 - classification_loss: 0.1202 273/500 [===============>..............] - ETA: 56s - loss: 1.0343 - regression_loss: 0.9142 - classification_loss: 0.1200 274/500 [===============>..............] - ETA: 56s - loss: 1.0362 - regression_loss: 0.9161 - classification_loss: 0.1201 275/500 [===============>..............] - ETA: 56s - loss: 1.0362 - regression_loss: 0.9160 - classification_loss: 0.1202 276/500 [===============>..............] - ETA: 55s - loss: 1.0365 - regression_loss: 0.9161 - classification_loss: 0.1204 277/500 [===============>..............] - ETA: 55s - loss: 1.0344 - regression_loss: 0.9144 - classification_loss: 0.1201 278/500 [===============>..............] - ETA: 55s - loss: 1.0336 - regression_loss: 0.9137 - classification_loss: 0.1199 279/500 [===============>..............] - ETA: 55s - loss: 1.0335 - regression_loss: 0.9137 - classification_loss: 0.1198 280/500 [===============>..............] - ETA: 54s - loss: 1.0338 - regression_loss: 0.9139 - classification_loss: 0.1200 281/500 [===============>..............] - ETA: 54s - loss: 1.0315 - regression_loss: 0.9119 - classification_loss: 0.1196 282/500 [===============>..............] - ETA: 54s - loss: 1.0320 - regression_loss: 0.9123 - classification_loss: 0.1197 283/500 [===============>..............] - ETA: 54s - loss: 1.0328 - regression_loss: 0.9129 - classification_loss: 0.1199 284/500 [================>.............] - ETA: 53s - loss: 1.0328 - regression_loss: 0.9132 - classification_loss: 0.1197 285/500 [================>.............] - ETA: 53s - loss: 1.0322 - regression_loss: 0.9127 - classification_loss: 0.1195 286/500 [================>.............] - ETA: 53s - loss: 1.0302 - regression_loss: 0.9111 - classification_loss: 0.1192 287/500 [================>.............] - ETA: 53s - loss: 1.0305 - regression_loss: 0.9113 - classification_loss: 0.1192 288/500 [================>.............] - ETA: 52s - loss: 1.0320 - regression_loss: 0.9127 - classification_loss: 0.1194 289/500 [================>.............] - ETA: 52s - loss: 1.0334 - regression_loss: 0.9140 - classification_loss: 0.1194 290/500 [================>.............] - ETA: 52s - loss: 1.0318 - regression_loss: 0.9127 - classification_loss: 0.1191 291/500 [================>.............] - ETA: 52s - loss: 1.0323 - regression_loss: 0.9131 - classification_loss: 0.1191 292/500 [================>.............] - ETA: 51s - loss: 1.0316 - regression_loss: 0.9127 - classification_loss: 0.1189 293/500 [================>.............] - ETA: 51s - loss: 1.0323 - regression_loss: 0.9136 - classification_loss: 0.1187 294/500 [================>.............] - ETA: 51s - loss: 1.0301 - regression_loss: 0.9118 - classification_loss: 0.1183 295/500 [================>.............] - ETA: 51s - loss: 1.0300 - regression_loss: 0.9116 - classification_loss: 0.1184 296/500 [================>.............] - ETA: 50s - loss: 1.0288 - regression_loss: 0.9107 - classification_loss: 0.1181 297/500 [================>.............] - ETA: 50s - loss: 1.0296 - regression_loss: 0.9113 - classification_loss: 0.1183 298/500 [================>.............] - ETA: 50s - loss: 1.0277 - regression_loss: 0.9096 - classification_loss: 0.1180 299/500 [================>.............] - ETA: 50s - loss: 1.0278 - regression_loss: 0.9098 - classification_loss: 0.1180 300/500 [=================>............] - ETA: 49s - loss: 1.0276 - regression_loss: 0.9097 - classification_loss: 0.1179 301/500 [=================>............] - ETA: 49s - loss: 1.0283 - regression_loss: 0.9104 - classification_loss: 0.1178 302/500 [=================>............] - ETA: 49s - loss: 1.0288 - regression_loss: 0.9112 - classification_loss: 0.1176 303/500 [=================>............] - ETA: 49s - loss: 1.0309 - regression_loss: 0.9128 - classification_loss: 0.1181 304/500 [=================>............] - ETA: 48s - loss: 1.0328 - regression_loss: 0.9145 - classification_loss: 0.1184 305/500 [=================>............] - ETA: 48s - loss: 1.0323 - regression_loss: 0.9142 - classification_loss: 0.1181 306/500 [=================>............] - ETA: 48s - loss: 1.0310 - regression_loss: 0.9131 - classification_loss: 0.1179 307/500 [=================>............] - ETA: 48s - loss: 1.0319 - regression_loss: 0.9139 - classification_loss: 0.1181 308/500 [=================>............] - ETA: 47s - loss: 1.0320 - regression_loss: 0.9140 - classification_loss: 0.1180 309/500 [=================>............] - ETA: 47s - loss: 1.0304 - regression_loss: 0.9126 - classification_loss: 0.1178 310/500 [=================>............] - ETA: 47s - loss: 1.0297 - regression_loss: 0.9120 - classification_loss: 0.1177 311/500 [=================>............] - ETA: 47s - loss: 1.0296 - regression_loss: 0.9120 - classification_loss: 0.1176 312/500 [=================>............] - ETA: 46s - loss: 1.0298 - regression_loss: 0.9122 - classification_loss: 0.1176 313/500 [=================>............] - ETA: 46s - loss: 1.0284 - regression_loss: 0.9109 - classification_loss: 0.1175 314/500 [=================>............] - ETA: 46s - loss: 1.0285 - regression_loss: 0.9110 - classification_loss: 0.1175 315/500 [=================>............] - ETA: 46s - loss: 1.0274 - regression_loss: 0.9101 - classification_loss: 0.1173 316/500 [=================>............] - ETA: 45s - loss: 1.0284 - regression_loss: 0.9110 - classification_loss: 0.1174 317/500 [==================>...........] - ETA: 45s - loss: 1.0297 - regression_loss: 0.9120 - classification_loss: 0.1176 318/500 [==================>...........] - ETA: 45s - loss: 1.0302 - regression_loss: 0.9125 - classification_loss: 0.1177 319/500 [==================>...........] - ETA: 45s - loss: 1.0292 - regression_loss: 0.9118 - classification_loss: 0.1175 320/500 [==================>...........] - ETA: 44s - loss: 1.0286 - regression_loss: 0.9113 - classification_loss: 0.1173 321/500 [==================>...........] - ETA: 44s - loss: 1.0295 - regression_loss: 0.9120 - classification_loss: 0.1175 322/500 [==================>...........] - ETA: 44s - loss: 1.0289 - regression_loss: 0.9116 - classification_loss: 0.1174 323/500 [==================>...........] - ETA: 44s - loss: 1.0276 - regression_loss: 0.9105 - classification_loss: 0.1171 324/500 [==================>...........] - ETA: 43s - loss: 1.0281 - regression_loss: 0.9109 - classification_loss: 0.1172 325/500 [==================>...........] - ETA: 43s - loss: 1.0267 - regression_loss: 0.9096 - classification_loss: 0.1171 326/500 [==================>...........] - ETA: 43s - loss: 1.0285 - regression_loss: 0.9111 - classification_loss: 0.1174 327/500 [==================>...........] - ETA: 43s - loss: 1.0274 - regression_loss: 0.9101 - classification_loss: 0.1173 328/500 [==================>...........] - ETA: 42s - loss: 1.0257 - regression_loss: 0.9086 - classification_loss: 0.1171 329/500 [==================>...........] - ETA: 42s - loss: 1.0247 - regression_loss: 0.9079 - classification_loss: 0.1168 330/500 [==================>...........] - ETA: 42s - loss: 1.0264 - regression_loss: 0.9092 - classification_loss: 0.1172 331/500 [==================>...........] - ETA: 42s - loss: 1.0263 - regression_loss: 0.9091 - classification_loss: 0.1172 332/500 [==================>...........] - ETA: 41s - loss: 1.0256 - regression_loss: 0.9086 - classification_loss: 0.1170 333/500 [==================>...........] - ETA: 41s - loss: 1.0244 - regression_loss: 0.9076 - classification_loss: 0.1168 334/500 [===================>..........] - ETA: 41s - loss: 1.0246 - regression_loss: 0.9078 - classification_loss: 0.1168 335/500 [===================>..........] - ETA: 41s - loss: 1.0224 - regression_loss: 0.9059 - classification_loss: 0.1165 336/500 [===================>..........] - ETA: 40s - loss: 1.0225 - regression_loss: 0.9061 - classification_loss: 0.1164 337/500 [===================>..........] - ETA: 40s - loss: 1.0223 - regression_loss: 0.9060 - classification_loss: 0.1163 338/500 [===================>..........] - ETA: 40s - loss: 1.0213 - regression_loss: 0.9053 - classification_loss: 0.1161 339/500 [===================>..........] - ETA: 40s - loss: 1.0213 - regression_loss: 0.9054 - classification_loss: 0.1159 340/500 [===================>..........] - ETA: 39s - loss: 1.0208 - regression_loss: 0.9050 - classification_loss: 0.1158 341/500 [===================>..........] - ETA: 39s - loss: 1.0212 - regression_loss: 0.9054 - classification_loss: 0.1158 342/500 [===================>..........] - ETA: 39s - loss: 1.0223 - regression_loss: 0.9063 - classification_loss: 0.1160 343/500 [===================>..........] - ETA: 39s - loss: 1.0227 - regression_loss: 0.9067 - classification_loss: 0.1160 344/500 [===================>..........] - ETA: 38s - loss: 1.0224 - regression_loss: 0.9064 - classification_loss: 0.1159 345/500 [===================>..........] - ETA: 38s - loss: 1.0228 - regression_loss: 0.9068 - classification_loss: 0.1160 346/500 [===================>..........] - ETA: 38s - loss: 1.0240 - regression_loss: 0.9079 - classification_loss: 0.1162 347/500 [===================>..........] - ETA: 38s - loss: 1.0241 - regression_loss: 0.9080 - classification_loss: 0.1161 348/500 [===================>..........] - ETA: 37s - loss: 1.0251 - regression_loss: 0.9088 - classification_loss: 0.1163 349/500 [===================>..........] - ETA: 37s - loss: 1.0258 - regression_loss: 0.9096 - classification_loss: 0.1162 350/500 [====================>.........] - ETA: 37s - loss: 1.0256 - regression_loss: 0.9095 - classification_loss: 0.1160 351/500 [====================>.........] - ETA: 37s - loss: 1.0266 - regression_loss: 0.9104 - classification_loss: 0.1162 352/500 [====================>.........] - ETA: 36s - loss: 1.0274 - regression_loss: 0.9111 - classification_loss: 0.1163 353/500 [====================>.........] - ETA: 36s - loss: 1.0283 - regression_loss: 0.9118 - classification_loss: 0.1164 354/500 [====================>.........] - ETA: 36s - loss: 1.0267 - regression_loss: 0.9105 - classification_loss: 0.1162 355/500 [====================>.........] - ETA: 36s - loss: 1.0265 - regression_loss: 0.9103 - classification_loss: 0.1162 356/500 [====================>.........] - ETA: 35s - loss: 1.0270 - regression_loss: 0.9107 - classification_loss: 0.1163 357/500 [====================>.........] - ETA: 35s - loss: 1.0265 - regression_loss: 0.9102 - classification_loss: 0.1162 358/500 [====================>.........] - ETA: 35s - loss: 1.0255 - regression_loss: 0.9095 - classification_loss: 0.1160 359/500 [====================>.........] - ETA: 35s - loss: 1.0265 - regression_loss: 0.9103 - classification_loss: 0.1162 360/500 [====================>.........] - ETA: 34s - loss: 1.0272 - regression_loss: 0.9108 - classification_loss: 0.1164 361/500 [====================>.........] - ETA: 34s - loss: 1.0273 - regression_loss: 0.9108 - classification_loss: 0.1165 362/500 [====================>.........] - ETA: 34s - loss: 1.0252 - regression_loss: 0.9090 - classification_loss: 0.1162 363/500 [====================>.........] - ETA: 34s - loss: 1.0255 - regression_loss: 0.9092 - classification_loss: 0.1163 364/500 [====================>.........] - ETA: 33s - loss: 1.0260 - regression_loss: 0.9096 - classification_loss: 0.1163 365/500 [====================>.........] - ETA: 33s - loss: 1.0262 - regression_loss: 0.9098 - classification_loss: 0.1164 366/500 [====================>.........] - ETA: 33s - loss: 1.0273 - regression_loss: 0.9108 - classification_loss: 0.1165 367/500 [=====================>........] - ETA: 33s - loss: 1.0278 - regression_loss: 0.9111 - classification_loss: 0.1166 368/500 [=====================>........] - ETA: 32s - loss: 1.0276 - regression_loss: 0.9110 - classification_loss: 0.1166 369/500 [=====================>........] - ETA: 32s - loss: 1.0268 - regression_loss: 0.9105 - classification_loss: 0.1164 370/500 [=====================>........] - ETA: 32s - loss: 1.0256 - regression_loss: 0.9094 - classification_loss: 0.1162 371/500 [=====================>........] - ETA: 32s - loss: 1.0264 - regression_loss: 0.9103 - classification_loss: 0.1161 372/500 [=====================>........] - ETA: 31s - loss: 1.0261 - regression_loss: 0.9101 - classification_loss: 0.1160 373/500 [=====================>........] - ETA: 31s - loss: 1.0265 - regression_loss: 0.9105 - classification_loss: 0.1160 374/500 [=====================>........] - ETA: 31s - loss: 1.0267 - regression_loss: 0.9106 - classification_loss: 0.1161 375/500 [=====================>........] - ETA: 31s - loss: 1.0258 - regression_loss: 0.9099 - classification_loss: 0.1159 376/500 [=====================>........] - ETA: 30s - loss: 1.0253 - regression_loss: 0.9095 - classification_loss: 0.1159 377/500 [=====================>........] - ETA: 30s - loss: 1.0243 - regression_loss: 0.9086 - classification_loss: 0.1158 378/500 [=====================>........] - ETA: 30s - loss: 1.0239 - regression_loss: 0.9083 - classification_loss: 0.1156 379/500 [=====================>........] - ETA: 30s - loss: 1.0232 - regression_loss: 0.9077 - classification_loss: 0.1154 380/500 [=====================>........] - ETA: 29s - loss: 1.0222 - regression_loss: 0.9069 - classification_loss: 0.1153 381/500 [=====================>........] - ETA: 29s - loss: 1.0233 - regression_loss: 0.9078 - classification_loss: 0.1155 382/500 [=====================>........] - ETA: 29s - loss: 1.0244 - regression_loss: 0.9088 - classification_loss: 0.1156 383/500 [=====================>........] - ETA: 29s - loss: 1.0249 - regression_loss: 0.9095 - classification_loss: 0.1154 384/500 [======================>.......] - ETA: 28s - loss: 1.0243 - regression_loss: 0.9089 - classification_loss: 0.1154 385/500 [======================>.......] - ETA: 28s - loss: 1.0245 - regression_loss: 0.9092 - classification_loss: 0.1154 386/500 [======================>.......] - ETA: 28s - loss: 1.0249 - regression_loss: 0.9095 - classification_loss: 0.1154 387/500 [======================>.......] - ETA: 28s - loss: 1.0243 - regression_loss: 0.9089 - classification_loss: 0.1154 388/500 [======================>.......] - ETA: 27s - loss: 1.0236 - regression_loss: 0.9084 - classification_loss: 0.1152 389/500 [======================>.......] - ETA: 27s - loss: 1.0225 - regression_loss: 0.9075 - classification_loss: 0.1150 390/500 [======================>.......] - ETA: 27s - loss: 1.0227 - regression_loss: 0.9078 - classification_loss: 0.1150 391/500 [======================>.......] - ETA: 27s - loss: 1.0243 - regression_loss: 0.9091 - classification_loss: 0.1152 392/500 [======================>.......] - ETA: 26s - loss: 1.0235 - regression_loss: 0.9084 - classification_loss: 0.1151 393/500 [======================>.......] - ETA: 26s - loss: 1.0218 - regression_loss: 0.9070 - classification_loss: 0.1148 394/500 [======================>.......] - ETA: 26s - loss: 1.0233 - regression_loss: 0.9083 - classification_loss: 0.1150 395/500 [======================>.......] - ETA: 26s - loss: 1.0239 - regression_loss: 0.9087 - classification_loss: 0.1152 396/500 [======================>.......] - ETA: 25s - loss: 1.0244 - regression_loss: 0.9090 - classification_loss: 0.1153 397/500 [======================>.......] - ETA: 25s - loss: 1.0287 - regression_loss: 0.9128 - classification_loss: 0.1160 398/500 [======================>.......] - ETA: 25s - loss: 1.0277 - regression_loss: 0.9119 - classification_loss: 0.1158 399/500 [======================>.......] - ETA: 25s - loss: 1.0256 - regression_loss: 0.9101 - classification_loss: 0.1155 400/500 [=======================>......] - ETA: 24s - loss: 1.0256 - regression_loss: 0.9100 - classification_loss: 0.1155 401/500 [=======================>......] - ETA: 24s - loss: 1.0242 - regression_loss: 0.9087 - classification_loss: 0.1155 402/500 [=======================>......] - ETA: 24s - loss: 1.0246 - regression_loss: 0.9091 - classification_loss: 0.1155 403/500 [=======================>......] - ETA: 24s - loss: 1.0238 - regression_loss: 0.9084 - classification_loss: 0.1154 404/500 [=======================>......] - ETA: 23s - loss: 1.0223 - regression_loss: 0.9071 - classification_loss: 0.1152 405/500 [=======================>......] - ETA: 23s - loss: 1.0219 - regression_loss: 0.9066 - classification_loss: 0.1152 406/500 [=======================>......] - ETA: 23s - loss: 1.0208 - regression_loss: 0.9058 - classification_loss: 0.1150 407/500 [=======================>......] - ETA: 23s - loss: 1.0200 - regression_loss: 0.9051 - classification_loss: 0.1149 408/500 [=======================>......] - ETA: 22s - loss: 1.0184 - regression_loss: 0.9037 - classification_loss: 0.1147 409/500 [=======================>......] - ETA: 22s - loss: 1.0181 - regression_loss: 0.9034 - classification_loss: 0.1147 410/500 [=======================>......] - ETA: 22s - loss: 1.0174 - regression_loss: 0.9028 - classification_loss: 0.1146 411/500 [=======================>......] - ETA: 22s - loss: 1.0161 - regression_loss: 0.9017 - classification_loss: 0.1144 412/500 [=======================>......] - ETA: 21s - loss: 1.0150 - regression_loss: 0.9008 - classification_loss: 0.1142 413/500 [=======================>......] - ETA: 21s - loss: 1.0141 - regression_loss: 0.9001 - classification_loss: 0.1140 414/500 [=======================>......] - ETA: 21s - loss: 1.0139 - regression_loss: 0.8999 - classification_loss: 0.1140 415/500 [=======================>......] - ETA: 21s - loss: 1.0133 - regression_loss: 0.8992 - classification_loss: 0.1140 416/500 [=======================>......] - ETA: 20s - loss: 1.0141 - regression_loss: 0.8999 - classification_loss: 0.1142 417/500 [========================>.....] - ETA: 20s - loss: 1.0137 - regression_loss: 0.8997 - classification_loss: 0.1141 418/500 [========================>.....] - ETA: 20s - loss: 1.0144 - regression_loss: 0.9002 - classification_loss: 0.1142 419/500 [========================>.....] - ETA: 20s - loss: 1.0146 - regression_loss: 0.9004 - classification_loss: 0.1142 420/500 [========================>.....] - ETA: 19s - loss: 1.0143 - regression_loss: 0.9001 - classification_loss: 0.1142 421/500 [========================>.....] - ETA: 19s - loss: 1.0144 - regression_loss: 0.9003 - classification_loss: 0.1141 422/500 [========================>.....] - ETA: 19s - loss: 1.0131 - regression_loss: 0.8992 - classification_loss: 0.1139 423/500 [========================>.....] - ETA: 19s - loss: 1.0141 - regression_loss: 0.9000 - classification_loss: 0.1140 424/500 [========================>.....] - ETA: 18s - loss: 1.0152 - regression_loss: 0.9010 - classification_loss: 0.1142 425/500 [========================>.....] - ETA: 18s - loss: 1.0155 - regression_loss: 0.9014 - classification_loss: 0.1141 426/500 [========================>.....] - ETA: 18s - loss: 1.0162 - regression_loss: 0.9020 - classification_loss: 0.1142 427/500 [========================>.....] - ETA: 18s - loss: 1.0163 - regression_loss: 0.9020 - classification_loss: 0.1143 428/500 [========================>.....] - ETA: 17s - loss: 1.0162 - regression_loss: 0.9018 - classification_loss: 0.1144 429/500 [========================>.....] - ETA: 17s - loss: 1.0154 - regression_loss: 0.9012 - classification_loss: 0.1142 430/500 [========================>.....] - ETA: 17s - loss: 1.0160 - regression_loss: 0.9017 - classification_loss: 0.1143 431/500 [========================>.....] - ETA: 17s - loss: 1.0165 - regression_loss: 0.9021 - classification_loss: 0.1144 432/500 [========================>.....] - ETA: 16s - loss: 1.0169 - regression_loss: 0.9024 - classification_loss: 0.1145 433/500 [========================>.....] - ETA: 16s - loss: 1.0167 - regression_loss: 0.9023 - classification_loss: 0.1144 434/500 [=========================>....] - ETA: 16s - loss: 1.0173 - regression_loss: 0.9027 - classification_loss: 0.1145 435/500 [=========================>....] - ETA: 16s - loss: 1.0174 - regression_loss: 0.9028 - classification_loss: 0.1146 436/500 [=========================>....] - ETA: 15s - loss: 1.0169 - regression_loss: 0.9025 - classification_loss: 0.1144 437/500 [=========================>....] - ETA: 15s - loss: 1.0168 - regression_loss: 0.9023 - classification_loss: 0.1145 438/500 [=========================>....] - ETA: 15s - loss: 1.0167 - regression_loss: 0.9022 - classification_loss: 0.1145 439/500 [=========================>....] - ETA: 15s - loss: 1.0168 - regression_loss: 0.9024 - classification_loss: 0.1144 440/500 [=========================>....] - ETA: 14s - loss: 1.0180 - regression_loss: 0.9033 - classification_loss: 0.1146 441/500 [=========================>....] - ETA: 14s - loss: 1.0183 - regression_loss: 0.9035 - classification_loss: 0.1147 442/500 [=========================>....] - ETA: 14s - loss: 1.0195 - regression_loss: 0.9045 - classification_loss: 0.1150 443/500 [=========================>....] - ETA: 14s - loss: 1.0204 - regression_loss: 0.9052 - classification_loss: 0.1152 444/500 [=========================>....] - ETA: 13s - loss: 1.0210 - regression_loss: 0.9055 - classification_loss: 0.1155 445/500 [=========================>....] - ETA: 13s - loss: 1.0207 - regression_loss: 0.9053 - classification_loss: 0.1155 446/500 [=========================>....] - ETA: 13s - loss: 1.0209 - regression_loss: 0.9055 - classification_loss: 0.1155 447/500 [=========================>....] - ETA: 13s - loss: 1.0212 - regression_loss: 0.9057 - classification_loss: 0.1154 448/500 [=========================>....] - ETA: 12s - loss: 1.0209 - regression_loss: 0.9054 - classification_loss: 0.1154 449/500 [=========================>....] - ETA: 12s - loss: 1.0219 - regression_loss: 0.9064 - classification_loss: 0.1155 450/500 [==========================>...] - ETA: 12s - loss: 1.0220 - regression_loss: 0.9065 - classification_loss: 0.1155 451/500 [==========================>...] - ETA: 12s - loss: 1.0220 - regression_loss: 0.9065 - classification_loss: 0.1155 452/500 [==========================>...] - ETA: 11s - loss: 1.0220 - regression_loss: 0.9066 - classification_loss: 0.1154 453/500 [==========================>...] - ETA: 11s - loss: 1.0227 - regression_loss: 0.9072 - classification_loss: 0.1155 454/500 [==========================>...] - ETA: 11s - loss: 1.0233 - regression_loss: 0.9076 - classification_loss: 0.1157 455/500 [==========================>...] - ETA: 11s - loss: 1.0240 - regression_loss: 0.9082 - classification_loss: 0.1158 456/500 [==========================>...] - ETA: 10s - loss: 1.0229 - regression_loss: 0.9071 - classification_loss: 0.1157 457/500 [==========================>...] - ETA: 10s - loss: 1.0221 - regression_loss: 0.9065 - classification_loss: 0.1156 458/500 [==========================>...] - ETA: 10s - loss: 1.0213 - regression_loss: 0.9059 - classification_loss: 0.1154 459/500 [==========================>...] - ETA: 10s - loss: 1.0214 - regression_loss: 0.9059 - classification_loss: 0.1155 460/500 [==========================>...] - ETA: 9s - loss: 1.0213 - regression_loss: 0.9059 - classification_loss: 0.1155  461/500 [==========================>...] - ETA: 9s - loss: 1.0214 - regression_loss: 0.9060 - classification_loss: 0.1154 462/500 [==========================>...] - ETA: 9s - loss: 1.0218 - regression_loss: 0.9063 - classification_loss: 0.1154 463/500 [==========================>...] - ETA: 9s - loss: 1.0225 - regression_loss: 0.9070 - classification_loss: 0.1155 464/500 [==========================>...] - ETA: 8s - loss: 1.0223 - regression_loss: 0.9069 - classification_loss: 0.1154 465/500 [==========================>...] - ETA: 8s - loss: 1.0226 - regression_loss: 0.9072 - classification_loss: 0.1154 466/500 [==========================>...] - ETA: 8s - loss: 1.0222 - regression_loss: 0.9067 - classification_loss: 0.1155 467/500 [===========================>..] - ETA: 8s - loss: 1.0231 - regression_loss: 0.9075 - classification_loss: 0.1156 468/500 [===========================>..] - ETA: 7s - loss: 1.0229 - regression_loss: 0.9074 - classification_loss: 0.1155 469/500 [===========================>..] - ETA: 7s - loss: 1.0225 - regression_loss: 0.9070 - classification_loss: 0.1155 470/500 [===========================>..] - ETA: 7s - loss: 1.0229 - regression_loss: 0.9074 - classification_loss: 0.1155 471/500 [===========================>..] - ETA: 7s - loss: 1.0229 - regression_loss: 0.9074 - classification_loss: 0.1155 472/500 [===========================>..] - ETA: 6s - loss: 1.0224 - regression_loss: 0.9071 - classification_loss: 0.1153 473/500 [===========================>..] - ETA: 6s - loss: 1.0211 - regression_loss: 0.9058 - classification_loss: 0.1152 474/500 [===========================>..] - ETA: 6s - loss: 1.0212 - regression_loss: 0.9059 - classification_loss: 0.1152 475/500 [===========================>..] - ETA: 6s - loss: 1.0203 - regression_loss: 0.9052 - classification_loss: 0.1151 476/500 [===========================>..] - ETA: 5s - loss: 1.0205 - regression_loss: 0.9054 - classification_loss: 0.1151 477/500 [===========================>..] - ETA: 5s - loss: 1.0213 - regression_loss: 0.9060 - classification_loss: 0.1153 478/500 [===========================>..] - ETA: 5s - loss: 1.0225 - regression_loss: 0.9070 - classification_loss: 0.1156 479/500 [===========================>..] - ETA: 5s - loss: 1.0232 - regression_loss: 0.9076 - classification_loss: 0.1156 480/500 [===========================>..] - ETA: 4s - loss: 1.0236 - regression_loss: 0.9079 - classification_loss: 0.1157 481/500 [===========================>..] - ETA: 4s - loss: 1.0227 - regression_loss: 0.9072 - classification_loss: 0.1155 482/500 [===========================>..] - ETA: 4s - loss: 1.0231 - regression_loss: 0.9075 - classification_loss: 0.1155 483/500 [===========================>..] - ETA: 4s - loss: 1.0228 - regression_loss: 0.9072 - classification_loss: 0.1155 484/500 [============================>.] - ETA: 3s - loss: 1.0226 - regression_loss: 0.9072 - classification_loss: 0.1155 485/500 [============================>.] - ETA: 3s - loss: 1.0219 - regression_loss: 0.9066 - classification_loss: 0.1154 486/500 [============================>.] - ETA: 3s - loss: 1.0237 - regression_loss: 0.9080 - classification_loss: 0.1157 487/500 [============================>.] - ETA: 3s - loss: 1.0229 - regression_loss: 0.9073 - classification_loss: 0.1156 488/500 [============================>.] - ETA: 2s - loss: 1.0225 - regression_loss: 0.9070 - classification_loss: 0.1155 489/500 [============================>.] - ETA: 2s - loss: 1.0223 - regression_loss: 0.9068 - classification_loss: 0.1154 490/500 [============================>.] - ETA: 2s - loss: 1.0226 - regression_loss: 0.9073 - classification_loss: 0.1154 491/500 [============================>.] - ETA: 2s - loss: 1.0220 - regression_loss: 0.9068 - classification_loss: 0.1153 492/500 [============================>.] - ETA: 1s - loss: 1.0220 - regression_loss: 0.9067 - classification_loss: 0.1153 493/500 [============================>.] - ETA: 1s - loss: 1.0230 - regression_loss: 0.9075 - classification_loss: 0.1154 494/500 [============================>.] - ETA: 1s - loss: 1.0222 - regression_loss: 0.9069 - classification_loss: 0.1153 495/500 [============================>.] - ETA: 1s - loss: 1.0222 - regression_loss: 0.9070 - classification_loss: 0.1152 496/500 [============================>.] - ETA: 0s - loss: 1.0211 - regression_loss: 0.9060 - classification_loss: 0.1151 497/500 [============================>.] - ETA: 0s - loss: 1.0216 - regression_loss: 0.9062 - classification_loss: 0.1154 498/500 [============================>.] - ETA: 0s - loss: 1.0221 - regression_loss: 0.9066 - classification_loss: 0.1155 499/500 [============================>.] - ETA: 0s - loss: 1.0217 - regression_loss: 0.9063 - classification_loss: 0.1155 500/500 [==============================] - 125s 249ms/step - loss: 1.0206 - regression_loss: 0.9053 - classification_loss: 0.1153 1172 instances of class plum with average precision: 0.7806 mAP: 0.7806 Epoch 00036: saving model to ./training/snapshots/resnet50_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 1:59 - loss: 1.0557 - regression_loss: 1.0251 - classification_loss: 0.0306 2/500 [..............................] - ETA: 2:03 - loss: 0.6961 - regression_loss: 0.6633 - classification_loss: 0.0328 3/500 [..............................] - ETA: 2:04 - loss: 0.7252 - regression_loss: 0.6705 - classification_loss: 0.0547 4/500 [..............................] - ETA: 2:03 - loss: 0.7977 - regression_loss: 0.7312 - classification_loss: 0.0665 5/500 [..............................] - ETA: 2:03 - loss: 0.7606 - regression_loss: 0.6993 - classification_loss: 0.0612 6/500 [..............................] - ETA: 2:04 - loss: 0.8199 - regression_loss: 0.7408 - classification_loss: 0.0791 7/500 [..............................] - ETA: 2:04 - loss: 0.8633 - regression_loss: 0.7815 - classification_loss: 0.0818 8/500 [..............................] - ETA: 2:03 - loss: 0.9157 - regression_loss: 0.8240 - classification_loss: 0.0917 9/500 [..............................] - ETA: 2:03 - loss: 0.9812 - regression_loss: 0.8810 - classification_loss: 0.1002 10/500 [..............................] - ETA: 2:03 - loss: 1.0239 - regression_loss: 0.9188 - classification_loss: 0.1050 11/500 [..............................] - ETA: 2:03 - loss: 1.0411 - regression_loss: 0.9303 - classification_loss: 0.1107 12/500 [..............................] - ETA: 2:02 - loss: 1.0406 - regression_loss: 0.9338 - classification_loss: 0.1068 13/500 [..............................] - ETA: 2:01 - loss: 1.0748 - regression_loss: 0.9635 - classification_loss: 0.1113 14/500 [..............................] - ETA: 2:00 - loss: 1.0832 - regression_loss: 0.9699 - classification_loss: 0.1133 15/500 [..............................] - ETA: 2:00 - loss: 1.0734 - regression_loss: 0.9554 - classification_loss: 0.1180 16/500 [..............................] - ETA: 2:00 - loss: 1.0504 - regression_loss: 0.9371 - classification_loss: 0.1133 17/500 [>.............................] - ETA: 1:59 - loss: 1.0462 - regression_loss: 0.9342 - classification_loss: 0.1121 18/500 [>.............................] - ETA: 1:59 - loss: 1.0636 - regression_loss: 0.9487 - classification_loss: 0.1149 19/500 [>.............................] - ETA: 1:59 - loss: 1.0844 - regression_loss: 0.9657 - classification_loss: 0.1187 20/500 [>.............................] - ETA: 1:58 - loss: 1.0878 - regression_loss: 0.9684 - classification_loss: 0.1194 21/500 [>.............................] - ETA: 1:58 - loss: 1.0800 - regression_loss: 0.9595 - classification_loss: 0.1205 22/500 [>.............................] - ETA: 1:58 - loss: 1.0815 - regression_loss: 0.9612 - classification_loss: 0.1203 23/500 [>.............................] - ETA: 1:58 - loss: 1.0840 - regression_loss: 0.9633 - classification_loss: 0.1207 24/500 [>.............................] - ETA: 1:57 - loss: 1.0880 - regression_loss: 0.9696 - classification_loss: 0.1185 25/500 [>.............................] - ETA: 1:57 - loss: 1.0931 - regression_loss: 0.9734 - classification_loss: 0.1197 26/500 [>.............................] - ETA: 1:57 - loss: 1.1041 - regression_loss: 0.9820 - classification_loss: 0.1220 27/500 [>.............................] - ETA: 1:57 - loss: 1.0901 - regression_loss: 0.9708 - classification_loss: 0.1194 28/500 [>.............................] - ETA: 1:56 - loss: 1.0946 - regression_loss: 0.9749 - classification_loss: 0.1198 29/500 [>.............................] - ETA: 1:56 - loss: 1.0930 - regression_loss: 0.9747 - classification_loss: 0.1183 30/500 [>.............................] - ETA: 1:56 - loss: 1.0860 - regression_loss: 0.9690 - classification_loss: 0.1170 31/500 [>.............................] - ETA: 1:56 - loss: 1.0702 - regression_loss: 0.9551 - classification_loss: 0.1151 32/500 [>.............................] - ETA: 1:55 - loss: 1.0719 - regression_loss: 0.9568 - classification_loss: 0.1151 33/500 [>.............................] - ETA: 1:55 - loss: 1.0867 - regression_loss: 0.9682 - classification_loss: 0.1186 34/500 [=>............................] - ETA: 1:55 - loss: 1.0796 - regression_loss: 0.9614 - classification_loss: 0.1182 35/500 [=>............................] - ETA: 1:55 - loss: 1.0910 - regression_loss: 0.9713 - classification_loss: 0.1197 36/500 [=>............................] - ETA: 1:54 - loss: 1.0920 - regression_loss: 0.9738 - classification_loss: 0.1182 37/500 [=>............................] - ETA: 1:54 - loss: 1.0746 - regression_loss: 0.9582 - classification_loss: 0.1164 38/500 [=>............................] - ETA: 1:54 - loss: 1.0623 - regression_loss: 0.9475 - classification_loss: 0.1147 39/500 [=>............................] - ETA: 1:54 - loss: 1.0591 - regression_loss: 0.9443 - classification_loss: 0.1148 40/500 [=>............................] - ETA: 1:54 - loss: 1.0646 - regression_loss: 0.9492 - classification_loss: 0.1154 41/500 [=>............................] - ETA: 1:54 - loss: 1.0712 - regression_loss: 0.9539 - classification_loss: 0.1173 42/500 [=>............................] - ETA: 1:53 - loss: 1.0611 - regression_loss: 0.9447 - classification_loss: 0.1164 43/500 [=>............................] - ETA: 1:53 - loss: 1.0707 - regression_loss: 0.9531 - classification_loss: 0.1176 44/500 [=>............................] - ETA: 1:53 - loss: 1.0644 - regression_loss: 0.9468 - classification_loss: 0.1176 45/500 [=>............................] - ETA: 1:53 - loss: 1.0661 - regression_loss: 0.9479 - classification_loss: 0.1181 46/500 [=>............................] - ETA: 1:52 - loss: 1.0659 - regression_loss: 0.9486 - classification_loss: 0.1173 47/500 [=>............................] - ETA: 1:52 - loss: 1.0543 - regression_loss: 0.9388 - classification_loss: 0.1155 48/500 [=>............................] - ETA: 1:52 - loss: 1.0370 - regression_loss: 0.9235 - classification_loss: 0.1135 49/500 [=>............................] - ETA: 1:52 - loss: 1.0441 - regression_loss: 0.9282 - classification_loss: 0.1159 50/500 [==>...........................] - ETA: 1:52 - loss: 1.0461 - regression_loss: 0.9301 - classification_loss: 0.1160 51/500 [==>...........................] - ETA: 1:51 - loss: 1.0519 - regression_loss: 0.9355 - classification_loss: 0.1164 52/500 [==>...........................] - ETA: 1:51 - loss: 1.0567 - regression_loss: 0.9389 - classification_loss: 0.1178 53/500 [==>...........................] - ETA: 1:51 - loss: 1.0582 - regression_loss: 0.9399 - classification_loss: 0.1183 54/500 [==>...........................] - ETA: 1:51 - loss: 1.0625 - regression_loss: 0.9440 - classification_loss: 0.1184 55/500 [==>...........................] - ETA: 1:50 - loss: 1.0526 - regression_loss: 0.9355 - classification_loss: 0.1171 56/500 [==>...........................] - ETA: 1:50 - loss: 1.0530 - regression_loss: 0.9359 - classification_loss: 0.1172 57/500 [==>...........................] - ETA: 1:50 - loss: 1.0563 - regression_loss: 0.9386 - classification_loss: 0.1177 58/500 [==>...........................] - ETA: 1:50 - loss: 1.0514 - regression_loss: 0.9346 - classification_loss: 0.1168 59/500 [==>...........................] - ETA: 1:50 - loss: 1.0590 - regression_loss: 0.9419 - classification_loss: 0.1171 60/500 [==>...........................] - ETA: 1:49 - loss: 1.0632 - regression_loss: 0.9439 - classification_loss: 0.1192 61/500 [==>...........................] - ETA: 1:49 - loss: 1.0634 - regression_loss: 0.9442 - classification_loss: 0.1192 62/500 [==>...........................] - ETA: 1:49 - loss: 1.0661 - regression_loss: 0.9463 - classification_loss: 0.1198 63/500 [==>...........................] - ETA: 1:48 - loss: 1.0602 - regression_loss: 0.9414 - classification_loss: 0.1188 64/500 [==>...........................] - ETA: 1:48 - loss: 1.0509 - regression_loss: 0.9333 - classification_loss: 0.1176 65/500 [==>...........................] - ETA: 1:48 - loss: 1.0556 - regression_loss: 0.9370 - classification_loss: 0.1186 66/500 [==>...........................] - ETA: 1:48 - loss: 1.0471 - regression_loss: 0.9296 - classification_loss: 0.1175 67/500 [===>..........................] - ETA: 1:47 - loss: 1.0448 - regression_loss: 0.9280 - classification_loss: 0.1168 68/500 [===>..........................] - ETA: 1:47 - loss: 1.0469 - regression_loss: 0.9299 - classification_loss: 0.1170 69/500 [===>..........................] - ETA: 1:47 - loss: 1.0458 - regression_loss: 0.9290 - classification_loss: 0.1169 70/500 [===>..........................] - ETA: 1:47 - loss: 1.0376 - regression_loss: 0.9218 - classification_loss: 0.1158 71/500 [===>..........................] - ETA: 1:46 - loss: 1.0471 - regression_loss: 0.9289 - classification_loss: 0.1182 72/500 [===>..........................] - ETA: 1:46 - loss: 1.0498 - regression_loss: 0.9311 - classification_loss: 0.1187 73/500 [===>..........................] - ETA: 1:46 - loss: 1.0557 - regression_loss: 0.9358 - classification_loss: 0.1199 74/500 [===>..........................] - ETA: 1:46 - loss: 1.0568 - regression_loss: 0.9369 - classification_loss: 0.1199 75/500 [===>..........................] - ETA: 1:45 - loss: 1.0543 - regression_loss: 0.9352 - classification_loss: 0.1191 76/500 [===>..........................] - ETA: 1:45 - loss: 1.0478 - regression_loss: 0.9299 - classification_loss: 0.1179 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0407 - regression_loss: 0.9237 - classification_loss: 0.1169 78/500 [===>..........................] - ETA: 1:45 - loss: 1.0339 - regression_loss: 0.9182 - classification_loss: 0.1157 79/500 [===>..........................] - ETA: 1:44 - loss: 1.0365 - regression_loss: 0.9203 - classification_loss: 0.1162 80/500 [===>..........................] - ETA: 1:44 - loss: 1.0343 - regression_loss: 0.9186 - classification_loss: 0.1157 81/500 [===>..........................] - ETA: 1:44 - loss: 1.0311 - regression_loss: 0.9158 - classification_loss: 0.1154 82/500 [===>..........................] - ETA: 1:44 - loss: 1.0360 - regression_loss: 0.9198 - classification_loss: 0.1161 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0376 - regression_loss: 0.9205 - classification_loss: 0.1171 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0417 - regression_loss: 0.9245 - classification_loss: 0.1172 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0402 - regression_loss: 0.9232 - classification_loss: 0.1169 86/500 [====>.........................] - ETA: 1:43 - loss: 1.0339 - regression_loss: 0.9178 - classification_loss: 0.1161 87/500 [====>.........................] - ETA: 1:42 - loss: 1.0349 - regression_loss: 0.9185 - classification_loss: 0.1165 88/500 [====>.........................] - ETA: 1:42 - loss: 1.0321 - regression_loss: 0.9164 - classification_loss: 0.1157 89/500 [====>.........................] - ETA: 1:42 - loss: 1.0386 - regression_loss: 0.9215 - classification_loss: 0.1171 90/500 [====>.........................] - ETA: 1:41 - loss: 1.0332 - regression_loss: 0.9166 - classification_loss: 0.1166 91/500 [====>.........................] - ETA: 1:41 - loss: 1.0312 - regression_loss: 0.9153 - classification_loss: 0.1159 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0335 - regression_loss: 0.9175 - classification_loss: 0.1160 93/500 [====>.........................] - ETA: 1:41 - loss: 1.0332 - regression_loss: 0.9175 - classification_loss: 0.1157 94/500 [====>.........................] - ETA: 1:41 - loss: 1.0289 - regression_loss: 0.9139 - classification_loss: 0.1150 95/500 [====>.........................] - ETA: 1:40 - loss: 1.0277 - regression_loss: 0.9135 - classification_loss: 0.1143 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0194 - regression_loss: 0.9060 - classification_loss: 0.1134 97/500 [====>.........................] - ETA: 1:40 - loss: 1.0188 - regression_loss: 0.9053 - classification_loss: 0.1135 98/500 [====>.........................] - ETA: 1:40 - loss: 1.0229 - regression_loss: 0.9090 - classification_loss: 0.1140 99/500 [====>.........................] - ETA: 1:39 - loss: 1.0253 - regression_loss: 0.9111 - classification_loss: 0.1142 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0247 - regression_loss: 0.9105 - classification_loss: 0.1142 101/500 [=====>........................] - ETA: 1:39 - loss: 1.0251 - regression_loss: 0.9107 - classification_loss: 0.1144 102/500 [=====>........................] - ETA: 1:39 - loss: 1.0271 - regression_loss: 0.9126 - classification_loss: 0.1145 103/500 [=====>........................] - ETA: 1:38 - loss: 1.0313 - regression_loss: 0.9160 - classification_loss: 0.1153 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0319 - regression_loss: 0.9163 - classification_loss: 0.1155 105/500 [=====>........................] - ETA: 1:38 - loss: 1.0325 - regression_loss: 0.9169 - classification_loss: 0.1156 106/500 [=====>........................] - ETA: 1:38 - loss: 1.0305 - regression_loss: 0.9151 - classification_loss: 0.1154 107/500 [=====>........................] - ETA: 1:37 - loss: 1.0307 - regression_loss: 0.9154 - classification_loss: 0.1153 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0329 - regression_loss: 0.9173 - classification_loss: 0.1156 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0319 - regression_loss: 0.9170 - classification_loss: 0.1149 110/500 [=====>........................] - ETA: 1:37 - loss: 1.0292 - regression_loss: 0.9147 - classification_loss: 0.1145 111/500 [=====>........................] - ETA: 1:36 - loss: 1.0251 - regression_loss: 0.9112 - classification_loss: 0.1139 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0204 - regression_loss: 0.9074 - classification_loss: 0.1130 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0193 - regression_loss: 0.9065 - classification_loss: 0.1128 114/500 [=====>........................] - ETA: 1:36 - loss: 1.0203 - regression_loss: 0.9072 - classification_loss: 0.1131 115/500 [=====>........................] - ETA: 1:35 - loss: 1.0182 - regression_loss: 0.9053 - classification_loss: 0.1129 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0174 - regression_loss: 0.9045 - classification_loss: 0.1129 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0191 - regression_loss: 0.9057 - classification_loss: 0.1134 118/500 [======>.......................] - ETA: 1:35 - loss: 1.0207 - regression_loss: 0.9069 - classification_loss: 0.1138 119/500 [======>.......................] - ETA: 1:35 - loss: 1.0197 - regression_loss: 0.9062 - classification_loss: 0.1134 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0200 - regression_loss: 0.9064 - classification_loss: 0.1135 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0137 - regression_loss: 0.9009 - classification_loss: 0.1128 122/500 [======>.......................] - ETA: 1:34 - loss: 1.0179 - regression_loss: 0.9053 - classification_loss: 0.1126 123/500 [======>.......................] - ETA: 1:34 - loss: 1.0217 - regression_loss: 0.9088 - classification_loss: 0.1130 124/500 [======>.......................] - ETA: 1:33 - loss: 1.0227 - regression_loss: 0.9097 - classification_loss: 0.1130 125/500 [======>.......................] - ETA: 1:33 - loss: 1.0211 - regression_loss: 0.9083 - classification_loss: 0.1128 126/500 [======>.......................] - ETA: 1:33 - loss: 1.0210 - regression_loss: 0.9077 - classification_loss: 0.1133 127/500 [======>.......................] - ETA: 1:33 - loss: 1.0201 - regression_loss: 0.9072 - classification_loss: 0.1129 128/500 [======>.......................] - ETA: 1:32 - loss: 1.0238 - regression_loss: 0.9104 - classification_loss: 0.1134 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0268 - regression_loss: 0.9131 - classification_loss: 0.1138 130/500 [======>.......................] - ETA: 1:32 - loss: 1.0291 - regression_loss: 0.9152 - classification_loss: 0.1139 131/500 [======>.......................] - ETA: 1:31 - loss: 1.0277 - regression_loss: 0.9141 - classification_loss: 0.1136 132/500 [======>.......................] - ETA: 1:31 - loss: 1.0293 - regression_loss: 0.9154 - classification_loss: 0.1139 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0293 - regression_loss: 0.9155 - classification_loss: 0.1138 134/500 [=======>......................] - ETA: 1:30 - loss: 1.0252 - regression_loss: 0.9117 - classification_loss: 0.1135 135/500 [=======>......................] - ETA: 1:30 - loss: 1.0222 - regression_loss: 0.9092 - classification_loss: 0.1130 136/500 [=======>......................] - ETA: 1:30 - loss: 1.0247 - regression_loss: 0.9118 - classification_loss: 0.1129 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0247 - regression_loss: 0.9117 - classification_loss: 0.1130 138/500 [=======>......................] - ETA: 1:30 - loss: 1.0240 - regression_loss: 0.9114 - classification_loss: 0.1127 139/500 [=======>......................] - ETA: 1:29 - loss: 1.0241 - regression_loss: 0.9114 - classification_loss: 0.1127 140/500 [=======>......................] - ETA: 1:29 - loss: 1.0179 - regression_loss: 0.9058 - classification_loss: 0.1120 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0209 - regression_loss: 0.9093 - classification_loss: 0.1116 142/500 [=======>......................] - ETA: 1:29 - loss: 1.0232 - regression_loss: 0.9112 - classification_loss: 0.1120 143/500 [=======>......................] - ETA: 1:28 - loss: 1.0243 - regression_loss: 0.9122 - classification_loss: 0.1120 144/500 [=======>......................] - ETA: 1:28 - loss: 1.0272 - regression_loss: 0.9146 - classification_loss: 0.1126 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0269 - regression_loss: 0.9144 - classification_loss: 0.1125 146/500 [=======>......................] - ETA: 1:28 - loss: 1.0215 - regression_loss: 0.9095 - classification_loss: 0.1119 147/500 [=======>......................] - ETA: 1:27 - loss: 1.0234 - regression_loss: 0.9113 - classification_loss: 0.1121 148/500 [=======>......................] - ETA: 1:27 - loss: 1.0237 - regression_loss: 0.9117 - classification_loss: 0.1120 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0236 - regression_loss: 0.9116 - classification_loss: 0.1120 150/500 [========>.....................] - ETA: 1:26 - loss: 1.0240 - regression_loss: 0.9122 - classification_loss: 0.1118 151/500 [========>.....................] - ETA: 1:26 - loss: 1.0242 - regression_loss: 0.9125 - classification_loss: 0.1118 152/500 [========>.....................] - ETA: 1:26 - loss: 1.0251 - regression_loss: 0.9134 - classification_loss: 0.1117 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0259 - regression_loss: 0.9140 - classification_loss: 0.1119 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0283 - regression_loss: 0.9160 - classification_loss: 0.1123 155/500 [========>.....................] - ETA: 1:25 - loss: 1.0313 - regression_loss: 0.9188 - classification_loss: 0.1125 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0322 - regression_loss: 0.9194 - classification_loss: 0.1128 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0325 - regression_loss: 0.9198 - classification_loss: 0.1127 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0310 - regression_loss: 0.9185 - classification_loss: 0.1125 159/500 [========>.....................] - ETA: 1:24 - loss: 1.0285 - regression_loss: 0.9166 - classification_loss: 0.1120 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0322 - regression_loss: 0.9203 - classification_loss: 0.1119 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0343 - regression_loss: 0.9219 - classification_loss: 0.1124 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0317 - regression_loss: 0.9198 - classification_loss: 0.1119 163/500 [========>.....................] - ETA: 1:23 - loss: 1.0345 - regression_loss: 0.9222 - classification_loss: 0.1123 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0340 - regression_loss: 0.9215 - classification_loss: 0.1125 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0306 - regression_loss: 0.9185 - classification_loss: 0.1120 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0333 - regression_loss: 0.9209 - classification_loss: 0.1124 167/500 [=========>....................] - ETA: 1:22 - loss: 1.0333 - regression_loss: 0.9208 - classification_loss: 0.1125 168/500 [=========>....................] - ETA: 1:22 - loss: 1.0360 - regression_loss: 0.9230 - classification_loss: 0.1130 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0332 - regression_loss: 0.9207 - classification_loss: 0.1125 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0375 - regression_loss: 0.9244 - classification_loss: 0.1131 171/500 [=========>....................] - ETA: 1:21 - loss: 1.0396 - regression_loss: 0.9261 - classification_loss: 0.1135 172/500 [=========>....................] - ETA: 1:21 - loss: 1.0415 - regression_loss: 0.9278 - classification_loss: 0.1137 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0394 - regression_loss: 0.9260 - classification_loss: 0.1134 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0390 - regression_loss: 0.9260 - classification_loss: 0.1130 175/500 [=========>....................] - ETA: 1:20 - loss: 1.0396 - regression_loss: 0.9265 - classification_loss: 0.1131 176/500 [=========>....................] - ETA: 1:20 - loss: 1.0366 - regression_loss: 0.9240 - classification_loss: 0.1126 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0367 - regression_loss: 0.9239 - classification_loss: 0.1127 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0381 - regression_loss: 0.9251 - classification_loss: 0.1130 179/500 [=========>....................] - ETA: 1:19 - loss: 1.0397 - regression_loss: 0.9263 - classification_loss: 0.1134 180/500 [=========>....................] - ETA: 1:19 - loss: 1.0410 - regression_loss: 0.9274 - classification_loss: 0.1136 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0438 - regression_loss: 0.9298 - classification_loss: 0.1140 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0433 - regression_loss: 0.9290 - classification_loss: 0.1143 183/500 [=========>....................] - ETA: 1:18 - loss: 1.0451 - regression_loss: 0.9304 - classification_loss: 0.1146 184/500 [==========>...................] - ETA: 1:18 - loss: 1.0495 - regression_loss: 0.9347 - classification_loss: 0.1148 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0493 - regression_loss: 0.9343 - classification_loss: 0.1149 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0494 - regression_loss: 0.9346 - classification_loss: 0.1148 187/500 [==========>...................] - ETA: 1:17 - loss: 1.0514 - regression_loss: 0.9362 - classification_loss: 0.1152 188/500 [==========>...................] - ETA: 1:17 - loss: 1.0500 - regression_loss: 0.9349 - classification_loss: 0.1151 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0512 - regression_loss: 0.9358 - classification_loss: 0.1153 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0471 - regression_loss: 0.9323 - classification_loss: 0.1148 191/500 [==========>...................] - ETA: 1:17 - loss: 1.0438 - regression_loss: 0.9294 - classification_loss: 0.1143 192/500 [==========>...................] - ETA: 1:16 - loss: 1.0413 - regression_loss: 0.9272 - classification_loss: 0.1140 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0431 - regression_loss: 0.9286 - classification_loss: 0.1145 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0407 - regression_loss: 0.9266 - classification_loss: 0.1141 195/500 [==========>...................] - ETA: 1:16 - loss: 1.0379 - regression_loss: 0.9241 - classification_loss: 0.1138 196/500 [==========>...................] - ETA: 1:15 - loss: 1.0364 - regression_loss: 0.9229 - classification_loss: 0.1134 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0354 - regression_loss: 0.9221 - classification_loss: 0.1133 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0361 - regression_loss: 0.9226 - classification_loss: 0.1135 199/500 [==========>...................] - ETA: 1:15 - loss: 1.0338 - regression_loss: 0.9207 - classification_loss: 0.1131 200/500 [===========>..................] - ETA: 1:14 - loss: 1.0345 - regression_loss: 0.9215 - classification_loss: 0.1129 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0348 - regression_loss: 0.9218 - classification_loss: 0.1130 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0352 - regression_loss: 0.9222 - classification_loss: 0.1130 203/500 [===========>..................] - ETA: 1:14 - loss: 1.0345 - regression_loss: 0.9217 - classification_loss: 0.1128 204/500 [===========>..................] - ETA: 1:13 - loss: 1.0354 - regression_loss: 0.9227 - classification_loss: 0.1127 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0356 - regression_loss: 0.9230 - classification_loss: 0.1125 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0351 - regression_loss: 0.9229 - classification_loss: 0.1121 207/500 [===========>..................] - ETA: 1:13 - loss: 1.0378 - regression_loss: 0.9251 - classification_loss: 0.1127 208/500 [===========>..................] - ETA: 1:12 - loss: 1.0373 - regression_loss: 0.9248 - classification_loss: 0.1125 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0359 - regression_loss: 0.9237 - classification_loss: 0.1122 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0352 - regression_loss: 0.9231 - classification_loss: 0.1121 211/500 [===========>..................] - ETA: 1:12 - loss: 1.0367 - regression_loss: 0.9243 - classification_loss: 0.1124 212/500 [===========>..................] - ETA: 1:11 - loss: 1.0366 - regression_loss: 0.9242 - classification_loss: 0.1124 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0378 - regression_loss: 0.9252 - classification_loss: 0.1125 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0356 - regression_loss: 0.9234 - classification_loss: 0.1121 215/500 [===========>..................] - ETA: 1:11 - loss: 1.0346 - regression_loss: 0.9228 - classification_loss: 0.1118 216/500 [===========>..................] - ETA: 1:10 - loss: 1.0397 - regression_loss: 0.9268 - classification_loss: 0.1129 217/500 [============>.................] - ETA: 1:10 - loss: 1.0391 - regression_loss: 0.9263 - classification_loss: 0.1128 218/500 [============>.................] - ETA: 1:10 - loss: 1.0406 - regression_loss: 0.9276 - classification_loss: 0.1130 219/500 [============>.................] - ETA: 1:10 - loss: 1.0400 - regression_loss: 0.9269 - classification_loss: 0.1130 220/500 [============>.................] - ETA: 1:09 - loss: 1.0410 - regression_loss: 0.9277 - classification_loss: 0.1132 221/500 [============>.................] - ETA: 1:09 - loss: 1.0401 - regression_loss: 0.9271 - classification_loss: 0.1131 222/500 [============>.................] - ETA: 1:09 - loss: 1.0417 - regression_loss: 0.9284 - classification_loss: 0.1133 223/500 [============>.................] - ETA: 1:09 - loss: 1.0438 - regression_loss: 0.9303 - classification_loss: 0.1136 224/500 [============>.................] - ETA: 1:08 - loss: 1.0441 - regression_loss: 0.9304 - classification_loss: 0.1137 225/500 [============>.................] - ETA: 1:08 - loss: 1.0436 - regression_loss: 0.9299 - classification_loss: 0.1136 226/500 [============>.................] - ETA: 1:08 - loss: 1.0413 - regression_loss: 0.9279 - classification_loss: 0.1134 227/500 [============>.................] - ETA: 1:08 - loss: 1.0397 - regression_loss: 0.9266 - classification_loss: 0.1131 228/500 [============>.................] - ETA: 1:07 - loss: 1.0401 - regression_loss: 0.9270 - classification_loss: 0.1131 229/500 [============>.................] - ETA: 1:07 - loss: 1.0400 - regression_loss: 0.9268 - classification_loss: 0.1132 230/500 [============>.................] - ETA: 1:07 - loss: 1.0372 - regression_loss: 0.9244 - classification_loss: 0.1128 231/500 [============>.................] - ETA: 1:07 - loss: 1.0377 - regression_loss: 0.9247 - classification_loss: 0.1130 232/500 [============>.................] - ETA: 1:06 - loss: 1.0402 - regression_loss: 0.9270 - classification_loss: 0.1132 233/500 [============>.................] - ETA: 1:06 - loss: 1.0401 - regression_loss: 0.9270 - classification_loss: 0.1131 234/500 [=============>................] - ETA: 1:06 - loss: 1.0401 - regression_loss: 0.9270 - classification_loss: 0.1131 235/500 [=============>................] - ETA: 1:06 - loss: 1.0391 - regression_loss: 0.9263 - classification_loss: 0.1128 236/500 [=============>................] - ETA: 1:05 - loss: 1.0399 - regression_loss: 0.9269 - classification_loss: 0.1130 237/500 [=============>................] - ETA: 1:05 - loss: 1.0431 - regression_loss: 0.9295 - classification_loss: 0.1136 238/500 [=============>................] - ETA: 1:05 - loss: 1.0436 - regression_loss: 0.9301 - classification_loss: 0.1136 239/500 [=============>................] - ETA: 1:05 - loss: 1.0439 - regression_loss: 0.9303 - classification_loss: 0.1136 240/500 [=============>................] - ETA: 1:04 - loss: 1.0443 - regression_loss: 0.9305 - classification_loss: 0.1138 241/500 [=============>................] - ETA: 1:04 - loss: 1.0444 - regression_loss: 0.9307 - classification_loss: 0.1136 242/500 [=============>................] - ETA: 1:04 - loss: 1.0454 - regression_loss: 0.9315 - classification_loss: 0.1139 243/500 [=============>................] - ETA: 1:04 - loss: 1.0429 - regression_loss: 0.9293 - classification_loss: 0.1136 244/500 [=============>................] - ETA: 1:03 - loss: 1.0404 - regression_loss: 0.9271 - classification_loss: 0.1133 245/500 [=============>................] - ETA: 1:03 - loss: 1.0405 - regression_loss: 0.9272 - classification_loss: 0.1133 246/500 [=============>................] - ETA: 1:03 - loss: 1.0401 - regression_loss: 0.9265 - classification_loss: 0.1136 247/500 [=============>................] - ETA: 1:03 - loss: 1.0400 - regression_loss: 0.9262 - classification_loss: 0.1137 248/500 [=============>................] - ETA: 1:02 - loss: 1.0401 - regression_loss: 0.9264 - classification_loss: 0.1137 249/500 [=============>................] - ETA: 1:02 - loss: 1.0373 - regression_loss: 0.9240 - classification_loss: 0.1133 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0355 - regression_loss: 0.9225 - classification_loss: 0.1130 251/500 [==============>...............] - ETA: 1:02 - loss: 1.0356 - regression_loss: 0.9225 - classification_loss: 0.1131 252/500 [==============>...............] - ETA: 1:01 - loss: 1.0365 - regression_loss: 0.9233 - classification_loss: 0.1132 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0374 - regression_loss: 0.9240 - classification_loss: 0.1134 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0378 - regression_loss: 0.9243 - classification_loss: 0.1135 255/500 [==============>...............] - ETA: 1:01 - loss: 1.0400 - regression_loss: 0.9261 - classification_loss: 0.1139 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0378 - regression_loss: 0.9238 - classification_loss: 0.1140 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0372 - regression_loss: 0.9232 - classification_loss: 0.1139 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0378 - regression_loss: 0.9237 - classification_loss: 0.1140 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0377 - regression_loss: 0.9239 - classification_loss: 0.1138 260/500 [==============>...............] - ETA: 59s - loss: 1.0368 - regression_loss: 0.9233 - classification_loss: 0.1136  261/500 [==============>...............] - ETA: 59s - loss: 1.0364 - regression_loss: 0.9228 - classification_loss: 0.1136 262/500 [==============>...............] - ETA: 59s - loss: 1.0358 - regression_loss: 0.9223 - classification_loss: 0.1135 263/500 [==============>...............] - ETA: 59s - loss: 1.0361 - regression_loss: 0.9225 - classification_loss: 0.1135 264/500 [==============>...............] - ETA: 58s - loss: 1.0343 - regression_loss: 0.9210 - classification_loss: 0.1133 265/500 [==============>...............] - ETA: 58s - loss: 1.0327 - regression_loss: 0.9197 - classification_loss: 0.1130 266/500 [==============>...............] - ETA: 58s - loss: 1.0329 - regression_loss: 0.9199 - classification_loss: 0.1130 267/500 [===============>..............] - ETA: 58s - loss: 1.0314 - regression_loss: 0.9186 - classification_loss: 0.1128 268/500 [===============>..............] - ETA: 57s - loss: 1.0297 - regression_loss: 0.9172 - classification_loss: 0.1125 269/500 [===============>..............] - ETA: 57s - loss: 1.0294 - regression_loss: 0.9169 - classification_loss: 0.1126 270/500 [===============>..............] - ETA: 57s - loss: 1.0303 - regression_loss: 0.9176 - classification_loss: 0.1127 271/500 [===============>..............] - ETA: 57s - loss: 1.0279 - regression_loss: 0.9155 - classification_loss: 0.1124 272/500 [===============>..............] - ETA: 56s - loss: 1.0281 - regression_loss: 0.9157 - classification_loss: 0.1124 273/500 [===============>..............] - ETA: 56s - loss: 1.0269 - regression_loss: 0.9147 - classification_loss: 0.1122 274/500 [===============>..............] - ETA: 56s - loss: 1.0277 - regression_loss: 0.9154 - classification_loss: 0.1123 275/500 [===============>..............] - ETA: 56s - loss: 1.0257 - regression_loss: 0.9136 - classification_loss: 0.1121 276/500 [===============>..............] - ETA: 55s - loss: 1.0241 - regression_loss: 0.9123 - classification_loss: 0.1119 277/500 [===============>..............] - ETA: 55s - loss: 1.0233 - regression_loss: 0.9116 - classification_loss: 0.1118 278/500 [===============>..............] - ETA: 55s - loss: 1.0237 - regression_loss: 0.9119 - classification_loss: 0.1118 279/500 [===============>..............] - ETA: 55s - loss: 1.0210 - regression_loss: 0.9096 - classification_loss: 0.1114 280/500 [===============>..............] - ETA: 54s - loss: 1.0219 - regression_loss: 0.9102 - classification_loss: 0.1116 281/500 [===============>..............] - ETA: 54s - loss: 1.0219 - regression_loss: 0.9104 - classification_loss: 0.1115 282/500 [===============>..............] - ETA: 54s - loss: 1.0225 - regression_loss: 0.9109 - classification_loss: 0.1115 283/500 [===============>..............] - ETA: 54s - loss: 1.0216 - regression_loss: 0.9103 - classification_loss: 0.1113 284/500 [================>.............] - ETA: 53s - loss: 1.0207 - regression_loss: 0.9095 - classification_loss: 0.1112 285/500 [================>.............] - ETA: 53s - loss: 1.0203 - regression_loss: 0.9092 - classification_loss: 0.1111 286/500 [================>.............] - ETA: 53s - loss: 1.0217 - regression_loss: 0.9103 - classification_loss: 0.1114 287/500 [================>.............] - ETA: 53s - loss: 1.0213 - regression_loss: 0.9100 - classification_loss: 0.1113 288/500 [================>.............] - ETA: 52s - loss: 1.0243 - regression_loss: 0.9122 - classification_loss: 0.1121 289/500 [================>.............] - ETA: 52s - loss: 1.0244 - regression_loss: 0.9122 - classification_loss: 0.1122 290/500 [================>.............] - ETA: 52s - loss: 1.0257 - regression_loss: 0.9133 - classification_loss: 0.1124 291/500 [================>.............] - ETA: 52s - loss: 1.0252 - regression_loss: 0.9127 - classification_loss: 0.1125 292/500 [================>.............] - ETA: 51s - loss: 1.0236 - regression_loss: 0.9114 - classification_loss: 0.1122 293/500 [================>.............] - ETA: 51s - loss: 1.0227 - regression_loss: 0.9107 - classification_loss: 0.1121 294/500 [================>.............] - ETA: 51s - loss: 1.0217 - regression_loss: 0.9097 - classification_loss: 0.1120 295/500 [================>.............] - ETA: 51s - loss: 1.0235 - regression_loss: 0.9113 - classification_loss: 0.1122 296/500 [================>.............] - ETA: 50s - loss: 1.0224 - regression_loss: 0.9103 - classification_loss: 0.1121 297/500 [================>.............] - ETA: 50s - loss: 1.0227 - regression_loss: 0.9105 - classification_loss: 0.1122 298/500 [================>.............] - ETA: 50s - loss: 1.0237 - regression_loss: 0.9113 - classification_loss: 0.1124 299/500 [================>.............] - ETA: 50s - loss: 1.0218 - regression_loss: 0.9096 - classification_loss: 0.1122 300/500 [=================>............] - ETA: 49s - loss: 1.0216 - regression_loss: 0.9093 - classification_loss: 0.1123 301/500 [=================>............] - ETA: 49s - loss: 1.0196 - regression_loss: 0.9075 - classification_loss: 0.1121 302/500 [=================>............] - ETA: 49s - loss: 1.0191 - regression_loss: 0.9073 - classification_loss: 0.1118 303/500 [=================>............] - ETA: 49s - loss: 1.0188 - regression_loss: 0.9072 - classification_loss: 0.1116 304/500 [=================>............] - ETA: 48s - loss: 1.0171 - regression_loss: 0.9057 - classification_loss: 0.1114 305/500 [=================>............] - ETA: 48s - loss: 1.0171 - regression_loss: 0.9056 - classification_loss: 0.1115 306/500 [=================>............] - ETA: 48s - loss: 1.0191 - regression_loss: 0.9072 - classification_loss: 0.1119 307/500 [=================>............] - ETA: 48s - loss: 1.0184 - regression_loss: 0.9066 - classification_loss: 0.1118 308/500 [=================>............] - ETA: 47s - loss: 1.0169 - regression_loss: 0.9052 - classification_loss: 0.1117 309/500 [=================>............] - ETA: 47s - loss: 1.0143 - regression_loss: 0.9030 - classification_loss: 0.1114 310/500 [=================>............] - ETA: 47s - loss: 1.0134 - regression_loss: 0.9022 - classification_loss: 0.1112 311/500 [=================>............] - ETA: 47s - loss: 1.0139 - regression_loss: 0.9025 - classification_loss: 0.1113 312/500 [=================>............] - ETA: 46s - loss: 1.0112 - regression_loss: 0.9002 - classification_loss: 0.1110 313/500 [=================>............] - ETA: 46s - loss: 1.0124 - regression_loss: 0.9012 - classification_loss: 0.1112 314/500 [=================>............] - ETA: 46s - loss: 1.0123 - regression_loss: 0.9011 - classification_loss: 0.1112 315/500 [=================>............] - ETA: 46s - loss: 1.0123 - regression_loss: 0.9011 - classification_loss: 0.1111 316/500 [=================>............] - ETA: 45s - loss: 1.0126 - regression_loss: 0.9016 - classification_loss: 0.1110 317/500 [==================>...........] - ETA: 45s - loss: 1.0126 - regression_loss: 0.9014 - classification_loss: 0.1111 318/500 [==================>...........] - ETA: 45s - loss: 1.0116 - regression_loss: 0.9007 - classification_loss: 0.1109 319/500 [==================>...........] - ETA: 45s - loss: 1.0122 - regression_loss: 0.9012 - classification_loss: 0.1109 320/500 [==================>...........] - ETA: 44s - loss: 1.0119 - regression_loss: 0.9010 - classification_loss: 0.1110 321/500 [==================>...........] - ETA: 44s - loss: 1.0113 - regression_loss: 0.9005 - classification_loss: 0.1108 322/500 [==================>...........] - ETA: 44s - loss: 1.0110 - regression_loss: 0.9002 - classification_loss: 0.1108 323/500 [==================>...........] - ETA: 44s - loss: 1.0102 - regression_loss: 0.8996 - classification_loss: 0.1106 324/500 [==================>...........] - ETA: 43s - loss: 1.0092 - regression_loss: 0.8987 - classification_loss: 0.1104 325/500 [==================>...........] - ETA: 43s - loss: 1.0098 - regression_loss: 0.8993 - classification_loss: 0.1105 326/500 [==================>...........] - ETA: 43s - loss: 1.0089 - regression_loss: 0.8984 - classification_loss: 0.1105 327/500 [==================>...........] - ETA: 43s - loss: 1.0075 - regression_loss: 0.8972 - classification_loss: 0.1102 328/500 [==================>...........] - ETA: 42s - loss: 1.0091 - regression_loss: 0.8987 - classification_loss: 0.1104 329/500 [==================>...........] - ETA: 42s - loss: 1.0095 - regression_loss: 0.8990 - classification_loss: 0.1105 330/500 [==================>...........] - ETA: 42s - loss: 1.0105 - regression_loss: 0.8998 - classification_loss: 0.1107 331/500 [==================>...........] - ETA: 42s - loss: 1.0117 - regression_loss: 0.9007 - classification_loss: 0.1109 332/500 [==================>...........] - ETA: 41s - loss: 1.0121 - regression_loss: 0.9009 - classification_loss: 0.1112 333/500 [==================>...........] - ETA: 41s - loss: 1.0121 - regression_loss: 0.9010 - classification_loss: 0.1112 334/500 [===================>..........] - ETA: 41s - loss: 1.0136 - regression_loss: 0.9023 - classification_loss: 0.1114 335/500 [===================>..........] - ETA: 41s - loss: 1.0150 - regression_loss: 0.9034 - classification_loss: 0.1116 336/500 [===================>..........] - ETA: 40s - loss: 1.0151 - regression_loss: 0.9036 - classification_loss: 0.1116 337/500 [===================>..........] - ETA: 40s - loss: 1.0146 - regression_loss: 0.9032 - classification_loss: 0.1115 338/500 [===================>..........] - ETA: 40s - loss: 1.0165 - regression_loss: 0.9048 - classification_loss: 0.1118 339/500 [===================>..........] - ETA: 40s - loss: 1.0171 - regression_loss: 0.9053 - classification_loss: 0.1119 340/500 [===================>..........] - ETA: 39s - loss: 1.0156 - regression_loss: 0.9040 - classification_loss: 0.1116 341/500 [===================>..........] - ETA: 39s - loss: 1.0162 - regression_loss: 0.9046 - classification_loss: 0.1115 342/500 [===================>..........] - ETA: 39s - loss: 1.0168 - regression_loss: 0.9052 - classification_loss: 0.1116 343/500 [===================>..........] - ETA: 39s - loss: 1.0177 - regression_loss: 0.9058 - classification_loss: 0.1119 344/500 [===================>..........] - ETA: 38s - loss: 1.0164 - regression_loss: 0.9046 - classification_loss: 0.1118 345/500 [===================>..........] - ETA: 38s - loss: 1.0145 - regression_loss: 0.9030 - classification_loss: 0.1116 346/500 [===================>..........] - ETA: 38s - loss: 1.0133 - regression_loss: 0.9016 - classification_loss: 0.1116 347/500 [===================>..........] - ETA: 38s - loss: 1.0141 - regression_loss: 0.9024 - classification_loss: 0.1118 348/500 [===================>..........] - ETA: 37s - loss: 1.0130 - regression_loss: 0.9015 - classification_loss: 0.1115 349/500 [===================>..........] - ETA: 37s - loss: 1.0127 - regression_loss: 0.9012 - classification_loss: 0.1115 350/500 [====================>.........] - ETA: 37s - loss: 1.0140 - regression_loss: 0.9023 - classification_loss: 0.1117 351/500 [====================>.........] - ETA: 37s - loss: 1.0145 - regression_loss: 0.9028 - classification_loss: 0.1117 352/500 [====================>.........] - ETA: 36s - loss: 1.0134 - regression_loss: 0.9018 - classification_loss: 0.1116 353/500 [====================>.........] - ETA: 36s - loss: 1.0122 - regression_loss: 0.9008 - classification_loss: 0.1114 354/500 [====================>.........] - ETA: 36s - loss: 1.0127 - regression_loss: 0.9013 - classification_loss: 0.1114 355/500 [====================>.........] - ETA: 36s - loss: 1.0132 - regression_loss: 0.9018 - classification_loss: 0.1115 356/500 [====================>.........] - ETA: 35s - loss: 1.0126 - regression_loss: 0.9012 - classification_loss: 0.1114 357/500 [====================>.........] - ETA: 35s - loss: 1.0130 - regression_loss: 0.9016 - classification_loss: 0.1114 358/500 [====================>.........] - ETA: 35s - loss: 1.0135 - regression_loss: 0.9021 - classification_loss: 0.1114 359/500 [====================>.........] - ETA: 35s - loss: 1.0137 - regression_loss: 0.9023 - classification_loss: 0.1114 360/500 [====================>.........] - ETA: 34s - loss: 1.0127 - regression_loss: 0.9014 - classification_loss: 0.1113 361/500 [====================>.........] - ETA: 34s - loss: 1.0136 - regression_loss: 0.9021 - classification_loss: 0.1115 362/500 [====================>.........] - ETA: 34s - loss: 1.0141 - regression_loss: 0.9026 - classification_loss: 0.1115 363/500 [====================>.........] - ETA: 34s - loss: 1.0126 - regression_loss: 0.9013 - classification_loss: 0.1113 364/500 [====================>.........] - ETA: 33s - loss: 1.0138 - regression_loss: 0.9022 - classification_loss: 0.1115 365/500 [====================>.........] - ETA: 33s - loss: 1.0138 - regression_loss: 0.9022 - classification_loss: 0.1116 366/500 [====================>.........] - ETA: 33s - loss: 1.0148 - regression_loss: 0.9028 - classification_loss: 0.1119 367/500 [=====================>........] - ETA: 33s - loss: 1.0135 - regression_loss: 0.9018 - classification_loss: 0.1117 368/500 [=====================>........] - ETA: 32s - loss: 1.0140 - regression_loss: 0.9022 - classification_loss: 0.1118 369/500 [=====================>........] - ETA: 32s - loss: 1.0145 - regression_loss: 0.9022 - classification_loss: 0.1123 370/500 [=====================>........] - ETA: 32s - loss: 1.0135 - regression_loss: 0.9013 - classification_loss: 0.1123 371/500 [=====================>........] - ETA: 32s - loss: 1.0129 - regression_loss: 0.9008 - classification_loss: 0.1121 372/500 [=====================>........] - ETA: 31s - loss: 1.0111 - regression_loss: 0.8993 - classification_loss: 0.1119 373/500 [=====================>........] - ETA: 31s - loss: 1.0112 - regression_loss: 0.8994 - classification_loss: 0.1118 374/500 [=====================>........] - ETA: 31s - loss: 1.0119 - regression_loss: 0.9000 - classification_loss: 0.1120 375/500 [=====================>........] - ETA: 31s - loss: 1.0110 - regression_loss: 0.8991 - classification_loss: 0.1118 376/500 [=====================>........] - ETA: 30s - loss: 1.0114 - regression_loss: 0.8994 - classification_loss: 0.1120 377/500 [=====================>........] - ETA: 30s - loss: 1.0117 - regression_loss: 0.8996 - classification_loss: 0.1121 378/500 [=====================>........] - ETA: 30s - loss: 1.0122 - regression_loss: 0.9001 - classification_loss: 0.1122 379/500 [=====================>........] - ETA: 30s - loss: 1.0119 - regression_loss: 0.8998 - classification_loss: 0.1121 380/500 [=====================>........] - ETA: 29s - loss: 1.0106 - regression_loss: 0.8986 - classification_loss: 0.1120 381/500 [=====================>........] - ETA: 29s - loss: 1.0109 - regression_loss: 0.8988 - classification_loss: 0.1120 382/500 [=====================>........] - ETA: 29s - loss: 1.0112 - regression_loss: 0.8993 - classification_loss: 0.1119 383/500 [=====================>........] - ETA: 29s - loss: 1.0120 - regression_loss: 0.9000 - classification_loss: 0.1120 384/500 [======================>.......] - ETA: 28s - loss: 1.0132 - regression_loss: 0.9011 - classification_loss: 0.1122 385/500 [======================>.......] - ETA: 28s - loss: 1.0136 - regression_loss: 0.9014 - classification_loss: 0.1122 386/500 [======================>.......] - ETA: 28s - loss: 1.0127 - regression_loss: 0.9005 - classification_loss: 0.1122 387/500 [======================>.......] - ETA: 28s - loss: 1.0125 - regression_loss: 0.9003 - classification_loss: 0.1123 388/500 [======================>.......] - ETA: 27s - loss: 1.0140 - regression_loss: 0.9016 - classification_loss: 0.1125 389/500 [======================>.......] - ETA: 27s - loss: 1.0151 - regression_loss: 0.9025 - classification_loss: 0.1126 390/500 [======================>.......] - ETA: 27s - loss: 1.0160 - regression_loss: 0.9033 - classification_loss: 0.1127 391/500 [======================>.......] - ETA: 27s - loss: 1.0161 - regression_loss: 0.9034 - classification_loss: 0.1127 392/500 [======================>.......] - ETA: 26s - loss: 1.0169 - regression_loss: 0.9040 - classification_loss: 0.1129 393/500 [======================>.......] - ETA: 26s - loss: 1.0171 - regression_loss: 0.9042 - classification_loss: 0.1129 394/500 [======================>.......] - ETA: 26s - loss: 1.0174 - regression_loss: 0.9046 - classification_loss: 0.1128 395/500 [======================>.......] - ETA: 26s - loss: 1.0178 - regression_loss: 0.9049 - classification_loss: 0.1129 396/500 [======================>.......] - ETA: 25s - loss: 1.0184 - regression_loss: 0.9055 - classification_loss: 0.1129 397/500 [======================>.......] - ETA: 25s - loss: 1.0183 - regression_loss: 0.9054 - classification_loss: 0.1129 398/500 [======================>.......] - ETA: 25s - loss: 1.0185 - regression_loss: 0.9056 - classification_loss: 0.1129 399/500 [======================>.......] - ETA: 25s - loss: 1.0189 - regression_loss: 0.9060 - classification_loss: 0.1128 400/500 [=======================>......] - ETA: 24s - loss: 1.0192 - regression_loss: 0.9063 - classification_loss: 0.1129 401/500 [=======================>......] - ETA: 24s - loss: 1.0186 - regression_loss: 0.9058 - classification_loss: 0.1129 402/500 [=======================>......] - ETA: 24s - loss: 1.0174 - regression_loss: 0.9048 - classification_loss: 0.1126 403/500 [=======================>......] - ETA: 24s - loss: 1.0170 - regression_loss: 0.9044 - classification_loss: 0.1126 404/500 [=======================>......] - ETA: 23s - loss: 1.0176 - regression_loss: 0.9049 - classification_loss: 0.1126 405/500 [=======================>......] - ETA: 23s - loss: 1.0187 - regression_loss: 0.9059 - classification_loss: 0.1129 406/500 [=======================>......] - ETA: 23s - loss: 1.0191 - regression_loss: 0.9063 - classification_loss: 0.1129 407/500 [=======================>......] - ETA: 23s - loss: 1.0201 - regression_loss: 0.9070 - classification_loss: 0.1131 408/500 [=======================>......] - ETA: 22s - loss: 1.0199 - regression_loss: 0.9068 - classification_loss: 0.1131 409/500 [=======================>......] - ETA: 22s - loss: 1.0187 - regression_loss: 0.9058 - classification_loss: 0.1130 410/500 [=======================>......] - ETA: 22s - loss: 1.0185 - regression_loss: 0.9056 - classification_loss: 0.1129 411/500 [=======================>......] - ETA: 22s - loss: 1.0178 - regression_loss: 0.9050 - classification_loss: 0.1127 412/500 [=======================>......] - ETA: 21s - loss: 1.0179 - regression_loss: 0.9052 - classification_loss: 0.1128 413/500 [=======================>......] - ETA: 21s - loss: 1.0181 - regression_loss: 0.9052 - classification_loss: 0.1129 414/500 [=======================>......] - ETA: 21s - loss: 1.0196 - regression_loss: 0.9063 - classification_loss: 0.1133 415/500 [=======================>......] - ETA: 21s - loss: 1.0200 - regression_loss: 0.9066 - classification_loss: 0.1134 416/500 [=======================>......] - ETA: 20s - loss: 1.0201 - regression_loss: 0.9067 - classification_loss: 0.1134 417/500 [========================>.....] - ETA: 20s - loss: 1.0185 - regression_loss: 0.9053 - classification_loss: 0.1132 418/500 [========================>.....] - ETA: 20s - loss: 1.0183 - regression_loss: 0.9051 - classification_loss: 0.1132 419/500 [========================>.....] - ETA: 20s - loss: 1.0176 - regression_loss: 0.9045 - classification_loss: 0.1131 420/500 [========================>.....] - ETA: 19s - loss: 1.0182 - regression_loss: 0.9050 - classification_loss: 0.1132 421/500 [========================>.....] - ETA: 19s - loss: 1.0170 - regression_loss: 0.9039 - classification_loss: 0.1131 422/500 [========================>.....] - ETA: 19s - loss: 1.0159 - regression_loss: 0.9030 - classification_loss: 0.1129 423/500 [========================>.....] - ETA: 19s - loss: 1.0151 - regression_loss: 0.9024 - classification_loss: 0.1127 424/500 [========================>.....] - ETA: 18s - loss: 1.0166 - regression_loss: 0.9034 - classification_loss: 0.1131 425/500 [========================>.....] - ETA: 18s - loss: 1.0166 - regression_loss: 0.9034 - classification_loss: 0.1131 426/500 [========================>.....] - ETA: 18s - loss: 1.0162 - regression_loss: 0.9032 - classification_loss: 0.1130 427/500 [========================>.....] - ETA: 18s - loss: 1.0145 - regression_loss: 0.9017 - classification_loss: 0.1128 428/500 [========================>.....] - ETA: 17s - loss: 1.0134 - regression_loss: 0.9007 - classification_loss: 0.1127 429/500 [========================>.....] - ETA: 17s - loss: 1.0139 - regression_loss: 0.9011 - classification_loss: 0.1128 430/500 [========================>.....] - ETA: 17s - loss: 1.0144 - regression_loss: 0.9016 - classification_loss: 0.1128 431/500 [========================>.....] - ETA: 17s - loss: 1.0153 - regression_loss: 0.9023 - classification_loss: 0.1130 432/500 [========================>.....] - ETA: 16s - loss: 1.0140 - regression_loss: 0.9012 - classification_loss: 0.1128 433/500 [========================>.....] - ETA: 16s - loss: 1.0145 - regression_loss: 0.9018 - classification_loss: 0.1127 434/500 [=========================>....] - ETA: 16s - loss: 1.0148 - regression_loss: 0.9022 - classification_loss: 0.1127 435/500 [=========================>....] - ETA: 16s - loss: 1.0150 - regression_loss: 0.9023 - classification_loss: 0.1127 436/500 [=========================>....] - ETA: 15s - loss: 1.0157 - regression_loss: 0.9030 - classification_loss: 0.1127 437/500 [=========================>....] - ETA: 15s - loss: 1.0151 - regression_loss: 0.9026 - classification_loss: 0.1125 438/500 [=========================>....] - ETA: 15s - loss: 1.0144 - regression_loss: 0.9020 - classification_loss: 0.1124 439/500 [=========================>....] - ETA: 15s - loss: 1.0142 - regression_loss: 0.9018 - classification_loss: 0.1123 440/500 [=========================>....] - ETA: 14s - loss: 1.0130 - regression_loss: 0.9008 - classification_loss: 0.1122 441/500 [=========================>....] - ETA: 14s - loss: 1.0139 - regression_loss: 0.9015 - classification_loss: 0.1124 442/500 [=========================>....] - ETA: 14s - loss: 1.0147 - regression_loss: 0.9022 - classification_loss: 0.1125 443/500 [=========================>....] - ETA: 14s - loss: 1.0154 - regression_loss: 0.9027 - classification_loss: 0.1126 444/500 [=========================>....] - ETA: 13s - loss: 1.0146 - regression_loss: 0.9021 - classification_loss: 0.1125 445/500 [=========================>....] - ETA: 13s - loss: 1.0148 - regression_loss: 0.9022 - classification_loss: 0.1125 446/500 [=========================>....] - ETA: 13s - loss: 1.0135 - regression_loss: 0.9011 - classification_loss: 0.1124 447/500 [=========================>....] - ETA: 13s - loss: 1.0137 - regression_loss: 0.9012 - classification_loss: 0.1125 448/500 [=========================>....] - ETA: 13s - loss: 1.0138 - regression_loss: 0.9013 - classification_loss: 0.1125 449/500 [=========================>....] - ETA: 12s - loss: 1.0141 - regression_loss: 0.9016 - classification_loss: 0.1125 450/500 [==========================>...] - ETA: 12s - loss: 1.0134 - regression_loss: 0.9010 - classification_loss: 0.1124 451/500 [==========================>...] - ETA: 12s - loss: 1.0137 - regression_loss: 0.9014 - classification_loss: 0.1124 452/500 [==========================>...] - ETA: 12s - loss: 1.0138 - regression_loss: 0.9014 - classification_loss: 0.1124 453/500 [==========================>...] - ETA: 11s - loss: 1.0131 - regression_loss: 0.9008 - classification_loss: 0.1123 454/500 [==========================>...] - ETA: 11s - loss: 1.0128 - regression_loss: 0.9006 - classification_loss: 0.1122 455/500 [==========================>...] - ETA: 11s - loss: 1.0117 - regression_loss: 0.8997 - classification_loss: 0.1121 456/500 [==========================>...] - ETA: 11s - loss: 1.0116 - regression_loss: 0.8996 - classification_loss: 0.1121 457/500 [==========================>...] - ETA: 10s - loss: 1.0122 - regression_loss: 0.9000 - classification_loss: 0.1122 458/500 [==========================>...] - ETA: 10s - loss: 1.0118 - regression_loss: 0.8997 - classification_loss: 0.1121 459/500 [==========================>...] - ETA: 10s - loss: 1.0114 - regression_loss: 0.8994 - classification_loss: 0.1120 460/500 [==========================>...] - ETA: 10s - loss: 1.0118 - regression_loss: 0.8997 - classification_loss: 0.1121 461/500 [==========================>...] - ETA: 9s - loss: 1.0134 - regression_loss: 0.9010 - classification_loss: 0.1124  462/500 [==========================>...] - ETA: 9s - loss: 1.0144 - regression_loss: 0.9018 - classification_loss: 0.1125 463/500 [==========================>...] - ETA: 9s - loss: 1.0152 - regression_loss: 0.9026 - classification_loss: 0.1126 464/500 [==========================>...] - ETA: 9s - loss: 1.0156 - regression_loss: 0.9029 - classification_loss: 0.1126 465/500 [==========================>...] - ETA: 8s - loss: 1.0162 - regression_loss: 0.9034 - classification_loss: 0.1128 466/500 [==========================>...] - ETA: 8s - loss: 1.0163 - regression_loss: 0.9034 - classification_loss: 0.1128 467/500 [===========================>..] - ETA: 8s - loss: 1.0154 - regression_loss: 0.9027 - classification_loss: 0.1127 468/500 [===========================>..] - ETA: 8s - loss: 1.0155 - regression_loss: 0.9028 - classification_loss: 0.1127 469/500 [===========================>..] - ETA: 7s - loss: 1.0155 - regression_loss: 0.9028 - classification_loss: 0.1127 470/500 [===========================>..] - ETA: 7s - loss: 1.0166 - regression_loss: 0.9037 - classification_loss: 0.1129 471/500 [===========================>..] - ETA: 7s - loss: 1.0164 - regression_loss: 0.9034 - classification_loss: 0.1130 472/500 [===========================>..] - ETA: 7s - loss: 1.0164 - regression_loss: 0.9034 - classification_loss: 0.1130 473/500 [===========================>..] - ETA: 6s - loss: 1.0162 - regression_loss: 0.9031 - classification_loss: 0.1130 474/500 [===========================>..] - ETA: 6s - loss: 1.0151 - regression_loss: 0.9021 - classification_loss: 0.1129 475/500 [===========================>..] - ETA: 6s - loss: 1.0156 - regression_loss: 0.9027 - classification_loss: 0.1130 476/500 [===========================>..] - ETA: 6s - loss: 1.0153 - regression_loss: 0.9024 - classification_loss: 0.1129 477/500 [===========================>..] - ETA: 5s - loss: 1.0156 - regression_loss: 0.9026 - classification_loss: 0.1130 478/500 [===========================>..] - ETA: 5s - loss: 1.0149 - regression_loss: 0.9021 - classification_loss: 0.1128 479/500 [===========================>..] - ETA: 5s - loss: 1.0152 - regression_loss: 0.9024 - classification_loss: 0.1127 480/500 [===========================>..] - ETA: 5s - loss: 1.0154 - regression_loss: 0.9027 - classification_loss: 0.1127 481/500 [===========================>..] - ETA: 4s - loss: 1.0153 - regression_loss: 0.9025 - classification_loss: 0.1128 482/500 [===========================>..] - ETA: 4s - loss: 1.0161 - regression_loss: 0.9032 - classification_loss: 0.1129 483/500 [===========================>..] - ETA: 4s - loss: 1.0164 - regression_loss: 0.9035 - classification_loss: 0.1129 484/500 [============================>.] - ETA: 3s - loss: 1.0170 - regression_loss: 0.9041 - classification_loss: 0.1130 485/500 [============================>.] - ETA: 3s - loss: 1.0170 - regression_loss: 0.9041 - classification_loss: 0.1129 486/500 [============================>.] - ETA: 3s - loss: 1.0156 - regression_loss: 0.9028 - classification_loss: 0.1128 487/500 [============================>.] - ETA: 3s - loss: 1.0150 - regression_loss: 0.9024 - classification_loss: 0.1126 488/500 [============================>.] - ETA: 2s - loss: 1.0144 - regression_loss: 0.9019 - classification_loss: 0.1125 489/500 [============================>.] - ETA: 2s - loss: 1.0152 - regression_loss: 0.9027 - classification_loss: 0.1125 490/500 [============================>.] - ETA: 2s - loss: 1.0162 - regression_loss: 0.9035 - classification_loss: 0.1128 491/500 [============================>.] - ETA: 2s - loss: 1.0153 - regression_loss: 0.9027 - classification_loss: 0.1126 492/500 [============================>.] - ETA: 1s - loss: 1.0143 - regression_loss: 0.9018 - classification_loss: 0.1125 493/500 [============================>.] - ETA: 1s - loss: 1.0148 - regression_loss: 0.9022 - classification_loss: 0.1125 494/500 [============================>.] - ETA: 1s - loss: 1.0153 - regression_loss: 0.9027 - classification_loss: 0.1126 495/500 [============================>.] - ETA: 1s - loss: 1.0167 - regression_loss: 0.9038 - classification_loss: 0.1129 496/500 [============================>.] - ETA: 0s - loss: 1.0159 - regression_loss: 0.9031 - classification_loss: 0.1127 497/500 [============================>.] - ETA: 0s - loss: 1.0164 - regression_loss: 0.9037 - classification_loss: 0.1126 498/500 [============================>.] - ETA: 0s - loss: 1.0163 - regression_loss: 0.9037 - classification_loss: 0.1127 499/500 [============================>.] - ETA: 0s - loss: 1.0167 - regression_loss: 0.9039 - classification_loss: 0.1128 500/500 [==============================] - 125s 250ms/step - loss: 1.0170 - regression_loss: 0.9042 - classification_loss: 0.1128 1172 instances of class plum with average precision: 0.7916 mAP: 0.7916 Epoch 00037: saving model to ./training/snapshots/resnet50_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 1:52 - loss: 1.0126 - regression_loss: 0.9360 - classification_loss: 0.0766 2/500 [..............................] - ETA: 1:58 - loss: 1.0025 - regression_loss: 0.9339 - classification_loss: 0.0686 3/500 [..............................] - ETA: 1:59 - loss: 1.1000 - regression_loss: 1.0020 - classification_loss: 0.0980 4/500 [..............................] - ETA: 2:01 - loss: 0.9599 - regression_loss: 0.8688 - classification_loss: 0.0911 5/500 [..............................] - ETA: 2:02 - loss: 0.9397 - regression_loss: 0.8520 - classification_loss: 0.0877 6/500 [..............................] - ETA: 2:02 - loss: 0.9695 - regression_loss: 0.8764 - classification_loss: 0.0931 7/500 [..............................] - ETA: 2:01 - loss: 0.9582 - regression_loss: 0.8685 - classification_loss: 0.0897 8/500 [..............................] - ETA: 2:01 - loss: 0.8993 - regression_loss: 0.8170 - classification_loss: 0.0823 9/500 [..............................] - ETA: 2:01 - loss: 0.9222 - regression_loss: 0.8354 - classification_loss: 0.0868 10/500 [..............................] - ETA: 2:01 - loss: 0.9636 - regression_loss: 0.8711 - classification_loss: 0.0926 11/500 [..............................] - ETA: 2:01 - loss: 1.2704 - regression_loss: 0.8064 - classification_loss: 0.4640 12/500 [..............................] - ETA: 2:01 - loss: 1.2567 - regression_loss: 0.8201 - classification_loss: 0.4366 13/500 [..............................] - ETA: 2:01 - loss: 1.2416 - regression_loss: 0.8288 - classification_loss: 0.4129 14/500 [..............................] - ETA: 2:01 - loss: 1.1985 - regression_loss: 0.8075 - classification_loss: 0.3910 15/500 [..............................] - ETA: 2:01 - loss: 1.1575 - regression_loss: 0.7907 - classification_loss: 0.3669 16/500 [..............................] - ETA: 2:01 - loss: 1.1440 - regression_loss: 0.7931 - classification_loss: 0.3509 17/500 [>.............................] - ETA: 2:01 - loss: 1.1361 - regression_loss: 0.7990 - classification_loss: 0.3371 18/500 [>.............................] - ETA: 2:00 - loss: 1.1394 - regression_loss: 0.8122 - classification_loss: 0.3272 19/500 [>.............................] - ETA: 2:00 - loss: 1.1685 - regression_loss: 0.8449 - classification_loss: 0.3237 20/500 [>.............................] - ETA: 2:00 - loss: 1.1546 - regression_loss: 0.8416 - classification_loss: 0.3130 21/500 [>.............................] - ETA: 2:00 - loss: 1.1787 - regression_loss: 0.8705 - classification_loss: 0.3082 22/500 [>.............................] - ETA: 1:59 - loss: 1.1456 - regression_loss: 0.8494 - classification_loss: 0.2962 23/500 [>.............................] - ETA: 1:59 - loss: 1.1449 - regression_loss: 0.8564 - classification_loss: 0.2885 24/500 [>.............................] - ETA: 1:59 - loss: 1.1102 - regression_loss: 0.8331 - classification_loss: 0.2771 25/500 [>.............................] - ETA: 1:58 - loss: 1.0960 - regression_loss: 0.8274 - classification_loss: 0.2687 26/500 [>.............................] - ETA: 1:58 - loss: 1.0954 - regression_loss: 0.8345 - classification_loss: 0.2609 27/500 [>.............................] - ETA: 1:58 - loss: 1.1106 - regression_loss: 0.8524 - classification_loss: 0.2582 28/500 [>.............................] - ETA: 1:58 - loss: 1.1032 - regression_loss: 0.8504 - classification_loss: 0.2528 29/500 [>.............................] - ETA: 1:57 - loss: 1.1251 - regression_loss: 0.8727 - classification_loss: 0.2524 30/500 [>.............................] - ETA: 1:57 - loss: 1.1399 - regression_loss: 0.8902 - classification_loss: 0.2498 31/500 [>.............................] - ETA: 1:57 - loss: 1.1312 - regression_loss: 0.8852 - classification_loss: 0.2460 32/500 [>.............................] - ETA: 1:56 - loss: 1.1501 - regression_loss: 0.9040 - classification_loss: 0.2461 33/500 [>.............................] - ETA: 1:56 - loss: 1.1476 - regression_loss: 0.9047 - classification_loss: 0.2429 34/500 [=>............................] - ETA: 1:56 - loss: 1.1515 - regression_loss: 0.9104 - classification_loss: 0.2411 35/500 [=>............................] - ETA: 1:56 - loss: 1.1306 - regression_loss: 0.8958 - classification_loss: 0.2348 36/500 [=>............................] - ETA: 1:55 - loss: 1.1347 - regression_loss: 0.9021 - classification_loss: 0.2326 37/500 [=>............................] - ETA: 1:55 - loss: 1.1409 - regression_loss: 0.9107 - classification_loss: 0.2301 38/500 [=>............................] - ETA: 1:55 - loss: 1.1376 - regression_loss: 0.9113 - classification_loss: 0.2263 39/500 [=>............................] - ETA: 1:54 - loss: 1.1322 - regression_loss: 0.9091 - classification_loss: 0.2231 40/500 [=>............................] - ETA: 1:54 - loss: 1.1202 - regression_loss: 0.9014 - classification_loss: 0.2188 41/500 [=>............................] - ETA: 1:54 - loss: 1.1188 - regression_loss: 0.9024 - classification_loss: 0.2164 42/500 [=>............................] - ETA: 1:54 - loss: 1.1228 - regression_loss: 0.9074 - classification_loss: 0.2155 43/500 [=>............................] - ETA: 1:53 - loss: 1.1239 - regression_loss: 0.9103 - classification_loss: 0.2136 44/500 [=>............................] - ETA: 1:53 - loss: 1.1245 - regression_loss: 0.9127 - classification_loss: 0.2119 45/500 [=>............................] - ETA: 1:53 - loss: 1.1370 - regression_loss: 0.9250 - classification_loss: 0.2120 46/500 [=>............................] - ETA: 1:53 - loss: 1.1245 - regression_loss: 0.9161 - classification_loss: 0.2084 47/500 [=>............................] - ETA: 1:52 - loss: 1.1336 - regression_loss: 0.9251 - classification_loss: 0.2085 48/500 [=>............................] - ETA: 1:52 - loss: 1.1219 - regression_loss: 0.9168 - classification_loss: 0.2051 49/500 [=>............................] - ETA: 1:52 - loss: 1.1150 - regression_loss: 0.9127 - classification_loss: 0.2024 50/500 [==>...........................] - ETA: 1:52 - loss: 1.1190 - regression_loss: 0.9178 - classification_loss: 0.2012 51/500 [==>...........................] - ETA: 1:51 - loss: 1.1191 - regression_loss: 0.9203 - classification_loss: 0.1988 52/500 [==>...........................] - ETA: 1:51 - loss: 1.1189 - regression_loss: 0.9215 - classification_loss: 0.1973 53/500 [==>...........................] - ETA: 1:51 - loss: 1.1068 - regression_loss: 0.9126 - classification_loss: 0.1942 54/500 [==>...........................] - ETA: 1:51 - loss: 1.1017 - regression_loss: 0.9094 - classification_loss: 0.1923 55/500 [==>...........................] - ETA: 1:50 - loss: 1.1051 - regression_loss: 0.9127 - classification_loss: 0.1924 56/500 [==>...........................] - ETA: 1:50 - loss: 1.1108 - regression_loss: 0.9190 - classification_loss: 0.1919 57/500 [==>...........................] - ETA: 1:50 - loss: 1.1090 - regression_loss: 0.9190 - classification_loss: 0.1900 58/500 [==>...........................] - ETA: 1:50 - loss: 1.1129 - regression_loss: 0.9234 - classification_loss: 0.1895 59/500 [==>...........................] - ETA: 1:50 - loss: 1.1016 - regression_loss: 0.9150 - classification_loss: 0.1866 60/500 [==>...........................] - ETA: 1:49 - loss: 1.0934 - regression_loss: 0.9090 - classification_loss: 0.1843 61/500 [==>...........................] - ETA: 1:49 - loss: 1.0934 - regression_loss: 0.9096 - classification_loss: 0.1838 62/500 [==>...........................] - ETA: 1:49 - loss: 1.0917 - regression_loss: 0.9098 - classification_loss: 0.1819 63/500 [==>...........................] - ETA: 1:48 - loss: 1.0901 - regression_loss: 0.9100 - classification_loss: 0.1801 64/500 [==>...........................] - ETA: 1:48 - loss: 1.0931 - regression_loss: 0.9136 - classification_loss: 0.1796 65/500 [==>...........................] - ETA: 1:48 - loss: 1.0867 - regression_loss: 0.9094 - classification_loss: 0.1773 66/500 [==>...........................] - ETA: 1:48 - loss: 1.0915 - regression_loss: 0.9140 - classification_loss: 0.1775 67/500 [===>..........................] - ETA: 1:48 - loss: 1.0908 - regression_loss: 0.9140 - classification_loss: 0.1768 68/500 [===>..........................] - ETA: 1:47 - loss: 1.0868 - regression_loss: 0.9111 - classification_loss: 0.1757 69/500 [===>..........................] - ETA: 1:47 - loss: 1.0907 - regression_loss: 0.9152 - classification_loss: 0.1755 70/500 [===>..........................] - ETA: 1:47 - loss: 1.0899 - regression_loss: 0.9151 - classification_loss: 0.1748 71/500 [===>..........................] - ETA: 1:47 - loss: 1.0818 - regression_loss: 0.9090 - classification_loss: 0.1727 72/500 [===>..........................] - ETA: 1:46 - loss: 1.0803 - regression_loss: 0.9097 - classification_loss: 0.1707 73/500 [===>..........................] - ETA: 1:46 - loss: 1.0836 - regression_loss: 0.9129 - classification_loss: 0.1708 74/500 [===>..........................] - ETA: 1:46 - loss: 1.0827 - regression_loss: 0.9126 - classification_loss: 0.1700 75/500 [===>..........................] - ETA: 1:46 - loss: 1.0862 - regression_loss: 0.9164 - classification_loss: 0.1699 76/500 [===>..........................] - ETA: 1:45 - loss: 1.0914 - regression_loss: 0.9222 - classification_loss: 0.1692 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0919 - regression_loss: 0.9235 - classification_loss: 0.1684 78/500 [===>..........................] - ETA: 1:45 - loss: 1.0901 - regression_loss: 0.9224 - classification_loss: 0.1677 79/500 [===>..........................] - ETA: 1:45 - loss: 1.0966 - regression_loss: 0.9283 - classification_loss: 0.1683 80/500 [===>..........................] - ETA: 1:44 - loss: 1.0950 - regression_loss: 0.9275 - classification_loss: 0.1675 81/500 [===>..........................] - ETA: 1:44 - loss: 1.0913 - regression_loss: 0.9247 - classification_loss: 0.1666 82/500 [===>..........................] - ETA: 1:44 - loss: 1.0869 - regression_loss: 0.9215 - classification_loss: 0.1653 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0882 - regression_loss: 0.9232 - classification_loss: 0.1650 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0847 - regression_loss: 0.9204 - classification_loss: 0.1643 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0816 - regression_loss: 0.9186 - classification_loss: 0.1630 86/500 [====>.........................] - ETA: 1:43 - loss: 1.0785 - regression_loss: 0.9165 - classification_loss: 0.1620 87/500 [====>.........................] - ETA: 1:42 - loss: 1.0741 - regression_loss: 0.9131 - classification_loss: 0.1610 88/500 [====>.........................] - ETA: 1:42 - loss: 1.0734 - regression_loss: 0.9128 - classification_loss: 0.1606 89/500 [====>.........................] - ETA: 1:42 - loss: 1.0755 - regression_loss: 0.9141 - classification_loss: 0.1614 90/500 [====>.........................] - ETA: 1:42 - loss: 1.0688 - regression_loss: 0.9087 - classification_loss: 0.1601 91/500 [====>.........................] - ETA: 1:41 - loss: 1.0667 - regression_loss: 0.9073 - classification_loss: 0.1594 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0615 - regression_loss: 0.9035 - classification_loss: 0.1579 93/500 [====>.........................] - ETA: 1:41 - loss: 1.0566 - regression_loss: 0.9001 - classification_loss: 0.1565 94/500 [====>.........................] - ETA: 1:41 - loss: 1.0591 - regression_loss: 0.9027 - classification_loss: 0.1564 95/500 [====>.........................] - ETA: 1:40 - loss: 1.0626 - regression_loss: 0.9059 - classification_loss: 0.1567 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0640 - regression_loss: 0.9075 - classification_loss: 0.1565 97/500 [====>.........................] - ETA: 1:40 - loss: 1.0638 - regression_loss: 0.9073 - classification_loss: 0.1565 98/500 [====>.........................] - ETA: 1:40 - loss: 1.0555 - regression_loss: 0.9002 - classification_loss: 0.1552 99/500 [====>.........................] - ETA: 1:40 - loss: 1.0503 - regression_loss: 0.8960 - classification_loss: 0.1543 100/500 [=====>........................] - ETA: 1:39 - loss: 1.0501 - regression_loss: 0.8959 - classification_loss: 0.1541 101/500 [=====>........................] - ETA: 1:39 - loss: 1.0531 - regression_loss: 0.8995 - classification_loss: 0.1535 102/500 [=====>........................] - ETA: 1:39 - loss: 1.0529 - regression_loss: 0.8998 - classification_loss: 0.1531 103/500 [=====>........................] - ETA: 1:39 - loss: 1.0518 - regression_loss: 0.8990 - classification_loss: 0.1527 104/500 [=====>........................] - ETA: 1:38 - loss: 1.0567 - regression_loss: 0.9036 - classification_loss: 0.1531 105/500 [=====>........................] - ETA: 1:38 - loss: 1.0545 - regression_loss: 0.9025 - classification_loss: 0.1520 106/500 [=====>........................] - ETA: 1:38 - loss: 1.0568 - regression_loss: 0.9051 - classification_loss: 0.1517 107/500 [=====>........................] - ETA: 1:38 - loss: 1.0508 - regression_loss: 0.9002 - classification_loss: 0.1506 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0565 - regression_loss: 0.9053 - classification_loss: 0.1511 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0607 - regression_loss: 0.9089 - classification_loss: 0.1518 110/500 [=====>........................] - ETA: 1:37 - loss: 1.0548 - regression_loss: 0.9039 - classification_loss: 0.1508 111/500 [=====>........................] - ETA: 1:37 - loss: 1.0519 - regression_loss: 0.9016 - classification_loss: 0.1503 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0502 - regression_loss: 0.9001 - classification_loss: 0.1501 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0417 - regression_loss: 0.8921 - classification_loss: 0.1496 114/500 [=====>........................] - ETA: 1:36 - loss: 1.0398 - regression_loss: 0.8909 - classification_loss: 0.1489 115/500 [=====>........................] - ETA: 1:36 - loss: 1.0380 - regression_loss: 0.8897 - classification_loss: 0.1483 116/500 [=====>........................] - ETA: 1:35 - loss: 1.0325 - regression_loss: 0.8853 - classification_loss: 0.1472 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0288 - regression_loss: 0.8824 - classification_loss: 0.1464 118/500 [======>.......................] - ETA: 1:35 - loss: 1.0328 - regression_loss: 0.8858 - classification_loss: 0.1470 119/500 [======>.......................] - ETA: 1:35 - loss: 1.0327 - regression_loss: 0.8863 - classification_loss: 0.1464 120/500 [======>.......................] - ETA: 1:35 - loss: 1.0349 - regression_loss: 0.8886 - classification_loss: 0.1463 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0369 - regression_loss: 0.8908 - classification_loss: 0.1461 122/500 [======>.......................] - ETA: 1:34 - loss: 1.0357 - regression_loss: 0.8898 - classification_loss: 0.1458 123/500 [======>.......................] - ETA: 1:34 - loss: 1.0348 - regression_loss: 0.8889 - classification_loss: 0.1459 124/500 [======>.......................] - ETA: 1:34 - loss: 1.0336 - regression_loss: 0.8880 - classification_loss: 0.1456 125/500 [======>.......................] - ETA: 1:33 - loss: 1.0325 - regression_loss: 0.8874 - classification_loss: 0.1451 126/500 [======>.......................] - ETA: 1:33 - loss: 1.0370 - regression_loss: 0.8911 - classification_loss: 0.1459 127/500 [======>.......................] - ETA: 1:33 - loss: 1.0382 - regression_loss: 0.8925 - classification_loss: 0.1457 128/500 [======>.......................] - ETA: 1:33 - loss: 1.0364 - regression_loss: 0.8910 - classification_loss: 0.1453 129/500 [======>.......................] - ETA: 1:32 - loss: 1.0307 - regression_loss: 0.8862 - classification_loss: 0.1444 130/500 [======>.......................] - ETA: 1:32 - loss: 1.0280 - regression_loss: 0.8842 - classification_loss: 0.1438 131/500 [======>.......................] - ETA: 1:32 - loss: 1.0281 - regression_loss: 0.8846 - classification_loss: 0.1435 132/500 [======>.......................] - ETA: 1:32 - loss: 1.0261 - regression_loss: 0.8832 - classification_loss: 0.1429 133/500 [======>.......................] - ETA: 1:31 - loss: 1.0286 - regression_loss: 0.8856 - classification_loss: 0.1429 134/500 [=======>......................] - ETA: 1:31 - loss: 1.0336 - regression_loss: 0.8911 - classification_loss: 0.1425 135/500 [=======>......................] - ETA: 1:31 - loss: 1.0364 - regression_loss: 0.8938 - classification_loss: 0.1426 136/500 [=======>......................] - ETA: 1:31 - loss: 1.0387 - regression_loss: 0.8958 - classification_loss: 0.1429 137/500 [=======>......................] - ETA: 1:30 - loss: 1.0379 - regression_loss: 0.8956 - classification_loss: 0.1423 138/500 [=======>......................] - ETA: 1:30 - loss: 1.0389 - regression_loss: 0.8966 - classification_loss: 0.1424 139/500 [=======>......................] - ETA: 1:30 - loss: 1.0348 - regression_loss: 0.8932 - classification_loss: 0.1416 140/500 [=======>......................] - ETA: 1:30 - loss: 1.0371 - regression_loss: 0.8952 - classification_loss: 0.1418 141/500 [=======>......................] - ETA: 1:29 - loss: 1.0383 - regression_loss: 0.8965 - classification_loss: 0.1419 142/500 [=======>......................] - ETA: 1:29 - loss: 1.0408 - regression_loss: 0.8987 - classification_loss: 0.1421 143/500 [=======>......................] - ETA: 1:29 - loss: 1.0362 - regression_loss: 0.8950 - classification_loss: 0.1412 144/500 [=======>......................] - ETA: 1:29 - loss: 1.0358 - regression_loss: 0.8949 - classification_loss: 0.1409 145/500 [=======>......................] - ETA: 1:28 - loss: 1.0339 - regression_loss: 0.8931 - classification_loss: 0.1408 146/500 [=======>......................] - ETA: 1:28 - loss: 1.0341 - regression_loss: 0.8932 - classification_loss: 0.1408 147/500 [=======>......................] - ETA: 1:28 - loss: 1.0313 - regression_loss: 0.8911 - classification_loss: 0.1402 148/500 [=======>......................] - ETA: 1:27 - loss: 1.0327 - regression_loss: 0.8925 - classification_loss: 0.1403 149/500 [=======>......................] - ETA: 1:27 - loss: 1.0359 - regression_loss: 0.8948 - classification_loss: 0.1411 150/500 [========>.....................] - ETA: 1:27 - loss: 1.0374 - regression_loss: 0.8956 - classification_loss: 0.1418 151/500 [========>.....................] - ETA: 1:27 - loss: 1.0348 - regression_loss: 0.8937 - classification_loss: 0.1411 152/500 [========>.....................] - ETA: 1:26 - loss: 1.0370 - regression_loss: 0.8954 - classification_loss: 0.1416 153/500 [========>.....................] - ETA: 1:26 - loss: 1.0341 - regression_loss: 0.8932 - classification_loss: 0.1409 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0358 - regression_loss: 0.8948 - classification_loss: 0.1410 155/500 [========>.....................] - ETA: 1:26 - loss: 1.0385 - regression_loss: 0.8976 - classification_loss: 0.1410 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0391 - regression_loss: 0.8983 - classification_loss: 0.1408 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0411 - regression_loss: 0.9002 - classification_loss: 0.1409 158/500 [========>.....................] - ETA: 1:25 - loss: 1.0385 - regression_loss: 0.8982 - classification_loss: 0.1403 159/500 [========>.....................] - ETA: 1:24 - loss: 1.0399 - regression_loss: 0.8996 - classification_loss: 0.1403 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0400 - regression_loss: 0.8999 - classification_loss: 0.1401 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0386 - regression_loss: 0.8991 - classification_loss: 0.1395 162/500 [========>.....................] - ETA: 1:24 - loss: 1.0354 - regression_loss: 0.8965 - classification_loss: 0.1390 163/500 [========>.....................] - ETA: 1:23 - loss: 1.0367 - regression_loss: 0.8976 - classification_loss: 0.1391 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0374 - regression_loss: 0.8980 - classification_loss: 0.1395 165/500 [========>.....................] - ETA: 1:23 - loss: 1.0390 - regression_loss: 0.8995 - classification_loss: 0.1395 166/500 [========>.....................] - ETA: 1:23 - loss: 1.0424 - regression_loss: 0.9026 - classification_loss: 0.1399 167/500 [=========>....................] - ETA: 1:22 - loss: 1.0410 - regression_loss: 0.9016 - classification_loss: 0.1394 168/500 [=========>....................] - ETA: 1:22 - loss: 1.0409 - regression_loss: 0.9016 - classification_loss: 0.1393 169/500 [=========>....................] - ETA: 1:22 - loss: 1.0412 - regression_loss: 0.9024 - classification_loss: 0.1388 170/500 [=========>....................] - ETA: 1:22 - loss: 1.0367 - regression_loss: 0.8984 - classification_loss: 0.1383 171/500 [=========>....................] - ETA: 1:21 - loss: 1.0370 - regression_loss: 0.8988 - classification_loss: 0.1383 172/500 [=========>....................] - ETA: 1:21 - loss: 1.0368 - regression_loss: 0.8986 - classification_loss: 0.1381 173/500 [=========>....................] - ETA: 1:21 - loss: 1.0396 - regression_loss: 0.9012 - classification_loss: 0.1384 174/500 [=========>....................] - ETA: 1:21 - loss: 1.0363 - regression_loss: 0.8985 - classification_loss: 0.1378 175/500 [=========>....................] - ETA: 1:20 - loss: 1.0356 - regression_loss: 0.8980 - classification_loss: 0.1376 176/500 [=========>....................] - ETA: 1:20 - loss: 1.0351 - regression_loss: 0.8978 - classification_loss: 0.1373 177/500 [=========>....................] - ETA: 1:20 - loss: 1.0322 - regression_loss: 0.8953 - classification_loss: 0.1369 178/500 [=========>....................] - ETA: 1:20 - loss: 1.0299 - regression_loss: 0.8935 - classification_loss: 0.1364 179/500 [=========>....................] - ETA: 1:19 - loss: 1.0312 - regression_loss: 0.8948 - classification_loss: 0.1364 180/500 [=========>....................] - ETA: 1:19 - loss: 1.0312 - regression_loss: 0.8949 - classification_loss: 0.1363 181/500 [=========>....................] - ETA: 1:19 - loss: 1.0294 - regression_loss: 0.8937 - classification_loss: 0.1358 182/500 [=========>....................] - ETA: 1:19 - loss: 1.0297 - regression_loss: 0.8941 - classification_loss: 0.1357 183/500 [=========>....................] - ETA: 1:18 - loss: 1.0296 - regression_loss: 0.8944 - classification_loss: 0.1352 184/500 [==========>...................] - ETA: 1:18 - loss: 1.0315 - regression_loss: 0.8960 - classification_loss: 0.1356 185/500 [==========>...................] - ETA: 1:18 - loss: 1.0335 - regression_loss: 0.8976 - classification_loss: 0.1359 186/500 [==========>...................] - ETA: 1:18 - loss: 1.0353 - regression_loss: 0.8992 - classification_loss: 0.1361 187/500 [==========>...................] - ETA: 1:17 - loss: 1.0367 - regression_loss: 0.9003 - classification_loss: 0.1364 188/500 [==========>...................] - ETA: 1:17 - loss: 1.0391 - regression_loss: 0.9022 - classification_loss: 0.1369 189/500 [==========>...................] - ETA: 1:17 - loss: 1.0399 - regression_loss: 0.9030 - classification_loss: 0.1370 190/500 [==========>...................] - ETA: 1:17 - loss: 1.0426 - regression_loss: 0.9053 - classification_loss: 0.1373 191/500 [==========>...................] - ETA: 1:16 - loss: 1.0421 - regression_loss: 0.9050 - classification_loss: 0.1370 192/500 [==========>...................] - ETA: 1:16 - loss: 1.0426 - regression_loss: 0.9057 - classification_loss: 0.1369 193/500 [==========>...................] - ETA: 1:16 - loss: 1.0423 - regression_loss: 0.9059 - classification_loss: 0.1365 194/500 [==========>...................] - ETA: 1:16 - loss: 1.0439 - regression_loss: 0.9075 - classification_loss: 0.1363 195/500 [==========>...................] - ETA: 1:15 - loss: 1.0440 - regression_loss: 0.9078 - classification_loss: 0.1362 196/500 [==========>...................] - ETA: 1:15 - loss: 1.0435 - regression_loss: 0.9078 - classification_loss: 0.1357 197/500 [==========>...................] - ETA: 1:15 - loss: 1.0428 - regression_loss: 0.9071 - classification_loss: 0.1356 198/500 [==========>...................] - ETA: 1:15 - loss: 1.0410 - regression_loss: 0.9056 - classification_loss: 0.1354 199/500 [==========>...................] - ETA: 1:14 - loss: 1.0381 - regression_loss: 0.9030 - classification_loss: 0.1351 200/500 [===========>..................] - ETA: 1:14 - loss: 1.0374 - regression_loss: 0.9023 - classification_loss: 0.1351 201/500 [===========>..................] - ETA: 1:14 - loss: 1.0393 - regression_loss: 0.9040 - classification_loss: 0.1353 202/500 [===========>..................] - ETA: 1:14 - loss: 1.0374 - regression_loss: 0.9023 - classification_loss: 0.1352 203/500 [===========>..................] - ETA: 1:13 - loss: 1.0388 - regression_loss: 0.9035 - classification_loss: 0.1353 204/500 [===========>..................] - ETA: 1:13 - loss: 1.0387 - regression_loss: 0.9036 - classification_loss: 0.1351 205/500 [===========>..................] - ETA: 1:13 - loss: 1.0369 - regression_loss: 0.9023 - classification_loss: 0.1346 206/500 [===========>..................] - ETA: 1:13 - loss: 1.0364 - regression_loss: 0.9017 - classification_loss: 0.1347 207/500 [===========>..................] - ETA: 1:12 - loss: 1.0390 - regression_loss: 0.9041 - classification_loss: 0.1349 208/500 [===========>..................] - ETA: 1:12 - loss: 1.0416 - regression_loss: 0.9065 - classification_loss: 0.1350 209/500 [===========>..................] - ETA: 1:12 - loss: 1.0411 - regression_loss: 0.9062 - classification_loss: 0.1349 210/500 [===========>..................] - ETA: 1:12 - loss: 1.0405 - regression_loss: 0.9057 - classification_loss: 0.1348 211/500 [===========>..................] - ETA: 1:11 - loss: 1.0384 - regression_loss: 0.9041 - classification_loss: 0.1343 212/500 [===========>..................] - ETA: 1:11 - loss: 1.0355 - regression_loss: 0.9017 - classification_loss: 0.1338 213/500 [===========>..................] - ETA: 1:11 - loss: 1.0384 - regression_loss: 0.9041 - classification_loss: 0.1342 214/500 [===========>..................] - ETA: 1:11 - loss: 1.0389 - regression_loss: 0.9048 - classification_loss: 0.1341 215/500 [===========>..................] - ETA: 1:10 - loss: 1.0373 - regression_loss: 0.9036 - classification_loss: 0.1337 216/500 [===========>..................] - ETA: 1:10 - loss: 1.0334 - regression_loss: 0.9002 - classification_loss: 0.1332 217/500 [============>.................] - ETA: 1:10 - loss: 1.0332 - regression_loss: 0.9001 - classification_loss: 0.1331 218/500 [============>.................] - ETA: 1:10 - loss: 1.0332 - regression_loss: 0.9002 - classification_loss: 0.1330 219/500 [============>.................] - ETA: 1:09 - loss: 1.0300 - regression_loss: 0.8972 - classification_loss: 0.1328 220/500 [============>.................] - ETA: 1:09 - loss: 1.0296 - regression_loss: 0.8968 - classification_loss: 0.1327 221/500 [============>.................] - ETA: 1:09 - loss: 1.0299 - regression_loss: 0.8970 - classification_loss: 0.1328 222/500 [============>.................] - ETA: 1:09 - loss: 1.0320 - regression_loss: 0.8987 - classification_loss: 0.1333 223/500 [============>.................] - ETA: 1:08 - loss: 1.0320 - regression_loss: 0.8988 - classification_loss: 0.1332 224/500 [============>.................] - ETA: 1:08 - loss: 1.0325 - regression_loss: 0.8991 - classification_loss: 0.1333 225/500 [============>.................] - ETA: 1:08 - loss: 1.0332 - regression_loss: 0.8999 - classification_loss: 0.1333 226/500 [============>.................] - ETA: 1:08 - loss: 1.0336 - regression_loss: 0.9002 - classification_loss: 0.1333 227/500 [============>.................] - ETA: 1:07 - loss: 1.0318 - regression_loss: 0.8988 - classification_loss: 0.1330 228/500 [============>.................] - ETA: 1:07 - loss: 1.0335 - regression_loss: 0.9004 - classification_loss: 0.1331 229/500 [============>.................] - ETA: 1:07 - loss: 1.0347 - regression_loss: 0.9019 - classification_loss: 0.1328 230/500 [============>.................] - ETA: 1:07 - loss: 1.0346 - regression_loss: 0.9019 - classification_loss: 0.1327 231/500 [============>.................] - ETA: 1:06 - loss: 1.0344 - regression_loss: 0.9018 - classification_loss: 0.1326 232/500 [============>.................] - ETA: 1:06 - loss: 1.0346 - regression_loss: 0.9020 - classification_loss: 0.1325 233/500 [============>.................] - ETA: 1:06 - loss: 1.0355 - regression_loss: 0.9031 - classification_loss: 0.1324 234/500 [=============>................] - ETA: 1:06 - loss: 1.0333 - regression_loss: 0.9013 - classification_loss: 0.1320 235/500 [=============>................] - ETA: 1:05 - loss: 1.0307 - regression_loss: 0.8991 - classification_loss: 0.1316 236/500 [=============>................] - ETA: 1:05 - loss: 1.0297 - regression_loss: 0.8982 - classification_loss: 0.1315 237/500 [=============>................] - ETA: 1:05 - loss: 1.0265 - regression_loss: 0.8954 - classification_loss: 0.1310 238/500 [=============>................] - ETA: 1:05 - loss: 1.0261 - regression_loss: 0.8952 - classification_loss: 0.1309 239/500 [=============>................] - ETA: 1:04 - loss: 1.0244 - regression_loss: 0.8939 - classification_loss: 0.1305 240/500 [=============>................] - ETA: 1:04 - loss: 1.0243 - regression_loss: 0.8939 - classification_loss: 0.1304 241/500 [=============>................] - ETA: 1:04 - loss: 1.0255 - regression_loss: 0.8951 - classification_loss: 0.1304 242/500 [=============>................] - ETA: 1:04 - loss: 1.0249 - regression_loss: 0.8947 - classification_loss: 0.1302 243/500 [=============>................] - ETA: 1:03 - loss: 1.0254 - regression_loss: 0.8954 - classification_loss: 0.1300 244/500 [=============>................] - ETA: 1:03 - loss: 1.0272 - regression_loss: 0.8970 - classification_loss: 0.1303 245/500 [=============>................] - ETA: 1:03 - loss: 1.0278 - regression_loss: 0.8976 - classification_loss: 0.1302 246/500 [=============>................] - ETA: 1:03 - loss: 1.0283 - regression_loss: 0.8980 - classification_loss: 0.1303 247/500 [=============>................] - ETA: 1:02 - loss: 1.0290 - regression_loss: 0.8989 - classification_loss: 0.1301 248/500 [=============>................] - ETA: 1:02 - loss: 1.0299 - regression_loss: 0.8999 - classification_loss: 0.1300 249/500 [=============>................] - ETA: 1:02 - loss: 1.0294 - regression_loss: 0.8994 - classification_loss: 0.1300 250/500 [==============>...............] - ETA: 1:02 - loss: 1.0320 - regression_loss: 0.9016 - classification_loss: 0.1305 251/500 [==============>...............] - ETA: 1:01 - loss: 1.0303 - regression_loss: 0.9002 - classification_loss: 0.1301 252/500 [==============>...............] - ETA: 1:01 - loss: 1.0295 - regression_loss: 0.8994 - classification_loss: 0.1301 253/500 [==============>...............] - ETA: 1:01 - loss: 1.0280 - regression_loss: 0.8982 - classification_loss: 0.1298 254/500 [==============>...............] - ETA: 1:01 - loss: 1.0291 - regression_loss: 0.8991 - classification_loss: 0.1300 255/500 [==============>...............] - ETA: 1:00 - loss: 1.0277 - regression_loss: 0.8980 - classification_loss: 0.1298 256/500 [==============>...............] - ETA: 1:00 - loss: 1.0247 - regression_loss: 0.8954 - classification_loss: 0.1293 257/500 [==============>...............] - ETA: 1:00 - loss: 1.0253 - regression_loss: 0.8957 - classification_loss: 0.1296 258/500 [==============>...............] - ETA: 1:00 - loss: 1.0231 - regression_loss: 0.8937 - classification_loss: 0.1293 259/500 [==============>...............] - ETA: 1:00 - loss: 1.0233 - regression_loss: 0.8942 - classification_loss: 0.1291 260/500 [==============>...............] - ETA: 59s - loss: 1.0222 - regression_loss: 0.8932 - classification_loss: 0.1290  261/500 [==============>...............] - ETA: 59s - loss: 1.0218 - regression_loss: 0.8932 - classification_loss: 0.1286 262/500 [==============>...............] - ETA: 59s - loss: 1.0196 - regression_loss: 0.8914 - classification_loss: 0.1283 263/500 [==============>...............] - ETA: 59s - loss: 1.0185 - regression_loss: 0.8905 - classification_loss: 0.1280 264/500 [==============>...............] - ETA: 58s - loss: 1.0193 - regression_loss: 0.8910 - classification_loss: 0.1282 265/500 [==============>...............] - ETA: 58s - loss: 1.0200 - regression_loss: 0.8917 - classification_loss: 0.1282 266/500 [==============>...............] - ETA: 58s - loss: 1.0193 - regression_loss: 0.8913 - classification_loss: 0.1279 267/500 [===============>..............] - ETA: 58s - loss: 1.0191 - regression_loss: 0.8912 - classification_loss: 0.1279 268/500 [===============>..............] - ETA: 57s - loss: 1.0172 - regression_loss: 0.8896 - classification_loss: 0.1276 269/500 [===============>..............] - ETA: 57s - loss: 1.0156 - regression_loss: 0.8884 - classification_loss: 0.1272 270/500 [===============>..............] - ETA: 57s - loss: 1.0145 - regression_loss: 0.8876 - classification_loss: 0.1268 271/500 [===============>..............] - ETA: 57s - loss: 1.0138 - regression_loss: 0.8872 - classification_loss: 0.1266 272/500 [===============>..............] - ETA: 56s - loss: 1.0143 - regression_loss: 0.8876 - classification_loss: 0.1267 273/500 [===============>..............] - ETA: 56s - loss: 1.0126 - regression_loss: 0.8862 - classification_loss: 0.1264 274/500 [===============>..............] - ETA: 56s - loss: 1.0122 - regression_loss: 0.8857 - classification_loss: 0.1265 275/500 [===============>..............] - ETA: 56s - loss: 1.0158 - regression_loss: 0.8892 - classification_loss: 0.1266 276/500 [===============>..............] - ETA: 55s - loss: 1.0140 - regression_loss: 0.8877 - classification_loss: 0.1263 277/500 [===============>..............] - ETA: 55s - loss: 1.0145 - regression_loss: 0.8881 - classification_loss: 0.1263 278/500 [===============>..............] - ETA: 55s - loss: 1.0145 - regression_loss: 0.8883 - classification_loss: 0.1262 279/500 [===============>..............] - ETA: 55s - loss: 1.0161 - regression_loss: 0.8897 - classification_loss: 0.1264 280/500 [===============>..............] - ETA: 54s - loss: 1.0157 - regression_loss: 0.8893 - classification_loss: 0.1264 281/500 [===============>..............] - ETA: 54s - loss: 1.0173 - regression_loss: 0.8908 - classification_loss: 0.1265 282/500 [===============>..............] - ETA: 54s - loss: 1.0145 - regression_loss: 0.8884 - classification_loss: 0.1261 283/500 [===============>..............] - ETA: 54s - loss: 1.0144 - regression_loss: 0.8884 - classification_loss: 0.1260 284/500 [================>.............] - ETA: 53s - loss: 1.0162 - regression_loss: 0.8900 - classification_loss: 0.1262 285/500 [================>.............] - ETA: 53s - loss: 1.0189 - regression_loss: 0.8922 - classification_loss: 0.1267 286/500 [================>.............] - ETA: 53s - loss: 1.0190 - regression_loss: 0.8925 - classification_loss: 0.1265 287/500 [================>.............] - ETA: 53s - loss: 1.0198 - regression_loss: 0.8932 - classification_loss: 0.1265 288/500 [================>.............] - ETA: 52s - loss: 1.0190 - regression_loss: 0.8926 - classification_loss: 0.1264 289/500 [================>.............] - ETA: 52s - loss: 1.0179 - regression_loss: 0.8918 - classification_loss: 0.1261 290/500 [================>.............] - ETA: 52s - loss: 1.0173 - regression_loss: 0.8915 - classification_loss: 0.1259 291/500 [================>.............] - ETA: 52s - loss: 1.0147 - regression_loss: 0.8892 - classification_loss: 0.1256 292/500 [================>.............] - ETA: 51s - loss: 1.0159 - regression_loss: 0.8901 - classification_loss: 0.1258 293/500 [================>.............] - ETA: 51s - loss: 1.0162 - regression_loss: 0.8901 - classification_loss: 0.1260 294/500 [================>.............] - ETA: 51s - loss: 1.0171 - regression_loss: 0.8910 - classification_loss: 0.1261 295/500 [================>.............] - ETA: 51s - loss: 1.0171 - regression_loss: 0.8909 - classification_loss: 0.1262 296/500 [================>.............] - ETA: 50s - loss: 1.0172 - regression_loss: 0.8910 - classification_loss: 0.1262 297/500 [================>.............] - ETA: 50s - loss: 1.0157 - regression_loss: 0.8897 - classification_loss: 0.1259 298/500 [================>.............] - ETA: 50s - loss: 1.0171 - regression_loss: 0.8910 - classification_loss: 0.1261 299/500 [================>.............] - ETA: 50s - loss: 1.0167 - regression_loss: 0.8906 - classification_loss: 0.1261 300/500 [=================>............] - ETA: 49s - loss: 1.0166 - regression_loss: 0.8907 - classification_loss: 0.1260 301/500 [=================>............] - ETA: 49s - loss: 1.0161 - regression_loss: 0.8904 - classification_loss: 0.1257 302/500 [=================>............] - ETA: 49s - loss: 1.0172 - regression_loss: 0.8912 - classification_loss: 0.1260 303/500 [=================>............] - ETA: 49s - loss: 1.0174 - regression_loss: 0.8915 - classification_loss: 0.1259 304/500 [=================>............] - ETA: 48s - loss: 1.0177 - regression_loss: 0.8918 - classification_loss: 0.1259 305/500 [=================>............] - ETA: 48s - loss: 1.0171 - regression_loss: 0.8914 - classification_loss: 0.1257 306/500 [=================>............] - ETA: 48s - loss: 1.0165 - regression_loss: 0.8910 - classification_loss: 0.1255 307/500 [=================>............] - ETA: 48s - loss: 1.0148 - regression_loss: 0.8895 - classification_loss: 0.1253 308/500 [=================>............] - ETA: 47s - loss: 1.0145 - regression_loss: 0.8892 - classification_loss: 0.1252 309/500 [=================>............] - ETA: 47s - loss: 1.0147 - regression_loss: 0.8894 - classification_loss: 0.1253 310/500 [=================>............] - ETA: 47s - loss: 1.0148 - regression_loss: 0.8896 - classification_loss: 0.1252 311/500 [=================>............] - ETA: 47s - loss: 1.0148 - regression_loss: 0.8897 - classification_loss: 0.1251 312/500 [=================>............] - ETA: 46s - loss: 1.0156 - regression_loss: 0.8903 - classification_loss: 0.1252 313/500 [=================>............] - ETA: 46s - loss: 1.0148 - regression_loss: 0.8898 - classification_loss: 0.1250 314/500 [=================>............] - ETA: 46s - loss: 1.0147 - regression_loss: 0.8896 - classification_loss: 0.1251 315/500 [=================>............] - ETA: 46s - loss: 1.0147 - regression_loss: 0.8896 - classification_loss: 0.1250 316/500 [=================>............] - ETA: 45s - loss: 1.0148 - regression_loss: 0.8897 - classification_loss: 0.1251 317/500 [==================>...........] - ETA: 45s - loss: 1.0157 - regression_loss: 0.8906 - classification_loss: 0.1250 318/500 [==================>...........] - ETA: 45s - loss: 1.0137 - regression_loss: 0.8889 - classification_loss: 0.1247 319/500 [==================>...........] - ETA: 45s - loss: 1.0145 - regression_loss: 0.8898 - classification_loss: 0.1248 320/500 [==================>...........] - ETA: 44s - loss: 1.0131 - regression_loss: 0.8886 - classification_loss: 0.1245 321/500 [==================>...........] - ETA: 44s - loss: 1.0126 - regression_loss: 0.8883 - classification_loss: 0.1243 322/500 [==================>...........] - ETA: 44s - loss: 1.0125 - regression_loss: 0.8881 - classification_loss: 0.1244 323/500 [==================>...........] - ETA: 44s - loss: 1.0130 - regression_loss: 0.8886 - classification_loss: 0.1244 324/500 [==================>...........] - ETA: 43s - loss: 1.0130 - regression_loss: 0.8888 - classification_loss: 0.1242 325/500 [==================>...........] - ETA: 43s - loss: 1.0137 - regression_loss: 0.8894 - classification_loss: 0.1242 326/500 [==================>...........] - ETA: 43s - loss: 1.0144 - regression_loss: 0.8900 - classification_loss: 0.1243 327/500 [==================>...........] - ETA: 43s - loss: 1.0136 - regression_loss: 0.8893 - classification_loss: 0.1243 328/500 [==================>...........] - ETA: 42s - loss: 1.0131 - regression_loss: 0.8890 - classification_loss: 0.1241 329/500 [==================>...........] - ETA: 42s - loss: 1.0134 - regression_loss: 0.8894 - classification_loss: 0.1240 330/500 [==================>...........] - ETA: 42s - loss: 1.0110 - regression_loss: 0.8873 - classification_loss: 0.1237 331/500 [==================>...........] - ETA: 42s - loss: 1.0106 - regression_loss: 0.8869 - classification_loss: 0.1237 332/500 [==================>...........] - ETA: 41s - loss: 1.0105 - regression_loss: 0.8868 - classification_loss: 0.1237 333/500 [==================>...........] - ETA: 41s - loss: 1.0114 - regression_loss: 0.8876 - classification_loss: 0.1239 334/500 [===================>..........] - ETA: 41s - loss: 1.0119 - regression_loss: 0.8878 - classification_loss: 0.1241 335/500 [===================>..........] - ETA: 41s - loss: 1.0118 - regression_loss: 0.8878 - classification_loss: 0.1240 336/500 [===================>..........] - ETA: 40s - loss: 1.0107 - regression_loss: 0.8868 - classification_loss: 0.1239 337/500 [===================>..........] - ETA: 40s - loss: 1.0109 - regression_loss: 0.8870 - classification_loss: 0.1239 338/500 [===================>..........] - ETA: 40s - loss: 1.0108 - regression_loss: 0.8868 - classification_loss: 0.1240 339/500 [===================>..........] - ETA: 40s - loss: 1.0110 - regression_loss: 0.8871 - classification_loss: 0.1239 340/500 [===================>..........] - ETA: 39s - loss: 1.0126 - regression_loss: 0.8884 - classification_loss: 0.1242 341/500 [===================>..........] - ETA: 39s - loss: 1.0142 - regression_loss: 0.8900 - classification_loss: 0.1242 342/500 [===================>..........] - ETA: 39s - loss: 1.0146 - regression_loss: 0.8903 - classification_loss: 0.1243 343/500 [===================>..........] - ETA: 39s - loss: 1.0138 - regression_loss: 0.8897 - classification_loss: 0.1241 344/500 [===================>..........] - ETA: 38s - loss: 1.0131 - regression_loss: 0.8892 - classification_loss: 0.1240 345/500 [===================>..........] - ETA: 38s - loss: 1.0118 - regression_loss: 0.8882 - classification_loss: 0.1237 346/500 [===================>..........] - ETA: 38s - loss: 1.0111 - regression_loss: 0.8877 - classification_loss: 0.1234 347/500 [===================>..........] - ETA: 38s - loss: 1.0105 - regression_loss: 0.8873 - classification_loss: 0.1232 348/500 [===================>..........] - ETA: 37s - loss: 1.0091 - regression_loss: 0.8860 - classification_loss: 0.1230 349/500 [===================>..........] - ETA: 37s - loss: 1.0096 - regression_loss: 0.8865 - classification_loss: 0.1231 350/500 [====================>.........] - ETA: 37s - loss: 1.0093 - regression_loss: 0.8863 - classification_loss: 0.1230 351/500 [====================>.........] - ETA: 37s - loss: 1.0085 - regression_loss: 0.8857 - classification_loss: 0.1228 352/500 [====================>.........] - ETA: 36s - loss: 1.0075 - regression_loss: 0.8849 - classification_loss: 0.1226 353/500 [====================>.........] - ETA: 36s - loss: 1.0058 - regression_loss: 0.8835 - classification_loss: 0.1223 354/500 [====================>.........] - ETA: 36s - loss: 1.0047 - regression_loss: 0.8826 - classification_loss: 0.1221 355/500 [====================>.........] - ETA: 36s - loss: 1.0038 - regression_loss: 0.8818 - classification_loss: 0.1220 356/500 [====================>.........] - ETA: 35s - loss: 1.0034 - regression_loss: 0.8816 - classification_loss: 0.1218 357/500 [====================>.........] - ETA: 35s - loss: 1.0048 - regression_loss: 0.8830 - classification_loss: 0.1218 358/500 [====================>.........] - ETA: 35s - loss: 1.0069 - regression_loss: 0.8844 - classification_loss: 0.1225 359/500 [====================>.........] - ETA: 35s - loss: 1.0066 - regression_loss: 0.8842 - classification_loss: 0.1225 360/500 [====================>.........] - ETA: 34s - loss: 1.0067 - regression_loss: 0.8842 - classification_loss: 0.1225 361/500 [====================>.........] - ETA: 34s - loss: 1.0064 - regression_loss: 0.8841 - classification_loss: 0.1223 362/500 [====================>.........] - ETA: 34s - loss: 1.0071 - regression_loss: 0.8849 - classification_loss: 0.1222 363/500 [====================>.........] - ETA: 34s - loss: 1.0072 - regression_loss: 0.8850 - classification_loss: 0.1221 364/500 [====================>.........] - ETA: 33s - loss: 1.0062 - regression_loss: 0.8842 - classification_loss: 0.1220 365/500 [====================>.........] - ETA: 33s - loss: 1.0072 - regression_loss: 0.8850 - classification_loss: 0.1221 366/500 [====================>.........] - ETA: 33s - loss: 1.0076 - regression_loss: 0.8855 - classification_loss: 0.1221 367/500 [=====================>........] - ETA: 33s - loss: 1.0063 - regression_loss: 0.8844 - classification_loss: 0.1220 368/500 [=====================>........] - ETA: 32s - loss: 1.0067 - regression_loss: 0.8847 - classification_loss: 0.1220 369/500 [=====================>........] - ETA: 32s - loss: 1.0079 - regression_loss: 0.8856 - classification_loss: 0.1222 370/500 [=====================>........] - ETA: 32s - loss: 1.0082 - regression_loss: 0.8859 - classification_loss: 0.1223 371/500 [=====================>........] - ETA: 32s - loss: 1.0089 - regression_loss: 0.8867 - classification_loss: 0.1222 372/500 [=====================>........] - ETA: 31s - loss: 1.0102 - regression_loss: 0.8879 - classification_loss: 0.1224 373/500 [=====================>........] - ETA: 31s - loss: 1.0102 - regression_loss: 0.8879 - classification_loss: 0.1223 374/500 [=====================>........] - ETA: 31s - loss: 1.0100 - regression_loss: 0.8877 - classification_loss: 0.1223 375/500 [=====================>........] - ETA: 31s - loss: 1.0111 - regression_loss: 0.8886 - classification_loss: 0.1225 376/500 [=====================>........] - ETA: 30s - loss: 1.0117 - regression_loss: 0.8889 - classification_loss: 0.1227 377/500 [=====================>........] - ETA: 30s - loss: 1.0115 - regression_loss: 0.8889 - classification_loss: 0.1226 378/500 [=====================>........] - ETA: 30s - loss: 1.0115 - regression_loss: 0.8888 - classification_loss: 0.1226 379/500 [=====================>........] - ETA: 30s - loss: 1.0116 - regression_loss: 0.8891 - classification_loss: 0.1225 380/500 [=====================>........] - ETA: 29s - loss: 1.0101 - regression_loss: 0.8879 - classification_loss: 0.1222 381/500 [=====================>........] - ETA: 29s - loss: 1.0102 - regression_loss: 0.8880 - classification_loss: 0.1222 382/500 [=====================>........] - ETA: 29s - loss: 1.0082 - regression_loss: 0.8862 - classification_loss: 0.1219 383/500 [=====================>........] - ETA: 29s - loss: 1.0099 - regression_loss: 0.8878 - classification_loss: 0.1221 384/500 [======================>.......] - ETA: 28s - loss: 1.0097 - regression_loss: 0.8876 - classification_loss: 0.1221 385/500 [======================>.......] - ETA: 28s - loss: 1.0094 - regression_loss: 0.8874 - classification_loss: 0.1220 386/500 [======================>.......] - ETA: 28s - loss: 1.0102 - regression_loss: 0.8881 - classification_loss: 0.1221 387/500 [======================>.......] - ETA: 28s - loss: 1.0108 - regression_loss: 0.8885 - classification_loss: 0.1223 388/500 [======================>.......] - ETA: 27s - loss: 1.0100 - regression_loss: 0.8879 - classification_loss: 0.1221 389/500 [======================>.......] - ETA: 27s - loss: 1.0098 - regression_loss: 0.8878 - classification_loss: 0.1221 390/500 [======================>.......] - ETA: 27s - loss: 1.0090 - regression_loss: 0.8871 - classification_loss: 0.1219 391/500 [======================>.......] - ETA: 27s - loss: 1.0096 - regression_loss: 0.8876 - classification_loss: 0.1220 392/500 [======================>.......] - ETA: 26s - loss: 1.0105 - regression_loss: 0.8884 - classification_loss: 0.1221 393/500 [======================>.......] - ETA: 26s - loss: 1.0094 - regression_loss: 0.8876 - classification_loss: 0.1218 394/500 [======================>.......] - ETA: 26s - loss: 1.0097 - regression_loss: 0.8879 - classification_loss: 0.1218 395/500 [======================>.......] - ETA: 26s - loss: 1.0096 - regression_loss: 0.8877 - classification_loss: 0.1219 396/500 [======================>.......] - ETA: 25s - loss: 1.0089 - regression_loss: 0.8872 - classification_loss: 0.1217 397/500 [======================>.......] - ETA: 25s - loss: 1.0092 - regression_loss: 0.8875 - classification_loss: 0.1217 398/500 [======================>.......] - ETA: 25s - loss: 1.0107 - regression_loss: 0.8888 - classification_loss: 0.1219 399/500 [======================>.......] - ETA: 25s - loss: 1.0091 - regression_loss: 0.8875 - classification_loss: 0.1216 400/500 [=======================>......] - ETA: 24s - loss: 1.0102 - regression_loss: 0.8884 - classification_loss: 0.1218 401/500 [=======================>......] - ETA: 24s - loss: 1.0109 - regression_loss: 0.8891 - classification_loss: 0.1218 402/500 [=======================>......] - ETA: 24s - loss: 1.0118 - regression_loss: 0.8898 - classification_loss: 0.1220 403/500 [=======================>......] - ETA: 24s - loss: 1.0107 - regression_loss: 0.8889 - classification_loss: 0.1218 404/500 [=======================>......] - ETA: 23s - loss: 1.0108 - regression_loss: 0.8891 - classification_loss: 0.1217 405/500 [=======================>......] - ETA: 23s - loss: 1.0125 - regression_loss: 0.8906 - classification_loss: 0.1219 406/500 [=======================>......] - ETA: 23s - loss: 1.0117 - regression_loss: 0.8901 - classification_loss: 0.1217 407/500 [=======================>......] - ETA: 23s - loss: 1.0136 - regression_loss: 0.8916 - classification_loss: 0.1220 408/500 [=======================>......] - ETA: 22s - loss: 1.0141 - regression_loss: 0.8921 - classification_loss: 0.1220 409/500 [=======================>......] - ETA: 22s - loss: 1.0137 - regression_loss: 0.8919 - classification_loss: 0.1218 410/500 [=======================>......] - ETA: 22s - loss: 1.0143 - regression_loss: 0.8925 - classification_loss: 0.1218 411/500 [=======================>......] - ETA: 22s - loss: 1.0150 - regression_loss: 0.8931 - classification_loss: 0.1219 412/500 [=======================>......] - ETA: 21s - loss: 1.0154 - regression_loss: 0.8934 - classification_loss: 0.1220 413/500 [=======================>......] - ETA: 21s - loss: 1.0157 - regression_loss: 0.8936 - classification_loss: 0.1220 414/500 [=======================>......] - ETA: 21s - loss: 1.0166 - regression_loss: 0.8944 - classification_loss: 0.1222 415/500 [=======================>......] - ETA: 21s - loss: 1.0170 - regression_loss: 0.8948 - classification_loss: 0.1222 416/500 [=======================>......] - ETA: 20s - loss: 1.0165 - regression_loss: 0.8945 - classification_loss: 0.1221 417/500 [========================>.....] - ETA: 20s - loss: 1.0167 - regression_loss: 0.8947 - classification_loss: 0.1220 418/500 [========================>.....] - ETA: 20s - loss: 1.0159 - regression_loss: 0.8941 - classification_loss: 0.1218 419/500 [========================>.....] - ETA: 20s - loss: 1.0144 - regression_loss: 0.8928 - classification_loss: 0.1216 420/500 [========================>.....] - ETA: 19s - loss: 1.0141 - regression_loss: 0.8926 - classification_loss: 0.1214 421/500 [========================>.....] - ETA: 19s - loss: 1.0145 - regression_loss: 0.8931 - classification_loss: 0.1214 422/500 [========================>.....] - ETA: 19s - loss: 1.0146 - regression_loss: 0.8932 - classification_loss: 0.1214 423/500 [========================>.....] - ETA: 19s - loss: 1.0150 - regression_loss: 0.8936 - classification_loss: 0.1213 424/500 [========================>.....] - ETA: 18s - loss: 1.0143 - regression_loss: 0.8931 - classification_loss: 0.1212 425/500 [========================>.....] - ETA: 18s - loss: 1.0133 - regression_loss: 0.8923 - classification_loss: 0.1210 426/500 [========================>.....] - ETA: 18s - loss: 1.0131 - regression_loss: 0.8922 - classification_loss: 0.1210 427/500 [========================>.....] - ETA: 18s - loss: 1.0139 - regression_loss: 0.8928 - classification_loss: 0.1211 428/500 [========================>.....] - ETA: 17s - loss: 1.0134 - regression_loss: 0.8924 - classification_loss: 0.1210 429/500 [========================>.....] - ETA: 17s - loss: 1.0138 - regression_loss: 0.8927 - classification_loss: 0.1211 430/500 [========================>.....] - ETA: 17s - loss: 1.0147 - regression_loss: 0.8935 - classification_loss: 0.1212 431/500 [========================>.....] - ETA: 17s - loss: 1.0139 - regression_loss: 0.8929 - classification_loss: 0.1210 432/500 [========================>.....] - ETA: 16s - loss: 1.0136 - regression_loss: 0.8927 - classification_loss: 0.1209 433/500 [========================>.....] - ETA: 16s - loss: 1.0144 - regression_loss: 0.8935 - classification_loss: 0.1209 434/500 [=========================>....] - ETA: 16s - loss: 1.0143 - regression_loss: 0.8935 - classification_loss: 0.1209 435/500 [=========================>....] - ETA: 16s - loss: 1.0145 - regression_loss: 0.8937 - classification_loss: 0.1208 436/500 [=========================>....] - ETA: 15s - loss: 1.0135 - regression_loss: 0.8928 - classification_loss: 0.1206 437/500 [=========================>....] - ETA: 15s - loss: 1.0123 - regression_loss: 0.8918 - classification_loss: 0.1205 438/500 [=========================>....] - ETA: 15s - loss: 1.0132 - regression_loss: 0.8926 - classification_loss: 0.1206 439/500 [=========================>....] - ETA: 15s - loss: 1.0130 - regression_loss: 0.8925 - classification_loss: 0.1205 440/500 [=========================>....] - ETA: 14s - loss: 1.0115 - regression_loss: 0.8912 - classification_loss: 0.1202 441/500 [=========================>....] - ETA: 14s - loss: 1.0097 - regression_loss: 0.8897 - classification_loss: 0.1200 442/500 [=========================>....] - ETA: 14s - loss: 1.0090 - regression_loss: 0.8892 - classification_loss: 0.1199 443/500 [=========================>....] - ETA: 14s - loss: 1.0095 - regression_loss: 0.8896 - classification_loss: 0.1199 444/500 [=========================>....] - ETA: 13s - loss: 1.0090 - regression_loss: 0.8892 - classification_loss: 0.1198 445/500 [=========================>....] - ETA: 13s - loss: 1.0081 - regression_loss: 0.8884 - classification_loss: 0.1197 446/500 [=========================>....] - ETA: 13s - loss: 1.0081 - regression_loss: 0.8884 - classification_loss: 0.1197 447/500 [=========================>....] - ETA: 13s - loss: 1.0087 - regression_loss: 0.8890 - classification_loss: 0.1197 448/500 [=========================>....] - ETA: 12s - loss: 1.0081 - regression_loss: 0.8886 - classification_loss: 0.1196 449/500 [=========================>....] - ETA: 12s - loss: 1.0121 - regression_loss: 0.8914 - classification_loss: 0.1207 450/500 [==========================>...] - ETA: 12s - loss: 1.0115 - regression_loss: 0.8909 - classification_loss: 0.1206 451/500 [==========================>...] - ETA: 12s - loss: 1.0119 - regression_loss: 0.8912 - classification_loss: 0.1207 452/500 [==========================>...] - ETA: 11s - loss: 1.0133 - regression_loss: 0.8924 - classification_loss: 0.1209 453/500 [==========================>...] - ETA: 11s - loss: 1.0126 - regression_loss: 0.8919 - classification_loss: 0.1207 454/500 [==========================>...] - ETA: 11s - loss: 1.0130 - regression_loss: 0.8923 - classification_loss: 0.1208 455/500 [==========================>...] - ETA: 11s - loss: 1.0127 - regression_loss: 0.8920 - classification_loss: 0.1207 456/500 [==========================>...] - ETA: 10s - loss: 1.0131 - regression_loss: 0.8923 - classification_loss: 0.1208 457/500 [==========================>...] - ETA: 10s - loss: 1.0129 - regression_loss: 0.8922 - classification_loss: 0.1207 458/500 [==========================>...] - ETA: 10s - loss: 1.0126 - regression_loss: 0.8920 - classification_loss: 0.1206 459/500 [==========================>...] - ETA: 10s - loss: 1.0126 - regression_loss: 0.8920 - classification_loss: 0.1206 460/500 [==========================>...] - ETA: 9s - loss: 1.0134 - regression_loss: 0.8927 - classification_loss: 0.1207  461/500 [==========================>...] - ETA: 9s - loss: 1.0136 - regression_loss: 0.8929 - classification_loss: 0.1207 462/500 [==========================>...] - ETA: 9s - loss: 1.0139 - regression_loss: 0.8930 - classification_loss: 0.1209 463/500 [==========================>...] - ETA: 9s - loss: 1.0142 - regression_loss: 0.8933 - classification_loss: 0.1209 464/500 [==========================>...] - ETA: 8s - loss: 1.0138 - regression_loss: 0.8929 - classification_loss: 0.1209 465/500 [==========================>...] - ETA: 8s - loss: 1.0127 - regression_loss: 0.8920 - classification_loss: 0.1207 466/500 [==========================>...] - ETA: 8s - loss: 1.0114 - regression_loss: 0.8908 - classification_loss: 0.1205 467/500 [===========================>..] - ETA: 8s - loss: 1.0117 - regression_loss: 0.8912 - classification_loss: 0.1205 468/500 [===========================>..] - ETA: 7s - loss: 1.0120 - regression_loss: 0.8914 - classification_loss: 0.1206 469/500 [===========================>..] - ETA: 7s - loss: 1.0119 - regression_loss: 0.8913 - classification_loss: 0.1206 470/500 [===========================>..] - ETA: 7s - loss: 1.0119 - regression_loss: 0.8913 - classification_loss: 0.1205 471/500 [===========================>..] - ETA: 7s - loss: 1.0109 - regression_loss: 0.8905 - classification_loss: 0.1204 472/500 [===========================>..] - ETA: 6s - loss: 1.0112 - regression_loss: 0.8908 - classification_loss: 0.1204 473/500 [===========================>..] - ETA: 6s - loss: 1.0107 - regression_loss: 0.8905 - classification_loss: 0.1202 474/500 [===========================>..] - ETA: 6s - loss: 1.0101 - regression_loss: 0.8899 - classification_loss: 0.1202 475/500 [===========================>..] - ETA: 6s - loss: 1.0100 - regression_loss: 0.8897 - classification_loss: 0.1202 476/500 [===========================>..] - ETA: 5s - loss: 1.0100 - regression_loss: 0.8897 - classification_loss: 0.1202 477/500 [===========================>..] - ETA: 5s - loss: 1.0093 - regression_loss: 0.8891 - classification_loss: 0.1201 478/500 [===========================>..] - ETA: 5s - loss: 1.0089 - regression_loss: 0.8889 - classification_loss: 0.1200 479/500 [===========================>..] - ETA: 5s - loss: 1.0093 - regression_loss: 0.8893 - classification_loss: 0.1200 480/500 [===========================>..] - ETA: 5s - loss: 1.0080 - regression_loss: 0.8881 - classification_loss: 0.1198 481/500 [===========================>..] - ETA: 4s - loss: 1.0091 - regression_loss: 0.8891 - classification_loss: 0.1200 482/500 [===========================>..] - ETA: 4s - loss: 1.0084 - regression_loss: 0.8885 - classification_loss: 0.1198 483/500 [===========================>..] - ETA: 4s - loss: 1.0086 - regression_loss: 0.8889 - classification_loss: 0.1198 484/500 [============================>.] - ETA: 4s - loss: 1.0093 - regression_loss: 0.8893 - classification_loss: 0.1199 485/500 [============================>.] - ETA: 3s - loss: 1.0097 - regression_loss: 0.8897 - classification_loss: 0.1200 486/500 [============================>.] - ETA: 3s - loss: 1.0092 - regression_loss: 0.8893 - classification_loss: 0.1199 487/500 [============================>.] - ETA: 3s - loss: 1.0088 - regression_loss: 0.8890 - classification_loss: 0.1198 488/500 [============================>.] - ETA: 3s - loss: 1.0086 - regression_loss: 0.8888 - classification_loss: 0.1198 489/500 [============================>.] - ETA: 2s - loss: 1.0080 - regression_loss: 0.8883 - classification_loss: 0.1197 490/500 [============================>.] - ETA: 2s - loss: 1.0082 - regression_loss: 0.8885 - classification_loss: 0.1197 491/500 [============================>.] - ETA: 2s - loss: 1.0075 - regression_loss: 0.8880 - classification_loss: 0.1195 492/500 [============================>.] - ETA: 2s - loss: 1.0073 - regression_loss: 0.8879 - classification_loss: 0.1194 493/500 [============================>.] - ETA: 1s - loss: 1.0078 - regression_loss: 0.8883 - classification_loss: 0.1195 494/500 [============================>.] - ETA: 1s - loss: 1.0074 - regression_loss: 0.8879 - classification_loss: 0.1194 495/500 [============================>.] - ETA: 1s - loss: 1.0084 - regression_loss: 0.8889 - classification_loss: 0.1195 496/500 [============================>.] - ETA: 1s - loss: 1.0087 - regression_loss: 0.8892 - classification_loss: 0.1195 497/500 [============================>.] - ETA: 0s - loss: 1.0088 - regression_loss: 0.8893 - classification_loss: 0.1195 498/500 [============================>.] - ETA: 0s - loss: 1.0091 - regression_loss: 0.8896 - classification_loss: 0.1196 499/500 [============================>.] - ETA: 0s - loss: 1.0098 - regression_loss: 0.8901 - classification_loss: 0.1197 500/500 [==============================] - 125s 250ms/step - loss: 1.0105 - regression_loss: 0.8908 - classification_loss: 0.1196 1172 instances of class plum with average precision: 0.7774 mAP: 0.7774 Epoch 00038: saving model to ./training/snapshots/resnet50_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 1:58 - loss: 1.2151 - regression_loss: 1.0650 - classification_loss: 0.1501 2/500 [..............................] - ETA: 2:02 - loss: 1.4074 - regression_loss: 1.2500 - classification_loss: 0.1574 3/500 [..............................] - ETA: 2:03 - loss: 1.2136 - regression_loss: 1.0840 - classification_loss: 0.1296 4/500 [..............................] - ETA: 2:03 - loss: 1.2309 - regression_loss: 1.0997 - classification_loss: 0.1312 5/500 [..............................] - ETA: 2:03 - loss: 1.2500 - regression_loss: 1.1139 - classification_loss: 0.1361 6/500 [..............................] - ETA: 2:03 - loss: 1.2488 - regression_loss: 1.1164 - classification_loss: 0.1325 7/500 [..............................] - ETA: 2:03 - loss: 1.2305 - regression_loss: 1.1095 - classification_loss: 0.1210 8/500 [..............................] - ETA: 2:01 - loss: 1.1827 - regression_loss: 1.0665 - classification_loss: 0.1162 9/500 [..............................] - ETA: 1:59 - loss: 1.1725 - regression_loss: 1.0529 - classification_loss: 0.1195 10/500 [..............................] - ETA: 1:57 - loss: 1.1854 - regression_loss: 1.0639 - classification_loss: 0.1215 11/500 [..............................] - ETA: 1:56 - loss: 1.1677 - regression_loss: 1.0503 - classification_loss: 0.1174 12/500 [..............................] - ETA: 1:54 - loss: 1.1699 - regression_loss: 1.0531 - classification_loss: 0.1168 13/500 [..............................] - ETA: 1:54 - loss: 1.0973 - regression_loss: 0.9889 - classification_loss: 0.1084 14/500 [..............................] - ETA: 1:54 - loss: 1.1197 - regression_loss: 1.0061 - classification_loss: 0.1136 15/500 [..............................] - ETA: 1:54 - loss: 1.0829 - regression_loss: 0.9737 - classification_loss: 0.1092 16/500 [..............................] - ETA: 1:54 - loss: 1.1059 - regression_loss: 0.9933 - classification_loss: 0.1126 17/500 [>.............................] - ETA: 1:55 - loss: 1.1122 - regression_loss: 0.9984 - classification_loss: 0.1138 18/500 [>.............................] - ETA: 1:55 - loss: 1.0734 - regression_loss: 0.9642 - classification_loss: 0.1092 19/500 [>.............................] - ETA: 1:55 - loss: 1.0708 - regression_loss: 0.9604 - classification_loss: 0.1104 20/500 [>.............................] - ETA: 1:55 - loss: 1.0771 - regression_loss: 0.9660 - classification_loss: 0.1111 21/500 [>.............................] - ETA: 1:55 - loss: 1.0674 - regression_loss: 0.9536 - classification_loss: 0.1139 22/500 [>.............................] - ETA: 1:55 - loss: 1.0778 - regression_loss: 0.9622 - classification_loss: 0.1156 23/500 [>.............................] - ETA: 1:55 - loss: 1.0943 - regression_loss: 0.9797 - classification_loss: 0.1146 24/500 [>.............................] - ETA: 1:55 - loss: 1.0720 - regression_loss: 0.9617 - classification_loss: 0.1103 25/500 [>.............................] - ETA: 1:55 - loss: 1.0756 - regression_loss: 0.9645 - classification_loss: 0.1111 26/500 [>.............................] - ETA: 1:55 - loss: 1.0796 - regression_loss: 0.9669 - classification_loss: 0.1127 27/500 [>.............................] - ETA: 1:54 - loss: 1.0761 - regression_loss: 0.9648 - classification_loss: 0.1113 28/500 [>.............................] - ETA: 1:54 - loss: 1.0895 - regression_loss: 0.9750 - classification_loss: 0.1145 29/500 [>.............................] - ETA: 1:54 - loss: 1.0602 - regression_loss: 0.9491 - classification_loss: 0.1111 30/500 [>.............................] - ETA: 1:54 - loss: 1.0383 - regression_loss: 0.9298 - classification_loss: 0.1085 31/500 [>.............................] - ETA: 1:54 - loss: 1.0453 - regression_loss: 0.9355 - classification_loss: 0.1099 32/500 [>.............................] - ETA: 1:54 - loss: 1.0483 - regression_loss: 0.9401 - classification_loss: 0.1082 33/500 [>.............................] - ETA: 1:54 - loss: 1.0346 - regression_loss: 0.9281 - classification_loss: 0.1065 34/500 [=>............................] - ETA: 1:54 - loss: 1.0263 - regression_loss: 0.9219 - classification_loss: 0.1044 35/500 [=>............................] - ETA: 1:53 - loss: 1.0274 - regression_loss: 0.9229 - classification_loss: 0.1045 36/500 [=>............................] - ETA: 1:53 - loss: 1.0298 - regression_loss: 0.9246 - classification_loss: 0.1051 37/500 [=>............................] - ETA: 1:53 - loss: 1.0244 - regression_loss: 0.9205 - classification_loss: 0.1040 38/500 [=>............................] - ETA: 1:53 - loss: 1.0246 - regression_loss: 0.9210 - classification_loss: 0.1036 39/500 [=>............................] - ETA: 1:53 - loss: 1.0192 - regression_loss: 0.9159 - classification_loss: 0.1033 40/500 [=>............................] - ETA: 1:53 - loss: 1.0112 - regression_loss: 0.9091 - classification_loss: 0.1021 41/500 [=>............................] - ETA: 1:52 - loss: 1.0135 - regression_loss: 0.9107 - classification_loss: 0.1028 42/500 [=>............................] - ETA: 1:52 - loss: 1.0164 - regression_loss: 0.9131 - classification_loss: 0.1033 43/500 [=>............................] - ETA: 1:52 - loss: 1.0149 - regression_loss: 0.9120 - classification_loss: 0.1029 44/500 [=>............................] - ETA: 1:52 - loss: 1.0148 - regression_loss: 0.9114 - classification_loss: 0.1034 45/500 [=>............................] - ETA: 1:51 - loss: 1.0203 - regression_loss: 0.9159 - classification_loss: 0.1044 46/500 [=>............................] - ETA: 1:51 - loss: 1.0176 - regression_loss: 0.9139 - classification_loss: 0.1037 47/500 [=>............................] - ETA: 1:51 - loss: 1.0036 - regression_loss: 0.9014 - classification_loss: 0.1022 48/500 [=>............................] - ETA: 1:51 - loss: 1.0065 - regression_loss: 0.9036 - classification_loss: 0.1030 49/500 [=>............................] - ETA: 1:51 - loss: 1.0074 - regression_loss: 0.9050 - classification_loss: 0.1025 50/500 [==>...........................] - ETA: 1:50 - loss: 1.0159 - regression_loss: 0.9124 - classification_loss: 0.1035 51/500 [==>...........................] - ETA: 1:50 - loss: 1.0164 - regression_loss: 0.9136 - classification_loss: 0.1028 52/500 [==>...........................] - ETA: 1:50 - loss: 1.0164 - regression_loss: 0.9136 - classification_loss: 0.1028 53/500 [==>...........................] - ETA: 1:50 - loss: 1.0164 - regression_loss: 0.9136 - classification_loss: 0.1028 54/500 [==>...........................] - ETA: 1:50 - loss: 1.0157 - regression_loss: 0.9113 - classification_loss: 0.1044 55/500 [==>...........................] - ETA: 1:50 - loss: 1.0190 - regression_loss: 0.9148 - classification_loss: 0.1042 56/500 [==>...........................] - ETA: 1:49 - loss: 1.0152 - regression_loss: 0.9111 - classification_loss: 0.1041 57/500 [==>...........................] - ETA: 1:49 - loss: 1.0085 - regression_loss: 0.9053 - classification_loss: 0.1032 58/500 [==>...........................] - ETA: 1:49 - loss: 0.9969 - regression_loss: 0.8949 - classification_loss: 0.1020 59/500 [==>...........................] - ETA: 1:49 - loss: 0.9968 - regression_loss: 0.8949 - classification_loss: 0.1019 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9865 - regression_loss: 0.8858 - classification_loss: 0.1007 61/500 [==>...........................] - ETA: 1:48 - loss: 0.9960 - regression_loss: 0.8943 - classification_loss: 0.1018 62/500 [==>...........................] - ETA: 1:48 - loss: 1.0010 - regression_loss: 0.8978 - classification_loss: 0.1032 63/500 [==>...........................] - ETA: 1:48 - loss: 0.9980 - regression_loss: 0.8956 - classification_loss: 0.1023 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9947 - regression_loss: 0.8930 - classification_loss: 0.1017 65/500 [==>...........................] - ETA: 1:48 - loss: 0.9975 - regression_loss: 0.8953 - classification_loss: 0.1022 66/500 [==>...........................] - ETA: 1:47 - loss: 0.9960 - regression_loss: 0.8940 - classification_loss: 0.1020 67/500 [===>..........................] - ETA: 1:47 - loss: 0.9880 - regression_loss: 0.8872 - classification_loss: 0.1009 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9947 - regression_loss: 0.8924 - classification_loss: 0.1023 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9934 - regression_loss: 0.8919 - classification_loss: 0.1016 70/500 [===>..........................] - ETA: 1:46 - loss: 0.9980 - regression_loss: 0.8958 - classification_loss: 0.1022 71/500 [===>..........................] - ETA: 1:46 - loss: 1.0027 - regression_loss: 0.8997 - classification_loss: 0.1030 72/500 [===>..........................] - ETA: 1:46 - loss: 1.0071 - regression_loss: 0.9032 - classification_loss: 0.1039 73/500 [===>..........................] - ETA: 1:46 - loss: 1.0094 - regression_loss: 0.9051 - classification_loss: 0.1044 74/500 [===>..........................] - ETA: 1:45 - loss: 1.0088 - regression_loss: 0.9050 - classification_loss: 0.1038 75/500 [===>..........................] - ETA: 1:45 - loss: 1.0091 - regression_loss: 0.9044 - classification_loss: 0.1046 76/500 [===>..........................] - ETA: 1:45 - loss: 1.0105 - regression_loss: 0.9059 - classification_loss: 0.1046 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0086 - regression_loss: 0.9042 - classification_loss: 0.1043 78/500 [===>..........................] - ETA: 1:44 - loss: 1.0025 - regression_loss: 0.8993 - classification_loss: 0.1032 79/500 [===>..........................] - ETA: 1:44 - loss: 1.0068 - regression_loss: 0.9018 - classification_loss: 0.1050 80/500 [===>..........................] - ETA: 1:44 - loss: 1.0114 - regression_loss: 0.9056 - classification_loss: 0.1058 81/500 [===>..........................] - ETA: 1:44 - loss: 1.0155 - regression_loss: 0.9087 - classification_loss: 0.1068 82/500 [===>..........................] - ETA: 1:44 - loss: 1.0141 - regression_loss: 0.9071 - classification_loss: 0.1070 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0182 - regression_loss: 0.9106 - classification_loss: 0.1076 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0195 - regression_loss: 0.9100 - classification_loss: 0.1094 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0252 - regression_loss: 0.9146 - classification_loss: 0.1105 86/500 [====>.........................] - ETA: 1:43 - loss: 1.0169 - regression_loss: 0.9074 - classification_loss: 0.1095 87/500 [====>.........................] - ETA: 1:42 - loss: 1.0109 - regression_loss: 0.9024 - classification_loss: 0.1086 88/500 [====>.........................] - ETA: 1:42 - loss: 1.0116 - regression_loss: 0.9025 - classification_loss: 0.1091 89/500 [====>.........................] - ETA: 1:42 - loss: 1.0118 - regression_loss: 0.9024 - classification_loss: 0.1094 90/500 [====>.........................] - ETA: 1:42 - loss: 1.0141 - regression_loss: 0.9045 - classification_loss: 0.1096 91/500 [====>.........................] - ETA: 1:41 - loss: 1.0108 - regression_loss: 0.9019 - classification_loss: 0.1089 92/500 [====>.........................] - ETA: 1:41 - loss: 1.0066 - regression_loss: 0.8985 - classification_loss: 0.1081 93/500 [====>.........................] - ETA: 1:41 - loss: 1.0134 - regression_loss: 0.9043 - classification_loss: 0.1090 94/500 [====>.........................] - ETA: 1:41 - loss: 1.0066 - regression_loss: 0.8985 - classification_loss: 0.1082 95/500 [====>.........................] - ETA: 1:40 - loss: 1.0050 - regression_loss: 0.8969 - classification_loss: 0.1081 96/500 [====>.........................] - ETA: 1:40 - loss: 1.0055 - regression_loss: 0.8981 - classification_loss: 0.1075 97/500 [====>.........................] - ETA: 1:40 - loss: 1.0061 - regression_loss: 0.8982 - classification_loss: 0.1080 98/500 [====>.........................] - ETA: 1:40 - loss: 1.0050 - regression_loss: 0.8971 - classification_loss: 0.1079 99/500 [====>.........................] - ETA: 1:39 - loss: 1.0004 - regression_loss: 0.8933 - classification_loss: 0.1071 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9971 - regression_loss: 0.8907 - classification_loss: 0.1064 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9991 - regression_loss: 0.8923 - classification_loss: 0.1068 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9993 - regression_loss: 0.8931 - classification_loss: 0.1062 103/500 [=====>........................] - ETA: 1:38 - loss: 0.9959 - regression_loss: 0.8905 - classification_loss: 0.1054 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9983 - regression_loss: 0.8926 - classification_loss: 0.1058 105/500 [=====>........................] - ETA: 1:38 - loss: 1.0016 - regression_loss: 0.8951 - classification_loss: 0.1065 106/500 [=====>........................] - ETA: 1:38 - loss: 1.0001 - regression_loss: 0.8936 - classification_loss: 0.1065 107/500 [=====>........................] - ETA: 1:37 - loss: 0.9986 - regression_loss: 0.8921 - classification_loss: 0.1065 108/500 [=====>........................] - ETA: 1:37 - loss: 1.0013 - regression_loss: 0.8944 - classification_loss: 0.1070 109/500 [=====>........................] - ETA: 1:37 - loss: 1.0035 - regression_loss: 0.8963 - classification_loss: 0.1072 110/500 [=====>........................] - ETA: 1:37 - loss: 1.0026 - regression_loss: 0.8957 - classification_loss: 0.1070 111/500 [=====>........................] - ETA: 1:36 - loss: 1.0068 - regression_loss: 0.8988 - classification_loss: 0.1080 112/500 [=====>........................] - ETA: 1:36 - loss: 1.0070 - regression_loss: 0.8986 - classification_loss: 0.1084 113/500 [=====>........................] - ETA: 1:36 - loss: 1.0095 - regression_loss: 0.9009 - classification_loss: 0.1087 114/500 [=====>........................] - ETA: 1:36 - loss: 1.0055 - regression_loss: 0.8976 - classification_loss: 0.1079 115/500 [=====>........................] - ETA: 1:35 - loss: 1.0008 - regression_loss: 0.8936 - classification_loss: 0.1072 116/500 [=====>........................] - ETA: 1:35 - loss: 0.9995 - regression_loss: 0.8927 - classification_loss: 0.1067 117/500 [======>.......................] - ETA: 1:35 - loss: 1.0014 - regression_loss: 0.8945 - classification_loss: 0.1069 118/500 [======>.......................] - ETA: 1:35 - loss: 1.0029 - regression_loss: 0.8957 - classification_loss: 0.1072 119/500 [======>.......................] - ETA: 1:34 - loss: 1.0056 - regression_loss: 0.8976 - classification_loss: 0.1080 120/500 [======>.......................] - ETA: 1:34 - loss: 1.0061 - regression_loss: 0.8978 - classification_loss: 0.1083 121/500 [======>.......................] - ETA: 1:34 - loss: 1.0010 - regression_loss: 0.8934 - classification_loss: 0.1077 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9997 - regression_loss: 0.8923 - classification_loss: 0.1074 123/500 [======>.......................] - ETA: 1:33 - loss: 0.9959 - regression_loss: 0.8887 - classification_loss: 0.1072 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9971 - regression_loss: 0.8897 - classification_loss: 0.1074 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9957 - regression_loss: 0.8884 - classification_loss: 0.1072 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9925 - regression_loss: 0.8856 - classification_loss: 0.1069 127/500 [======>.......................] - ETA: 1:32 - loss: 0.9909 - regression_loss: 0.8846 - classification_loss: 0.1063 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9925 - regression_loss: 0.8861 - classification_loss: 0.1064 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9876 - regression_loss: 0.8818 - classification_loss: 0.1058 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9860 - regression_loss: 0.8806 - classification_loss: 0.1054 131/500 [======>.......................] - ETA: 1:31 - loss: 0.9844 - regression_loss: 0.8793 - classification_loss: 0.1051 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9863 - regression_loss: 0.8813 - classification_loss: 0.1050 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9904 - regression_loss: 0.8848 - classification_loss: 0.1057 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9888 - regression_loss: 0.8833 - classification_loss: 0.1055 135/500 [=======>......................] - ETA: 1:30 - loss: 0.9862 - regression_loss: 0.8813 - classification_loss: 0.1049 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9886 - regression_loss: 0.8834 - classification_loss: 0.1052 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9900 - regression_loss: 0.8847 - classification_loss: 0.1053 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9911 - regression_loss: 0.8861 - classification_loss: 0.1050 139/500 [=======>......................] - ETA: 1:29 - loss: 0.9957 - regression_loss: 0.8902 - classification_loss: 0.1055 140/500 [=======>......................] - ETA: 1:29 - loss: 0.9928 - regression_loss: 0.8873 - classification_loss: 0.1055 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9943 - regression_loss: 0.8890 - classification_loss: 0.1053 142/500 [=======>......................] - ETA: 1:28 - loss: 0.9934 - regression_loss: 0.8883 - classification_loss: 0.1051 143/500 [=======>......................] - ETA: 1:28 - loss: 0.9925 - regression_loss: 0.8873 - classification_loss: 0.1052 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9952 - regression_loss: 0.8900 - classification_loss: 0.1052 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9915 - regression_loss: 0.8868 - classification_loss: 0.1046 146/500 [=======>......................] - ETA: 1:27 - loss: 0.9934 - regression_loss: 0.8882 - classification_loss: 0.1053 147/500 [=======>......................] - ETA: 1:27 - loss: 0.9961 - regression_loss: 0.8901 - classification_loss: 0.1059 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9952 - regression_loss: 0.8890 - classification_loss: 0.1062 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9947 - regression_loss: 0.8887 - classification_loss: 0.1059 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9965 - regression_loss: 0.8901 - classification_loss: 0.1064 151/500 [========>.....................] - ETA: 1:26 - loss: 0.9964 - regression_loss: 0.8900 - classification_loss: 0.1064 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9988 - regression_loss: 0.8921 - classification_loss: 0.1067 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9984 - regression_loss: 0.8917 - classification_loss: 0.1067 154/500 [========>.....................] - ETA: 1:26 - loss: 1.0032 - regression_loss: 0.8954 - classification_loss: 0.1078 155/500 [========>.....................] - ETA: 1:25 - loss: 1.0041 - regression_loss: 0.8962 - classification_loss: 0.1080 156/500 [========>.....................] - ETA: 1:25 - loss: 1.0002 - regression_loss: 0.8928 - classification_loss: 0.1074 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9974 - regression_loss: 0.8905 - classification_loss: 0.1069 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9983 - regression_loss: 0.8915 - classification_loss: 0.1068 159/500 [========>.....................] - ETA: 1:24 - loss: 0.9972 - regression_loss: 0.8909 - classification_loss: 0.1063 160/500 [========>.....................] - ETA: 1:24 - loss: 0.9977 - regression_loss: 0.8915 - classification_loss: 0.1062 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9982 - regression_loss: 0.8919 - classification_loss: 0.1064 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9986 - regression_loss: 0.8921 - classification_loss: 0.1065 163/500 [========>.....................] - ETA: 1:23 - loss: 0.9950 - regression_loss: 0.8890 - classification_loss: 0.1060 164/500 [========>.....................] - ETA: 1:23 - loss: 0.9926 - regression_loss: 0.8871 - classification_loss: 0.1055 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9932 - regression_loss: 0.8877 - classification_loss: 0.1055 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9939 - regression_loss: 0.8883 - classification_loss: 0.1056 167/500 [=========>....................] - ETA: 1:22 - loss: 0.9945 - regression_loss: 0.8888 - classification_loss: 0.1057 168/500 [=========>....................] - ETA: 1:22 - loss: 0.9959 - regression_loss: 0.8900 - classification_loss: 0.1059 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9929 - regression_loss: 0.8874 - classification_loss: 0.1055 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9932 - regression_loss: 0.8875 - classification_loss: 0.1057 171/500 [=========>....................] - ETA: 1:21 - loss: 0.9920 - regression_loss: 0.8865 - classification_loss: 0.1055 172/500 [=========>....................] - ETA: 1:21 - loss: 0.9944 - regression_loss: 0.8884 - classification_loss: 0.1060 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9921 - regression_loss: 0.8864 - classification_loss: 0.1057 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9958 - regression_loss: 0.8895 - classification_loss: 0.1063 175/500 [=========>....................] - ETA: 1:20 - loss: 0.9921 - regression_loss: 0.8862 - classification_loss: 0.1059 176/500 [=========>....................] - ETA: 1:20 - loss: 0.9929 - regression_loss: 0.8869 - classification_loss: 0.1061 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9925 - regression_loss: 0.8865 - classification_loss: 0.1060 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9926 - regression_loss: 0.8865 - classification_loss: 0.1061 179/500 [=========>....................] - ETA: 1:19 - loss: 0.9908 - regression_loss: 0.8849 - classification_loss: 0.1059 180/500 [=========>....................] - ETA: 1:19 - loss: 0.9894 - regression_loss: 0.8836 - classification_loss: 0.1058 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9881 - regression_loss: 0.8825 - classification_loss: 0.1056 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9880 - regression_loss: 0.8825 - classification_loss: 0.1055 183/500 [=========>....................] - ETA: 1:18 - loss: 0.9886 - regression_loss: 0.8830 - classification_loss: 0.1056 184/500 [==========>...................] - ETA: 1:18 - loss: 0.9889 - regression_loss: 0.8833 - classification_loss: 0.1056 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9854 - regression_loss: 0.8803 - classification_loss: 0.1051 186/500 [==========>...................] - ETA: 1:17 - loss: 0.9836 - regression_loss: 0.8789 - classification_loss: 0.1048 187/500 [==========>...................] - ETA: 1:17 - loss: 0.9817 - regression_loss: 0.8773 - classification_loss: 0.1044 188/500 [==========>...................] - ETA: 1:17 - loss: 0.9834 - regression_loss: 0.8788 - classification_loss: 0.1046 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9821 - regression_loss: 0.8776 - classification_loss: 0.1045 190/500 [==========>...................] - ETA: 1:16 - loss: 0.9813 - regression_loss: 0.8770 - classification_loss: 0.1043 191/500 [==========>...................] - ETA: 1:16 - loss: 0.9805 - regression_loss: 0.8760 - classification_loss: 0.1044 192/500 [==========>...................] - ETA: 1:16 - loss: 0.9783 - regression_loss: 0.8741 - classification_loss: 0.1043 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9782 - regression_loss: 0.8739 - classification_loss: 0.1043 194/500 [==========>...................] - ETA: 1:15 - loss: 0.9750 - regression_loss: 0.8711 - classification_loss: 0.1039 195/500 [==========>...................] - ETA: 1:15 - loss: 0.9743 - regression_loss: 0.8705 - classification_loss: 0.1038 196/500 [==========>...................] - ETA: 1:15 - loss: 0.9780 - regression_loss: 0.8742 - classification_loss: 0.1038 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9782 - regression_loss: 0.8745 - classification_loss: 0.1037 198/500 [==========>...................] - ETA: 1:14 - loss: 0.9775 - regression_loss: 0.8740 - classification_loss: 0.1035 199/500 [==========>...................] - ETA: 1:14 - loss: 0.9767 - regression_loss: 0.8734 - classification_loss: 0.1033 200/500 [===========>..................] - ETA: 1:14 - loss: 0.9794 - regression_loss: 0.8757 - classification_loss: 0.1037 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9794 - regression_loss: 0.8756 - classification_loss: 0.1038 202/500 [===========>..................] - ETA: 1:13 - loss: 0.9799 - regression_loss: 0.8761 - classification_loss: 0.1038 203/500 [===========>..................] - ETA: 1:13 - loss: 0.9789 - regression_loss: 0.8754 - classification_loss: 0.1036 204/500 [===========>..................] - ETA: 1:13 - loss: 0.9780 - regression_loss: 0.8746 - classification_loss: 0.1034 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9754 - regression_loss: 0.8724 - classification_loss: 0.1030 206/500 [===========>..................] - ETA: 1:12 - loss: 0.9747 - regression_loss: 0.8719 - classification_loss: 0.1028 207/500 [===========>..................] - ETA: 1:12 - loss: 0.9727 - regression_loss: 0.8701 - classification_loss: 0.1026 208/500 [===========>..................] - ETA: 1:12 - loss: 0.9718 - regression_loss: 0.8694 - classification_loss: 0.1023 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9720 - regression_loss: 0.8696 - classification_loss: 0.1024 210/500 [===========>..................] - ETA: 1:11 - loss: 0.9735 - regression_loss: 0.8709 - classification_loss: 0.1026 211/500 [===========>..................] - ETA: 1:11 - loss: 0.9752 - regression_loss: 0.8725 - classification_loss: 0.1028 212/500 [===========>..................] - ETA: 1:11 - loss: 0.9753 - regression_loss: 0.8724 - classification_loss: 0.1029 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9751 - regression_loss: 0.8721 - classification_loss: 0.1030 214/500 [===========>..................] - ETA: 1:10 - loss: 0.9749 - regression_loss: 0.8721 - classification_loss: 0.1028 215/500 [===========>..................] - ETA: 1:10 - loss: 0.9751 - regression_loss: 0.8725 - classification_loss: 0.1027 216/500 [===========>..................] - ETA: 1:10 - loss: 0.9772 - regression_loss: 0.8742 - classification_loss: 0.1030 217/500 [============>.................] - ETA: 1:10 - loss: 0.9779 - regression_loss: 0.8751 - classification_loss: 0.1028 218/500 [============>.................] - ETA: 1:09 - loss: 0.9799 - regression_loss: 0.8767 - classification_loss: 0.1032 219/500 [============>.................] - ETA: 1:09 - loss: 0.9805 - regression_loss: 0.8773 - classification_loss: 0.1032 220/500 [============>.................] - ETA: 1:09 - loss: 0.9802 - regression_loss: 0.8770 - classification_loss: 0.1032 221/500 [============>.................] - ETA: 1:09 - loss: 0.9810 - regression_loss: 0.8781 - classification_loss: 0.1029 222/500 [============>.................] - ETA: 1:09 - loss: 0.9814 - regression_loss: 0.8784 - classification_loss: 0.1030 223/500 [============>.................] - ETA: 1:08 - loss: 0.9832 - regression_loss: 0.8799 - classification_loss: 0.1033 224/500 [============>.................] - ETA: 1:08 - loss: 0.9823 - regression_loss: 0.8792 - classification_loss: 0.1031 225/500 [============>.................] - ETA: 1:08 - loss: 0.9838 - regression_loss: 0.8806 - classification_loss: 0.1032 226/500 [============>.................] - ETA: 1:08 - loss: 0.9860 - regression_loss: 0.8825 - classification_loss: 0.1035 227/500 [============>.................] - ETA: 1:07 - loss: 0.9854 - regression_loss: 0.8820 - classification_loss: 0.1033 228/500 [============>.................] - ETA: 1:07 - loss: 0.9865 - regression_loss: 0.8829 - classification_loss: 0.1036 229/500 [============>.................] - ETA: 1:07 - loss: 0.9865 - regression_loss: 0.8828 - classification_loss: 0.1037 230/500 [============>.................] - ETA: 1:07 - loss: 0.9876 - regression_loss: 0.8837 - classification_loss: 0.1040 231/500 [============>.................] - ETA: 1:06 - loss: 0.9874 - regression_loss: 0.8835 - classification_loss: 0.1040 232/500 [============>.................] - ETA: 1:06 - loss: 0.9883 - regression_loss: 0.8842 - classification_loss: 0.1041 233/500 [============>.................] - ETA: 1:06 - loss: 0.9898 - regression_loss: 0.8853 - classification_loss: 0.1045 234/500 [=============>................] - ETA: 1:06 - loss: 0.9910 - regression_loss: 0.8863 - classification_loss: 0.1047 235/500 [=============>................] - ETA: 1:05 - loss: 0.9910 - regression_loss: 0.8863 - classification_loss: 0.1048 236/500 [=============>................] - ETA: 1:05 - loss: 0.9912 - regression_loss: 0.8866 - classification_loss: 0.1046 237/500 [=============>................] - ETA: 1:05 - loss: 0.9914 - regression_loss: 0.8866 - classification_loss: 0.1048 238/500 [=============>................] - ETA: 1:05 - loss: 0.9899 - regression_loss: 0.8853 - classification_loss: 0.1045 239/500 [=============>................] - ETA: 1:04 - loss: 0.9904 - regression_loss: 0.8857 - classification_loss: 0.1047 240/500 [=============>................] - ETA: 1:04 - loss: 0.9890 - regression_loss: 0.8845 - classification_loss: 0.1045 241/500 [=============>................] - ETA: 1:04 - loss: 0.9862 - regression_loss: 0.8820 - classification_loss: 0.1042 242/500 [=============>................] - ETA: 1:04 - loss: 0.9850 - regression_loss: 0.8811 - classification_loss: 0.1039 243/500 [=============>................] - ETA: 1:03 - loss: 0.9858 - regression_loss: 0.8817 - classification_loss: 0.1041 244/500 [=============>................] - ETA: 1:03 - loss: 0.9869 - regression_loss: 0.8826 - classification_loss: 0.1043 245/500 [=============>................] - ETA: 1:03 - loss: 0.9859 - regression_loss: 0.8818 - classification_loss: 0.1042 246/500 [=============>................] - ETA: 1:03 - loss: 0.9873 - regression_loss: 0.8830 - classification_loss: 0.1043 247/500 [=============>................] - ETA: 1:02 - loss: 0.9857 - regression_loss: 0.8817 - classification_loss: 0.1040 248/500 [=============>................] - ETA: 1:02 - loss: 0.9849 - regression_loss: 0.8812 - classification_loss: 0.1037 249/500 [=============>................] - ETA: 1:02 - loss: 0.9845 - regression_loss: 0.8808 - classification_loss: 0.1037 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9847 - regression_loss: 0.8809 - classification_loss: 0.1038 251/500 [==============>...............] - ETA: 1:01 - loss: 0.9877 - regression_loss: 0.8834 - classification_loss: 0.1043 252/500 [==============>...............] - ETA: 1:01 - loss: 0.9899 - regression_loss: 0.8851 - classification_loss: 0.1048 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9881 - regression_loss: 0.8835 - classification_loss: 0.1046 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9882 - regression_loss: 0.8837 - classification_loss: 0.1045 255/500 [==============>...............] - ETA: 1:00 - loss: 0.9895 - regression_loss: 0.8848 - classification_loss: 0.1047 256/500 [==============>...............] - ETA: 1:00 - loss: 0.9903 - regression_loss: 0.8853 - classification_loss: 0.1050 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9879 - regression_loss: 0.8832 - classification_loss: 0.1047 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9876 - regression_loss: 0.8826 - classification_loss: 0.1050 259/500 [==============>...............] - ETA: 59s - loss: 0.9892 - regression_loss: 0.8839 - classification_loss: 0.1053  260/500 [==============>...............] - ETA: 59s - loss: 0.9904 - regression_loss: 0.8849 - classification_loss: 0.1055 261/500 [==============>...............] - ETA: 59s - loss: 0.9906 - regression_loss: 0.8850 - classification_loss: 0.1056 262/500 [==============>...............] - ETA: 59s - loss: 0.9922 - regression_loss: 0.8866 - classification_loss: 0.1057 263/500 [==============>...............] - ETA: 58s - loss: 0.9922 - regression_loss: 0.8867 - classification_loss: 0.1054 264/500 [==============>...............] - ETA: 58s - loss: 0.9939 - regression_loss: 0.8882 - classification_loss: 0.1057 265/500 [==============>...............] - ETA: 58s - loss: 0.9949 - regression_loss: 0.8888 - classification_loss: 0.1061 266/500 [==============>...............] - ETA: 58s - loss: 0.9935 - regression_loss: 0.8874 - classification_loss: 0.1061 267/500 [===============>..............] - ETA: 57s - loss: 0.9959 - regression_loss: 0.8892 - classification_loss: 0.1066 268/500 [===============>..............] - ETA: 57s - loss: 0.9968 - regression_loss: 0.8899 - classification_loss: 0.1068 269/500 [===============>..............] - ETA: 57s - loss: 0.9965 - regression_loss: 0.8898 - classification_loss: 0.1067 270/500 [===============>..............] - ETA: 57s - loss: 0.9957 - regression_loss: 0.8892 - classification_loss: 0.1065 271/500 [===============>..............] - ETA: 56s - loss: 0.9968 - regression_loss: 0.8901 - classification_loss: 0.1067 272/500 [===============>..............] - ETA: 56s - loss: 0.9971 - regression_loss: 0.8903 - classification_loss: 0.1068 273/500 [===============>..............] - ETA: 56s - loss: 0.9984 - regression_loss: 0.8915 - classification_loss: 0.1069 274/500 [===============>..............] - ETA: 56s - loss: 0.9968 - regression_loss: 0.8900 - classification_loss: 0.1068 275/500 [===============>..............] - ETA: 55s - loss: 0.9957 - regression_loss: 0.8892 - classification_loss: 0.1065 276/500 [===============>..............] - ETA: 55s - loss: 0.9954 - regression_loss: 0.8891 - classification_loss: 0.1064 277/500 [===============>..............] - ETA: 55s - loss: 0.9951 - regression_loss: 0.8888 - classification_loss: 0.1063 278/500 [===============>..............] - ETA: 55s - loss: 0.9941 - regression_loss: 0.8879 - classification_loss: 0.1062 279/500 [===============>..............] - ETA: 54s - loss: 0.9959 - regression_loss: 0.8895 - classification_loss: 0.1064 280/500 [===============>..............] - ETA: 54s - loss: 0.9955 - regression_loss: 0.8891 - classification_loss: 0.1064 281/500 [===============>..............] - ETA: 54s - loss: 0.9957 - regression_loss: 0.8892 - classification_loss: 0.1065 282/500 [===============>..............] - ETA: 54s - loss: 0.9958 - regression_loss: 0.8893 - classification_loss: 0.1065 283/500 [===============>..............] - ETA: 53s - loss: 0.9954 - regression_loss: 0.8888 - classification_loss: 0.1065 284/500 [================>.............] - ETA: 53s - loss: 0.9945 - regression_loss: 0.8880 - classification_loss: 0.1065 285/500 [================>.............] - ETA: 53s - loss: 0.9942 - regression_loss: 0.8878 - classification_loss: 0.1065 286/500 [================>.............] - ETA: 53s - loss: 0.9943 - regression_loss: 0.8878 - classification_loss: 0.1064 287/500 [================>.............] - ETA: 52s - loss: 0.9944 - regression_loss: 0.8880 - classification_loss: 0.1064 288/500 [================>.............] - ETA: 52s - loss: 0.9930 - regression_loss: 0.8868 - classification_loss: 0.1062 289/500 [================>.............] - ETA: 52s - loss: 0.9936 - regression_loss: 0.8875 - classification_loss: 0.1061 290/500 [================>.............] - ETA: 52s - loss: 0.9915 - regression_loss: 0.8857 - classification_loss: 0.1058 291/500 [================>.............] - ETA: 51s - loss: 0.9925 - regression_loss: 0.8865 - classification_loss: 0.1060 292/500 [================>.............] - ETA: 51s - loss: 0.9943 - regression_loss: 0.8879 - classification_loss: 0.1063 293/500 [================>.............] - ETA: 51s - loss: 0.9958 - regression_loss: 0.8894 - classification_loss: 0.1064 294/500 [================>.............] - ETA: 51s - loss: 0.9987 - regression_loss: 0.8917 - classification_loss: 0.1070 295/500 [================>.............] - ETA: 50s - loss: 0.9980 - regression_loss: 0.8911 - classification_loss: 0.1069 296/500 [================>.............] - ETA: 50s - loss: 0.9979 - regression_loss: 0.8910 - classification_loss: 0.1068 297/500 [================>.............] - ETA: 50s - loss: 0.9984 - regression_loss: 0.8915 - classification_loss: 0.1070 298/500 [================>.............] - ETA: 50s - loss: 0.9975 - regression_loss: 0.8907 - classification_loss: 0.1068 299/500 [================>.............] - ETA: 49s - loss: 0.9953 - regression_loss: 0.8887 - classification_loss: 0.1066 300/500 [=================>............] - ETA: 49s - loss: 0.9962 - regression_loss: 0.8895 - classification_loss: 0.1067 301/500 [=================>............] - ETA: 49s - loss: 0.9972 - regression_loss: 0.8904 - classification_loss: 0.1068 302/500 [=================>............] - ETA: 49s - loss: 0.9970 - regression_loss: 0.8902 - classification_loss: 0.1068 303/500 [=================>............] - ETA: 48s - loss: 0.9958 - regression_loss: 0.8893 - classification_loss: 0.1066 304/500 [=================>............] - ETA: 48s - loss: 0.9943 - regression_loss: 0.8878 - classification_loss: 0.1065 305/500 [=================>............] - ETA: 48s - loss: 0.9950 - regression_loss: 0.8884 - classification_loss: 0.1066 306/500 [=================>............] - ETA: 48s - loss: 0.9944 - regression_loss: 0.8879 - classification_loss: 0.1065 307/500 [=================>............] - ETA: 47s - loss: 0.9944 - regression_loss: 0.8881 - classification_loss: 0.1064 308/500 [=================>............] - ETA: 47s - loss: 0.9939 - regression_loss: 0.8875 - classification_loss: 0.1064 309/500 [=================>............] - ETA: 47s - loss: 0.9942 - regression_loss: 0.8877 - classification_loss: 0.1065 310/500 [=================>............] - ETA: 47s - loss: 0.9931 - regression_loss: 0.8868 - classification_loss: 0.1063 311/500 [=================>............] - ETA: 46s - loss: 0.9938 - regression_loss: 0.8873 - classification_loss: 0.1065 312/500 [=================>............] - ETA: 46s - loss: 0.9934 - regression_loss: 0.8870 - classification_loss: 0.1064 313/500 [=================>............] - ETA: 46s - loss: 0.9931 - regression_loss: 0.8868 - classification_loss: 0.1063 314/500 [=================>............] - ETA: 46s - loss: 0.9933 - regression_loss: 0.8870 - classification_loss: 0.1063 315/500 [=================>............] - ETA: 46s - loss: 0.9918 - regression_loss: 0.8857 - classification_loss: 0.1061 316/500 [=================>............] - ETA: 45s - loss: 0.9938 - regression_loss: 0.8871 - classification_loss: 0.1066 317/500 [==================>...........] - ETA: 45s - loss: 0.9940 - regression_loss: 0.8873 - classification_loss: 0.1067 318/500 [==================>...........] - ETA: 45s - loss: 0.9924 - regression_loss: 0.8860 - classification_loss: 0.1065 319/500 [==================>...........] - ETA: 45s - loss: 0.9929 - regression_loss: 0.8863 - classification_loss: 0.1066 320/500 [==================>...........] - ETA: 44s - loss: 0.9923 - regression_loss: 0.8860 - classification_loss: 0.1064 321/500 [==================>...........] - ETA: 44s - loss: 0.9935 - regression_loss: 0.8869 - classification_loss: 0.1066 322/500 [==================>...........] - ETA: 44s - loss: 0.9933 - regression_loss: 0.8867 - classification_loss: 0.1066 323/500 [==================>...........] - ETA: 44s - loss: 0.9939 - regression_loss: 0.8872 - classification_loss: 0.1067 324/500 [==================>...........] - ETA: 43s - loss: 0.9942 - regression_loss: 0.8874 - classification_loss: 0.1068 325/500 [==================>...........] - ETA: 43s - loss: 0.9928 - regression_loss: 0.8862 - classification_loss: 0.1065 326/500 [==================>...........] - ETA: 43s - loss: 0.9935 - regression_loss: 0.8869 - classification_loss: 0.1066 327/500 [==================>...........] - ETA: 43s - loss: 0.9939 - regression_loss: 0.8873 - classification_loss: 0.1066 328/500 [==================>...........] - ETA: 42s - loss: 0.9923 - regression_loss: 0.8859 - classification_loss: 0.1064 329/500 [==================>...........] - ETA: 42s - loss: 0.9928 - regression_loss: 0.8862 - classification_loss: 0.1066 330/500 [==================>...........] - ETA: 42s - loss: 0.9925 - regression_loss: 0.8860 - classification_loss: 0.1066 331/500 [==================>...........] - ETA: 42s - loss: 0.9923 - regression_loss: 0.8857 - classification_loss: 0.1066 332/500 [==================>...........] - ETA: 41s - loss: 0.9917 - regression_loss: 0.8852 - classification_loss: 0.1065 333/500 [==================>...........] - ETA: 41s - loss: 0.9919 - regression_loss: 0.8853 - classification_loss: 0.1066 334/500 [===================>..........] - ETA: 41s - loss: 0.9924 - regression_loss: 0.8859 - classification_loss: 0.1065 335/500 [===================>..........] - ETA: 41s - loss: 0.9913 - regression_loss: 0.8850 - classification_loss: 0.1063 336/500 [===================>..........] - ETA: 40s - loss: 0.9928 - regression_loss: 0.8863 - classification_loss: 0.1065 337/500 [===================>..........] - ETA: 40s - loss: 0.9926 - regression_loss: 0.8860 - classification_loss: 0.1065 338/500 [===================>..........] - ETA: 40s - loss: 0.9928 - regression_loss: 0.8862 - classification_loss: 0.1065 339/500 [===================>..........] - ETA: 40s - loss: 0.9931 - regression_loss: 0.8865 - classification_loss: 0.1066 340/500 [===================>..........] - ETA: 39s - loss: 0.9932 - regression_loss: 0.8866 - classification_loss: 0.1066 341/500 [===================>..........] - ETA: 39s - loss: 0.9944 - regression_loss: 0.8876 - classification_loss: 0.1067 342/500 [===================>..........] - ETA: 39s - loss: 0.9954 - regression_loss: 0.8885 - classification_loss: 0.1069 343/500 [===================>..........] - ETA: 39s - loss: 0.9934 - regression_loss: 0.8867 - classification_loss: 0.1067 344/500 [===================>..........] - ETA: 38s - loss: 0.9919 - regression_loss: 0.8854 - classification_loss: 0.1065 345/500 [===================>..........] - ETA: 38s - loss: 0.9923 - regression_loss: 0.8857 - classification_loss: 0.1065 346/500 [===================>..........] - ETA: 38s - loss: 0.9939 - regression_loss: 0.8871 - classification_loss: 0.1068 347/500 [===================>..........] - ETA: 38s - loss: 0.9927 - regression_loss: 0.8860 - classification_loss: 0.1067 348/500 [===================>..........] - ETA: 37s - loss: 0.9918 - regression_loss: 0.8853 - classification_loss: 0.1065 349/500 [===================>..........] - ETA: 37s - loss: 0.9924 - regression_loss: 0.8859 - classification_loss: 0.1066 350/500 [====================>.........] - ETA: 37s - loss: 0.9922 - regression_loss: 0.8857 - classification_loss: 0.1065 351/500 [====================>.........] - ETA: 37s - loss: 0.9904 - regression_loss: 0.8840 - classification_loss: 0.1064 352/500 [====================>.........] - ETA: 36s - loss: 0.9913 - regression_loss: 0.8846 - classification_loss: 0.1066 353/500 [====================>.........] - ETA: 36s - loss: 0.9917 - regression_loss: 0.8850 - classification_loss: 0.1067 354/500 [====================>.........] - ETA: 36s - loss: 0.9913 - regression_loss: 0.8846 - classification_loss: 0.1067 355/500 [====================>.........] - ETA: 36s - loss: 0.9898 - regression_loss: 0.8833 - classification_loss: 0.1065 356/500 [====================>.........] - ETA: 35s - loss: 0.9914 - regression_loss: 0.8846 - classification_loss: 0.1068 357/500 [====================>.........] - ETA: 35s - loss: 0.9905 - regression_loss: 0.8838 - classification_loss: 0.1067 358/500 [====================>.........] - ETA: 35s - loss: 0.9903 - regression_loss: 0.8836 - classification_loss: 0.1067 359/500 [====================>.........] - ETA: 35s - loss: 0.9907 - regression_loss: 0.8841 - classification_loss: 0.1067 360/500 [====================>.........] - ETA: 34s - loss: 0.9890 - regression_loss: 0.8826 - classification_loss: 0.1064 361/500 [====================>.........] - ETA: 34s - loss: 0.9875 - regression_loss: 0.8813 - classification_loss: 0.1061 362/500 [====================>.........] - ETA: 34s - loss: 0.9889 - regression_loss: 0.8825 - classification_loss: 0.1064 363/500 [====================>.........] - ETA: 34s - loss: 0.9913 - regression_loss: 0.8844 - classification_loss: 0.1069 364/500 [====================>.........] - ETA: 33s - loss: 0.9922 - regression_loss: 0.8851 - classification_loss: 0.1070 365/500 [====================>.........] - ETA: 33s - loss: 0.9926 - regression_loss: 0.8855 - classification_loss: 0.1071 366/500 [====================>.........] - ETA: 33s - loss: 0.9937 - regression_loss: 0.8863 - classification_loss: 0.1074 367/500 [=====================>........] - ETA: 33s - loss: 0.9938 - regression_loss: 0.8864 - classification_loss: 0.1073 368/500 [=====================>........] - ETA: 32s - loss: 0.9950 - regression_loss: 0.8874 - classification_loss: 0.1076 369/500 [=====================>........] - ETA: 32s - loss: 0.9950 - regression_loss: 0.8874 - classification_loss: 0.1076 370/500 [=====================>........] - ETA: 32s - loss: 0.9946 - regression_loss: 0.8871 - classification_loss: 0.1075 371/500 [=====================>........] - ETA: 32s - loss: 0.9960 - regression_loss: 0.8882 - classification_loss: 0.1078 372/500 [=====================>........] - ETA: 31s - loss: 0.9952 - regression_loss: 0.8873 - classification_loss: 0.1079 373/500 [=====================>........] - ETA: 31s - loss: 0.9943 - regression_loss: 0.8866 - classification_loss: 0.1077 374/500 [=====================>........] - ETA: 31s - loss: 0.9950 - regression_loss: 0.8873 - classification_loss: 0.1077 375/500 [=====================>........] - ETA: 31s - loss: 1.0001 - regression_loss: 0.8898 - classification_loss: 0.1102 376/500 [=====================>........] - ETA: 30s - loss: 0.9996 - regression_loss: 0.8895 - classification_loss: 0.1101 377/500 [=====================>........] - ETA: 30s - loss: 1.0027 - regression_loss: 0.8921 - classification_loss: 0.1107 378/500 [=====================>........] - ETA: 30s - loss: 1.0017 - regression_loss: 0.8912 - classification_loss: 0.1105 379/500 [=====================>........] - ETA: 30s - loss: 1.0009 - regression_loss: 0.8905 - classification_loss: 0.1104 380/500 [=====================>........] - ETA: 29s - loss: 1.0013 - regression_loss: 0.8909 - classification_loss: 0.1104 381/500 [=====================>........] - ETA: 29s - loss: 1.0020 - regression_loss: 0.8915 - classification_loss: 0.1105 382/500 [=====================>........] - ETA: 29s - loss: 1.0040 - regression_loss: 0.8932 - classification_loss: 0.1108 383/500 [=====================>........] - ETA: 29s - loss: 1.0031 - regression_loss: 0.8924 - classification_loss: 0.1108 384/500 [======================>.......] - ETA: 28s - loss: 1.0023 - regression_loss: 0.8915 - classification_loss: 0.1108 385/500 [======================>.......] - ETA: 28s - loss: 1.0019 - regression_loss: 0.8911 - classification_loss: 0.1108 386/500 [======================>.......] - ETA: 28s - loss: 1.0014 - regression_loss: 0.8908 - classification_loss: 0.1106 387/500 [======================>.......] - ETA: 28s - loss: 1.0001 - regression_loss: 0.8898 - classification_loss: 0.1104 388/500 [======================>.......] - ETA: 27s - loss: 0.9994 - regression_loss: 0.8889 - classification_loss: 0.1105 389/500 [======================>.......] - ETA: 27s - loss: 0.9994 - regression_loss: 0.8890 - classification_loss: 0.1104 390/500 [======================>.......] - ETA: 27s - loss: 0.9983 - regression_loss: 0.8881 - classification_loss: 0.1102 391/500 [======================>.......] - ETA: 27s - loss: 0.9986 - regression_loss: 0.8883 - classification_loss: 0.1102 392/500 [======================>.......] - ETA: 26s - loss: 0.9986 - regression_loss: 0.8883 - classification_loss: 0.1103 393/500 [======================>.......] - ETA: 26s - loss: 0.9984 - regression_loss: 0.8881 - classification_loss: 0.1102 394/500 [======================>.......] - ETA: 26s - loss: 0.9987 - regression_loss: 0.8884 - classification_loss: 0.1103 395/500 [======================>.......] - ETA: 26s - loss: 1.0004 - regression_loss: 0.8899 - classification_loss: 0.1105 396/500 [======================>.......] - ETA: 25s - loss: 1.0010 - regression_loss: 0.8904 - classification_loss: 0.1106 397/500 [======================>.......] - ETA: 25s - loss: 0.9999 - regression_loss: 0.8895 - classification_loss: 0.1104 398/500 [======================>.......] - ETA: 25s - loss: 1.0016 - regression_loss: 0.8910 - classification_loss: 0.1105 399/500 [======================>.......] - ETA: 25s - loss: 1.0026 - regression_loss: 0.8920 - classification_loss: 0.1106 400/500 [=======================>......] - ETA: 24s - loss: 1.0032 - regression_loss: 0.8925 - classification_loss: 0.1108 401/500 [=======================>......] - ETA: 24s - loss: 1.0031 - regression_loss: 0.8923 - classification_loss: 0.1108 402/500 [=======================>......] - ETA: 24s - loss: 1.0025 - regression_loss: 0.8918 - classification_loss: 0.1107 403/500 [=======================>......] - ETA: 24s - loss: 1.0034 - regression_loss: 0.8926 - classification_loss: 0.1107 404/500 [=======================>......] - ETA: 23s - loss: 1.0040 - regression_loss: 0.8932 - classification_loss: 0.1109 405/500 [=======================>......] - ETA: 23s - loss: 1.0038 - regression_loss: 0.8931 - classification_loss: 0.1107 406/500 [=======================>......] - ETA: 23s - loss: 1.0041 - regression_loss: 0.8934 - classification_loss: 0.1107 407/500 [=======================>......] - ETA: 23s - loss: 1.0037 - regression_loss: 0.8931 - classification_loss: 0.1105 408/500 [=======================>......] - ETA: 22s - loss: 1.0019 - regression_loss: 0.8915 - classification_loss: 0.1104 409/500 [=======================>......] - ETA: 22s - loss: 1.0015 - regression_loss: 0.8911 - classification_loss: 0.1104 410/500 [=======================>......] - ETA: 22s - loss: 1.0010 - regression_loss: 0.8908 - classification_loss: 0.1102 411/500 [=======================>......] - ETA: 22s - loss: 1.0007 - regression_loss: 0.8906 - classification_loss: 0.1101 412/500 [=======================>......] - ETA: 21s - loss: 1.0020 - regression_loss: 0.8917 - classification_loss: 0.1103 413/500 [=======================>......] - ETA: 21s - loss: 1.0026 - regression_loss: 0.8922 - classification_loss: 0.1104 414/500 [=======================>......] - ETA: 21s - loss: 1.0034 - regression_loss: 0.8929 - classification_loss: 0.1105 415/500 [=======================>......] - ETA: 21s - loss: 1.0038 - regression_loss: 0.8933 - classification_loss: 0.1105 416/500 [=======================>......] - ETA: 20s - loss: 1.0029 - regression_loss: 0.8925 - classification_loss: 0.1104 417/500 [========================>.....] - ETA: 20s - loss: 1.0036 - regression_loss: 0.8930 - classification_loss: 0.1105 418/500 [========================>.....] - ETA: 20s - loss: 1.0029 - regression_loss: 0.8925 - classification_loss: 0.1104 419/500 [========================>.....] - ETA: 20s - loss: 1.0030 - regression_loss: 0.8926 - classification_loss: 0.1105 420/500 [========================>.....] - ETA: 19s - loss: 1.0041 - regression_loss: 0.8932 - classification_loss: 0.1109 421/500 [========================>.....] - ETA: 19s - loss: 1.0026 - regression_loss: 0.8919 - classification_loss: 0.1107 422/500 [========================>.....] - ETA: 19s - loss: 1.0026 - regression_loss: 0.8919 - classification_loss: 0.1107 423/500 [========================>.....] - ETA: 19s - loss: 1.0029 - regression_loss: 0.8922 - classification_loss: 0.1107 424/500 [========================>.....] - ETA: 18s - loss: 1.0033 - regression_loss: 0.8925 - classification_loss: 0.1108 425/500 [========================>.....] - ETA: 18s - loss: 1.0025 - regression_loss: 0.8919 - classification_loss: 0.1106 426/500 [========================>.....] - ETA: 18s - loss: 1.0018 - regression_loss: 0.8913 - classification_loss: 0.1105 427/500 [========================>.....] - ETA: 18s - loss: 1.0015 - regression_loss: 0.8910 - classification_loss: 0.1104 428/500 [========================>.....] - ETA: 17s - loss: 1.0028 - regression_loss: 0.8923 - classification_loss: 0.1105 429/500 [========================>.....] - ETA: 17s - loss: 1.0027 - regression_loss: 0.8922 - classification_loss: 0.1105 430/500 [========================>.....] - ETA: 17s - loss: 1.0027 - regression_loss: 0.8922 - classification_loss: 0.1106 431/500 [========================>.....] - ETA: 17s - loss: 1.0034 - regression_loss: 0.8927 - classification_loss: 0.1107 432/500 [========================>.....] - ETA: 16s - loss: 1.0034 - regression_loss: 0.8928 - classification_loss: 0.1106 433/500 [========================>.....] - ETA: 16s - loss: 1.0041 - regression_loss: 0.8933 - classification_loss: 0.1108 434/500 [=========================>....] - ETA: 16s - loss: 1.0032 - regression_loss: 0.8925 - classification_loss: 0.1106 435/500 [=========================>....] - ETA: 16s - loss: 1.0040 - regression_loss: 0.8931 - classification_loss: 0.1108 436/500 [=========================>....] - ETA: 15s - loss: 1.0027 - regression_loss: 0.8920 - classification_loss: 0.1106 437/500 [=========================>....] - ETA: 15s - loss: 1.0018 - regression_loss: 0.8913 - classification_loss: 0.1105 438/500 [=========================>....] - ETA: 15s - loss: 1.0013 - regression_loss: 0.8909 - classification_loss: 0.1104 439/500 [=========================>....] - ETA: 15s - loss: 1.0008 - regression_loss: 0.8903 - classification_loss: 0.1105 440/500 [=========================>....] - ETA: 14s - loss: 1.0011 - regression_loss: 0.8906 - classification_loss: 0.1105 441/500 [=========================>....] - ETA: 14s - loss: 0.9997 - regression_loss: 0.8894 - classification_loss: 0.1103 442/500 [=========================>....] - ETA: 14s - loss: 1.0001 - regression_loss: 0.8899 - classification_loss: 0.1103 443/500 [=========================>....] - ETA: 14s - loss: 1.0006 - regression_loss: 0.8903 - classification_loss: 0.1103 444/500 [=========================>....] - ETA: 13s - loss: 1.0005 - regression_loss: 0.8903 - classification_loss: 0.1102 445/500 [=========================>....] - ETA: 13s - loss: 0.9999 - regression_loss: 0.8898 - classification_loss: 0.1101 446/500 [=========================>....] - ETA: 13s - loss: 0.9993 - regression_loss: 0.8894 - classification_loss: 0.1099 447/500 [=========================>....] - ETA: 13s - loss: 0.9990 - regression_loss: 0.8891 - classification_loss: 0.1098 448/500 [=========================>....] - ETA: 12s - loss: 0.9989 - regression_loss: 0.8890 - classification_loss: 0.1098 449/500 [=========================>....] - ETA: 12s - loss: 0.9990 - regression_loss: 0.8892 - classification_loss: 0.1098 450/500 [==========================>...] - ETA: 12s - loss: 0.9978 - regression_loss: 0.8882 - classification_loss: 0.1096 451/500 [==========================>...] - ETA: 12s - loss: 0.9966 - regression_loss: 0.8871 - classification_loss: 0.1095 452/500 [==========================>...] - ETA: 11s - loss: 0.9959 - regression_loss: 0.8865 - classification_loss: 0.1094 453/500 [==========================>...] - ETA: 11s - loss: 0.9967 - regression_loss: 0.8871 - classification_loss: 0.1097 454/500 [==========================>...] - ETA: 11s - loss: 0.9956 - regression_loss: 0.8861 - classification_loss: 0.1095 455/500 [==========================>...] - ETA: 11s - loss: 0.9954 - regression_loss: 0.8861 - classification_loss: 0.1093 456/500 [==========================>...] - ETA: 10s - loss: 0.9957 - regression_loss: 0.8864 - classification_loss: 0.1093 457/500 [==========================>...] - ETA: 10s - loss: 0.9951 - regression_loss: 0.8858 - classification_loss: 0.1093 458/500 [==========================>...] - ETA: 10s - loss: 0.9950 - regression_loss: 0.8858 - classification_loss: 0.1092 459/500 [==========================>...] - ETA: 10s - loss: 0.9949 - regression_loss: 0.8857 - classification_loss: 0.1092 460/500 [==========================>...] - ETA: 9s - loss: 0.9950 - regression_loss: 0.8859 - classification_loss: 0.1091  461/500 [==========================>...] - ETA: 9s - loss: 0.9956 - regression_loss: 0.8864 - classification_loss: 0.1092 462/500 [==========================>...] - ETA: 9s - loss: 0.9960 - regression_loss: 0.8867 - classification_loss: 0.1093 463/500 [==========================>...] - ETA: 9s - loss: 0.9964 - regression_loss: 0.8872 - classification_loss: 0.1092 464/500 [==========================>...] - ETA: 8s - loss: 0.9955 - regression_loss: 0.8864 - classification_loss: 0.1091 465/500 [==========================>...] - ETA: 8s - loss: 0.9953 - regression_loss: 0.8863 - classification_loss: 0.1090 466/500 [==========================>...] - ETA: 8s - loss: 0.9966 - regression_loss: 0.8873 - classification_loss: 0.1093 467/500 [===========================>..] - ETA: 8s - loss: 0.9969 - regression_loss: 0.8878 - classification_loss: 0.1092 468/500 [===========================>..] - ETA: 7s - loss: 0.9958 - regression_loss: 0.8868 - classification_loss: 0.1090 469/500 [===========================>..] - ETA: 7s - loss: 0.9957 - regression_loss: 0.8867 - classification_loss: 0.1089 470/500 [===========================>..] - ETA: 7s - loss: 0.9958 - regression_loss: 0.8868 - classification_loss: 0.1090 471/500 [===========================>..] - ETA: 7s - loss: 0.9953 - regression_loss: 0.8864 - classification_loss: 0.1090 472/500 [===========================>..] - ETA: 6s - loss: 0.9950 - regression_loss: 0.8861 - classification_loss: 0.1089 473/500 [===========================>..] - ETA: 6s - loss: 0.9955 - regression_loss: 0.8865 - classification_loss: 0.1090 474/500 [===========================>..] - ETA: 6s - loss: 0.9944 - regression_loss: 0.8856 - classification_loss: 0.1089 475/500 [===========================>..] - ETA: 6s - loss: 0.9939 - regression_loss: 0.8852 - classification_loss: 0.1087 476/500 [===========================>..] - ETA: 5s - loss: 0.9930 - regression_loss: 0.8845 - classification_loss: 0.1085 477/500 [===========================>..] - ETA: 5s - loss: 0.9933 - regression_loss: 0.8848 - classification_loss: 0.1085 478/500 [===========================>..] - ETA: 5s - loss: 0.9941 - regression_loss: 0.8854 - classification_loss: 0.1086 479/500 [===========================>..] - ETA: 5s - loss: 0.9942 - regression_loss: 0.8856 - classification_loss: 0.1086 480/500 [===========================>..] - ETA: 4s - loss: 0.9936 - regression_loss: 0.8851 - classification_loss: 0.1085 481/500 [===========================>..] - ETA: 4s - loss: 0.9932 - regression_loss: 0.8848 - classification_loss: 0.1084 482/500 [===========================>..] - ETA: 4s - loss: 0.9932 - regression_loss: 0.8847 - classification_loss: 0.1085 483/500 [===========================>..] - ETA: 4s - loss: 0.9921 - regression_loss: 0.8837 - classification_loss: 0.1084 484/500 [============================>.] - ETA: 3s - loss: 0.9933 - regression_loss: 0.8847 - classification_loss: 0.1086 485/500 [============================>.] - ETA: 3s - loss: 0.9936 - regression_loss: 0.8851 - classification_loss: 0.1085 486/500 [============================>.] - ETA: 3s - loss: 0.9937 - regression_loss: 0.8851 - classification_loss: 0.1086 487/500 [============================>.] - ETA: 3s - loss: 0.9940 - regression_loss: 0.8853 - classification_loss: 0.1087 488/500 [============================>.] - ETA: 2s - loss: 0.9934 - regression_loss: 0.8849 - classification_loss: 0.1085 489/500 [============================>.] - ETA: 2s - loss: 0.9944 - regression_loss: 0.8857 - classification_loss: 0.1087 490/500 [============================>.] - ETA: 2s - loss: 0.9953 - regression_loss: 0.8864 - classification_loss: 0.1089 491/500 [============================>.] - ETA: 2s - loss: 0.9951 - regression_loss: 0.8863 - classification_loss: 0.1089 492/500 [============================>.] - ETA: 1s - loss: 0.9951 - regression_loss: 0.8863 - classification_loss: 0.1089 493/500 [============================>.] - ETA: 1s - loss: 0.9953 - regression_loss: 0.8864 - classification_loss: 0.1089 494/500 [============================>.] - ETA: 1s - loss: 0.9943 - regression_loss: 0.8856 - classification_loss: 0.1087 495/500 [============================>.] - ETA: 1s - loss: 0.9944 - regression_loss: 0.8857 - classification_loss: 0.1087 496/500 [============================>.] - ETA: 0s - loss: 0.9940 - regression_loss: 0.8853 - classification_loss: 0.1087 497/500 [============================>.] - ETA: 0s - loss: 0.9942 - regression_loss: 0.8855 - classification_loss: 0.1088 498/500 [============================>.] - ETA: 0s - loss: 0.9945 - regression_loss: 0.8857 - classification_loss: 0.1088 499/500 [============================>.] - ETA: 0s - loss: 0.9932 - regression_loss: 0.8845 - classification_loss: 0.1087 500/500 [==============================] - 125s 249ms/step - loss: 0.9934 - regression_loss: 0.8848 - classification_loss: 0.1086 1172 instances of class plum with average precision: 0.7980 mAP: 0.7980 Epoch 00039: saving model to ./training/snapshots/resnet50_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:03 - loss: 1.5790 - regression_loss: 1.3734 - classification_loss: 0.2055 2/500 [..............................] - ETA: 2:06 - loss: 1.3401 - regression_loss: 1.1713 - classification_loss: 0.1688 3/500 [..............................] - ETA: 2:07 - loss: 1.0439 - regression_loss: 0.9236 - classification_loss: 0.1204 4/500 [..............................] - ETA: 2:06 - loss: 0.9411 - regression_loss: 0.8394 - classification_loss: 0.1016 5/500 [..............................] - ETA: 2:05 - loss: 0.9615 - regression_loss: 0.8596 - classification_loss: 0.1018 6/500 [..............................] - ETA: 2:03 - loss: 0.9297 - regression_loss: 0.8357 - classification_loss: 0.0939 7/500 [..............................] - ETA: 2:03 - loss: 0.9247 - regression_loss: 0.8324 - classification_loss: 0.0924 8/500 [..............................] - ETA: 2:02 - loss: 0.9345 - regression_loss: 0.8395 - classification_loss: 0.0950 9/500 [..............................] - ETA: 2:01 - loss: 0.9557 - regression_loss: 0.8530 - classification_loss: 0.1026 10/500 [..............................] - ETA: 2:01 - loss: 0.9493 - regression_loss: 0.8476 - classification_loss: 0.1017 11/500 [..............................] - ETA: 2:01 - loss: 1.0069 - regression_loss: 0.8944 - classification_loss: 0.1125 12/500 [..............................] - ETA: 2:01 - loss: 0.9357 - regression_loss: 0.8320 - classification_loss: 0.1037 13/500 [..............................] - ETA: 2:01 - loss: 0.9509 - regression_loss: 0.8426 - classification_loss: 0.1083 14/500 [..............................] - ETA: 2:01 - loss: 0.8966 - regression_loss: 0.7948 - classification_loss: 0.1018 15/500 [..............................] - ETA: 2:01 - loss: 0.9210 - regression_loss: 0.8163 - classification_loss: 0.1047 16/500 [..............................] - ETA: 2:00 - loss: 0.9475 - regression_loss: 0.8416 - classification_loss: 0.1058 17/500 [>.............................] - ETA: 2:00 - loss: 0.9500 - regression_loss: 0.8437 - classification_loss: 0.1064 18/500 [>.............................] - ETA: 2:00 - loss: 0.9792 - regression_loss: 0.8696 - classification_loss: 0.1096 19/500 [>.............................] - ETA: 2:00 - loss: 0.9791 - regression_loss: 0.8687 - classification_loss: 0.1104 20/500 [>.............................] - ETA: 2:00 - loss: 0.9863 - regression_loss: 0.8761 - classification_loss: 0.1102 21/500 [>.............................] - ETA: 2:00 - loss: 0.9674 - regression_loss: 0.8593 - classification_loss: 0.1082 22/500 [>.............................] - ETA: 1:59 - loss: 0.9387 - regression_loss: 0.8346 - classification_loss: 0.1041 23/500 [>.............................] - ETA: 1:59 - loss: 0.9413 - regression_loss: 0.8382 - classification_loss: 0.1031 24/500 [>.............................] - ETA: 1:59 - loss: 0.9542 - regression_loss: 0.8475 - classification_loss: 0.1067 25/500 [>.............................] - ETA: 1:59 - loss: 0.9725 - regression_loss: 0.8635 - classification_loss: 0.1091 26/500 [>.............................] - ETA: 1:59 - loss: 0.9682 - regression_loss: 0.8595 - classification_loss: 0.1088 27/500 [>.............................] - ETA: 1:58 - loss: 0.9833 - regression_loss: 0.8724 - classification_loss: 0.1109 28/500 [>.............................] - ETA: 1:58 - loss: 0.9791 - regression_loss: 0.8686 - classification_loss: 0.1106 29/500 [>.............................] - ETA: 1:58 - loss: 0.9593 - regression_loss: 0.8514 - classification_loss: 0.1078 30/500 [>.............................] - ETA: 1:58 - loss: 0.9367 - regression_loss: 0.8319 - classification_loss: 0.1049 31/500 [>.............................] - ETA: 1:57 - loss: 0.9531 - regression_loss: 0.8453 - classification_loss: 0.1077 32/500 [>.............................] - ETA: 1:57 - loss: 0.9552 - regression_loss: 0.8471 - classification_loss: 0.1081 33/500 [>.............................] - ETA: 1:57 - loss: 0.9432 - regression_loss: 0.8373 - classification_loss: 0.1058 34/500 [=>............................] - ETA: 1:57 - loss: 0.9538 - regression_loss: 0.8459 - classification_loss: 0.1079 35/500 [=>............................] - ETA: 1:56 - loss: 0.9558 - regression_loss: 0.8475 - classification_loss: 0.1084 36/500 [=>............................] - ETA: 1:56 - loss: 0.9571 - regression_loss: 0.8488 - classification_loss: 0.1083 37/500 [=>............................] - ETA: 1:55 - loss: 0.9502 - regression_loss: 0.8434 - classification_loss: 0.1068 38/500 [=>............................] - ETA: 1:55 - loss: 0.9579 - regression_loss: 0.8513 - classification_loss: 0.1066 39/500 [=>............................] - ETA: 1:54 - loss: 0.9597 - regression_loss: 0.8536 - classification_loss: 0.1061 40/500 [=>............................] - ETA: 1:54 - loss: 0.9616 - regression_loss: 0.8553 - classification_loss: 0.1062 41/500 [=>............................] - ETA: 1:53 - loss: 0.9642 - regression_loss: 0.8574 - classification_loss: 0.1068 42/500 [=>............................] - ETA: 1:53 - loss: 0.9662 - regression_loss: 0.8591 - classification_loss: 0.1071 43/500 [=>............................] - ETA: 1:53 - loss: 0.9683 - regression_loss: 0.8605 - classification_loss: 0.1078 44/500 [=>............................] - ETA: 1:53 - loss: 0.9595 - regression_loss: 0.8536 - classification_loss: 0.1058 45/500 [=>............................] - ETA: 1:52 - loss: 0.9478 - regression_loss: 0.8433 - classification_loss: 0.1046 46/500 [=>............................] - ETA: 1:52 - loss: 0.9500 - regression_loss: 0.8451 - classification_loss: 0.1049 47/500 [=>............................] - ETA: 1:52 - loss: 0.9397 - regression_loss: 0.8365 - classification_loss: 0.1031 48/500 [=>............................] - ETA: 1:52 - loss: 0.9318 - regression_loss: 0.8305 - classification_loss: 0.1013 49/500 [=>............................] - ETA: 1:51 - loss: 0.9431 - regression_loss: 0.8399 - classification_loss: 0.1032 50/500 [==>...........................] - ETA: 1:51 - loss: 0.9355 - regression_loss: 0.8335 - classification_loss: 0.1020 51/500 [==>...........................] - ETA: 1:51 - loss: 0.9399 - regression_loss: 0.8370 - classification_loss: 0.1029 52/500 [==>...........................] - ETA: 1:51 - loss: 0.9498 - regression_loss: 0.8454 - classification_loss: 0.1044 53/500 [==>...........................] - ETA: 1:50 - loss: 0.9500 - regression_loss: 0.8459 - classification_loss: 0.1041 54/500 [==>...........................] - ETA: 1:50 - loss: 0.9451 - regression_loss: 0.8411 - classification_loss: 0.1040 55/500 [==>...........................] - ETA: 1:50 - loss: 0.9401 - regression_loss: 0.8373 - classification_loss: 0.1028 56/500 [==>...........................] - ETA: 1:50 - loss: 0.9480 - regression_loss: 0.8438 - classification_loss: 0.1042 57/500 [==>...........................] - ETA: 1:49 - loss: 0.9481 - regression_loss: 0.8439 - classification_loss: 0.1043 58/500 [==>...........................] - ETA: 1:49 - loss: 0.9386 - regression_loss: 0.8358 - classification_loss: 0.1028 59/500 [==>...........................] - ETA: 1:49 - loss: 0.9320 - regression_loss: 0.8296 - classification_loss: 0.1024 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9388 - regression_loss: 0.8367 - classification_loss: 0.1021 61/500 [==>...........................] - ETA: 1:48 - loss: 0.9445 - regression_loss: 0.8413 - classification_loss: 0.1032 62/500 [==>...........................] - ETA: 1:48 - loss: 0.9480 - regression_loss: 0.8438 - classification_loss: 0.1042 63/500 [==>...........................] - ETA: 1:48 - loss: 0.9453 - regression_loss: 0.8418 - classification_loss: 0.1036 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9477 - regression_loss: 0.8440 - classification_loss: 0.1037 65/500 [==>...........................] - ETA: 1:47 - loss: 0.9473 - regression_loss: 0.8439 - classification_loss: 0.1034 66/500 [==>...........................] - ETA: 1:47 - loss: 0.9587 - regression_loss: 0.8552 - classification_loss: 0.1034 67/500 [===>..........................] - ETA: 1:47 - loss: 0.9591 - regression_loss: 0.8556 - classification_loss: 0.1035 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9585 - regression_loss: 0.8555 - classification_loss: 0.1029 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9572 - regression_loss: 0.8545 - classification_loss: 0.1026 70/500 [===>..........................] - ETA: 1:46 - loss: 0.9531 - regression_loss: 0.8515 - classification_loss: 0.1016 71/500 [===>..........................] - ETA: 1:46 - loss: 0.9542 - regression_loss: 0.8528 - classification_loss: 0.1015 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9488 - regression_loss: 0.8483 - classification_loss: 0.1005 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9528 - regression_loss: 0.8520 - classification_loss: 0.1009 74/500 [===>..........................] - ETA: 1:45 - loss: 0.9548 - regression_loss: 0.8542 - classification_loss: 0.1006 75/500 [===>..........................] - ETA: 1:45 - loss: 0.9574 - regression_loss: 0.8567 - classification_loss: 0.1008 76/500 [===>..........................] - ETA: 1:45 - loss: 0.9589 - regression_loss: 0.8580 - classification_loss: 0.1009 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9583 - regression_loss: 0.8576 - classification_loss: 0.1008 78/500 [===>..........................] - ETA: 1:44 - loss: 0.9619 - regression_loss: 0.8609 - classification_loss: 0.1009 79/500 [===>..........................] - ETA: 1:44 - loss: 0.9612 - regression_loss: 0.8607 - classification_loss: 0.1005 80/500 [===>..........................] - ETA: 1:44 - loss: 0.9577 - regression_loss: 0.8579 - classification_loss: 0.0998 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9592 - regression_loss: 0.8596 - classification_loss: 0.0996 82/500 [===>..........................] - ETA: 1:43 - loss: 0.9542 - regression_loss: 0.8553 - classification_loss: 0.0989 83/500 [===>..........................] - ETA: 1:43 - loss: 0.9557 - regression_loss: 0.8571 - classification_loss: 0.0985 84/500 [====>.........................] - ETA: 1:43 - loss: 0.9532 - regression_loss: 0.8550 - classification_loss: 0.0982 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9506 - regression_loss: 0.8527 - classification_loss: 0.0979 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9582 - regression_loss: 0.8592 - classification_loss: 0.0991 87/500 [====>.........................] - ETA: 1:42 - loss: 0.9593 - regression_loss: 0.8596 - classification_loss: 0.0997 88/500 [====>.........................] - ETA: 1:42 - loss: 0.9669 - regression_loss: 0.8669 - classification_loss: 0.0999 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9715 - regression_loss: 0.8706 - classification_loss: 0.1009 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9764 - regression_loss: 0.8746 - classification_loss: 0.1017 91/500 [====>.........................] - ETA: 1:41 - loss: 0.9699 - regression_loss: 0.8690 - classification_loss: 0.1009 92/500 [====>.........................] - ETA: 1:41 - loss: 0.9688 - regression_loss: 0.8682 - classification_loss: 0.1006 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9743 - regression_loss: 0.8736 - classification_loss: 0.1007 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9688 - regression_loss: 0.8688 - classification_loss: 0.1000 95/500 [====>.........................] - ETA: 1:40 - loss: 0.9728 - regression_loss: 0.8723 - classification_loss: 0.1005 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9751 - regression_loss: 0.8743 - classification_loss: 0.1008 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9784 - regression_loss: 0.8773 - classification_loss: 0.1011 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9820 - regression_loss: 0.8800 - classification_loss: 0.1019 99/500 [====>.........................] - ETA: 1:39 - loss: 0.9781 - regression_loss: 0.8764 - classification_loss: 0.1017 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9808 - regression_loss: 0.8781 - classification_loss: 0.1027 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9745 - regression_loss: 0.8726 - classification_loss: 0.1020 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9717 - regression_loss: 0.8695 - classification_loss: 0.1022 103/500 [=====>........................] - ETA: 1:38 - loss: 0.9754 - regression_loss: 0.8724 - classification_loss: 0.1030 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9714 - regression_loss: 0.8690 - classification_loss: 0.1024 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9733 - regression_loss: 0.8704 - classification_loss: 0.1028 106/500 [=====>........................] - ETA: 1:37 - loss: 0.9719 - regression_loss: 0.8694 - classification_loss: 0.1025 107/500 [=====>........................] - ETA: 1:37 - loss: 0.9649 - regression_loss: 0.8633 - classification_loss: 0.1017 108/500 [=====>........................] - ETA: 1:37 - loss: 0.9655 - regression_loss: 0.8636 - classification_loss: 0.1019 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9699 - regression_loss: 0.8673 - classification_loss: 0.1026 110/500 [=====>........................] - ETA: 1:36 - loss: 0.9673 - regression_loss: 0.8652 - classification_loss: 0.1021 111/500 [=====>........................] - ETA: 1:36 - loss: 0.9665 - regression_loss: 0.8646 - classification_loss: 0.1019 112/500 [=====>........................] - ETA: 1:36 - loss: 0.9621 - regression_loss: 0.8610 - classification_loss: 0.1011 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9602 - regression_loss: 0.8591 - classification_loss: 0.1012 114/500 [=====>........................] - ETA: 1:35 - loss: 0.9612 - regression_loss: 0.8601 - classification_loss: 0.1010 115/500 [=====>........................] - ETA: 1:35 - loss: 0.9639 - regression_loss: 0.8622 - classification_loss: 0.1017 116/500 [=====>........................] - ETA: 1:35 - loss: 0.9589 - regression_loss: 0.8580 - classification_loss: 0.1010 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9605 - regression_loss: 0.8594 - classification_loss: 0.1010 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9620 - regression_loss: 0.8606 - classification_loss: 0.1014 119/500 [======>.......................] - ETA: 1:34 - loss: 0.9610 - regression_loss: 0.8598 - classification_loss: 0.1012 120/500 [======>.......................] - ETA: 1:34 - loss: 0.9629 - regression_loss: 0.8615 - classification_loss: 0.1014 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9637 - regression_loss: 0.8622 - classification_loss: 0.1015 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9649 - regression_loss: 0.8633 - classification_loss: 0.1017 123/500 [======>.......................] - ETA: 1:33 - loss: 0.9633 - regression_loss: 0.8619 - classification_loss: 0.1015 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9645 - regression_loss: 0.8625 - classification_loss: 0.1020 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9683 - regression_loss: 0.8660 - classification_loss: 0.1023 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9661 - regression_loss: 0.8644 - classification_loss: 0.1017 127/500 [======>.......................] - ETA: 1:32 - loss: 0.9683 - regression_loss: 0.8661 - classification_loss: 0.1022 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9730 - regression_loss: 0.8699 - classification_loss: 0.1031 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9718 - regression_loss: 0.8688 - classification_loss: 0.1030 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9748 - regression_loss: 0.8712 - classification_loss: 0.1036 131/500 [======>.......................] - ETA: 1:31 - loss: 0.9762 - regression_loss: 0.8724 - classification_loss: 0.1038 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9788 - regression_loss: 0.8735 - classification_loss: 0.1053 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9788 - regression_loss: 0.8736 - classification_loss: 0.1052 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9764 - regression_loss: 0.8716 - classification_loss: 0.1047 135/500 [=======>......................] - ETA: 1:30 - loss: 0.9746 - regression_loss: 0.8700 - classification_loss: 0.1045 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9722 - regression_loss: 0.8679 - classification_loss: 0.1043 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9730 - regression_loss: 0.8683 - classification_loss: 0.1047 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9775 - regression_loss: 0.8720 - classification_loss: 0.1055 139/500 [=======>......................] - ETA: 1:29 - loss: 0.9800 - regression_loss: 0.8742 - classification_loss: 0.1058 140/500 [=======>......................] - ETA: 1:29 - loss: 0.9811 - regression_loss: 0.8752 - classification_loss: 0.1059 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9849 - regression_loss: 0.8784 - classification_loss: 0.1065 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9859 - regression_loss: 0.8796 - classification_loss: 0.1063 143/500 [=======>......................] - ETA: 1:28 - loss: 0.9810 - regression_loss: 0.8752 - classification_loss: 0.1058 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9800 - regression_loss: 0.8744 - classification_loss: 0.1056 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9814 - regression_loss: 0.8756 - classification_loss: 0.1058 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9830 - regression_loss: 0.8769 - classification_loss: 0.1060 147/500 [=======>......................] - ETA: 1:27 - loss: 0.9848 - regression_loss: 0.8786 - classification_loss: 0.1062 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9824 - regression_loss: 0.8765 - classification_loss: 0.1059 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9840 - regression_loss: 0.8777 - classification_loss: 0.1064 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9818 - regression_loss: 0.8757 - classification_loss: 0.1061 151/500 [========>.....................] - ETA: 1:26 - loss: 0.9859 - regression_loss: 0.8794 - classification_loss: 0.1065 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9907 - regression_loss: 0.8838 - classification_loss: 0.1070 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9927 - regression_loss: 0.8856 - classification_loss: 0.1071 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9972 - regression_loss: 0.8892 - classification_loss: 0.1080 155/500 [========>.....................] - ETA: 1:25 - loss: 0.9981 - regression_loss: 0.8899 - classification_loss: 0.1082 156/500 [========>.....................] - ETA: 1:25 - loss: 0.9985 - regression_loss: 0.8902 - classification_loss: 0.1083 157/500 [========>.....................] - ETA: 1:25 - loss: 1.0006 - regression_loss: 0.8917 - classification_loss: 0.1089 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9984 - regression_loss: 0.8900 - classification_loss: 0.1084 159/500 [========>.....................] - ETA: 1:24 - loss: 1.0007 - regression_loss: 0.8924 - classification_loss: 0.1083 160/500 [========>.....................] - ETA: 1:24 - loss: 1.0006 - regression_loss: 0.8921 - classification_loss: 0.1084 161/500 [========>.....................] - ETA: 1:24 - loss: 1.0005 - regression_loss: 0.8921 - classification_loss: 0.1084 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9970 - regression_loss: 0.8891 - classification_loss: 0.1078 163/500 [========>.....................] - ETA: 1:23 - loss: 0.9994 - regression_loss: 0.8905 - classification_loss: 0.1088 164/500 [========>.....................] - ETA: 1:23 - loss: 1.0014 - regression_loss: 0.8921 - classification_loss: 0.1092 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9983 - regression_loss: 0.8895 - classification_loss: 0.1088 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9967 - regression_loss: 0.8878 - classification_loss: 0.1089 167/500 [=========>....................] - ETA: 1:22 - loss: 0.9961 - regression_loss: 0.8871 - classification_loss: 0.1091 168/500 [=========>....................] - ETA: 1:22 - loss: 0.9960 - regression_loss: 0.8868 - classification_loss: 0.1092 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9958 - regression_loss: 0.8868 - classification_loss: 0.1090 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9960 - regression_loss: 0.8871 - classification_loss: 0.1088 171/500 [=========>....................] - ETA: 1:21 - loss: 0.9941 - regression_loss: 0.8854 - classification_loss: 0.1087 172/500 [=========>....................] - ETA: 1:21 - loss: 0.9932 - regression_loss: 0.8847 - classification_loss: 0.1085 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9927 - regression_loss: 0.8843 - classification_loss: 0.1084 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9948 - regression_loss: 0.8860 - classification_loss: 0.1089 175/500 [=========>....................] - ETA: 1:20 - loss: 0.9959 - regression_loss: 0.8868 - classification_loss: 0.1091 176/500 [=========>....................] - ETA: 1:20 - loss: 0.9975 - regression_loss: 0.8881 - classification_loss: 0.1094 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9980 - regression_loss: 0.8889 - classification_loss: 0.1091 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9985 - regression_loss: 0.8889 - classification_loss: 0.1095 179/500 [=========>....................] - ETA: 1:19 - loss: 0.9978 - regression_loss: 0.8883 - classification_loss: 0.1096 180/500 [=========>....................] - ETA: 1:19 - loss: 0.9958 - regression_loss: 0.8866 - classification_loss: 0.1093 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9931 - regression_loss: 0.8842 - classification_loss: 0.1089 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9927 - regression_loss: 0.8839 - classification_loss: 0.1088 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9932 - regression_loss: 0.8845 - classification_loss: 0.1087 184/500 [==========>...................] - ETA: 1:18 - loss: 0.9936 - regression_loss: 0.8847 - classification_loss: 0.1089 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9947 - regression_loss: 0.8855 - classification_loss: 0.1091 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9948 - regression_loss: 0.8855 - classification_loss: 0.1093 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9946 - regression_loss: 0.8856 - classification_loss: 0.1091 188/500 [==========>...................] - ETA: 1:17 - loss: 0.9948 - regression_loss: 0.8858 - classification_loss: 0.1090 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9919 - regression_loss: 0.8833 - classification_loss: 0.1086 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9903 - regression_loss: 0.8820 - classification_loss: 0.1083 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9905 - regression_loss: 0.8823 - classification_loss: 0.1082 192/500 [==========>...................] - ETA: 1:16 - loss: 0.9899 - regression_loss: 0.8819 - classification_loss: 0.1081 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9917 - regression_loss: 0.8836 - classification_loss: 0.1081 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9939 - regression_loss: 0.8854 - classification_loss: 0.1085 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9959 - regression_loss: 0.8871 - classification_loss: 0.1088 196/500 [==========>...................] - ETA: 1:15 - loss: 0.9943 - regression_loss: 0.8858 - classification_loss: 0.1085 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9950 - regression_loss: 0.8862 - classification_loss: 0.1088 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9931 - regression_loss: 0.8847 - classification_loss: 0.1084 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9907 - regression_loss: 0.8827 - classification_loss: 0.1080 200/500 [===========>..................] - ETA: 1:14 - loss: 0.9873 - regression_loss: 0.8798 - classification_loss: 0.1076 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9866 - regression_loss: 0.8793 - classification_loss: 0.1072 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9884 - regression_loss: 0.8808 - classification_loss: 0.1077 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9864 - regression_loss: 0.8790 - classification_loss: 0.1074 204/500 [===========>..................] - ETA: 1:13 - loss: 0.9845 - regression_loss: 0.8772 - classification_loss: 0.1073 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9820 - regression_loss: 0.8751 - classification_loss: 0.1069 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9823 - regression_loss: 0.8753 - classification_loss: 0.1070 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9800 - regression_loss: 0.8733 - classification_loss: 0.1067 208/500 [===========>..................] - ETA: 1:12 - loss: 0.9797 - regression_loss: 0.8732 - classification_loss: 0.1065 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9771 - regression_loss: 0.8710 - classification_loss: 0.1061 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9787 - regression_loss: 0.8723 - classification_loss: 0.1064 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9793 - regression_loss: 0.8728 - classification_loss: 0.1065 212/500 [===========>..................] - ETA: 1:11 - loss: 0.9785 - regression_loss: 0.8722 - classification_loss: 0.1062 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9789 - regression_loss: 0.8725 - classification_loss: 0.1064 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9763 - regression_loss: 0.8703 - classification_loss: 0.1060 215/500 [===========>..................] - ETA: 1:10 - loss: 0.9786 - regression_loss: 0.8724 - classification_loss: 0.1062 216/500 [===========>..................] - ETA: 1:10 - loss: 0.9767 - regression_loss: 0.8708 - classification_loss: 0.1059 217/500 [============>.................] - ETA: 1:10 - loss: 0.9740 - regression_loss: 0.8685 - classification_loss: 0.1055 218/500 [============>.................] - ETA: 1:10 - loss: 0.9753 - regression_loss: 0.8695 - classification_loss: 0.1058 219/500 [============>.................] - ETA: 1:09 - loss: 0.9753 - regression_loss: 0.8695 - classification_loss: 0.1058 220/500 [============>.................] - ETA: 1:09 - loss: 0.9767 - regression_loss: 0.8707 - classification_loss: 0.1061 221/500 [============>.................] - ETA: 1:09 - loss: 0.9764 - regression_loss: 0.8703 - classification_loss: 0.1061 222/500 [============>.................] - ETA: 1:09 - loss: 0.9755 - regression_loss: 0.8697 - classification_loss: 0.1058 223/500 [============>.................] - ETA: 1:08 - loss: 0.9745 - regression_loss: 0.8690 - classification_loss: 0.1056 224/500 [============>.................] - ETA: 1:08 - loss: 0.9740 - regression_loss: 0.8685 - classification_loss: 0.1055 225/500 [============>.................] - ETA: 1:08 - loss: 0.9762 - regression_loss: 0.8706 - classification_loss: 0.1056 226/500 [============>.................] - ETA: 1:08 - loss: 0.9740 - regression_loss: 0.8688 - classification_loss: 0.1053 227/500 [============>.................] - ETA: 1:07 - loss: 0.9713 - regression_loss: 0.8664 - classification_loss: 0.1049 228/500 [============>.................] - ETA: 1:07 - loss: 0.9705 - regression_loss: 0.8657 - classification_loss: 0.1048 229/500 [============>.................] - ETA: 1:07 - loss: 0.9722 - regression_loss: 0.8672 - classification_loss: 0.1051 230/500 [============>.................] - ETA: 1:07 - loss: 0.9701 - regression_loss: 0.8654 - classification_loss: 0.1048 231/500 [============>.................] - ETA: 1:06 - loss: 0.9682 - regression_loss: 0.8637 - classification_loss: 0.1045 232/500 [============>.................] - ETA: 1:06 - loss: 0.9685 - regression_loss: 0.8639 - classification_loss: 0.1046 233/500 [============>.................] - ETA: 1:06 - loss: 0.9689 - regression_loss: 0.8643 - classification_loss: 0.1046 234/500 [=============>................] - ETA: 1:06 - loss: 0.9710 - regression_loss: 0.8661 - classification_loss: 0.1049 235/500 [=============>................] - ETA: 1:05 - loss: 0.9727 - regression_loss: 0.8676 - classification_loss: 0.1051 236/500 [=============>................] - ETA: 1:05 - loss: 0.9722 - regression_loss: 0.8671 - classification_loss: 0.1051 237/500 [=============>................] - ETA: 1:05 - loss: 0.9704 - regression_loss: 0.8655 - classification_loss: 0.1049 238/500 [=============>................] - ETA: 1:05 - loss: 0.9721 - regression_loss: 0.8669 - classification_loss: 0.1052 239/500 [=============>................] - ETA: 1:04 - loss: 0.9725 - regression_loss: 0.8672 - classification_loss: 0.1053 240/500 [=============>................] - ETA: 1:04 - loss: 0.9723 - regression_loss: 0.8673 - classification_loss: 0.1051 241/500 [=============>................] - ETA: 1:04 - loss: 0.9717 - regression_loss: 0.8668 - classification_loss: 0.1049 242/500 [=============>................] - ETA: 1:04 - loss: 0.9712 - regression_loss: 0.8662 - classification_loss: 0.1050 243/500 [=============>................] - ETA: 1:03 - loss: 0.9698 - regression_loss: 0.8652 - classification_loss: 0.1046 244/500 [=============>................] - ETA: 1:03 - loss: 0.9694 - regression_loss: 0.8647 - classification_loss: 0.1047 245/500 [=============>................] - ETA: 1:03 - loss: 0.9686 - regression_loss: 0.8641 - classification_loss: 0.1045 246/500 [=============>................] - ETA: 1:03 - loss: 0.9715 - regression_loss: 0.8663 - classification_loss: 0.1052 247/500 [=============>................] - ETA: 1:02 - loss: 0.9742 - regression_loss: 0.8686 - classification_loss: 0.1056 248/500 [=============>................] - ETA: 1:02 - loss: 0.9747 - regression_loss: 0.8689 - classification_loss: 0.1058 249/500 [=============>................] - ETA: 1:02 - loss: 0.9723 - regression_loss: 0.8669 - classification_loss: 0.1054 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9706 - regression_loss: 0.8654 - classification_loss: 0.1052 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9682 - regression_loss: 0.8634 - classification_loss: 0.1049 252/500 [==============>...............] - ETA: 1:01 - loss: 0.9693 - regression_loss: 0.8641 - classification_loss: 0.1051 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9701 - regression_loss: 0.8648 - classification_loss: 0.1053 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9680 - regression_loss: 0.8629 - classification_loss: 0.1051 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9696 - regression_loss: 0.8643 - classification_loss: 0.1053 256/500 [==============>...............] - ETA: 1:00 - loss: 0.9686 - regression_loss: 0.8634 - classification_loss: 0.1051 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9675 - regression_loss: 0.8626 - classification_loss: 0.1049 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9668 - regression_loss: 0.8620 - classification_loss: 0.1049 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9681 - regression_loss: 0.8631 - classification_loss: 0.1050 260/500 [==============>...............] - ETA: 59s - loss: 0.9693 - regression_loss: 0.8642 - classification_loss: 0.1051  261/500 [==============>...............] - ETA: 59s - loss: 0.9682 - regression_loss: 0.8634 - classification_loss: 0.1048 262/500 [==============>...............] - ETA: 59s - loss: 0.9692 - regression_loss: 0.8643 - classification_loss: 0.1050 263/500 [==============>...............] - ETA: 59s - loss: 0.9691 - regression_loss: 0.8641 - classification_loss: 0.1051 264/500 [==============>...............] - ETA: 58s - loss: 0.9700 - regression_loss: 0.8644 - classification_loss: 0.1056 265/500 [==============>...............] - ETA: 58s - loss: 0.9681 - regression_loss: 0.8629 - classification_loss: 0.1053 266/500 [==============>...............] - ETA: 58s - loss: 0.9676 - regression_loss: 0.8623 - classification_loss: 0.1053 267/500 [===============>..............] - ETA: 58s - loss: 0.9668 - regression_loss: 0.8617 - classification_loss: 0.1051 268/500 [===============>..............] - ETA: 57s - loss: 0.9667 - regression_loss: 0.8616 - classification_loss: 0.1052 269/500 [===============>..............] - ETA: 57s - loss: 0.9660 - regression_loss: 0.8610 - classification_loss: 0.1050 270/500 [===============>..............] - ETA: 57s - loss: 0.9639 - regression_loss: 0.8590 - classification_loss: 0.1048 271/500 [===============>..............] - ETA: 57s - loss: 0.9641 - regression_loss: 0.8592 - classification_loss: 0.1048 272/500 [===============>..............] - ETA: 56s - loss: 0.9656 - regression_loss: 0.8602 - classification_loss: 0.1053 273/500 [===============>..............] - ETA: 56s - loss: 0.9658 - regression_loss: 0.8604 - classification_loss: 0.1054 274/500 [===============>..............] - ETA: 56s - loss: 0.9669 - regression_loss: 0.8614 - classification_loss: 0.1055 275/500 [===============>..............] - ETA: 56s - loss: 0.9669 - regression_loss: 0.8613 - classification_loss: 0.1056 276/500 [===============>..............] - ETA: 55s - loss: 0.9662 - regression_loss: 0.8607 - classification_loss: 0.1055 277/500 [===============>..............] - ETA: 55s - loss: 0.9662 - regression_loss: 0.8607 - classification_loss: 0.1054 278/500 [===============>..............] - ETA: 55s - loss: 0.9671 - regression_loss: 0.8617 - classification_loss: 0.1055 279/500 [===============>..............] - ETA: 55s - loss: 0.9676 - regression_loss: 0.8622 - classification_loss: 0.1054 280/500 [===============>..............] - ETA: 54s - loss: 0.9659 - regression_loss: 0.8608 - classification_loss: 0.1051 281/500 [===============>..............] - ETA: 54s - loss: 0.9641 - regression_loss: 0.8592 - classification_loss: 0.1049 282/500 [===============>..............] - ETA: 54s - loss: 0.9636 - regression_loss: 0.8589 - classification_loss: 0.1047 283/500 [===============>..............] - ETA: 54s - loss: 0.9625 - regression_loss: 0.8580 - classification_loss: 0.1045 284/500 [================>.............] - ETA: 53s - loss: 0.9634 - regression_loss: 0.8587 - classification_loss: 0.1047 285/500 [================>.............] - ETA: 53s - loss: 0.9638 - regression_loss: 0.8589 - classification_loss: 0.1048 286/500 [================>.............] - ETA: 53s - loss: 0.9642 - regression_loss: 0.8593 - classification_loss: 0.1049 287/500 [================>.............] - ETA: 53s - loss: 0.9646 - regression_loss: 0.8599 - classification_loss: 0.1047 288/500 [================>.............] - ETA: 52s - loss: 0.9648 - regression_loss: 0.8600 - classification_loss: 0.1047 289/500 [================>.............] - ETA: 52s - loss: 0.9652 - regression_loss: 0.8605 - classification_loss: 0.1047 290/500 [================>.............] - ETA: 52s - loss: 0.9651 - regression_loss: 0.8606 - classification_loss: 0.1046 291/500 [================>.............] - ETA: 52s - loss: 0.9643 - regression_loss: 0.8598 - classification_loss: 0.1045 292/500 [================>.............] - ETA: 51s - loss: 0.9645 - regression_loss: 0.8600 - classification_loss: 0.1045 293/500 [================>.............] - ETA: 51s - loss: 0.9650 - regression_loss: 0.8604 - classification_loss: 0.1046 294/500 [================>.............] - ETA: 51s - loss: 0.9639 - regression_loss: 0.8594 - classification_loss: 0.1044 295/500 [================>.............] - ETA: 51s - loss: 0.9652 - regression_loss: 0.8607 - classification_loss: 0.1045 296/500 [================>.............] - ETA: 50s - loss: 0.9661 - regression_loss: 0.8615 - classification_loss: 0.1047 297/500 [================>.............] - ETA: 50s - loss: 0.9670 - regression_loss: 0.8622 - classification_loss: 0.1048 298/500 [================>.............] - ETA: 50s - loss: 0.9664 - regression_loss: 0.8618 - classification_loss: 0.1046 299/500 [================>.............] - ETA: 50s - loss: 0.9676 - regression_loss: 0.8628 - classification_loss: 0.1048 300/500 [=================>............] - ETA: 49s - loss: 0.9699 - regression_loss: 0.8646 - classification_loss: 0.1053 301/500 [=================>............] - ETA: 49s - loss: 0.9698 - regression_loss: 0.8645 - classification_loss: 0.1052 302/500 [=================>............] - ETA: 49s - loss: 0.9711 - regression_loss: 0.8656 - classification_loss: 0.1054 303/500 [=================>............] - ETA: 49s - loss: 0.9705 - regression_loss: 0.8653 - classification_loss: 0.1052 304/500 [=================>............] - ETA: 48s - loss: 0.9689 - regression_loss: 0.8640 - classification_loss: 0.1050 305/500 [=================>............] - ETA: 48s - loss: 0.9689 - regression_loss: 0.8640 - classification_loss: 0.1049 306/500 [=================>............] - ETA: 48s - loss: 0.9695 - regression_loss: 0.8645 - classification_loss: 0.1050 307/500 [=================>............] - ETA: 48s - loss: 0.9703 - regression_loss: 0.8652 - classification_loss: 0.1050 308/500 [=================>............] - ETA: 47s - loss: 0.9711 - regression_loss: 0.8659 - classification_loss: 0.1051 309/500 [=================>............] - ETA: 47s - loss: 0.9720 - regression_loss: 0.8668 - classification_loss: 0.1052 310/500 [=================>............] - ETA: 47s - loss: 0.9717 - regression_loss: 0.8665 - classification_loss: 0.1053 311/500 [=================>............] - ETA: 47s - loss: 0.9718 - regression_loss: 0.8665 - classification_loss: 0.1053 312/500 [=================>............] - ETA: 46s - loss: 0.9716 - regression_loss: 0.8663 - classification_loss: 0.1053 313/500 [=================>............] - ETA: 46s - loss: 0.9712 - regression_loss: 0.8661 - classification_loss: 0.1051 314/500 [=================>............] - ETA: 46s - loss: 0.9721 - regression_loss: 0.8668 - classification_loss: 0.1053 315/500 [=================>............] - ETA: 46s - loss: 0.9717 - regression_loss: 0.8664 - classification_loss: 0.1053 316/500 [=================>............] - ETA: 45s - loss: 0.9718 - regression_loss: 0.8665 - classification_loss: 0.1053 317/500 [==================>...........] - ETA: 45s - loss: 0.9700 - regression_loss: 0.8650 - classification_loss: 0.1050 318/500 [==================>...........] - ETA: 45s - loss: 0.9690 - regression_loss: 0.8641 - classification_loss: 0.1049 319/500 [==================>...........] - ETA: 45s - loss: 0.9682 - regression_loss: 0.8635 - classification_loss: 0.1047 320/500 [==================>...........] - ETA: 44s - loss: 0.9668 - regression_loss: 0.8622 - classification_loss: 0.1046 321/500 [==================>...........] - ETA: 44s - loss: 0.9673 - regression_loss: 0.8626 - classification_loss: 0.1047 322/500 [==================>...........] - ETA: 44s - loss: 0.9682 - regression_loss: 0.8634 - classification_loss: 0.1048 323/500 [==================>...........] - ETA: 44s - loss: 0.9699 - regression_loss: 0.8648 - classification_loss: 0.1052 324/500 [==================>...........] - ETA: 43s - loss: 0.9683 - regression_loss: 0.8634 - classification_loss: 0.1050 325/500 [==================>...........] - ETA: 43s - loss: 0.9683 - regression_loss: 0.8634 - classification_loss: 0.1049 326/500 [==================>...........] - ETA: 43s - loss: 0.9685 - regression_loss: 0.8637 - classification_loss: 0.1049 327/500 [==================>...........] - ETA: 43s - loss: 0.9684 - regression_loss: 0.8636 - classification_loss: 0.1048 328/500 [==================>...........] - ETA: 42s - loss: 0.9687 - regression_loss: 0.8639 - classification_loss: 0.1048 329/500 [==================>...........] - ETA: 42s - loss: 0.9691 - regression_loss: 0.8644 - classification_loss: 0.1047 330/500 [==================>...........] - ETA: 42s - loss: 0.9675 - regression_loss: 0.8630 - classification_loss: 0.1044 331/500 [==================>...........] - ETA: 42s - loss: 0.9684 - regression_loss: 0.8638 - classification_loss: 0.1046 332/500 [==================>...........] - ETA: 41s - loss: 0.9690 - regression_loss: 0.8644 - classification_loss: 0.1046 333/500 [==================>...........] - ETA: 41s - loss: 0.9691 - regression_loss: 0.8645 - classification_loss: 0.1047 334/500 [===================>..........] - ETA: 41s - loss: 0.9696 - regression_loss: 0.8649 - classification_loss: 0.1047 335/500 [===================>..........] - ETA: 41s - loss: 0.9687 - regression_loss: 0.8642 - classification_loss: 0.1045 336/500 [===================>..........] - ETA: 40s - loss: 0.9673 - regression_loss: 0.8630 - classification_loss: 0.1043 337/500 [===================>..........] - ETA: 40s - loss: 0.9681 - regression_loss: 0.8636 - classification_loss: 0.1044 338/500 [===================>..........] - ETA: 40s - loss: 0.9694 - regression_loss: 0.8648 - classification_loss: 0.1046 339/500 [===================>..........] - ETA: 40s - loss: 0.9679 - regression_loss: 0.8635 - classification_loss: 0.1045 340/500 [===================>..........] - ETA: 39s - loss: 0.9700 - regression_loss: 0.8652 - classification_loss: 0.1048 341/500 [===================>..........] - ETA: 39s - loss: 0.9714 - regression_loss: 0.8661 - classification_loss: 0.1052 342/500 [===================>..........] - ETA: 39s - loss: 0.9721 - regression_loss: 0.8667 - classification_loss: 0.1054 343/500 [===================>..........] - ETA: 39s - loss: 0.9734 - regression_loss: 0.8675 - classification_loss: 0.1060 344/500 [===================>..........] - ETA: 38s - loss: 0.9736 - regression_loss: 0.8677 - classification_loss: 0.1060 345/500 [===================>..........] - ETA: 38s - loss: 0.9737 - regression_loss: 0.8677 - classification_loss: 0.1060 346/500 [===================>..........] - ETA: 38s - loss: 0.9740 - regression_loss: 0.8681 - classification_loss: 0.1059 347/500 [===================>..........] - ETA: 38s - loss: 0.9745 - regression_loss: 0.8686 - classification_loss: 0.1059 348/500 [===================>..........] - ETA: 37s - loss: 0.9737 - regression_loss: 0.8679 - classification_loss: 0.1058 349/500 [===================>..........] - ETA: 37s - loss: 0.9738 - regression_loss: 0.8679 - classification_loss: 0.1058 350/500 [====================>.........] - ETA: 37s - loss: 0.9734 - regression_loss: 0.8676 - classification_loss: 0.1058 351/500 [====================>.........] - ETA: 37s - loss: 0.9746 - regression_loss: 0.8687 - classification_loss: 0.1058 352/500 [====================>.........] - ETA: 36s - loss: 0.9757 - regression_loss: 0.8696 - classification_loss: 0.1060 353/500 [====================>.........] - ETA: 36s - loss: 0.9760 - regression_loss: 0.8699 - classification_loss: 0.1061 354/500 [====================>.........] - ETA: 36s - loss: 0.9754 - regression_loss: 0.8694 - classification_loss: 0.1059 355/500 [====================>.........] - ETA: 36s - loss: 0.9753 - regression_loss: 0.8694 - classification_loss: 0.1060 356/500 [====================>.........] - ETA: 35s - loss: 0.9764 - regression_loss: 0.8703 - classification_loss: 0.1061 357/500 [====================>.........] - ETA: 35s - loss: 0.9760 - regression_loss: 0.8700 - classification_loss: 0.1060 358/500 [====================>.........] - ETA: 35s - loss: 0.9752 - regression_loss: 0.8693 - classification_loss: 0.1059 359/500 [====================>.........] - ETA: 35s - loss: 0.9736 - regression_loss: 0.8679 - classification_loss: 0.1057 360/500 [====================>.........] - ETA: 34s - loss: 0.9715 - regression_loss: 0.8660 - classification_loss: 0.1055 361/500 [====================>.........] - ETA: 34s - loss: 0.9714 - regression_loss: 0.8660 - classification_loss: 0.1054 362/500 [====================>.........] - ETA: 34s - loss: 0.9717 - regression_loss: 0.8663 - classification_loss: 0.1054 363/500 [====================>.........] - ETA: 34s - loss: 0.9703 - regression_loss: 0.8651 - classification_loss: 0.1052 364/500 [====================>.........] - ETA: 33s - loss: 0.9707 - regression_loss: 0.8656 - classification_loss: 0.1051 365/500 [====================>.........] - ETA: 33s - loss: 0.9704 - regression_loss: 0.8655 - classification_loss: 0.1049 366/500 [====================>.........] - ETA: 33s - loss: 0.9714 - regression_loss: 0.8663 - classification_loss: 0.1051 367/500 [=====================>........] - ETA: 33s - loss: 0.9698 - regression_loss: 0.8649 - classification_loss: 0.1049 368/500 [=====================>........] - ETA: 33s - loss: 0.9688 - regression_loss: 0.8640 - classification_loss: 0.1048 369/500 [=====================>........] - ETA: 32s - loss: 0.9668 - regression_loss: 0.8622 - classification_loss: 0.1046 370/500 [=====================>........] - ETA: 32s - loss: 0.9677 - regression_loss: 0.8630 - classification_loss: 0.1048 371/500 [=====================>........] - ETA: 32s - loss: 0.9684 - regression_loss: 0.8636 - classification_loss: 0.1048 372/500 [=====================>........] - ETA: 31s - loss: 0.9690 - regression_loss: 0.8642 - classification_loss: 0.1049 373/500 [=====================>........] - ETA: 31s - loss: 0.9673 - regression_loss: 0.8627 - classification_loss: 0.1047 374/500 [=====================>........] - ETA: 31s - loss: 0.9713 - regression_loss: 0.8663 - classification_loss: 0.1050 375/500 [=====================>........] - ETA: 31s - loss: 0.9719 - regression_loss: 0.8669 - classification_loss: 0.1050 376/500 [=====================>........] - ETA: 30s - loss: 0.9739 - regression_loss: 0.8686 - classification_loss: 0.1053 377/500 [=====================>........] - ETA: 30s - loss: 0.9739 - regression_loss: 0.8686 - classification_loss: 0.1053 378/500 [=====================>........] - ETA: 30s - loss: 0.9739 - regression_loss: 0.8686 - classification_loss: 0.1053 379/500 [=====================>........] - ETA: 30s - loss: 0.9742 - regression_loss: 0.8688 - classification_loss: 0.1055 380/500 [=====================>........] - ETA: 29s - loss: 0.9750 - regression_loss: 0.8694 - classification_loss: 0.1056 381/500 [=====================>........] - ETA: 29s - loss: 0.9760 - regression_loss: 0.8703 - classification_loss: 0.1057 382/500 [=====================>........] - ETA: 29s - loss: 0.9766 - regression_loss: 0.8708 - classification_loss: 0.1059 383/500 [=====================>........] - ETA: 29s - loss: 0.9772 - regression_loss: 0.8712 - classification_loss: 0.1060 384/500 [======================>.......] - ETA: 28s - loss: 0.9777 - regression_loss: 0.8717 - classification_loss: 0.1060 385/500 [======================>.......] - ETA: 28s - loss: 0.9774 - regression_loss: 0.8715 - classification_loss: 0.1059 386/500 [======================>.......] - ETA: 28s - loss: 0.9770 - regression_loss: 0.8711 - classification_loss: 0.1059 387/500 [======================>.......] - ETA: 28s - loss: 0.9766 - regression_loss: 0.8707 - classification_loss: 0.1059 388/500 [======================>.......] - ETA: 27s - loss: 0.9762 - regression_loss: 0.8704 - classification_loss: 0.1058 389/500 [======================>.......] - ETA: 27s - loss: 0.9759 - regression_loss: 0.8700 - classification_loss: 0.1058 390/500 [======================>.......] - ETA: 27s - loss: 0.9767 - regression_loss: 0.8707 - classification_loss: 0.1060 391/500 [======================>.......] - ETA: 27s - loss: 0.9760 - regression_loss: 0.8700 - classification_loss: 0.1060 392/500 [======================>.......] - ETA: 26s - loss: 0.9763 - regression_loss: 0.8703 - classification_loss: 0.1060 393/500 [======================>.......] - ETA: 26s - loss: 0.9766 - regression_loss: 0.8704 - classification_loss: 0.1061 394/500 [======================>.......] - ETA: 26s - loss: 0.9769 - regression_loss: 0.8707 - classification_loss: 0.1062 395/500 [======================>.......] - ETA: 26s - loss: 0.9756 - regression_loss: 0.8696 - classification_loss: 0.1060 396/500 [======================>.......] - ETA: 25s - loss: 0.9776 - regression_loss: 0.8714 - classification_loss: 0.1063 397/500 [======================>.......] - ETA: 25s - loss: 0.9769 - regression_loss: 0.8707 - classification_loss: 0.1062 398/500 [======================>.......] - ETA: 25s - loss: 0.9766 - regression_loss: 0.8705 - classification_loss: 0.1061 399/500 [======================>.......] - ETA: 25s - loss: 0.9771 - regression_loss: 0.8709 - classification_loss: 0.1062 400/500 [=======================>......] - ETA: 24s - loss: 0.9759 - regression_loss: 0.8699 - classification_loss: 0.1061 401/500 [=======================>......] - ETA: 24s - loss: 0.9754 - regression_loss: 0.8695 - classification_loss: 0.1059 402/500 [=======================>......] - ETA: 24s - loss: 0.9744 - regression_loss: 0.8686 - classification_loss: 0.1057 403/500 [=======================>......] - ETA: 24s - loss: 0.9735 - regression_loss: 0.8679 - classification_loss: 0.1056 404/500 [=======================>......] - ETA: 23s - loss: 0.9743 - regression_loss: 0.8686 - classification_loss: 0.1057 405/500 [=======================>......] - ETA: 23s - loss: 0.9748 - regression_loss: 0.8691 - classification_loss: 0.1057 406/500 [=======================>......] - ETA: 23s - loss: 0.9731 - regression_loss: 0.8676 - classification_loss: 0.1056 407/500 [=======================>......] - ETA: 23s - loss: 0.9720 - regression_loss: 0.8666 - classification_loss: 0.1054 408/500 [=======================>......] - ETA: 22s - loss: 0.9726 - regression_loss: 0.8671 - classification_loss: 0.1055 409/500 [=======================>......] - ETA: 22s - loss: 0.9735 - regression_loss: 0.8679 - classification_loss: 0.1056 410/500 [=======================>......] - ETA: 22s - loss: 0.9731 - regression_loss: 0.8675 - classification_loss: 0.1056 411/500 [=======================>......] - ETA: 22s - loss: 0.9747 - regression_loss: 0.8690 - classification_loss: 0.1057 412/500 [=======================>......] - ETA: 21s - loss: 0.9751 - regression_loss: 0.8692 - classification_loss: 0.1058 413/500 [=======================>......] - ETA: 21s - loss: 0.9749 - regression_loss: 0.8690 - classification_loss: 0.1059 414/500 [=======================>......] - ETA: 21s - loss: 0.9748 - regression_loss: 0.8690 - classification_loss: 0.1058 415/500 [=======================>......] - ETA: 21s - loss: 0.9750 - regression_loss: 0.8691 - classification_loss: 0.1059 416/500 [=======================>......] - ETA: 20s - loss: 0.9760 - regression_loss: 0.8699 - classification_loss: 0.1061 417/500 [========================>.....] - ETA: 20s - loss: 0.9756 - regression_loss: 0.8697 - classification_loss: 0.1059 418/500 [========================>.....] - ETA: 20s - loss: 0.9756 - regression_loss: 0.8697 - classification_loss: 0.1059 419/500 [========================>.....] - ETA: 20s - loss: 0.9758 - regression_loss: 0.8699 - classification_loss: 0.1058 420/500 [========================>.....] - ETA: 19s - loss: 0.9748 - regression_loss: 0.8692 - classification_loss: 0.1057 421/500 [========================>.....] - ETA: 19s - loss: 0.9742 - regression_loss: 0.8685 - classification_loss: 0.1056 422/500 [========================>.....] - ETA: 19s - loss: 0.9751 - regression_loss: 0.8695 - classification_loss: 0.1056 423/500 [========================>.....] - ETA: 19s - loss: 0.9758 - regression_loss: 0.8701 - classification_loss: 0.1057 424/500 [========================>.....] - ETA: 18s - loss: 0.9762 - regression_loss: 0.8705 - classification_loss: 0.1057 425/500 [========================>.....] - ETA: 18s - loss: 0.9761 - regression_loss: 0.8705 - classification_loss: 0.1056 426/500 [========================>.....] - ETA: 18s - loss: 0.9774 - regression_loss: 0.8716 - classification_loss: 0.1058 427/500 [========================>.....] - ETA: 18s - loss: 0.9778 - regression_loss: 0.8720 - classification_loss: 0.1058 428/500 [========================>.....] - ETA: 17s - loss: 0.9795 - regression_loss: 0.8736 - classification_loss: 0.1060 429/500 [========================>.....] - ETA: 17s - loss: 0.9796 - regression_loss: 0.8736 - classification_loss: 0.1060 430/500 [========================>.....] - ETA: 17s - loss: 0.9795 - regression_loss: 0.8735 - classification_loss: 0.1059 431/500 [========================>.....] - ETA: 17s - loss: 0.9792 - regression_loss: 0.8734 - classification_loss: 0.1058 432/500 [========================>.....] - ETA: 16s - loss: 0.9798 - regression_loss: 0.8739 - classification_loss: 0.1059 433/500 [========================>.....] - ETA: 16s - loss: 0.9800 - regression_loss: 0.8741 - classification_loss: 0.1059 434/500 [=========================>....] - ETA: 16s - loss: 0.9793 - regression_loss: 0.8735 - classification_loss: 0.1058 435/500 [=========================>....] - ETA: 16s - loss: 0.9791 - regression_loss: 0.8734 - classification_loss: 0.1057 436/500 [=========================>....] - ETA: 15s - loss: 0.9789 - regression_loss: 0.8732 - classification_loss: 0.1057 437/500 [=========================>....] - ETA: 15s - loss: 0.9773 - regression_loss: 0.8718 - classification_loss: 0.1055 438/500 [=========================>....] - ETA: 15s - loss: 0.9764 - regression_loss: 0.8710 - classification_loss: 0.1054 439/500 [=========================>....] - ETA: 15s - loss: 0.9764 - regression_loss: 0.8711 - classification_loss: 0.1053 440/500 [=========================>....] - ETA: 14s - loss: 0.9765 - regression_loss: 0.8711 - classification_loss: 0.1054 441/500 [=========================>....] - ETA: 14s - loss: 0.9750 - regression_loss: 0.8698 - classification_loss: 0.1052 442/500 [=========================>....] - ETA: 14s - loss: 0.9749 - regression_loss: 0.8697 - classification_loss: 0.1052 443/500 [=========================>....] - ETA: 14s - loss: 0.9758 - regression_loss: 0.8705 - classification_loss: 0.1053 444/500 [=========================>....] - ETA: 13s - loss: 0.9777 - regression_loss: 0.8718 - classification_loss: 0.1059 445/500 [=========================>....] - ETA: 13s - loss: 0.9794 - regression_loss: 0.8733 - classification_loss: 0.1061 446/500 [=========================>....] - ETA: 13s - loss: 0.9791 - regression_loss: 0.8730 - classification_loss: 0.1061 447/500 [=========================>....] - ETA: 13s - loss: 0.9793 - regression_loss: 0.8732 - classification_loss: 0.1061 448/500 [=========================>....] - ETA: 12s - loss: 0.9803 - regression_loss: 0.8741 - classification_loss: 0.1062 449/500 [=========================>....] - ETA: 12s - loss: 0.9805 - regression_loss: 0.8743 - classification_loss: 0.1062 450/500 [==========================>...] - ETA: 12s - loss: 0.9796 - regression_loss: 0.8736 - classification_loss: 0.1060 451/500 [==========================>...] - ETA: 12s - loss: 0.9795 - regression_loss: 0.8735 - classification_loss: 0.1060 452/500 [==========================>...] - ETA: 11s - loss: 0.9801 - regression_loss: 0.8739 - classification_loss: 0.1061 453/500 [==========================>...] - ETA: 11s - loss: 0.9787 - regression_loss: 0.8727 - classification_loss: 0.1059 454/500 [==========================>...] - ETA: 11s - loss: 0.9790 - regression_loss: 0.8731 - classification_loss: 0.1060 455/500 [==========================>...] - ETA: 11s - loss: 0.9804 - regression_loss: 0.8742 - classification_loss: 0.1062 456/500 [==========================>...] - ETA: 10s - loss: 0.9798 - regression_loss: 0.8738 - classification_loss: 0.1061 457/500 [==========================>...] - ETA: 10s - loss: 0.9799 - regression_loss: 0.8739 - classification_loss: 0.1060 458/500 [==========================>...] - ETA: 10s - loss: 0.9802 - regression_loss: 0.8740 - classification_loss: 0.1061 459/500 [==========================>...] - ETA: 10s - loss: 0.9815 - regression_loss: 0.8751 - classification_loss: 0.1064 460/500 [==========================>...] - ETA: 9s - loss: 0.9816 - regression_loss: 0.8752 - classification_loss: 0.1064  461/500 [==========================>...] - ETA: 9s - loss: 0.9815 - regression_loss: 0.8751 - classification_loss: 0.1064 462/500 [==========================>...] - ETA: 9s - loss: 0.9806 - regression_loss: 0.8744 - classification_loss: 0.1063 463/500 [==========================>...] - ETA: 9s - loss: 0.9806 - regression_loss: 0.8742 - classification_loss: 0.1063 464/500 [==========================>...] - ETA: 8s - loss: 0.9800 - regression_loss: 0.8737 - classification_loss: 0.1063 465/500 [==========================>...] - ETA: 8s - loss: 0.9799 - regression_loss: 0.8736 - classification_loss: 0.1063 466/500 [==========================>...] - ETA: 8s - loss: 0.9792 - regression_loss: 0.8728 - classification_loss: 0.1064 467/500 [===========================>..] - ETA: 8s - loss: 0.9790 - regression_loss: 0.8728 - classification_loss: 0.1063 468/500 [===========================>..] - ETA: 7s - loss: 0.9787 - regression_loss: 0.8725 - classification_loss: 0.1062 469/500 [===========================>..] - ETA: 7s - loss: 0.9780 - regression_loss: 0.8718 - classification_loss: 0.1062 470/500 [===========================>..] - ETA: 7s - loss: 0.9780 - regression_loss: 0.8719 - classification_loss: 0.1062 471/500 [===========================>..] - ETA: 7s - loss: 0.9783 - regression_loss: 0.8721 - classification_loss: 0.1062 472/500 [===========================>..] - ETA: 6s - loss: 0.9790 - regression_loss: 0.8729 - classification_loss: 0.1061 473/500 [===========================>..] - ETA: 6s - loss: 0.9780 - regression_loss: 0.8719 - classification_loss: 0.1060 474/500 [===========================>..] - ETA: 6s - loss: 0.9772 - regression_loss: 0.8713 - classification_loss: 0.1059 475/500 [===========================>..] - ETA: 6s - loss: 0.9762 - regression_loss: 0.8705 - classification_loss: 0.1058 476/500 [===========================>..] - ETA: 5s - loss: 0.9775 - regression_loss: 0.8714 - classification_loss: 0.1060 477/500 [===========================>..] - ETA: 5s - loss: 0.9782 - regression_loss: 0.8721 - classification_loss: 0.1061 478/500 [===========================>..] - ETA: 5s - loss: 0.9774 - regression_loss: 0.8714 - classification_loss: 0.1060 479/500 [===========================>..] - ETA: 5s - loss: 0.9777 - regression_loss: 0.8716 - classification_loss: 0.1061 480/500 [===========================>..] - ETA: 4s - loss: 0.9788 - regression_loss: 0.8725 - classification_loss: 0.1063 481/500 [===========================>..] - ETA: 4s - loss: 0.9775 - regression_loss: 0.8714 - classification_loss: 0.1061 482/500 [===========================>..] - ETA: 4s - loss: 0.9776 - regression_loss: 0.8714 - classification_loss: 0.1061 483/500 [===========================>..] - ETA: 4s - loss: 0.9774 - regression_loss: 0.8712 - classification_loss: 0.1061 484/500 [============================>.] - ETA: 3s - loss: 0.9764 - regression_loss: 0.8704 - classification_loss: 0.1060 485/500 [============================>.] - ETA: 3s - loss: 0.9771 - regression_loss: 0.8710 - classification_loss: 0.1060 486/500 [============================>.] - ETA: 3s - loss: 0.9769 - regression_loss: 0.8709 - classification_loss: 0.1059 487/500 [============================>.] - ETA: 3s - loss: 0.9782 - regression_loss: 0.8720 - classification_loss: 0.1062 488/500 [============================>.] - ETA: 2s - loss: 0.9777 - regression_loss: 0.8716 - classification_loss: 0.1061 489/500 [============================>.] - ETA: 2s - loss: 0.9781 - regression_loss: 0.8720 - classification_loss: 0.1062 490/500 [============================>.] - ETA: 2s - loss: 0.9770 - regression_loss: 0.8710 - classification_loss: 0.1060 491/500 [============================>.] - ETA: 2s - loss: 0.9771 - regression_loss: 0.8711 - classification_loss: 0.1061 492/500 [============================>.] - ETA: 1s - loss: 0.9780 - regression_loss: 0.8718 - classification_loss: 0.1062 493/500 [============================>.] - ETA: 1s - loss: 0.9792 - regression_loss: 0.8725 - classification_loss: 0.1067 494/500 [============================>.] - ETA: 1s - loss: 0.9791 - regression_loss: 0.8724 - classification_loss: 0.1066 495/500 [============================>.] - ETA: 1s - loss: 0.9798 - regression_loss: 0.8732 - classification_loss: 0.1067 496/500 [============================>.] - ETA: 0s - loss: 0.9791 - regression_loss: 0.8725 - classification_loss: 0.1066 497/500 [============================>.] - ETA: 0s - loss: 0.9790 - regression_loss: 0.8724 - classification_loss: 0.1065 498/500 [============================>.] - ETA: 0s - loss: 0.9790 - regression_loss: 0.8724 - classification_loss: 0.1065 499/500 [============================>.] - ETA: 0s - loss: 0.9779 - regression_loss: 0.8715 - classification_loss: 0.1064 500/500 [==============================] - 125s 250ms/step - loss: 0.9782 - regression_loss: 0.8719 - classification_loss: 0.1063 1172 instances of class plum with average precision: 0.7983 mAP: 0.7983 Epoch 00040: saving model to ./training/snapshots/resnet50_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 1:54 - loss: 1.5879 - regression_loss: 1.3929 - classification_loss: 0.1950 2/500 [..............................] - ETA: 2:01 - loss: 1.3261 - regression_loss: 1.1928 - classification_loss: 0.1333 3/500 [..............................] - ETA: 2:03 - loss: 1.0756 - regression_loss: 0.9782 - classification_loss: 0.0975 4/500 [..............................] - ETA: 2:03 - loss: 1.1617 - regression_loss: 1.0425 - classification_loss: 0.1193 5/500 [..............................] - ETA: 2:02 - loss: 1.0564 - regression_loss: 0.9544 - classification_loss: 0.1020 6/500 [..............................] - ETA: 2:02 - loss: 0.9319 - regression_loss: 0.8441 - classification_loss: 0.0878 7/500 [..............................] - ETA: 2:02 - loss: 0.9406 - regression_loss: 0.8486 - classification_loss: 0.0920 8/500 [..............................] - ETA: 2:02 - loss: 0.9549 - regression_loss: 0.8572 - classification_loss: 0.0977 9/500 [..............................] - ETA: 2:02 - loss: 0.9667 - regression_loss: 0.8692 - classification_loss: 0.0975 10/500 [..............................] - ETA: 2:02 - loss: 0.9485 - regression_loss: 0.8519 - classification_loss: 0.0967 11/500 [..............................] - ETA: 2:02 - loss: 0.9514 - regression_loss: 0.8525 - classification_loss: 0.0988 12/500 [..............................] - ETA: 2:02 - loss: 0.9186 - regression_loss: 0.8258 - classification_loss: 0.0927 13/500 [..............................] - ETA: 2:01 - loss: 0.9402 - regression_loss: 0.8465 - classification_loss: 0.0937 14/500 [..............................] - ETA: 2:01 - loss: 0.9729 - regression_loss: 0.8721 - classification_loss: 0.1009 15/500 [..............................] - ETA: 2:02 - loss: 0.9967 - regression_loss: 0.8912 - classification_loss: 0.1054 16/500 [..............................] - ETA: 2:01 - loss: 1.0126 - regression_loss: 0.9058 - classification_loss: 0.1068 17/500 [>.............................] - ETA: 2:01 - loss: 1.0124 - regression_loss: 0.9064 - classification_loss: 0.1059 18/500 [>.............................] - ETA: 2:01 - loss: 0.9855 - regression_loss: 0.8835 - classification_loss: 0.1019 19/500 [>.............................] - ETA: 2:00 - loss: 0.9904 - regression_loss: 0.8853 - classification_loss: 0.1051 20/500 [>.............................] - ETA: 2:00 - loss: 0.9986 - regression_loss: 0.8915 - classification_loss: 0.1071 21/500 [>.............................] - ETA: 2:00 - loss: 1.0179 - regression_loss: 0.9062 - classification_loss: 0.1116 22/500 [>.............................] - ETA: 1:59 - loss: 1.0273 - regression_loss: 0.9169 - classification_loss: 0.1103 23/500 [>.............................] - ETA: 1:59 - loss: 1.0269 - regression_loss: 0.9166 - classification_loss: 0.1102 24/500 [>.............................] - ETA: 1:59 - loss: 1.0315 - regression_loss: 0.9192 - classification_loss: 0.1123 25/500 [>.............................] - ETA: 1:59 - loss: 1.0279 - regression_loss: 0.9156 - classification_loss: 0.1124 26/500 [>.............................] - ETA: 1:58 - loss: 1.0392 - regression_loss: 0.9266 - classification_loss: 0.1126 27/500 [>.............................] - ETA: 1:58 - loss: 1.0278 - regression_loss: 0.9162 - classification_loss: 0.1116 28/500 [>.............................] - ETA: 1:58 - loss: 1.0321 - regression_loss: 0.9205 - classification_loss: 0.1117 29/500 [>.............................] - ETA: 1:58 - loss: 1.0376 - regression_loss: 0.9259 - classification_loss: 0.1117 30/500 [>.............................] - ETA: 1:57 - loss: 1.0389 - regression_loss: 0.9269 - classification_loss: 0.1120 31/500 [>.............................] - ETA: 1:57 - loss: 1.0700 - regression_loss: 0.9537 - classification_loss: 0.1163 32/500 [>.............................] - ETA: 1:57 - loss: 1.0606 - regression_loss: 0.9440 - classification_loss: 0.1166 33/500 [>.............................] - ETA: 1:57 - loss: 1.0479 - regression_loss: 0.9339 - classification_loss: 0.1140 34/500 [=>............................] - ETA: 1:56 - loss: 1.0478 - regression_loss: 0.9330 - classification_loss: 0.1148 35/500 [=>............................] - ETA: 1:56 - loss: 1.0421 - regression_loss: 0.9283 - classification_loss: 0.1138 36/500 [=>............................] - ETA: 1:56 - loss: 1.0441 - regression_loss: 0.9298 - classification_loss: 0.1143 37/500 [=>............................] - ETA: 1:56 - loss: 1.0427 - regression_loss: 0.9291 - classification_loss: 0.1135 38/500 [=>............................] - ETA: 1:56 - loss: 1.0324 - regression_loss: 0.9207 - classification_loss: 0.1117 39/500 [=>............................] - ETA: 1:55 - loss: 1.0377 - regression_loss: 0.9249 - classification_loss: 0.1127 40/500 [=>............................] - ETA: 1:55 - loss: 1.0399 - regression_loss: 0.9267 - classification_loss: 0.1132 41/500 [=>............................] - ETA: 1:55 - loss: 1.0367 - regression_loss: 0.9227 - classification_loss: 0.1140 42/500 [=>............................] - ETA: 1:55 - loss: 1.0294 - regression_loss: 0.9167 - classification_loss: 0.1127 43/500 [=>............................] - ETA: 1:54 - loss: 1.0353 - regression_loss: 0.9215 - classification_loss: 0.1137 44/500 [=>............................] - ETA: 1:54 - loss: 1.0382 - regression_loss: 0.9245 - classification_loss: 0.1137 45/500 [=>............................] - ETA: 1:54 - loss: 1.0332 - regression_loss: 0.9208 - classification_loss: 0.1124 46/500 [=>............................] - ETA: 1:54 - loss: 1.0343 - regression_loss: 0.9214 - classification_loss: 0.1129 47/500 [=>............................] - ETA: 1:54 - loss: 1.0286 - regression_loss: 0.9169 - classification_loss: 0.1117 48/500 [=>............................] - ETA: 1:53 - loss: 1.0214 - regression_loss: 0.9108 - classification_loss: 0.1107 49/500 [=>............................] - ETA: 1:53 - loss: 1.0221 - regression_loss: 0.9113 - classification_loss: 0.1109 50/500 [==>...........................] - ETA: 1:53 - loss: 1.0119 - regression_loss: 0.9029 - classification_loss: 0.1091 51/500 [==>...........................] - ETA: 1:53 - loss: 1.0080 - regression_loss: 0.8992 - classification_loss: 0.1087 52/500 [==>...........................] - ETA: 1:52 - loss: 1.0034 - regression_loss: 0.8959 - classification_loss: 0.1075 53/500 [==>...........................] - ETA: 1:52 - loss: 1.0051 - regression_loss: 0.8974 - classification_loss: 0.1077 54/500 [==>...........................] - ETA: 1:52 - loss: 0.9940 - regression_loss: 0.8876 - classification_loss: 0.1063 55/500 [==>...........................] - ETA: 1:52 - loss: 0.9921 - regression_loss: 0.8861 - classification_loss: 0.1059 56/500 [==>...........................] - ETA: 1:51 - loss: 0.9899 - regression_loss: 0.8847 - classification_loss: 0.1052 57/500 [==>...........................] - ETA: 1:51 - loss: 0.9952 - regression_loss: 0.8888 - classification_loss: 0.1064 58/500 [==>...........................] - ETA: 1:51 - loss: 1.0032 - regression_loss: 0.8965 - classification_loss: 0.1068 59/500 [==>...........................] - ETA: 1:51 - loss: 1.0019 - regression_loss: 0.8954 - classification_loss: 0.1066 60/500 [==>...........................] - ETA: 1:50 - loss: 0.9960 - regression_loss: 0.8906 - classification_loss: 0.1054 61/500 [==>...........................] - ETA: 1:50 - loss: 1.0013 - regression_loss: 0.8949 - classification_loss: 0.1063 62/500 [==>...........................] - ETA: 1:50 - loss: 1.0054 - regression_loss: 0.8990 - classification_loss: 0.1063 63/500 [==>...........................] - ETA: 1:50 - loss: 0.9965 - regression_loss: 0.8914 - classification_loss: 0.1051 64/500 [==>...........................] - ETA: 1:49 - loss: 0.9913 - regression_loss: 0.8872 - classification_loss: 0.1040 65/500 [==>...........................] - ETA: 1:49 - loss: 0.9960 - regression_loss: 0.8911 - classification_loss: 0.1049 66/500 [==>...........................] - ETA: 1:48 - loss: 0.9940 - regression_loss: 0.8890 - classification_loss: 0.1050 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9931 - regression_loss: 0.8882 - classification_loss: 0.1049 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9929 - regression_loss: 0.8876 - classification_loss: 0.1053 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9898 - regression_loss: 0.8848 - classification_loss: 0.1049 70/500 [===>..........................] - ETA: 1:47 - loss: 0.9841 - regression_loss: 0.8801 - classification_loss: 0.1040 71/500 [===>..........................] - ETA: 1:46 - loss: 0.9817 - regression_loss: 0.8785 - classification_loss: 0.1033 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9771 - regression_loss: 0.8745 - classification_loss: 0.1026 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9681 - regression_loss: 0.8668 - classification_loss: 0.1014 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9701 - regression_loss: 0.8665 - classification_loss: 0.1036 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9772 - regression_loss: 0.8726 - classification_loss: 0.1045 76/500 [===>..........................] - ETA: 1:45 - loss: 0.9761 - regression_loss: 0.8715 - classification_loss: 0.1046 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9762 - regression_loss: 0.8714 - classification_loss: 0.1047 78/500 [===>..........................] - ETA: 1:45 - loss: 0.9819 - regression_loss: 0.8762 - classification_loss: 0.1058 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9757 - regression_loss: 0.8708 - classification_loss: 0.1049 80/500 [===>..........................] - ETA: 1:44 - loss: 0.9831 - regression_loss: 0.8769 - classification_loss: 0.1062 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9833 - regression_loss: 0.8766 - classification_loss: 0.1067 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9815 - regression_loss: 0.8752 - classification_loss: 0.1063 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9835 - regression_loss: 0.8770 - classification_loss: 0.1065 84/500 [====>.........................] - ETA: 1:43 - loss: 0.9776 - regression_loss: 0.8720 - classification_loss: 0.1056 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9817 - regression_loss: 0.8756 - classification_loss: 0.1061 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9765 - regression_loss: 0.8713 - classification_loss: 0.1052 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9810 - regression_loss: 0.8750 - classification_loss: 0.1060 88/500 [====>.........................] - ETA: 1:42 - loss: 0.9826 - regression_loss: 0.8768 - classification_loss: 0.1058 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9768 - regression_loss: 0.8718 - classification_loss: 0.1050 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9810 - regression_loss: 0.8752 - classification_loss: 0.1058 91/500 [====>.........................] - ETA: 1:42 - loss: 0.9816 - regression_loss: 0.8758 - classification_loss: 0.1058 92/500 [====>.........................] - ETA: 1:41 - loss: 0.9884 - regression_loss: 0.8811 - classification_loss: 0.1073 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9868 - regression_loss: 0.8796 - classification_loss: 0.1072 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9871 - regression_loss: 0.8794 - classification_loss: 0.1077 95/500 [====>.........................] - ETA: 1:41 - loss: 0.9890 - regression_loss: 0.8816 - classification_loss: 0.1074 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9878 - regression_loss: 0.8803 - classification_loss: 0.1075 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9791 - regression_loss: 0.8726 - classification_loss: 0.1066 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9798 - regression_loss: 0.8731 - classification_loss: 0.1068 99/500 [====>.........................] - ETA: 1:40 - loss: 0.9799 - regression_loss: 0.8734 - classification_loss: 0.1065 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9784 - regression_loss: 0.8725 - classification_loss: 0.1060 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9717 - regression_loss: 0.8666 - classification_loss: 0.1051 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9694 - regression_loss: 0.8643 - classification_loss: 0.1051 103/500 [=====>........................] - ETA: 1:39 - loss: 0.9709 - regression_loss: 0.8653 - classification_loss: 0.1055 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9700 - regression_loss: 0.8643 - classification_loss: 0.1057 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9668 - regression_loss: 0.8619 - classification_loss: 0.1049 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9617 - regression_loss: 0.8572 - classification_loss: 0.1045 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9650 - regression_loss: 0.8600 - classification_loss: 0.1050 108/500 [=====>........................] - ETA: 1:37 - loss: 0.9638 - regression_loss: 0.8590 - classification_loss: 0.1049 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9622 - regression_loss: 0.8576 - classification_loss: 0.1046 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9566 - regression_loss: 0.8528 - classification_loss: 0.1038 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9528 - regression_loss: 0.8496 - classification_loss: 0.1032 112/500 [=====>........................] - ETA: 1:37 - loss: 0.9550 - regression_loss: 0.8515 - classification_loss: 0.1035 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9532 - regression_loss: 0.8502 - classification_loss: 0.1030 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9505 - regression_loss: 0.8476 - classification_loss: 0.1028 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9523 - regression_loss: 0.8492 - classification_loss: 0.1032 116/500 [=====>........................] - ETA: 1:36 - loss: 0.9475 - regression_loss: 0.8451 - classification_loss: 0.1024 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9512 - regression_loss: 0.8479 - classification_loss: 0.1033 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9527 - regression_loss: 0.8495 - classification_loss: 0.1033 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9536 - regression_loss: 0.8503 - classification_loss: 0.1033 120/500 [======>.......................] - ETA: 1:35 - loss: 0.9563 - regression_loss: 0.8523 - classification_loss: 0.1040 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9517 - regression_loss: 0.8484 - classification_loss: 0.1033 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9474 - regression_loss: 0.8448 - classification_loss: 0.1026 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9448 - regression_loss: 0.8423 - classification_loss: 0.1025 124/500 [======>.......................] - ETA: 1:34 - loss: 0.9456 - regression_loss: 0.8432 - classification_loss: 0.1024 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9442 - regression_loss: 0.8423 - classification_loss: 0.1019 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9497 - regression_loss: 0.8468 - classification_loss: 0.1030 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9506 - regression_loss: 0.8475 - classification_loss: 0.1031 128/500 [======>.......................] - ETA: 1:33 - loss: 0.9480 - regression_loss: 0.8454 - classification_loss: 0.1026 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9531 - regression_loss: 0.8497 - classification_loss: 0.1034 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9544 - regression_loss: 0.8509 - classification_loss: 0.1035 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9498 - regression_loss: 0.8461 - classification_loss: 0.1037 132/500 [======>.......................] - ETA: 1:32 - loss: 0.9520 - regression_loss: 0.8482 - classification_loss: 0.1038 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9553 - regression_loss: 0.8508 - classification_loss: 0.1044 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9528 - regression_loss: 0.8486 - classification_loss: 0.1042 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9515 - regression_loss: 0.8474 - classification_loss: 0.1040 136/500 [=======>......................] - ETA: 1:31 - loss: 0.9498 - regression_loss: 0.8462 - classification_loss: 0.1037 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9501 - regression_loss: 0.8464 - classification_loss: 0.1037 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9500 - regression_loss: 0.8462 - classification_loss: 0.1038 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9563 - regression_loss: 0.8521 - classification_loss: 0.1042 140/500 [=======>......................] - ETA: 1:30 - loss: 0.9521 - regression_loss: 0.8484 - classification_loss: 0.1037 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9534 - regression_loss: 0.8499 - classification_loss: 0.1035 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9553 - regression_loss: 0.8513 - classification_loss: 0.1039 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9559 - regression_loss: 0.8520 - classification_loss: 0.1038 144/500 [=======>......................] - ETA: 1:29 - loss: 0.9603 - regression_loss: 0.8561 - classification_loss: 0.1043 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9610 - regression_loss: 0.8566 - classification_loss: 0.1044 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9596 - regression_loss: 0.8556 - classification_loss: 0.1040 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9573 - regression_loss: 0.8538 - classification_loss: 0.1035 148/500 [=======>......................] - ETA: 1:28 - loss: 0.9568 - regression_loss: 0.8533 - classification_loss: 0.1035 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9538 - regression_loss: 0.8508 - classification_loss: 0.1030 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9537 - regression_loss: 0.8508 - classification_loss: 0.1029 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9591 - regression_loss: 0.8553 - classification_loss: 0.1038 152/500 [========>.....................] - ETA: 1:27 - loss: 0.9564 - regression_loss: 0.8529 - classification_loss: 0.1035 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9577 - regression_loss: 0.8541 - classification_loss: 0.1036 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9559 - regression_loss: 0.8527 - classification_loss: 0.1032 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9541 - regression_loss: 0.8514 - classification_loss: 0.1027 156/500 [========>.....................] - ETA: 1:26 - loss: 0.9572 - regression_loss: 0.8538 - classification_loss: 0.1034 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9577 - regression_loss: 0.8541 - classification_loss: 0.1037 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9590 - regression_loss: 0.8556 - classification_loss: 0.1033 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9569 - regression_loss: 0.8538 - classification_loss: 0.1031 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9567 - regression_loss: 0.8537 - classification_loss: 0.1031 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9583 - regression_loss: 0.8550 - classification_loss: 0.1034 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9582 - regression_loss: 0.8548 - classification_loss: 0.1034 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9576 - regression_loss: 0.8543 - classification_loss: 0.1033 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9582 - regression_loss: 0.8550 - classification_loss: 0.1032 165/500 [========>.....................] - ETA: 1:24 - loss: 0.9545 - regression_loss: 0.8519 - classification_loss: 0.1026 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9564 - regression_loss: 0.8535 - classification_loss: 0.1030 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9529 - regression_loss: 0.8505 - classification_loss: 0.1025 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9510 - regression_loss: 0.8487 - classification_loss: 0.1023 169/500 [=========>....................] - ETA: 1:23 - loss: 0.9531 - regression_loss: 0.8506 - classification_loss: 0.1025 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9518 - regression_loss: 0.8496 - classification_loss: 0.1021 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9497 - regression_loss: 0.8477 - classification_loss: 0.1019 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9486 - regression_loss: 0.8467 - classification_loss: 0.1019 173/500 [=========>....................] - ETA: 1:22 - loss: 0.9504 - regression_loss: 0.8485 - classification_loss: 0.1019 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9502 - regression_loss: 0.8484 - classification_loss: 0.1018 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9470 - regression_loss: 0.8456 - classification_loss: 0.1014 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9484 - regression_loss: 0.8469 - classification_loss: 0.1015 177/500 [=========>....................] - ETA: 1:21 - loss: 0.9481 - regression_loss: 0.8465 - classification_loss: 0.1015 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9485 - regression_loss: 0.8471 - classification_loss: 0.1014 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9468 - regression_loss: 0.8459 - classification_loss: 0.1010 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9494 - regression_loss: 0.8482 - classification_loss: 0.1011 181/500 [=========>....................] - ETA: 1:20 - loss: 0.9508 - regression_loss: 0.8493 - classification_loss: 0.1015 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9516 - regression_loss: 0.8502 - classification_loss: 0.1015 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9535 - regression_loss: 0.8519 - classification_loss: 0.1016 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9539 - regression_loss: 0.8525 - classification_loss: 0.1014 185/500 [==========>...................] - ETA: 1:19 - loss: 0.9544 - regression_loss: 0.8526 - classification_loss: 0.1018 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9547 - regression_loss: 0.8527 - classification_loss: 0.1020 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9559 - regression_loss: 0.8538 - classification_loss: 0.1022 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9544 - regression_loss: 0.8525 - classification_loss: 0.1019 189/500 [==========>...................] - ETA: 1:18 - loss: 0.9543 - regression_loss: 0.8527 - classification_loss: 0.1017 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9545 - regression_loss: 0.8529 - classification_loss: 0.1017 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9556 - regression_loss: 0.8538 - classification_loss: 0.1018 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9567 - regression_loss: 0.8548 - classification_loss: 0.1019 193/500 [==========>...................] - ETA: 1:17 - loss: 0.9595 - regression_loss: 0.8572 - classification_loss: 0.1023 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9577 - regression_loss: 0.8554 - classification_loss: 0.1023 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9585 - regression_loss: 0.8562 - classification_loss: 0.1023 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9591 - regression_loss: 0.8567 - classification_loss: 0.1025 197/500 [==========>...................] - ETA: 1:16 - loss: 0.9584 - regression_loss: 0.8561 - classification_loss: 0.1023 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9556 - regression_loss: 0.8538 - classification_loss: 0.1019 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9560 - regression_loss: 0.8542 - classification_loss: 0.1018 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9534 - regression_loss: 0.8519 - classification_loss: 0.1015 201/500 [===========>..................] - ETA: 1:15 - loss: 0.9526 - regression_loss: 0.8510 - classification_loss: 0.1016 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9524 - regression_loss: 0.8507 - classification_loss: 0.1017 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9536 - regression_loss: 0.8517 - classification_loss: 0.1019 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9549 - regression_loss: 0.8526 - classification_loss: 0.1023 205/500 [===========>..................] - ETA: 1:14 - loss: 0.9543 - regression_loss: 0.8521 - classification_loss: 0.1022 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9535 - regression_loss: 0.8515 - classification_loss: 0.1020 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9519 - regression_loss: 0.8499 - classification_loss: 0.1020 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9519 - regression_loss: 0.8500 - classification_loss: 0.1019 209/500 [===========>..................] - ETA: 1:13 - loss: 0.9525 - regression_loss: 0.8506 - classification_loss: 0.1019 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9541 - regression_loss: 0.8518 - classification_loss: 0.1023 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9529 - regression_loss: 0.8509 - classification_loss: 0.1020 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9519 - regression_loss: 0.8499 - classification_loss: 0.1020 213/500 [===========>..................] - ETA: 1:12 - loss: 0.9505 - regression_loss: 0.8487 - classification_loss: 0.1018 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9482 - regression_loss: 0.8466 - classification_loss: 0.1015 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9489 - regression_loss: 0.8474 - classification_loss: 0.1016 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9497 - regression_loss: 0.8479 - classification_loss: 0.1018 217/500 [============>.................] - ETA: 1:11 - loss: 0.9478 - regression_loss: 0.8463 - classification_loss: 0.1015 218/500 [============>.................] - ETA: 1:10 - loss: 0.9478 - regression_loss: 0.8458 - classification_loss: 0.1020 219/500 [============>.................] - ETA: 1:10 - loss: 0.9476 - regression_loss: 0.8457 - classification_loss: 0.1020 220/500 [============>.................] - ETA: 1:10 - loss: 0.9482 - regression_loss: 0.8461 - classification_loss: 0.1021 221/500 [============>.................] - ETA: 1:10 - loss: 0.9517 - regression_loss: 0.8495 - classification_loss: 0.1022 222/500 [============>.................] - ETA: 1:09 - loss: 0.9528 - regression_loss: 0.8505 - classification_loss: 0.1023 223/500 [============>.................] - ETA: 1:09 - loss: 0.9518 - regression_loss: 0.8496 - classification_loss: 0.1023 224/500 [============>.................] - ETA: 1:09 - loss: 0.9524 - regression_loss: 0.8500 - classification_loss: 0.1024 225/500 [============>.................] - ETA: 1:09 - loss: 0.9518 - regression_loss: 0.8495 - classification_loss: 0.1024 226/500 [============>.................] - ETA: 1:08 - loss: 0.9532 - regression_loss: 0.8506 - classification_loss: 0.1026 227/500 [============>.................] - ETA: 1:08 - loss: 0.9532 - regression_loss: 0.8506 - classification_loss: 0.1025 228/500 [============>.................] - ETA: 1:08 - loss: 0.9503 - regression_loss: 0.8482 - classification_loss: 0.1021 229/500 [============>.................] - ETA: 1:08 - loss: 0.9519 - regression_loss: 0.8496 - classification_loss: 0.1023 230/500 [============>.................] - ETA: 1:07 - loss: 0.9525 - regression_loss: 0.8503 - classification_loss: 0.1022 231/500 [============>.................] - ETA: 1:07 - loss: 0.9531 - regression_loss: 0.8507 - classification_loss: 0.1024 232/500 [============>.................] - ETA: 1:07 - loss: 0.9535 - regression_loss: 0.8508 - classification_loss: 0.1026 233/500 [============>.................] - ETA: 1:07 - loss: 0.9542 - regression_loss: 0.8515 - classification_loss: 0.1027 234/500 [=============>................] - ETA: 1:06 - loss: 0.9558 - regression_loss: 0.8529 - classification_loss: 0.1029 235/500 [=============>................] - ETA: 1:06 - loss: 0.9573 - regression_loss: 0.8540 - classification_loss: 0.1033 236/500 [=============>................] - ETA: 1:06 - loss: 0.9586 - regression_loss: 0.8550 - classification_loss: 0.1037 237/500 [=============>................] - ETA: 1:06 - loss: 0.9604 - regression_loss: 0.8563 - classification_loss: 0.1041 238/500 [=============>................] - ETA: 1:05 - loss: 0.9595 - regression_loss: 0.8557 - classification_loss: 0.1038 239/500 [=============>................] - ETA: 1:05 - loss: 0.9595 - regression_loss: 0.8557 - classification_loss: 0.1038 240/500 [=============>................] - ETA: 1:05 - loss: 0.9603 - regression_loss: 0.8564 - classification_loss: 0.1039 241/500 [=============>................] - ETA: 1:05 - loss: 0.9588 - regression_loss: 0.8552 - classification_loss: 0.1037 242/500 [=============>................] - ETA: 1:04 - loss: 0.9624 - regression_loss: 0.8585 - classification_loss: 0.1039 243/500 [=============>................] - ETA: 1:04 - loss: 0.9617 - regression_loss: 0.8580 - classification_loss: 0.1036 244/500 [=============>................] - ETA: 1:04 - loss: 0.9613 - regression_loss: 0.8577 - classification_loss: 0.1036 245/500 [=============>................] - ETA: 1:03 - loss: 0.9615 - regression_loss: 0.8578 - classification_loss: 0.1037 246/500 [=============>................] - ETA: 1:03 - loss: 0.9629 - regression_loss: 0.8589 - classification_loss: 0.1040 247/500 [=============>................] - ETA: 1:03 - loss: 0.9628 - regression_loss: 0.8588 - classification_loss: 0.1040 248/500 [=============>................] - ETA: 1:03 - loss: 0.9623 - regression_loss: 0.8583 - classification_loss: 0.1039 249/500 [=============>................] - ETA: 1:02 - loss: 0.9629 - regression_loss: 0.8590 - classification_loss: 0.1039 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9611 - regression_loss: 0.8574 - classification_loss: 0.1037 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9633 - regression_loss: 0.8589 - classification_loss: 0.1043 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9640 - regression_loss: 0.8596 - classification_loss: 0.1043 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9611 - regression_loss: 0.8570 - classification_loss: 0.1041 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9610 - regression_loss: 0.8570 - classification_loss: 0.1039 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9619 - regression_loss: 0.8579 - classification_loss: 0.1040 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9646 - regression_loss: 0.8604 - classification_loss: 0.1042 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9658 - regression_loss: 0.8614 - classification_loss: 0.1044 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9667 - regression_loss: 0.8622 - classification_loss: 0.1045 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9656 - regression_loss: 0.8614 - classification_loss: 0.1042 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9635 - regression_loss: 0.8596 - classification_loss: 0.1039 261/500 [==============>...............] - ETA: 59s - loss: 0.9625 - regression_loss: 0.8586 - classification_loss: 0.1039  262/500 [==============>...............] - ETA: 59s - loss: 0.9630 - regression_loss: 0.8591 - classification_loss: 0.1039 263/500 [==============>...............] - ETA: 59s - loss: 0.9620 - regression_loss: 0.8584 - classification_loss: 0.1036 264/500 [==============>...............] - ETA: 59s - loss: 0.9620 - regression_loss: 0.8583 - classification_loss: 0.1037 265/500 [==============>...............] - ETA: 58s - loss: 0.9589 - regression_loss: 0.8555 - classification_loss: 0.1034 266/500 [==============>...............] - ETA: 58s - loss: 0.9585 - regression_loss: 0.8553 - classification_loss: 0.1032 267/500 [===============>..............] - ETA: 58s - loss: 0.9597 - regression_loss: 0.8563 - classification_loss: 0.1034 268/500 [===============>..............] - ETA: 58s - loss: 0.9613 - regression_loss: 0.8576 - classification_loss: 0.1036 269/500 [===============>..............] - ETA: 57s - loss: 0.9614 - regression_loss: 0.8579 - classification_loss: 0.1035 270/500 [===============>..............] - ETA: 57s - loss: 0.9633 - regression_loss: 0.8597 - classification_loss: 0.1035 271/500 [===============>..............] - ETA: 57s - loss: 0.9643 - regression_loss: 0.8609 - classification_loss: 0.1034 272/500 [===============>..............] - ETA: 57s - loss: 0.9639 - regression_loss: 0.8605 - classification_loss: 0.1034 273/500 [===============>..............] - ETA: 56s - loss: 0.9640 - regression_loss: 0.8606 - classification_loss: 0.1034 274/500 [===============>..............] - ETA: 56s - loss: 0.9656 - regression_loss: 0.8619 - classification_loss: 0.1037 275/500 [===============>..............] - ETA: 56s - loss: 0.9640 - regression_loss: 0.8606 - classification_loss: 0.1034 276/500 [===============>..............] - ETA: 56s - loss: 0.9632 - regression_loss: 0.8595 - classification_loss: 0.1037 277/500 [===============>..............] - ETA: 55s - loss: 0.9647 - regression_loss: 0.8608 - classification_loss: 0.1039 278/500 [===============>..............] - ETA: 55s - loss: 0.9660 - regression_loss: 0.8618 - classification_loss: 0.1042 279/500 [===============>..............] - ETA: 55s - loss: 0.9660 - regression_loss: 0.8618 - classification_loss: 0.1042 280/500 [===============>..............] - ETA: 55s - loss: 0.9655 - regression_loss: 0.8614 - classification_loss: 0.1041 281/500 [===============>..............] - ETA: 54s - loss: 0.9657 - regression_loss: 0.8615 - classification_loss: 0.1042 282/500 [===============>..............] - ETA: 54s - loss: 0.9632 - regression_loss: 0.8594 - classification_loss: 0.1039 283/500 [===============>..............] - ETA: 54s - loss: 0.9616 - regression_loss: 0.8579 - classification_loss: 0.1036 284/500 [================>.............] - ETA: 54s - loss: 0.9623 - regression_loss: 0.8586 - classification_loss: 0.1038 285/500 [================>.............] - ETA: 53s - loss: 0.9627 - regression_loss: 0.8589 - classification_loss: 0.1038 286/500 [================>.............] - ETA: 53s - loss: 0.9633 - regression_loss: 0.8594 - classification_loss: 0.1039 287/500 [================>.............] - ETA: 53s - loss: 0.9656 - regression_loss: 0.8613 - classification_loss: 0.1043 288/500 [================>.............] - ETA: 53s - loss: 0.9683 - regression_loss: 0.8635 - classification_loss: 0.1048 289/500 [================>.............] - ETA: 52s - loss: 0.9688 - regression_loss: 0.8641 - classification_loss: 0.1047 290/500 [================>.............] - ETA: 52s - loss: 0.9673 - regression_loss: 0.8628 - classification_loss: 0.1045 291/500 [================>.............] - ETA: 52s - loss: 0.9722 - regression_loss: 0.8665 - classification_loss: 0.1057 292/500 [================>.............] - ETA: 52s - loss: 0.9729 - regression_loss: 0.8671 - classification_loss: 0.1058 293/500 [================>.............] - ETA: 51s - loss: 0.9748 - regression_loss: 0.8688 - classification_loss: 0.1060 294/500 [================>.............] - ETA: 51s - loss: 0.9761 - regression_loss: 0.8698 - classification_loss: 0.1063 295/500 [================>.............] - ETA: 51s - loss: 0.9779 - regression_loss: 0.8713 - classification_loss: 0.1067 296/500 [================>.............] - ETA: 51s - loss: 0.9759 - regression_loss: 0.8695 - classification_loss: 0.1064 297/500 [================>.............] - ETA: 50s - loss: 0.9757 - regression_loss: 0.8694 - classification_loss: 0.1063 298/500 [================>.............] - ETA: 50s - loss: 0.9766 - regression_loss: 0.8701 - classification_loss: 0.1065 299/500 [================>.............] - ETA: 50s - loss: 0.9772 - regression_loss: 0.8706 - classification_loss: 0.1066 300/500 [=================>............] - ETA: 50s - loss: 0.9770 - regression_loss: 0.8707 - classification_loss: 0.1063 301/500 [=================>............] - ETA: 49s - loss: 0.9770 - regression_loss: 0.8707 - classification_loss: 0.1063 302/500 [=================>............] - ETA: 49s - loss: 0.9777 - regression_loss: 0.8714 - classification_loss: 0.1063 303/500 [=================>............] - ETA: 49s - loss: 0.9795 - regression_loss: 0.8729 - classification_loss: 0.1066 304/500 [=================>............] - ETA: 49s - loss: 0.9809 - regression_loss: 0.8740 - classification_loss: 0.1069 305/500 [=================>............] - ETA: 48s - loss: 0.9799 - regression_loss: 0.8732 - classification_loss: 0.1067 306/500 [=================>............] - ETA: 48s - loss: 0.9815 - regression_loss: 0.8745 - classification_loss: 0.1070 307/500 [=================>............] - ETA: 48s - loss: 0.9805 - regression_loss: 0.8737 - classification_loss: 0.1068 308/500 [=================>............] - ETA: 48s - loss: 0.9807 - regression_loss: 0.8738 - classification_loss: 0.1069 309/500 [=================>............] - ETA: 47s - loss: 0.9813 - regression_loss: 0.8744 - classification_loss: 0.1069 310/500 [=================>............] - ETA: 47s - loss: 0.9833 - regression_loss: 0.8762 - classification_loss: 0.1071 311/500 [=================>............] - ETA: 47s - loss: 0.9842 - regression_loss: 0.8771 - classification_loss: 0.1071 312/500 [=================>............] - ETA: 47s - loss: 0.9853 - regression_loss: 0.8780 - classification_loss: 0.1073 313/500 [=================>............] - ETA: 46s - loss: 0.9853 - regression_loss: 0.8779 - classification_loss: 0.1074 314/500 [=================>............] - ETA: 46s - loss: 0.9847 - regression_loss: 0.8774 - classification_loss: 0.1072 315/500 [=================>............] - ETA: 46s - loss: 0.9840 - regression_loss: 0.8770 - classification_loss: 0.1070 316/500 [=================>............] - ETA: 46s - loss: 0.9846 - regression_loss: 0.8775 - classification_loss: 0.1071 317/500 [==================>...........] - ETA: 45s - loss: 0.9833 - regression_loss: 0.8764 - classification_loss: 0.1069 318/500 [==================>...........] - ETA: 45s - loss: 0.9820 - regression_loss: 0.8752 - classification_loss: 0.1068 319/500 [==================>...........] - ETA: 45s - loss: 0.9821 - regression_loss: 0.8755 - classification_loss: 0.1067 320/500 [==================>...........] - ETA: 45s - loss: 0.9818 - regression_loss: 0.8751 - classification_loss: 0.1067 321/500 [==================>...........] - ETA: 44s - loss: 0.9817 - regression_loss: 0.8751 - classification_loss: 0.1067 322/500 [==================>...........] - ETA: 44s - loss: 0.9817 - regression_loss: 0.8751 - classification_loss: 0.1067 323/500 [==================>...........] - ETA: 44s - loss: 0.9807 - regression_loss: 0.8742 - classification_loss: 0.1065 324/500 [==================>...........] - ETA: 44s - loss: 0.9808 - regression_loss: 0.8743 - classification_loss: 0.1065 325/500 [==================>...........] - ETA: 43s - loss: 0.9807 - regression_loss: 0.8742 - classification_loss: 0.1064 326/500 [==================>...........] - ETA: 43s - loss: 0.9808 - regression_loss: 0.8743 - classification_loss: 0.1065 327/500 [==================>...........] - ETA: 43s - loss: 0.9811 - regression_loss: 0.8745 - classification_loss: 0.1066 328/500 [==================>...........] - ETA: 43s - loss: 0.9808 - regression_loss: 0.8743 - classification_loss: 0.1065 329/500 [==================>...........] - ETA: 42s - loss: 0.9811 - regression_loss: 0.8744 - classification_loss: 0.1067 330/500 [==================>...........] - ETA: 42s - loss: 0.9806 - regression_loss: 0.8741 - classification_loss: 0.1065 331/500 [==================>...........] - ETA: 42s - loss: 0.9799 - regression_loss: 0.8734 - classification_loss: 0.1064 332/500 [==================>...........] - ETA: 42s - loss: 0.9789 - regression_loss: 0.8726 - classification_loss: 0.1063 333/500 [==================>...........] - ETA: 41s - loss: 0.9810 - regression_loss: 0.8744 - classification_loss: 0.1066 334/500 [===================>..........] - ETA: 41s - loss: 0.9801 - regression_loss: 0.8737 - classification_loss: 0.1065 335/500 [===================>..........] - ETA: 41s - loss: 0.9805 - regression_loss: 0.8740 - classification_loss: 0.1065 336/500 [===================>..........] - ETA: 41s - loss: 0.9804 - regression_loss: 0.8739 - classification_loss: 0.1064 337/500 [===================>..........] - ETA: 40s - loss: 0.9804 - regression_loss: 0.8739 - classification_loss: 0.1066 338/500 [===================>..........] - ETA: 40s - loss: 0.9804 - regression_loss: 0.8737 - classification_loss: 0.1067 339/500 [===================>..........] - ETA: 40s - loss: 0.9801 - regression_loss: 0.8734 - classification_loss: 0.1067 340/500 [===================>..........] - ETA: 40s - loss: 0.9794 - regression_loss: 0.8729 - classification_loss: 0.1065 341/500 [===================>..........] - ETA: 39s - loss: 0.9791 - regression_loss: 0.8728 - classification_loss: 0.1063 342/500 [===================>..........] - ETA: 39s - loss: 0.9792 - regression_loss: 0.8730 - classification_loss: 0.1062 343/500 [===================>..........] - ETA: 39s - loss: 0.9795 - regression_loss: 0.8734 - classification_loss: 0.1062 344/500 [===================>..........] - ETA: 39s - loss: 0.9796 - regression_loss: 0.8735 - classification_loss: 0.1061 345/500 [===================>..........] - ETA: 38s - loss: 0.9798 - regression_loss: 0.8737 - classification_loss: 0.1061 346/500 [===================>..........] - ETA: 38s - loss: 0.9791 - regression_loss: 0.8731 - classification_loss: 0.1060 347/500 [===================>..........] - ETA: 38s - loss: 0.9793 - regression_loss: 0.8732 - classification_loss: 0.1061 348/500 [===================>..........] - ETA: 38s - loss: 0.9796 - regression_loss: 0.8736 - classification_loss: 0.1060 349/500 [===================>..........] - ETA: 37s - loss: 0.9805 - regression_loss: 0.8743 - classification_loss: 0.1061 350/500 [====================>.........] - ETA: 37s - loss: 0.9817 - regression_loss: 0.8753 - classification_loss: 0.1064 351/500 [====================>.........] - ETA: 37s - loss: 0.9832 - regression_loss: 0.8766 - classification_loss: 0.1066 352/500 [====================>.........] - ETA: 37s - loss: 0.9814 - regression_loss: 0.8750 - classification_loss: 0.1064 353/500 [====================>.........] - ETA: 36s - loss: 0.9826 - regression_loss: 0.8761 - classification_loss: 0.1065 354/500 [====================>.........] - ETA: 36s - loss: 0.9831 - regression_loss: 0.8765 - classification_loss: 0.1066 355/500 [====================>.........] - ETA: 36s - loss: 0.9829 - regression_loss: 0.8763 - classification_loss: 0.1066 356/500 [====================>.........] - ETA: 36s - loss: 0.9837 - regression_loss: 0.8770 - classification_loss: 0.1067 357/500 [====================>.........] - ETA: 35s - loss: 0.9838 - regression_loss: 0.8770 - classification_loss: 0.1068 358/500 [====================>.........] - ETA: 35s - loss: 0.9833 - regression_loss: 0.8766 - classification_loss: 0.1067 359/500 [====================>.........] - ETA: 35s - loss: 0.9831 - regression_loss: 0.8763 - classification_loss: 0.1068 360/500 [====================>.........] - ETA: 35s - loss: 0.9839 - regression_loss: 0.8771 - classification_loss: 0.1068 361/500 [====================>.........] - ETA: 34s - loss: 0.9841 - regression_loss: 0.8773 - classification_loss: 0.1069 362/500 [====================>.........] - ETA: 34s - loss: 0.9847 - regression_loss: 0.8778 - classification_loss: 0.1069 363/500 [====================>.........] - ETA: 34s - loss: 0.9843 - regression_loss: 0.8774 - classification_loss: 0.1068 364/500 [====================>.........] - ETA: 34s - loss: 0.9830 - regression_loss: 0.8763 - classification_loss: 0.1066 365/500 [====================>.........] - ETA: 33s - loss: 0.9834 - regression_loss: 0.8767 - classification_loss: 0.1067 366/500 [====================>.........] - ETA: 33s - loss: 0.9842 - regression_loss: 0.8774 - classification_loss: 0.1068 367/500 [=====================>........] - ETA: 33s - loss: 0.9830 - regression_loss: 0.8764 - classification_loss: 0.1066 368/500 [=====================>........] - ETA: 33s - loss: 0.9836 - regression_loss: 0.8770 - classification_loss: 0.1067 369/500 [=====================>........] - ETA: 32s - loss: 0.9824 - regression_loss: 0.8759 - classification_loss: 0.1066 370/500 [=====================>........] - ETA: 32s - loss: 0.9817 - regression_loss: 0.8753 - classification_loss: 0.1064 371/500 [=====================>........] - ETA: 32s - loss: 0.9807 - regression_loss: 0.8744 - classification_loss: 0.1063 372/500 [=====================>........] - ETA: 32s - loss: 0.9808 - regression_loss: 0.8745 - classification_loss: 0.1063 373/500 [=====================>........] - ETA: 31s - loss: 0.9803 - regression_loss: 0.8740 - classification_loss: 0.1063 374/500 [=====================>........] - ETA: 31s - loss: 0.9794 - regression_loss: 0.8731 - classification_loss: 0.1063 375/500 [=====================>........] - ETA: 31s - loss: 0.9800 - regression_loss: 0.8737 - classification_loss: 0.1063 376/500 [=====================>........] - ETA: 31s - loss: 0.9814 - regression_loss: 0.8750 - classification_loss: 0.1064 377/500 [=====================>........] - ETA: 30s - loss: 0.9803 - regression_loss: 0.8741 - classification_loss: 0.1062 378/500 [=====================>........] - ETA: 30s - loss: 0.9806 - regression_loss: 0.8743 - classification_loss: 0.1062 379/500 [=====================>........] - ETA: 30s - loss: 0.9788 - regression_loss: 0.8728 - classification_loss: 0.1060 380/500 [=====================>........] - ETA: 30s - loss: 0.9784 - regression_loss: 0.8725 - classification_loss: 0.1059 381/500 [=====================>........] - ETA: 29s - loss: 0.9783 - regression_loss: 0.8724 - classification_loss: 0.1059 382/500 [=====================>........] - ETA: 29s - loss: 0.9790 - regression_loss: 0.8731 - classification_loss: 0.1059 383/500 [=====================>........] - ETA: 29s - loss: 0.9804 - regression_loss: 0.8743 - classification_loss: 0.1062 384/500 [======================>.......] - ETA: 29s - loss: 0.9810 - regression_loss: 0.8749 - classification_loss: 0.1061 385/500 [======================>.......] - ETA: 28s - loss: 0.9818 - regression_loss: 0.8755 - classification_loss: 0.1063 386/500 [======================>.......] - ETA: 28s - loss: 0.9827 - regression_loss: 0.8763 - classification_loss: 0.1065 387/500 [======================>.......] - ETA: 28s - loss: 0.9832 - regression_loss: 0.8767 - classification_loss: 0.1064 388/500 [======================>.......] - ETA: 28s - loss: 0.9816 - regression_loss: 0.8753 - classification_loss: 0.1063 389/500 [======================>.......] - ETA: 27s - loss: 0.9800 - regression_loss: 0.8740 - classification_loss: 0.1061 390/500 [======================>.......] - ETA: 27s - loss: 0.9793 - regression_loss: 0.8735 - classification_loss: 0.1058 391/500 [======================>.......] - ETA: 27s - loss: 0.9802 - regression_loss: 0.8742 - classification_loss: 0.1060 392/500 [======================>.......] - ETA: 27s - loss: 0.9786 - regression_loss: 0.8728 - classification_loss: 0.1058 393/500 [======================>.......] - ETA: 26s - loss: 0.9786 - regression_loss: 0.8730 - classification_loss: 0.1057 394/500 [======================>.......] - ETA: 26s - loss: 0.9792 - regression_loss: 0.8734 - classification_loss: 0.1058 395/500 [======================>.......] - ETA: 26s - loss: 0.9792 - regression_loss: 0.8735 - classification_loss: 0.1058 396/500 [======================>.......] - ETA: 26s - loss: 0.9788 - regression_loss: 0.8730 - classification_loss: 0.1058 397/500 [======================>.......] - ETA: 25s - loss: 0.9794 - regression_loss: 0.8734 - classification_loss: 0.1060 398/500 [======================>.......] - ETA: 25s - loss: 0.9810 - regression_loss: 0.8748 - classification_loss: 0.1062 399/500 [======================>.......] - ETA: 25s - loss: 0.9817 - regression_loss: 0.8754 - classification_loss: 0.1063 400/500 [=======================>......] - ETA: 25s - loss: 0.9817 - regression_loss: 0.8754 - classification_loss: 0.1063 401/500 [=======================>......] - ETA: 24s - loss: 0.9805 - regression_loss: 0.8743 - classification_loss: 0.1061 402/500 [=======================>......] - ETA: 24s - loss: 0.9808 - regression_loss: 0.8746 - classification_loss: 0.1062 403/500 [=======================>......] - ETA: 24s - loss: 0.9811 - regression_loss: 0.8748 - classification_loss: 0.1063 404/500 [=======================>......] - ETA: 24s - loss: 0.9811 - regression_loss: 0.8748 - classification_loss: 0.1063 405/500 [=======================>......] - ETA: 23s - loss: 0.9814 - regression_loss: 0.8751 - classification_loss: 0.1063 406/500 [=======================>......] - ETA: 23s - loss: 0.9822 - regression_loss: 0.8757 - classification_loss: 0.1065 407/500 [=======================>......] - ETA: 23s - loss: 0.9817 - regression_loss: 0.8753 - classification_loss: 0.1063 408/500 [=======================>......] - ETA: 23s - loss: 0.9822 - regression_loss: 0.8757 - classification_loss: 0.1065 409/500 [=======================>......] - ETA: 22s - loss: 0.9826 - regression_loss: 0.8760 - classification_loss: 0.1066 410/500 [=======================>......] - ETA: 22s - loss: 0.9823 - regression_loss: 0.8758 - classification_loss: 0.1066 411/500 [=======================>......] - ETA: 22s - loss: 0.9831 - regression_loss: 0.8764 - classification_loss: 0.1067 412/500 [=======================>......] - ETA: 22s - loss: 0.9827 - regression_loss: 0.8761 - classification_loss: 0.1066 413/500 [=======================>......] - ETA: 21s - loss: 0.9815 - regression_loss: 0.8751 - classification_loss: 0.1064 414/500 [=======================>......] - ETA: 21s - loss: 0.9803 - regression_loss: 0.8740 - classification_loss: 0.1062 415/500 [=======================>......] - ETA: 21s - loss: 0.9800 - regression_loss: 0.8738 - classification_loss: 0.1062 416/500 [=======================>......] - ETA: 21s - loss: 0.9804 - regression_loss: 0.8742 - classification_loss: 0.1062 417/500 [========================>.....] - ETA: 20s - loss: 0.9810 - regression_loss: 0.8747 - classification_loss: 0.1063 418/500 [========================>.....] - ETA: 20s - loss: 0.9809 - regression_loss: 0.8746 - classification_loss: 0.1063 419/500 [========================>.....] - ETA: 20s - loss: 0.9809 - regression_loss: 0.8745 - classification_loss: 0.1063 420/500 [========================>.....] - ETA: 20s - loss: 0.9802 - regression_loss: 0.8739 - classification_loss: 0.1063 421/500 [========================>.....] - ETA: 19s - loss: 0.9786 - regression_loss: 0.8725 - classification_loss: 0.1061 422/500 [========================>.....] - ETA: 19s - loss: 0.9791 - regression_loss: 0.8729 - classification_loss: 0.1062 423/500 [========================>.....] - ETA: 19s - loss: 0.9774 - regression_loss: 0.8715 - classification_loss: 0.1059 424/500 [========================>.....] - ETA: 19s - loss: 0.9779 - regression_loss: 0.8719 - classification_loss: 0.1060 425/500 [========================>.....] - ETA: 18s - loss: 0.9773 - regression_loss: 0.8715 - classification_loss: 0.1059 426/500 [========================>.....] - ETA: 18s - loss: 0.9774 - regression_loss: 0.8714 - classification_loss: 0.1060 427/500 [========================>.....] - ETA: 18s - loss: 0.9777 - regression_loss: 0.8717 - classification_loss: 0.1060 428/500 [========================>.....] - ETA: 18s - loss: 0.9772 - regression_loss: 0.8712 - classification_loss: 0.1059 429/500 [========================>.....] - ETA: 17s - loss: 0.9761 - regression_loss: 0.8704 - classification_loss: 0.1057 430/500 [========================>.....] - ETA: 17s - loss: 0.9770 - regression_loss: 0.8710 - classification_loss: 0.1060 431/500 [========================>.....] - ETA: 17s - loss: 0.9763 - regression_loss: 0.8705 - classification_loss: 0.1058 432/500 [========================>.....] - ETA: 17s - loss: 0.9773 - regression_loss: 0.8714 - classification_loss: 0.1058 433/500 [========================>.....] - ETA: 16s - loss: 0.9763 - regression_loss: 0.8706 - classification_loss: 0.1057 434/500 [=========================>....] - ETA: 16s - loss: 0.9762 - regression_loss: 0.8706 - classification_loss: 0.1056 435/500 [=========================>....] - ETA: 16s - loss: 0.9757 - regression_loss: 0.8703 - classification_loss: 0.1055 436/500 [=========================>....] - ETA: 16s - loss: 0.9758 - regression_loss: 0.8703 - classification_loss: 0.1055 437/500 [=========================>....] - ETA: 15s - loss: 0.9770 - regression_loss: 0.8713 - classification_loss: 0.1057 438/500 [=========================>....] - ETA: 15s - loss: 0.9760 - regression_loss: 0.8705 - classification_loss: 0.1055 439/500 [=========================>....] - ETA: 15s - loss: 0.9750 - regression_loss: 0.8696 - classification_loss: 0.1054 440/500 [=========================>....] - ETA: 15s - loss: 0.9748 - regression_loss: 0.8695 - classification_loss: 0.1053 441/500 [=========================>....] - ETA: 14s - loss: 0.9747 - regression_loss: 0.8695 - classification_loss: 0.1052 442/500 [=========================>....] - ETA: 14s - loss: 0.9753 - regression_loss: 0.8700 - classification_loss: 0.1053 443/500 [=========================>....] - ETA: 14s - loss: 0.9749 - regression_loss: 0.8697 - classification_loss: 0.1053 444/500 [=========================>....] - ETA: 14s - loss: 0.9751 - regression_loss: 0.8700 - classification_loss: 0.1051 445/500 [=========================>....] - ETA: 13s - loss: 0.9750 - regression_loss: 0.8699 - classification_loss: 0.1051 446/500 [=========================>....] - ETA: 13s - loss: 0.9741 - regression_loss: 0.8691 - classification_loss: 0.1050 447/500 [=========================>....] - ETA: 13s - loss: 0.9741 - regression_loss: 0.8691 - classification_loss: 0.1049 448/500 [=========================>....] - ETA: 13s - loss: 0.9738 - regression_loss: 0.8689 - classification_loss: 0.1049 449/500 [=========================>....] - ETA: 12s - loss: 0.9740 - regression_loss: 0.8690 - classification_loss: 0.1049 450/500 [==========================>...] - ETA: 12s - loss: 0.9744 - regression_loss: 0.8694 - classification_loss: 0.1050 451/500 [==========================>...] - ETA: 12s - loss: 0.9749 - regression_loss: 0.8698 - classification_loss: 0.1051 452/500 [==========================>...] - ETA: 12s - loss: 0.9742 - regression_loss: 0.8692 - classification_loss: 0.1049 453/500 [==========================>...] - ETA: 11s - loss: 0.9751 - regression_loss: 0.8701 - classification_loss: 0.1051 454/500 [==========================>...] - ETA: 11s - loss: 0.9748 - regression_loss: 0.8698 - classification_loss: 0.1050 455/500 [==========================>...] - ETA: 11s - loss: 0.9749 - regression_loss: 0.8701 - classification_loss: 0.1049 456/500 [==========================>...] - ETA: 11s - loss: 0.9753 - regression_loss: 0.8704 - classification_loss: 0.1049 457/500 [==========================>...] - ETA: 10s - loss: 0.9749 - regression_loss: 0.8700 - classification_loss: 0.1049 458/500 [==========================>...] - ETA: 10s - loss: 0.9751 - regression_loss: 0.8702 - classification_loss: 0.1049 459/500 [==========================>...] - ETA: 10s - loss: 0.9742 - regression_loss: 0.8694 - classification_loss: 0.1048 460/500 [==========================>...] - ETA: 10s - loss: 0.9749 - regression_loss: 0.8701 - classification_loss: 0.1049 461/500 [==========================>...] - ETA: 9s - loss: 0.9755 - regression_loss: 0.8705 - classification_loss: 0.1050  462/500 [==========================>...] - ETA: 9s - loss: 0.9760 - regression_loss: 0.8710 - classification_loss: 0.1050 463/500 [==========================>...] - ETA: 9s - loss: 0.9766 - regression_loss: 0.8717 - classification_loss: 0.1050 464/500 [==========================>...] - ETA: 9s - loss: 0.9773 - regression_loss: 0.8722 - classification_loss: 0.1051 465/500 [==========================>...] - ETA: 8s - loss: 0.9771 - regression_loss: 0.8721 - classification_loss: 0.1051 466/500 [==========================>...] - ETA: 8s - loss: 0.9781 - regression_loss: 0.8728 - classification_loss: 0.1053 467/500 [===========================>..] - ETA: 8s - loss: 0.9776 - regression_loss: 0.8724 - classification_loss: 0.1052 468/500 [===========================>..] - ETA: 8s - loss: 0.9786 - regression_loss: 0.8732 - classification_loss: 0.1054 469/500 [===========================>..] - ETA: 7s - loss: 0.9780 - regression_loss: 0.8727 - classification_loss: 0.1053 470/500 [===========================>..] - ETA: 7s - loss: 0.9778 - regression_loss: 0.8726 - classification_loss: 0.1053 471/500 [===========================>..] - ETA: 7s - loss: 0.9769 - regression_loss: 0.8718 - classification_loss: 0.1051 472/500 [===========================>..] - ETA: 7s - loss: 0.9767 - regression_loss: 0.8716 - classification_loss: 0.1051 473/500 [===========================>..] - ETA: 6s - loss: 0.9771 - regression_loss: 0.8721 - classification_loss: 0.1050 474/500 [===========================>..] - ETA: 6s - loss: 0.9773 - regression_loss: 0.8723 - classification_loss: 0.1050 475/500 [===========================>..] - ETA: 6s - loss: 0.9776 - regression_loss: 0.8725 - classification_loss: 0.1051 476/500 [===========================>..] - ETA: 6s - loss: 0.9781 - regression_loss: 0.8731 - classification_loss: 0.1050 477/500 [===========================>..] - ETA: 5s - loss: 0.9774 - regression_loss: 0.8725 - classification_loss: 0.1049 478/500 [===========================>..] - ETA: 5s - loss: 0.9765 - regression_loss: 0.8717 - classification_loss: 0.1047 479/500 [===========================>..] - ETA: 5s - loss: 0.9750 - regression_loss: 0.8705 - classification_loss: 0.1045 480/500 [===========================>..] - ETA: 5s - loss: 0.9751 - regression_loss: 0.8705 - classification_loss: 0.1046 481/500 [===========================>..] - ETA: 4s - loss: 0.9763 - regression_loss: 0.8715 - classification_loss: 0.1048 482/500 [===========================>..] - ETA: 4s - loss: 0.9764 - regression_loss: 0.8716 - classification_loss: 0.1048 483/500 [===========================>..] - ETA: 4s - loss: 0.9764 - regression_loss: 0.8716 - classification_loss: 0.1048 484/500 [============================>.] - ETA: 4s - loss: 0.9764 - regression_loss: 0.8717 - classification_loss: 0.1047 485/500 [============================>.] - ETA: 3s - loss: 0.9763 - regression_loss: 0.8717 - classification_loss: 0.1047 486/500 [============================>.] - ETA: 3s - loss: 0.9767 - regression_loss: 0.8720 - classification_loss: 0.1047 487/500 [============================>.] - ETA: 3s - loss: 0.9775 - regression_loss: 0.8727 - classification_loss: 0.1049 488/500 [============================>.] - ETA: 3s - loss: 0.9773 - regression_loss: 0.8725 - classification_loss: 0.1048 489/500 [============================>.] - ETA: 2s - loss: 0.9770 - regression_loss: 0.8722 - classification_loss: 0.1048 490/500 [============================>.] - ETA: 2s - loss: 0.9765 - regression_loss: 0.8718 - classification_loss: 0.1047 491/500 [============================>.] - ETA: 2s - loss: 0.9763 - regression_loss: 0.8716 - classification_loss: 0.1047 492/500 [============================>.] - ETA: 2s - loss: 0.9756 - regression_loss: 0.8710 - classification_loss: 0.1047 493/500 [============================>.] - ETA: 1s - loss: 0.9755 - regression_loss: 0.8709 - classification_loss: 0.1047 494/500 [============================>.] - ETA: 1s - loss: 0.9753 - regression_loss: 0.8707 - classification_loss: 0.1046 495/500 [============================>.] - ETA: 1s - loss: 0.9754 - regression_loss: 0.8706 - classification_loss: 0.1048 496/500 [============================>.] - ETA: 1s - loss: 0.9752 - regression_loss: 0.8704 - classification_loss: 0.1048 497/500 [============================>.] - ETA: 0s - loss: 0.9763 - regression_loss: 0.8712 - classification_loss: 0.1051 498/500 [============================>.] - ETA: 0s - loss: 0.9751 - regression_loss: 0.8702 - classification_loss: 0.1049 499/500 [============================>.] - ETA: 0s - loss: 0.9762 - regression_loss: 0.8710 - classification_loss: 0.1051 500/500 [==============================] - 125s 251ms/step - loss: 0.9765 - regression_loss: 0.8712 - classification_loss: 0.1053 1172 instances of class plum with average precision: 0.7745 mAP: 0.7745 Epoch 00041: saving model to ./training/snapshots/resnet50_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 1:49 - loss: 0.8994 - regression_loss: 0.7807 - classification_loss: 0.1187 2/500 [..............................] - ETA: 1:56 - loss: 0.9959 - regression_loss: 0.8958 - classification_loss: 0.1002 3/500 [..............................] - ETA: 1:58 - loss: 0.7889 - regression_loss: 0.7005 - classification_loss: 0.0884 4/500 [..............................] - ETA: 1:59 - loss: 0.8434 - regression_loss: 0.7476 - classification_loss: 0.0958 5/500 [..............................] - ETA: 2:00 - loss: 0.7487 - regression_loss: 0.6654 - classification_loss: 0.0834 6/500 [..............................] - ETA: 2:00 - loss: 0.6882 - regression_loss: 0.6162 - classification_loss: 0.0721 7/500 [..............................] - ETA: 2:01 - loss: 0.6495 - regression_loss: 0.5838 - classification_loss: 0.0657 8/500 [..............................] - ETA: 2:01 - loss: 0.7132 - regression_loss: 0.6345 - classification_loss: 0.0787 9/500 [..............................] - ETA: 2:01 - loss: 0.7819 - regression_loss: 0.6897 - classification_loss: 0.0923 10/500 [..............................] - ETA: 2:01 - loss: 0.8306 - regression_loss: 0.7308 - classification_loss: 0.0999 11/500 [..............................] - ETA: 2:01 - loss: 0.8908 - regression_loss: 0.7825 - classification_loss: 0.1082 12/500 [..............................] - ETA: 2:00 - loss: 0.8834 - regression_loss: 0.7755 - classification_loss: 0.1080 13/500 [..............................] - ETA: 2:00 - loss: 0.9028 - regression_loss: 0.7902 - classification_loss: 0.1126 14/500 [..............................] - ETA: 2:00 - loss: 0.8602 - regression_loss: 0.7540 - classification_loss: 0.1062 15/500 [..............................] - ETA: 2:00 - loss: 0.8924 - regression_loss: 0.7840 - classification_loss: 0.1085 16/500 [..............................] - ETA: 2:00 - loss: 0.8777 - regression_loss: 0.7726 - classification_loss: 0.1051 17/500 [>.............................] - ETA: 2:00 - loss: 0.8846 - regression_loss: 0.7825 - classification_loss: 0.1020 18/500 [>.............................] - ETA: 2:00 - loss: 0.8930 - regression_loss: 0.7921 - classification_loss: 0.1009 19/500 [>.............................] - ETA: 1:59 - loss: 0.8772 - regression_loss: 0.7798 - classification_loss: 0.0974 20/500 [>.............................] - ETA: 1:59 - loss: 0.8917 - regression_loss: 0.7934 - classification_loss: 0.0983 21/500 [>.............................] - ETA: 1:59 - loss: 0.8767 - regression_loss: 0.7811 - classification_loss: 0.0957 22/500 [>.............................] - ETA: 1:59 - loss: 0.8786 - regression_loss: 0.7842 - classification_loss: 0.0943 23/500 [>.............................] - ETA: 1:59 - loss: 0.8946 - regression_loss: 0.7958 - classification_loss: 0.0988 24/500 [>.............................] - ETA: 1:58 - loss: 0.8913 - regression_loss: 0.7922 - classification_loss: 0.0991 25/500 [>.............................] - ETA: 1:58 - loss: 0.9109 - regression_loss: 0.8079 - classification_loss: 0.1030 26/500 [>.............................] - ETA: 1:58 - loss: 0.8904 - regression_loss: 0.7897 - classification_loss: 0.1007 27/500 [>.............................] - ETA: 1:57 - loss: 0.8948 - regression_loss: 0.7941 - classification_loss: 0.1007 28/500 [>.............................] - ETA: 1:57 - loss: 0.9113 - regression_loss: 0.8064 - classification_loss: 0.1050 29/500 [>.............................] - ETA: 1:56 - loss: 0.9169 - regression_loss: 0.8121 - classification_loss: 0.1049 30/500 [>.............................] - ETA: 1:56 - loss: 0.9020 - regression_loss: 0.7997 - classification_loss: 0.1023 31/500 [>.............................] - ETA: 1:56 - loss: 0.8950 - regression_loss: 0.7931 - classification_loss: 0.1020 32/500 [>.............................] - ETA: 1:56 - loss: 0.8991 - regression_loss: 0.7970 - classification_loss: 0.1021 33/500 [>.............................] - ETA: 1:56 - loss: 0.9130 - regression_loss: 0.8091 - classification_loss: 0.1039 34/500 [=>............................] - ETA: 1:56 - loss: 0.8973 - regression_loss: 0.7961 - classification_loss: 0.1013 35/500 [=>............................] - ETA: 1:56 - loss: 0.9065 - regression_loss: 0.8031 - classification_loss: 0.1035 36/500 [=>............................] - ETA: 1:55 - loss: 0.8934 - regression_loss: 0.7917 - classification_loss: 0.1017 37/500 [=>............................] - ETA: 1:55 - loss: 0.9003 - regression_loss: 0.7958 - classification_loss: 0.1045 38/500 [=>............................] - ETA: 1:55 - loss: 0.8964 - regression_loss: 0.7930 - classification_loss: 0.1034 39/500 [=>............................] - ETA: 1:55 - loss: 0.8894 - regression_loss: 0.7874 - classification_loss: 0.1020 40/500 [=>............................] - ETA: 1:54 - loss: 0.9090 - regression_loss: 0.8027 - classification_loss: 0.1063 41/500 [=>............................] - ETA: 1:54 - loss: 0.9056 - regression_loss: 0.7994 - classification_loss: 0.1061 42/500 [=>............................] - ETA: 1:54 - loss: 0.9114 - regression_loss: 0.8048 - classification_loss: 0.1066 43/500 [=>............................] - ETA: 1:54 - loss: 0.9181 - regression_loss: 0.8105 - classification_loss: 0.1076 44/500 [=>............................] - ETA: 1:54 - loss: 0.9191 - regression_loss: 0.8125 - classification_loss: 0.1066 45/500 [=>............................] - ETA: 1:53 - loss: 0.9180 - regression_loss: 0.8118 - classification_loss: 0.1063 46/500 [=>............................] - ETA: 1:53 - loss: 0.9147 - regression_loss: 0.8098 - classification_loss: 0.1049 47/500 [=>............................] - ETA: 1:53 - loss: 0.9168 - regression_loss: 0.8118 - classification_loss: 0.1050 48/500 [=>............................] - ETA: 1:53 - loss: 0.9229 - regression_loss: 0.8174 - classification_loss: 0.1055 49/500 [=>............................] - ETA: 1:53 - loss: 0.9291 - regression_loss: 0.8232 - classification_loss: 0.1058 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9336 - regression_loss: 0.8273 - classification_loss: 0.1063 51/500 [==>...........................] - ETA: 1:52 - loss: 0.9408 - regression_loss: 0.8334 - classification_loss: 0.1073 52/500 [==>...........................] - ETA: 1:52 - loss: 0.9397 - regression_loss: 0.8330 - classification_loss: 0.1066 53/500 [==>...........................] - ETA: 1:51 - loss: 0.9302 - regression_loss: 0.8251 - classification_loss: 0.1051 54/500 [==>...........................] - ETA: 1:51 - loss: 0.9283 - regression_loss: 0.8237 - classification_loss: 0.1046 55/500 [==>...........................] - ETA: 1:51 - loss: 0.9283 - regression_loss: 0.8239 - classification_loss: 0.1044 56/500 [==>...........................] - ETA: 1:51 - loss: 0.9303 - regression_loss: 0.8252 - classification_loss: 0.1051 57/500 [==>...........................] - ETA: 1:51 - loss: 0.9384 - regression_loss: 0.8335 - classification_loss: 0.1049 58/500 [==>...........................] - ETA: 1:50 - loss: 0.9379 - regression_loss: 0.8311 - classification_loss: 0.1069 59/500 [==>...........................] - ETA: 1:50 - loss: 0.9408 - regression_loss: 0.8344 - classification_loss: 0.1063 60/500 [==>...........................] - ETA: 1:50 - loss: 0.9373 - regression_loss: 0.8312 - classification_loss: 0.1061 61/500 [==>...........................] - ETA: 1:50 - loss: 0.9368 - regression_loss: 0.8306 - classification_loss: 0.1062 62/500 [==>...........................] - ETA: 1:49 - loss: 0.9475 - regression_loss: 0.8403 - classification_loss: 0.1072 63/500 [==>...........................] - ETA: 1:49 - loss: 0.9457 - regression_loss: 0.8394 - classification_loss: 0.1063 64/500 [==>...........................] - ETA: 1:49 - loss: 0.9431 - regression_loss: 0.8372 - classification_loss: 0.1059 65/500 [==>...........................] - ETA: 1:49 - loss: 0.9422 - regression_loss: 0.8366 - classification_loss: 0.1057 66/500 [==>...........................] - ETA: 1:48 - loss: 0.9382 - regression_loss: 0.8332 - classification_loss: 0.1050 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9385 - regression_loss: 0.8334 - classification_loss: 0.1051 68/500 [===>..........................] - ETA: 1:48 - loss: 0.9403 - regression_loss: 0.8347 - classification_loss: 0.1056 69/500 [===>..........................] - ETA: 1:48 - loss: 0.9491 - regression_loss: 0.8440 - classification_loss: 0.1051 70/500 [===>..........................] - ETA: 1:47 - loss: 0.9432 - regression_loss: 0.8390 - classification_loss: 0.1042 71/500 [===>..........................] - ETA: 1:47 - loss: 0.9459 - regression_loss: 0.8410 - classification_loss: 0.1049 72/500 [===>..........................] - ETA: 1:47 - loss: 0.9369 - regression_loss: 0.8332 - classification_loss: 0.1036 73/500 [===>..........................] - ETA: 1:47 - loss: 0.9380 - regression_loss: 0.8340 - classification_loss: 0.1039 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9354 - regression_loss: 0.8317 - classification_loss: 0.1038 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9339 - regression_loss: 0.8300 - classification_loss: 0.1038 76/500 [===>..........................] - ETA: 1:46 - loss: 0.9289 - regression_loss: 0.8258 - classification_loss: 0.1031 77/500 [===>..........................] - ETA: 1:46 - loss: 0.9288 - regression_loss: 0.8259 - classification_loss: 0.1029 78/500 [===>..........................] - ETA: 1:45 - loss: 0.9239 - regression_loss: 0.8218 - classification_loss: 0.1021 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9202 - regression_loss: 0.8187 - classification_loss: 0.1015 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9140 - regression_loss: 0.8134 - classification_loss: 0.1006 81/500 [===>..........................] - ETA: 1:45 - loss: 0.9114 - regression_loss: 0.8117 - classification_loss: 0.0996 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9168 - regression_loss: 0.8158 - classification_loss: 0.1010 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9197 - regression_loss: 0.8183 - classification_loss: 0.1014 84/500 [====>.........................] - ETA: 1:44 - loss: 0.9206 - regression_loss: 0.8185 - classification_loss: 0.1021 85/500 [====>.........................] - ETA: 1:44 - loss: 0.9234 - regression_loss: 0.8208 - classification_loss: 0.1027 86/500 [====>.........................] - ETA: 1:44 - loss: 0.9240 - regression_loss: 0.8209 - classification_loss: 0.1031 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9245 - regression_loss: 0.8214 - classification_loss: 0.1031 88/500 [====>.........................] - ETA: 1:43 - loss: 0.9288 - regression_loss: 0.8259 - classification_loss: 0.1030 89/500 [====>.........................] - ETA: 1:43 - loss: 0.9302 - regression_loss: 0.8269 - classification_loss: 0.1033 90/500 [====>.........................] - ETA: 1:43 - loss: 0.9257 - regression_loss: 0.8232 - classification_loss: 0.1025 91/500 [====>.........................] - ETA: 1:43 - loss: 0.9312 - regression_loss: 0.8279 - classification_loss: 0.1033 92/500 [====>.........................] - ETA: 1:42 - loss: 0.9354 - regression_loss: 0.8311 - classification_loss: 0.1043 93/500 [====>.........................] - ETA: 1:42 - loss: 0.9305 - regression_loss: 0.8267 - classification_loss: 0.1038 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9332 - regression_loss: 0.8286 - classification_loss: 0.1045 95/500 [====>.........................] - ETA: 1:41 - loss: 0.9284 - regression_loss: 0.8246 - classification_loss: 0.1038 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9334 - regression_loss: 0.8293 - classification_loss: 0.1041 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9328 - regression_loss: 0.8288 - classification_loss: 0.1040 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9341 - regression_loss: 0.8306 - classification_loss: 0.1035 99/500 [====>.........................] - ETA: 1:40 - loss: 0.9380 - regression_loss: 0.8341 - classification_loss: 0.1039 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9371 - regression_loss: 0.8332 - classification_loss: 0.1039 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9372 - regression_loss: 0.8336 - classification_loss: 0.1036 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9386 - regression_loss: 0.8346 - classification_loss: 0.1040 103/500 [=====>........................] - ETA: 1:39 - loss: 0.9421 - regression_loss: 0.8371 - classification_loss: 0.1050 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9398 - regression_loss: 0.8350 - classification_loss: 0.1048 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9399 - regression_loss: 0.8351 - classification_loss: 0.1048 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9427 - regression_loss: 0.8375 - classification_loss: 0.1053 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9458 - regression_loss: 0.8404 - classification_loss: 0.1055 108/500 [=====>........................] - ETA: 1:37 - loss: 0.9416 - regression_loss: 0.8369 - classification_loss: 0.1048 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9413 - regression_loss: 0.8368 - classification_loss: 0.1045 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9414 - regression_loss: 0.8368 - classification_loss: 0.1047 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9404 - regression_loss: 0.8359 - classification_loss: 0.1045 112/500 [=====>........................] - ETA: 1:36 - loss: 0.9382 - regression_loss: 0.8342 - classification_loss: 0.1040 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9404 - regression_loss: 0.8359 - classification_loss: 0.1045 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9406 - regression_loss: 0.8366 - classification_loss: 0.1041 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9482 - regression_loss: 0.8433 - classification_loss: 0.1049 116/500 [=====>........................] - ETA: 1:35 - loss: 0.9510 - regression_loss: 0.8459 - classification_loss: 0.1051 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9474 - regression_loss: 0.8425 - classification_loss: 0.1048 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9464 - regression_loss: 0.8420 - classification_loss: 0.1044 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9445 - regression_loss: 0.8404 - classification_loss: 0.1041 120/500 [======>.......................] - ETA: 1:34 - loss: 0.9480 - regression_loss: 0.8430 - classification_loss: 0.1050 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9516 - regression_loss: 0.8459 - classification_loss: 0.1057 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9532 - regression_loss: 0.8475 - classification_loss: 0.1057 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9559 - regression_loss: 0.8496 - classification_loss: 0.1063 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9536 - regression_loss: 0.8478 - classification_loss: 0.1058 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9538 - regression_loss: 0.8481 - classification_loss: 0.1057 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9562 - regression_loss: 0.8502 - classification_loss: 0.1060 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9572 - regression_loss: 0.8511 - classification_loss: 0.1061 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9575 - regression_loss: 0.8514 - classification_loss: 0.1061 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9576 - regression_loss: 0.8517 - classification_loss: 0.1059 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9590 - regression_loss: 0.8529 - classification_loss: 0.1061 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9553 - regression_loss: 0.8497 - classification_loss: 0.1055 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9588 - regression_loss: 0.8531 - classification_loss: 0.1057 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9560 - regression_loss: 0.8507 - classification_loss: 0.1053 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9515 - regression_loss: 0.8466 - classification_loss: 0.1049 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9526 - regression_loss: 0.8476 - classification_loss: 0.1049 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9498 - regression_loss: 0.8454 - classification_loss: 0.1044 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9465 - regression_loss: 0.8426 - classification_loss: 0.1039 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9491 - regression_loss: 0.8451 - classification_loss: 0.1040 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9503 - regression_loss: 0.8461 - classification_loss: 0.1042 140/500 [=======>......................] - ETA: 1:29 - loss: 0.9468 - regression_loss: 0.8431 - classification_loss: 0.1037 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9477 - regression_loss: 0.8442 - classification_loss: 0.1035 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9479 - regression_loss: 0.8445 - classification_loss: 0.1035 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9460 - regression_loss: 0.8430 - classification_loss: 0.1030 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9481 - regression_loss: 0.8448 - classification_loss: 0.1033 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9481 - regression_loss: 0.8448 - classification_loss: 0.1033 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9512 - regression_loss: 0.8474 - classification_loss: 0.1038 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9520 - regression_loss: 0.8473 - classification_loss: 0.1047 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9526 - regression_loss: 0.8478 - classification_loss: 0.1048 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9513 - regression_loss: 0.8466 - classification_loss: 0.1046 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9485 - regression_loss: 0.8444 - classification_loss: 0.1041 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9485 - regression_loss: 0.8442 - classification_loss: 0.1042 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9471 - regression_loss: 0.8435 - classification_loss: 0.1037 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9469 - regression_loss: 0.8433 - classification_loss: 0.1036 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9442 - regression_loss: 0.8408 - classification_loss: 0.1034 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9455 - regression_loss: 0.8421 - classification_loss: 0.1035 156/500 [========>.....................] - ETA: 1:25 - loss: 0.9472 - regression_loss: 0.8434 - classification_loss: 0.1038 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9443 - regression_loss: 0.8408 - classification_loss: 0.1034 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9451 - regression_loss: 0.8416 - classification_loss: 0.1035 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9466 - regression_loss: 0.8428 - classification_loss: 0.1038 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9433 - regression_loss: 0.8401 - classification_loss: 0.1032 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9411 - regression_loss: 0.8383 - classification_loss: 0.1027 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9440 - regression_loss: 0.8410 - classification_loss: 0.1030 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9417 - regression_loss: 0.8391 - classification_loss: 0.1026 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9382 - regression_loss: 0.8361 - classification_loss: 0.1021 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9400 - regression_loss: 0.8374 - classification_loss: 0.1027 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9409 - regression_loss: 0.8384 - classification_loss: 0.1024 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9423 - regression_loss: 0.8395 - classification_loss: 0.1028 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9458 - regression_loss: 0.8431 - classification_loss: 0.1027 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9464 - regression_loss: 0.8436 - classification_loss: 0.1029 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9505 - regression_loss: 0.8477 - classification_loss: 0.1029 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9502 - regression_loss: 0.8474 - classification_loss: 0.1028 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9488 - regression_loss: 0.8465 - classification_loss: 0.1023 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9502 - regression_loss: 0.8476 - classification_loss: 0.1026 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9517 - regression_loss: 0.8488 - classification_loss: 0.1029 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9524 - regression_loss: 0.8495 - classification_loss: 0.1028 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9521 - regression_loss: 0.8495 - classification_loss: 0.1026 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9516 - regression_loss: 0.8492 - classification_loss: 0.1024 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9528 - regression_loss: 0.8502 - classification_loss: 0.1026 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9523 - regression_loss: 0.8496 - classification_loss: 0.1027 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9524 - regression_loss: 0.8496 - classification_loss: 0.1029 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9497 - regression_loss: 0.8472 - classification_loss: 0.1024 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9489 - regression_loss: 0.8466 - classification_loss: 0.1023 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9485 - regression_loss: 0.8461 - classification_loss: 0.1024 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9506 - regression_loss: 0.8480 - classification_loss: 0.1026 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9496 - regression_loss: 0.8473 - classification_loss: 0.1023 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9534 - regression_loss: 0.8508 - classification_loss: 0.1026 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9548 - regression_loss: 0.8520 - classification_loss: 0.1028 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9551 - regression_loss: 0.8525 - classification_loss: 0.1026 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9528 - regression_loss: 0.8506 - classification_loss: 0.1023 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9510 - regression_loss: 0.8490 - classification_loss: 0.1019 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9512 - regression_loss: 0.8494 - classification_loss: 0.1018 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9480 - regression_loss: 0.8465 - classification_loss: 0.1016 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9498 - regression_loss: 0.8480 - classification_loss: 0.1018 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9498 - regression_loss: 0.8478 - classification_loss: 0.1020 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9505 - regression_loss: 0.8484 - classification_loss: 0.1022 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9520 - regression_loss: 0.8497 - classification_loss: 0.1023 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9508 - regression_loss: 0.8485 - classification_loss: 0.1023 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9494 - regression_loss: 0.8474 - classification_loss: 0.1020 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9496 - regression_loss: 0.8478 - classification_loss: 0.1018 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9506 - regression_loss: 0.8487 - classification_loss: 0.1019 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9524 - regression_loss: 0.8499 - classification_loss: 0.1025 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9554 - regression_loss: 0.8525 - classification_loss: 0.1029 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9577 - regression_loss: 0.8543 - classification_loss: 0.1034 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9585 - regression_loss: 0.8550 - classification_loss: 0.1036 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9587 - regression_loss: 0.8551 - classification_loss: 0.1035 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9616 - regression_loss: 0.8580 - classification_loss: 0.1037 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9638 - regression_loss: 0.8597 - classification_loss: 0.1040 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9625 - regression_loss: 0.8588 - classification_loss: 0.1038 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9636 - regression_loss: 0.8596 - classification_loss: 0.1040 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9638 - regression_loss: 0.8599 - classification_loss: 0.1039 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9621 - regression_loss: 0.8585 - classification_loss: 0.1036 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9627 - regression_loss: 0.8590 - classification_loss: 0.1036 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9650 - regression_loss: 0.8609 - classification_loss: 0.1041 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9611 - regression_loss: 0.8575 - classification_loss: 0.1036 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9582 - regression_loss: 0.8549 - classification_loss: 0.1033 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9592 - regression_loss: 0.8557 - classification_loss: 0.1036 217/500 [============>.................] - ETA: 1:10 - loss: 0.9620 - regression_loss: 0.8580 - classification_loss: 0.1040 218/500 [============>.................] - ETA: 1:10 - loss: 0.9611 - regression_loss: 0.8575 - classification_loss: 0.1036 219/500 [============>.................] - ETA: 1:10 - loss: 0.9603 - regression_loss: 0.8569 - classification_loss: 0.1034 220/500 [============>.................] - ETA: 1:10 - loss: 0.9604 - regression_loss: 0.8570 - classification_loss: 0.1034 221/500 [============>.................] - ETA: 1:09 - loss: 0.9632 - regression_loss: 0.8593 - classification_loss: 0.1039 222/500 [============>.................] - ETA: 1:09 - loss: 0.9631 - regression_loss: 0.8593 - classification_loss: 0.1038 223/500 [============>.................] - ETA: 1:09 - loss: 0.9623 - regression_loss: 0.8586 - classification_loss: 0.1037 224/500 [============>.................] - ETA: 1:09 - loss: 0.9619 - regression_loss: 0.8582 - classification_loss: 0.1037 225/500 [============>.................] - ETA: 1:08 - loss: 0.9621 - regression_loss: 0.8583 - classification_loss: 0.1038 226/500 [============>.................] - ETA: 1:08 - loss: 0.9631 - regression_loss: 0.8592 - classification_loss: 0.1039 227/500 [============>.................] - ETA: 1:08 - loss: 0.9642 - regression_loss: 0.8601 - classification_loss: 0.1041 228/500 [============>.................] - ETA: 1:08 - loss: 0.9628 - regression_loss: 0.8590 - classification_loss: 0.1038 229/500 [============>.................] - ETA: 1:07 - loss: 0.9609 - regression_loss: 0.8574 - classification_loss: 0.1035 230/500 [============>.................] - ETA: 1:07 - loss: 0.9572 - regression_loss: 0.8540 - classification_loss: 0.1032 231/500 [============>.................] - ETA: 1:07 - loss: 0.9580 - regression_loss: 0.8547 - classification_loss: 0.1033 232/500 [============>.................] - ETA: 1:07 - loss: 0.9555 - regression_loss: 0.8525 - classification_loss: 0.1030 233/500 [============>.................] - ETA: 1:06 - loss: 0.9552 - regression_loss: 0.8522 - classification_loss: 0.1030 234/500 [=============>................] - ETA: 1:06 - loss: 0.9582 - regression_loss: 0.8550 - classification_loss: 0.1032 235/500 [=============>................] - ETA: 1:06 - loss: 0.9598 - regression_loss: 0.8560 - classification_loss: 0.1037 236/500 [=============>................] - ETA: 1:06 - loss: 0.9606 - regression_loss: 0.8568 - classification_loss: 0.1038 237/500 [=============>................] - ETA: 1:05 - loss: 0.9610 - regression_loss: 0.8571 - classification_loss: 0.1039 238/500 [=============>................] - ETA: 1:05 - loss: 0.9624 - regression_loss: 0.8581 - classification_loss: 0.1042 239/500 [=============>................] - ETA: 1:05 - loss: 0.9618 - regression_loss: 0.8578 - classification_loss: 0.1040 240/500 [=============>................] - ETA: 1:05 - loss: 0.9612 - regression_loss: 0.8574 - classification_loss: 0.1039 241/500 [=============>................] - ETA: 1:04 - loss: 0.9606 - regression_loss: 0.8568 - classification_loss: 0.1038 242/500 [=============>................] - ETA: 1:04 - loss: 0.9600 - regression_loss: 0.8562 - classification_loss: 0.1038 243/500 [=============>................] - ETA: 1:04 - loss: 0.9603 - regression_loss: 0.8565 - classification_loss: 0.1038 244/500 [=============>................] - ETA: 1:04 - loss: 0.9621 - regression_loss: 0.8581 - classification_loss: 0.1039 245/500 [=============>................] - ETA: 1:03 - loss: 0.9618 - regression_loss: 0.8579 - classification_loss: 0.1039 246/500 [=============>................] - ETA: 1:03 - loss: 0.9605 - regression_loss: 0.8568 - classification_loss: 0.1037 247/500 [=============>................] - ETA: 1:03 - loss: 0.9619 - regression_loss: 0.8579 - classification_loss: 0.1040 248/500 [=============>................] - ETA: 1:03 - loss: 0.9636 - regression_loss: 0.8589 - classification_loss: 0.1047 249/500 [=============>................] - ETA: 1:02 - loss: 0.9642 - regression_loss: 0.8595 - classification_loss: 0.1048 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9623 - regression_loss: 0.8579 - classification_loss: 0.1044 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9604 - regression_loss: 0.8563 - classification_loss: 0.1042 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9609 - regression_loss: 0.8567 - classification_loss: 0.1043 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9599 - regression_loss: 0.8558 - classification_loss: 0.1040 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9596 - regression_loss: 0.8557 - classification_loss: 0.1039 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9582 - regression_loss: 0.8545 - classification_loss: 0.1037 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9562 - regression_loss: 0.8528 - classification_loss: 0.1034 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9559 - regression_loss: 0.8525 - classification_loss: 0.1033 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9555 - regression_loss: 0.8522 - classification_loss: 0.1033 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9582 - regression_loss: 0.8543 - classification_loss: 0.1039 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9573 - regression_loss: 0.8536 - classification_loss: 0.1038 261/500 [==============>...............] - ETA: 59s - loss: 0.9577 - regression_loss: 0.8540 - classification_loss: 0.1038  262/500 [==============>...............] - ETA: 59s - loss: 0.9590 - regression_loss: 0.8552 - classification_loss: 0.1038 263/500 [==============>...............] - ETA: 59s - loss: 0.9598 - regression_loss: 0.8558 - classification_loss: 0.1039 264/500 [==============>...............] - ETA: 59s - loss: 0.9610 - regression_loss: 0.8572 - classification_loss: 0.1038 265/500 [==============>...............] - ETA: 58s - loss: 0.9609 - regression_loss: 0.8572 - classification_loss: 0.1037 266/500 [==============>...............] - ETA: 58s - loss: 0.9631 - regression_loss: 0.8590 - classification_loss: 0.1042 267/500 [===============>..............] - ETA: 58s - loss: 0.9634 - regression_loss: 0.8592 - classification_loss: 0.1042 268/500 [===============>..............] - ETA: 58s - loss: 0.9640 - regression_loss: 0.8600 - classification_loss: 0.1040 269/500 [===============>..............] - ETA: 57s - loss: 0.9627 - regression_loss: 0.8589 - classification_loss: 0.1038 270/500 [===============>..............] - ETA: 57s - loss: 0.9623 - regression_loss: 0.8585 - classification_loss: 0.1037 271/500 [===============>..............] - ETA: 57s - loss: 0.9617 - regression_loss: 0.8580 - classification_loss: 0.1036 272/500 [===============>..............] - ETA: 57s - loss: 0.9615 - regression_loss: 0.8580 - classification_loss: 0.1035 273/500 [===============>..............] - ETA: 56s - loss: 0.9626 - regression_loss: 0.8589 - classification_loss: 0.1037 274/500 [===============>..............] - ETA: 56s - loss: 0.9624 - regression_loss: 0.8589 - classification_loss: 0.1035 275/500 [===============>..............] - ETA: 56s - loss: 0.9630 - regression_loss: 0.8593 - classification_loss: 0.1037 276/500 [===============>..............] - ETA: 56s - loss: 0.9632 - regression_loss: 0.8596 - classification_loss: 0.1036 277/500 [===============>..............] - ETA: 55s - loss: 0.9620 - regression_loss: 0.8585 - classification_loss: 0.1035 278/500 [===============>..............] - ETA: 55s - loss: 0.9619 - regression_loss: 0.8585 - classification_loss: 0.1034 279/500 [===============>..............] - ETA: 55s - loss: 0.9621 - regression_loss: 0.8588 - classification_loss: 0.1033 280/500 [===============>..............] - ETA: 55s - loss: 0.9612 - regression_loss: 0.8580 - classification_loss: 0.1032 281/500 [===============>..............] - ETA: 54s - loss: 0.9595 - regression_loss: 0.8565 - classification_loss: 0.1030 282/500 [===============>..............] - ETA: 54s - loss: 0.9599 - regression_loss: 0.8569 - classification_loss: 0.1029 283/500 [===============>..............] - ETA: 54s - loss: 0.9605 - regression_loss: 0.8573 - classification_loss: 0.1032 284/500 [================>.............] - ETA: 54s - loss: 0.9598 - regression_loss: 0.8566 - classification_loss: 0.1032 285/500 [================>.............] - ETA: 53s - loss: 0.9576 - regression_loss: 0.8546 - classification_loss: 0.1029 286/500 [================>.............] - ETA: 53s - loss: 0.9574 - regression_loss: 0.8545 - classification_loss: 0.1029 287/500 [================>.............] - ETA: 53s - loss: 0.9583 - regression_loss: 0.8553 - classification_loss: 0.1030 288/500 [================>.............] - ETA: 53s - loss: 0.9580 - regression_loss: 0.8551 - classification_loss: 0.1029 289/500 [================>.............] - ETA: 52s - loss: 0.9570 - regression_loss: 0.8544 - classification_loss: 0.1027 290/500 [================>.............] - ETA: 52s - loss: 0.9596 - regression_loss: 0.8563 - classification_loss: 0.1032 291/500 [================>.............] - ETA: 52s - loss: 0.9597 - regression_loss: 0.8564 - classification_loss: 0.1033 292/500 [================>.............] - ETA: 52s - loss: 0.9586 - regression_loss: 0.8554 - classification_loss: 0.1032 293/500 [================>.............] - ETA: 51s - loss: 0.9565 - regression_loss: 0.8536 - classification_loss: 0.1029 294/500 [================>.............] - ETA: 51s - loss: 0.9562 - regression_loss: 0.8534 - classification_loss: 0.1028 295/500 [================>.............] - ETA: 51s - loss: 0.9564 - regression_loss: 0.8535 - classification_loss: 0.1029 296/500 [================>.............] - ETA: 51s - loss: 0.9572 - regression_loss: 0.8542 - classification_loss: 0.1030 297/500 [================>.............] - ETA: 50s - loss: 0.9572 - regression_loss: 0.8542 - classification_loss: 0.1030 298/500 [================>.............] - ETA: 50s - loss: 0.9574 - regression_loss: 0.8543 - classification_loss: 0.1031 299/500 [================>.............] - ETA: 50s - loss: 0.9590 - regression_loss: 0.8556 - classification_loss: 0.1034 300/500 [=================>............] - ETA: 50s - loss: 0.9592 - regression_loss: 0.8558 - classification_loss: 0.1034 301/500 [=================>............] - ETA: 49s - loss: 0.9591 - regression_loss: 0.8556 - classification_loss: 0.1035 302/500 [=================>............] - ETA: 49s - loss: 0.9587 - regression_loss: 0.8554 - classification_loss: 0.1034 303/500 [=================>............] - ETA: 49s - loss: 0.9606 - regression_loss: 0.8568 - classification_loss: 0.1037 304/500 [=================>............] - ETA: 49s - loss: 0.9620 - regression_loss: 0.8581 - classification_loss: 0.1040 305/500 [=================>............] - ETA: 48s - loss: 0.9620 - regression_loss: 0.8582 - classification_loss: 0.1038 306/500 [=================>............] - ETA: 48s - loss: 0.9633 - regression_loss: 0.8593 - classification_loss: 0.1040 307/500 [=================>............] - ETA: 48s - loss: 0.9646 - regression_loss: 0.8604 - classification_loss: 0.1042 308/500 [=================>............] - ETA: 48s - loss: 0.9668 - regression_loss: 0.8623 - classification_loss: 0.1044 309/500 [=================>............] - ETA: 47s - loss: 0.9657 - regression_loss: 0.8615 - classification_loss: 0.1042 310/500 [=================>............] - ETA: 47s - loss: 0.9651 - regression_loss: 0.8610 - classification_loss: 0.1041 311/500 [=================>............] - ETA: 47s - loss: 0.9635 - regression_loss: 0.8596 - classification_loss: 0.1039 312/500 [=================>............] - ETA: 47s - loss: 0.9649 - regression_loss: 0.8605 - classification_loss: 0.1045 313/500 [=================>............] - ETA: 46s - loss: 0.9641 - regression_loss: 0.8598 - classification_loss: 0.1043 314/500 [=================>............] - ETA: 46s - loss: 0.9637 - regression_loss: 0.8594 - classification_loss: 0.1042 315/500 [=================>............] - ETA: 46s - loss: 0.9634 - regression_loss: 0.8592 - classification_loss: 0.1041 316/500 [=================>............] - ETA: 46s - loss: 0.9635 - regression_loss: 0.8594 - classification_loss: 0.1042 317/500 [==================>...........] - ETA: 45s - loss: 0.9631 - regression_loss: 0.8589 - classification_loss: 0.1041 318/500 [==================>...........] - ETA: 45s - loss: 0.9623 - regression_loss: 0.8584 - classification_loss: 0.1040 319/500 [==================>...........] - ETA: 45s - loss: 0.9608 - regression_loss: 0.8570 - classification_loss: 0.1037 320/500 [==================>...........] - ETA: 45s - loss: 0.9615 - regression_loss: 0.8576 - classification_loss: 0.1038 321/500 [==================>...........] - ETA: 44s - loss: 0.9615 - regression_loss: 0.8576 - classification_loss: 0.1039 322/500 [==================>...........] - ETA: 44s - loss: 0.9629 - regression_loss: 0.8588 - classification_loss: 0.1041 323/500 [==================>...........] - ETA: 44s - loss: 0.9632 - regression_loss: 0.8591 - classification_loss: 0.1041 324/500 [==================>...........] - ETA: 44s - loss: 0.9630 - regression_loss: 0.8589 - classification_loss: 0.1041 325/500 [==================>...........] - ETA: 43s - loss: 0.9615 - regression_loss: 0.8575 - classification_loss: 0.1040 326/500 [==================>...........] - ETA: 43s - loss: 0.9619 - regression_loss: 0.8578 - classification_loss: 0.1041 327/500 [==================>...........] - ETA: 43s - loss: 0.9612 - regression_loss: 0.8574 - classification_loss: 0.1038 328/500 [==================>...........] - ETA: 43s - loss: 0.9611 - regression_loss: 0.8572 - classification_loss: 0.1039 329/500 [==================>...........] - ETA: 42s - loss: 0.9612 - regression_loss: 0.8573 - classification_loss: 0.1039 330/500 [==================>...........] - ETA: 42s - loss: 0.9601 - regression_loss: 0.8565 - classification_loss: 0.1036 331/500 [==================>...........] - ETA: 42s - loss: 0.9609 - regression_loss: 0.8572 - classification_loss: 0.1037 332/500 [==================>...........] - ETA: 42s - loss: 0.9609 - regression_loss: 0.8571 - classification_loss: 0.1038 333/500 [==================>...........] - ETA: 41s - loss: 0.9604 - regression_loss: 0.8567 - classification_loss: 0.1037 334/500 [===================>..........] - ETA: 41s - loss: 0.9606 - regression_loss: 0.8569 - classification_loss: 0.1037 335/500 [===================>..........] - ETA: 41s - loss: 0.9608 - regression_loss: 0.8572 - classification_loss: 0.1036 336/500 [===================>..........] - ETA: 41s - loss: 0.9609 - regression_loss: 0.8574 - classification_loss: 0.1036 337/500 [===================>..........] - ETA: 40s - loss: 0.9623 - regression_loss: 0.8586 - classification_loss: 0.1038 338/500 [===================>..........] - ETA: 40s - loss: 0.9626 - regression_loss: 0.8589 - classification_loss: 0.1038 339/500 [===================>..........] - ETA: 40s - loss: 0.9618 - regression_loss: 0.8582 - classification_loss: 0.1036 340/500 [===================>..........] - ETA: 40s - loss: 0.9630 - regression_loss: 0.8592 - classification_loss: 0.1038 341/500 [===================>..........] - ETA: 39s - loss: 0.9646 - regression_loss: 0.8606 - classification_loss: 0.1040 342/500 [===================>..........] - ETA: 39s - loss: 0.9653 - regression_loss: 0.8614 - classification_loss: 0.1039 343/500 [===================>..........] - ETA: 39s - loss: 0.9650 - regression_loss: 0.8611 - classification_loss: 0.1038 344/500 [===================>..........] - ETA: 39s - loss: 0.9627 - regression_loss: 0.8589 - classification_loss: 0.1037 345/500 [===================>..........] - ETA: 38s - loss: 0.9625 - regression_loss: 0.8588 - classification_loss: 0.1037 346/500 [===================>..........] - ETA: 38s - loss: 0.9631 - regression_loss: 0.8593 - classification_loss: 0.1038 347/500 [===================>..........] - ETA: 38s - loss: 0.9646 - regression_loss: 0.8606 - classification_loss: 0.1040 348/500 [===================>..........] - ETA: 38s - loss: 0.9645 - regression_loss: 0.8605 - classification_loss: 0.1040 349/500 [===================>..........] - ETA: 37s - loss: 0.9649 - regression_loss: 0.8608 - classification_loss: 0.1041 350/500 [====================>.........] - ETA: 37s - loss: 0.9646 - regression_loss: 0.8606 - classification_loss: 0.1040 351/500 [====================>.........] - ETA: 37s - loss: 0.9649 - regression_loss: 0.8608 - classification_loss: 0.1041 352/500 [====================>.........] - ETA: 37s - loss: 0.9650 - regression_loss: 0.8608 - classification_loss: 0.1043 353/500 [====================>.........] - ETA: 36s - loss: 0.9650 - regression_loss: 0.8607 - classification_loss: 0.1043 354/500 [====================>.........] - ETA: 36s - loss: 0.9654 - regression_loss: 0.8610 - classification_loss: 0.1044 355/500 [====================>.........] - ETA: 36s - loss: 0.9640 - regression_loss: 0.8598 - classification_loss: 0.1043 356/500 [====================>.........] - ETA: 36s - loss: 0.9639 - regression_loss: 0.8597 - classification_loss: 0.1042 357/500 [====================>.........] - ETA: 35s - loss: 0.9651 - regression_loss: 0.8606 - classification_loss: 0.1045 358/500 [====================>.........] - ETA: 35s - loss: 0.9638 - regression_loss: 0.8594 - classification_loss: 0.1044 359/500 [====================>.........] - ETA: 35s - loss: 0.9634 - regression_loss: 0.8592 - classification_loss: 0.1042 360/500 [====================>.........] - ETA: 35s - loss: 0.9620 - regression_loss: 0.8580 - classification_loss: 0.1040 361/500 [====================>.........] - ETA: 34s - loss: 0.9613 - regression_loss: 0.8574 - classification_loss: 0.1039 362/500 [====================>.........] - ETA: 34s - loss: 0.9613 - regression_loss: 0.8574 - classification_loss: 0.1039 363/500 [====================>.........] - ETA: 34s - loss: 0.9616 - regression_loss: 0.8576 - classification_loss: 0.1040 364/500 [====================>.........] - ETA: 34s - loss: 0.9631 - regression_loss: 0.8587 - classification_loss: 0.1043 365/500 [====================>.........] - ETA: 33s - loss: 0.9630 - regression_loss: 0.8587 - classification_loss: 0.1043 366/500 [====================>.........] - ETA: 33s - loss: 0.9621 - regression_loss: 0.8579 - classification_loss: 0.1041 367/500 [=====================>........] - ETA: 33s - loss: 0.9616 - regression_loss: 0.8576 - classification_loss: 0.1040 368/500 [=====================>........] - ETA: 33s - loss: 0.9615 - regression_loss: 0.8576 - classification_loss: 0.1039 369/500 [=====================>........] - ETA: 32s - loss: 0.9602 - regression_loss: 0.8564 - classification_loss: 0.1038 370/500 [=====================>........] - ETA: 32s - loss: 0.9601 - regression_loss: 0.8564 - classification_loss: 0.1037 371/500 [=====================>........] - ETA: 32s - loss: 0.9608 - regression_loss: 0.8570 - classification_loss: 0.1038 372/500 [=====================>........] - ETA: 32s - loss: 0.9609 - regression_loss: 0.8572 - classification_loss: 0.1037 373/500 [=====================>........] - ETA: 31s - loss: 0.9608 - regression_loss: 0.8571 - classification_loss: 0.1037 374/500 [=====================>........] - ETA: 31s - loss: 0.9595 - regression_loss: 0.8560 - classification_loss: 0.1035 375/500 [=====================>........] - ETA: 31s - loss: 0.9601 - regression_loss: 0.8564 - classification_loss: 0.1036 376/500 [=====================>........] - ETA: 31s - loss: 0.9604 - regression_loss: 0.8570 - classification_loss: 0.1035 377/500 [=====================>........] - ETA: 30s - loss: 0.9606 - regression_loss: 0.8571 - classification_loss: 0.1035 378/500 [=====================>........] - ETA: 30s - loss: 0.9600 - regression_loss: 0.8566 - classification_loss: 0.1034 379/500 [=====================>........] - ETA: 30s - loss: 0.9586 - regression_loss: 0.8554 - classification_loss: 0.1032 380/500 [=====================>........] - ETA: 30s - loss: 0.9585 - regression_loss: 0.8552 - classification_loss: 0.1033 381/500 [=====================>........] - ETA: 29s - loss: 0.9597 - regression_loss: 0.8561 - classification_loss: 0.1036 382/500 [=====================>........] - ETA: 29s - loss: 0.9602 - regression_loss: 0.8565 - classification_loss: 0.1037 383/500 [=====================>........] - ETA: 29s - loss: 0.9599 - regression_loss: 0.8562 - classification_loss: 0.1037 384/500 [======================>.......] - ETA: 29s - loss: 0.9594 - regression_loss: 0.8558 - classification_loss: 0.1036 385/500 [======================>.......] - ETA: 28s - loss: 0.9588 - regression_loss: 0.8553 - classification_loss: 0.1034 386/500 [======================>.......] - ETA: 28s - loss: 0.9574 - regression_loss: 0.8541 - classification_loss: 0.1033 387/500 [======================>.......] - ETA: 28s - loss: 0.9577 - regression_loss: 0.8545 - classification_loss: 0.1033 388/500 [======================>.......] - ETA: 28s - loss: 0.9595 - regression_loss: 0.8550 - classification_loss: 0.1045 389/500 [======================>.......] - ETA: 27s - loss: 0.9593 - regression_loss: 0.8547 - classification_loss: 0.1046 390/500 [======================>.......] - ETA: 27s - loss: 0.9596 - regression_loss: 0.8551 - classification_loss: 0.1045 391/500 [======================>.......] - ETA: 27s - loss: 0.9600 - regression_loss: 0.8554 - classification_loss: 0.1046 392/500 [======================>.......] - ETA: 27s - loss: 0.9603 - regression_loss: 0.8556 - classification_loss: 0.1047 393/500 [======================>.......] - ETA: 26s - loss: 0.9594 - regression_loss: 0.8548 - classification_loss: 0.1046 394/500 [======================>.......] - ETA: 26s - loss: 0.9599 - regression_loss: 0.8552 - classification_loss: 0.1047 395/500 [======================>.......] - ETA: 26s - loss: 0.9607 - regression_loss: 0.8558 - classification_loss: 0.1049 396/500 [======================>.......] - ETA: 26s - loss: 0.9621 - regression_loss: 0.8570 - classification_loss: 0.1051 397/500 [======================>.......] - ETA: 25s - loss: 0.9623 - regression_loss: 0.8572 - classification_loss: 0.1051 398/500 [======================>.......] - ETA: 25s - loss: 0.9627 - regression_loss: 0.8575 - classification_loss: 0.1052 399/500 [======================>.......] - ETA: 25s - loss: 0.9638 - regression_loss: 0.8584 - classification_loss: 0.1054 400/500 [=======================>......] - ETA: 25s - loss: 0.9636 - regression_loss: 0.8582 - classification_loss: 0.1055 401/500 [=======================>......] - ETA: 24s - loss: 0.9643 - regression_loss: 0.8587 - classification_loss: 0.1056 402/500 [=======================>......] - ETA: 24s - loss: 0.9650 - regression_loss: 0.8594 - classification_loss: 0.1056 403/500 [=======================>......] - ETA: 24s - loss: 0.9640 - regression_loss: 0.8585 - classification_loss: 0.1055 404/500 [=======================>......] - ETA: 24s - loss: 0.9631 - regression_loss: 0.8576 - classification_loss: 0.1055 405/500 [=======================>......] - ETA: 23s - loss: 0.9628 - regression_loss: 0.8573 - classification_loss: 0.1054 406/500 [=======================>......] - ETA: 23s - loss: 0.9633 - regression_loss: 0.8577 - classification_loss: 0.1056 407/500 [=======================>......] - ETA: 23s - loss: 0.9617 - regression_loss: 0.8564 - classification_loss: 0.1053 408/500 [=======================>......] - ETA: 23s - loss: 0.9613 - regression_loss: 0.8560 - classification_loss: 0.1053 409/500 [=======================>......] - ETA: 22s - loss: 0.9605 - regression_loss: 0.8553 - classification_loss: 0.1052 410/500 [=======================>......] - ETA: 22s - loss: 0.9610 - regression_loss: 0.8558 - classification_loss: 0.1053 411/500 [=======================>......] - ETA: 22s - loss: 0.9612 - regression_loss: 0.8559 - classification_loss: 0.1053 412/500 [=======================>......] - ETA: 22s - loss: 0.9607 - regression_loss: 0.8555 - classification_loss: 0.1051 413/500 [=======================>......] - ETA: 21s - loss: 0.9602 - regression_loss: 0.8552 - classification_loss: 0.1049 414/500 [=======================>......] - ETA: 21s - loss: 0.9591 - regression_loss: 0.8544 - classification_loss: 0.1047 415/500 [=======================>......] - ETA: 21s - loss: 0.9593 - regression_loss: 0.8545 - classification_loss: 0.1048 416/500 [=======================>......] - ETA: 21s - loss: 0.9579 - regression_loss: 0.8533 - classification_loss: 0.1046 417/500 [========================>.....] - ETA: 20s - loss: 0.9577 - regression_loss: 0.8532 - classification_loss: 0.1046 418/500 [========================>.....] - ETA: 20s - loss: 0.9579 - regression_loss: 0.8533 - classification_loss: 0.1046 419/500 [========================>.....] - ETA: 20s - loss: 0.9576 - regression_loss: 0.8532 - classification_loss: 0.1045 420/500 [========================>.....] - ETA: 20s - loss: 0.9564 - regression_loss: 0.8521 - classification_loss: 0.1043 421/500 [========================>.....] - ETA: 19s - loss: 0.9578 - regression_loss: 0.8533 - classification_loss: 0.1045 422/500 [========================>.....] - ETA: 19s - loss: 0.9580 - regression_loss: 0.8535 - classification_loss: 0.1045 423/500 [========================>.....] - ETA: 19s - loss: 0.9593 - regression_loss: 0.8546 - classification_loss: 0.1047 424/500 [========================>.....] - ETA: 19s - loss: 0.9582 - regression_loss: 0.8537 - classification_loss: 0.1045 425/500 [========================>.....] - ETA: 18s - loss: 0.9589 - regression_loss: 0.8543 - classification_loss: 0.1046 426/500 [========================>.....] - ETA: 18s - loss: 0.9584 - regression_loss: 0.8538 - classification_loss: 0.1045 427/500 [========================>.....] - ETA: 18s - loss: 0.9583 - regression_loss: 0.8537 - classification_loss: 0.1046 428/500 [========================>.....] - ETA: 18s - loss: 0.9592 - regression_loss: 0.8545 - classification_loss: 0.1047 429/500 [========================>.....] - ETA: 17s - loss: 0.9604 - regression_loss: 0.8555 - classification_loss: 0.1049 430/500 [========================>.....] - ETA: 17s - loss: 0.9592 - regression_loss: 0.8544 - classification_loss: 0.1048 431/500 [========================>.....] - ETA: 17s - loss: 0.9601 - regression_loss: 0.8549 - classification_loss: 0.1052 432/500 [========================>.....] - ETA: 17s - loss: 0.9591 - regression_loss: 0.8540 - classification_loss: 0.1050 433/500 [========================>.....] - ETA: 16s - loss: 0.9587 - regression_loss: 0.8538 - classification_loss: 0.1049 434/500 [=========================>....] - ETA: 16s - loss: 0.9581 - regression_loss: 0.8532 - classification_loss: 0.1049 435/500 [=========================>....] - ETA: 16s - loss: 0.9576 - regression_loss: 0.8528 - classification_loss: 0.1048 436/500 [=========================>....] - ETA: 16s - loss: 0.9573 - regression_loss: 0.8526 - classification_loss: 0.1047 437/500 [=========================>....] - ETA: 15s - loss: 0.9567 - regression_loss: 0.8522 - classification_loss: 0.1045 438/500 [=========================>....] - ETA: 15s - loss: 0.9551 - regression_loss: 0.8508 - classification_loss: 0.1043 439/500 [=========================>....] - ETA: 15s - loss: 0.9539 - regression_loss: 0.8497 - classification_loss: 0.1041 440/500 [=========================>....] - ETA: 15s - loss: 0.9537 - regression_loss: 0.8495 - classification_loss: 0.1041 441/500 [=========================>....] - ETA: 14s - loss: 0.9548 - regression_loss: 0.8503 - classification_loss: 0.1044 442/500 [=========================>....] - ETA: 14s - loss: 0.9552 - regression_loss: 0.8506 - classification_loss: 0.1046 443/500 [=========================>....] - ETA: 14s - loss: 0.9543 - regression_loss: 0.8499 - classification_loss: 0.1044 444/500 [=========================>....] - ETA: 14s - loss: 0.9542 - regression_loss: 0.8498 - classification_loss: 0.1045 445/500 [=========================>....] - ETA: 13s - loss: 0.9543 - regression_loss: 0.8499 - classification_loss: 0.1045 446/500 [=========================>....] - ETA: 13s - loss: 0.9533 - regression_loss: 0.8490 - classification_loss: 0.1044 447/500 [=========================>....] - ETA: 13s - loss: 0.9529 - regression_loss: 0.8486 - classification_loss: 0.1043 448/500 [=========================>....] - ETA: 13s - loss: 0.9533 - regression_loss: 0.8490 - classification_loss: 0.1043 449/500 [=========================>....] - ETA: 12s - loss: 0.9531 - regression_loss: 0.8489 - classification_loss: 0.1042 450/500 [==========================>...] - ETA: 12s - loss: 0.9526 - regression_loss: 0.8485 - classification_loss: 0.1041 451/500 [==========================>...] - ETA: 12s - loss: 0.9519 - regression_loss: 0.8479 - classification_loss: 0.1040 452/500 [==========================>...] - ETA: 12s - loss: 0.9525 - regression_loss: 0.8484 - classification_loss: 0.1042 453/500 [==========================>...] - ETA: 11s - loss: 0.9509 - regression_loss: 0.8470 - classification_loss: 0.1039 454/500 [==========================>...] - ETA: 11s - loss: 0.9506 - regression_loss: 0.8467 - classification_loss: 0.1039 455/500 [==========================>...] - ETA: 11s - loss: 0.9496 - regression_loss: 0.8458 - classification_loss: 0.1037 456/500 [==========================>...] - ETA: 11s - loss: 0.9511 - regression_loss: 0.8473 - classification_loss: 0.1038 457/500 [==========================>...] - ETA: 10s - loss: 0.9495 - regression_loss: 0.8459 - classification_loss: 0.1036 458/500 [==========================>...] - ETA: 10s - loss: 0.9491 - regression_loss: 0.8456 - classification_loss: 0.1036 459/500 [==========================>...] - ETA: 10s - loss: 0.9495 - regression_loss: 0.8459 - classification_loss: 0.1036 460/500 [==========================>...] - ETA: 10s - loss: 0.9501 - regression_loss: 0.8464 - classification_loss: 0.1037 461/500 [==========================>...] - ETA: 9s - loss: 0.9488 - regression_loss: 0.8452 - classification_loss: 0.1036  462/500 [==========================>...] - ETA: 9s - loss: 0.9504 - regression_loss: 0.8466 - classification_loss: 0.1038 463/500 [==========================>...] - ETA: 9s - loss: 0.9503 - regression_loss: 0.8466 - classification_loss: 0.1037 464/500 [==========================>...] - ETA: 9s - loss: 0.9487 - regression_loss: 0.8452 - classification_loss: 0.1035 465/500 [==========================>...] - ETA: 8s - loss: 0.9497 - regression_loss: 0.8460 - classification_loss: 0.1037 466/500 [==========================>...] - ETA: 8s - loss: 0.9498 - regression_loss: 0.8462 - classification_loss: 0.1036 467/500 [===========================>..] - ETA: 8s - loss: 0.9500 - regression_loss: 0.8464 - classification_loss: 0.1036 468/500 [===========================>..] - ETA: 8s - loss: 0.9504 - regression_loss: 0.8467 - classification_loss: 0.1037 469/500 [===========================>..] - ETA: 7s - loss: 0.9507 - regression_loss: 0.8470 - classification_loss: 0.1037 470/500 [===========================>..] - ETA: 7s - loss: 0.9504 - regression_loss: 0.8467 - classification_loss: 0.1037 471/500 [===========================>..] - ETA: 7s - loss: 0.9495 - regression_loss: 0.8460 - classification_loss: 0.1035 472/500 [===========================>..] - ETA: 7s - loss: 0.9487 - regression_loss: 0.8453 - classification_loss: 0.1034 473/500 [===========================>..] - ETA: 6s - loss: 0.9485 - regression_loss: 0.8452 - classification_loss: 0.1033 474/500 [===========================>..] - ETA: 6s - loss: 0.9482 - regression_loss: 0.8450 - classification_loss: 0.1032 475/500 [===========================>..] - ETA: 6s - loss: 0.9486 - regression_loss: 0.8453 - classification_loss: 0.1033 476/500 [===========================>..] - ETA: 6s - loss: 0.9476 - regression_loss: 0.8445 - classification_loss: 0.1031 477/500 [===========================>..] - ETA: 5s - loss: 0.9474 - regression_loss: 0.8443 - classification_loss: 0.1030 478/500 [===========================>..] - ETA: 5s - loss: 0.9463 - regression_loss: 0.8435 - classification_loss: 0.1029 479/500 [===========================>..] - ETA: 5s - loss: 0.9455 - regression_loss: 0.8428 - classification_loss: 0.1027 480/500 [===========================>..] - ETA: 5s - loss: 0.9452 - regression_loss: 0.8424 - classification_loss: 0.1027 481/500 [===========================>..] - ETA: 4s - loss: 0.9438 - regression_loss: 0.8413 - classification_loss: 0.1026 482/500 [===========================>..] - ETA: 4s - loss: 0.9454 - regression_loss: 0.8426 - classification_loss: 0.1028 483/500 [===========================>..] - ETA: 4s - loss: 0.9452 - regression_loss: 0.8424 - classification_loss: 0.1028 484/500 [============================>.] - ETA: 4s - loss: 0.9445 - regression_loss: 0.8419 - classification_loss: 0.1026 485/500 [============================>.] - ETA: 3s - loss: 0.9454 - regression_loss: 0.8427 - classification_loss: 0.1028 486/500 [============================>.] - ETA: 3s - loss: 0.9440 - regression_loss: 0.8414 - classification_loss: 0.1026 487/500 [============================>.] - ETA: 3s - loss: 0.9442 - regression_loss: 0.8415 - classification_loss: 0.1026 488/500 [============================>.] - ETA: 3s - loss: 0.9444 - regression_loss: 0.8418 - classification_loss: 0.1027 489/500 [============================>.] - ETA: 2s - loss: 0.9448 - regression_loss: 0.8420 - classification_loss: 0.1028 490/500 [============================>.] - ETA: 2s - loss: 0.9454 - regression_loss: 0.8426 - classification_loss: 0.1028 491/500 [============================>.] - ETA: 2s - loss: 0.9455 - regression_loss: 0.8427 - classification_loss: 0.1028 492/500 [============================>.] - ETA: 2s - loss: 0.9470 - regression_loss: 0.8439 - classification_loss: 0.1030 493/500 [============================>.] - ETA: 1s - loss: 0.9476 - regression_loss: 0.8446 - classification_loss: 0.1031 494/500 [============================>.] - ETA: 1s - loss: 0.9482 - regression_loss: 0.8451 - classification_loss: 0.1031 495/500 [============================>.] - ETA: 1s - loss: 0.9470 - regression_loss: 0.8441 - classification_loss: 0.1029 496/500 [============================>.] - ETA: 1s - loss: 0.9465 - regression_loss: 0.8436 - classification_loss: 0.1029 497/500 [============================>.] - ETA: 0s - loss: 0.9473 - regression_loss: 0.8441 - classification_loss: 0.1032 498/500 [============================>.] - ETA: 0s - loss: 0.9474 - regression_loss: 0.8443 - classification_loss: 0.1031 499/500 [============================>.] - ETA: 0s - loss: 0.9481 - regression_loss: 0.8449 - classification_loss: 0.1032 500/500 [==============================] - 125s 250ms/step - loss: 0.9475 - regression_loss: 0.8444 - classification_loss: 0.1031 1172 instances of class plum with average precision: 0.7790 mAP: 0.7790 Epoch 00042: saving model to ./training/snapshots/resnet50_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 1:55 - loss: 1.4536 - regression_loss: 1.2656 - classification_loss: 0.1880 2/500 [..............................] - ETA: 2:01 - loss: 1.2940 - regression_loss: 1.1263 - classification_loss: 0.1678 3/500 [..............................] - ETA: 2:03 - loss: 1.3307 - regression_loss: 1.1618 - classification_loss: 0.1689 4/500 [..............................] - ETA: 2:04 - loss: 1.2695 - regression_loss: 1.1097 - classification_loss: 0.1598 5/500 [..............................] - ETA: 2:04 - loss: 1.2283 - regression_loss: 1.0759 - classification_loss: 0.1524 6/500 [..............................] - ETA: 2:04 - loss: 1.1042 - regression_loss: 0.9730 - classification_loss: 0.1312 7/500 [..............................] - ETA: 2:03 - loss: 1.0258 - regression_loss: 0.9062 - classification_loss: 0.1196 8/500 [..............................] - ETA: 2:03 - loss: 1.0031 - regression_loss: 0.8884 - classification_loss: 0.1146 9/500 [..............................] - ETA: 2:03 - loss: 0.9441 - regression_loss: 0.8372 - classification_loss: 0.1069 10/500 [..............................] - ETA: 2:03 - loss: 0.9370 - regression_loss: 0.8320 - classification_loss: 0.1050 11/500 [..............................] - ETA: 2:02 - loss: 0.8986 - regression_loss: 0.7996 - classification_loss: 0.0989 12/500 [..............................] - ETA: 2:02 - loss: 0.8981 - regression_loss: 0.8041 - classification_loss: 0.0940 13/500 [..............................] - ETA: 2:02 - loss: 0.8956 - regression_loss: 0.7997 - classification_loss: 0.0958 14/500 [..............................] - ETA: 2:02 - loss: 0.9034 - regression_loss: 0.8065 - classification_loss: 0.0969 15/500 [..............................] - ETA: 2:01 - loss: 0.8838 - regression_loss: 0.7911 - classification_loss: 0.0927 16/500 [..............................] - ETA: 2:01 - loss: 0.8702 - regression_loss: 0.7787 - classification_loss: 0.0915 17/500 [>.............................] - ETA: 2:01 - loss: 0.8411 - regression_loss: 0.7528 - classification_loss: 0.0883 18/500 [>.............................] - ETA: 2:01 - loss: 0.8505 - regression_loss: 0.7575 - classification_loss: 0.0930 19/500 [>.............................] - ETA: 2:00 - loss: 0.8688 - regression_loss: 0.7732 - classification_loss: 0.0956 20/500 [>.............................] - ETA: 2:00 - loss: 0.8739 - regression_loss: 0.7774 - classification_loss: 0.0965 21/500 [>.............................] - ETA: 1:59 - loss: 0.8861 - regression_loss: 0.7886 - classification_loss: 0.0975 22/500 [>.............................] - ETA: 1:59 - loss: 0.8739 - regression_loss: 0.7771 - classification_loss: 0.0969 23/500 [>.............................] - ETA: 1:59 - loss: 0.8680 - regression_loss: 0.7727 - classification_loss: 0.0953 24/500 [>.............................] - ETA: 1:58 - loss: 0.8721 - regression_loss: 0.7752 - classification_loss: 0.0969 25/500 [>.............................] - ETA: 1:58 - loss: 0.8917 - regression_loss: 0.7927 - classification_loss: 0.0990 26/500 [>.............................] - ETA: 1:58 - loss: 0.8922 - regression_loss: 0.7948 - classification_loss: 0.0974 27/500 [>.............................] - ETA: 1:58 - loss: 0.9010 - regression_loss: 0.8021 - classification_loss: 0.0989 28/500 [>.............................] - ETA: 1:57 - loss: 0.9161 - regression_loss: 0.8147 - classification_loss: 0.1015 29/500 [>.............................] - ETA: 1:57 - loss: 0.9196 - regression_loss: 0.8205 - classification_loss: 0.0991 30/500 [>.............................] - ETA: 1:56 - loss: 0.9250 - regression_loss: 0.8245 - classification_loss: 0.1005 31/500 [>.............................] - ETA: 1:56 - loss: 0.9354 - regression_loss: 0.8361 - classification_loss: 0.0993 32/500 [>.............................] - ETA: 1:56 - loss: 0.9387 - regression_loss: 0.8383 - classification_loss: 0.1004 33/500 [>.............................] - ETA: 1:56 - loss: 0.9315 - regression_loss: 0.8334 - classification_loss: 0.0981 34/500 [=>............................] - ETA: 1:55 - loss: 0.9410 - regression_loss: 0.8411 - classification_loss: 0.0999 35/500 [=>............................] - ETA: 1:55 - loss: 0.9465 - regression_loss: 0.8454 - classification_loss: 0.1011 36/500 [=>............................] - ETA: 1:55 - loss: 0.9502 - regression_loss: 0.8501 - classification_loss: 0.1001 37/500 [=>............................] - ETA: 1:55 - loss: 0.9508 - regression_loss: 0.8513 - classification_loss: 0.0996 38/500 [=>............................] - ETA: 1:54 - loss: 0.9550 - regression_loss: 0.8553 - classification_loss: 0.0997 39/500 [=>............................] - ETA: 1:54 - loss: 0.9393 - regression_loss: 0.8417 - classification_loss: 0.0976 40/500 [=>............................] - ETA: 1:54 - loss: 0.9432 - regression_loss: 0.8449 - classification_loss: 0.0983 41/500 [=>............................] - ETA: 1:54 - loss: 0.9507 - regression_loss: 0.8507 - classification_loss: 0.1000 42/500 [=>............................] - ETA: 1:54 - loss: 0.9510 - regression_loss: 0.8514 - classification_loss: 0.0996 43/500 [=>............................] - ETA: 1:54 - loss: 0.9596 - regression_loss: 0.8577 - classification_loss: 0.1019 44/500 [=>............................] - ETA: 1:53 - loss: 0.9552 - regression_loss: 0.8539 - classification_loss: 0.1013 45/500 [=>............................] - ETA: 1:53 - loss: 0.9521 - regression_loss: 0.8510 - classification_loss: 0.1010 46/500 [=>............................] - ETA: 1:53 - loss: 0.9539 - regression_loss: 0.8527 - classification_loss: 0.1012 47/500 [=>............................] - ETA: 1:52 - loss: 0.9598 - regression_loss: 0.8582 - classification_loss: 0.1016 48/500 [=>............................] - ETA: 1:52 - loss: 0.9659 - regression_loss: 0.8631 - classification_loss: 0.1027 49/500 [=>............................] - ETA: 1:52 - loss: 0.9714 - regression_loss: 0.8676 - classification_loss: 0.1038 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9817 - regression_loss: 0.8761 - classification_loss: 0.1056 51/500 [==>...........................] - ETA: 1:51 - loss: 0.9755 - regression_loss: 0.8703 - classification_loss: 0.1052 52/500 [==>...........................] - ETA: 1:51 - loss: 0.9704 - regression_loss: 0.8662 - classification_loss: 0.1042 53/500 [==>...........................] - ETA: 1:51 - loss: 0.9743 - regression_loss: 0.8709 - classification_loss: 0.1034 54/500 [==>...........................] - ETA: 1:51 - loss: 0.9862 - regression_loss: 0.8809 - classification_loss: 0.1053 55/500 [==>...........................] - ETA: 1:50 - loss: 0.9909 - regression_loss: 0.8848 - classification_loss: 0.1062 56/500 [==>...........................] - ETA: 1:50 - loss: 0.9867 - regression_loss: 0.8814 - classification_loss: 0.1053 57/500 [==>...........................] - ETA: 1:50 - loss: 0.9948 - regression_loss: 0.8858 - classification_loss: 0.1090 58/500 [==>...........................] - ETA: 1:50 - loss: 0.9871 - regression_loss: 0.8793 - classification_loss: 0.1078 59/500 [==>...........................] - ETA: 1:49 - loss: 0.9942 - regression_loss: 0.8856 - classification_loss: 0.1086 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9963 - regression_loss: 0.8880 - classification_loss: 0.1083 61/500 [==>...........................] - ETA: 1:49 - loss: 0.9918 - regression_loss: 0.8848 - classification_loss: 0.1069 62/500 [==>...........................] - ETA: 1:49 - loss: 0.9942 - regression_loss: 0.8870 - classification_loss: 0.1072 63/500 [==>...........................] - ETA: 1:48 - loss: 1.0003 - regression_loss: 0.8921 - classification_loss: 0.1082 64/500 [==>...........................] - ETA: 1:48 - loss: 1.0011 - regression_loss: 0.8929 - classification_loss: 0.1082 65/500 [==>...........................] - ETA: 1:48 - loss: 1.0046 - regression_loss: 0.8953 - classification_loss: 0.1093 66/500 [==>...........................] - ETA: 1:48 - loss: 1.0114 - regression_loss: 0.9012 - classification_loss: 0.1102 67/500 [===>..........................] - ETA: 1:48 - loss: 1.0122 - regression_loss: 0.9013 - classification_loss: 0.1109 68/500 [===>..........................] - ETA: 1:47 - loss: 1.0054 - regression_loss: 0.8958 - classification_loss: 0.1096 69/500 [===>..........................] - ETA: 1:47 - loss: 1.0028 - regression_loss: 0.8942 - classification_loss: 0.1087 70/500 [===>..........................] - ETA: 1:47 - loss: 1.0006 - regression_loss: 0.8921 - classification_loss: 0.1085 71/500 [===>..........................] - ETA: 1:47 - loss: 0.9959 - regression_loss: 0.8876 - classification_loss: 0.1083 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9967 - regression_loss: 0.8883 - classification_loss: 0.1084 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9971 - regression_loss: 0.8876 - classification_loss: 0.1094 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9960 - regression_loss: 0.8867 - classification_loss: 0.1093 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9888 - regression_loss: 0.8806 - classification_loss: 0.1082 76/500 [===>..........................] - ETA: 1:45 - loss: 0.9916 - regression_loss: 0.8833 - classification_loss: 0.1084 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9844 - regression_loss: 0.8770 - classification_loss: 0.1074 78/500 [===>..........................] - ETA: 1:45 - loss: 0.9845 - regression_loss: 0.8773 - classification_loss: 0.1072 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9870 - regression_loss: 0.8797 - classification_loss: 0.1073 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9845 - regression_loss: 0.8760 - classification_loss: 0.1084 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9846 - regression_loss: 0.8763 - classification_loss: 0.1082 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9851 - regression_loss: 0.8773 - classification_loss: 0.1079 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9864 - regression_loss: 0.8784 - classification_loss: 0.1080 84/500 [====>.........................] - ETA: 1:44 - loss: 0.9904 - regression_loss: 0.8815 - classification_loss: 0.1089 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9896 - regression_loss: 0.8805 - classification_loss: 0.1091 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9888 - regression_loss: 0.8802 - classification_loss: 0.1086 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9880 - regression_loss: 0.8795 - classification_loss: 0.1085 88/500 [====>.........................] - ETA: 1:43 - loss: 0.9850 - regression_loss: 0.8769 - classification_loss: 0.1081 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9854 - regression_loss: 0.8771 - classification_loss: 0.1082 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9801 - regression_loss: 0.8728 - classification_loss: 0.1073 91/500 [====>.........................] - ETA: 1:42 - loss: 0.9796 - regression_loss: 0.8721 - classification_loss: 0.1075 92/500 [====>.........................] - ETA: 1:42 - loss: 0.9729 - regression_loss: 0.8663 - classification_loss: 0.1066 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9705 - regression_loss: 0.8641 - classification_loss: 0.1063 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9720 - regression_loss: 0.8655 - classification_loss: 0.1065 95/500 [====>.........................] - ETA: 1:41 - loss: 0.9731 - regression_loss: 0.8663 - classification_loss: 0.1067 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9746 - regression_loss: 0.8682 - classification_loss: 0.1064 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9749 - regression_loss: 0.8682 - classification_loss: 0.1067 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9755 - regression_loss: 0.8684 - classification_loss: 0.1071 99/500 [====>.........................] - ETA: 1:40 - loss: 0.9726 - regression_loss: 0.8661 - classification_loss: 0.1065 100/500 [=====>........................] - ETA: 1:40 - loss: 0.9676 - regression_loss: 0.8620 - classification_loss: 0.1056 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9720 - regression_loss: 0.8656 - classification_loss: 0.1063 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9745 - regression_loss: 0.8678 - classification_loss: 0.1066 103/500 [=====>........................] - ETA: 1:39 - loss: 0.9771 - regression_loss: 0.8704 - classification_loss: 0.1067 104/500 [=====>........................] - ETA: 1:39 - loss: 0.9806 - regression_loss: 0.8744 - classification_loss: 0.1062 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9869 - regression_loss: 0.8796 - classification_loss: 0.1073 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9821 - regression_loss: 0.8756 - classification_loss: 0.1065 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9855 - regression_loss: 0.8782 - classification_loss: 0.1073 108/500 [=====>........................] - ETA: 1:38 - loss: 0.9836 - regression_loss: 0.8765 - classification_loss: 0.1071 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9805 - regression_loss: 0.8741 - classification_loss: 0.1064 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9743 - regression_loss: 0.8687 - classification_loss: 0.1056 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9804 - regression_loss: 0.8735 - classification_loss: 0.1070 112/500 [=====>........................] - ETA: 1:37 - loss: 0.9833 - regression_loss: 0.8764 - classification_loss: 0.1070 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9883 - regression_loss: 0.8807 - classification_loss: 0.1076 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9892 - regression_loss: 0.8815 - classification_loss: 0.1077 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9899 - regression_loss: 0.8822 - classification_loss: 0.1077 116/500 [=====>........................] - ETA: 1:36 - loss: 0.9879 - regression_loss: 0.8807 - classification_loss: 0.1072 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9890 - regression_loss: 0.8814 - classification_loss: 0.1075 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9837 - regression_loss: 0.8768 - classification_loss: 0.1068 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9825 - regression_loss: 0.8759 - classification_loss: 0.1066 120/500 [======>.......................] - ETA: 1:34 - loss: 0.9843 - regression_loss: 0.8774 - classification_loss: 0.1069 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9839 - regression_loss: 0.8769 - classification_loss: 0.1070 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9820 - regression_loss: 0.8750 - classification_loss: 0.1069 123/500 [======>.......................] - ETA: 1:33 - loss: 0.9821 - regression_loss: 0.8754 - classification_loss: 0.1068 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9778 - regression_loss: 0.8717 - classification_loss: 0.1061 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9771 - regression_loss: 0.8710 - classification_loss: 0.1061 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9729 - regression_loss: 0.8674 - classification_loss: 0.1055 127/500 [======>.......................] - ETA: 1:32 - loss: 0.9716 - regression_loss: 0.8663 - classification_loss: 0.1052 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9700 - regression_loss: 0.8650 - classification_loss: 0.1050 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9638 - regression_loss: 0.8594 - classification_loss: 0.1044 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9656 - regression_loss: 0.8609 - classification_loss: 0.1046 131/500 [======>.......................] - ETA: 1:31 - loss: 0.9628 - regression_loss: 0.8586 - classification_loss: 0.1042 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9636 - regression_loss: 0.8595 - classification_loss: 0.1041 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9595 - regression_loss: 0.8560 - classification_loss: 0.1035 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9573 - regression_loss: 0.8543 - classification_loss: 0.1030 135/500 [=======>......................] - ETA: 1:30 - loss: 0.9575 - regression_loss: 0.8546 - classification_loss: 0.1030 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9561 - regression_loss: 0.8531 - classification_loss: 0.1030 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9538 - regression_loss: 0.8512 - classification_loss: 0.1026 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9523 - regression_loss: 0.8502 - classification_loss: 0.1021 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9522 - regression_loss: 0.8498 - classification_loss: 0.1024 140/500 [=======>......................] - ETA: 1:29 - loss: 0.9544 - regression_loss: 0.8517 - classification_loss: 0.1027 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9556 - regression_loss: 0.8527 - classification_loss: 0.1029 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9544 - regression_loss: 0.8517 - classification_loss: 0.1027 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9537 - regression_loss: 0.8512 - classification_loss: 0.1026 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9538 - regression_loss: 0.8510 - classification_loss: 0.1027 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9542 - regression_loss: 0.8518 - classification_loss: 0.1025 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9537 - regression_loss: 0.8511 - classification_loss: 0.1026 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9568 - regression_loss: 0.8539 - classification_loss: 0.1029 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9575 - regression_loss: 0.8545 - classification_loss: 0.1030 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9575 - regression_loss: 0.8546 - classification_loss: 0.1029 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9588 - regression_loss: 0.8557 - classification_loss: 0.1031 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9594 - regression_loss: 0.8564 - classification_loss: 0.1029 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9603 - regression_loss: 0.8572 - classification_loss: 0.1031 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9598 - regression_loss: 0.8568 - classification_loss: 0.1031 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9596 - regression_loss: 0.8566 - classification_loss: 0.1030 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9560 - regression_loss: 0.8535 - classification_loss: 0.1025 156/500 [========>.....................] - ETA: 1:26 - loss: 0.9519 - regression_loss: 0.8499 - classification_loss: 0.1020 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9523 - regression_loss: 0.8503 - classification_loss: 0.1020 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9516 - regression_loss: 0.8496 - classification_loss: 0.1020 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9526 - regression_loss: 0.8502 - classification_loss: 0.1024 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9486 - regression_loss: 0.8467 - classification_loss: 0.1019 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9475 - regression_loss: 0.8457 - classification_loss: 0.1018 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9500 - regression_loss: 0.8478 - classification_loss: 0.1022 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9497 - regression_loss: 0.8474 - classification_loss: 0.1023 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9510 - regression_loss: 0.8485 - classification_loss: 0.1024 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9524 - regression_loss: 0.8496 - classification_loss: 0.1028 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9544 - regression_loss: 0.8511 - classification_loss: 0.1033 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9557 - regression_loss: 0.8522 - classification_loss: 0.1034 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9558 - regression_loss: 0.8524 - classification_loss: 0.1034 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9524 - regression_loss: 0.8494 - classification_loss: 0.1030 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9525 - regression_loss: 0.8495 - classification_loss: 0.1030 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9511 - regression_loss: 0.8485 - classification_loss: 0.1026 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9525 - regression_loss: 0.8497 - classification_loss: 0.1028 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9545 - regression_loss: 0.8513 - classification_loss: 0.1032 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9555 - regression_loss: 0.8521 - classification_loss: 0.1034 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9528 - regression_loss: 0.8497 - classification_loss: 0.1031 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9507 - regression_loss: 0.8480 - classification_loss: 0.1027 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9510 - regression_loss: 0.8484 - classification_loss: 0.1026 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9485 - regression_loss: 0.8462 - classification_loss: 0.1023 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9475 - regression_loss: 0.8455 - classification_loss: 0.1019 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9502 - regression_loss: 0.8479 - classification_loss: 0.1023 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9482 - regression_loss: 0.8463 - classification_loss: 0.1019 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9458 - regression_loss: 0.8442 - classification_loss: 0.1016 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9420 - regression_loss: 0.8408 - classification_loss: 0.1012 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9412 - regression_loss: 0.8402 - classification_loss: 0.1009 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9432 - regression_loss: 0.8420 - classification_loss: 0.1012 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9440 - regression_loss: 0.8427 - classification_loss: 0.1013 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9413 - regression_loss: 0.8404 - classification_loss: 0.1008 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9401 - regression_loss: 0.8394 - classification_loss: 0.1007 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9419 - regression_loss: 0.8410 - classification_loss: 0.1010 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9427 - regression_loss: 0.8417 - classification_loss: 0.1010 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9427 - regression_loss: 0.8417 - classification_loss: 0.1010 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9439 - regression_loss: 0.8428 - classification_loss: 0.1012 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9447 - regression_loss: 0.8434 - classification_loss: 0.1012 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9441 - regression_loss: 0.8430 - classification_loss: 0.1011 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9429 - regression_loss: 0.8421 - classification_loss: 0.1007 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9417 - regression_loss: 0.8411 - classification_loss: 0.1006 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9421 - regression_loss: 0.8414 - classification_loss: 0.1007 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9419 - regression_loss: 0.8412 - classification_loss: 0.1007 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9434 - regression_loss: 0.8427 - classification_loss: 0.1007 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9451 - regression_loss: 0.8442 - classification_loss: 0.1009 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9480 - regression_loss: 0.8473 - classification_loss: 0.1007 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9456 - regression_loss: 0.8453 - classification_loss: 0.1003 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9418 - regression_loss: 0.8418 - classification_loss: 0.0999 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9405 - regression_loss: 0.8408 - classification_loss: 0.0997 205/500 [===========>..................] - ETA: 1:14 - loss: 0.9397 - regression_loss: 0.8402 - classification_loss: 0.0995 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9385 - regression_loss: 0.8393 - classification_loss: 0.0992 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9410 - regression_loss: 0.8406 - classification_loss: 0.1004 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9425 - regression_loss: 0.8416 - classification_loss: 0.1010 209/500 [===========>..................] - ETA: 1:13 - loss: 0.9444 - regression_loss: 0.8432 - classification_loss: 0.1013 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9463 - regression_loss: 0.8448 - classification_loss: 0.1015 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9484 - regression_loss: 0.8466 - classification_loss: 0.1017 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9516 - regression_loss: 0.8498 - classification_loss: 0.1017 213/500 [===========>..................] - ETA: 1:12 - loss: 0.9504 - regression_loss: 0.8489 - classification_loss: 0.1015 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9498 - regression_loss: 0.8485 - classification_loss: 0.1013 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9507 - regression_loss: 0.8492 - classification_loss: 0.1015 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9479 - regression_loss: 0.8468 - classification_loss: 0.1011 217/500 [============>.................] - ETA: 1:11 - loss: 0.9460 - regression_loss: 0.8452 - classification_loss: 0.1007 218/500 [============>.................] - ETA: 1:10 - loss: 0.9444 - regression_loss: 0.8439 - classification_loss: 0.1005 219/500 [============>.................] - ETA: 1:10 - loss: 0.9432 - regression_loss: 0.8429 - classification_loss: 0.1003 220/500 [============>.................] - ETA: 1:10 - loss: 0.9424 - regression_loss: 0.8422 - classification_loss: 0.1003 221/500 [============>.................] - ETA: 1:10 - loss: 0.9437 - regression_loss: 0.8434 - classification_loss: 0.1003 222/500 [============>.................] - ETA: 1:09 - loss: 0.9438 - regression_loss: 0.8435 - classification_loss: 0.1003 223/500 [============>.................] - ETA: 1:09 - loss: 0.9437 - regression_loss: 0.8437 - classification_loss: 0.1000 224/500 [============>.................] - ETA: 1:09 - loss: 0.9416 - regression_loss: 0.8419 - classification_loss: 0.0997 225/500 [============>.................] - ETA: 1:09 - loss: 0.9404 - regression_loss: 0.8410 - classification_loss: 0.0994 226/500 [============>.................] - ETA: 1:08 - loss: 0.9396 - regression_loss: 0.8404 - classification_loss: 0.0992 227/500 [============>.................] - ETA: 1:08 - loss: 0.9402 - regression_loss: 0.8409 - classification_loss: 0.0993 228/500 [============>.................] - ETA: 1:08 - loss: 0.9376 - regression_loss: 0.8386 - classification_loss: 0.0989 229/500 [============>.................] - ETA: 1:08 - loss: 0.9369 - regression_loss: 0.8382 - classification_loss: 0.0987 230/500 [============>.................] - ETA: 1:07 - loss: 0.9379 - regression_loss: 0.8390 - classification_loss: 0.0989 231/500 [============>.................] - ETA: 1:07 - loss: 0.9358 - regression_loss: 0.8372 - classification_loss: 0.0986 232/500 [============>.................] - ETA: 1:07 - loss: 0.9334 - regression_loss: 0.8350 - classification_loss: 0.0984 233/500 [============>.................] - ETA: 1:07 - loss: 0.9313 - regression_loss: 0.8332 - classification_loss: 0.0982 234/500 [=============>................] - ETA: 1:06 - loss: 0.9312 - regression_loss: 0.8327 - classification_loss: 0.0984 235/500 [=============>................] - ETA: 1:06 - loss: 0.9323 - regression_loss: 0.8336 - classification_loss: 0.0986 236/500 [=============>................] - ETA: 1:06 - loss: 0.9307 - regression_loss: 0.8323 - classification_loss: 0.0984 237/500 [=============>................] - ETA: 1:06 - loss: 0.9307 - regression_loss: 0.8325 - classification_loss: 0.0981 238/500 [=============>................] - ETA: 1:05 - loss: 0.9323 - regression_loss: 0.8338 - classification_loss: 0.0985 239/500 [=============>................] - ETA: 1:05 - loss: 0.9328 - regression_loss: 0.8343 - classification_loss: 0.0985 240/500 [=============>................] - ETA: 1:05 - loss: 0.9330 - regression_loss: 0.8347 - classification_loss: 0.0984 241/500 [=============>................] - ETA: 1:05 - loss: 0.9351 - regression_loss: 0.8366 - classification_loss: 0.0985 242/500 [=============>................] - ETA: 1:04 - loss: 0.9379 - regression_loss: 0.8390 - classification_loss: 0.0989 243/500 [=============>................] - ETA: 1:04 - loss: 0.9366 - regression_loss: 0.8380 - classification_loss: 0.0986 244/500 [=============>................] - ETA: 1:04 - loss: 0.9370 - regression_loss: 0.8384 - classification_loss: 0.0987 245/500 [=============>................] - ETA: 1:04 - loss: 0.9379 - regression_loss: 0.8392 - classification_loss: 0.0987 246/500 [=============>................] - ETA: 1:03 - loss: 0.9361 - regression_loss: 0.8376 - classification_loss: 0.0984 247/500 [=============>................] - ETA: 1:03 - loss: 0.9385 - regression_loss: 0.8397 - classification_loss: 0.0988 248/500 [=============>................] - ETA: 1:03 - loss: 0.9402 - regression_loss: 0.8409 - classification_loss: 0.0993 249/500 [=============>................] - ETA: 1:03 - loss: 0.9398 - regression_loss: 0.8406 - classification_loss: 0.0993 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9416 - regression_loss: 0.8421 - classification_loss: 0.0995 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9432 - regression_loss: 0.8435 - classification_loss: 0.0997 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9441 - regression_loss: 0.8443 - classification_loss: 0.0998 253/500 [==============>...............] - ETA: 1:02 - loss: 0.9450 - regression_loss: 0.8448 - classification_loss: 0.1002 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9462 - regression_loss: 0.8459 - classification_loss: 0.1003 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9470 - regression_loss: 0.8467 - classification_loss: 0.1003 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9476 - regression_loss: 0.8473 - classification_loss: 0.1003 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9469 - regression_loss: 0.8469 - classification_loss: 0.1000 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9494 - regression_loss: 0.8489 - classification_loss: 0.1005 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9501 - regression_loss: 0.8498 - classification_loss: 0.1004 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9509 - regression_loss: 0.8504 - classification_loss: 0.1005 261/500 [==============>...............] - ETA: 59s - loss: 0.9499 - regression_loss: 0.8496 - classification_loss: 0.1003  262/500 [==============>...............] - ETA: 59s - loss: 0.9507 - regression_loss: 0.8502 - classification_loss: 0.1005 263/500 [==============>...............] - ETA: 59s - loss: 0.9524 - regression_loss: 0.8516 - classification_loss: 0.1007 264/500 [==============>...............] - ETA: 59s - loss: 0.9542 - regression_loss: 0.8531 - classification_loss: 0.1011 265/500 [==============>...............] - ETA: 58s - loss: 0.9562 - regression_loss: 0.8547 - classification_loss: 0.1015 266/500 [==============>...............] - ETA: 58s - loss: 0.9579 - regression_loss: 0.8561 - classification_loss: 0.1018 267/500 [===============>..............] - ETA: 58s - loss: 0.9607 - regression_loss: 0.8582 - classification_loss: 0.1024 268/500 [===============>..............] - ETA: 58s - loss: 0.9625 - regression_loss: 0.8598 - classification_loss: 0.1028 269/500 [===============>..............] - ETA: 57s - loss: 0.9619 - regression_loss: 0.8593 - classification_loss: 0.1026 270/500 [===============>..............] - ETA: 57s - loss: 0.9615 - regression_loss: 0.8590 - classification_loss: 0.1024 271/500 [===============>..............] - ETA: 57s - loss: 0.9614 - regression_loss: 0.8588 - classification_loss: 0.1025 272/500 [===============>..............] - ETA: 57s - loss: 0.9612 - regression_loss: 0.8588 - classification_loss: 0.1024 273/500 [===============>..............] - ETA: 56s - loss: 0.9610 - regression_loss: 0.8586 - classification_loss: 0.1024 274/500 [===============>..............] - ETA: 56s - loss: 0.9590 - regression_loss: 0.8569 - classification_loss: 0.1021 275/500 [===============>..............] - ETA: 56s - loss: 0.9591 - regression_loss: 0.8570 - classification_loss: 0.1021 276/500 [===============>..............] - ETA: 56s - loss: 0.9582 - regression_loss: 0.8562 - classification_loss: 0.1020 277/500 [===============>..............] - ETA: 55s - loss: 0.9562 - regression_loss: 0.8545 - classification_loss: 0.1017 278/500 [===============>..............] - ETA: 55s - loss: 0.9556 - regression_loss: 0.8540 - classification_loss: 0.1016 279/500 [===============>..............] - ETA: 55s - loss: 0.9533 - regression_loss: 0.8519 - classification_loss: 0.1013 280/500 [===============>..............] - ETA: 55s - loss: 0.9526 - regression_loss: 0.8511 - classification_loss: 0.1015 281/500 [===============>..............] - ETA: 54s - loss: 0.9552 - regression_loss: 0.8532 - classification_loss: 0.1021 282/500 [===============>..............] - ETA: 54s - loss: 0.9530 - regression_loss: 0.8513 - classification_loss: 0.1017 283/500 [===============>..............] - ETA: 54s - loss: 0.9522 - regression_loss: 0.8502 - classification_loss: 0.1020 284/500 [================>.............] - ETA: 54s - loss: 0.9531 - regression_loss: 0.8509 - classification_loss: 0.1022 285/500 [================>.............] - ETA: 53s - loss: 0.9527 - regression_loss: 0.8506 - classification_loss: 0.1021 286/500 [================>.............] - ETA: 53s - loss: 0.9535 - regression_loss: 0.8514 - classification_loss: 0.1021 287/500 [================>.............] - ETA: 53s - loss: 0.9525 - regression_loss: 0.8505 - classification_loss: 0.1020 288/500 [================>.............] - ETA: 53s - loss: 0.9526 - regression_loss: 0.8506 - classification_loss: 0.1020 289/500 [================>.............] - ETA: 52s - loss: 0.9530 - regression_loss: 0.8510 - classification_loss: 0.1020 290/500 [================>.............] - ETA: 52s - loss: 0.9509 - regression_loss: 0.8492 - classification_loss: 0.1018 291/500 [================>.............] - ETA: 52s - loss: 0.9515 - regression_loss: 0.8497 - classification_loss: 0.1018 292/500 [================>.............] - ETA: 52s - loss: 0.9519 - regression_loss: 0.8499 - classification_loss: 0.1020 293/500 [================>.............] - ETA: 51s - loss: 0.9538 - regression_loss: 0.8514 - classification_loss: 0.1024 294/500 [================>.............] - ETA: 51s - loss: 0.9551 - regression_loss: 0.8525 - classification_loss: 0.1026 295/500 [================>.............] - ETA: 51s - loss: 0.9557 - regression_loss: 0.8531 - classification_loss: 0.1026 296/500 [================>.............] - ETA: 51s - loss: 0.9561 - regression_loss: 0.8535 - classification_loss: 0.1026 297/500 [================>.............] - ETA: 50s - loss: 0.9571 - regression_loss: 0.8544 - classification_loss: 0.1027 298/500 [================>.............] - ETA: 50s - loss: 0.9557 - regression_loss: 0.8532 - classification_loss: 0.1025 299/500 [================>.............] - ETA: 50s - loss: 0.9551 - regression_loss: 0.8528 - classification_loss: 0.1023 300/500 [=================>............] - ETA: 50s - loss: 0.9562 - regression_loss: 0.8538 - classification_loss: 0.1024 301/500 [=================>............] - ETA: 49s - loss: 0.9565 - regression_loss: 0.8542 - classification_loss: 0.1024 302/500 [=================>............] - ETA: 49s - loss: 0.9564 - regression_loss: 0.8543 - classification_loss: 0.1021 303/500 [=================>............] - ETA: 49s - loss: 0.9561 - regression_loss: 0.8541 - classification_loss: 0.1020 304/500 [=================>............] - ETA: 49s - loss: 0.9552 - regression_loss: 0.8533 - classification_loss: 0.1019 305/500 [=================>............] - ETA: 48s - loss: 0.9554 - regression_loss: 0.8535 - classification_loss: 0.1019 306/500 [=================>............] - ETA: 48s - loss: 0.9564 - regression_loss: 0.8544 - classification_loss: 0.1020 307/500 [=================>............] - ETA: 48s - loss: 0.9554 - regression_loss: 0.8536 - classification_loss: 0.1018 308/500 [=================>............] - ETA: 48s - loss: 0.9559 - regression_loss: 0.8540 - classification_loss: 0.1019 309/500 [=================>............] - ETA: 47s - loss: 0.9562 - regression_loss: 0.8543 - classification_loss: 0.1019 310/500 [=================>............] - ETA: 47s - loss: 0.9545 - regression_loss: 0.8529 - classification_loss: 0.1016 311/500 [=================>............] - ETA: 47s - loss: 0.9551 - regression_loss: 0.8536 - classification_loss: 0.1015 312/500 [=================>............] - ETA: 47s - loss: 0.9540 - regression_loss: 0.8527 - classification_loss: 0.1013 313/500 [=================>............] - ETA: 46s - loss: 0.9546 - regression_loss: 0.8532 - classification_loss: 0.1014 314/500 [=================>............] - ETA: 46s - loss: 0.9542 - regression_loss: 0.8529 - classification_loss: 0.1013 315/500 [=================>............] - ETA: 46s - loss: 0.9538 - regression_loss: 0.8526 - classification_loss: 0.1012 316/500 [=================>............] - ETA: 46s - loss: 0.9531 - regression_loss: 0.8519 - classification_loss: 0.1012 317/500 [==================>...........] - ETA: 45s - loss: 0.9533 - regression_loss: 0.8519 - classification_loss: 0.1013 318/500 [==================>...........] - ETA: 45s - loss: 0.9541 - regression_loss: 0.8528 - classification_loss: 0.1013 319/500 [==================>...........] - ETA: 45s - loss: 0.9545 - regression_loss: 0.8532 - classification_loss: 0.1012 320/500 [==================>...........] - ETA: 45s - loss: 0.9544 - regression_loss: 0.8530 - classification_loss: 0.1014 321/500 [==================>...........] - ETA: 44s - loss: 0.9541 - regression_loss: 0.8528 - classification_loss: 0.1013 322/500 [==================>...........] - ETA: 44s - loss: 0.9545 - regression_loss: 0.8532 - classification_loss: 0.1013 323/500 [==================>...........] - ETA: 44s - loss: 0.9552 - regression_loss: 0.8537 - classification_loss: 0.1015 324/500 [==================>...........] - ETA: 44s - loss: 0.9561 - regression_loss: 0.8546 - classification_loss: 0.1015 325/500 [==================>...........] - ETA: 43s - loss: 0.9571 - regression_loss: 0.8556 - classification_loss: 0.1015 326/500 [==================>...........] - ETA: 43s - loss: 0.9576 - regression_loss: 0.8561 - classification_loss: 0.1016 327/500 [==================>...........] - ETA: 43s - loss: 0.9583 - regression_loss: 0.8566 - classification_loss: 0.1017 328/500 [==================>...........] - ETA: 43s - loss: 0.9598 - regression_loss: 0.8579 - classification_loss: 0.1020 329/500 [==================>...........] - ETA: 42s - loss: 0.9600 - regression_loss: 0.8580 - classification_loss: 0.1020 330/500 [==================>...........] - ETA: 42s - loss: 0.9599 - regression_loss: 0.8579 - classification_loss: 0.1020 331/500 [==================>...........] - ETA: 42s - loss: 0.9586 - regression_loss: 0.8569 - classification_loss: 0.1017 332/500 [==================>...........] - ETA: 42s - loss: 0.9597 - regression_loss: 0.8578 - classification_loss: 0.1019 333/500 [==================>...........] - ETA: 41s - loss: 0.9609 - regression_loss: 0.8588 - classification_loss: 0.1021 334/500 [===================>..........] - ETA: 41s - loss: 0.9618 - regression_loss: 0.8597 - classification_loss: 0.1021 335/500 [===================>..........] - ETA: 41s - loss: 0.9628 - regression_loss: 0.8605 - classification_loss: 0.1023 336/500 [===================>..........] - ETA: 41s - loss: 0.9639 - regression_loss: 0.8613 - classification_loss: 0.1025 337/500 [===================>..........] - ETA: 40s - loss: 0.9649 - regression_loss: 0.8621 - classification_loss: 0.1027 338/500 [===================>..........] - ETA: 40s - loss: 0.9658 - regression_loss: 0.8628 - classification_loss: 0.1029 339/500 [===================>..........] - ETA: 40s - loss: 0.9655 - regression_loss: 0.8628 - classification_loss: 0.1028 340/500 [===================>..........] - ETA: 40s - loss: 0.9654 - regression_loss: 0.8627 - classification_loss: 0.1027 341/500 [===================>..........] - ETA: 39s - loss: 0.9659 - regression_loss: 0.8631 - classification_loss: 0.1028 342/500 [===================>..........] - ETA: 39s - loss: 0.9646 - regression_loss: 0.8620 - classification_loss: 0.1026 343/500 [===================>..........] - ETA: 39s - loss: 0.9651 - regression_loss: 0.8625 - classification_loss: 0.1026 344/500 [===================>..........] - ETA: 39s - loss: 0.9657 - regression_loss: 0.8630 - classification_loss: 0.1027 345/500 [===================>..........] - ETA: 38s - loss: 0.9670 - regression_loss: 0.8641 - classification_loss: 0.1030 346/500 [===================>..........] - ETA: 38s - loss: 0.9661 - regression_loss: 0.8633 - classification_loss: 0.1028 347/500 [===================>..........] - ETA: 38s - loss: 0.9662 - regression_loss: 0.8634 - classification_loss: 0.1029 348/500 [===================>..........] - ETA: 38s - loss: 0.9670 - regression_loss: 0.8641 - classification_loss: 0.1029 349/500 [===================>..........] - ETA: 37s - loss: 0.9655 - regression_loss: 0.8629 - classification_loss: 0.1026 350/500 [====================>.........] - ETA: 37s - loss: 0.9637 - regression_loss: 0.8613 - classification_loss: 0.1024 351/500 [====================>.........] - ETA: 37s - loss: 0.9635 - regression_loss: 0.8611 - classification_loss: 0.1024 352/500 [====================>.........] - ETA: 37s - loss: 0.9647 - regression_loss: 0.8621 - classification_loss: 0.1026 353/500 [====================>.........] - ETA: 36s - loss: 0.9644 - regression_loss: 0.8618 - classification_loss: 0.1026 354/500 [====================>.........] - ETA: 36s - loss: 0.9646 - regression_loss: 0.8620 - classification_loss: 0.1026 355/500 [====================>.........] - ETA: 36s - loss: 0.9645 - regression_loss: 0.8620 - classification_loss: 0.1024 356/500 [====================>.........] - ETA: 36s - loss: 0.9654 - regression_loss: 0.8630 - classification_loss: 0.1024 357/500 [====================>.........] - ETA: 35s - loss: 0.9642 - regression_loss: 0.8619 - classification_loss: 0.1022 358/500 [====================>.........] - ETA: 35s - loss: 0.9624 - regression_loss: 0.8604 - classification_loss: 0.1020 359/500 [====================>.........] - ETA: 35s - loss: 0.9622 - regression_loss: 0.8602 - classification_loss: 0.1019 360/500 [====================>.........] - ETA: 35s - loss: 0.9627 - regression_loss: 0.8607 - classification_loss: 0.1020 361/500 [====================>.........] - ETA: 34s - loss: 0.9652 - regression_loss: 0.8629 - classification_loss: 0.1023 362/500 [====================>.........] - ETA: 34s - loss: 0.9659 - regression_loss: 0.8636 - classification_loss: 0.1022 363/500 [====================>.........] - ETA: 34s - loss: 0.9653 - regression_loss: 0.8632 - classification_loss: 0.1021 364/500 [====================>.........] - ETA: 34s - loss: 0.9661 - regression_loss: 0.8638 - classification_loss: 0.1023 365/500 [====================>.........] - ETA: 33s - loss: 0.9666 - regression_loss: 0.8642 - classification_loss: 0.1023 366/500 [====================>.........] - ETA: 33s - loss: 0.9658 - regression_loss: 0.8636 - classification_loss: 0.1022 367/500 [=====================>........] - ETA: 33s - loss: 0.9664 - regression_loss: 0.8642 - classification_loss: 0.1023 368/500 [=====================>........] - ETA: 33s - loss: 0.9654 - regression_loss: 0.8633 - classification_loss: 0.1021 369/500 [=====================>........] - ETA: 32s - loss: 0.9653 - regression_loss: 0.8633 - classification_loss: 0.1020 370/500 [=====================>........] - ETA: 32s - loss: 0.9647 - regression_loss: 0.8628 - classification_loss: 0.1019 371/500 [=====================>........] - ETA: 32s - loss: 0.9644 - regression_loss: 0.8626 - classification_loss: 0.1018 372/500 [=====================>........] - ETA: 32s - loss: 0.9634 - regression_loss: 0.8618 - classification_loss: 0.1016 373/500 [=====================>........] - ETA: 31s - loss: 0.9639 - regression_loss: 0.8622 - classification_loss: 0.1016 374/500 [=====================>........] - ETA: 31s - loss: 0.9640 - regression_loss: 0.8623 - classification_loss: 0.1017 375/500 [=====================>........] - ETA: 31s - loss: 0.9630 - regression_loss: 0.8615 - classification_loss: 0.1015 376/500 [=====================>........] - ETA: 31s - loss: 0.9617 - regression_loss: 0.8604 - classification_loss: 0.1013 377/500 [=====================>........] - ETA: 30s - loss: 0.9621 - regression_loss: 0.8609 - classification_loss: 0.1013 378/500 [=====================>........] - ETA: 30s - loss: 0.9627 - regression_loss: 0.8614 - classification_loss: 0.1013 379/500 [=====================>........] - ETA: 30s - loss: 0.9618 - regression_loss: 0.8606 - classification_loss: 0.1012 380/500 [=====================>........] - ETA: 30s - loss: 0.9621 - regression_loss: 0.8609 - classification_loss: 0.1013 381/500 [=====================>........] - ETA: 29s - loss: 0.9630 - regression_loss: 0.8617 - classification_loss: 0.1013 382/500 [=====================>........] - ETA: 29s - loss: 0.9626 - regression_loss: 0.8614 - classification_loss: 0.1012 383/500 [=====================>........] - ETA: 29s - loss: 0.9624 - regression_loss: 0.8613 - classification_loss: 0.1012 384/500 [======================>.......] - ETA: 29s - loss: 0.9620 - regression_loss: 0.8609 - classification_loss: 0.1011 385/500 [======================>.......] - ETA: 28s - loss: 0.9619 - regression_loss: 0.8609 - classification_loss: 0.1011 386/500 [======================>.......] - ETA: 28s - loss: 0.9601 - regression_loss: 0.8593 - classification_loss: 0.1008 387/500 [======================>.......] - ETA: 28s - loss: 0.9605 - regression_loss: 0.8595 - classification_loss: 0.1010 388/500 [======================>.......] - ETA: 28s - loss: 0.9618 - regression_loss: 0.8607 - classification_loss: 0.1012 389/500 [======================>.......] - ETA: 27s - loss: 0.9614 - regression_loss: 0.8604 - classification_loss: 0.1010 390/500 [======================>.......] - ETA: 27s - loss: 0.9610 - regression_loss: 0.8600 - classification_loss: 0.1010 391/500 [======================>.......] - ETA: 27s - loss: 0.9632 - regression_loss: 0.8621 - classification_loss: 0.1011 392/500 [======================>.......] - ETA: 27s - loss: 0.9624 - regression_loss: 0.8614 - classification_loss: 0.1010 393/500 [======================>.......] - ETA: 26s - loss: 0.9609 - regression_loss: 0.8601 - classification_loss: 0.1008 394/500 [======================>.......] - ETA: 26s - loss: 0.9612 - regression_loss: 0.8606 - classification_loss: 0.1007 395/500 [======================>.......] - ETA: 26s - loss: 0.9627 - regression_loss: 0.8617 - classification_loss: 0.1009 396/500 [======================>.......] - ETA: 26s - loss: 0.9628 - regression_loss: 0.8619 - classification_loss: 0.1009 397/500 [======================>.......] - ETA: 25s - loss: 0.9620 - regression_loss: 0.8612 - classification_loss: 0.1008 398/500 [======================>.......] - ETA: 25s - loss: 0.9613 - regression_loss: 0.8606 - classification_loss: 0.1007 399/500 [======================>.......] - ETA: 25s - loss: 0.9615 - regression_loss: 0.8609 - classification_loss: 0.1007 400/500 [=======================>......] - ETA: 25s - loss: 0.9608 - regression_loss: 0.8603 - classification_loss: 0.1005 401/500 [=======================>......] - ETA: 24s - loss: 0.9613 - regression_loss: 0.8607 - classification_loss: 0.1006 402/500 [=======================>......] - ETA: 24s - loss: 0.9609 - regression_loss: 0.8604 - classification_loss: 0.1005 403/500 [=======================>......] - ETA: 24s - loss: 0.9605 - regression_loss: 0.8602 - classification_loss: 0.1004 404/500 [=======================>......] - ETA: 24s - loss: 0.9601 - regression_loss: 0.8598 - classification_loss: 0.1003 405/500 [=======================>......] - ETA: 23s - loss: 0.9611 - regression_loss: 0.8606 - classification_loss: 0.1005 406/500 [=======================>......] - ETA: 23s - loss: 0.9598 - regression_loss: 0.8595 - classification_loss: 0.1003 407/500 [=======================>......] - ETA: 23s - loss: 0.9583 - regression_loss: 0.8582 - classification_loss: 0.1001 408/500 [=======================>......] - ETA: 23s - loss: 0.9573 - regression_loss: 0.8574 - classification_loss: 0.1000 409/500 [=======================>......] - ETA: 22s - loss: 0.9558 - regression_loss: 0.8560 - classification_loss: 0.0998 410/500 [=======================>......] - ETA: 22s - loss: 0.9561 - regression_loss: 0.8562 - classification_loss: 0.0998 411/500 [=======================>......] - ETA: 22s - loss: 0.9556 - regression_loss: 0.8558 - classification_loss: 0.0998 412/500 [=======================>......] - ETA: 22s - loss: 0.9566 - regression_loss: 0.8567 - classification_loss: 0.0999 413/500 [=======================>......] - ETA: 21s - loss: 0.9564 - regression_loss: 0.8565 - classification_loss: 0.0998 414/500 [=======================>......] - ETA: 21s - loss: 0.9578 - regression_loss: 0.8578 - classification_loss: 0.1000 415/500 [=======================>......] - ETA: 21s - loss: 0.9590 - regression_loss: 0.8588 - classification_loss: 0.1002 416/500 [=======================>......] - ETA: 21s - loss: 0.9601 - regression_loss: 0.8598 - classification_loss: 0.1004 417/500 [========================>.....] - ETA: 20s - loss: 0.9603 - regression_loss: 0.8600 - classification_loss: 0.1003 418/500 [========================>.....] - ETA: 20s - loss: 0.9605 - regression_loss: 0.8601 - classification_loss: 0.1004 419/500 [========================>.....] - ETA: 20s - loss: 0.9604 - regression_loss: 0.8600 - classification_loss: 0.1003 420/500 [========================>.....] - ETA: 20s - loss: 0.9597 - regression_loss: 0.8595 - classification_loss: 0.1002 421/500 [========================>.....] - ETA: 19s - loss: 0.9593 - regression_loss: 0.8591 - classification_loss: 0.1001 422/500 [========================>.....] - ETA: 19s - loss: 0.9591 - regression_loss: 0.8590 - classification_loss: 0.1001 423/500 [========================>.....] - ETA: 19s - loss: 0.9597 - regression_loss: 0.8595 - classification_loss: 0.1001 424/500 [========================>.....] - ETA: 19s - loss: 0.9599 - regression_loss: 0.8598 - classification_loss: 0.1001 425/500 [========================>.....] - ETA: 18s - loss: 0.9604 - regression_loss: 0.8603 - classification_loss: 0.1000 426/500 [========================>.....] - ETA: 18s - loss: 0.9605 - regression_loss: 0.8604 - classification_loss: 0.1000 427/500 [========================>.....] - ETA: 18s - loss: 0.9615 - regression_loss: 0.8613 - classification_loss: 0.1002 428/500 [========================>.....] - ETA: 18s - loss: 0.9612 - regression_loss: 0.8611 - classification_loss: 0.1001 429/500 [========================>.....] - ETA: 17s - loss: 0.9621 - regression_loss: 0.8618 - classification_loss: 0.1003 430/500 [========================>.....] - ETA: 17s - loss: 0.9619 - regression_loss: 0.8616 - classification_loss: 0.1003 431/500 [========================>.....] - ETA: 17s - loss: 0.9611 - regression_loss: 0.8610 - classification_loss: 0.1002 432/500 [========================>.....] - ETA: 17s - loss: 0.9615 - regression_loss: 0.8613 - classification_loss: 0.1002 433/500 [========================>.....] - ETA: 16s - loss: 0.9614 - regression_loss: 0.8607 - classification_loss: 0.1006 434/500 [=========================>....] - ETA: 16s - loss: 0.9607 - regression_loss: 0.8602 - classification_loss: 0.1005 435/500 [=========================>....] - ETA: 16s - loss: 0.9609 - regression_loss: 0.8604 - classification_loss: 0.1006 436/500 [=========================>....] - ETA: 16s - loss: 0.9611 - regression_loss: 0.8606 - classification_loss: 0.1006 437/500 [=========================>....] - ETA: 15s - loss: 0.9614 - regression_loss: 0.8609 - classification_loss: 0.1006 438/500 [=========================>....] - ETA: 15s - loss: 0.9621 - regression_loss: 0.8616 - classification_loss: 0.1004 439/500 [=========================>....] - ETA: 15s - loss: 0.9622 - regression_loss: 0.8615 - classification_loss: 0.1007 440/500 [=========================>....] - ETA: 15s - loss: 0.9624 - regression_loss: 0.8617 - classification_loss: 0.1008 441/500 [=========================>....] - ETA: 14s - loss: 0.9622 - regression_loss: 0.8616 - classification_loss: 0.1007 442/500 [=========================>....] - ETA: 14s - loss: 0.9614 - regression_loss: 0.8609 - classification_loss: 0.1006 443/500 [=========================>....] - ETA: 14s - loss: 0.9613 - regression_loss: 0.8607 - classification_loss: 0.1006 444/500 [=========================>....] - ETA: 14s - loss: 0.9617 - regression_loss: 0.8611 - classification_loss: 0.1006 445/500 [=========================>....] - ETA: 13s - loss: 0.9615 - regression_loss: 0.8609 - classification_loss: 0.1006 446/500 [=========================>....] - ETA: 13s - loss: 0.9600 - regression_loss: 0.8596 - classification_loss: 0.1004 447/500 [=========================>....] - ETA: 13s - loss: 0.9599 - regression_loss: 0.8595 - classification_loss: 0.1005 448/500 [=========================>....] - ETA: 13s - loss: 0.9606 - regression_loss: 0.8600 - classification_loss: 0.1006 449/500 [=========================>....] - ETA: 12s - loss: 0.9602 - regression_loss: 0.8597 - classification_loss: 0.1005 450/500 [==========================>...] - ETA: 12s - loss: 0.9597 - regression_loss: 0.8592 - classification_loss: 0.1004 451/500 [==========================>...] - ETA: 12s - loss: 0.9601 - regression_loss: 0.8596 - classification_loss: 0.1006 452/500 [==========================>...] - ETA: 12s - loss: 0.9602 - regression_loss: 0.8596 - classification_loss: 0.1006 453/500 [==========================>...] - ETA: 11s - loss: 0.9603 - regression_loss: 0.8596 - classification_loss: 0.1007 454/500 [==========================>...] - ETA: 11s - loss: 0.9593 - regression_loss: 0.8588 - classification_loss: 0.1005 455/500 [==========================>...] - ETA: 11s - loss: 0.9604 - regression_loss: 0.8596 - classification_loss: 0.1007 456/500 [==========================>...] - ETA: 11s - loss: 0.9599 - regression_loss: 0.8593 - classification_loss: 0.1006 457/500 [==========================>...] - ETA: 10s - loss: 0.9587 - regression_loss: 0.8583 - classification_loss: 0.1004 458/500 [==========================>...] - ETA: 10s - loss: 0.9595 - regression_loss: 0.8589 - classification_loss: 0.1006 459/500 [==========================>...] - ETA: 10s - loss: 0.9591 - regression_loss: 0.8585 - classification_loss: 0.1005 460/500 [==========================>...] - ETA: 10s - loss: 0.9597 - regression_loss: 0.8591 - classification_loss: 0.1007 461/500 [==========================>...] - ETA: 9s - loss: 0.9584 - regression_loss: 0.8579 - classification_loss: 0.1005  462/500 [==========================>...] - ETA: 9s - loss: 0.9577 - regression_loss: 0.8573 - classification_loss: 0.1004 463/500 [==========================>...] - ETA: 9s - loss: 0.9570 - regression_loss: 0.8567 - classification_loss: 0.1003 464/500 [==========================>...] - ETA: 9s - loss: 0.9562 - regression_loss: 0.8561 - classification_loss: 0.1002 465/500 [==========================>...] - ETA: 8s - loss: 0.9560 - regression_loss: 0.8559 - classification_loss: 0.1001 466/500 [==========================>...] - ETA: 8s - loss: 0.9564 - regression_loss: 0.8562 - classification_loss: 0.1003 467/500 [===========================>..] - ETA: 8s - loss: 0.9556 - regression_loss: 0.8555 - classification_loss: 0.1001 468/500 [===========================>..] - ETA: 8s - loss: 0.9557 - regression_loss: 0.8556 - classification_loss: 0.1001 469/500 [===========================>..] - ETA: 7s - loss: 0.9562 - regression_loss: 0.8560 - classification_loss: 0.1002 470/500 [===========================>..] - ETA: 7s - loss: 0.9572 - regression_loss: 0.8567 - classification_loss: 0.1005 471/500 [===========================>..] - ETA: 7s - loss: 0.9573 - regression_loss: 0.8568 - classification_loss: 0.1005 472/500 [===========================>..] - ETA: 7s - loss: 0.9569 - regression_loss: 0.8565 - classification_loss: 0.1005 473/500 [===========================>..] - ETA: 6s - loss: 0.9567 - regression_loss: 0.8563 - classification_loss: 0.1004 474/500 [===========================>..] - ETA: 6s - loss: 0.9588 - regression_loss: 0.8582 - classification_loss: 0.1006 475/500 [===========================>..] - ETA: 6s - loss: 0.9605 - regression_loss: 0.8598 - classification_loss: 0.1007 476/500 [===========================>..] - ETA: 6s - loss: 0.9602 - regression_loss: 0.8596 - classification_loss: 0.1006 477/500 [===========================>..] - ETA: 5s - loss: 0.9600 - regression_loss: 0.8596 - classification_loss: 0.1005 478/500 [===========================>..] - ETA: 5s - loss: 0.9603 - regression_loss: 0.8597 - classification_loss: 0.1005 479/500 [===========================>..] - ETA: 5s - loss: 0.9604 - regression_loss: 0.8599 - classification_loss: 0.1005 480/500 [===========================>..] - ETA: 5s - loss: 0.9606 - regression_loss: 0.8601 - classification_loss: 0.1005 481/500 [===========================>..] - ETA: 4s - loss: 0.9611 - regression_loss: 0.8605 - classification_loss: 0.1006 482/500 [===========================>..] - ETA: 4s - loss: 0.9606 - regression_loss: 0.8602 - classification_loss: 0.1004 483/500 [===========================>..] - ETA: 4s - loss: 0.9605 - regression_loss: 0.8600 - classification_loss: 0.1005 484/500 [============================>.] - ETA: 4s - loss: 0.9604 - regression_loss: 0.8600 - classification_loss: 0.1004 485/500 [============================>.] - ETA: 3s - loss: 0.9594 - regression_loss: 0.8591 - classification_loss: 0.1003 486/500 [============================>.] - ETA: 3s - loss: 0.9589 - regression_loss: 0.8588 - classification_loss: 0.1002 487/500 [============================>.] - ETA: 3s - loss: 0.9586 - regression_loss: 0.8585 - classification_loss: 0.1001 488/500 [============================>.] - ETA: 3s - loss: 0.9577 - regression_loss: 0.8577 - classification_loss: 0.1000 489/500 [============================>.] - ETA: 2s - loss: 0.9585 - regression_loss: 0.8584 - classification_loss: 0.1001 490/500 [============================>.] - ETA: 2s - loss: 0.9598 - regression_loss: 0.8595 - classification_loss: 0.1004 491/500 [============================>.] - ETA: 2s - loss: 0.9596 - regression_loss: 0.8592 - classification_loss: 0.1003 492/500 [============================>.] - ETA: 2s - loss: 0.9597 - regression_loss: 0.8593 - classification_loss: 0.1004 493/500 [============================>.] - ETA: 1s - loss: 0.9605 - regression_loss: 0.8600 - classification_loss: 0.1006 494/500 [============================>.] - ETA: 1s - loss: 0.9607 - regression_loss: 0.8601 - classification_loss: 0.1006 495/500 [============================>.] - ETA: 1s - loss: 0.9604 - regression_loss: 0.8598 - classification_loss: 0.1006 496/500 [============================>.] - ETA: 1s - loss: 0.9604 - regression_loss: 0.8598 - classification_loss: 0.1005 497/500 [============================>.] - ETA: 0s - loss: 0.9593 - regression_loss: 0.8589 - classification_loss: 0.1004 498/500 [============================>.] - ETA: 0s - loss: 0.9594 - regression_loss: 0.8589 - classification_loss: 0.1005 499/500 [============================>.] - ETA: 0s - loss: 0.9599 - regression_loss: 0.8594 - classification_loss: 0.1005 500/500 [==============================] - 125s 251ms/step - loss: 0.9592 - regression_loss: 0.8589 - classification_loss: 0.1004 1172 instances of class plum with average precision: 0.7887 mAP: 0.7887 Epoch 00043: saving model to ./training/snapshots/resnet50_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 1:58 - loss: 1.4365 - regression_loss: 1.2326 - classification_loss: 0.2039 2/500 [..............................] - ETA: 1:58 - loss: 0.9277 - regression_loss: 0.8156 - classification_loss: 0.1121 3/500 [..............................] - ETA: 2:00 - loss: 1.2379 - regression_loss: 1.0705 - classification_loss: 0.1674 4/500 [..............................] - ETA: 2:00 - loss: 1.1021 - regression_loss: 0.9655 - classification_loss: 0.1366 5/500 [..............................] - ETA: 2:02 - loss: 1.1115 - regression_loss: 0.9782 - classification_loss: 0.1333 6/500 [..............................] - ETA: 2:02 - loss: 1.0769 - regression_loss: 0.9533 - classification_loss: 0.1235 7/500 [..............................] - ETA: 2:02 - loss: 1.0659 - regression_loss: 0.9417 - classification_loss: 0.1242 8/500 [..............................] - ETA: 2:02 - loss: 1.0860 - regression_loss: 0.9594 - classification_loss: 0.1265 9/500 [..............................] - ETA: 2:02 - loss: 1.0635 - regression_loss: 0.9401 - classification_loss: 0.1233 10/500 [..............................] - ETA: 2:02 - loss: 1.0435 - regression_loss: 0.9268 - classification_loss: 0.1167 11/500 [..............................] - ETA: 2:02 - loss: 1.0453 - regression_loss: 0.9188 - classification_loss: 0.1264 12/500 [..............................] - ETA: 2:02 - loss: 1.0123 - regression_loss: 0.8920 - classification_loss: 0.1203 13/500 [..............................] - ETA: 2:02 - loss: 0.9717 - regression_loss: 0.8590 - classification_loss: 0.1128 14/500 [..............................] - ETA: 2:02 - loss: 0.9559 - regression_loss: 0.8432 - classification_loss: 0.1127 15/500 [..............................] - ETA: 2:01 - loss: 0.9418 - regression_loss: 0.8336 - classification_loss: 0.1082 16/500 [..............................] - ETA: 2:01 - loss: 0.9320 - regression_loss: 0.8262 - classification_loss: 0.1058 17/500 [>.............................] - ETA: 2:01 - loss: 0.9044 - regression_loss: 0.8041 - classification_loss: 0.1003 18/500 [>.............................] - ETA: 2:00 - loss: 0.8890 - regression_loss: 0.7919 - classification_loss: 0.0971 19/500 [>.............................] - ETA: 2:00 - loss: 0.8909 - regression_loss: 0.7935 - classification_loss: 0.0974 20/500 [>.............................] - ETA: 2:00 - loss: 0.9044 - regression_loss: 0.8102 - classification_loss: 0.0942 21/500 [>.............................] - ETA: 2:00 - loss: 0.9081 - regression_loss: 0.8147 - classification_loss: 0.0934 22/500 [>.............................] - ETA: 2:00 - loss: 0.8957 - regression_loss: 0.8053 - classification_loss: 0.0904 23/500 [>.............................] - ETA: 1:59 - loss: 0.8646 - regression_loss: 0.7773 - classification_loss: 0.0873 24/500 [>.............................] - ETA: 1:59 - loss: 0.8650 - regression_loss: 0.7748 - classification_loss: 0.0902 25/500 [>.............................] - ETA: 1:59 - loss: 0.8780 - regression_loss: 0.7860 - classification_loss: 0.0920 26/500 [>.............................] - ETA: 1:59 - loss: 0.8680 - regression_loss: 0.7775 - classification_loss: 0.0906 27/500 [>.............................] - ETA: 1:58 - loss: 0.8786 - regression_loss: 0.7875 - classification_loss: 0.0911 28/500 [>.............................] - ETA: 1:58 - loss: 0.8749 - regression_loss: 0.7840 - classification_loss: 0.0909 29/500 [>.............................] - ETA: 1:58 - loss: 0.8809 - regression_loss: 0.7897 - classification_loss: 0.0913 30/500 [>.............................] - ETA: 1:58 - loss: 0.8644 - regression_loss: 0.7751 - classification_loss: 0.0893 31/500 [>.............................] - ETA: 1:57 - loss: 0.8518 - regression_loss: 0.7643 - classification_loss: 0.0875 32/500 [>.............................] - ETA: 1:57 - loss: 0.8481 - regression_loss: 0.7616 - classification_loss: 0.0866 33/500 [>.............................] - ETA: 1:57 - loss: 0.8570 - regression_loss: 0.7688 - classification_loss: 0.0883 34/500 [=>............................] - ETA: 1:57 - loss: 0.8422 - regression_loss: 0.7560 - classification_loss: 0.0862 35/500 [=>............................] - ETA: 1:57 - loss: 0.8462 - regression_loss: 0.7590 - classification_loss: 0.0872 36/500 [=>............................] - ETA: 1:57 - loss: 0.8300 - regression_loss: 0.7448 - classification_loss: 0.0852 37/500 [=>............................] - ETA: 1:56 - loss: 0.8395 - regression_loss: 0.7526 - classification_loss: 0.0868 38/500 [=>............................] - ETA: 1:56 - loss: 0.8395 - regression_loss: 0.7520 - classification_loss: 0.0874 39/500 [=>............................] - ETA: 1:56 - loss: 0.8257 - regression_loss: 0.7401 - classification_loss: 0.0856 40/500 [=>............................] - ETA: 1:56 - loss: 0.8227 - regression_loss: 0.7385 - classification_loss: 0.0843 41/500 [=>............................] - ETA: 1:55 - loss: 0.8337 - regression_loss: 0.7476 - classification_loss: 0.0861 42/500 [=>............................] - ETA: 1:55 - loss: 0.8406 - regression_loss: 0.7546 - classification_loss: 0.0861 43/500 [=>............................] - ETA: 1:55 - loss: 0.8478 - regression_loss: 0.7598 - classification_loss: 0.0880 44/500 [=>............................] - ETA: 1:55 - loss: 0.8611 - regression_loss: 0.7713 - classification_loss: 0.0898 45/500 [=>............................] - ETA: 1:54 - loss: 0.8664 - regression_loss: 0.7765 - classification_loss: 0.0899 46/500 [=>............................] - ETA: 1:54 - loss: 0.8718 - regression_loss: 0.7792 - classification_loss: 0.0926 47/500 [=>............................] - ETA: 1:54 - loss: 0.8750 - regression_loss: 0.7822 - classification_loss: 0.0928 48/500 [=>............................] - ETA: 1:54 - loss: 0.8743 - regression_loss: 0.7820 - classification_loss: 0.0924 49/500 [=>............................] - ETA: 1:53 - loss: 0.8804 - regression_loss: 0.7867 - classification_loss: 0.0937 50/500 [==>...........................] - ETA: 1:53 - loss: 0.8703 - regression_loss: 0.7782 - classification_loss: 0.0922 51/500 [==>...........................] - ETA: 1:53 - loss: 0.8723 - regression_loss: 0.7798 - classification_loss: 0.0924 52/500 [==>...........................] - ETA: 1:52 - loss: 0.8838 - regression_loss: 0.7895 - classification_loss: 0.0943 53/500 [==>...........................] - ETA: 1:52 - loss: 0.8773 - regression_loss: 0.7839 - classification_loss: 0.0934 54/500 [==>...........................] - ETA: 1:52 - loss: 0.8685 - regression_loss: 0.7763 - classification_loss: 0.0923 55/500 [==>...........................] - ETA: 1:51 - loss: 0.8664 - regression_loss: 0.7748 - classification_loss: 0.0916 56/500 [==>...........................] - ETA: 1:51 - loss: 0.8681 - regression_loss: 0.7767 - classification_loss: 0.0914 57/500 [==>...........................] - ETA: 1:51 - loss: 0.8610 - regression_loss: 0.7708 - classification_loss: 0.0902 58/500 [==>...........................] - ETA: 1:51 - loss: 0.8616 - regression_loss: 0.7718 - classification_loss: 0.0898 59/500 [==>...........................] - ETA: 1:51 - loss: 0.8665 - regression_loss: 0.7756 - classification_loss: 0.0909 60/500 [==>...........................] - ETA: 1:50 - loss: 0.8604 - regression_loss: 0.7705 - classification_loss: 0.0898 61/500 [==>...........................] - ETA: 1:50 - loss: 0.8543 - regression_loss: 0.7648 - classification_loss: 0.0895 62/500 [==>...........................] - ETA: 1:50 - loss: 0.8607 - regression_loss: 0.7706 - classification_loss: 0.0901 63/500 [==>...........................] - ETA: 1:49 - loss: 0.8654 - regression_loss: 0.7745 - classification_loss: 0.0909 64/500 [==>...........................] - ETA: 1:49 - loss: 0.8737 - regression_loss: 0.7822 - classification_loss: 0.0915 65/500 [==>...........................] - ETA: 1:49 - loss: 0.8696 - regression_loss: 0.7786 - classification_loss: 0.0910 66/500 [==>...........................] - ETA: 1:49 - loss: 0.8693 - regression_loss: 0.7790 - classification_loss: 0.0902 67/500 [===>..........................] - ETA: 1:48 - loss: 0.8646 - regression_loss: 0.7750 - classification_loss: 0.0896 68/500 [===>..........................] - ETA: 1:48 - loss: 0.8735 - regression_loss: 0.7830 - classification_loss: 0.0905 69/500 [===>..........................] - ETA: 1:48 - loss: 0.8810 - regression_loss: 0.7897 - classification_loss: 0.0913 70/500 [===>..........................] - ETA: 1:48 - loss: 0.8878 - regression_loss: 0.7954 - classification_loss: 0.0924 71/500 [===>..........................] - ETA: 1:48 - loss: 0.8976 - regression_loss: 0.8036 - classification_loss: 0.0940 72/500 [===>..........................] - ETA: 1:47 - loss: 0.9007 - regression_loss: 0.8065 - classification_loss: 0.0942 73/500 [===>..........................] - ETA: 1:47 - loss: 0.9042 - regression_loss: 0.8097 - classification_loss: 0.0945 74/500 [===>..........................] - ETA: 1:47 - loss: 0.9081 - regression_loss: 0.8139 - classification_loss: 0.0942 75/500 [===>..........................] - ETA: 1:47 - loss: 0.9118 - regression_loss: 0.8171 - classification_loss: 0.0947 76/500 [===>..........................] - ETA: 1:46 - loss: 0.9153 - regression_loss: 0.8200 - classification_loss: 0.0953 77/500 [===>..........................] - ETA: 1:46 - loss: 0.9134 - regression_loss: 0.8181 - classification_loss: 0.0953 78/500 [===>..........................] - ETA: 1:46 - loss: 0.9200 - regression_loss: 0.8234 - classification_loss: 0.0966 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9193 - regression_loss: 0.8229 - classification_loss: 0.0964 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9135 - regression_loss: 0.8180 - classification_loss: 0.0955 81/500 [===>..........................] - ETA: 1:45 - loss: 0.9161 - regression_loss: 0.8202 - classification_loss: 0.0959 82/500 [===>..........................] - ETA: 1:45 - loss: 0.9110 - regression_loss: 0.8157 - classification_loss: 0.0953 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9104 - regression_loss: 0.8147 - classification_loss: 0.0956 84/500 [====>.........................] - ETA: 1:44 - loss: 0.9129 - regression_loss: 0.8168 - classification_loss: 0.0961 85/500 [====>.........................] - ETA: 1:44 - loss: 0.9062 - regression_loss: 0.8110 - classification_loss: 0.0952 86/500 [====>.........................] - ETA: 1:44 - loss: 0.8995 - regression_loss: 0.8051 - classification_loss: 0.0943 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9001 - regression_loss: 0.8062 - classification_loss: 0.0940 88/500 [====>.........................] - ETA: 1:43 - loss: 0.8962 - regression_loss: 0.8020 - classification_loss: 0.0942 89/500 [====>.........................] - ETA: 1:43 - loss: 0.8930 - regression_loss: 0.7995 - classification_loss: 0.0935 90/500 [====>.........................] - ETA: 1:43 - loss: 0.8943 - regression_loss: 0.8006 - classification_loss: 0.0937 91/500 [====>.........................] - ETA: 1:42 - loss: 0.8961 - regression_loss: 0.8024 - classification_loss: 0.0938 92/500 [====>.........................] - ETA: 1:42 - loss: 0.8962 - regression_loss: 0.8027 - classification_loss: 0.0935 93/500 [====>.........................] - ETA: 1:42 - loss: 0.8932 - regression_loss: 0.8003 - classification_loss: 0.0929 94/500 [====>.........................] - ETA: 1:42 - loss: 0.8942 - regression_loss: 0.8012 - classification_loss: 0.0931 95/500 [====>.........................] - ETA: 1:41 - loss: 0.8935 - regression_loss: 0.8010 - classification_loss: 0.0925 96/500 [====>.........................] - ETA: 1:41 - loss: 0.8928 - regression_loss: 0.7995 - classification_loss: 0.0934 97/500 [====>.........................] - ETA: 1:41 - loss: 0.8954 - regression_loss: 0.8015 - classification_loss: 0.0940 98/500 [====>.........................] - ETA: 1:41 - loss: 0.8894 - regression_loss: 0.7963 - classification_loss: 0.0932 99/500 [====>.........................] - ETA: 1:40 - loss: 0.8933 - regression_loss: 0.7998 - classification_loss: 0.0935 100/500 [=====>........................] - ETA: 1:40 - loss: 0.8945 - regression_loss: 0.8009 - classification_loss: 0.0935 101/500 [=====>........................] - ETA: 1:40 - loss: 0.8972 - regression_loss: 0.8034 - classification_loss: 0.0938 102/500 [=====>........................] - ETA: 1:40 - loss: 0.8992 - regression_loss: 0.8051 - classification_loss: 0.0941 103/500 [=====>........................] - ETA: 1:40 - loss: 0.8980 - regression_loss: 0.8041 - classification_loss: 0.0939 104/500 [=====>........................] - ETA: 1:39 - loss: 0.8954 - regression_loss: 0.8018 - classification_loss: 0.0936 105/500 [=====>........................] - ETA: 1:39 - loss: 0.8989 - regression_loss: 0.8044 - classification_loss: 0.0945 106/500 [=====>........................] - ETA: 1:39 - loss: 0.9016 - regression_loss: 0.8064 - classification_loss: 0.0953 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9057 - regression_loss: 0.8094 - classification_loss: 0.0963 108/500 [=====>........................] - ETA: 1:38 - loss: 0.9038 - regression_loss: 0.8080 - classification_loss: 0.0958 109/500 [=====>........................] - ETA: 1:38 - loss: 0.9076 - regression_loss: 0.8111 - classification_loss: 0.0965 110/500 [=====>........................] - ETA: 1:38 - loss: 0.9074 - regression_loss: 0.8109 - classification_loss: 0.0965 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9100 - regression_loss: 0.8136 - classification_loss: 0.0964 112/500 [=====>........................] - ETA: 1:37 - loss: 0.9085 - regression_loss: 0.8123 - classification_loss: 0.0961 113/500 [=====>........................] - ETA: 1:37 - loss: 0.9084 - regression_loss: 0.8123 - classification_loss: 0.0961 114/500 [=====>........................] - ETA: 1:37 - loss: 0.9038 - regression_loss: 0.8083 - classification_loss: 0.0955 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9019 - regression_loss: 0.8068 - classification_loss: 0.0951 116/500 [=====>........................] - ETA: 1:36 - loss: 0.9028 - regression_loss: 0.8073 - classification_loss: 0.0955 117/500 [======>.......................] - ETA: 1:36 - loss: 0.9052 - regression_loss: 0.8091 - classification_loss: 0.0961 118/500 [======>.......................] - ETA: 1:36 - loss: 0.9080 - regression_loss: 0.8114 - classification_loss: 0.0966 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9077 - regression_loss: 0.8112 - classification_loss: 0.0965 120/500 [======>.......................] - ETA: 1:35 - loss: 0.9037 - regression_loss: 0.8078 - classification_loss: 0.0958 121/500 [======>.......................] - ETA: 1:35 - loss: 0.9075 - regression_loss: 0.8113 - classification_loss: 0.0962 122/500 [======>.......................] - ETA: 1:35 - loss: 0.9054 - regression_loss: 0.8093 - classification_loss: 0.0961 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9073 - regression_loss: 0.8108 - classification_loss: 0.0965 124/500 [======>.......................] - ETA: 1:34 - loss: 0.9056 - regression_loss: 0.8092 - classification_loss: 0.0964 125/500 [======>.......................] - ETA: 1:34 - loss: 0.9079 - regression_loss: 0.8114 - classification_loss: 0.0966 126/500 [======>.......................] - ETA: 1:34 - loss: 0.9096 - regression_loss: 0.8129 - classification_loss: 0.0966 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9079 - regression_loss: 0.8117 - classification_loss: 0.0962 128/500 [======>.......................] - ETA: 1:33 - loss: 0.9037 - regression_loss: 0.8081 - classification_loss: 0.0956 129/500 [======>.......................] - ETA: 1:33 - loss: 0.9006 - regression_loss: 0.8054 - classification_loss: 0.0952 130/500 [======>.......................] - ETA: 1:33 - loss: 0.9016 - regression_loss: 0.8063 - classification_loss: 0.0953 131/500 [======>.......................] - ETA: 1:32 - loss: 0.8990 - regression_loss: 0.8043 - classification_loss: 0.0948 132/500 [======>.......................] - ETA: 1:32 - loss: 0.8995 - regression_loss: 0.8044 - classification_loss: 0.0951 133/500 [======>.......................] - ETA: 1:32 - loss: 0.9012 - regression_loss: 0.8058 - classification_loss: 0.0954 134/500 [=======>......................] - ETA: 1:32 - loss: 0.9041 - regression_loss: 0.8082 - classification_loss: 0.0959 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9034 - regression_loss: 0.8077 - classification_loss: 0.0957 136/500 [=======>......................] - ETA: 1:31 - loss: 0.9083 - regression_loss: 0.8118 - classification_loss: 0.0965 137/500 [=======>......................] - ETA: 1:31 - loss: 0.9108 - regression_loss: 0.8139 - classification_loss: 0.0969 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9114 - regression_loss: 0.8148 - classification_loss: 0.0965 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9085 - regression_loss: 0.8125 - classification_loss: 0.0961 140/500 [=======>......................] - ETA: 1:30 - loss: 0.9074 - regression_loss: 0.8116 - classification_loss: 0.0958 141/500 [=======>......................] - ETA: 1:30 - loss: 0.9094 - regression_loss: 0.8132 - classification_loss: 0.0962 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9108 - regression_loss: 0.8143 - classification_loss: 0.0966 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9119 - regression_loss: 0.8154 - classification_loss: 0.0965 144/500 [=======>......................] - ETA: 1:29 - loss: 0.9088 - regression_loss: 0.8123 - classification_loss: 0.0965 145/500 [=======>......................] - ETA: 1:29 - loss: 0.9057 - regression_loss: 0.8096 - classification_loss: 0.0961 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9059 - regression_loss: 0.8100 - classification_loss: 0.0959 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9031 - regression_loss: 0.8076 - classification_loss: 0.0956 148/500 [=======>......................] - ETA: 1:28 - loss: 0.9045 - regression_loss: 0.8088 - classification_loss: 0.0957 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9008 - regression_loss: 0.8053 - classification_loss: 0.0954 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9029 - regression_loss: 0.8072 - classification_loss: 0.0957 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9034 - regression_loss: 0.8076 - classification_loss: 0.0958 152/500 [========>.....................] - ETA: 1:27 - loss: 0.9080 - regression_loss: 0.8113 - classification_loss: 0.0968 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9063 - regression_loss: 0.8098 - classification_loss: 0.0964 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9042 - regression_loss: 0.8082 - classification_loss: 0.0959 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9062 - regression_loss: 0.8100 - classification_loss: 0.0962 156/500 [========>.....................] - ETA: 1:26 - loss: 0.9035 - regression_loss: 0.8078 - classification_loss: 0.0958 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9080 - regression_loss: 0.8114 - classification_loss: 0.0967 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9085 - regression_loss: 0.8117 - classification_loss: 0.0969 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9089 - regression_loss: 0.8121 - classification_loss: 0.0968 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9101 - regression_loss: 0.8129 - classification_loss: 0.0972 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9098 - regression_loss: 0.8126 - classification_loss: 0.0972 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9097 - regression_loss: 0.8124 - classification_loss: 0.0973 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9104 - regression_loss: 0.8133 - classification_loss: 0.0971 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9085 - regression_loss: 0.8117 - classification_loss: 0.0968 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9120 - regression_loss: 0.8145 - classification_loss: 0.0974 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9133 - regression_loss: 0.8155 - classification_loss: 0.0977 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9157 - regression_loss: 0.8174 - classification_loss: 0.0984 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9131 - regression_loss: 0.8151 - classification_loss: 0.0981 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9128 - regression_loss: 0.8148 - classification_loss: 0.0980 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9108 - regression_loss: 0.8130 - classification_loss: 0.0978 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9119 - regression_loss: 0.8145 - classification_loss: 0.0975 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9131 - regression_loss: 0.8155 - classification_loss: 0.0976 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9136 - regression_loss: 0.8160 - classification_loss: 0.0976 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9163 - regression_loss: 0.8182 - classification_loss: 0.0981 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9173 - regression_loss: 0.8192 - classification_loss: 0.0981 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9184 - regression_loss: 0.8202 - classification_loss: 0.0983 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9218 - regression_loss: 0.8234 - classification_loss: 0.0984 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9228 - regression_loss: 0.8242 - classification_loss: 0.0985 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9241 - regression_loss: 0.8256 - classification_loss: 0.0985 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9254 - regression_loss: 0.8267 - classification_loss: 0.0988 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9265 - regression_loss: 0.8277 - classification_loss: 0.0987 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9253 - regression_loss: 0.8268 - classification_loss: 0.0984 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9255 - regression_loss: 0.8270 - classification_loss: 0.0985 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9251 - regression_loss: 0.8268 - classification_loss: 0.0983 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9250 - regression_loss: 0.8268 - classification_loss: 0.0982 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9217 - regression_loss: 0.8238 - classification_loss: 0.0978 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9220 - regression_loss: 0.8241 - classification_loss: 0.0980 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9243 - regression_loss: 0.8261 - classification_loss: 0.0982 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9254 - regression_loss: 0.8270 - classification_loss: 0.0985 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9260 - regression_loss: 0.8274 - classification_loss: 0.0986 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9269 - regression_loss: 0.8282 - classification_loss: 0.0987 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9243 - regression_loss: 0.8254 - classification_loss: 0.0989 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9224 - regression_loss: 0.8237 - classification_loss: 0.0987 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9219 - regression_loss: 0.8231 - classification_loss: 0.0987 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9247 - regression_loss: 0.8256 - classification_loss: 0.0992 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9227 - regression_loss: 0.8237 - classification_loss: 0.0990 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9242 - regression_loss: 0.8252 - classification_loss: 0.0990 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9234 - regression_loss: 0.8244 - classification_loss: 0.0989 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9240 - regression_loss: 0.8251 - classification_loss: 0.0989 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9220 - regression_loss: 0.8234 - classification_loss: 0.0986 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9236 - regression_loss: 0.8247 - classification_loss: 0.0989 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9216 - regression_loss: 0.8231 - classification_loss: 0.0985 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9224 - regression_loss: 0.8236 - classification_loss: 0.0988 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9247 - regression_loss: 0.8256 - classification_loss: 0.0991 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9246 - regression_loss: 0.8257 - classification_loss: 0.0989 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9230 - regression_loss: 0.8243 - classification_loss: 0.0987 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9215 - regression_loss: 0.8229 - classification_loss: 0.0986 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9211 - regression_loss: 0.8227 - classification_loss: 0.0985 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9194 - regression_loss: 0.8212 - classification_loss: 0.0981 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9200 - regression_loss: 0.8217 - classification_loss: 0.0983 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9205 - regression_loss: 0.8221 - classification_loss: 0.0984 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9192 - regression_loss: 0.8210 - classification_loss: 0.0982 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9199 - regression_loss: 0.8217 - classification_loss: 0.0982 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9220 - regression_loss: 0.8233 - classification_loss: 0.0986 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9229 - regression_loss: 0.8243 - classification_loss: 0.0986 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9209 - regression_loss: 0.8227 - classification_loss: 0.0982 217/500 [============>.................] - ETA: 1:10 - loss: 0.9188 - regression_loss: 0.8209 - classification_loss: 0.0979 218/500 [============>.................] - ETA: 1:10 - loss: 0.9177 - regression_loss: 0.8200 - classification_loss: 0.0977 219/500 [============>.................] - ETA: 1:10 - loss: 0.9180 - regression_loss: 0.8200 - classification_loss: 0.0980 220/500 [============>.................] - ETA: 1:10 - loss: 0.9200 - regression_loss: 0.8217 - classification_loss: 0.0984 221/500 [============>.................] - ETA: 1:09 - loss: 0.9202 - regression_loss: 0.8219 - classification_loss: 0.0984 222/500 [============>.................] - ETA: 1:09 - loss: 0.9206 - regression_loss: 0.8224 - classification_loss: 0.0982 223/500 [============>.................] - ETA: 1:09 - loss: 0.9218 - regression_loss: 0.8235 - classification_loss: 0.0983 224/500 [============>.................] - ETA: 1:09 - loss: 0.9238 - regression_loss: 0.8253 - classification_loss: 0.0985 225/500 [============>.................] - ETA: 1:08 - loss: 0.9235 - regression_loss: 0.8250 - classification_loss: 0.0985 226/500 [============>.................] - ETA: 1:08 - loss: 0.9204 - regression_loss: 0.8223 - classification_loss: 0.0981 227/500 [============>.................] - ETA: 1:08 - loss: 0.9204 - regression_loss: 0.8225 - classification_loss: 0.0979 228/500 [============>.................] - ETA: 1:08 - loss: 0.9214 - regression_loss: 0.8234 - classification_loss: 0.0981 229/500 [============>.................] - ETA: 1:07 - loss: 0.9215 - regression_loss: 0.8234 - classification_loss: 0.0981 230/500 [============>.................] - ETA: 1:07 - loss: 0.9199 - regression_loss: 0.8222 - classification_loss: 0.0978 231/500 [============>.................] - ETA: 1:07 - loss: 0.9203 - regression_loss: 0.8225 - classification_loss: 0.0979 232/500 [============>.................] - ETA: 1:07 - loss: 0.9192 - regression_loss: 0.8216 - classification_loss: 0.0976 233/500 [============>.................] - ETA: 1:06 - loss: 0.9186 - regression_loss: 0.8211 - classification_loss: 0.0975 234/500 [=============>................] - ETA: 1:06 - loss: 0.9177 - regression_loss: 0.8205 - classification_loss: 0.0972 235/500 [=============>................] - ETA: 1:06 - loss: 0.9183 - regression_loss: 0.8210 - classification_loss: 0.0973 236/500 [=============>................] - ETA: 1:06 - loss: 0.9178 - regression_loss: 0.8206 - classification_loss: 0.0972 237/500 [=============>................] - ETA: 1:05 - loss: 0.9161 - regression_loss: 0.8191 - classification_loss: 0.0970 238/500 [=============>................] - ETA: 1:05 - loss: 0.9184 - regression_loss: 0.8211 - classification_loss: 0.0973 239/500 [=============>................] - ETA: 1:05 - loss: 0.9155 - regression_loss: 0.8185 - classification_loss: 0.0970 240/500 [=============>................] - ETA: 1:05 - loss: 0.9163 - regression_loss: 0.8193 - classification_loss: 0.0970 241/500 [=============>................] - ETA: 1:04 - loss: 0.9136 - regression_loss: 0.8168 - classification_loss: 0.0967 242/500 [=============>................] - ETA: 1:04 - loss: 0.9123 - regression_loss: 0.8157 - classification_loss: 0.0966 243/500 [=============>................] - ETA: 1:04 - loss: 0.9106 - regression_loss: 0.8143 - classification_loss: 0.0963 244/500 [=============>................] - ETA: 1:04 - loss: 0.9090 - regression_loss: 0.8130 - classification_loss: 0.0960 245/500 [=============>................] - ETA: 1:03 - loss: 0.9107 - regression_loss: 0.8143 - classification_loss: 0.0964 246/500 [=============>................] - ETA: 1:03 - loss: 0.9123 - regression_loss: 0.8156 - classification_loss: 0.0967 247/500 [=============>................] - ETA: 1:03 - loss: 0.9126 - regression_loss: 0.8159 - classification_loss: 0.0967 248/500 [=============>................] - ETA: 1:03 - loss: 0.9144 - regression_loss: 0.8174 - classification_loss: 0.0970 249/500 [=============>................] - ETA: 1:02 - loss: 0.9145 - regression_loss: 0.8175 - classification_loss: 0.0970 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9142 - regression_loss: 0.8174 - classification_loss: 0.0969 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9157 - regression_loss: 0.8187 - classification_loss: 0.0970 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9172 - regression_loss: 0.8195 - classification_loss: 0.0977 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9173 - regression_loss: 0.8196 - classification_loss: 0.0976 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9182 - regression_loss: 0.8201 - classification_loss: 0.0981 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9167 - regression_loss: 0.8189 - classification_loss: 0.0978 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9179 - regression_loss: 0.8198 - classification_loss: 0.0980 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9152 - regression_loss: 0.8174 - classification_loss: 0.0978 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9123 - regression_loss: 0.8148 - classification_loss: 0.0974 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9105 - regression_loss: 0.8133 - classification_loss: 0.0972 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9129 - regression_loss: 0.8153 - classification_loss: 0.0976 261/500 [==============>...............] - ETA: 59s - loss: 0.9144 - regression_loss: 0.8165 - classification_loss: 0.0979  262/500 [==============>...............] - ETA: 59s - loss: 0.9166 - regression_loss: 0.8184 - classification_loss: 0.0982 263/500 [==============>...............] - ETA: 59s - loss: 0.9158 - regression_loss: 0.8178 - classification_loss: 0.0980 264/500 [==============>...............] - ETA: 59s - loss: 0.9154 - regression_loss: 0.8174 - classification_loss: 0.0979 265/500 [==============>...............] - ETA: 58s - loss: 0.9180 - regression_loss: 0.8196 - classification_loss: 0.0985 266/500 [==============>...............] - ETA: 58s - loss: 0.9200 - regression_loss: 0.8213 - classification_loss: 0.0987 267/500 [===============>..............] - ETA: 58s - loss: 0.9194 - regression_loss: 0.8209 - classification_loss: 0.0985 268/500 [===============>..............] - ETA: 58s - loss: 0.9178 - regression_loss: 0.8196 - classification_loss: 0.0981 269/500 [===============>..............] - ETA: 57s - loss: 0.9207 - regression_loss: 0.8220 - classification_loss: 0.0986 270/500 [===============>..............] - ETA: 57s - loss: 0.9197 - regression_loss: 0.8212 - classification_loss: 0.0985 271/500 [===============>..............] - ETA: 57s - loss: 0.9202 - regression_loss: 0.8216 - classification_loss: 0.0986 272/500 [===============>..............] - ETA: 57s - loss: 0.9203 - regression_loss: 0.8218 - classification_loss: 0.0985 273/500 [===============>..............] - ETA: 56s - loss: 0.9194 - regression_loss: 0.8210 - classification_loss: 0.0984 274/500 [===============>..............] - ETA: 56s - loss: 0.9208 - regression_loss: 0.8223 - classification_loss: 0.0986 275/500 [===============>..............] - ETA: 56s - loss: 0.9212 - regression_loss: 0.8226 - classification_loss: 0.0986 276/500 [===============>..............] - ETA: 56s - loss: 0.9211 - regression_loss: 0.8223 - classification_loss: 0.0988 277/500 [===============>..............] - ETA: 55s - loss: 0.9194 - regression_loss: 0.8207 - classification_loss: 0.0987 278/500 [===============>..............] - ETA: 55s - loss: 0.9201 - regression_loss: 0.8214 - classification_loss: 0.0986 279/500 [===============>..............] - ETA: 55s - loss: 0.9220 - regression_loss: 0.8231 - classification_loss: 0.0989 280/500 [===============>..............] - ETA: 55s - loss: 0.9197 - regression_loss: 0.8210 - classification_loss: 0.0987 281/500 [===============>..............] - ETA: 54s - loss: 0.9191 - regression_loss: 0.8206 - classification_loss: 0.0986 282/500 [===============>..............] - ETA: 54s - loss: 0.9183 - regression_loss: 0.8199 - classification_loss: 0.0984 283/500 [===============>..............] - ETA: 54s - loss: 0.9187 - regression_loss: 0.8203 - classification_loss: 0.0984 284/500 [================>.............] - ETA: 54s - loss: 0.9184 - regression_loss: 0.8200 - classification_loss: 0.0984 285/500 [================>.............] - ETA: 53s - loss: 0.9188 - regression_loss: 0.8203 - classification_loss: 0.0985 286/500 [================>.............] - ETA: 53s - loss: 0.9199 - regression_loss: 0.8212 - classification_loss: 0.0987 287/500 [================>.............] - ETA: 53s - loss: 0.9201 - regression_loss: 0.8215 - classification_loss: 0.0986 288/500 [================>.............] - ETA: 53s - loss: 0.9196 - regression_loss: 0.8210 - classification_loss: 0.0986 289/500 [================>.............] - ETA: 52s - loss: 0.9185 - regression_loss: 0.8202 - classification_loss: 0.0983 290/500 [================>.............] - ETA: 52s - loss: 0.9187 - regression_loss: 0.8202 - classification_loss: 0.0984 291/500 [================>.............] - ETA: 52s - loss: 0.9187 - regression_loss: 0.8204 - classification_loss: 0.0983 292/500 [================>.............] - ETA: 52s - loss: 0.9176 - regression_loss: 0.8195 - classification_loss: 0.0981 293/500 [================>.............] - ETA: 51s - loss: 0.9172 - regression_loss: 0.8192 - classification_loss: 0.0980 294/500 [================>.............] - ETA: 51s - loss: 0.9185 - regression_loss: 0.8202 - classification_loss: 0.0983 295/500 [================>.............] - ETA: 51s - loss: 0.9199 - regression_loss: 0.8213 - classification_loss: 0.0986 296/500 [================>.............] - ETA: 51s - loss: 0.9212 - regression_loss: 0.8224 - classification_loss: 0.0988 297/500 [================>.............] - ETA: 50s - loss: 0.9207 - regression_loss: 0.8220 - classification_loss: 0.0987 298/500 [================>.............] - ETA: 50s - loss: 0.9209 - regression_loss: 0.8222 - classification_loss: 0.0987 299/500 [================>.............] - ETA: 50s - loss: 0.9233 - regression_loss: 0.8243 - classification_loss: 0.0990 300/500 [=================>............] - ETA: 50s - loss: 0.9209 - regression_loss: 0.8222 - classification_loss: 0.0987 301/500 [=================>............] - ETA: 49s - loss: 0.9208 - regression_loss: 0.8220 - classification_loss: 0.0988 302/500 [=================>............] - ETA: 49s - loss: 0.9230 - regression_loss: 0.8236 - classification_loss: 0.0994 303/500 [=================>............] - ETA: 49s - loss: 0.9239 - regression_loss: 0.8243 - classification_loss: 0.0996 304/500 [=================>............] - ETA: 49s - loss: 0.9248 - regression_loss: 0.8251 - classification_loss: 0.0997 305/500 [=================>............] - ETA: 48s - loss: 0.9249 - regression_loss: 0.8251 - classification_loss: 0.0998 306/500 [=================>............] - ETA: 48s - loss: 0.9243 - regression_loss: 0.8247 - classification_loss: 0.0997 307/500 [=================>............] - ETA: 48s - loss: 0.9246 - regression_loss: 0.8252 - classification_loss: 0.0995 308/500 [=================>............] - ETA: 48s - loss: 0.9235 - regression_loss: 0.8242 - classification_loss: 0.0993 309/500 [=================>............] - ETA: 47s - loss: 0.9230 - regression_loss: 0.8240 - classification_loss: 0.0990 310/500 [=================>............] - ETA: 47s - loss: 0.9228 - regression_loss: 0.8237 - classification_loss: 0.0990 311/500 [=================>............] - ETA: 47s - loss: 0.9227 - regression_loss: 0.8237 - classification_loss: 0.0990 312/500 [=================>............] - ETA: 47s - loss: 0.9214 - regression_loss: 0.8226 - classification_loss: 0.0988 313/500 [=================>............] - ETA: 46s - loss: 0.9220 - regression_loss: 0.8232 - classification_loss: 0.0988 314/500 [=================>............] - ETA: 46s - loss: 0.9237 - regression_loss: 0.8247 - classification_loss: 0.0990 315/500 [=================>............] - ETA: 46s - loss: 0.9227 - regression_loss: 0.8239 - classification_loss: 0.0988 316/500 [=================>............] - ETA: 46s - loss: 0.9230 - regression_loss: 0.8243 - classification_loss: 0.0987 317/500 [==================>...........] - ETA: 45s - loss: 0.9206 - regression_loss: 0.8221 - classification_loss: 0.0985 318/500 [==================>...........] - ETA: 45s - loss: 0.9209 - regression_loss: 0.8225 - classification_loss: 0.0984 319/500 [==================>...........] - ETA: 45s - loss: 0.9208 - regression_loss: 0.8223 - classification_loss: 0.0984 320/500 [==================>...........] - ETA: 45s - loss: 0.9211 - regression_loss: 0.8228 - classification_loss: 0.0983 321/500 [==================>...........] - ETA: 44s - loss: 0.9202 - regression_loss: 0.8221 - classification_loss: 0.0982 322/500 [==================>...........] - ETA: 44s - loss: 0.9202 - regression_loss: 0.8221 - classification_loss: 0.0981 323/500 [==================>...........] - ETA: 44s - loss: 0.9208 - regression_loss: 0.8226 - classification_loss: 0.0982 324/500 [==================>...........] - ETA: 44s - loss: 0.9208 - regression_loss: 0.8228 - classification_loss: 0.0980 325/500 [==================>...........] - ETA: 43s - loss: 0.9197 - regression_loss: 0.8219 - classification_loss: 0.0978 326/500 [==================>...........] - ETA: 43s - loss: 0.9197 - regression_loss: 0.8219 - classification_loss: 0.0978 327/500 [==================>...........] - ETA: 43s - loss: 0.9206 - regression_loss: 0.8228 - classification_loss: 0.0978 328/500 [==================>...........] - ETA: 43s - loss: 0.9193 - regression_loss: 0.8217 - classification_loss: 0.0976 329/500 [==================>...........] - ETA: 42s - loss: 0.9192 - regression_loss: 0.8216 - classification_loss: 0.0976 330/500 [==================>...........] - ETA: 42s - loss: 0.9192 - regression_loss: 0.8216 - classification_loss: 0.0976 331/500 [==================>...........] - ETA: 42s - loss: 0.9197 - regression_loss: 0.8221 - classification_loss: 0.0976 332/500 [==================>...........] - ETA: 42s - loss: 0.9194 - regression_loss: 0.8218 - classification_loss: 0.0975 333/500 [==================>...........] - ETA: 41s - loss: 0.9187 - regression_loss: 0.8213 - classification_loss: 0.0975 334/500 [===================>..........] - ETA: 41s - loss: 0.9185 - regression_loss: 0.8211 - classification_loss: 0.0973 335/500 [===================>..........] - ETA: 41s - loss: 0.9181 - regression_loss: 0.8208 - classification_loss: 0.0972 336/500 [===================>..........] - ETA: 41s - loss: 0.9188 - regression_loss: 0.8215 - classification_loss: 0.0973 337/500 [===================>..........] - ETA: 40s - loss: 0.9191 - regression_loss: 0.8217 - classification_loss: 0.0974 338/500 [===================>..........] - ETA: 40s - loss: 0.9187 - regression_loss: 0.8214 - classification_loss: 0.0974 339/500 [===================>..........] - ETA: 40s - loss: 0.9197 - regression_loss: 0.8222 - classification_loss: 0.0975 340/500 [===================>..........] - ETA: 40s - loss: 0.9223 - regression_loss: 0.8241 - classification_loss: 0.0982 341/500 [===================>..........] - ETA: 39s - loss: 0.9226 - regression_loss: 0.8244 - classification_loss: 0.0982 342/500 [===================>..........] - ETA: 39s - loss: 0.9227 - regression_loss: 0.8245 - classification_loss: 0.0982 343/500 [===================>..........] - ETA: 39s - loss: 0.9234 - regression_loss: 0.8251 - classification_loss: 0.0983 344/500 [===================>..........] - ETA: 39s - loss: 0.9245 - regression_loss: 0.8260 - classification_loss: 0.0984 345/500 [===================>..........] - ETA: 38s - loss: 0.9244 - regression_loss: 0.8261 - classification_loss: 0.0983 346/500 [===================>..........] - ETA: 38s - loss: 0.9240 - regression_loss: 0.8257 - classification_loss: 0.0983 347/500 [===================>..........] - ETA: 38s - loss: 0.9254 - regression_loss: 0.8269 - classification_loss: 0.0985 348/500 [===================>..........] - ETA: 38s - loss: 0.9265 - regression_loss: 0.8280 - classification_loss: 0.0985 349/500 [===================>..........] - ETA: 37s - loss: 0.9269 - regression_loss: 0.8285 - classification_loss: 0.0985 350/500 [====================>.........] - ETA: 37s - loss: 0.9260 - regression_loss: 0.8278 - classification_loss: 0.0983 351/500 [====================>.........] - ETA: 37s - loss: 0.9249 - regression_loss: 0.8268 - classification_loss: 0.0981 352/500 [====================>.........] - ETA: 37s - loss: 0.9241 - regression_loss: 0.8261 - classification_loss: 0.0980 353/500 [====================>.........] - ETA: 36s - loss: 0.9255 - regression_loss: 0.8272 - classification_loss: 0.0983 354/500 [====================>.........] - ETA: 36s - loss: 0.9251 - regression_loss: 0.8269 - classification_loss: 0.0983 355/500 [====================>.........] - ETA: 36s - loss: 0.9241 - regression_loss: 0.8260 - classification_loss: 0.0981 356/500 [====================>.........] - ETA: 36s - loss: 0.9234 - regression_loss: 0.8254 - classification_loss: 0.0980 357/500 [====================>.........] - ETA: 35s - loss: 0.9239 - regression_loss: 0.8257 - classification_loss: 0.0982 358/500 [====================>.........] - ETA: 35s - loss: 0.9242 - regression_loss: 0.8260 - classification_loss: 0.0982 359/500 [====================>.........] - ETA: 35s - loss: 0.9251 - regression_loss: 0.8268 - classification_loss: 0.0984 360/500 [====================>.........] - ETA: 35s - loss: 0.9258 - regression_loss: 0.8274 - classification_loss: 0.0984 361/500 [====================>.........] - ETA: 34s - loss: 0.9244 - regression_loss: 0.8262 - classification_loss: 0.0982 362/500 [====================>.........] - ETA: 34s - loss: 0.9258 - regression_loss: 0.8272 - classification_loss: 0.0985 363/500 [====================>.........] - ETA: 34s - loss: 0.9264 - regression_loss: 0.8280 - classification_loss: 0.0984 364/500 [====================>.........] - ETA: 34s - loss: 0.9264 - regression_loss: 0.8282 - classification_loss: 0.0983 365/500 [====================>.........] - ETA: 33s - loss: 0.9284 - regression_loss: 0.8297 - classification_loss: 0.0987 366/500 [====================>.........] - ETA: 33s - loss: 0.9278 - regression_loss: 0.8293 - classification_loss: 0.0985 367/500 [=====================>........] - ETA: 33s - loss: 0.9266 - regression_loss: 0.8281 - classification_loss: 0.0984 368/500 [=====================>........] - ETA: 33s - loss: 0.9247 - regression_loss: 0.8265 - classification_loss: 0.0982 369/500 [=====================>........] - ETA: 32s - loss: 0.9247 - regression_loss: 0.8265 - classification_loss: 0.0982 370/500 [=====================>........] - ETA: 32s - loss: 0.9242 - regression_loss: 0.8261 - classification_loss: 0.0981 371/500 [=====================>........] - ETA: 32s - loss: 0.9253 - regression_loss: 0.8270 - classification_loss: 0.0983 372/500 [=====================>........] - ETA: 32s - loss: 0.9262 - regression_loss: 0.8279 - classification_loss: 0.0983 373/500 [=====================>........] - ETA: 31s - loss: 0.9259 - regression_loss: 0.8277 - classification_loss: 0.0982 374/500 [=====================>........] - ETA: 31s - loss: 0.9266 - regression_loss: 0.8283 - classification_loss: 0.0984 375/500 [=====================>........] - ETA: 31s - loss: 0.9271 - regression_loss: 0.8287 - classification_loss: 0.0984 376/500 [=====================>........] - ETA: 31s - loss: 0.9281 - regression_loss: 0.8294 - classification_loss: 0.0987 377/500 [=====================>........] - ETA: 30s - loss: 0.9294 - regression_loss: 0.8305 - classification_loss: 0.0989 378/500 [=====================>........] - ETA: 30s - loss: 0.9283 - regression_loss: 0.8296 - classification_loss: 0.0987 379/500 [=====================>........] - ETA: 30s - loss: 0.9284 - regression_loss: 0.8296 - classification_loss: 0.0988 380/500 [=====================>........] - ETA: 30s - loss: 0.9284 - regression_loss: 0.8297 - classification_loss: 0.0988 381/500 [=====================>........] - ETA: 29s - loss: 0.9291 - regression_loss: 0.8303 - classification_loss: 0.0988 382/500 [=====================>........] - ETA: 29s - loss: 0.9303 - regression_loss: 0.8313 - classification_loss: 0.0990 383/500 [=====================>........] - ETA: 29s - loss: 0.9303 - regression_loss: 0.8314 - classification_loss: 0.0989 384/500 [======================>.......] - ETA: 29s - loss: 0.9313 - regression_loss: 0.8323 - classification_loss: 0.0991 385/500 [======================>.......] - ETA: 28s - loss: 0.9302 - regression_loss: 0.8314 - classification_loss: 0.0989 386/500 [======================>.......] - ETA: 28s - loss: 0.9309 - regression_loss: 0.8319 - classification_loss: 0.0990 387/500 [======================>.......] - ETA: 28s - loss: 0.9309 - regression_loss: 0.8319 - classification_loss: 0.0989 388/500 [======================>.......] - ETA: 28s - loss: 0.9305 - regression_loss: 0.8316 - classification_loss: 0.0989 389/500 [======================>.......] - ETA: 27s - loss: 0.9288 - regression_loss: 0.8301 - classification_loss: 0.0987 390/500 [======================>.......] - ETA: 27s - loss: 0.9292 - regression_loss: 0.8303 - classification_loss: 0.0989 391/500 [======================>.......] - ETA: 27s - loss: 0.9300 - regression_loss: 0.8308 - classification_loss: 0.0991 392/500 [======================>.......] - ETA: 27s - loss: 0.9315 - regression_loss: 0.8321 - classification_loss: 0.0994 393/500 [======================>.......] - ETA: 26s - loss: 0.9319 - regression_loss: 0.8323 - classification_loss: 0.0996 394/500 [======================>.......] - ETA: 26s - loss: 0.9325 - regression_loss: 0.8329 - classification_loss: 0.0995 395/500 [======================>.......] - ETA: 26s - loss: 0.9331 - regression_loss: 0.8336 - classification_loss: 0.0996 396/500 [======================>.......] - ETA: 26s - loss: 0.9334 - regression_loss: 0.8338 - classification_loss: 0.0997 397/500 [======================>.......] - ETA: 25s - loss: 0.9335 - regression_loss: 0.8338 - classification_loss: 0.0996 398/500 [======================>.......] - ETA: 25s - loss: 0.9347 - regression_loss: 0.8349 - classification_loss: 0.0998 399/500 [======================>.......] - ETA: 25s - loss: 0.9350 - regression_loss: 0.8351 - classification_loss: 0.1000 400/500 [=======================>......] - ETA: 25s - loss: 0.9348 - regression_loss: 0.8349 - classification_loss: 0.0999 401/500 [=======================>......] - ETA: 24s - loss: 0.9354 - regression_loss: 0.8355 - classification_loss: 0.0999 402/500 [=======================>......] - ETA: 24s - loss: 0.9357 - regression_loss: 0.8357 - classification_loss: 0.0999 403/500 [=======================>......] - ETA: 24s - loss: 0.9353 - regression_loss: 0.8355 - classification_loss: 0.0998 404/500 [=======================>......] - ETA: 24s - loss: 0.9345 - regression_loss: 0.8349 - classification_loss: 0.0997 405/500 [=======================>......] - ETA: 23s - loss: 0.9355 - regression_loss: 0.8355 - classification_loss: 0.1000 406/500 [=======================>......] - ETA: 23s - loss: 0.9361 - regression_loss: 0.8361 - classification_loss: 0.1001 407/500 [=======================>......] - ETA: 23s - loss: 0.9355 - regression_loss: 0.8355 - classification_loss: 0.0999 408/500 [=======================>......] - ETA: 23s - loss: 0.9354 - regression_loss: 0.8354 - classification_loss: 0.1000 409/500 [=======================>......] - ETA: 22s - loss: 0.9354 - regression_loss: 0.8353 - classification_loss: 0.1001 410/500 [=======================>......] - ETA: 22s - loss: 0.9367 - regression_loss: 0.8364 - classification_loss: 0.1002 411/500 [=======================>......] - ETA: 22s - loss: 0.9368 - regression_loss: 0.8365 - classification_loss: 0.1003 412/500 [=======================>......] - ETA: 22s - loss: 0.9374 - regression_loss: 0.8371 - classification_loss: 0.1002 413/500 [=======================>......] - ETA: 21s - loss: 0.9368 - regression_loss: 0.8367 - classification_loss: 0.1001 414/500 [=======================>......] - ETA: 21s - loss: 0.9353 - regression_loss: 0.8354 - classification_loss: 0.0999 415/500 [=======================>......] - ETA: 21s - loss: 0.9373 - regression_loss: 0.8370 - classification_loss: 0.1003 416/500 [=======================>......] - ETA: 21s - loss: 0.9374 - regression_loss: 0.8371 - classification_loss: 0.1003 417/500 [========================>.....] - ETA: 20s - loss: 0.9381 - regression_loss: 0.8378 - classification_loss: 0.1003 418/500 [========================>.....] - ETA: 20s - loss: 0.9377 - regression_loss: 0.8376 - classification_loss: 0.1001 419/500 [========================>.....] - ETA: 20s - loss: 0.9380 - regression_loss: 0.8380 - classification_loss: 0.1000 420/500 [========================>.....] - ETA: 20s - loss: 0.9364 - regression_loss: 0.8366 - classification_loss: 0.0999 421/500 [========================>.....] - ETA: 19s - loss: 0.9373 - regression_loss: 0.8373 - classification_loss: 0.1000 422/500 [========================>.....] - ETA: 19s - loss: 0.9373 - regression_loss: 0.8373 - classification_loss: 0.1000 423/500 [========================>.....] - ETA: 19s - loss: 0.9364 - regression_loss: 0.8366 - classification_loss: 0.0998 424/500 [========================>.....] - ETA: 19s - loss: 0.9376 - regression_loss: 0.8376 - classification_loss: 0.1001 425/500 [========================>.....] - ETA: 18s - loss: 0.9377 - regression_loss: 0.8376 - classification_loss: 0.1001 426/500 [========================>.....] - ETA: 18s - loss: 0.9387 - regression_loss: 0.8383 - classification_loss: 0.1004 427/500 [========================>.....] - ETA: 18s - loss: 0.9374 - regression_loss: 0.8372 - classification_loss: 0.1002 428/500 [========================>.....] - ETA: 18s - loss: 0.9380 - regression_loss: 0.8378 - classification_loss: 0.1003 429/500 [========================>.....] - ETA: 17s - loss: 0.9369 - regression_loss: 0.8369 - classification_loss: 0.1001 430/500 [========================>.....] - ETA: 17s - loss: 0.9362 - regression_loss: 0.8362 - classification_loss: 0.1000 431/500 [========================>.....] - ETA: 17s - loss: 0.9359 - regression_loss: 0.8359 - classification_loss: 0.1000 432/500 [========================>.....] - ETA: 17s - loss: 0.9362 - regression_loss: 0.8362 - classification_loss: 0.1000 433/500 [========================>.....] - ETA: 16s - loss: 0.9372 - regression_loss: 0.8370 - classification_loss: 0.1001 434/500 [=========================>....] - ETA: 16s - loss: 0.9372 - regression_loss: 0.8371 - classification_loss: 0.1001 435/500 [=========================>....] - ETA: 16s - loss: 0.9383 - regression_loss: 0.8381 - classification_loss: 0.1003 436/500 [=========================>....] - ETA: 16s - loss: 0.9368 - regression_loss: 0.8367 - classification_loss: 0.1001 437/500 [=========================>....] - ETA: 15s - loss: 0.9372 - regression_loss: 0.8370 - classification_loss: 0.1002 438/500 [=========================>....] - ETA: 15s - loss: 0.9371 - regression_loss: 0.8370 - classification_loss: 0.1001 439/500 [=========================>....] - ETA: 15s - loss: 0.9366 - regression_loss: 0.8365 - classification_loss: 0.1001 440/500 [=========================>....] - ETA: 15s - loss: 0.9375 - regression_loss: 0.8372 - classification_loss: 0.1003 441/500 [=========================>....] - ETA: 14s - loss: 0.9369 - regression_loss: 0.8367 - classification_loss: 0.1002 442/500 [=========================>....] - ETA: 14s - loss: 0.9357 - regression_loss: 0.8356 - classification_loss: 0.1001 443/500 [=========================>....] - ETA: 14s - loss: 0.9347 - regression_loss: 0.8347 - classification_loss: 0.0999 444/500 [=========================>....] - ETA: 14s - loss: 0.9341 - regression_loss: 0.8342 - classification_loss: 0.0999 445/500 [=========================>....] - ETA: 13s - loss: 0.9337 - regression_loss: 0.8339 - classification_loss: 0.0998 446/500 [=========================>....] - ETA: 13s - loss: 0.9340 - regression_loss: 0.8342 - classification_loss: 0.0999 447/500 [=========================>....] - ETA: 13s - loss: 0.9343 - regression_loss: 0.8345 - classification_loss: 0.0998 448/500 [=========================>....] - ETA: 13s - loss: 0.9354 - regression_loss: 0.8355 - classification_loss: 0.0999 449/500 [=========================>....] - ETA: 12s - loss: 0.9360 - regression_loss: 0.8360 - classification_loss: 0.1000 450/500 [==========================>...] - ETA: 12s - loss: 0.9363 - regression_loss: 0.8364 - classification_loss: 0.1000 451/500 [==========================>...] - ETA: 12s - loss: 0.9358 - regression_loss: 0.8359 - classification_loss: 0.0998 452/500 [==========================>...] - ETA: 12s - loss: 0.9361 - regression_loss: 0.8363 - classification_loss: 0.0998 453/500 [==========================>...] - ETA: 11s - loss: 0.9345 - regression_loss: 0.8348 - classification_loss: 0.0997 454/500 [==========================>...] - ETA: 11s - loss: 0.9349 - regression_loss: 0.8352 - classification_loss: 0.0996 455/500 [==========================>...] - ETA: 11s - loss: 0.9358 - regression_loss: 0.8361 - classification_loss: 0.0997 456/500 [==========================>...] - ETA: 11s - loss: 0.9360 - regression_loss: 0.8364 - classification_loss: 0.0997 457/500 [==========================>...] - ETA: 10s - loss: 0.9368 - regression_loss: 0.8370 - classification_loss: 0.0998 458/500 [==========================>...] - ETA: 10s - loss: 0.9368 - regression_loss: 0.8370 - classification_loss: 0.0998 459/500 [==========================>...] - ETA: 10s - loss: 0.9362 - regression_loss: 0.8365 - classification_loss: 0.0997 460/500 [==========================>...] - ETA: 10s - loss: 0.9362 - regression_loss: 0.8365 - classification_loss: 0.0997 461/500 [==========================>...] - ETA: 9s - loss: 0.9357 - regression_loss: 0.8362 - classification_loss: 0.0996  462/500 [==========================>...] - ETA: 9s - loss: 0.9354 - regression_loss: 0.8358 - classification_loss: 0.0995 463/500 [==========================>...] - ETA: 9s - loss: 0.9351 - regression_loss: 0.8357 - classification_loss: 0.0994 464/500 [==========================>...] - ETA: 9s - loss: 0.9345 - regression_loss: 0.8352 - classification_loss: 0.0993 465/500 [==========================>...] - ETA: 8s - loss: 0.9340 - regression_loss: 0.8349 - classification_loss: 0.0991 466/500 [==========================>...] - ETA: 8s - loss: 0.9351 - regression_loss: 0.8357 - classification_loss: 0.0993 467/500 [===========================>..] - ETA: 8s - loss: 0.9342 - regression_loss: 0.8350 - classification_loss: 0.0992 468/500 [===========================>..] - ETA: 8s - loss: 0.9337 - regression_loss: 0.8346 - classification_loss: 0.0991 469/500 [===========================>..] - ETA: 7s - loss: 0.9339 - regression_loss: 0.8348 - classification_loss: 0.0991 470/500 [===========================>..] - ETA: 7s - loss: 0.9340 - regression_loss: 0.8349 - classification_loss: 0.0991 471/500 [===========================>..] - ETA: 7s - loss: 0.9338 - regression_loss: 0.8348 - classification_loss: 0.0990 472/500 [===========================>..] - ETA: 7s - loss: 0.9337 - regression_loss: 0.8348 - classification_loss: 0.0989 473/500 [===========================>..] - ETA: 6s - loss: 0.9349 - regression_loss: 0.8358 - classification_loss: 0.0991 474/500 [===========================>..] - ETA: 6s - loss: 0.9344 - regression_loss: 0.8353 - classification_loss: 0.0991 475/500 [===========================>..] - ETA: 6s - loss: 0.9336 - regression_loss: 0.8347 - classification_loss: 0.0989 476/500 [===========================>..] - ETA: 6s - loss: 0.9340 - regression_loss: 0.8352 - classification_loss: 0.0988 477/500 [===========================>..] - ETA: 5s - loss: 0.9340 - regression_loss: 0.8353 - classification_loss: 0.0987 478/500 [===========================>..] - ETA: 5s - loss: 0.9337 - regression_loss: 0.8349 - classification_loss: 0.0988 479/500 [===========================>..] - ETA: 5s - loss: 0.9342 - regression_loss: 0.8353 - classification_loss: 0.0989 480/500 [===========================>..] - ETA: 5s - loss: 0.9338 - regression_loss: 0.8350 - classification_loss: 0.0988 481/500 [===========================>..] - ETA: 4s - loss: 0.9350 - regression_loss: 0.8361 - classification_loss: 0.0989 482/500 [===========================>..] - ETA: 4s - loss: 0.9346 - regression_loss: 0.8358 - classification_loss: 0.0988 483/500 [===========================>..] - ETA: 4s - loss: 0.9345 - regression_loss: 0.8357 - classification_loss: 0.0988 484/500 [============================>.] - ETA: 4s - loss: 0.9340 - regression_loss: 0.8353 - classification_loss: 0.0987 485/500 [============================>.] - ETA: 3s - loss: 0.9338 - regression_loss: 0.8351 - classification_loss: 0.0987 486/500 [============================>.] - ETA: 3s - loss: 0.9332 - regression_loss: 0.8346 - classification_loss: 0.0986 487/500 [============================>.] - ETA: 3s - loss: 0.9331 - regression_loss: 0.8346 - classification_loss: 0.0985 488/500 [============================>.] - ETA: 3s - loss: 0.9327 - regression_loss: 0.8342 - classification_loss: 0.0985 489/500 [============================>.] - ETA: 2s - loss: 0.9319 - regression_loss: 0.8335 - classification_loss: 0.0983 490/500 [============================>.] - ETA: 2s - loss: 0.9308 - regression_loss: 0.8326 - classification_loss: 0.0982 491/500 [============================>.] - ETA: 2s - loss: 0.9310 - regression_loss: 0.8328 - classification_loss: 0.0982 492/500 [============================>.] - ETA: 2s - loss: 0.9322 - regression_loss: 0.8340 - classification_loss: 0.0982 493/500 [============================>.] - ETA: 1s - loss: 0.9323 - regression_loss: 0.8341 - classification_loss: 0.0982 494/500 [============================>.] - ETA: 1s - loss: 0.9330 - regression_loss: 0.8347 - classification_loss: 0.0983 495/500 [============================>.] - ETA: 1s - loss: 0.9333 - regression_loss: 0.8350 - classification_loss: 0.0983 496/500 [============================>.] - ETA: 1s - loss: 0.9337 - regression_loss: 0.8354 - classification_loss: 0.0983 497/500 [============================>.] - ETA: 0s - loss: 0.9345 - regression_loss: 0.8362 - classification_loss: 0.0983 498/500 [============================>.] - ETA: 0s - loss: 0.9347 - regression_loss: 0.8364 - classification_loss: 0.0983 499/500 [============================>.] - ETA: 0s - loss: 0.9361 - regression_loss: 0.8373 - classification_loss: 0.0988 500/500 [==============================] - 125s 251ms/step - loss: 0.9362 - regression_loss: 0.8374 - classification_loss: 0.0988 1172 instances of class plum with average precision: 0.7737 mAP: 0.7737 Epoch 00044: saving model to ./training/snapshots/resnet50_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 1:47 - loss: 1.0137 - regression_loss: 0.8890 - classification_loss: 0.1248 2/500 [..............................] - ETA: 1:53 - loss: 1.0335 - regression_loss: 0.9105 - classification_loss: 0.1230 3/500 [..............................] - ETA: 1:57 - loss: 1.1223 - regression_loss: 0.9949 - classification_loss: 0.1274 4/500 [..............................] - ETA: 1:59 - loss: 0.9875 - regression_loss: 0.8801 - classification_loss: 0.1074 5/500 [..............................] - ETA: 2:00 - loss: 0.9576 - regression_loss: 0.8473 - classification_loss: 0.1103 6/500 [..............................] - ETA: 2:01 - loss: 0.9599 - regression_loss: 0.8490 - classification_loss: 0.1109 7/500 [..............................] - ETA: 2:01 - loss: 1.0235 - regression_loss: 0.9185 - classification_loss: 0.1051 8/500 [..............................] - ETA: 2:02 - loss: 0.9620 - regression_loss: 0.8665 - classification_loss: 0.0955 9/500 [..............................] - ETA: 2:02 - loss: 0.9147 - regression_loss: 0.8221 - classification_loss: 0.0926 10/500 [..............................] - ETA: 2:02 - loss: 0.8684 - regression_loss: 0.7809 - classification_loss: 0.0876 11/500 [..............................] - ETA: 2:01 - loss: 0.8961 - regression_loss: 0.8036 - classification_loss: 0.0925 12/500 [..............................] - ETA: 2:01 - loss: 0.8978 - regression_loss: 0.8041 - classification_loss: 0.0937 13/500 [..............................] - ETA: 2:01 - loss: 0.8729 - regression_loss: 0.7838 - classification_loss: 0.0892 14/500 [..............................] - ETA: 2:00 - loss: 0.8827 - regression_loss: 0.7927 - classification_loss: 0.0900 15/500 [..............................] - ETA: 2:00 - loss: 0.9105 - regression_loss: 0.8152 - classification_loss: 0.0953 16/500 [..............................] - ETA: 2:00 - loss: 0.8957 - regression_loss: 0.8016 - classification_loss: 0.0941 17/500 [>.............................] - ETA: 2:00 - loss: 0.9279 - regression_loss: 0.8288 - classification_loss: 0.0991 18/500 [>.............................] - ETA: 1:59 - loss: 0.8969 - regression_loss: 0.8016 - classification_loss: 0.0952 19/500 [>.............................] - ETA: 1:59 - loss: 0.9087 - regression_loss: 0.8108 - classification_loss: 0.0979 20/500 [>.............................] - ETA: 1:59 - loss: 0.9317 - regression_loss: 0.8299 - classification_loss: 0.1018 21/500 [>.............................] - ETA: 1:59 - loss: 0.9333 - regression_loss: 0.8319 - classification_loss: 0.1014 22/500 [>.............................] - ETA: 1:59 - loss: 0.9308 - regression_loss: 0.8304 - classification_loss: 0.1003 23/500 [>.............................] - ETA: 1:59 - loss: 0.9508 - regression_loss: 0.8484 - classification_loss: 0.1024 24/500 [>.............................] - ETA: 1:59 - loss: 0.9547 - regression_loss: 0.8544 - classification_loss: 0.1003 25/500 [>.............................] - ETA: 1:58 - loss: 0.9318 - regression_loss: 0.8335 - classification_loss: 0.0983 26/500 [>.............................] - ETA: 1:58 - loss: 0.9530 - regression_loss: 0.8520 - classification_loss: 0.1010 27/500 [>.............................] - ETA: 1:58 - loss: 0.9548 - regression_loss: 0.8531 - classification_loss: 0.1017 28/500 [>.............................] - ETA: 1:58 - loss: 0.9585 - regression_loss: 0.8562 - classification_loss: 0.1022 29/500 [>.............................] - ETA: 1:57 - loss: 0.9607 - regression_loss: 0.8593 - classification_loss: 0.1014 30/500 [>.............................] - ETA: 1:57 - loss: 0.9546 - regression_loss: 0.8548 - classification_loss: 0.0998 31/500 [>.............................] - ETA: 1:56 - loss: 0.9609 - regression_loss: 0.8600 - classification_loss: 0.1009 32/500 [>.............................] - ETA: 1:56 - loss: 0.9941 - regression_loss: 0.8903 - classification_loss: 0.1038 33/500 [>.............................] - ETA: 1:56 - loss: 0.9824 - regression_loss: 0.8807 - classification_loss: 0.1016 34/500 [=>............................] - ETA: 1:56 - loss: 0.9761 - regression_loss: 0.8756 - classification_loss: 0.1005 35/500 [=>............................] - ETA: 1:55 - loss: 0.9802 - regression_loss: 0.8792 - classification_loss: 0.1010 36/500 [=>............................] - ETA: 1:55 - loss: 0.9663 - regression_loss: 0.8665 - classification_loss: 0.0998 37/500 [=>............................] - ETA: 1:55 - loss: 0.9842 - regression_loss: 0.8847 - classification_loss: 0.0995 38/500 [=>............................] - ETA: 1:55 - loss: 0.9899 - regression_loss: 0.8890 - classification_loss: 0.1009 39/500 [=>............................] - ETA: 1:55 - loss: 0.9941 - regression_loss: 0.8929 - classification_loss: 0.1012 40/500 [=>............................] - ETA: 1:54 - loss: 0.9865 - regression_loss: 0.8874 - classification_loss: 0.0991 41/500 [=>............................] - ETA: 1:54 - loss: 0.9804 - regression_loss: 0.8827 - classification_loss: 0.0977 42/500 [=>............................] - ETA: 1:54 - loss: 0.9813 - regression_loss: 0.8834 - classification_loss: 0.0979 43/500 [=>............................] - ETA: 1:54 - loss: 0.9914 - regression_loss: 0.8920 - classification_loss: 0.0994 44/500 [=>............................] - ETA: 1:54 - loss: 0.9843 - regression_loss: 0.8866 - classification_loss: 0.0977 45/500 [=>............................] - ETA: 1:53 - loss: 0.9920 - regression_loss: 0.8930 - classification_loss: 0.0990 46/500 [=>............................] - ETA: 1:53 - loss: 1.0017 - regression_loss: 0.8999 - classification_loss: 0.1018 47/500 [=>............................] - ETA: 1:53 - loss: 1.0053 - regression_loss: 0.9031 - classification_loss: 0.1022 48/500 [=>............................] - ETA: 1:53 - loss: 1.0092 - regression_loss: 0.9061 - classification_loss: 0.1030 49/500 [=>............................] - ETA: 1:53 - loss: 1.0123 - regression_loss: 0.9058 - classification_loss: 0.1065 50/500 [==>...........................] - ETA: 1:52 - loss: 1.0069 - regression_loss: 0.9006 - classification_loss: 0.1063 51/500 [==>...........................] - ETA: 1:52 - loss: 1.0149 - regression_loss: 0.9073 - classification_loss: 0.1076 52/500 [==>...........................] - ETA: 1:52 - loss: 1.0205 - regression_loss: 0.9120 - classification_loss: 0.1085 53/500 [==>...........................] - ETA: 1:52 - loss: 1.0179 - regression_loss: 0.9097 - classification_loss: 0.1081 54/500 [==>...........................] - ETA: 1:51 - loss: 1.0201 - regression_loss: 0.9118 - classification_loss: 0.1084 55/500 [==>...........................] - ETA: 1:51 - loss: 1.0248 - regression_loss: 0.9156 - classification_loss: 0.1092 56/500 [==>...........................] - ETA: 1:51 - loss: 1.0243 - regression_loss: 0.9138 - classification_loss: 0.1105 57/500 [==>...........................] - ETA: 1:50 - loss: 1.0223 - regression_loss: 0.9114 - classification_loss: 0.1109 58/500 [==>...........................] - ETA: 1:50 - loss: 1.0151 - regression_loss: 0.9050 - classification_loss: 0.1101 59/500 [==>...........................] - ETA: 1:50 - loss: 1.0051 - regression_loss: 0.8961 - classification_loss: 0.1090 60/500 [==>...........................] - ETA: 1:50 - loss: 1.0083 - regression_loss: 0.8987 - classification_loss: 0.1096 61/500 [==>...........................] - ETA: 1:49 - loss: 1.0077 - regression_loss: 0.8985 - classification_loss: 0.1092 62/500 [==>...........................] - ETA: 1:49 - loss: 1.0004 - regression_loss: 0.8918 - classification_loss: 0.1086 63/500 [==>...........................] - ETA: 1:49 - loss: 0.9983 - regression_loss: 0.8897 - classification_loss: 0.1087 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9989 - regression_loss: 0.8896 - classification_loss: 0.1093 65/500 [==>...........................] - ETA: 1:48 - loss: 0.9910 - regression_loss: 0.8826 - classification_loss: 0.1084 66/500 [==>...........................] - ETA: 1:48 - loss: 0.9907 - regression_loss: 0.8826 - classification_loss: 0.1081 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9930 - regression_loss: 0.8849 - classification_loss: 0.1081 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9953 - regression_loss: 0.8871 - classification_loss: 0.1082 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9858 - regression_loss: 0.8786 - classification_loss: 0.1071 70/500 [===>..........................] - ETA: 1:47 - loss: 0.9898 - regression_loss: 0.8818 - classification_loss: 0.1080 71/500 [===>..........................] - ETA: 1:47 - loss: 0.9896 - regression_loss: 0.8820 - classification_loss: 0.1077 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9813 - regression_loss: 0.8747 - classification_loss: 0.1065 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9855 - regression_loss: 0.8783 - classification_loss: 0.1072 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9829 - regression_loss: 0.8761 - classification_loss: 0.1068 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9862 - regression_loss: 0.8787 - classification_loss: 0.1075 76/500 [===>..........................] - ETA: 1:45 - loss: 0.9972 - regression_loss: 0.8886 - classification_loss: 0.1086 77/500 [===>..........................] - ETA: 1:45 - loss: 1.0050 - regression_loss: 0.8958 - classification_loss: 0.1091 78/500 [===>..........................] - ETA: 1:45 - loss: 1.0030 - regression_loss: 0.8942 - classification_loss: 0.1088 79/500 [===>..........................] - ETA: 1:45 - loss: 1.0001 - regression_loss: 0.8924 - classification_loss: 0.1078 80/500 [===>..........................] - ETA: 1:44 - loss: 0.9961 - regression_loss: 0.8886 - classification_loss: 0.1076 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9966 - regression_loss: 0.8890 - classification_loss: 0.1076 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9988 - regression_loss: 0.8910 - classification_loss: 0.1078 83/500 [===>..........................] - ETA: 1:43 - loss: 1.0031 - regression_loss: 0.8946 - classification_loss: 0.1086 84/500 [====>.........................] - ETA: 1:43 - loss: 1.0058 - regression_loss: 0.8967 - classification_loss: 0.1090 85/500 [====>.........................] - ETA: 1:43 - loss: 1.0015 - regression_loss: 0.8934 - classification_loss: 0.1082 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9948 - regression_loss: 0.8876 - classification_loss: 0.1072 87/500 [====>.........................] - ETA: 1:42 - loss: 0.9887 - regression_loss: 0.8825 - classification_loss: 0.1062 88/500 [====>.........................] - ETA: 1:42 - loss: 0.9941 - regression_loss: 0.8867 - classification_loss: 0.1074 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9896 - regression_loss: 0.8829 - classification_loss: 0.1067 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9885 - regression_loss: 0.8810 - classification_loss: 0.1075 91/500 [====>.........................] - ETA: 1:41 - loss: 0.9898 - regression_loss: 0.8821 - classification_loss: 0.1077 92/500 [====>.........................] - ETA: 1:41 - loss: 0.9906 - regression_loss: 0.8828 - classification_loss: 0.1078 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9971 - regression_loss: 0.8889 - classification_loss: 0.1083 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9983 - regression_loss: 0.8897 - classification_loss: 0.1086 95/500 [====>.........................] - ETA: 1:41 - loss: 0.9920 - regression_loss: 0.8841 - classification_loss: 0.1078 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9934 - regression_loss: 0.8849 - classification_loss: 0.1085 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9917 - regression_loss: 0.8837 - classification_loss: 0.1080 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9924 - regression_loss: 0.8845 - classification_loss: 0.1079 99/500 [====>.........................] - ETA: 1:40 - loss: 0.9874 - regression_loss: 0.8803 - classification_loss: 0.1071 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9880 - regression_loss: 0.8805 - classification_loss: 0.1075 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9903 - regression_loss: 0.8830 - classification_loss: 0.1073 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9879 - regression_loss: 0.8808 - classification_loss: 0.1071 103/500 [=====>........................] - ETA: 1:39 - loss: 0.9910 - regression_loss: 0.8830 - classification_loss: 0.1080 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9924 - regression_loss: 0.8841 - classification_loss: 0.1083 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9902 - regression_loss: 0.8823 - classification_loss: 0.1079 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9911 - regression_loss: 0.8833 - classification_loss: 0.1079 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9880 - regression_loss: 0.8809 - classification_loss: 0.1072 108/500 [=====>........................] - ETA: 1:38 - loss: 0.9905 - regression_loss: 0.8831 - classification_loss: 0.1074 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9866 - regression_loss: 0.8799 - classification_loss: 0.1066 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9912 - regression_loss: 0.8836 - classification_loss: 0.1076 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9924 - regression_loss: 0.8846 - classification_loss: 0.1078 112/500 [=====>........................] - ETA: 1:37 - loss: 0.9928 - regression_loss: 0.8849 - classification_loss: 0.1078 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9873 - regression_loss: 0.8802 - classification_loss: 0.1071 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9872 - regression_loss: 0.8802 - classification_loss: 0.1069 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9842 - regression_loss: 0.8781 - classification_loss: 0.1062 116/500 [=====>........................] - ETA: 1:36 - loss: 0.9835 - regression_loss: 0.8768 - classification_loss: 0.1067 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9828 - regression_loss: 0.8762 - classification_loss: 0.1065 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9814 - regression_loss: 0.8753 - classification_loss: 0.1061 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9803 - regression_loss: 0.8745 - classification_loss: 0.1058 120/500 [======>.......................] - ETA: 1:35 - loss: 0.9761 - regression_loss: 0.8710 - classification_loss: 0.1051 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9704 - regression_loss: 0.8661 - classification_loss: 0.1043 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9661 - regression_loss: 0.8621 - classification_loss: 0.1040 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9642 - regression_loss: 0.8607 - classification_loss: 0.1036 124/500 [======>.......................] - ETA: 1:34 - loss: 0.9661 - regression_loss: 0.8622 - classification_loss: 0.1039 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9649 - regression_loss: 0.8611 - classification_loss: 0.1038 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9625 - regression_loss: 0.8591 - classification_loss: 0.1034 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9648 - regression_loss: 0.8611 - classification_loss: 0.1037 128/500 [======>.......................] - ETA: 1:33 - loss: 0.9623 - regression_loss: 0.8589 - classification_loss: 0.1033 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9601 - regression_loss: 0.8573 - classification_loss: 0.1028 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9595 - regression_loss: 0.8569 - classification_loss: 0.1026 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9643 - regression_loss: 0.8609 - classification_loss: 0.1034 132/500 [======>.......................] - ETA: 1:32 - loss: 0.9622 - regression_loss: 0.8592 - classification_loss: 0.1030 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9592 - regression_loss: 0.8567 - classification_loss: 0.1025 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9597 - regression_loss: 0.8571 - classification_loss: 0.1025 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9607 - regression_loss: 0.8577 - classification_loss: 0.1030 136/500 [=======>......................] - ETA: 1:31 - loss: 0.9558 - regression_loss: 0.8534 - classification_loss: 0.1025 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9531 - regression_loss: 0.8510 - classification_loss: 0.1020 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9515 - regression_loss: 0.8498 - classification_loss: 0.1017 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9486 - regression_loss: 0.8474 - classification_loss: 0.1011 140/500 [=======>......................] - ETA: 1:30 - loss: 0.9513 - regression_loss: 0.8496 - classification_loss: 0.1017 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9542 - regression_loss: 0.8519 - classification_loss: 0.1023 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9542 - regression_loss: 0.8519 - classification_loss: 0.1023 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9512 - regression_loss: 0.8494 - classification_loss: 0.1019 144/500 [=======>......................] - ETA: 1:29 - loss: 0.9565 - regression_loss: 0.8537 - classification_loss: 0.1027 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9549 - regression_loss: 0.8527 - classification_loss: 0.1022 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9565 - regression_loss: 0.8541 - classification_loss: 0.1024 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9523 - regression_loss: 0.8505 - classification_loss: 0.1018 148/500 [=======>......................] - ETA: 1:28 - loss: 0.9558 - regression_loss: 0.8523 - classification_loss: 0.1034 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9516 - regression_loss: 0.8486 - classification_loss: 0.1029 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9521 - regression_loss: 0.8492 - classification_loss: 0.1029 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9502 - regression_loss: 0.8478 - classification_loss: 0.1025 152/500 [========>.....................] - ETA: 1:27 - loss: 0.9521 - regression_loss: 0.8497 - classification_loss: 0.1024 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9555 - regression_loss: 0.8529 - classification_loss: 0.1026 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9572 - regression_loss: 0.8544 - classification_loss: 0.1028 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9634 - regression_loss: 0.8598 - classification_loss: 0.1036 156/500 [========>.....................] - ETA: 1:26 - loss: 0.9646 - regression_loss: 0.8610 - classification_loss: 0.1036 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9637 - regression_loss: 0.8604 - classification_loss: 0.1033 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9641 - regression_loss: 0.8606 - classification_loss: 0.1035 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9612 - regression_loss: 0.8577 - classification_loss: 0.1035 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9619 - regression_loss: 0.8585 - classification_loss: 0.1034 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9645 - regression_loss: 0.8603 - classification_loss: 0.1042 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9645 - regression_loss: 0.8602 - classification_loss: 0.1043 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9656 - regression_loss: 0.8609 - classification_loss: 0.1047 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9624 - regression_loss: 0.8582 - classification_loss: 0.1041 165/500 [========>.....................] - ETA: 1:24 - loss: 0.9618 - regression_loss: 0.8578 - classification_loss: 0.1040 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9610 - regression_loss: 0.8573 - classification_loss: 0.1036 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9603 - regression_loss: 0.8567 - classification_loss: 0.1036 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9603 - regression_loss: 0.8568 - classification_loss: 0.1035 169/500 [=========>....................] - ETA: 1:23 - loss: 0.9620 - regression_loss: 0.8583 - classification_loss: 0.1037 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9591 - regression_loss: 0.8559 - classification_loss: 0.1032 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9600 - regression_loss: 0.8568 - classification_loss: 0.1033 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9611 - regression_loss: 0.8578 - classification_loss: 0.1033 173/500 [=========>....................] - ETA: 1:22 - loss: 0.9611 - regression_loss: 0.8577 - classification_loss: 0.1034 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9600 - regression_loss: 0.8570 - classification_loss: 0.1030 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9577 - regression_loss: 0.8549 - classification_loss: 0.1028 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9579 - regression_loss: 0.8551 - classification_loss: 0.1028 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9565 - regression_loss: 0.8539 - classification_loss: 0.1027 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9567 - regression_loss: 0.8542 - classification_loss: 0.1026 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9564 - regression_loss: 0.8539 - classification_loss: 0.1025 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9570 - regression_loss: 0.8545 - classification_loss: 0.1025 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9565 - regression_loss: 0.8542 - classification_loss: 0.1023 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9542 - regression_loss: 0.8523 - classification_loss: 0.1019 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9551 - regression_loss: 0.8528 - classification_loss: 0.1023 184/500 [==========>...................] - ETA: 1:18 - loss: 0.9555 - regression_loss: 0.8532 - classification_loss: 0.1023 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9537 - regression_loss: 0.8518 - classification_loss: 0.1019 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9520 - regression_loss: 0.8504 - classification_loss: 0.1017 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9520 - regression_loss: 0.8501 - classification_loss: 0.1020 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9490 - regression_loss: 0.8475 - classification_loss: 0.1015 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9496 - regression_loss: 0.8480 - classification_loss: 0.1016 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9498 - regression_loss: 0.8480 - classification_loss: 0.1018 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9503 - regression_loss: 0.8488 - classification_loss: 0.1015 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9497 - regression_loss: 0.8484 - classification_loss: 0.1012 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9487 - regression_loss: 0.8476 - classification_loss: 0.1011 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9501 - regression_loss: 0.8488 - classification_loss: 0.1013 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9470 - regression_loss: 0.8461 - classification_loss: 0.1009 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9450 - regression_loss: 0.8445 - classification_loss: 0.1005 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9476 - regression_loss: 0.8467 - classification_loss: 0.1008 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9463 - regression_loss: 0.8458 - classification_loss: 0.1005 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9458 - regression_loss: 0.8453 - classification_loss: 0.1005 200/500 [===========>..................] - ETA: 1:14 - loss: 0.9464 - regression_loss: 0.8459 - classification_loss: 0.1004 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9443 - regression_loss: 0.8442 - classification_loss: 0.1001 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9430 - regression_loss: 0.8432 - classification_loss: 0.0998 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9426 - regression_loss: 0.8428 - classification_loss: 0.0999 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9446 - regression_loss: 0.8446 - classification_loss: 0.1000 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9436 - regression_loss: 0.8437 - classification_loss: 0.0999 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9433 - regression_loss: 0.8435 - classification_loss: 0.0998 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9430 - regression_loss: 0.8435 - classification_loss: 0.0996 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9443 - regression_loss: 0.8445 - classification_loss: 0.0998 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9459 - regression_loss: 0.8459 - classification_loss: 0.1000 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9435 - regression_loss: 0.8438 - classification_loss: 0.0996 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9435 - regression_loss: 0.8438 - classification_loss: 0.0997 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9419 - regression_loss: 0.8424 - classification_loss: 0.0995 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9428 - regression_loss: 0.8431 - classification_loss: 0.0997 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9443 - regression_loss: 0.8444 - classification_loss: 0.0999 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9447 - regression_loss: 0.8447 - classification_loss: 0.1000 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9458 - regression_loss: 0.8460 - classification_loss: 0.0998 217/500 [============>.................] - ETA: 1:10 - loss: 0.9466 - regression_loss: 0.8467 - classification_loss: 0.0999 218/500 [============>.................] - ETA: 1:10 - loss: 0.9466 - regression_loss: 0.8467 - classification_loss: 0.0999 219/500 [============>.................] - ETA: 1:10 - loss: 0.9492 - regression_loss: 0.8490 - classification_loss: 0.1003 220/500 [============>.................] - ETA: 1:10 - loss: 0.9497 - regression_loss: 0.8493 - classification_loss: 0.1004 221/500 [============>.................] - ETA: 1:09 - loss: 0.9484 - regression_loss: 0.8482 - classification_loss: 0.1002 222/500 [============>.................] - ETA: 1:09 - loss: 0.9494 - regression_loss: 0.8483 - classification_loss: 0.1011 223/500 [============>.................] - ETA: 1:09 - loss: 0.9512 - regression_loss: 0.8495 - classification_loss: 0.1017 224/500 [============>.................] - ETA: 1:09 - loss: 0.9503 - regression_loss: 0.8487 - classification_loss: 0.1016 225/500 [============>.................] - ETA: 1:08 - loss: 0.9484 - regression_loss: 0.8471 - classification_loss: 0.1013 226/500 [============>.................] - ETA: 1:08 - loss: 0.9485 - regression_loss: 0.8472 - classification_loss: 0.1013 227/500 [============>.................] - ETA: 1:08 - loss: 0.9461 - regression_loss: 0.8451 - classification_loss: 0.1010 228/500 [============>.................] - ETA: 1:08 - loss: 0.9439 - regression_loss: 0.8432 - classification_loss: 0.1006 229/500 [============>.................] - ETA: 1:07 - loss: 0.9447 - regression_loss: 0.8439 - classification_loss: 0.1008 230/500 [============>.................] - ETA: 1:07 - loss: 0.9428 - regression_loss: 0.8422 - classification_loss: 0.1006 231/500 [============>.................] - ETA: 1:07 - loss: 0.9425 - regression_loss: 0.8421 - classification_loss: 0.1005 232/500 [============>.................] - ETA: 1:07 - loss: 0.9460 - regression_loss: 0.8451 - classification_loss: 0.1009 233/500 [============>.................] - ETA: 1:06 - loss: 0.9458 - regression_loss: 0.8450 - classification_loss: 0.1008 234/500 [=============>................] - ETA: 1:06 - loss: 0.9467 - regression_loss: 0.8456 - classification_loss: 0.1010 235/500 [=============>................] - ETA: 1:06 - loss: 0.9515 - regression_loss: 0.8503 - classification_loss: 0.1013 236/500 [=============>................] - ETA: 1:06 - loss: 0.9497 - regression_loss: 0.8486 - classification_loss: 0.1011 237/500 [=============>................] - ETA: 1:05 - loss: 0.9495 - regression_loss: 0.8486 - classification_loss: 0.1009 238/500 [=============>................] - ETA: 1:05 - loss: 0.9499 - regression_loss: 0.8488 - classification_loss: 0.1011 239/500 [=============>................] - ETA: 1:05 - loss: 0.9499 - regression_loss: 0.8488 - classification_loss: 0.1011 240/500 [=============>................] - ETA: 1:05 - loss: 0.9485 - regression_loss: 0.8475 - classification_loss: 0.1010 241/500 [=============>................] - ETA: 1:04 - loss: 0.9495 - regression_loss: 0.8484 - classification_loss: 0.1011 242/500 [=============>................] - ETA: 1:04 - loss: 0.9497 - regression_loss: 0.8487 - classification_loss: 0.1010 243/500 [=============>................] - ETA: 1:04 - loss: 0.9492 - regression_loss: 0.8484 - classification_loss: 0.1007 244/500 [=============>................] - ETA: 1:04 - loss: 0.9479 - regression_loss: 0.8474 - classification_loss: 0.1005 245/500 [=============>................] - ETA: 1:03 - loss: 0.9470 - regression_loss: 0.8467 - classification_loss: 0.1003 246/500 [=============>................] - ETA: 1:03 - loss: 0.9468 - regression_loss: 0.8466 - classification_loss: 0.1002 247/500 [=============>................] - ETA: 1:03 - loss: 0.9480 - regression_loss: 0.8476 - classification_loss: 0.1004 248/500 [=============>................] - ETA: 1:03 - loss: 0.9487 - regression_loss: 0.8482 - classification_loss: 0.1005 249/500 [=============>................] - ETA: 1:02 - loss: 0.9469 - regression_loss: 0.8466 - classification_loss: 0.1003 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9452 - regression_loss: 0.8452 - classification_loss: 0.1001 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9473 - regression_loss: 0.8469 - classification_loss: 0.1004 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9472 - regression_loss: 0.8469 - classification_loss: 0.1003 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9491 - regression_loss: 0.8486 - classification_loss: 0.1005 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9471 - regression_loss: 0.8467 - classification_loss: 0.1003 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9464 - regression_loss: 0.8462 - classification_loss: 0.1002 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9453 - regression_loss: 0.8453 - classification_loss: 0.1000 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9442 - regression_loss: 0.8443 - classification_loss: 0.0999 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9441 - regression_loss: 0.8442 - classification_loss: 0.0999 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9440 - regression_loss: 0.8440 - classification_loss: 0.1000 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9446 - regression_loss: 0.8445 - classification_loss: 0.1001 261/500 [==============>...............] - ETA: 59s - loss: 0.9462 - regression_loss: 0.8460 - classification_loss: 0.1001  262/500 [==============>...............] - ETA: 59s - loss: 0.9468 - regression_loss: 0.8464 - classification_loss: 0.1003 263/500 [==============>...............] - ETA: 59s - loss: 0.9484 - regression_loss: 0.8478 - classification_loss: 0.1005 264/500 [==============>...............] - ETA: 59s - loss: 0.9465 - regression_loss: 0.8462 - classification_loss: 0.1003 265/500 [==============>...............] - ETA: 58s - loss: 0.9464 - regression_loss: 0.8461 - classification_loss: 0.1002 266/500 [==============>...............] - ETA: 58s - loss: 0.9469 - regression_loss: 0.8464 - classification_loss: 0.1005 267/500 [===============>..............] - ETA: 58s - loss: 0.9464 - regression_loss: 0.8460 - classification_loss: 0.1004 268/500 [===============>..............] - ETA: 58s - loss: 0.9439 - regression_loss: 0.8438 - classification_loss: 0.1001 269/500 [===============>..............] - ETA: 57s - loss: 0.9448 - regression_loss: 0.8447 - classification_loss: 0.1001 270/500 [===============>..............] - ETA: 57s - loss: 0.9427 - regression_loss: 0.8429 - classification_loss: 0.0998 271/500 [===============>..............] - ETA: 57s - loss: 0.9418 - regression_loss: 0.8421 - classification_loss: 0.0996 272/500 [===============>..............] - ETA: 57s - loss: 0.9399 - regression_loss: 0.8405 - classification_loss: 0.0993 273/500 [===============>..............] - ETA: 56s - loss: 0.9390 - regression_loss: 0.8399 - classification_loss: 0.0992 274/500 [===============>..............] - ETA: 56s - loss: 0.9370 - regression_loss: 0.8382 - classification_loss: 0.0989 275/500 [===============>..............] - ETA: 56s - loss: 0.9382 - regression_loss: 0.8392 - classification_loss: 0.0990 276/500 [===============>..............] - ETA: 56s - loss: 0.9392 - regression_loss: 0.8401 - classification_loss: 0.0992 277/500 [===============>..............] - ETA: 55s - loss: 0.9414 - regression_loss: 0.8419 - classification_loss: 0.0995 278/500 [===============>..............] - ETA: 55s - loss: 0.9430 - regression_loss: 0.8431 - classification_loss: 0.0999 279/500 [===============>..............] - ETA: 55s - loss: 0.9439 - regression_loss: 0.8439 - classification_loss: 0.1000 280/500 [===============>..............] - ETA: 55s - loss: 0.9449 - regression_loss: 0.8448 - classification_loss: 0.1001 281/500 [===============>..............] - ETA: 54s - loss: 0.9447 - regression_loss: 0.8446 - classification_loss: 0.1000 282/500 [===============>..............] - ETA: 54s - loss: 0.9466 - regression_loss: 0.8463 - classification_loss: 0.1003 283/500 [===============>..............] - ETA: 54s - loss: 0.9464 - regression_loss: 0.8461 - classification_loss: 0.1004 284/500 [================>.............] - ETA: 53s - loss: 0.9455 - regression_loss: 0.8452 - classification_loss: 0.1003 285/500 [================>.............] - ETA: 53s - loss: 0.9450 - regression_loss: 0.8447 - classification_loss: 0.1003 286/500 [================>.............] - ETA: 53s - loss: 0.9460 - regression_loss: 0.8455 - classification_loss: 0.1005 287/500 [================>.............] - ETA: 53s - loss: 0.9447 - regression_loss: 0.8444 - classification_loss: 0.1003 288/500 [================>.............] - ETA: 52s - loss: 0.9448 - regression_loss: 0.8444 - classification_loss: 0.1003 289/500 [================>.............] - ETA: 52s - loss: 0.9424 - regression_loss: 0.8423 - classification_loss: 0.1000 290/500 [================>.............] - ETA: 52s - loss: 0.9435 - regression_loss: 0.8433 - classification_loss: 0.1002 291/500 [================>.............] - ETA: 52s - loss: 0.9437 - regression_loss: 0.8435 - classification_loss: 0.1002 292/500 [================>.............] - ETA: 51s - loss: 0.9454 - regression_loss: 0.8449 - classification_loss: 0.1005 293/500 [================>.............] - ETA: 51s - loss: 0.9456 - regression_loss: 0.8450 - classification_loss: 0.1006 294/500 [================>.............] - ETA: 51s - loss: 0.9452 - regression_loss: 0.8447 - classification_loss: 0.1004 295/500 [================>.............] - ETA: 50s - loss: 0.9440 - regression_loss: 0.8438 - classification_loss: 0.1002 296/500 [================>.............] - ETA: 50s - loss: 0.9448 - regression_loss: 0.8445 - classification_loss: 0.1004 297/500 [================>.............] - ETA: 50s - loss: 0.9457 - regression_loss: 0.8452 - classification_loss: 0.1005 298/500 [================>.............] - ETA: 50s - loss: 0.9452 - regression_loss: 0.8447 - classification_loss: 0.1005 299/500 [================>.............] - ETA: 49s - loss: 0.9460 - regression_loss: 0.8455 - classification_loss: 0.1005 300/500 [=================>............] - ETA: 49s - loss: 0.9455 - regression_loss: 0.8450 - classification_loss: 0.1005 301/500 [=================>............] - ETA: 49s - loss: 0.9451 - regression_loss: 0.8447 - classification_loss: 0.1004 302/500 [=================>............] - ETA: 49s - loss: 0.9453 - regression_loss: 0.8448 - classification_loss: 0.1004 303/500 [=================>............] - ETA: 48s - loss: 0.9444 - regression_loss: 0.8441 - classification_loss: 0.1003 304/500 [=================>............] - ETA: 48s - loss: 0.9445 - regression_loss: 0.8443 - classification_loss: 0.1002 305/500 [=================>............] - ETA: 48s - loss: 0.9441 - regression_loss: 0.8441 - classification_loss: 0.1000 306/500 [=================>............] - ETA: 48s - loss: 0.9437 - regression_loss: 0.8439 - classification_loss: 0.0998 307/500 [=================>............] - ETA: 47s - loss: 0.9449 - regression_loss: 0.8449 - classification_loss: 0.1000 308/500 [=================>............] - ETA: 47s - loss: 0.9445 - regression_loss: 0.8447 - classification_loss: 0.0998 309/500 [=================>............] - ETA: 47s - loss: 0.9439 - regression_loss: 0.8442 - classification_loss: 0.0997 310/500 [=================>............] - ETA: 47s - loss: 0.9446 - regression_loss: 0.8451 - classification_loss: 0.0995 311/500 [=================>............] - ETA: 46s - loss: 0.9428 - regression_loss: 0.8436 - classification_loss: 0.0993 312/500 [=================>............] - ETA: 46s - loss: 0.9447 - regression_loss: 0.8452 - classification_loss: 0.0995 313/500 [=================>............] - ETA: 46s - loss: 0.9429 - regression_loss: 0.8437 - classification_loss: 0.0992 314/500 [=================>............] - ETA: 46s - loss: 0.9446 - regression_loss: 0.8451 - classification_loss: 0.0994 315/500 [=================>............] - ETA: 45s - loss: 0.9459 - regression_loss: 0.8463 - classification_loss: 0.0996 316/500 [=================>............] - ETA: 45s - loss: 0.9480 - regression_loss: 0.8481 - classification_loss: 0.0999 317/500 [==================>...........] - ETA: 45s - loss: 0.9475 - regression_loss: 0.8478 - classification_loss: 0.0997 318/500 [==================>...........] - ETA: 45s - loss: 0.9469 - regression_loss: 0.8472 - classification_loss: 0.0997 319/500 [==================>...........] - ETA: 44s - loss: 0.9454 - regression_loss: 0.8459 - classification_loss: 0.0995 320/500 [==================>...........] - ETA: 44s - loss: 0.9452 - regression_loss: 0.8458 - classification_loss: 0.0994 321/500 [==================>...........] - ETA: 44s - loss: 0.9450 - regression_loss: 0.8457 - classification_loss: 0.0993 322/500 [==================>...........] - ETA: 44s - loss: 0.9452 - regression_loss: 0.8458 - classification_loss: 0.0994 323/500 [==================>...........] - ETA: 43s - loss: 0.9457 - regression_loss: 0.8463 - classification_loss: 0.0994 324/500 [==================>...........] - ETA: 43s - loss: 0.9478 - regression_loss: 0.8480 - classification_loss: 0.0999 325/500 [==================>...........] - ETA: 43s - loss: 0.9488 - regression_loss: 0.8489 - classification_loss: 0.0999 326/500 [==================>...........] - ETA: 43s - loss: 0.9476 - regression_loss: 0.8478 - classification_loss: 0.0997 327/500 [==================>...........] - ETA: 42s - loss: 0.9471 - regression_loss: 0.8474 - classification_loss: 0.0997 328/500 [==================>...........] - ETA: 42s - loss: 0.9486 - regression_loss: 0.8487 - classification_loss: 0.0999 329/500 [==================>...........] - ETA: 42s - loss: 0.9467 - regression_loss: 0.8471 - classification_loss: 0.0996 330/500 [==================>...........] - ETA: 42s - loss: 0.9471 - regression_loss: 0.8474 - classification_loss: 0.0997 331/500 [==================>...........] - ETA: 41s - loss: 0.9469 - regression_loss: 0.8474 - classification_loss: 0.0995 332/500 [==================>...........] - ETA: 41s - loss: 0.9469 - regression_loss: 0.8474 - classification_loss: 0.0995 333/500 [==================>...........] - ETA: 41s - loss: 0.9453 - regression_loss: 0.8459 - classification_loss: 0.0994 334/500 [===================>..........] - ETA: 41s - loss: 0.9453 - regression_loss: 0.8459 - classification_loss: 0.0994 335/500 [===================>..........] - ETA: 40s - loss: 0.9453 - regression_loss: 0.8459 - classification_loss: 0.0994 336/500 [===================>..........] - ETA: 40s - loss: 0.9434 - regression_loss: 0.8442 - classification_loss: 0.0991 337/500 [===================>..........] - ETA: 40s - loss: 0.9416 - regression_loss: 0.8426 - classification_loss: 0.0989 338/500 [===================>..........] - ETA: 40s - loss: 0.9415 - regression_loss: 0.8427 - classification_loss: 0.0988 339/500 [===================>..........] - ETA: 39s - loss: 0.9419 - regression_loss: 0.8430 - classification_loss: 0.0988 340/500 [===================>..........] - ETA: 39s - loss: 0.9411 - regression_loss: 0.8424 - classification_loss: 0.0987 341/500 [===================>..........] - ETA: 39s - loss: 0.9420 - regression_loss: 0.8431 - classification_loss: 0.0989 342/500 [===================>..........] - ETA: 39s - loss: 0.9429 - regression_loss: 0.8438 - classification_loss: 0.0991 343/500 [===================>..........] - ETA: 38s - loss: 0.9439 - regression_loss: 0.8446 - classification_loss: 0.0993 344/500 [===================>..........] - ETA: 38s - loss: 0.9456 - regression_loss: 0.8463 - classification_loss: 0.0993 345/500 [===================>..........] - ETA: 38s - loss: 0.9455 - regression_loss: 0.8463 - classification_loss: 0.0992 346/500 [===================>..........] - ETA: 38s - loss: 0.9439 - regression_loss: 0.8449 - classification_loss: 0.0990 347/500 [===================>..........] - ETA: 37s - loss: 0.9447 - regression_loss: 0.8457 - classification_loss: 0.0990 348/500 [===================>..........] - ETA: 37s - loss: 0.9442 - regression_loss: 0.8452 - classification_loss: 0.0990 349/500 [===================>..........] - ETA: 37s - loss: 0.9442 - regression_loss: 0.8453 - classification_loss: 0.0990 350/500 [====================>.........] - ETA: 37s - loss: 0.9442 - regression_loss: 0.8452 - classification_loss: 0.0991 351/500 [====================>.........] - ETA: 36s - loss: 0.9461 - regression_loss: 0.8467 - classification_loss: 0.0994 352/500 [====================>.........] - ETA: 36s - loss: 0.9478 - regression_loss: 0.8482 - classification_loss: 0.0996 353/500 [====================>.........] - ETA: 36s - loss: 0.9486 - regression_loss: 0.8488 - classification_loss: 0.0998 354/500 [====================>.........] - ETA: 36s - loss: 0.9481 - regression_loss: 0.8484 - classification_loss: 0.0997 355/500 [====================>.........] - ETA: 35s - loss: 0.9502 - regression_loss: 0.8504 - classification_loss: 0.0998 356/500 [====================>.........] - ETA: 35s - loss: 0.9527 - regression_loss: 0.8524 - classification_loss: 0.1004 357/500 [====================>.........] - ETA: 35s - loss: 0.9513 - regression_loss: 0.8511 - classification_loss: 0.1002 358/500 [====================>.........] - ETA: 35s - loss: 0.9508 - regression_loss: 0.8507 - classification_loss: 0.1001 359/500 [====================>.........] - ETA: 34s - loss: 0.9514 - regression_loss: 0.8508 - classification_loss: 0.1006 360/500 [====================>.........] - ETA: 34s - loss: 0.9522 - regression_loss: 0.8515 - classification_loss: 0.1008 361/500 [====================>.........] - ETA: 34s - loss: 0.9502 - regression_loss: 0.8496 - classification_loss: 0.1006 362/500 [====================>.........] - ETA: 34s - loss: 0.9510 - regression_loss: 0.8504 - classification_loss: 0.1005 363/500 [====================>.........] - ETA: 33s - loss: 0.9507 - regression_loss: 0.8500 - classification_loss: 0.1007 364/500 [====================>.........] - ETA: 33s - loss: 0.9500 - regression_loss: 0.8494 - classification_loss: 0.1006 365/500 [====================>.........] - ETA: 33s - loss: 0.9511 - regression_loss: 0.8504 - classification_loss: 0.1007 366/500 [====================>.........] - ETA: 33s - loss: 0.9505 - regression_loss: 0.8498 - classification_loss: 0.1006 367/500 [=====================>........] - ETA: 32s - loss: 0.9494 - regression_loss: 0.8490 - classification_loss: 0.1004 368/500 [=====================>........] - ETA: 32s - loss: 0.9477 - regression_loss: 0.8475 - classification_loss: 0.1002 369/500 [=====================>........] - ETA: 32s - loss: 0.9484 - regression_loss: 0.8480 - classification_loss: 0.1003 370/500 [=====================>........] - ETA: 32s - loss: 0.9485 - regression_loss: 0.8483 - classification_loss: 0.1001 371/500 [=====================>........] - ETA: 31s - loss: 0.9487 - regression_loss: 0.8487 - classification_loss: 0.1000 372/500 [=====================>........] - ETA: 31s - loss: 0.9475 - regression_loss: 0.8477 - classification_loss: 0.0998 373/500 [=====================>........] - ETA: 31s - loss: 0.9470 - regression_loss: 0.8472 - classification_loss: 0.0998 374/500 [=====================>........] - ETA: 31s - loss: 0.9475 - regression_loss: 0.8476 - classification_loss: 0.0999 375/500 [=====================>........] - ETA: 30s - loss: 0.9488 - regression_loss: 0.8486 - classification_loss: 0.1001 376/500 [=====================>........] - ETA: 30s - loss: 0.9482 - regression_loss: 0.8482 - classification_loss: 0.1000 377/500 [=====================>........] - ETA: 30s - loss: 0.9487 - regression_loss: 0.8486 - classification_loss: 0.1001 378/500 [=====================>........] - ETA: 30s - loss: 0.9470 - regression_loss: 0.8471 - classification_loss: 0.0999 379/500 [=====================>........] - ETA: 30s - loss: 0.9469 - regression_loss: 0.8470 - classification_loss: 0.0999 380/500 [=====================>........] - ETA: 29s - loss: 0.9474 - regression_loss: 0.8474 - classification_loss: 0.1000 381/500 [=====================>........] - ETA: 29s - loss: 0.9477 - regression_loss: 0.8476 - classification_loss: 0.1000 382/500 [=====================>........] - ETA: 29s - loss: 0.9483 - regression_loss: 0.8481 - classification_loss: 0.1002 383/500 [=====================>........] - ETA: 29s - loss: 0.9470 - regression_loss: 0.8470 - classification_loss: 0.1000 384/500 [======================>.......] - ETA: 28s - loss: 0.9481 - regression_loss: 0.8479 - classification_loss: 0.1002 385/500 [======================>.......] - ETA: 28s - loss: 0.9479 - regression_loss: 0.8476 - classification_loss: 0.1002 386/500 [======================>.......] - ETA: 28s - loss: 0.9478 - regression_loss: 0.8477 - classification_loss: 0.1001 387/500 [======================>.......] - ETA: 28s - loss: 0.9475 - regression_loss: 0.8474 - classification_loss: 0.1001 388/500 [======================>.......] - ETA: 27s - loss: 0.9467 - regression_loss: 0.8467 - classification_loss: 0.1000 389/500 [======================>.......] - ETA: 27s - loss: 0.9456 - regression_loss: 0.8458 - classification_loss: 0.0998 390/500 [======================>.......] - ETA: 27s - loss: 0.9458 - regression_loss: 0.8460 - classification_loss: 0.0997 391/500 [======================>.......] - ETA: 27s - loss: 0.9449 - regression_loss: 0.8453 - classification_loss: 0.0996 392/500 [======================>.......] - ETA: 26s - loss: 0.9453 - regression_loss: 0.8456 - classification_loss: 0.0996 393/500 [======================>.......] - ETA: 26s - loss: 0.9454 - regression_loss: 0.8458 - classification_loss: 0.0996 394/500 [======================>.......] - ETA: 26s - loss: 0.9457 - regression_loss: 0.8462 - classification_loss: 0.0995 395/500 [======================>.......] - ETA: 26s - loss: 0.9456 - regression_loss: 0.8462 - classification_loss: 0.0995 396/500 [======================>.......] - ETA: 25s - loss: 0.9464 - regression_loss: 0.8468 - classification_loss: 0.0996 397/500 [======================>.......] - ETA: 25s - loss: 0.9470 - regression_loss: 0.8473 - classification_loss: 0.0997 398/500 [======================>.......] - ETA: 25s - loss: 0.9478 - regression_loss: 0.8480 - classification_loss: 0.0998 399/500 [======================>.......] - ETA: 25s - loss: 0.9479 - regression_loss: 0.8481 - classification_loss: 0.0999 400/500 [=======================>......] - ETA: 24s - loss: 0.9482 - regression_loss: 0.8483 - classification_loss: 0.0998 401/500 [=======================>......] - ETA: 24s - loss: 0.9490 - regression_loss: 0.8491 - classification_loss: 0.1000 402/500 [=======================>......] - ETA: 24s - loss: 0.9486 - regression_loss: 0.8487 - classification_loss: 0.0999 403/500 [=======================>......] - ETA: 24s - loss: 0.9479 - regression_loss: 0.8481 - classification_loss: 0.0997 404/500 [=======================>......] - ETA: 23s - loss: 0.9480 - regression_loss: 0.8482 - classification_loss: 0.0998 405/500 [=======================>......] - ETA: 23s - loss: 0.9471 - regression_loss: 0.8475 - classification_loss: 0.0996 406/500 [=======================>......] - ETA: 23s - loss: 0.9470 - regression_loss: 0.8474 - classification_loss: 0.0996 407/500 [=======================>......] - ETA: 23s - loss: 0.9467 - regression_loss: 0.8472 - classification_loss: 0.0995 408/500 [=======================>......] - ETA: 22s - loss: 0.9464 - regression_loss: 0.8469 - classification_loss: 0.0994 409/500 [=======================>......] - ETA: 22s - loss: 0.9448 - regression_loss: 0.8455 - classification_loss: 0.0992 410/500 [=======================>......] - ETA: 22s - loss: 0.9450 - regression_loss: 0.8457 - classification_loss: 0.0993 411/500 [=======================>......] - ETA: 22s - loss: 0.9442 - regression_loss: 0.8451 - classification_loss: 0.0991 412/500 [=======================>......] - ETA: 21s - loss: 0.9438 - regression_loss: 0.8447 - classification_loss: 0.0991 413/500 [=======================>......] - ETA: 21s - loss: 0.9446 - regression_loss: 0.8454 - classification_loss: 0.0992 414/500 [=======================>......] - ETA: 21s - loss: 0.9459 - regression_loss: 0.8466 - classification_loss: 0.0993 415/500 [=======================>......] - ETA: 21s - loss: 0.9464 - regression_loss: 0.8469 - classification_loss: 0.0994 416/500 [=======================>......] - ETA: 20s - loss: 0.9457 - regression_loss: 0.8464 - classification_loss: 0.0993 417/500 [========================>.....] - ETA: 20s - loss: 0.9449 - regression_loss: 0.8457 - classification_loss: 0.0991 418/500 [========================>.....] - ETA: 20s - loss: 0.9451 - regression_loss: 0.8459 - classification_loss: 0.0992 419/500 [========================>.....] - ETA: 20s - loss: 0.9437 - regression_loss: 0.8447 - classification_loss: 0.0990 420/500 [========================>.....] - ETA: 19s - loss: 0.9446 - regression_loss: 0.8455 - classification_loss: 0.0991 421/500 [========================>.....] - ETA: 19s - loss: 0.9443 - regression_loss: 0.8452 - classification_loss: 0.0991 422/500 [========================>.....] - ETA: 19s - loss: 0.9435 - regression_loss: 0.8445 - classification_loss: 0.0990 423/500 [========================>.....] - ETA: 19s - loss: 0.9441 - regression_loss: 0.8449 - classification_loss: 0.0992 424/500 [========================>.....] - ETA: 18s - loss: 0.9429 - regression_loss: 0.8439 - classification_loss: 0.0990 425/500 [========================>.....] - ETA: 18s - loss: 0.9417 - regression_loss: 0.8428 - classification_loss: 0.0988 426/500 [========================>.....] - ETA: 18s - loss: 0.9414 - regression_loss: 0.8427 - classification_loss: 0.0987 427/500 [========================>.....] - ETA: 18s - loss: 0.9423 - regression_loss: 0.8436 - classification_loss: 0.0987 428/500 [========================>.....] - ETA: 17s - loss: 0.9424 - regression_loss: 0.8437 - classification_loss: 0.0987 429/500 [========================>.....] - ETA: 17s - loss: 0.9424 - regression_loss: 0.8437 - classification_loss: 0.0987 430/500 [========================>.....] - ETA: 17s - loss: 0.9420 - regression_loss: 0.8434 - classification_loss: 0.0986 431/500 [========================>.....] - ETA: 17s - loss: 0.9417 - regression_loss: 0.8432 - classification_loss: 0.0986 432/500 [========================>.....] - ETA: 16s - loss: 0.9407 - regression_loss: 0.8423 - classification_loss: 0.0984 433/500 [========================>.....] - ETA: 16s - loss: 0.9414 - regression_loss: 0.8431 - classification_loss: 0.0984 434/500 [=========================>....] - ETA: 16s - loss: 0.9417 - regression_loss: 0.8433 - classification_loss: 0.0984 435/500 [=========================>....] - ETA: 16s - loss: 0.9423 - regression_loss: 0.8437 - classification_loss: 0.0986 436/500 [=========================>....] - ETA: 15s - loss: 0.9416 - regression_loss: 0.8432 - classification_loss: 0.0985 437/500 [=========================>....] - ETA: 15s - loss: 0.9421 - regression_loss: 0.8437 - classification_loss: 0.0985 438/500 [=========================>....] - ETA: 15s - loss: 0.9424 - regression_loss: 0.8439 - classification_loss: 0.0985 439/500 [=========================>....] - ETA: 15s - loss: 0.9430 - regression_loss: 0.8443 - classification_loss: 0.0986 440/500 [=========================>....] - ETA: 14s - loss: 0.9431 - regression_loss: 0.8444 - classification_loss: 0.0986 441/500 [=========================>....] - ETA: 14s - loss: 0.9426 - regression_loss: 0.8440 - classification_loss: 0.0986 442/500 [=========================>....] - ETA: 14s - loss: 0.9418 - regression_loss: 0.8433 - classification_loss: 0.0984 443/500 [=========================>....] - ETA: 14s - loss: 0.9418 - regression_loss: 0.8433 - classification_loss: 0.0985 444/500 [=========================>....] - ETA: 13s - loss: 0.9408 - regression_loss: 0.8424 - classification_loss: 0.0984 445/500 [=========================>....] - ETA: 13s - loss: 0.9407 - regression_loss: 0.8424 - classification_loss: 0.0984 446/500 [=========================>....] - ETA: 13s - loss: 0.9408 - regression_loss: 0.8424 - classification_loss: 0.0985 447/500 [=========================>....] - ETA: 13s - loss: 0.9403 - regression_loss: 0.8420 - classification_loss: 0.0984 448/500 [=========================>....] - ETA: 12s - loss: 0.9402 - regression_loss: 0.8419 - classification_loss: 0.0983 449/500 [=========================>....] - ETA: 12s - loss: 0.9409 - regression_loss: 0.8425 - classification_loss: 0.0984 450/500 [==========================>...] - ETA: 12s - loss: 0.9403 - regression_loss: 0.8420 - classification_loss: 0.0983 451/500 [==========================>...] - ETA: 12s - loss: 0.9392 - regression_loss: 0.8411 - classification_loss: 0.0981 452/500 [==========================>...] - ETA: 11s - loss: 0.9395 - regression_loss: 0.8414 - classification_loss: 0.0981 453/500 [==========================>...] - ETA: 11s - loss: 0.9392 - regression_loss: 0.8412 - classification_loss: 0.0980 454/500 [==========================>...] - ETA: 11s - loss: 0.9402 - regression_loss: 0.8419 - classification_loss: 0.0983 455/500 [==========================>...] - ETA: 11s - loss: 0.9411 - regression_loss: 0.8426 - classification_loss: 0.0985 456/500 [==========================>...] - ETA: 10s - loss: 0.9400 - regression_loss: 0.8417 - classification_loss: 0.0983 457/500 [==========================>...] - ETA: 10s - loss: 0.9404 - regression_loss: 0.8420 - classification_loss: 0.0984 458/500 [==========================>...] - ETA: 10s - loss: 0.9417 - regression_loss: 0.8431 - classification_loss: 0.0986 459/500 [==========================>...] - ETA: 10s - loss: 0.9407 - regression_loss: 0.8422 - classification_loss: 0.0985 460/500 [==========================>...] - ETA: 9s - loss: 0.9419 - regression_loss: 0.8431 - classification_loss: 0.0987  461/500 [==========================>...] - ETA: 9s - loss: 0.9416 - regression_loss: 0.8429 - classification_loss: 0.0987 462/500 [==========================>...] - ETA: 9s - loss: 0.9424 - regression_loss: 0.8436 - classification_loss: 0.0987 463/500 [==========================>...] - ETA: 9s - loss: 0.9421 - regression_loss: 0.8435 - classification_loss: 0.0986 464/500 [==========================>...] - ETA: 8s - loss: 0.9420 - regression_loss: 0.8433 - classification_loss: 0.0986 465/500 [==========================>...] - ETA: 8s - loss: 0.9417 - regression_loss: 0.8432 - classification_loss: 0.0986 466/500 [==========================>...] - ETA: 8s - loss: 0.9416 - regression_loss: 0.8430 - classification_loss: 0.0986 467/500 [===========================>..] - ETA: 8s - loss: 0.9412 - regression_loss: 0.8427 - classification_loss: 0.0985 468/500 [===========================>..] - ETA: 7s - loss: 0.9412 - regression_loss: 0.8427 - classification_loss: 0.0985 469/500 [===========================>..] - ETA: 7s - loss: 0.9413 - regression_loss: 0.8427 - classification_loss: 0.0986 470/500 [===========================>..] - ETA: 7s - loss: 0.9425 - regression_loss: 0.8437 - classification_loss: 0.0988 471/500 [===========================>..] - ETA: 7s - loss: 0.9439 - regression_loss: 0.8451 - classification_loss: 0.0988 472/500 [===========================>..] - ETA: 6s - loss: 0.9440 - regression_loss: 0.8450 - classification_loss: 0.0989 473/500 [===========================>..] - ETA: 6s - loss: 0.9449 - regression_loss: 0.8458 - classification_loss: 0.0991 474/500 [===========================>..] - ETA: 6s - loss: 0.9445 - regression_loss: 0.8455 - classification_loss: 0.0990 475/500 [===========================>..] - ETA: 6s - loss: 0.9444 - regression_loss: 0.8455 - classification_loss: 0.0990 476/500 [===========================>..] - ETA: 5s - loss: 0.9442 - regression_loss: 0.8453 - classification_loss: 0.0989 477/500 [===========================>..] - ETA: 5s - loss: 0.9431 - regression_loss: 0.8444 - classification_loss: 0.0987 478/500 [===========================>..] - ETA: 5s - loss: 0.9431 - regression_loss: 0.8444 - classification_loss: 0.0987 479/500 [===========================>..] - ETA: 5s - loss: 0.9443 - regression_loss: 0.8454 - classification_loss: 0.0989 480/500 [===========================>..] - ETA: 4s - loss: 0.9457 - regression_loss: 0.8466 - classification_loss: 0.0990 481/500 [===========================>..] - ETA: 4s - loss: 0.9445 - regression_loss: 0.8457 - classification_loss: 0.0989 482/500 [===========================>..] - ETA: 4s - loss: 0.9443 - regression_loss: 0.8455 - classification_loss: 0.0988 483/500 [===========================>..] - ETA: 4s - loss: 0.9447 - regression_loss: 0.8459 - classification_loss: 0.0989 484/500 [============================>.] - ETA: 3s - loss: 0.9448 - regression_loss: 0.8460 - classification_loss: 0.0988 485/500 [============================>.] - ETA: 3s - loss: 0.9437 - regression_loss: 0.8450 - classification_loss: 0.0987 486/500 [============================>.] - ETA: 3s - loss: 0.9441 - regression_loss: 0.8454 - classification_loss: 0.0987 487/500 [============================>.] - ETA: 3s - loss: 0.9433 - regression_loss: 0.8448 - classification_loss: 0.0986 488/500 [============================>.] - ETA: 2s - loss: 0.9430 - regression_loss: 0.8445 - classification_loss: 0.0985 489/500 [============================>.] - ETA: 2s - loss: 0.9423 - regression_loss: 0.8438 - classification_loss: 0.0985 490/500 [============================>.] - ETA: 2s - loss: 0.9423 - regression_loss: 0.8437 - classification_loss: 0.0985 491/500 [============================>.] - ETA: 2s - loss: 0.9426 - regression_loss: 0.8441 - classification_loss: 0.0986 492/500 [============================>.] - ETA: 1s - loss: 0.9414 - regression_loss: 0.8430 - classification_loss: 0.0984 493/500 [============================>.] - ETA: 1s - loss: 0.9416 - regression_loss: 0.8432 - classification_loss: 0.0984 494/500 [============================>.] - ETA: 1s - loss: 0.9414 - regression_loss: 0.8430 - classification_loss: 0.0984 495/500 [============================>.] - ETA: 1s - loss: 0.9414 - regression_loss: 0.8430 - classification_loss: 0.0984 496/500 [============================>.] - ETA: 0s - loss: 0.9414 - regression_loss: 0.8430 - classification_loss: 0.0984 497/500 [============================>.] - ETA: 0s - loss: 0.9419 - regression_loss: 0.8434 - classification_loss: 0.0985 498/500 [============================>.] - ETA: 0s - loss: 0.9410 - regression_loss: 0.8426 - classification_loss: 0.0984 499/500 [============================>.] - ETA: 0s - loss: 0.9402 - regression_loss: 0.8419 - classification_loss: 0.0983 500/500 [==============================] - 124s 249ms/step - loss: 0.9414 - regression_loss: 0.8429 - classification_loss: 0.0985 1172 instances of class plum with average precision: 0.7889 mAP: 0.7889 Epoch 00045: saving model to ./training/snapshots/resnet50_pascal_45.h5 Epoch 46/150 1/500 [..............................] - ETA: 1:56 - loss: 1.4247 - regression_loss: 1.1661 - classification_loss: 0.2586 2/500 [..............................] - ETA: 1:58 - loss: 0.9995 - regression_loss: 0.8375 - classification_loss: 0.1620 3/500 [..............................] - ETA: 2:01 - loss: 0.8345 - regression_loss: 0.7187 - classification_loss: 0.1157 4/500 [..............................] - ETA: 2:02 - loss: 0.8262 - regression_loss: 0.7173 - classification_loss: 0.1088 5/500 [..............................] - ETA: 2:01 - loss: 0.7576 - regression_loss: 0.6613 - classification_loss: 0.0962 6/500 [..............................] - ETA: 2:01 - loss: 0.7220 - regression_loss: 0.6387 - classification_loss: 0.0833 7/500 [..............................] - ETA: 2:01 - loss: 0.7514 - regression_loss: 0.6743 - classification_loss: 0.0771 8/500 [..............................] - ETA: 2:01 - loss: 0.7887 - regression_loss: 0.7051 - classification_loss: 0.0835 9/500 [..............................] - ETA: 2:01 - loss: 0.7920 - regression_loss: 0.7077 - classification_loss: 0.0843 10/500 [..............................] - ETA: 2:01 - loss: 0.7933 - regression_loss: 0.7104 - classification_loss: 0.0829 11/500 [..............................] - ETA: 2:01 - loss: 0.8372 - regression_loss: 0.7493 - classification_loss: 0.0879 12/500 [..............................] - ETA: 2:01 - loss: 0.8644 - regression_loss: 0.7716 - classification_loss: 0.0928 13/500 [..............................] - ETA: 2:01 - loss: 0.8774 - regression_loss: 0.7832 - classification_loss: 0.0941 14/500 [..............................] - ETA: 2:00 - loss: 0.8695 - regression_loss: 0.7748 - classification_loss: 0.0946 15/500 [..............................] - ETA: 2:00 - loss: 0.8770 - regression_loss: 0.7780 - classification_loss: 0.0990 16/500 [..............................] - ETA: 2:00 - loss: 0.8498 - regression_loss: 0.7548 - classification_loss: 0.0951 17/500 [>.............................] - ETA: 2:00 - loss: 0.8290 - regression_loss: 0.7375 - classification_loss: 0.0916 18/500 [>.............................] - ETA: 1:59 - loss: 0.8359 - regression_loss: 0.7444 - classification_loss: 0.0914 19/500 [>.............................] - ETA: 1:59 - loss: 0.8362 - regression_loss: 0.7426 - classification_loss: 0.0936 20/500 [>.............................] - ETA: 1:59 - loss: 0.8549 - regression_loss: 0.7586 - classification_loss: 0.0963 21/500 [>.............................] - ETA: 1:59 - loss: 0.8716 - regression_loss: 0.7727 - classification_loss: 0.0988 22/500 [>.............................] - ETA: 1:59 - loss: 0.8766 - regression_loss: 0.7774 - classification_loss: 0.0991 23/500 [>.............................] - ETA: 1:58 - loss: 0.8939 - regression_loss: 0.7925 - classification_loss: 0.1014 24/500 [>.............................] - ETA: 1:58 - loss: 0.8944 - regression_loss: 0.7927 - classification_loss: 0.1017 25/500 [>.............................] - ETA: 1:58 - loss: 0.8765 - regression_loss: 0.7771 - classification_loss: 0.0994 26/500 [>.............................] - ETA: 1:58 - loss: 0.8839 - regression_loss: 0.7836 - classification_loss: 0.1003 27/500 [>.............................] - ETA: 1:58 - loss: 0.8930 - regression_loss: 0.7914 - classification_loss: 0.1016 28/500 [>.............................] - ETA: 1:57 - loss: 0.9061 - regression_loss: 0.8011 - classification_loss: 0.1050 29/500 [>.............................] - ETA: 1:57 - loss: 0.9188 - regression_loss: 0.8115 - classification_loss: 0.1073 30/500 [>.............................] - ETA: 1:56 - loss: 0.9207 - regression_loss: 0.8146 - classification_loss: 0.1062 31/500 [>.............................] - ETA: 1:55 - loss: 0.9037 - regression_loss: 0.8000 - classification_loss: 0.1037 32/500 [>.............................] - ETA: 1:55 - loss: 0.9000 - regression_loss: 0.7977 - classification_loss: 0.1023 33/500 [>.............................] - ETA: 1:55 - loss: 0.8995 - regression_loss: 0.7970 - classification_loss: 0.1024 34/500 [=>............................] - ETA: 1:54 - loss: 0.9028 - regression_loss: 0.8006 - classification_loss: 0.1021 35/500 [=>............................] - ETA: 1:54 - loss: 0.9313 - regression_loss: 0.8234 - classification_loss: 0.1079 36/500 [=>............................] - ETA: 1:54 - loss: 0.9373 - regression_loss: 0.8292 - classification_loss: 0.1081 37/500 [=>............................] - ETA: 1:54 - loss: 0.9388 - regression_loss: 0.8309 - classification_loss: 0.1079 38/500 [=>............................] - ETA: 1:54 - loss: 0.9459 - regression_loss: 0.8371 - classification_loss: 0.1089 39/500 [=>............................] - ETA: 1:54 - loss: 0.9440 - regression_loss: 0.8343 - classification_loss: 0.1098 40/500 [=>............................] - ETA: 1:53 - loss: 0.9312 - regression_loss: 0.8238 - classification_loss: 0.1075 41/500 [=>............................] - ETA: 1:53 - loss: 0.9224 - regression_loss: 0.8162 - classification_loss: 0.1062 42/500 [=>............................] - ETA: 1:53 - loss: 0.9182 - regression_loss: 0.8138 - classification_loss: 0.1044 43/500 [=>............................] - ETA: 1:53 - loss: 0.9359 - regression_loss: 0.8319 - classification_loss: 0.1040 44/500 [=>............................] - ETA: 1:53 - loss: 0.9361 - regression_loss: 0.8320 - classification_loss: 0.1041 45/500 [=>............................] - ETA: 1:52 - loss: 0.9389 - regression_loss: 0.8344 - classification_loss: 0.1045 46/500 [=>............................] - ETA: 1:52 - loss: 0.9295 - regression_loss: 0.8267 - classification_loss: 0.1028 47/500 [=>............................] - ETA: 1:52 - loss: 0.9260 - regression_loss: 0.8248 - classification_loss: 0.1012 48/500 [=>............................] - ETA: 1:52 - loss: 0.9324 - regression_loss: 0.8306 - classification_loss: 0.1019 49/500 [=>............................] - ETA: 1:52 - loss: 0.9389 - regression_loss: 0.8370 - classification_loss: 0.1018 50/500 [==>...........................] - ETA: 1:51 - loss: 0.9424 - regression_loss: 0.8406 - classification_loss: 0.1019 51/500 [==>...........................] - ETA: 1:51 - loss: 0.9447 - regression_loss: 0.8423 - classification_loss: 0.1024 52/500 [==>...........................] - ETA: 1:51 - loss: 0.9461 - regression_loss: 0.8435 - classification_loss: 0.1025 53/500 [==>...........................] - ETA: 1:51 - loss: 0.9426 - regression_loss: 0.8410 - classification_loss: 0.1015 54/500 [==>...........................] - ETA: 1:50 - loss: 0.9488 - regression_loss: 0.8470 - classification_loss: 0.1018 55/500 [==>...........................] - ETA: 1:50 - loss: 0.9524 - regression_loss: 0.8506 - classification_loss: 0.1018 56/500 [==>...........................] - ETA: 1:50 - loss: 0.9477 - regression_loss: 0.8470 - classification_loss: 0.1007 57/500 [==>...........................] - ETA: 1:50 - loss: 0.9346 - regression_loss: 0.8354 - classification_loss: 0.0992 58/500 [==>...........................] - ETA: 1:49 - loss: 0.9322 - regression_loss: 0.8334 - classification_loss: 0.0987 59/500 [==>...........................] - ETA: 1:49 - loss: 0.9241 - regression_loss: 0.8267 - classification_loss: 0.0975 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9353 - regression_loss: 0.8367 - classification_loss: 0.0986 61/500 [==>...........................] - ETA: 1:49 - loss: 0.9328 - regression_loss: 0.8345 - classification_loss: 0.0982 62/500 [==>...........................] - ETA: 1:49 - loss: 0.9328 - regression_loss: 0.8355 - classification_loss: 0.0973 63/500 [==>...........................] - ETA: 1:48 - loss: 0.9377 - regression_loss: 0.8408 - classification_loss: 0.0969 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9387 - regression_loss: 0.8426 - classification_loss: 0.0962 65/500 [==>...........................] - ETA: 1:48 - loss: 0.9358 - regression_loss: 0.8401 - classification_loss: 0.0956 66/500 [==>...........................] - ETA: 1:48 - loss: 0.9292 - regression_loss: 0.8344 - classification_loss: 0.0948 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9229 - regression_loss: 0.8292 - classification_loss: 0.0937 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9177 - regression_loss: 0.8247 - classification_loss: 0.0930 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9259 - regression_loss: 0.8314 - classification_loss: 0.0944 70/500 [===>..........................] - ETA: 1:47 - loss: 0.9329 - regression_loss: 0.8377 - classification_loss: 0.0952 71/500 [===>..........................] - ETA: 1:47 - loss: 0.9359 - regression_loss: 0.8400 - classification_loss: 0.0959 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9362 - regression_loss: 0.8404 - classification_loss: 0.0958 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9358 - regression_loss: 0.8401 - classification_loss: 0.0957 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9387 - regression_loss: 0.8425 - classification_loss: 0.0963 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9407 - regression_loss: 0.8439 - classification_loss: 0.0968 76/500 [===>..........................] - ETA: 1:46 - loss: 0.9430 - regression_loss: 0.8452 - classification_loss: 0.0979 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9439 - regression_loss: 0.8455 - classification_loss: 0.0984 78/500 [===>..........................] - ETA: 1:45 - loss: 0.9490 - regression_loss: 0.8500 - classification_loss: 0.0990 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9462 - regression_loss: 0.8474 - classification_loss: 0.0987 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9492 - regression_loss: 0.8495 - classification_loss: 0.0997 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9488 - regression_loss: 0.8490 - classification_loss: 0.0999 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9465 - regression_loss: 0.8468 - classification_loss: 0.0997 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9479 - regression_loss: 0.8477 - classification_loss: 0.1002 84/500 [====>.........................] - ETA: 1:44 - loss: 0.9506 - regression_loss: 0.8498 - classification_loss: 0.1009 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9576 - regression_loss: 0.8550 - classification_loss: 0.1026 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9517 - regression_loss: 0.8498 - classification_loss: 0.1019 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9532 - regression_loss: 0.8514 - classification_loss: 0.1017 88/500 [====>.........................] - ETA: 1:43 - loss: 0.9536 - regression_loss: 0.8519 - classification_loss: 0.1018 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9508 - regression_loss: 0.8498 - classification_loss: 0.1009 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9467 - regression_loss: 0.8467 - classification_loss: 0.1001 91/500 [====>.........................] - ETA: 1:42 - loss: 0.9413 - regression_loss: 0.8420 - classification_loss: 0.0993 92/500 [====>.........................] - ETA: 1:42 - loss: 0.9361 - regression_loss: 0.8375 - classification_loss: 0.0986 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9404 - regression_loss: 0.8409 - classification_loss: 0.0994 94/500 [====>.........................] - ETA: 1:41 - loss: 0.9349 - regression_loss: 0.8362 - classification_loss: 0.0987 95/500 [====>.........................] - ETA: 1:41 - loss: 0.9424 - regression_loss: 0.8425 - classification_loss: 0.0999 96/500 [====>.........................] - ETA: 1:41 - loss: 0.9395 - regression_loss: 0.8400 - classification_loss: 0.0995 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9366 - regression_loss: 0.8376 - classification_loss: 0.0990 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9376 - regression_loss: 0.8385 - classification_loss: 0.0992 99/500 [====>.........................] - ETA: 1:40 - loss: 0.9420 - regression_loss: 0.8422 - classification_loss: 0.0998 100/500 [=====>........................] - ETA: 1:40 - loss: 0.9449 - regression_loss: 0.8441 - classification_loss: 0.1007 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9396 - regression_loss: 0.8392 - classification_loss: 0.1005 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9395 - regression_loss: 0.8391 - classification_loss: 0.1004 103/500 [=====>........................] - ETA: 1:39 - loss: 0.9367 - regression_loss: 0.8367 - classification_loss: 0.1000 104/500 [=====>........................] - ETA: 1:39 - loss: 0.9391 - regression_loss: 0.8386 - classification_loss: 0.1005 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9346 - regression_loss: 0.8347 - classification_loss: 0.0999 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9345 - regression_loss: 0.8349 - classification_loss: 0.0996 107/500 [=====>........................] - ETA: 1:38 - loss: 0.9387 - regression_loss: 0.8384 - classification_loss: 0.1003 108/500 [=====>........................] - ETA: 1:38 - loss: 0.9376 - regression_loss: 0.8375 - classification_loss: 0.1000 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9348 - regression_loss: 0.8354 - classification_loss: 0.0994 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9336 - regression_loss: 0.8344 - classification_loss: 0.0992 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9332 - regression_loss: 0.8339 - classification_loss: 0.0992 112/500 [=====>........................] - ETA: 1:37 - loss: 0.9288 - regression_loss: 0.8302 - classification_loss: 0.0986 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9312 - regression_loss: 0.8319 - classification_loss: 0.0993 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9374 - regression_loss: 0.8377 - classification_loss: 0.0997 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9400 - regression_loss: 0.8399 - classification_loss: 0.1001 116/500 [=====>........................] - ETA: 1:36 - loss: 0.9341 - regression_loss: 0.8347 - classification_loss: 0.0993 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9358 - regression_loss: 0.8363 - classification_loss: 0.0995 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9307 - regression_loss: 0.8319 - classification_loss: 0.0988 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9309 - regression_loss: 0.8323 - classification_loss: 0.0986 120/500 [======>.......................] - ETA: 1:35 - loss: 0.9329 - regression_loss: 0.8338 - classification_loss: 0.0992 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9372 - regression_loss: 0.8373 - classification_loss: 0.0999 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9356 - regression_loss: 0.8361 - classification_loss: 0.0995 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9361 - regression_loss: 0.8371 - classification_loss: 0.0990 124/500 [======>.......................] - ETA: 1:34 - loss: 0.9364 - regression_loss: 0.8373 - classification_loss: 0.0991 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9377 - regression_loss: 0.8385 - classification_loss: 0.0992 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9454 - regression_loss: 0.8452 - classification_loss: 0.1002 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9409 - regression_loss: 0.8413 - classification_loss: 0.0996 128/500 [======>.......................] - ETA: 1:33 - loss: 0.9386 - regression_loss: 0.8394 - classification_loss: 0.0992 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9361 - regression_loss: 0.8373 - classification_loss: 0.0989 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9377 - regression_loss: 0.8387 - classification_loss: 0.0991 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9345 - regression_loss: 0.8360 - classification_loss: 0.0985 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9299 - regression_loss: 0.8318 - classification_loss: 0.0981 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9294 - regression_loss: 0.8316 - classification_loss: 0.0978 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9300 - regression_loss: 0.8322 - classification_loss: 0.0978 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9323 - regression_loss: 0.8336 - classification_loss: 0.0987 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9300 - regression_loss: 0.8316 - classification_loss: 0.0984 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9339 - regression_loss: 0.8353 - classification_loss: 0.0987 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9349 - regression_loss: 0.8358 - classification_loss: 0.0992 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9341 - regression_loss: 0.8352 - classification_loss: 0.0989 140/500 [=======>......................] - ETA: 1:30 - loss: 0.9309 - regression_loss: 0.8326 - classification_loss: 0.0983 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9306 - regression_loss: 0.8325 - classification_loss: 0.0981 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9288 - regression_loss: 0.8310 - classification_loss: 0.0978 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9292 - regression_loss: 0.8314 - classification_loss: 0.0978 144/500 [=======>......................] - ETA: 1:29 - loss: 0.9291 - regression_loss: 0.8313 - classification_loss: 0.0978 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9322 - regression_loss: 0.8338 - classification_loss: 0.0984 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9371 - regression_loss: 0.8376 - classification_loss: 0.0995 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9371 - regression_loss: 0.8377 - classification_loss: 0.0995 148/500 [=======>......................] - ETA: 1:28 - loss: 0.9337 - regression_loss: 0.8346 - classification_loss: 0.0991 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9342 - regression_loss: 0.8351 - classification_loss: 0.0991 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9364 - regression_loss: 0.8374 - classification_loss: 0.0990 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9381 - regression_loss: 0.8388 - classification_loss: 0.0993 152/500 [========>.....................] - ETA: 1:27 - loss: 0.9401 - regression_loss: 0.8405 - classification_loss: 0.0996 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9403 - regression_loss: 0.8406 - classification_loss: 0.0997 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9364 - regression_loss: 0.8371 - classification_loss: 0.0993 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9379 - regression_loss: 0.8384 - classification_loss: 0.0994 156/500 [========>.....................] - ETA: 1:26 - loss: 0.9398 - regression_loss: 0.8401 - classification_loss: 0.0997 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9389 - regression_loss: 0.8393 - classification_loss: 0.0997 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9413 - regression_loss: 0.8413 - classification_loss: 0.1000 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9429 - regression_loss: 0.8426 - classification_loss: 0.1003 160/500 [========>.....................] - ETA: 1:25 - loss: 0.9409 - regression_loss: 0.8411 - classification_loss: 0.0998 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9414 - regression_loss: 0.8415 - classification_loss: 0.0999 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9386 - regression_loss: 0.8392 - classification_loss: 0.0994 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9386 - regression_loss: 0.8393 - classification_loss: 0.0993 164/500 [========>.....................] - ETA: 1:24 - loss: 0.9388 - regression_loss: 0.8395 - classification_loss: 0.0993 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9408 - regression_loss: 0.8410 - classification_loss: 0.0998 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9406 - regression_loss: 0.8406 - classification_loss: 0.1000 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9424 - regression_loss: 0.8424 - classification_loss: 0.1000 168/500 [=========>....................] - ETA: 1:23 - loss: 0.9463 - regression_loss: 0.8455 - classification_loss: 0.1007 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9486 - regression_loss: 0.8475 - classification_loss: 0.1011 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9497 - regression_loss: 0.8486 - classification_loss: 0.1011 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9515 - regression_loss: 0.8502 - classification_loss: 0.1013 172/500 [=========>....................] - ETA: 1:22 - loss: 0.9511 - regression_loss: 0.8498 - classification_loss: 0.1013 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9490 - regression_loss: 0.8482 - classification_loss: 0.1007 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9497 - regression_loss: 0.8487 - classification_loss: 0.1009 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9470 - regression_loss: 0.8466 - classification_loss: 0.1004 176/500 [=========>....................] - ETA: 1:21 - loss: 0.9481 - regression_loss: 0.8474 - classification_loss: 0.1007 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9454 - regression_loss: 0.8451 - classification_loss: 0.1003 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9478 - regression_loss: 0.8472 - classification_loss: 0.1005 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9477 - regression_loss: 0.8471 - classification_loss: 0.1005 180/500 [=========>....................] - ETA: 1:20 - loss: 0.9469 - regression_loss: 0.8466 - classification_loss: 0.1003 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9462 - regression_loss: 0.8462 - classification_loss: 0.1000 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9449 - regression_loss: 0.8453 - classification_loss: 0.0996 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9464 - regression_loss: 0.8466 - classification_loss: 0.0998 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9456 - regression_loss: 0.8459 - classification_loss: 0.0996 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9437 - regression_loss: 0.8444 - classification_loss: 0.0992 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9404 - regression_loss: 0.8416 - classification_loss: 0.0988 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9412 - regression_loss: 0.8421 - classification_loss: 0.0991 188/500 [==========>...................] - ETA: 1:18 - loss: 0.9413 - regression_loss: 0.8421 - classification_loss: 0.0992 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9413 - regression_loss: 0.8419 - classification_loss: 0.0994 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9430 - regression_loss: 0.8434 - classification_loss: 0.0995 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9438 - regression_loss: 0.8442 - classification_loss: 0.0996 192/500 [==========>...................] - ETA: 1:17 - loss: 0.9425 - regression_loss: 0.8431 - classification_loss: 0.0994 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9471 - regression_loss: 0.8468 - classification_loss: 0.1002 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9471 - regression_loss: 0.8471 - classification_loss: 0.1001 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9475 - regression_loss: 0.8473 - classification_loss: 0.1001 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9471 - regression_loss: 0.8473 - classification_loss: 0.0998 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9486 - regression_loss: 0.8484 - classification_loss: 0.1001 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9488 - regression_loss: 0.8486 - classification_loss: 0.1002 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9466 - regression_loss: 0.8468 - classification_loss: 0.0998 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9468 - regression_loss: 0.8471 - classification_loss: 0.0998 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9455 - regression_loss: 0.8459 - classification_loss: 0.0996 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9433 - regression_loss: 0.8441 - classification_loss: 0.0992 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9429 - regression_loss: 0.8438 - classification_loss: 0.0991 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9442 - regression_loss: 0.8451 - classification_loss: 0.0992 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9417 - regression_loss: 0.8429 - classification_loss: 0.0988 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9397 - regression_loss: 0.8411 - classification_loss: 0.0986 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9400 - regression_loss: 0.8415 - classification_loss: 0.0985 208/500 [===========>..................] - ETA: 1:12 - loss: 0.9384 - regression_loss: 0.8401 - classification_loss: 0.0983 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9395 - regression_loss: 0.8410 - classification_loss: 0.0984 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9399 - regression_loss: 0.8414 - classification_loss: 0.0985 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9413 - regression_loss: 0.8426 - classification_loss: 0.0986 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9440 - regression_loss: 0.8454 - classification_loss: 0.0986 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9429 - regression_loss: 0.8444 - classification_loss: 0.0985 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9435 - regression_loss: 0.8451 - classification_loss: 0.0984 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9437 - regression_loss: 0.8451 - classification_loss: 0.0986 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9486 - regression_loss: 0.8459 - classification_loss: 0.1027 217/500 [============>.................] - ETA: 1:10 - loss: 0.9484 - regression_loss: 0.8459 - classification_loss: 0.1025 218/500 [============>.................] - ETA: 1:10 - loss: 0.9470 - regression_loss: 0.8449 - classification_loss: 0.1022 219/500 [============>.................] - ETA: 1:10 - loss: 0.9482 - regression_loss: 0.8458 - classification_loss: 0.1024 220/500 [============>.................] - ETA: 1:10 - loss: 0.9469 - regression_loss: 0.8446 - classification_loss: 0.1023 221/500 [============>.................] - ETA: 1:09 - loss: 0.9445 - regression_loss: 0.8426 - classification_loss: 0.1019 222/500 [============>.................] - ETA: 1:09 - loss: 0.9441 - regression_loss: 0.8422 - classification_loss: 0.1019 223/500 [============>.................] - ETA: 1:09 - loss: 0.9419 - regression_loss: 0.8405 - classification_loss: 0.1015 224/500 [============>.................] - ETA: 1:09 - loss: 0.9426 - regression_loss: 0.8410 - classification_loss: 0.1016 225/500 [============>.................] - ETA: 1:08 - loss: 0.9444 - regression_loss: 0.8424 - classification_loss: 0.1020 226/500 [============>.................] - ETA: 1:08 - loss: 0.9456 - regression_loss: 0.8435 - classification_loss: 0.1021 227/500 [============>.................] - ETA: 1:08 - loss: 0.9454 - regression_loss: 0.8433 - classification_loss: 0.1021 228/500 [============>.................] - ETA: 1:08 - loss: 0.9464 - regression_loss: 0.8443 - classification_loss: 0.1022 229/500 [============>.................] - ETA: 1:07 - loss: 0.9472 - regression_loss: 0.8450 - classification_loss: 0.1022 230/500 [============>.................] - ETA: 1:07 - loss: 0.9482 - regression_loss: 0.8461 - classification_loss: 0.1021 231/500 [============>.................] - ETA: 1:07 - loss: 0.9474 - regression_loss: 0.8453 - classification_loss: 0.1021 232/500 [============>.................] - ETA: 1:07 - loss: 0.9450 - regression_loss: 0.8433 - classification_loss: 0.1017 233/500 [============>.................] - ETA: 1:06 - loss: 0.9458 - regression_loss: 0.8443 - classification_loss: 0.1015 234/500 [=============>................] - ETA: 1:06 - loss: 0.9447 - regression_loss: 0.8435 - classification_loss: 0.1013 235/500 [=============>................] - ETA: 1:06 - loss: 0.9466 - regression_loss: 0.8450 - classification_loss: 0.1016 236/500 [=============>................] - ETA: 1:06 - loss: 0.9472 - regression_loss: 0.8455 - classification_loss: 0.1017 237/500 [=============>................] - ETA: 1:05 - loss: 0.9481 - regression_loss: 0.8466 - classification_loss: 0.1016 238/500 [=============>................] - ETA: 1:05 - loss: 0.9503 - regression_loss: 0.8482 - classification_loss: 0.1020 239/500 [=============>................] - ETA: 1:05 - loss: 0.9481 - regression_loss: 0.8461 - classification_loss: 0.1020 240/500 [=============>................] - ETA: 1:05 - loss: 0.9513 - regression_loss: 0.8488 - classification_loss: 0.1025 241/500 [=============>................] - ETA: 1:04 - loss: 0.9505 - regression_loss: 0.8482 - classification_loss: 0.1024 242/500 [=============>................] - ETA: 1:04 - loss: 0.9521 - regression_loss: 0.8497 - classification_loss: 0.1025 243/500 [=============>................] - ETA: 1:04 - loss: 0.9521 - regression_loss: 0.8497 - classification_loss: 0.1024 244/500 [=============>................] - ETA: 1:04 - loss: 0.9548 - regression_loss: 0.8519 - classification_loss: 0.1029 245/500 [=============>................] - ETA: 1:03 - loss: 0.9532 - regression_loss: 0.8506 - classification_loss: 0.1026 246/500 [=============>................] - ETA: 1:03 - loss: 0.9524 - regression_loss: 0.8499 - classification_loss: 0.1025 247/500 [=============>................] - ETA: 1:03 - loss: 0.9521 - regression_loss: 0.8497 - classification_loss: 0.1023 248/500 [=============>................] - ETA: 1:03 - loss: 0.9531 - regression_loss: 0.8506 - classification_loss: 0.1025 249/500 [=============>................] - ETA: 1:02 - loss: 0.9522 - regression_loss: 0.8498 - classification_loss: 0.1024 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9518 - regression_loss: 0.8493 - classification_loss: 0.1025 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9523 - regression_loss: 0.8497 - classification_loss: 0.1026 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9511 - regression_loss: 0.8486 - classification_loss: 0.1025 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9521 - regression_loss: 0.8497 - classification_loss: 0.1024 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9498 - regression_loss: 0.8477 - classification_loss: 0.1021 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9512 - regression_loss: 0.8489 - classification_loss: 0.1024 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9511 - regression_loss: 0.8486 - classification_loss: 0.1025 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9512 - regression_loss: 0.8488 - classification_loss: 0.1024 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9515 - regression_loss: 0.8491 - classification_loss: 0.1025 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9519 - regression_loss: 0.8493 - classification_loss: 0.1026 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9524 - regression_loss: 0.8497 - classification_loss: 0.1027 261/500 [==============>...............] - ETA: 59s - loss: 0.9500 - regression_loss: 0.8476 - classification_loss: 0.1024  262/500 [==============>...............] - ETA: 59s - loss: 0.9490 - regression_loss: 0.8468 - classification_loss: 0.1022 263/500 [==============>...............] - ETA: 59s - loss: 0.9500 - regression_loss: 0.8477 - classification_loss: 0.1023 264/500 [==============>...............] - ETA: 59s - loss: 0.9490 - regression_loss: 0.8468 - classification_loss: 0.1021 265/500 [==============>...............] - ETA: 58s - loss: 0.9484 - regression_loss: 0.8465 - classification_loss: 0.1019 266/500 [==============>...............] - ETA: 58s - loss: 0.9478 - regression_loss: 0.8460 - classification_loss: 0.1019 267/500 [===============>..............] - ETA: 58s - loss: 0.9476 - regression_loss: 0.8458 - classification_loss: 0.1018 268/500 [===============>..............] - ETA: 58s - loss: 0.9474 - regression_loss: 0.8457 - classification_loss: 0.1017 269/500 [===============>..............] - ETA: 57s - loss: 0.9475 - regression_loss: 0.8458 - classification_loss: 0.1017 270/500 [===============>..............] - ETA: 57s - loss: 0.9477 - regression_loss: 0.8459 - classification_loss: 0.1017 271/500 [===============>..............] - ETA: 57s - loss: 0.9499 - regression_loss: 0.8472 - classification_loss: 0.1027 272/500 [===============>..............] - ETA: 57s - loss: 0.9500 - regression_loss: 0.8474 - classification_loss: 0.1027 273/500 [===============>..............] - ETA: 56s - loss: 0.9490 - regression_loss: 0.8465 - classification_loss: 0.1025 274/500 [===============>..............] - ETA: 56s - loss: 0.9515 - regression_loss: 0.8485 - classification_loss: 0.1030 275/500 [===============>..............] - ETA: 56s - loss: 0.9498 - regression_loss: 0.8470 - classification_loss: 0.1028 276/500 [===============>..............] - ETA: 56s - loss: 0.9506 - regression_loss: 0.8476 - classification_loss: 0.1030 277/500 [===============>..............] - ETA: 55s - loss: 0.9504 - regression_loss: 0.8474 - classification_loss: 0.1030 278/500 [===============>..............] - ETA: 55s - loss: 0.9497 - regression_loss: 0.8467 - classification_loss: 0.1030 279/500 [===============>..............] - ETA: 55s - loss: 0.9478 - regression_loss: 0.8451 - classification_loss: 0.1027 280/500 [===============>..............] - ETA: 55s - loss: 0.9457 - regression_loss: 0.8433 - classification_loss: 0.1024 281/500 [===============>..............] - ETA: 54s - loss: 0.9435 - regression_loss: 0.8414 - classification_loss: 0.1021 282/500 [===============>..............] - ETA: 54s - loss: 0.9447 - regression_loss: 0.8425 - classification_loss: 0.1022 283/500 [===============>..............] - ETA: 54s - loss: 0.9460 - regression_loss: 0.8436 - classification_loss: 0.1024 284/500 [================>.............] - ETA: 54s - loss: 0.9455 - regression_loss: 0.8432 - classification_loss: 0.1023 285/500 [================>.............] - ETA: 53s - loss: 0.9466 - regression_loss: 0.8440 - classification_loss: 0.1026 286/500 [================>.............] - ETA: 53s - loss: 0.9475 - regression_loss: 0.8448 - classification_loss: 0.1026 287/500 [================>.............] - ETA: 53s - loss: 0.9483 - regression_loss: 0.8452 - classification_loss: 0.1032 288/500 [================>.............] - ETA: 53s - loss: 0.9456 - regression_loss: 0.8428 - classification_loss: 0.1028 289/500 [================>.............] - ETA: 52s - loss: 0.9460 - regression_loss: 0.8432 - classification_loss: 0.1029 290/500 [================>.............] - ETA: 52s - loss: 0.9439 - regression_loss: 0.8413 - classification_loss: 0.1026 291/500 [================>.............] - ETA: 52s - loss: 0.9439 - regression_loss: 0.8414 - classification_loss: 0.1025 292/500 [================>.............] - ETA: 52s - loss: 0.9416 - regression_loss: 0.8395 - classification_loss: 0.1021 293/500 [================>.............] - ETA: 51s - loss: 0.9437 - regression_loss: 0.8411 - classification_loss: 0.1026 294/500 [================>.............] - ETA: 51s - loss: 0.9423 - regression_loss: 0.8399 - classification_loss: 0.1024 295/500 [================>.............] - ETA: 51s - loss: 0.9436 - regression_loss: 0.8410 - classification_loss: 0.1027 296/500 [================>.............] - ETA: 51s - loss: 0.9441 - regression_loss: 0.8414 - classification_loss: 0.1028 297/500 [================>.............] - ETA: 50s - loss: 0.9447 - regression_loss: 0.8418 - classification_loss: 0.1029 298/500 [================>.............] - ETA: 50s - loss: 0.9439 - regression_loss: 0.8412 - classification_loss: 0.1028 299/500 [================>.............] - ETA: 50s - loss: 0.9446 - regression_loss: 0.8417 - classification_loss: 0.1029 300/500 [=================>............] - ETA: 50s - loss: 0.9437 - regression_loss: 0.8409 - classification_loss: 0.1028 301/500 [=================>............] - ETA: 49s - loss: 0.9416 - regression_loss: 0.8391 - classification_loss: 0.1025 302/500 [=================>............] - ETA: 49s - loss: 0.9419 - regression_loss: 0.8397 - classification_loss: 0.1022 303/500 [=================>............] - ETA: 49s - loss: 0.9432 - regression_loss: 0.8408 - classification_loss: 0.1023 304/500 [=================>............] - ETA: 49s - loss: 0.9415 - regression_loss: 0.8394 - classification_loss: 0.1021 305/500 [=================>............] - ETA: 48s - loss: 0.9405 - regression_loss: 0.8386 - classification_loss: 0.1019 306/500 [=================>............] - ETA: 48s - loss: 0.9429 - regression_loss: 0.8404 - classification_loss: 0.1025 307/500 [=================>............] - ETA: 48s - loss: 0.9448 - regression_loss: 0.8421 - classification_loss: 0.1028 308/500 [=================>............] - ETA: 48s - loss: 0.9438 - regression_loss: 0.8413 - classification_loss: 0.1025 309/500 [=================>............] - ETA: 47s - loss: 0.9447 - regression_loss: 0.8421 - classification_loss: 0.1026 310/500 [=================>............] - ETA: 47s - loss: 0.9450 - regression_loss: 0.8424 - classification_loss: 0.1026 311/500 [=================>............] - ETA: 47s - loss: 0.9468 - regression_loss: 0.8440 - classification_loss: 0.1028 312/500 [=================>............] - ETA: 47s - loss: 0.9471 - regression_loss: 0.8442 - classification_loss: 0.1029 313/500 [=================>............] - ETA: 46s - loss: 0.9466 - regression_loss: 0.8440 - classification_loss: 0.1027 314/500 [=================>............] - ETA: 46s - loss: 0.9456 - regression_loss: 0.8432 - classification_loss: 0.1024 315/500 [=================>............] - ETA: 46s - loss: 0.9452 - regression_loss: 0.8429 - classification_loss: 0.1023 316/500 [=================>............] - ETA: 46s - loss: 0.9452 - regression_loss: 0.8428 - classification_loss: 0.1024 317/500 [==================>...........] - ETA: 45s - loss: 0.9463 - regression_loss: 0.8438 - classification_loss: 0.1025 318/500 [==================>...........] - ETA: 45s - loss: 0.9471 - regression_loss: 0.8444 - classification_loss: 0.1027 319/500 [==================>...........] - ETA: 45s - loss: 0.9468 - regression_loss: 0.8441 - classification_loss: 0.1027 320/500 [==================>...........] - ETA: 45s - loss: 0.9463 - regression_loss: 0.8437 - classification_loss: 0.1026 321/500 [==================>...........] - ETA: 44s - loss: 0.9473 - regression_loss: 0.8446 - classification_loss: 0.1026 322/500 [==================>...........] - ETA: 44s - loss: 0.9475 - regression_loss: 0.8449 - classification_loss: 0.1026 323/500 [==================>...........] - ETA: 44s - loss: 0.9471 - regression_loss: 0.8446 - classification_loss: 0.1025 324/500 [==================>...........] - ETA: 44s - loss: 0.9462 - regression_loss: 0.8439 - classification_loss: 0.1023 325/500 [==================>...........] - ETA: 43s - loss: 0.9450 - regression_loss: 0.8429 - classification_loss: 0.1021 326/500 [==================>...........] - ETA: 43s - loss: 0.9454 - regression_loss: 0.8433 - classification_loss: 0.1021 327/500 [==================>...........] - ETA: 43s - loss: 0.9477 - regression_loss: 0.8452 - classification_loss: 0.1025 328/500 [==================>...........] - ETA: 43s - loss: 0.9482 - regression_loss: 0.8459 - classification_loss: 0.1023 329/500 [==================>...........] - ETA: 42s - loss: 0.9484 - regression_loss: 0.8460 - classification_loss: 0.1024 330/500 [==================>...........] - ETA: 42s - loss: 0.9496 - regression_loss: 0.8470 - classification_loss: 0.1026 331/500 [==================>...........] - ETA: 42s - loss: 0.9510 - regression_loss: 0.8482 - classification_loss: 0.1028 332/500 [==================>...........] - ETA: 42s - loss: 0.9528 - regression_loss: 0.8497 - classification_loss: 0.1032 333/500 [==================>...........] - ETA: 41s - loss: 0.9517 - regression_loss: 0.8487 - classification_loss: 0.1030 334/500 [===================>..........] - ETA: 41s - loss: 0.9523 - regression_loss: 0.8495 - classification_loss: 0.1028 335/500 [===================>..........] - ETA: 41s - loss: 0.9524 - regression_loss: 0.8496 - classification_loss: 0.1028 336/500 [===================>..........] - ETA: 41s - loss: 0.9530 - regression_loss: 0.8500 - classification_loss: 0.1030 337/500 [===================>..........] - ETA: 40s - loss: 0.9530 - regression_loss: 0.8501 - classification_loss: 0.1030 338/500 [===================>..........] - ETA: 40s - loss: 0.9534 - regression_loss: 0.8504 - classification_loss: 0.1030 339/500 [===================>..........] - ETA: 40s - loss: 0.9525 - regression_loss: 0.8494 - classification_loss: 0.1031 340/500 [===================>..........] - ETA: 40s - loss: 0.9535 - regression_loss: 0.8502 - classification_loss: 0.1033 341/500 [===================>..........] - ETA: 39s - loss: 0.9532 - regression_loss: 0.8498 - classification_loss: 0.1034 342/500 [===================>..........] - ETA: 39s - loss: 0.9547 - regression_loss: 0.8510 - classification_loss: 0.1037 343/500 [===================>..........] - ETA: 39s - loss: 0.9553 - regression_loss: 0.8515 - classification_loss: 0.1038 344/500 [===================>..........] - ETA: 39s - loss: 0.9550 - regression_loss: 0.8513 - classification_loss: 0.1036 345/500 [===================>..........] - ETA: 38s - loss: 0.9559 - regression_loss: 0.8521 - classification_loss: 0.1037 346/500 [===================>..........] - ETA: 38s - loss: 0.9564 - regression_loss: 0.8526 - classification_loss: 0.1038 347/500 [===================>..........] - ETA: 38s - loss: 0.9575 - regression_loss: 0.8534 - classification_loss: 0.1040 348/500 [===================>..........] - ETA: 38s - loss: 0.9560 - regression_loss: 0.8521 - classification_loss: 0.1038 349/500 [===================>..........] - ETA: 37s - loss: 0.9555 - regression_loss: 0.8518 - classification_loss: 0.1037 350/500 [====================>.........] - ETA: 37s - loss: 0.9552 - regression_loss: 0.8516 - classification_loss: 0.1036 351/500 [====================>.........] - ETA: 37s - loss: 0.9555 - regression_loss: 0.8520 - classification_loss: 0.1036 352/500 [====================>.........] - ETA: 37s - loss: 0.9555 - regression_loss: 0.8519 - classification_loss: 0.1036 353/500 [====================>.........] - ETA: 36s - loss: 0.9547 - regression_loss: 0.8513 - classification_loss: 0.1034 354/500 [====================>.........] - ETA: 36s - loss: 0.9547 - regression_loss: 0.8514 - classification_loss: 0.1032 355/500 [====================>.........] - ETA: 36s - loss: 0.9554 - regression_loss: 0.8520 - classification_loss: 0.1034 356/500 [====================>.........] - ETA: 36s - loss: 0.9557 - regression_loss: 0.8523 - classification_loss: 0.1034 357/500 [====================>.........] - ETA: 35s - loss: 0.9558 - regression_loss: 0.8525 - classification_loss: 0.1033 358/500 [====================>.........] - ETA: 35s - loss: 0.9574 - regression_loss: 0.8539 - classification_loss: 0.1035 359/500 [====================>.........] - ETA: 35s - loss: 0.9558 - regression_loss: 0.8525 - classification_loss: 0.1033 360/500 [====================>.........] - ETA: 35s - loss: 0.9568 - regression_loss: 0.8533 - classification_loss: 0.1034 361/500 [====================>.........] - ETA: 34s - loss: 0.9578 - regression_loss: 0.8541 - classification_loss: 0.1036 362/500 [====================>.........] - ETA: 34s - loss: 0.9590 - regression_loss: 0.8552 - classification_loss: 0.1038 363/500 [====================>.........] - ETA: 34s - loss: 0.9592 - regression_loss: 0.8553 - classification_loss: 0.1039 364/500 [====================>.........] - ETA: 34s - loss: 0.9588 - regression_loss: 0.8550 - classification_loss: 0.1038 365/500 [====================>.........] - ETA: 33s - loss: 0.9589 - regression_loss: 0.8551 - classification_loss: 0.1038 366/500 [====================>.........] - ETA: 33s - loss: 0.9588 - regression_loss: 0.8551 - classification_loss: 0.1037 367/500 [=====================>........] - ETA: 33s - loss: 0.9590 - regression_loss: 0.8552 - classification_loss: 0.1038 368/500 [=====================>........] - ETA: 33s - loss: 0.9591 - regression_loss: 0.8553 - classification_loss: 0.1038 369/500 [=====================>........] - ETA: 32s - loss: 0.9599 - regression_loss: 0.8559 - classification_loss: 0.1040 370/500 [=====================>........] - ETA: 32s - loss: 0.9586 - regression_loss: 0.8549 - classification_loss: 0.1038 371/500 [=====================>........] - ETA: 32s - loss: 0.9580 - regression_loss: 0.8544 - classification_loss: 0.1037 372/500 [=====================>........] - ETA: 32s - loss: 0.9579 - regression_loss: 0.8543 - classification_loss: 0.1036 373/500 [=====================>........] - ETA: 31s - loss: 0.9580 - regression_loss: 0.8544 - classification_loss: 0.1036 374/500 [=====================>........] - ETA: 31s - loss: 0.9586 - regression_loss: 0.8549 - classification_loss: 0.1037 375/500 [=====================>........] - ETA: 31s - loss: 0.9612 - regression_loss: 0.8573 - classification_loss: 0.1038 376/500 [=====================>........] - ETA: 31s - loss: 0.9622 - regression_loss: 0.8579 - classification_loss: 0.1043 377/500 [=====================>........] - ETA: 30s - loss: 0.9623 - regression_loss: 0.8580 - classification_loss: 0.1042 378/500 [=====================>........] - ETA: 30s - loss: 0.9607 - regression_loss: 0.8567 - classification_loss: 0.1040 379/500 [=====================>........] - ETA: 30s - loss: 0.9612 - regression_loss: 0.8570 - classification_loss: 0.1042 380/500 [=====================>........] - ETA: 30s - loss: 0.9602 - regression_loss: 0.8561 - classification_loss: 0.1040 381/500 [=====================>........] - ETA: 29s - loss: 0.9607 - regression_loss: 0.8566 - classification_loss: 0.1041 382/500 [=====================>........] - ETA: 29s - loss: 0.9600 - regression_loss: 0.8560 - classification_loss: 0.1040 383/500 [=====================>........] - ETA: 29s - loss: 0.9603 - regression_loss: 0.8563 - classification_loss: 0.1040 384/500 [======================>.......] - ETA: 29s - loss: 0.9606 - regression_loss: 0.8566 - classification_loss: 0.1040 385/500 [======================>.......] - ETA: 28s - loss: 0.9610 - regression_loss: 0.8569 - classification_loss: 0.1040 386/500 [======================>.......] - ETA: 28s - loss: 0.9612 - regression_loss: 0.8571 - classification_loss: 0.1040 387/500 [======================>.......] - ETA: 28s - loss: 0.9615 - regression_loss: 0.8574 - classification_loss: 0.1041 388/500 [======================>.......] - ETA: 28s - loss: 0.9623 - regression_loss: 0.8581 - classification_loss: 0.1042 389/500 [======================>.......] - ETA: 27s - loss: 0.9628 - regression_loss: 0.8586 - classification_loss: 0.1042 390/500 [======================>.......] - ETA: 27s - loss: 0.9632 - regression_loss: 0.8589 - classification_loss: 0.1043 391/500 [======================>.......] - ETA: 27s - loss: 0.9623 - regression_loss: 0.8581 - classification_loss: 0.1041 392/500 [======================>.......] - ETA: 27s - loss: 0.9620 - regression_loss: 0.8580 - classification_loss: 0.1041 393/500 [======================>.......] - ETA: 26s - loss: 0.9611 - regression_loss: 0.8573 - classification_loss: 0.1039 394/500 [======================>.......] - ETA: 26s - loss: 0.9608 - regression_loss: 0.8570 - classification_loss: 0.1038 395/500 [======================>.......] - ETA: 26s - loss: 0.9596 - regression_loss: 0.8560 - classification_loss: 0.1036 396/500 [======================>.......] - ETA: 26s - loss: 0.9600 - regression_loss: 0.8562 - classification_loss: 0.1038 397/500 [======================>.......] - ETA: 25s - loss: 0.9603 - regression_loss: 0.8565 - classification_loss: 0.1038 398/500 [======================>.......] - ETA: 25s - loss: 0.9600 - regression_loss: 0.8562 - classification_loss: 0.1038 399/500 [======================>.......] - ETA: 25s - loss: 0.9587 - regression_loss: 0.8551 - classification_loss: 0.1036 400/500 [=======================>......] - ETA: 25s - loss: 0.9568 - regression_loss: 0.8534 - classification_loss: 0.1034 401/500 [=======================>......] - ETA: 24s - loss: 0.9567 - regression_loss: 0.8534 - classification_loss: 0.1033 402/500 [=======================>......] - ETA: 24s - loss: 0.9565 - regression_loss: 0.8534 - classification_loss: 0.1032 403/500 [=======================>......] - ETA: 24s - loss: 0.9566 - regression_loss: 0.8535 - classification_loss: 0.1031 404/500 [=======================>......] - ETA: 24s - loss: 0.9557 - regression_loss: 0.8527 - classification_loss: 0.1030 405/500 [=======================>......] - ETA: 23s - loss: 0.9545 - regression_loss: 0.8517 - classification_loss: 0.1028 406/500 [=======================>......] - ETA: 23s - loss: 0.9535 - regression_loss: 0.8508 - classification_loss: 0.1027 407/500 [=======================>......] - ETA: 23s - loss: 0.9539 - regression_loss: 0.8511 - classification_loss: 0.1028 408/500 [=======================>......] - ETA: 23s - loss: 0.9545 - regression_loss: 0.8516 - classification_loss: 0.1028 409/500 [=======================>......] - ETA: 22s - loss: 0.9544 - regression_loss: 0.8516 - classification_loss: 0.1028 410/500 [=======================>......] - ETA: 22s - loss: 0.9544 - regression_loss: 0.8515 - classification_loss: 0.1028 411/500 [=======================>......] - ETA: 22s - loss: 0.9532 - regression_loss: 0.8505 - classification_loss: 0.1027 412/500 [=======================>......] - ETA: 22s - loss: 0.9528 - regression_loss: 0.8502 - classification_loss: 0.1026 413/500 [=======================>......] - ETA: 21s - loss: 0.9533 - regression_loss: 0.8506 - classification_loss: 0.1027 414/500 [=======================>......] - ETA: 21s - loss: 0.9537 - regression_loss: 0.8510 - classification_loss: 0.1027 415/500 [=======================>......] - ETA: 21s - loss: 0.9527 - regression_loss: 0.8502 - classification_loss: 0.1025 416/500 [=======================>......] - ETA: 21s - loss: 0.9531 - regression_loss: 0.8505 - classification_loss: 0.1026 417/500 [========================>.....] - ETA: 20s - loss: 0.9522 - regression_loss: 0.8498 - classification_loss: 0.1024 418/500 [========================>.....] - ETA: 20s - loss: 0.9514 - regression_loss: 0.8491 - classification_loss: 0.1023 419/500 [========================>.....] - ETA: 20s - loss: 0.9516 - regression_loss: 0.8493 - classification_loss: 0.1023 420/500 [========================>.....] - ETA: 20s - loss: 0.9518 - regression_loss: 0.8494 - classification_loss: 0.1024 421/500 [========================>.....] - ETA: 19s - loss: 0.9521 - regression_loss: 0.8498 - classification_loss: 0.1023 422/500 [========================>.....] - ETA: 19s - loss: 0.9510 - regression_loss: 0.8488 - classification_loss: 0.1021 423/500 [========================>.....] - ETA: 19s - loss: 0.9513 - regression_loss: 0.8491 - classification_loss: 0.1021 424/500 [========================>.....] - ETA: 19s - loss: 0.9519 - regression_loss: 0.8497 - classification_loss: 0.1022 425/500 [========================>.....] - ETA: 18s - loss: 0.9513 - regression_loss: 0.8493 - classification_loss: 0.1021 426/500 [========================>.....] - ETA: 18s - loss: 0.9514 - regression_loss: 0.8493 - classification_loss: 0.1020 427/500 [========================>.....] - ETA: 18s - loss: 0.9509 - regression_loss: 0.8490 - classification_loss: 0.1019 428/500 [========================>.....] - ETA: 18s - loss: 0.9517 - regression_loss: 0.8496 - classification_loss: 0.1021 429/500 [========================>.....] - ETA: 17s - loss: 0.9519 - regression_loss: 0.8498 - classification_loss: 0.1020 430/500 [========================>.....] - ETA: 17s - loss: 0.9510 - regression_loss: 0.8491 - classification_loss: 0.1019 431/500 [========================>.....] - ETA: 17s - loss: 0.9510 - regression_loss: 0.8491 - classification_loss: 0.1019 432/500 [========================>.....] - ETA: 17s - loss: 0.9511 - regression_loss: 0.8492 - classification_loss: 0.1018 433/500 [========================>.....] - ETA: 16s - loss: 0.9511 - regression_loss: 0.8493 - classification_loss: 0.1018 434/500 [=========================>....] - ETA: 16s - loss: 0.9507 - regression_loss: 0.8489 - classification_loss: 0.1018 435/500 [=========================>....] - ETA: 16s - loss: 0.9502 - regression_loss: 0.8485 - classification_loss: 0.1017 436/500 [=========================>....] - ETA: 16s - loss: 0.9510 - regression_loss: 0.8492 - classification_loss: 0.1019 437/500 [=========================>....] - ETA: 15s - loss: 0.9505 - regression_loss: 0.8487 - classification_loss: 0.1018 438/500 [=========================>....] - ETA: 15s - loss: 0.9516 - regression_loss: 0.8499 - classification_loss: 0.1017 439/500 [=========================>....] - ETA: 15s - loss: 0.9511 - regression_loss: 0.8495 - classification_loss: 0.1016 440/500 [=========================>....] - ETA: 15s - loss: 0.9518 - regression_loss: 0.8503 - classification_loss: 0.1015 441/500 [=========================>....] - ETA: 14s - loss: 0.9515 - regression_loss: 0.8500 - classification_loss: 0.1015 442/500 [=========================>....] - ETA: 14s - loss: 0.9519 - regression_loss: 0.8504 - classification_loss: 0.1015 443/500 [=========================>....] - ETA: 14s - loss: 0.9509 - regression_loss: 0.8495 - classification_loss: 0.1014 444/500 [=========================>....] - ETA: 14s - loss: 0.9510 - regression_loss: 0.8495 - classification_loss: 0.1015 445/500 [=========================>....] - ETA: 13s - loss: 0.9497 - regression_loss: 0.8483 - classification_loss: 0.1014 446/500 [=========================>....] - ETA: 13s - loss: 0.9500 - regression_loss: 0.8485 - classification_loss: 0.1014 447/500 [=========================>....] - ETA: 13s - loss: 0.9506 - regression_loss: 0.8490 - classification_loss: 0.1016 448/500 [=========================>....] - ETA: 13s - loss: 0.9515 - regression_loss: 0.8498 - classification_loss: 0.1018 449/500 [=========================>....] - ETA: 12s - loss: 0.9505 - regression_loss: 0.8489 - classification_loss: 0.1016 450/500 [==========================>...] - ETA: 12s - loss: 0.9494 - regression_loss: 0.8479 - classification_loss: 0.1014 451/500 [==========================>...] - ETA: 12s - loss: 0.9501 - regression_loss: 0.8486 - classification_loss: 0.1015 452/500 [==========================>...] - ETA: 12s - loss: 0.9507 - regression_loss: 0.8491 - classification_loss: 0.1016 453/500 [==========================>...] - ETA: 11s - loss: 0.9513 - regression_loss: 0.8496 - classification_loss: 0.1017 454/500 [==========================>...] - ETA: 11s - loss: 0.9526 - regression_loss: 0.8503 - classification_loss: 0.1023 455/500 [==========================>...] - ETA: 11s - loss: 0.9528 - regression_loss: 0.8505 - classification_loss: 0.1024 456/500 [==========================>...] - ETA: 11s - loss: 0.9534 - regression_loss: 0.8508 - classification_loss: 0.1026 457/500 [==========================>...] - ETA: 10s - loss: 0.9540 - regression_loss: 0.8513 - classification_loss: 0.1027 458/500 [==========================>...] - ETA: 10s - loss: 0.9534 - regression_loss: 0.8509 - classification_loss: 0.1025 459/500 [==========================>...] - ETA: 10s - loss: 0.9523 - regression_loss: 0.8499 - classification_loss: 0.1023 460/500 [==========================>...] - ETA: 10s - loss: 0.9520 - regression_loss: 0.8498 - classification_loss: 0.1023 461/500 [==========================>...] - ETA: 9s - loss: 0.9528 - regression_loss: 0.8506 - classification_loss: 0.1023  462/500 [==========================>...] - ETA: 9s - loss: 0.9525 - regression_loss: 0.8502 - classification_loss: 0.1023 463/500 [==========================>...] - ETA: 9s - loss: 0.9520 - regression_loss: 0.8498 - classification_loss: 0.1022 464/500 [==========================>...] - ETA: 9s - loss: 0.9513 - regression_loss: 0.8493 - classification_loss: 0.1020 465/500 [==========================>...] - ETA: 8s - loss: 0.9505 - regression_loss: 0.8487 - classification_loss: 0.1019 466/500 [==========================>...] - ETA: 8s - loss: 0.9505 - regression_loss: 0.8486 - classification_loss: 0.1019 467/500 [===========================>..] - ETA: 8s - loss: 0.9505 - regression_loss: 0.8486 - classification_loss: 0.1019 468/500 [===========================>..] - ETA: 8s - loss: 0.9507 - regression_loss: 0.8488 - classification_loss: 0.1019 469/500 [===========================>..] - ETA: 7s - loss: 0.9504 - regression_loss: 0.8486 - classification_loss: 0.1018 470/500 [===========================>..] - ETA: 7s - loss: 0.9490 - regression_loss: 0.8474 - classification_loss: 0.1016 471/500 [===========================>..] - ETA: 7s - loss: 0.9493 - regression_loss: 0.8475 - classification_loss: 0.1017 472/500 [===========================>..] - ETA: 7s - loss: 0.9493 - regression_loss: 0.8476 - classification_loss: 0.1016 473/500 [===========================>..] - ETA: 6s - loss: 0.9493 - regression_loss: 0.8477 - classification_loss: 0.1016 474/500 [===========================>..] - ETA: 6s - loss: 0.9484 - regression_loss: 0.8470 - classification_loss: 0.1014 475/500 [===========================>..] - ETA: 6s - loss: 0.9483 - regression_loss: 0.8469 - classification_loss: 0.1014 476/500 [===========================>..] - ETA: 6s - loss: 0.9476 - regression_loss: 0.8463 - classification_loss: 0.1013 477/500 [===========================>..] - ETA: 5s - loss: 0.9471 - regression_loss: 0.8460 - classification_loss: 0.1012 478/500 [===========================>..] - ETA: 5s - loss: 0.9479 - regression_loss: 0.8466 - classification_loss: 0.1013 479/500 [===========================>..] - ETA: 5s - loss: 0.9469 - regression_loss: 0.8458 - classification_loss: 0.1011 480/500 [===========================>..] - ETA: 5s - loss: 0.9460 - regression_loss: 0.8450 - classification_loss: 0.1010 481/500 [===========================>..] - ETA: 4s - loss: 0.9450 - regression_loss: 0.8442 - classification_loss: 0.1008 482/500 [===========================>..] - ETA: 4s - loss: 0.9445 - regression_loss: 0.8438 - classification_loss: 0.1007 483/500 [===========================>..] - ETA: 4s - loss: 0.9436 - regression_loss: 0.8431 - classification_loss: 0.1006 484/500 [============================>.] - ETA: 4s - loss: 0.9450 - regression_loss: 0.8441 - classification_loss: 0.1009 485/500 [============================>.] - ETA: 3s - loss: 0.9454 - regression_loss: 0.8445 - classification_loss: 0.1009 486/500 [============================>.] - ETA: 3s - loss: 0.9469 - regression_loss: 0.8458 - classification_loss: 0.1012 487/500 [============================>.] - ETA: 3s - loss: 0.9457 - regression_loss: 0.8447 - classification_loss: 0.1010 488/500 [============================>.] - ETA: 3s - loss: 0.9452 - regression_loss: 0.8443 - classification_loss: 0.1009 489/500 [============================>.] - ETA: 2s - loss: 0.9439 - regression_loss: 0.8432 - classification_loss: 0.1007 490/500 [============================>.] - ETA: 2s - loss: 0.9428 - regression_loss: 0.8422 - classification_loss: 0.1006 491/500 [============================>.] - ETA: 2s - loss: 0.9432 - regression_loss: 0.8425 - classification_loss: 0.1006 492/500 [============================>.] - ETA: 2s - loss: 0.9444 - regression_loss: 0.8436 - classification_loss: 0.1008 493/500 [============================>.] - ETA: 1s - loss: 0.9435 - regression_loss: 0.8428 - classification_loss: 0.1007 494/500 [============================>.] - ETA: 1s - loss: 0.9424 - regression_loss: 0.8419 - classification_loss: 0.1006 495/500 [============================>.] - ETA: 1s - loss: 0.9427 - regression_loss: 0.8421 - classification_loss: 0.1006 496/500 [============================>.] - ETA: 1s - loss: 0.9432 - regression_loss: 0.8426 - classification_loss: 0.1006 497/500 [============================>.] - ETA: 0s - loss: 0.9432 - regression_loss: 0.8427 - classification_loss: 0.1005 498/500 [============================>.] - ETA: 0s - loss: 0.9433 - regression_loss: 0.8427 - classification_loss: 0.1006 499/500 [============================>.] - ETA: 0s - loss: 0.9430 - regression_loss: 0.8425 - classification_loss: 0.1005 500/500 [==============================] - 125s 251ms/step - loss: 0.9420 - regression_loss: 0.8416 - classification_loss: 0.1004 1172 instances of class plum with average precision: 0.8054 mAP: 0.8054 Epoch 00046: saving model to ./training/snapshots/resnet50_pascal_46.h5 Epoch 47/150 1/500 [..............................] - ETA: 1:45 - loss: 1.4497 - regression_loss: 1.2716 - classification_loss: 0.1781 2/500 [..............................] - ETA: 1:49 - loss: 0.9572 - regression_loss: 0.8381 - classification_loss: 0.1191 3/500 [..............................] - ETA: 1:54 - loss: 0.9905 - regression_loss: 0.8812 - classification_loss: 0.1093 4/500 [..............................] - ETA: 1:55 - loss: 0.9506 - regression_loss: 0.8512 - classification_loss: 0.0994 5/500 [..............................] - ETA: 1:56 - loss: 0.9448 - regression_loss: 0.8474 - classification_loss: 0.0973 6/500 [..............................] - ETA: 1:57 - loss: 0.9792 - regression_loss: 0.8721 - classification_loss: 0.1070 7/500 [..............................] - ETA: 1:58 - loss: 0.9694 - regression_loss: 0.8566 - classification_loss: 0.1127 8/500 [..............................] - ETA: 1:58 - loss: 0.9087 - regression_loss: 0.8051 - classification_loss: 0.1035 9/500 [..............................] - ETA: 1:58 - loss: 0.9404 - regression_loss: 0.8304 - classification_loss: 0.1100 10/500 [..............................] - ETA: 1:59 - loss: 0.9315 - regression_loss: 0.8266 - classification_loss: 0.1049 11/500 [..............................] - ETA: 1:59 - loss: 0.9733 - regression_loss: 0.8624 - classification_loss: 0.1109 12/500 [..............................] - ETA: 2:00 - loss: 0.9754 - regression_loss: 0.8680 - classification_loss: 0.1074 13/500 [..............................] - ETA: 2:00 - loss: 0.9789 - regression_loss: 0.8727 - classification_loss: 0.1061 14/500 [..............................] - ETA: 2:00 - loss: 0.9744 - regression_loss: 0.8681 - classification_loss: 0.1063 15/500 [..............................] - ETA: 1:59 - loss: 0.9295 - regression_loss: 0.8292 - classification_loss: 0.1002 16/500 [..............................] - ETA: 1:59 - loss: 0.9447 - regression_loss: 0.8392 - classification_loss: 0.1055 17/500 [>.............................] - ETA: 1:59 - loss: 0.9642 - regression_loss: 0.8578 - classification_loss: 0.1064 18/500 [>.............................] - ETA: 1:59 - loss: 0.9654 - regression_loss: 0.8580 - classification_loss: 0.1074 19/500 [>.............................] - ETA: 1:59 - loss: 0.9557 - regression_loss: 0.8491 - classification_loss: 0.1067 20/500 [>.............................] - ETA: 1:59 - loss: 0.9532 - regression_loss: 0.8469 - classification_loss: 0.1063 21/500 [>.............................] - ETA: 1:59 - loss: 0.9593 - regression_loss: 0.8533 - classification_loss: 0.1060 22/500 [>.............................] - ETA: 1:58 - loss: 0.9465 - regression_loss: 0.8436 - classification_loss: 0.1029 23/500 [>.............................] - ETA: 1:58 - loss: 0.9492 - regression_loss: 0.8467 - classification_loss: 0.1025 24/500 [>.............................] - ETA: 1:58 - loss: 0.9511 - regression_loss: 0.8477 - classification_loss: 0.1034 25/500 [>.............................] - ETA: 1:58 - loss: 0.9509 - regression_loss: 0.8471 - classification_loss: 0.1038 26/500 [>.............................] - ETA: 1:58 - loss: 0.9617 - regression_loss: 0.8578 - classification_loss: 0.1040 27/500 [>.............................] - ETA: 1:58 - loss: 0.9649 - regression_loss: 0.8618 - classification_loss: 0.1031 28/500 [>.............................] - ETA: 1:58 - loss: 0.9623 - regression_loss: 0.8605 - classification_loss: 0.1018 29/500 [>.............................] - ETA: 1:57 - loss: 0.9499 - regression_loss: 0.8502 - classification_loss: 0.0997 30/500 [>.............................] - ETA: 1:57 - loss: 0.9324 - regression_loss: 0.8350 - classification_loss: 0.0974 31/500 [>.............................] - ETA: 1:57 - loss: 0.9376 - regression_loss: 0.8402 - classification_loss: 0.0974 32/500 [>.............................] - ETA: 1:57 - loss: 0.9369 - regression_loss: 0.8387 - classification_loss: 0.0981 33/500 [>.............................] - ETA: 1:56 - loss: 0.9240 - regression_loss: 0.8275 - classification_loss: 0.0964 34/500 [=>............................] - ETA: 1:56 - loss: 0.9081 - regression_loss: 0.8137 - classification_loss: 0.0944 35/500 [=>............................] - ETA: 1:56 - loss: 0.9221 - regression_loss: 0.8251 - classification_loss: 0.0970 36/500 [=>............................] - ETA: 1:56 - loss: 0.9223 - regression_loss: 0.8257 - classification_loss: 0.0966 37/500 [=>............................] - ETA: 1:56 - loss: 0.9295 - regression_loss: 0.8317 - classification_loss: 0.0978 38/500 [=>............................] - ETA: 1:55 - loss: 0.9323 - regression_loss: 0.8346 - classification_loss: 0.0977 39/500 [=>............................] - ETA: 1:55 - loss: 0.9421 - regression_loss: 0.8425 - classification_loss: 0.0996 40/500 [=>............................] - ETA: 1:55 - loss: 0.9516 - regression_loss: 0.8507 - classification_loss: 0.1009 41/500 [=>............................] - ETA: 1:55 - loss: 0.9333 - regression_loss: 0.8343 - classification_loss: 0.0990 42/500 [=>............................] - ETA: 1:54 - loss: 0.9389 - regression_loss: 0.8387 - classification_loss: 0.1002 43/500 [=>............................] - ETA: 1:54 - loss: 0.9384 - regression_loss: 0.8387 - classification_loss: 0.0997 44/500 [=>............................] - ETA: 1:54 - loss: 0.9425 - regression_loss: 0.8410 - classification_loss: 0.1015 45/500 [=>............................] - ETA: 1:54 - loss: 0.9435 - regression_loss: 0.8404 - classification_loss: 0.1030 46/500 [=>............................] - ETA: 1:53 - loss: 0.9462 - regression_loss: 0.8431 - classification_loss: 0.1031 47/500 [=>............................] - ETA: 1:53 - loss: 0.9409 - regression_loss: 0.8381 - classification_loss: 0.1028 48/500 [=>............................] - ETA: 1:53 - loss: 0.9388 - regression_loss: 0.8359 - classification_loss: 0.1029 49/500 [=>............................] - ETA: 1:53 - loss: 0.9409 - regression_loss: 0.8373 - classification_loss: 0.1036 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9555 - regression_loss: 0.8490 - classification_loss: 0.1065 51/500 [==>...........................] - ETA: 1:52 - loss: 0.9629 - regression_loss: 0.8552 - classification_loss: 0.1076 52/500 [==>...........................] - ETA: 1:52 - loss: 0.9627 - regression_loss: 0.8549 - classification_loss: 0.1078 53/500 [==>...........................] - ETA: 1:52 - loss: 0.9584 - regression_loss: 0.8512 - classification_loss: 0.1072 54/500 [==>...........................] - ETA: 1:51 - loss: 0.9591 - regression_loss: 0.8520 - classification_loss: 0.1071 55/500 [==>...........................] - ETA: 1:51 - loss: 0.9614 - regression_loss: 0.8536 - classification_loss: 0.1078 56/500 [==>...........................] - ETA: 1:51 - loss: 0.9725 - regression_loss: 0.8634 - classification_loss: 0.1091 57/500 [==>...........................] - ETA: 1:50 - loss: 0.9662 - regression_loss: 0.8580 - classification_loss: 0.1083 58/500 [==>...........................] - ETA: 1:50 - loss: 0.9727 - regression_loss: 0.8641 - classification_loss: 0.1086 59/500 [==>...........................] - ETA: 1:49 - loss: 0.9800 - regression_loss: 0.8703 - classification_loss: 0.1097 60/500 [==>...........................] - ETA: 1:49 - loss: 0.9749 - regression_loss: 0.8661 - classification_loss: 0.1088 61/500 [==>...........................] - ETA: 1:48 - loss: 0.9658 - regression_loss: 0.8581 - classification_loss: 0.1076 62/500 [==>...........................] - ETA: 1:48 - loss: 0.9696 - regression_loss: 0.8608 - classification_loss: 0.1088 63/500 [==>...........................] - ETA: 1:48 - loss: 0.9643 - regression_loss: 0.8562 - classification_loss: 0.1081 64/500 [==>...........................] - ETA: 1:48 - loss: 0.9585 - regression_loss: 0.8513 - classification_loss: 0.1071 65/500 [==>...........................] - ETA: 1:48 - loss: 0.9486 - regression_loss: 0.8428 - classification_loss: 0.1058 66/500 [==>...........................] - ETA: 1:47 - loss: 0.9512 - regression_loss: 0.8451 - classification_loss: 0.1061 67/500 [===>..........................] - ETA: 1:47 - loss: 0.9583 - regression_loss: 0.8508 - classification_loss: 0.1075 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9491 - regression_loss: 0.8426 - classification_loss: 0.1065 69/500 [===>..........................] - ETA: 1:47 - loss: 0.9535 - regression_loss: 0.8469 - classification_loss: 0.1066 70/500 [===>..........................] - ETA: 1:46 - loss: 0.9526 - regression_loss: 0.8459 - classification_loss: 0.1066 71/500 [===>..........................] - ETA: 1:46 - loss: 0.9468 - regression_loss: 0.8409 - classification_loss: 0.1059 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9447 - regression_loss: 0.8389 - classification_loss: 0.1058 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9468 - regression_loss: 0.8412 - classification_loss: 0.1056 74/500 [===>..........................] - ETA: 1:45 - loss: 0.9464 - regression_loss: 0.8409 - classification_loss: 0.1055 75/500 [===>..........................] - ETA: 1:45 - loss: 0.9442 - regression_loss: 0.8391 - classification_loss: 0.1050 76/500 [===>..........................] - ETA: 1:45 - loss: 0.9363 - regression_loss: 0.8322 - classification_loss: 0.1041 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9358 - regression_loss: 0.8311 - classification_loss: 0.1048 78/500 [===>..........................] - ETA: 1:44 - loss: 0.9337 - regression_loss: 0.8284 - classification_loss: 0.1053 79/500 [===>..........................] - ETA: 1:44 - loss: 0.9370 - regression_loss: 0.8314 - classification_loss: 0.1055 80/500 [===>..........................] - ETA: 1:44 - loss: 0.9395 - regression_loss: 0.8337 - classification_loss: 0.1058 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9357 - regression_loss: 0.8308 - classification_loss: 0.1050 82/500 [===>..........................] - ETA: 1:43 - loss: 0.9344 - regression_loss: 0.8296 - classification_loss: 0.1048 83/500 [===>..........................] - ETA: 1:43 - loss: 0.9357 - regression_loss: 0.8305 - classification_loss: 0.1052 84/500 [====>.........................] - ETA: 1:43 - loss: 0.9391 - regression_loss: 0.8332 - classification_loss: 0.1059 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9321 - regression_loss: 0.8273 - classification_loss: 0.1048 86/500 [====>.........................] - ETA: 1:42 - loss: 0.9293 - regression_loss: 0.8254 - classification_loss: 0.1039 87/500 [====>.........................] - ETA: 1:42 - loss: 0.9254 - regression_loss: 0.8215 - classification_loss: 0.1039 88/500 [====>.........................] - ETA: 1:42 - loss: 0.9274 - regression_loss: 0.8233 - classification_loss: 0.1042 89/500 [====>.........................] - ETA: 1:42 - loss: 0.9274 - regression_loss: 0.8234 - classification_loss: 0.1040 90/500 [====>.........................] - ETA: 1:41 - loss: 0.9268 - regression_loss: 0.8229 - classification_loss: 0.1039 91/500 [====>.........................] - ETA: 1:41 - loss: 0.9264 - regression_loss: 0.8226 - classification_loss: 0.1039 92/500 [====>.........................] - ETA: 1:41 - loss: 0.9276 - regression_loss: 0.8239 - classification_loss: 0.1037 93/500 [====>.........................] - ETA: 1:41 - loss: 0.9274 - regression_loss: 0.8236 - classification_loss: 0.1039 94/500 [====>.........................] - ETA: 1:40 - loss: 0.9309 - regression_loss: 0.8268 - classification_loss: 0.1041 95/500 [====>.........................] - ETA: 1:40 - loss: 0.9286 - regression_loss: 0.8246 - classification_loss: 0.1040 96/500 [====>.........................] - ETA: 1:40 - loss: 0.9225 - regression_loss: 0.8195 - classification_loss: 0.1031 97/500 [====>.........................] - ETA: 1:40 - loss: 0.9178 - regression_loss: 0.8155 - classification_loss: 0.1023 98/500 [====>.........................] - ETA: 1:40 - loss: 0.9162 - regression_loss: 0.8141 - classification_loss: 0.1022 99/500 [====>.........................] - ETA: 1:39 - loss: 0.9177 - regression_loss: 0.8153 - classification_loss: 0.1024 100/500 [=====>........................] - ETA: 1:39 - loss: 0.9186 - regression_loss: 0.8164 - classification_loss: 0.1022 101/500 [=====>........................] - ETA: 1:39 - loss: 0.9216 - regression_loss: 0.8189 - classification_loss: 0.1027 102/500 [=====>........................] - ETA: 1:39 - loss: 0.9234 - regression_loss: 0.8208 - classification_loss: 0.1026 103/500 [=====>........................] - ETA: 1:38 - loss: 0.9219 - regression_loss: 0.8200 - classification_loss: 0.1019 104/500 [=====>........................] - ETA: 1:38 - loss: 0.9217 - regression_loss: 0.8195 - classification_loss: 0.1021 105/500 [=====>........................] - ETA: 1:38 - loss: 0.9222 - regression_loss: 0.8197 - classification_loss: 0.1025 106/500 [=====>........................] - ETA: 1:38 - loss: 0.9232 - regression_loss: 0.8210 - classification_loss: 0.1022 107/500 [=====>........................] - ETA: 1:37 - loss: 0.9199 - regression_loss: 0.8182 - classification_loss: 0.1017 108/500 [=====>........................] - ETA: 1:37 - loss: 0.9207 - regression_loss: 0.8192 - classification_loss: 0.1015 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9213 - regression_loss: 0.8193 - classification_loss: 0.1020 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9213 - regression_loss: 0.8196 - classification_loss: 0.1017 111/500 [=====>........................] - ETA: 1:37 - loss: 0.9263 - regression_loss: 0.8241 - classification_loss: 0.1023 112/500 [=====>........................] - ETA: 1:36 - loss: 0.9313 - regression_loss: 0.8279 - classification_loss: 0.1034 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9312 - regression_loss: 0.8282 - classification_loss: 0.1031 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9333 - regression_loss: 0.8302 - classification_loss: 0.1032 115/500 [=====>........................] - ETA: 1:36 - loss: 0.9344 - regression_loss: 0.8310 - classification_loss: 0.1034 116/500 [=====>........................] - ETA: 1:35 - loss: 0.9344 - regression_loss: 0.8310 - classification_loss: 0.1034 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9345 - regression_loss: 0.8313 - classification_loss: 0.1032 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9358 - regression_loss: 0.8324 - classification_loss: 0.1034 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9319 - regression_loss: 0.8290 - classification_loss: 0.1029 120/500 [======>.......................] - ETA: 1:34 - loss: 0.9273 - regression_loss: 0.8251 - classification_loss: 0.1023 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9269 - regression_loss: 0.8250 - classification_loss: 0.1020 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9270 - regression_loss: 0.8252 - classification_loss: 0.1018 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9284 - regression_loss: 0.8265 - classification_loss: 0.1019 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9284 - regression_loss: 0.8266 - classification_loss: 0.1018 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9294 - regression_loss: 0.8276 - classification_loss: 0.1018 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9266 - regression_loss: 0.8253 - classification_loss: 0.1013 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9232 - regression_loss: 0.8225 - classification_loss: 0.1007 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9286 - regression_loss: 0.8279 - classification_loss: 0.1007 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9245 - regression_loss: 0.8244 - classification_loss: 0.1001 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9254 - regression_loss: 0.8251 - classification_loss: 0.1003 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9229 - regression_loss: 0.8231 - classification_loss: 0.0998 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9224 - regression_loss: 0.8225 - classification_loss: 0.0999 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9247 - regression_loss: 0.8242 - classification_loss: 0.1005 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9266 - regression_loss: 0.8258 - classification_loss: 0.1008 135/500 [=======>......................] - ETA: 1:30 - loss: 0.9276 - regression_loss: 0.8264 - classification_loss: 0.1011 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9295 - regression_loss: 0.8280 - classification_loss: 0.1015 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9289 - regression_loss: 0.8274 - classification_loss: 0.1015 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9277 - regression_loss: 0.8265 - classification_loss: 0.1013 139/500 [=======>......................] - ETA: 1:29 - loss: 0.9225 - regression_loss: 0.8219 - classification_loss: 0.1006 140/500 [=======>......................] - ETA: 1:29 - loss: 0.9241 - regression_loss: 0.8234 - classification_loss: 0.1008 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9239 - regression_loss: 0.8232 - classification_loss: 0.1007 142/500 [=======>......................] - ETA: 1:28 - loss: 0.9259 - regression_loss: 0.8249 - classification_loss: 0.1010 143/500 [=======>......................] - ETA: 1:28 - loss: 0.9269 - regression_loss: 0.8258 - classification_loss: 0.1011 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9232 - regression_loss: 0.8227 - classification_loss: 0.1005 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9176 - regression_loss: 0.8176 - classification_loss: 0.1000 146/500 [=======>......................] - ETA: 1:27 - loss: 0.9165 - regression_loss: 0.8167 - classification_loss: 0.0998 147/500 [=======>......................] - ETA: 1:27 - loss: 0.9169 - regression_loss: 0.8171 - classification_loss: 0.0998 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9164 - regression_loss: 0.8171 - classification_loss: 0.0994 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9119 - regression_loss: 0.8130 - classification_loss: 0.0988 150/500 [========>.....................] - ETA: 1:26 - loss: 0.9133 - regression_loss: 0.8142 - classification_loss: 0.0991 151/500 [========>.....................] - ETA: 1:26 - loss: 0.9129 - regression_loss: 0.8141 - classification_loss: 0.0988 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9133 - regression_loss: 0.8145 - classification_loss: 0.0988 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9135 - regression_loss: 0.8146 - classification_loss: 0.0989 154/500 [========>.....................] - ETA: 1:25 - loss: 0.9138 - regression_loss: 0.8150 - classification_loss: 0.0988 155/500 [========>.....................] - ETA: 1:25 - loss: 0.9170 - regression_loss: 0.8175 - classification_loss: 0.0995 156/500 [========>.....................] - ETA: 1:25 - loss: 0.9185 - regression_loss: 0.8193 - classification_loss: 0.0992 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9205 - regression_loss: 0.8209 - classification_loss: 0.0996 158/500 [========>.....................] - ETA: 1:24 - loss: 0.9221 - regression_loss: 0.8220 - classification_loss: 0.1001 159/500 [========>.....................] - ETA: 1:24 - loss: 0.9202 - regression_loss: 0.8206 - classification_loss: 0.0996 160/500 [========>.....................] - ETA: 1:24 - loss: 0.9222 - regression_loss: 0.8222 - classification_loss: 0.1000 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9187 - regression_loss: 0.8193 - classification_loss: 0.0994 162/500 [========>.....................] - ETA: 1:23 - loss: 0.9193 - regression_loss: 0.8199 - classification_loss: 0.0994 163/500 [========>.....................] - ETA: 1:23 - loss: 0.9195 - regression_loss: 0.8199 - classification_loss: 0.0997 164/500 [========>.....................] - ETA: 1:23 - loss: 0.9172 - regression_loss: 0.8181 - classification_loss: 0.0992 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9162 - regression_loss: 0.8170 - classification_loss: 0.0992 166/500 [========>.....................] - ETA: 1:22 - loss: 0.9195 - regression_loss: 0.8198 - classification_loss: 0.0997 167/500 [=========>....................] - ETA: 1:22 - loss: 0.9195 - regression_loss: 0.8197 - classification_loss: 0.0998 168/500 [=========>....................] - ETA: 1:22 - loss: 0.9199 - regression_loss: 0.8200 - classification_loss: 0.0999 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9201 - regression_loss: 0.8202 - classification_loss: 0.0998 170/500 [=========>....................] - ETA: 1:21 - loss: 0.9165 - regression_loss: 0.8171 - classification_loss: 0.0994 171/500 [=========>....................] - ETA: 1:21 - loss: 0.9196 - regression_loss: 0.8199 - classification_loss: 0.0997 172/500 [=========>....................] - ETA: 1:21 - loss: 0.9197 - regression_loss: 0.8202 - classification_loss: 0.0995 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9222 - regression_loss: 0.8223 - classification_loss: 0.0999 174/500 [=========>....................] - ETA: 1:20 - loss: 0.9246 - regression_loss: 0.8246 - classification_loss: 0.1000 175/500 [=========>....................] - ETA: 1:20 - loss: 0.9245 - regression_loss: 0.8246 - classification_loss: 0.0999 176/500 [=========>....................] - ETA: 1:20 - loss: 0.9253 - regression_loss: 0.8255 - classification_loss: 0.0997 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9224 - regression_loss: 0.8230 - classification_loss: 0.0994 178/500 [=========>....................] - ETA: 1:19 - loss: 0.9238 - regression_loss: 0.8241 - classification_loss: 0.0996 179/500 [=========>....................] - ETA: 1:19 - loss: 0.9241 - regression_loss: 0.8246 - classification_loss: 0.0995 180/500 [=========>....................] - ETA: 1:19 - loss: 0.9241 - regression_loss: 0.8249 - classification_loss: 0.0992 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9211 - regression_loss: 0.8223 - classification_loss: 0.0988 182/500 [=========>....................] - ETA: 1:18 - loss: 0.9218 - regression_loss: 0.8229 - classification_loss: 0.0988 183/500 [=========>....................] - ETA: 1:18 - loss: 0.9224 - regression_loss: 0.8235 - classification_loss: 0.0989 184/500 [==========>...................] - ETA: 1:18 - loss: 0.9236 - regression_loss: 0.8247 - classification_loss: 0.0989 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9227 - regression_loss: 0.8239 - classification_loss: 0.0987 186/500 [==========>...................] - ETA: 1:17 - loss: 0.9222 - regression_loss: 0.8236 - classification_loss: 0.0986 187/500 [==========>...................] - ETA: 1:17 - loss: 0.9197 - regression_loss: 0.8215 - classification_loss: 0.0982 188/500 [==========>...................] - ETA: 1:17 - loss: 0.9184 - regression_loss: 0.8202 - classification_loss: 0.0982 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9179 - regression_loss: 0.8200 - classification_loss: 0.0979 190/500 [==========>...................] - ETA: 1:16 - loss: 0.9198 - regression_loss: 0.8215 - classification_loss: 0.0983 191/500 [==========>...................] - ETA: 1:16 - loss: 0.9184 - regression_loss: 0.8205 - classification_loss: 0.0979 192/500 [==========>...................] - ETA: 1:16 - loss: 0.9155 - regression_loss: 0.8180 - classification_loss: 0.0976 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9153 - regression_loss: 0.8178 - classification_loss: 0.0975 194/500 [==========>...................] - ETA: 1:15 - loss: 0.9132 - regression_loss: 0.8161 - classification_loss: 0.0971 195/500 [==========>...................] - ETA: 1:15 - loss: 0.9111 - regression_loss: 0.8143 - classification_loss: 0.0967 196/500 [==========>...................] - ETA: 1:15 - loss: 0.9113 - regression_loss: 0.8143 - classification_loss: 0.0970 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9143 - regression_loss: 0.8168 - classification_loss: 0.0975 198/500 [==========>...................] - ETA: 1:14 - loss: 0.9140 - regression_loss: 0.8167 - classification_loss: 0.0974 199/500 [==========>...................] - ETA: 1:14 - loss: 0.9131 - regression_loss: 0.8160 - classification_loss: 0.0971 200/500 [===========>..................] - ETA: 1:14 - loss: 0.9122 - regression_loss: 0.8153 - classification_loss: 0.0969 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9130 - regression_loss: 0.8159 - classification_loss: 0.0971 202/500 [===========>..................] - ETA: 1:13 - loss: 0.9155 - regression_loss: 0.8179 - classification_loss: 0.0976 203/500 [===========>..................] - ETA: 1:13 - loss: 0.9161 - regression_loss: 0.8183 - classification_loss: 0.0978 204/500 [===========>..................] - ETA: 1:13 - loss: 0.9164 - regression_loss: 0.8186 - classification_loss: 0.0978 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9175 - regression_loss: 0.8196 - classification_loss: 0.0979 206/500 [===========>..................] - ETA: 1:12 - loss: 0.9168 - regression_loss: 0.8190 - classification_loss: 0.0978 207/500 [===========>..................] - ETA: 1:12 - loss: 0.9198 - regression_loss: 0.8216 - classification_loss: 0.0982 208/500 [===========>..................] - ETA: 1:12 - loss: 0.9214 - regression_loss: 0.8230 - classification_loss: 0.0985 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9225 - regression_loss: 0.8239 - classification_loss: 0.0986 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9236 - regression_loss: 0.8250 - classification_loss: 0.0986 211/500 [===========>..................] - ETA: 1:11 - loss: 0.9249 - regression_loss: 0.8260 - classification_loss: 0.0988 212/500 [===========>..................] - ETA: 1:11 - loss: 0.9259 - regression_loss: 0.8269 - classification_loss: 0.0990 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9265 - regression_loss: 0.8273 - classification_loss: 0.0992 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9289 - regression_loss: 0.8296 - classification_loss: 0.0993 215/500 [===========>..................] - ETA: 1:10 - loss: 0.9293 - regression_loss: 0.8297 - classification_loss: 0.0995 216/500 [===========>..................] - ETA: 1:10 - loss: 0.9277 - regression_loss: 0.8285 - classification_loss: 0.0992 217/500 [============>.................] - ETA: 1:10 - loss: 0.9268 - regression_loss: 0.8276 - classification_loss: 0.0992 218/500 [============>.................] - ETA: 1:10 - loss: 0.9247 - regression_loss: 0.8258 - classification_loss: 0.0989 219/500 [============>.................] - ETA: 1:09 - loss: 0.9244 - regression_loss: 0.8256 - classification_loss: 0.0988 220/500 [============>.................] - ETA: 1:09 - loss: 0.9258 - regression_loss: 0.8267 - classification_loss: 0.0991 221/500 [============>.................] - ETA: 1:09 - loss: 0.9241 - regression_loss: 0.8253 - classification_loss: 0.0988 222/500 [============>.................] - ETA: 1:09 - loss: 0.9238 - regression_loss: 0.8251 - classification_loss: 0.0987 223/500 [============>.................] - ETA: 1:08 - loss: 0.9239 - regression_loss: 0.8252 - classification_loss: 0.0987 224/500 [============>.................] - ETA: 1:08 - loss: 0.9241 - regression_loss: 0.8253 - classification_loss: 0.0988 225/500 [============>.................] - ETA: 1:08 - loss: 0.9214 - regression_loss: 0.8229 - classification_loss: 0.0984 226/500 [============>.................] - ETA: 1:08 - loss: 0.9218 - regression_loss: 0.8233 - classification_loss: 0.0985 227/500 [============>.................] - ETA: 1:07 - loss: 0.9229 - regression_loss: 0.8242 - classification_loss: 0.0987 228/500 [============>.................] - ETA: 1:07 - loss: 0.9202 - regression_loss: 0.8218 - classification_loss: 0.0984 229/500 [============>.................] - ETA: 1:07 - loss: 0.9189 - regression_loss: 0.8208 - classification_loss: 0.0981 230/500 [============>.................] - ETA: 1:07 - loss: 0.9202 - regression_loss: 0.8219 - classification_loss: 0.0983 231/500 [============>.................] - ETA: 1:06 - loss: 0.9197 - regression_loss: 0.8215 - classification_loss: 0.0982 232/500 [============>.................] - ETA: 1:06 - loss: 0.9203 - regression_loss: 0.8221 - classification_loss: 0.0982 233/500 [============>.................] - ETA: 1:06 - loss: 0.9178 - regression_loss: 0.8199 - classification_loss: 0.0979 234/500 [=============>................] - ETA: 1:06 - loss: 0.9176 - regression_loss: 0.8197 - classification_loss: 0.0979 235/500 [=============>................] - ETA: 1:05 - loss: 0.9173 - regression_loss: 0.8194 - classification_loss: 0.0979 236/500 [=============>................] - ETA: 1:05 - loss: 0.9174 - regression_loss: 0.8195 - classification_loss: 0.0980 237/500 [=============>................] - ETA: 1:05 - loss: 0.9180 - regression_loss: 0.8199 - classification_loss: 0.0981 238/500 [=============>................] - ETA: 1:04 - loss: 0.9165 - regression_loss: 0.8188 - classification_loss: 0.0978 239/500 [=============>................] - ETA: 1:04 - loss: 0.9171 - regression_loss: 0.8194 - classification_loss: 0.0978 240/500 [=============>................] - ETA: 1:04 - loss: 0.9181 - regression_loss: 0.8202 - classification_loss: 0.0979 241/500 [=============>................] - ETA: 1:04 - loss: 0.9187 - regression_loss: 0.8208 - classification_loss: 0.0979 242/500 [=============>................] - ETA: 1:03 - loss: 0.9192 - regression_loss: 0.8211 - classification_loss: 0.0980 243/500 [=============>................] - ETA: 1:03 - loss: 0.9176 - regression_loss: 0.8198 - classification_loss: 0.0978 244/500 [=============>................] - ETA: 1:03 - loss: 0.9200 - regression_loss: 0.8220 - classification_loss: 0.0981 245/500 [=============>................] - ETA: 1:03 - loss: 0.9201 - regression_loss: 0.8220 - classification_loss: 0.0981 246/500 [=============>................] - ETA: 1:02 - loss: 0.9222 - regression_loss: 0.8239 - classification_loss: 0.0983 247/500 [=============>................] - ETA: 1:02 - loss: 0.9221 - regression_loss: 0.8239 - classification_loss: 0.0982 248/500 [=============>................] - ETA: 1:02 - loss: 0.9199 - regression_loss: 0.8219 - classification_loss: 0.0980 249/500 [=============>................] - ETA: 1:02 - loss: 0.9202 - regression_loss: 0.8223 - classification_loss: 0.0979 250/500 [==============>...............] - ETA: 1:01 - loss: 0.9199 - regression_loss: 0.8222 - classification_loss: 0.0976 251/500 [==============>...............] - ETA: 1:01 - loss: 0.9207 - regression_loss: 0.8229 - classification_loss: 0.0978 252/500 [==============>...............] - ETA: 1:01 - loss: 0.9215 - regression_loss: 0.8235 - classification_loss: 0.0979 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9216 - regression_loss: 0.8237 - classification_loss: 0.0979 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9204 - regression_loss: 0.8228 - classification_loss: 0.0976 255/500 [==============>...............] - ETA: 1:00 - loss: 0.9218 - regression_loss: 0.8239 - classification_loss: 0.0979 256/500 [==============>...............] - ETA: 1:00 - loss: 0.9216 - regression_loss: 0.8238 - classification_loss: 0.0978 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9223 - regression_loss: 0.8243 - classification_loss: 0.0980 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9242 - regression_loss: 0.8258 - classification_loss: 0.0984 259/500 [==============>...............] - ETA: 59s - loss: 0.9235 - regression_loss: 0.8253 - classification_loss: 0.0982  260/500 [==============>...............] - ETA: 59s - loss: 0.9245 - regression_loss: 0.8262 - classification_loss: 0.0983 261/500 [==============>...............] - ETA: 59s - loss: 0.9238 - regression_loss: 0.8254 - classification_loss: 0.0984 262/500 [==============>...............] - ETA: 59s - loss: 0.9248 - regression_loss: 0.8263 - classification_loss: 0.0986 263/500 [==============>...............] - ETA: 58s - loss: 0.9233 - regression_loss: 0.8250 - classification_loss: 0.0984 264/500 [==============>...............] - ETA: 58s - loss: 0.9234 - regression_loss: 0.8250 - classification_loss: 0.0984 265/500 [==============>...............] - ETA: 58s - loss: 0.9236 - regression_loss: 0.8251 - classification_loss: 0.0984 266/500 [==============>...............] - ETA: 58s - loss: 0.9244 - regression_loss: 0.8259 - classification_loss: 0.0985 267/500 [===============>..............] - ETA: 57s - loss: 0.9237 - regression_loss: 0.8254 - classification_loss: 0.0984 268/500 [===============>..............] - ETA: 57s - loss: 0.9212 - regression_loss: 0.8232 - classification_loss: 0.0981 269/500 [===============>..............] - ETA: 57s - loss: 0.9219 - regression_loss: 0.8238 - classification_loss: 0.0981 270/500 [===============>..............] - ETA: 57s - loss: 0.9215 - regression_loss: 0.8234 - classification_loss: 0.0981 271/500 [===============>..............] - ETA: 56s - loss: 0.9224 - regression_loss: 0.8241 - classification_loss: 0.0983 272/500 [===============>..............] - ETA: 56s - loss: 0.9215 - regression_loss: 0.8232 - classification_loss: 0.0983 273/500 [===============>..............] - ETA: 56s - loss: 0.9223 - regression_loss: 0.8239 - classification_loss: 0.0984 274/500 [===============>..............] - ETA: 56s - loss: 0.9208 - regression_loss: 0.8225 - classification_loss: 0.0983 275/500 [===============>..............] - ETA: 55s - loss: 0.9194 - regression_loss: 0.8214 - classification_loss: 0.0981 276/500 [===============>..............] - ETA: 55s - loss: 0.9179 - regression_loss: 0.8201 - classification_loss: 0.0978 277/500 [===============>..............] - ETA: 55s - loss: 0.9179 - regression_loss: 0.8201 - classification_loss: 0.0977 278/500 [===============>..............] - ETA: 55s - loss: 0.9171 - regression_loss: 0.8195 - classification_loss: 0.0977 279/500 [===============>..............] - ETA: 54s - loss: 0.9186 - regression_loss: 0.8206 - classification_loss: 0.0979 280/500 [===============>..............] - ETA: 54s - loss: 0.9190 - regression_loss: 0.8211 - classification_loss: 0.0979 281/500 [===============>..............] - ETA: 54s - loss: 0.9209 - regression_loss: 0.8227 - classification_loss: 0.0982 282/500 [===============>..............] - ETA: 54s - loss: 0.9218 - regression_loss: 0.8235 - classification_loss: 0.0983 283/500 [===============>..............] - ETA: 53s - loss: 0.9231 - regression_loss: 0.8249 - classification_loss: 0.0982 284/500 [================>.............] - ETA: 53s - loss: 0.9235 - regression_loss: 0.8252 - classification_loss: 0.0982 285/500 [================>.............] - ETA: 53s - loss: 0.9239 - regression_loss: 0.8256 - classification_loss: 0.0984 286/500 [================>.............] - ETA: 53s - loss: 0.9229 - regression_loss: 0.8247 - classification_loss: 0.0983 287/500 [================>.............] - ETA: 52s - loss: 0.9242 - regression_loss: 0.8256 - classification_loss: 0.0986 288/500 [================>.............] - ETA: 52s - loss: 0.9229 - regression_loss: 0.8245 - classification_loss: 0.0984 289/500 [================>.............] - ETA: 52s - loss: 0.9229 - regression_loss: 0.8246 - classification_loss: 0.0983 290/500 [================>.............] - ETA: 52s - loss: 0.9215 - regression_loss: 0.8235 - classification_loss: 0.0980 291/500 [================>.............] - ETA: 51s - loss: 0.9237 - regression_loss: 0.8254 - classification_loss: 0.0982 292/500 [================>.............] - ETA: 51s - loss: 0.9240 - regression_loss: 0.8258 - classification_loss: 0.0982 293/500 [================>.............] - ETA: 51s - loss: 0.9234 - regression_loss: 0.8254 - classification_loss: 0.0980 294/500 [================>.............] - ETA: 51s - loss: 0.9212 - regression_loss: 0.8235 - classification_loss: 0.0977 295/500 [================>.............] - ETA: 50s - loss: 0.9225 - regression_loss: 0.8246 - classification_loss: 0.0979 296/500 [================>.............] - ETA: 50s - loss: 0.9237 - regression_loss: 0.8256 - classification_loss: 0.0980 297/500 [================>.............] - ETA: 50s - loss: 0.9242 - regression_loss: 0.8261 - classification_loss: 0.0981 298/500 [================>.............] - ETA: 50s - loss: 0.9235 - regression_loss: 0.8256 - classification_loss: 0.0979 299/500 [================>.............] - ETA: 49s - loss: 0.9241 - regression_loss: 0.8261 - classification_loss: 0.0980 300/500 [=================>............] - ETA: 49s - loss: 0.9232 - regression_loss: 0.8253 - classification_loss: 0.0978 301/500 [=================>............] - ETA: 49s - loss: 0.9230 - regression_loss: 0.8253 - classification_loss: 0.0977 302/500 [=================>............] - ETA: 49s - loss: 0.9239 - regression_loss: 0.8260 - classification_loss: 0.0979 303/500 [=================>............] - ETA: 48s - loss: 0.9249 - regression_loss: 0.8270 - classification_loss: 0.0979 304/500 [=================>............] - ETA: 48s - loss: 0.9249 - regression_loss: 0.8270 - classification_loss: 0.0980 305/500 [=================>............] - ETA: 48s - loss: 0.9256 - regression_loss: 0.8275 - classification_loss: 0.0981 306/500 [=================>............] - ETA: 48s - loss: 0.9264 - regression_loss: 0.8280 - classification_loss: 0.0983 307/500 [=================>............] - ETA: 47s - loss: 0.9274 - regression_loss: 0.8289 - classification_loss: 0.0985 308/500 [=================>............] - ETA: 47s - loss: 0.9279 - regression_loss: 0.8293 - classification_loss: 0.0986 309/500 [=================>............] - ETA: 47s - loss: 0.9273 - regression_loss: 0.8287 - classification_loss: 0.0986 310/500 [=================>............] - ETA: 47s - loss: 0.9262 - regression_loss: 0.8279 - classification_loss: 0.0983 311/500 [=================>............] - ETA: 46s - loss: 0.9274 - regression_loss: 0.8290 - classification_loss: 0.0984 312/500 [=================>............] - ETA: 46s - loss: 0.9270 - regression_loss: 0.8286 - classification_loss: 0.0984 313/500 [=================>............] - ETA: 46s - loss: 0.9269 - regression_loss: 0.8285 - classification_loss: 0.0984 314/500 [=================>............] - ETA: 46s - loss: 0.9260 - regression_loss: 0.8278 - classification_loss: 0.0982 315/500 [=================>............] - ETA: 46s - loss: 0.9257 - regression_loss: 0.8275 - classification_loss: 0.0982 316/500 [=================>............] - ETA: 45s - loss: 0.9243 - regression_loss: 0.8263 - classification_loss: 0.0980 317/500 [==================>...........] - ETA: 45s - loss: 0.9248 - regression_loss: 0.8267 - classification_loss: 0.0981 318/500 [==================>...........] - ETA: 45s - loss: 0.9237 - regression_loss: 0.8257 - classification_loss: 0.0980 319/500 [==================>...........] - ETA: 45s - loss: 0.9233 - regression_loss: 0.8254 - classification_loss: 0.0979 320/500 [==================>...........] - ETA: 44s - loss: 0.9240 - regression_loss: 0.8261 - classification_loss: 0.0979 321/500 [==================>...........] - ETA: 44s - loss: 0.9228 - regression_loss: 0.8251 - classification_loss: 0.0977 322/500 [==================>...........] - ETA: 44s - loss: 0.9221 - regression_loss: 0.8245 - classification_loss: 0.0975 323/500 [==================>...........] - ETA: 44s - loss: 0.9215 - regression_loss: 0.8240 - classification_loss: 0.0975 324/500 [==================>...........] - ETA: 43s - loss: 0.9233 - regression_loss: 0.8254 - classification_loss: 0.0979 325/500 [==================>...........] - ETA: 43s - loss: 0.9229 - regression_loss: 0.8251 - classification_loss: 0.0978 326/500 [==================>...........] - ETA: 43s - loss: 0.9243 - regression_loss: 0.8263 - classification_loss: 0.0980 327/500 [==================>...........] - ETA: 43s - loss: 0.9241 - regression_loss: 0.8262 - classification_loss: 0.0979 328/500 [==================>...........] - ETA: 42s - loss: 0.9230 - regression_loss: 0.8254 - classification_loss: 0.0976 329/500 [==================>...........] - ETA: 42s - loss: 0.9230 - regression_loss: 0.8253 - classification_loss: 0.0977 330/500 [==================>...........] - ETA: 42s - loss: 0.9228 - regression_loss: 0.8250 - classification_loss: 0.0977 331/500 [==================>...........] - ETA: 42s - loss: 0.9232 - regression_loss: 0.8254 - classification_loss: 0.0978 332/500 [==================>...........] - ETA: 41s - loss: 0.9219 - regression_loss: 0.8243 - classification_loss: 0.0976 333/500 [==================>...........] - ETA: 41s - loss: 0.9219 - regression_loss: 0.8244 - classification_loss: 0.0975 334/500 [===================>..........] - ETA: 41s - loss: 0.9228 - regression_loss: 0.8251 - classification_loss: 0.0977 335/500 [===================>..........] - ETA: 41s - loss: 0.9230 - regression_loss: 0.8253 - classification_loss: 0.0978 336/500 [===================>..........] - ETA: 40s - loss: 0.9229 - regression_loss: 0.8251 - classification_loss: 0.0978 337/500 [===================>..........] - ETA: 40s - loss: 0.9223 - regression_loss: 0.8246 - classification_loss: 0.0977 338/500 [===================>..........] - ETA: 40s - loss: 0.9217 - regression_loss: 0.8240 - classification_loss: 0.0977 339/500 [===================>..........] - ETA: 40s - loss: 0.9200 - regression_loss: 0.8226 - classification_loss: 0.0974 340/500 [===================>..........] - ETA: 39s - loss: 0.9207 - regression_loss: 0.8233 - classification_loss: 0.0975 341/500 [===================>..........] - ETA: 39s - loss: 0.9201 - regression_loss: 0.8228 - classification_loss: 0.0973 342/500 [===================>..........] - ETA: 39s - loss: 0.9192 - regression_loss: 0.8221 - classification_loss: 0.0971 343/500 [===================>..........] - ETA: 39s - loss: 0.9191 - regression_loss: 0.8220 - classification_loss: 0.0971 344/500 [===================>..........] - ETA: 38s - loss: 0.9180 - regression_loss: 0.8211 - classification_loss: 0.0969 345/500 [===================>..........] - ETA: 38s - loss: 0.9180 - regression_loss: 0.8210 - classification_loss: 0.0969 346/500 [===================>..........] - ETA: 38s - loss: 0.9162 - regression_loss: 0.8195 - classification_loss: 0.0967 347/500 [===================>..........] - ETA: 38s - loss: 0.9160 - regression_loss: 0.8195 - classification_loss: 0.0965 348/500 [===================>..........] - ETA: 37s - loss: 0.9163 - regression_loss: 0.8198 - classification_loss: 0.0965 349/500 [===================>..........] - ETA: 37s - loss: 0.9171 - regression_loss: 0.8205 - classification_loss: 0.0966 350/500 [====================>.........] - ETA: 37s - loss: 0.9171 - regression_loss: 0.8205 - classification_loss: 0.0966 351/500 [====================>.........] - ETA: 37s - loss: 0.9180 - regression_loss: 0.8212 - classification_loss: 0.0968 352/500 [====================>.........] - ETA: 36s - loss: 0.9192 - regression_loss: 0.8222 - classification_loss: 0.0970 353/500 [====================>.........] - ETA: 36s - loss: 0.9184 - regression_loss: 0.8215 - classification_loss: 0.0968 354/500 [====================>.........] - ETA: 36s - loss: 0.9165 - regression_loss: 0.8199 - classification_loss: 0.0966 355/500 [====================>.........] - ETA: 36s - loss: 0.9152 - regression_loss: 0.8188 - classification_loss: 0.0964 356/500 [====================>.........] - ETA: 35s - loss: 0.9150 - regression_loss: 0.8186 - classification_loss: 0.0964 357/500 [====================>.........] - ETA: 35s - loss: 0.9179 - regression_loss: 0.8211 - classification_loss: 0.0968 358/500 [====================>.........] - ETA: 35s - loss: 0.9187 - regression_loss: 0.8219 - classification_loss: 0.0968 359/500 [====================>.........] - ETA: 35s - loss: 0.9179 - regression_loss: 0.8212 - classification_loss: 0.0967 360/500 [====================>.........] - ETA: 34s - loss: 0.9172 - regression_loss: 0.8206 - classification_loss: 0.0966 361/500 [====================>.........] - ETA: 34s - loss: 0.9172 - regression_loss: 0.8206 - classification_loss: 0.0966 362/500 [====================>.........] - ETA: 34s - loss: 0.9181 - regression_loss: 0.8214 - classification_loss: 0.0967 363/500 [====================>.........] - ETA: 34s - loss: 0.9195 - regression_loss: 0.8224 - classification_loss: 0.0970 364/500 [====================>.........] - ETA: 33s - loss: 0.9191 - regression_loss: 0.8222 - classification_loss: 0.0969 365/500 [====================>.........] - ETA: 33s - loss: 0.9191 - regression_loss: 0.8221 - classification_loss: 0.0970 366/500 [====================>.........] - ETA: 33s - loss: 0.9208 - regression_loss: 0.8234 - classification_loss: 0.0974 367/500 [=====================>........] - ETA: 33s - loss: 0.9199 - regression_loss: 0.8227 - classification_loss: 0.0972 368/500 [=====================>........] - ETA: 32s - loss: 0.9201 - regression_loss: 0.8228 - classification_loss: 0.0973 369/500 [=====================>........] - ETA: 32s - loss: 0.9201 - regression_loss: 0.8228 - classification_loss: 0.0973 370/500 [=====================>........] - ETA: 32s - loss: 0.9209 - regression_loss: 0.8233 - classification_loss: 0.0976 371/500 [=====================>........] - ETA: 32s - loss: 0.9214 - regression_loss: 0.8237 - classification_loss: 0.0977 372/500 [=====================>........] - ETA: 31s - loss: 0.9209 - regression_loss: 0.8232 - classification_loss: 0.0977 373/500 [=====================>........] - ETA: 31s - loss: 0.9215 - regression_loss: 0.8236 - classification_loss: 0.0979 374/500 [=====================>........] - ETA: 31s - loss: 0.9211 - regression_loss: 0.8232 - classification_loss: 0.0978 375/500 [=====================>........] - ETA: 31s - loss: 0.9216 - regression_loss: 0.8238 - classification_loss: 0.0978 376/500 [=====================>........] - ETA: 30s - loss: 0.9222 - regression_loss: 0.8244 - classification_loss: 0.0978 377/500 [=====================>........] - ETA: 30s - loss: 0.9212 - regression_loss: 0.8234 - classification_loss: 0.0977 378/500 [=====================>........] - ETA: 30s - loss: 0.9221 - regression_loss: 0.8241 - classification_loss: 0.0981 379/500 [=====================>........] - ETA: 30s - loss: 0.9226 - regression_loss: 0.8245 - classification_loss: 0.0981 380/500 [=====================>........] - ETA: 29s - loss: 0.9235 - regression_loss: 0.8253 - classification_loss: 0.0982 381/500 [=====================>........] - ETA: 29s - loss: 0.9237 - regression_loss: 0.8255 - classification_loss: 0.0982 382/500 [=====================>........] - ETA: 29s - loss: 0.9234 - regression_loss: 0.8253 - classification_loss: 0.0981 383/500 [=====================>........] - ETA: 29s - loss: 0.9238 - regression_loss: 0.8256 - classification_loss: 0.0982 384/500 [======================>.......] - ETA: 28s - loss: 0.9229 - regression_loss: 0.8249 - classification_loss: 0.0981 385/500 [======================>.......] - ETA: 28s - loss: 0.9236 - regression_loss: 0.8254 - classification_loss: 0.0982 386/500 [======================>.......] - ETA: 28s - loss: 0.9227 - regression_loss: 0.8247 - classification_loss: 0.0980 387/500 [======================>.......] - ETA: 28s - loss: 0.9221 - regression_loss: 0.8241 - classification_loss: 0.0980 388/500 [======================>.......] - ETA: 27s - loss: 0.9216 - regression_loss: 0.8237 - classification_loss: 0.0979 389/500 [======================>.......] - ETA: 27s - loss: 0.9215 - regression_loss: 0.8237 - classification_loss: 0.0978 390/500 [======================>.......] - ETA: 27s - loss: 0.9195 - regression_loss: 0.8219 - classification_loss: 0.0976 391/500 [======================>.......] - ETA: 27s - loss: 0.9207 - regression_loss: 0.8229 - classification_loss: 0.0978 392/500 [======================>.......] - ETA: 26s - loss: 0.9189 - regression_loss: 0.8213 - classification_loss: 0.0976 393/500 [======================>.......] - ETA: 26s - loss: 0.9198 - regression_loss: 0.8221 - classification_loss: 0.0977 394/500 [======================>.......] - ETA: 26s - loss: 0.9180 - regression_loss: 0.8205 - classification_loss: 0.0975 395/500 [======================>.......] - ETA: 26s - loss: 0.9174 - regression_loss: 0.8200 - classification_loss: 0.0974 396/500 [======================>.......] - ETA: 25s - loss: 0.9167 - regression_loss: 0.8194 - classification_loss: 0.0973 397/500 [======================>.......] - ETA: 25s - loss: 0.9158 - regression_loss: 0.8186 - classification_loss: 0.0971 398/500 [======================>.......] - ETA: 25s - loss: 0.9166 - regression_loss: 0.8194 - classification_loss: 0.0972 399/500 [======================>.......] - ETA: 25s - loss: 0.9168 - regression_loss: 0.8196 - classification_loss: 0.0973 400/500 [=======================>......] - ETA: 24s - loss: 0.9167 - regression_loss: 0.8194 - classification_loss: 0.0973 401/500 [=======================>......] - ETA: 24s - loss: 0.9154 - regression_loss: 0.8183 - classification_loss: 0.0971 402/500 [=======================>......] - ETA: 24s - loss: 0.9146 - regression_loss: 0.8178 - classification_loss: 0.0969 403/500 [=======================>......] - ETA: 24s - loss: 0.9138 - regression_loss: 0.8170 - classification_loss: 0.0968 404/500 [=======================>......] - ETA: 23s - loss: 0.9142 - regression_loss: 0.8173 - classification_loss: 0.0969 405/500 [=======================>......] - ETA: 23s - loss: 0.9128 - regression_loss: 0.8161 - classification_loss: 0.0967 406/500 [=======================>......] - ETA: 23s - loss: 0.9131 - regression_loss: 0.8164 - classification_loss: 0.0967 407/500 [=======================>......] - ETA: 23s - loss: 0.9139 - regression_loss: 0.8171 - classification_loss: 0.0967 408/500 [=======================>......] - ETA: 22s - loss: 0.9129 - regression_loss: 0.8163 - classification_loss: 0.0966 409/500 [=======================>......] - ETA: 22s - loss: 0.9132 - regression_loss: 0.8167 - classification_loss: 0.0966 410/500 [=======================>......] - ETA: 22s - loss: 0.9132 - regression_loss: 0.8166 - classification_loss: 0.0966 411/500 [=======================>......] - ETA: 22s - loss: 0.9131 - regression_loss: 0.8165 - classification_loss: 0.0966 412/500 [=======================>......] - ETA: 21s - loss: 0.9117 - regression_loss: 0.8153 - classification_loss: 0.0964 413/500 [=======================>......] - ETA: 21s - loss: 0.9119 - regression_loss: 0.8155 - classification_loss: 0.0964 414/500 [=======================>......] - ETA: 21s - loss: 0.9119 - regression_loss: 0.8156 - classification_loss: 0.0963 415/500 [=======================>......] - ETA: 21s - loss: 0.9120 - regression_loss: 0.8157 - classification_loss: 0.0963 416/500 [=======================>......] - ETA: 20s - loss: 0.9106 - regression_loss: 0.8144 - classification_loss: 0.0962 417/500 [========================>.....] - ETA: 20s - loss: 0.9108 - regression_loss: 0.8147 - classification_loss: 0.0962 418/500 [========================>.....] - ETA: 20s - loss: 0.9097 - regression_loss: 0.8137 - classification_loss: 0.0960 419/500 [========================>.....] - ETA: 20s - loss: 0.9105 - regression_loss: 0.8145 - classification_loss: 0.0960 420/500 [========================>.....] - ETA: 19s - loss: 0.9104 - regression_loss: 0.8143 - classification_loss: 0.0960 421/500 [========================>.....] - ETA: 19s - loss: 0.9105 - regression_loss: 0.8146 - classification_loss: 0.0960 422/500 [========================>.....] - ETA: 19s - loss: 0.9110 - regression_loss: 0.8148 - classification_loss: 0.0961 423/500 [========================>.....] - ETA: 19s - loss: 0.9105 - regression_loss: 0.8145 - classification_loss: 0.0961 424/500 [========================>.....] - ETA: 18s - loss: 0.9098 - regression_loss: 0.8139 - classification_loss: 0.0959 425/500 [========================>.....] - ETA: 18s - loss: 0.9090 - regression_loss: 0.8131 - classification_loss: 0.0959 426/500 [========================>.....] - ETA: 18s - loss: 0.9090 - regression_loss: 0.8131 - classification_loss: 0.0959 427/500 [========================>.....] - ETA: 18s - loss: 0.9093 - regression_loss: 0.8134 - classification_loss: 0.0959 428/500 [========================>.....] - ETA: 17s - loss: 0.9104 - regression_loss: 0.8143 - classification_loss: 0.0960 429/500 [========================>.....] - ETA: 17s - loss: 0.9103 - regression_loss: 0.8143 - classification_loss: 0.0961 430/500 [========================>.....] - ETA: 17s - loss: 0.9104 - regression_loss: 0.8143 - classification_loss: 0.0961 431/500 [========================>.....] - ETA: 17s - loss: 0.9098 - regression_loss: 0.8138 - classification_loss: 0.0960 432/500 [========================>.....] - ETA: 16s - loss: 0.9100 - regression_loss: 0.8140 - classification_loss: 0.0960 433/500 [========================>.....] - ETA: 16s - loss: 0.9110 - regression_loss: 0.8149 - classification_loss: 0.0961 434/500 [=========================>....] - ETA: 16s - loss: 0.9113 - regression_loss: 0.8152 - classification_loss: 0.0961 435/500 [=========================>....] - ETA: 16s - loss: 0.9116 - regression_loss: 0.8156 - classification_loss: 0.0960 436/500 [=========================>....] - ETA: 15s - loss: 0.9111 - regression_loss: 0.8152 - classification_loss: 0.0959 437/500 [=========================>....] - ETA: 15s - loss: 0.9111 - regression_loss: 0.8152 - classification_loss: 0.0959 438/500 [=========================>....] - ETA: 15s - loss: 0.9124 - regression_loss: 0.8162 - classification_loss: 0.0962 439/500 [=========================>....] - ETA: 15s - loss: 0.9130 - regression_loss: 0.8168 - classification_loss: 0.0962 440/500 [=========================>....] - ETA: 14s - loss: 0.9127 - regression_loss: 0.8166 - classification_loss: 0.0961 441/500 [=========================>....] - ETA: 14s - loss: 0.9124 - regression_loss: 0.8164 - classification_loss: 0.0960 442/500 [=========================>....] - ETA: 14s - loss: 0.9135 - regression_loss: 0.8173 - classification_loss: 0.0962 443/500 [=========================>....] - ETA: 14s - loss: 0.9129 - regression_loss: 0.8168 - classification_loss: 0.0961 444/500 [=========================>....] - ETA: 13s - loss: 0.9127 - regression_loss: 0.8167 - classification_loss: 0.0960 445/500 [=========================>....] - ETA: 13s - loss: 0.9125 - regression_loss: 0.8164 - classification_loss: 0.0961 446/500 [=========================>....] - ETA: 13s - loss: 0.9130 - regression_loss: 0.8168 - classification_loss: 0.0962 447/500 [=========================>....] - ETA: 13s - loss: 0.9121 - regression_loss: 0.8160 - classification_loss: 0.0960 448/500 [=========================>....] - ETA: 12s - loss: 0.9125 - regression_loss: 0.8164 - classification_loss: 0.0961 449/500 [=========================>....] - ETA: 12s - loss: 0.9115 - regression_loss: 0.8156 - classification_loss: 0.0960 450/500 [==========================>...] - ETA: 12s - loss: 0.9122 - regression_loss: 0.8162 - classification_loss: 0.0960 451/500 [==========================>...] - ETA: 12s - loss: 0.9113 - regression_loss: 0.8154 - classification_loss: 0.0959 452/500 [==========================>...] - ETA: 11s - loss: 0.9102 - regression_loss: 0.8145 - classification_loss: 0.0957 453/500 [==========================>...] - ETA: 11s - loss: 0.9113 - regression_loss: 0.8155 - classification_loss: 0.0959 454/500 [==========================>...] - ETA: 11s - loss: 0.9101 - regression_loss: 0.8144 - classification_loss: 0.0957 455/500 [==========================>...] - ETA: 11s - loss: 0.9092 - regression_loss: 0.8137 - classification_loss: 0.0955 456/500 [==========================>...] - ETA: 10s - loss: 0.9088 - regression_loss: 0.8134 - classification_loss: 0.0954 457/500 [==========================>...] - ETA: 10s - loss: 0.9091 - regression_loss: 0.8135 - classification_loss: 0.0956 458/500 [==========================>...] - ETA: 10s - loss: 0.9106 - regression_loss: 0.8148 - classification_loss: 0.0958 459/500 [==========================>...] - ETA: 10s - loss: 0.9093 - regression_loss: 0.8137 - classification_loss: 0.0956 460/500 [==========================>...] - ETA: 9s - loss: 0.9095 - regression_loss: 0.8139 - classification_loss: 0.0956  461/500 [==========================>...] - ETA: 9s - loss: 0.9088 - regression_loss: 0.8133 - classification_loss: 0.0955 462/500 [==========================>...] - ETA: 9s - loss: 0.9103 - regression_loss: 0.8145 - classification_loss: 0.0958 463/500 [==========================>...] - ETA: 9s - loss: 0.9094 - regression_loss: 0.8137 - classification_loss: 0.0956 464/500 [==========================>...] - ETA: 8s - loss: 0.9094 - regression_loss: 0.8138 - classification_loss: 0.0956 465/500 [==========================>...] - ETA: 8s - loss: 0.9084 - regression_loss: 0.8129 - classification_loss: 0.0955 466/500 [==========================>...] - ETA: 8s - loss: 0.9090 - regression_loss: 0.8135 - classification_loss: 0.0955 467/500 [===========================>..] - ETA: 8s - loss: 0.9098 - regression_loss: 0.8142 - classification_loss: 0.0956 468/500 [===========================>..] - ETA: 7s - loss: 0.9106 - regression_loss: 0.8150 - classification_loss: 0.0956 469/500 [===========================>..] - ETA: 7s - loss: 0.9109 - regression_loss: 0.8152 - classification_loss: 0.0957 470/500 [===========================>..] - ETA: 7s - loss: 0.9109 - regression_loss: 0.8153 - classification_loss: 0.0955 471/500 [===========================>..] - ETA: 7s - loss: 0.9116 - regression_loss: 0.8159 - classification_loss: 0.0957 472/500 [===========================>..] - ETA: 6s - loss: 0.9108 - regression_loss: 0.8152 - classification_loss: 0.0956 473/500 [===========================>..] - ETA: 6s - loss: 0.9105 - regression_loss: 0.8148 - classification_loss: 0.0957 474/500 [===========================>..] - ETA: 6s - loss: 0.9108 - regression_loss: 0.8151 - classification_loss: 0.0957 475/500 [===========================>..] - ETA: 6s - loss: 0.9108 - regression_loss: 0.8150 - classification_loss: 0.0957 476/500 [===========================>..] - ETA: 5s - loss: 0.9103 - regression_loss: 0.8147 - classification_loss: 0.0956 477/500 [===========================>..] - ETA: 5s - loss: 0.9112 - regression_loss: 0.8156 - classification_loss: 0.0956 478/500 [===========================>..] - ETA: 5s - loss: 0.9099 - regression_loss: 0.8145 - classification_loss: 0.0954 479/500 [===========================>..] - ETA: 5s - loss: 0.9102 - regression_loss: 0.8147 - classification_loss: 0.0955 480/500 [===========================>..] - ETA: 4s - loss: 0.9114 - regression_loss: 0.8158 - classification_loss: 0.0956 481/500 [===========================>..] - ETA: 4s - loss: 0.9110 - regression_loss: 0.8155 - classification_loss: 0.0956 482/500 [===========================>..] - ETA: 4s - loss: 0.9119 - regression_loss: 0.8162 - classification_loss: 0.0957 483/500 [===========================>..] - ETA: 4s - loss: 0.9110 - regression_loss: 0.8154 - classification_loss: 0.0956 484/500 [============================>.] - ETA: 3s - loss: 0.9118 - regression_loss: 0.8161 - classification_loss: 0.0957 485/500 [============================>.] - ETA: 3s - loss: 0.9124 - regression_loss: 0.8166 - classification_loss: 0.0958 486/500 [============================>.] - ETA: 3s - loss: 0.9120 - regression_loss: 0.8163 - classification_loss: 0.0957 487/500 [============================>.] - ETA: 3s - loss: 0.9121 - regression_loss: 0.8164 - classification_loss: 0.0957 488/500 [============================>.] - ETA: 2s - loss: 0.9128 - regression_loss: 0.8170 - classification_loss: 0.0959 489/500 [============================>.] - ETA: 2s - loss: 0.9119 - regression_loss: 0.8162 - classification_loss: 0.0957 490/500 [============================>.] - ETA: 2s - loss: 0.9120 - regression_loss: 0.8163 - classification_loss: 0.0957 491/500 [============================>.] - ETA: 2s - loss: 0.9121 - regression_loss: 0.8164 - classification_loss: 0.0957 492/500 [============================>.] - ETA: 1s - loss: 0.9120 - regression_loss: 0.8163 - classification_loss: 0.0957 493/500 [============================>.] - ETA: 1s - loss: 0.9123 - regression_loss: 0.8166 - classification_loss: 0.0957 494/500 [============================>.] - ETA: 1s - loss: 0.9129 - regression_loss: 0.8171 - classification_loss: 0.0958 495/500 [============================>.] - ETA: 1s - loss: 0.9134 - regression_loss: 0.8175 - classification_loss: 0.0959 496/500 [============================>.] - ETA: 0s - loss: 0.9135 - regression_loss: 0.8177 - classification_loss: 0.0959 497/500 [============================>.] - ETA: 0s - loss: 0.9127 - regression_loss: 0.8169 - classification_loss: 0.0957 498/500 [============================>.] - ETA: 0s - loss: 0.9130 - regression_loss: 0.8172 - classification_loss: 0.0958 499/500 [============================>.] - ETA: 0s - loss: 0.9127 - regression_loss: 0.8169 - classification_loss: 0.0958 500/500 [==============================] - 124s 249ms/step - loss: 0.9125 - regression_loss: 0.8167 - classification_loss: 0.0958 1172 instances of class plum with average precision: 0.8067 mAP: 0.8067 Epoch 00047: saving model to ./training/snapshots/resnet50_pascal_47.h5 Epoch 48/150 1/500 [..............................] - ETA: 1:57 - loss: 0.5290 - regression_loss: 0.5178 - classification_loss: 0.0112 2/500 [..............................] - ETA: 2:02 - loss: 0.7131 - regression_loss: 0.6721 - classification_loss: 0.0410 3/500 [..............................] - ETA: 2:03 - loss: 0.8148 - regression_loss: 0.7538 - classification_loss: 0.0610 4/500 [..............................] - ETA: 2:04 - loss: 0.9688 - regression_loss: 0.8783 - classification_loss: 0.0904 5/500 [..............................] - ETA: 2:03 - loss: 1.0168 - regression_loss: 0.9303 - classification_loss: 0.0865 6/500 [..............................] - ETA: 2:02 - loss: 0.9268 - regression_loss: 0.8467 - classification_loss: 0.0801 7/500 [..............................] - ETA: 2:02 - loss: 0.8969 - regression_loss: 0.8210 - classification_loss: 0.0759 8/500 [..............................] - ETA: 2:02 - loss: 0.9110 - regression_loss: 0.8360 - classification_loss: 0.0750 9/500 [..............................] - ETA: 2:02 - loss: 0.9479 - regression_loss: 0.8640 - classification_loss: 0.0839 10/500 [..............................] - ETA: 2:02 - loss: 0.9869 - regression_loss: 0.8964 - classification_loss: 0.0905 11/500 [..............................] - ETA: 2:02 - loss: 0.9769 - regression_loss: 0.8863 - classification_loss: 0.0906 12/500 [..............................] - ETA: 2:02 - loss: 1.0106 - regression_loss: 0.9222 - classification_loss: 0.0884 13/500 [..............................] - ETA: 2:02 - loss: 0.9594 - regression_loss: 0.8767 - classification_loss: 0.0827 14/500 [..............................] - ETA: 2:01 - loss: 0.9521 - regression_loss: 0.8722 - classification_loss: 0.0799 15/500 [..............................] - ETA: 2:01 - loss: 0.9714 - regression_loss: 0.8872 - classification_loss: 0.0842 16/500 [..............................] - ETA: 2:00 - loss: 0.9666 - regression_loss: 0.8803 - classification_loss: 0.0863 17/500 [>.............................] - ETA: 2:00 - loss: 0.9366 - regression_loss: 0.8540 - classification_loss: 0.0825 18/500 [>.............................] - ETA: 2:00 - loss: 0.9412 - regression_loss: 0.8564 - classification_loss: 0.0848 19/500 [>.............................] - ETA: 1:59 - loss: 0.9279 - regression_loss: 0.8456 - classification_loss: 0.0823 20/500 [>.............................] - ETA: 1:59 - loss: 0.9405 - regression_loss: 0.8584 - classification_loss: 0.0822 21/500 [>.............................] - ETA: 1:59 - loss: 0.9612 - regression_loss: 0.8746 - classification_loss: 0.0867 22/500 [>.............................] - ETA: 1:59 - loss: 0.9561 - regression_loss: 0.8699 - classification_loss: 0.0862 23/500 [>.............................] - ETA: 1:58 - loss: 0.9550 - regression_loss: 0.8674 - classification_loss: 0.0876 24/500 [>.............................] - ETA: 1:58 - loss: 0.9628 - regression_loss: 0.8736 - classification_loss: 0.0892 25/500 [>.............................] - ETA: 1:58 - loss: 0.9444 - regression_loss: 0.8565 - classification_loss: 0.0880 26/500 [>.............................] - ETA: 1:58 - loss: 0.9652 - regression_loss: 0.8731 - classification_loss: 0.0922 27/500 [>.............................] - ETA: 1:57 - loss: 0.9425 - regression_loss: 0.8527 - classification_loss: 0.0898 28/500 [>.............................] - ETA: 1:57 - loss: 0.9469 - regression_loss: 0.8557 - classification_loss: 0.0912 29/500 [>.............................] - ETA: 1:57 - loss: 0.9229 - regression_loss: 0.8311 - classification_loss: 0.0918 30/500 [>.............................] - ETA: 1:57 - loss: 0.9115 - regression_loss: 0.8223 - classification_loss: 0.0892 31/500 [>.............................] - ETA: 1:57 - loss: 0.8956 - regression_loss: 0.8087 - classification_loss: 0.0870 32/500 [>.............................] - ETA: 1:56 - loss: 0.8874 - regression_loss: 0.8016 - classification_loss: 0.0859 33/500 [>.............................] - ETA: 1:56 - loss: 0.8827 - regression_loss: 0.7966 - classification_loss: 0.0860 34/500 [=>............................] - ETA: 1:56 - loss: 0.8668 - regression_loss: 0.7820 - classification_loss: 0.0848 35/500 [=>............................] - ETA: 1:56 - loss: 0.8533 - regression_loss: 0.7695 - classification_loss: 0.0838 36/500 [=>............................] - ETA: 1:55 - loss: 0.8412 - regression_loss: 0.7591 - classification_loss: 0.0821 37/500 [=>............................] - ETA: 1:55 - loss: 0.8288 - regression_loss: 0.7475 - classification_loss: 0.0813 38/500 [=>............................] - ETA: 1:55 - loss: 0.8259 - regression_loss: 0.7453 - classification_loss: 0.0806 39/500 [=>............................] - ETA: 1:55 - loss: 0.8449 - regression_loss: 0.7608 - classification_loss: 0.0841 40/500 [=>............................] - ETA: 1:54 - loss: 0.8360 - regression_loss: 0.7533 - classification_loss: 0.0827 41/500 [=>............................] - ETA: 1:54 - loss: 0.8495 - regression_loss: 0.7651 - classification_loss: 0.0844 42/500 [=>............................] - ETA: 1:54 - loss: 0.8517 - regression_loss: 0.7672 - classification_loss: 0.0844 43/500 [=>............................] - ETA: 1:54 - loss: 0.8634 - regression_loss: 0.7766 - classification_loss: 0.0868 44/500 [=>............................] - ETA: 1:53 - loss: 0.8599 - regression_loss: 0.7737 - classification_loss: 0.0863 45/500 [=>............................] - ETA: 1:53 - loss: 0.8576 - regression_loss: 0.7718 - classification_loss: 0.0858 46/500 [=>............................] - ETA: 1:53 - loss: 0.8514 - regression_loss: 0.7666 - classification_loss: 0.0848 47/500 [=>............................] - ETA: 1:53 - loss: 0.8566 - regression_loss: 0.7711 - classification_loss: 0.0855 48/500 [=>............................] - ETA: 1:52 - loss: 0.8541 - regression_loss: 0.7693 - classification_loss: 0.0848 49/500 [=>............................] - ETA: 1:52 - loss: 0.8623 - regression_loss: 0.7765 - classification_loss: 0.0858 50/500 [==>...........................] - ETA: 1:52 - loss: 0.8676 - regression_loss: 0.7809 - classification_loss: 0.0867 51/500 [==>...........................] - ETA: 1:52 - loss: 0.8722 - regression_loss: 0.7846 - classification_loss: 0.0877 52/500 [==>...........................] - ETA: 1:51 - loss: 0.8681 - regression_loss: 0.7816 - classification_loss: 0.0865 53/500 [==>...........................] - ETA: 1:51 - loss: 0.8682 - regression_loss: 0.7816 - classification_loss: 0.0866 54/500 [==>...........................] - ETA: 1:51 - loss: 0.8673 - regression_loss: 0.7804 - classification_loss: 0.0869 55/500 [==>...........................] - ETA: 1:51 - loss: 0.8769 - regression_loss: 0.7895 - classification_loss: 0.0874 56/500 [==>...........................] - ETA: 1:50 - loss: 0.8801 - regression_loss: 0.7916 - classification_loss: 0.0885 57/500 [==>...........................] - ETA: 1:50 - loss: 0.8797 - regression_loss: 0.7906 - classification_loss: 0.0891 58/500 [==>...........................] - ETA: 1:50 - loss: 0.8867 - regression_loss: 0.7960 - classification_loss: 0.0907 59/500 [==>...........................] - ETA: 1:50 - loss: 0.8877 - regression_loss: 0.7959 - classification_loss: 0.0919 60/500 [==>...........................] - ETA: 1:49 - loss: 0.8843 - regression_loss: 0.7936 - classification_loss: 0.0907 61/500 [==>...........................] - ETA: 1:49 - loss: 0.8932 - regression_loss: 0.8008 - classification_loss: 0.0924 62/500 [==>...........................] - ETA: 1:49 - loss: 0.8987 - regression_loss: 0.8054 - classification_loss: 0.0934 63/500 [==>...........................] - ETA: 1:49 - loss: 0.8981 - regression_loss: 0.8039 - classification_loss: 0.0942 64/500 [==>...........................] - ETA: 1:48 - loss: 0.8910 - regression_loss: 0.7979 - classification_loss: 0.0931 65/500 [==>...........................] - ETA: 1:48 - loss: 0.8969 - regression_loss: 0.8026 - classification_loss: 0.0943 66/500 [==>...........................] - ETA: 1:48 - loss: 0.8957 - regression_loss: 0.8017 - classification_loss: 0.0940 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9035 - regression_loss: 0.8090 - classification_loss: 0.0945 68/500 [===>..........................] - ETA: 1:47 - loss: 0.9043 - regression_loss: 0.8089 - classification_loss: 0.0953 69/500 [===>..........................] - ETA: 1:47 - loss: 0.8944 - regression_loss: 0.8001 - classification_loss: 0.0943 70/500 [===>..........................] - ETA: 1:47 - loss: 0.8985 - regression_loss: 0.8034 - classification_loss: 0.0950 71/500 [===>..........................] - ETA: 1:47 - loss: 0.8993 - regression_loss: 0.8037 - classification_loss: 0.0955 72/500 [===>..........................] - ETA: 1:46 - loss: 0.9019 - regression_loss: 0.8060 - classification_loss: 0.0959 73/500 [===>..........................] - ETA: 1:46 - loss: 0.9024 - regression_loss: 0.8064 - classification_loss: 0.0960 74/500 [===>..........................] - ETA: 1:46 - loss: 0.9049 - regression_loss: 0.8084 - classification_loss: 0.0965 75/500 [===>..........................] - ETA: 1:46 - loss: 0.9118 - regression_loss: 0.8150 - classification_loss: 0.0968 76/500 [===>..........................] - ETA: 1:46 - loss: 0.9064 - regression_loss: 0.8103 - classification_loss: 0.0961 77/500 [===>..........................] - ETA: 1:45 - loss: 0.9052 - regression_loss: 0.8101 - classification_loss: 0.0950 78/500 [===>..........................] - ETA: 1:45 - loss: 0.9039 - regression_loss: 0.8092 - classification_loss: 0.0947 79/500 [===>..........................] - ETA: 1:45 - loss: 0.9038 - regression_loss: 0.8098 - classification_loss: 0.0941 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9084 - regression_loss: 0.8139 - classification_loss: 0.0945 81/500 [===>..........................] - ETA: 1:44 - loss: 0.9054 - regression_loss: 0.8117 - classification_loss: 0.0937 82/500 [===>..........................] - ETA: 1:44 - loss: 0.9091 - regression_loss: 0.8149 - classification_loss: 0.0942 83/500 [===>..........................] - ETA: 1:44 - loss: 0.9106 - regression_loss: 0.8163 - classification_loss: 0.0943 84/500 [====>.........................] - ETA: 1:44 - loss: 0.9083 - regression_loss: 0.8146 - classification_loss: 0.0937 85/500 [====>.........................] - ETA: 1:43 - loss: 0.9042 - regression_loss: 0.8110 - classification_loss: 0.0932 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9065 - regression_loss: 0.8132 - classification_loss: 0.0933 87/500 [====>.........................] - ETA: 1:42 - loss: 0.9016 - regression_loss: 0.8090 - classification_loss: 0.0926 88/500 [====>.........................] - ETA: 1:42 - loss: 0.8985 - regression_loss: 0.8059 - classification_loss: 0.0925 89/500 [====>.........................] - ETA: 1:42 - loss: 0.8990 - regression_loss: 0.8065 - classification_loss: 0.0925 90/500 [====>.........................] - ETA: 1:41 - loss: 0.9049 - regression_loss: 0.8114 - classification_loss: 0.0936 91/500 [====>.........................] - ETA: 1:41 - loss: 0.8974 - regression_loss: 0.8047 - classification_loss: 0.0927 92/500 [====>.........................] - ETA: 1:41 - loss: 0.8999 - regression_loss: 0.8071 - classification_loss: 0.0928 93/500 [====>.........................] - ETA: 1:41 - loss: 0.8947 - regression_loss: 0.8027 - classification_loss: 0.0921 94/500 [====>.........................] - ETA: 1:41 - loss: 0.8956 - regression_loss: 0.8033 - classification_loss: 0.0924 95/500 [====>.........................] - ETA: 1:40 - loss: 0.8896 - regression_loss: 0.7981 - classification_loss: 0.0915 96/500 [====>.........................] - ETA: 1:40 - loss: 0.8888 - regression_loss: 0.7975 - classification_loss: 0.0912 97/500 [====>.........................] - ETA: 1:40 - loss: 0.8885 - regression_loss: 0.7975 - classification_loss: 0.0910 98/500 [====>.........................] - ETA: 1:40 - loss: 0.8926 - regression_loss: 0.8014 - classification_loss: 0.0912 99/500 [====>.........................] - ETA: 1:39 - loss: 0.8894 - regression_loss: 0.7987 - classification_loss: 0.0908 100/500 [=====>........................] - ETA: 1:39 - loss: 0.8851 - regression_loss: 0.7951 - classification_loss: 0.0900 101/500 [=====>........................] - ETA: 1:39 - loss: 0.8882 - regression_loss: 0.7979 - classification_loss: 0.0903 102/500 [=====>........................] - ETA: 1:39 - loss: 0.8881 - regression_loss: 0.7980 - classification_loss: 0.0901 103/500 [=====>........................] - ETA: 1:38 - loss: 0.8883 - regression_loss: 0.7983 - classification_loss: 0.0900 104/500 [=====>........................] - ETA: 1:38 - loss: 0.8907 - regression_loss: 0.8004 - classification_loss: 0.0903 105/500 [=====>........................] - ETA: 1:38 - loss: 0.8983 - regression_loss: 0.8068 - classification_loss: 0.0916 106/500 [=====>........................] - ETA: 1:38 - loss: 0.8962 - regression_loss: 0.8051 - classification_loss: 0.0911 107/500 [=====>........................] - ETA: 1:37 - loss: 0.9000 - regression_loss: 0.8080 - classification_loss: 0.0920 108/500 [=====>........................] - ETA: 1:37 - loss: 0.9052 - regression_loss: 0.8128 - classification_loss: 0.0924 109/500 [=====>........................] - ETA: 1:37 - loss: 0.9056 - regression_loss: 0.8130 - classification_loss: 0.0926 110/500 [=====>........................] - ETA: 1:37 - loss: 0.9004 - regression_loss: 0.8085 - classification_loss: 0.0919 111/500 [=====>........................] - ETA: 1:36 - loss: 0.9056 - regression_loss: 0.8128 - classification_loss: 0.0927 112/500 [=====>........................] - ETA: 1:36 - loss: 0.9070 - regression_loss: 0.8140 - classification_loss: 0.0931 113/500 [=====>........................] - ETA: 1:36 - loss: 0.9167 - regression_loss: 0.8219 - classification_loss: 0.0949 114/500 [=====>........................] - ETA: 1:36 - loss: 0.9182 - regression_loss: 0.8232 - classification_loss: 0.0950 115/500 [=====>........................] - ETA: 1:35 - loss: 0.9211 - regression_loss: 0.8256 - classification_loss: 0.0955 116/500 [=====>........................] - ETA: 1:35 - loss: 0.9228 - regression_loss: 0.8272 - classification_loss: 0.0956 117/500 [======>.......................] - ETA: 1:35 - loss: 0.9246 - regression_loss: 0.8287 - classification_loss: 0.0959 118/500 [======>.......................] - ETA: 1:35 - loss: 0.9230 - regression_loss: 0.8274 - classification_loss: 0.0956 119/500 [======>.......................] - ETA: 1:35 - loss: 0.9182 - regression_loss: 0.8231 - classification_loss: 0.0951 120/500 [======>.......................] - ETA: 1:34 - loss: 0.9200 - regression_loss: 0.8248 - classification_loss: 0.0952 121/500 [======>.......................] - ETA: 1:34 - loss: 0.9235 - regression_loss: 0.8279 - classification_loss: 0.0956 122/500 [======>.......................] - ETA: 1:34 - loss: 0.9229 - regression_loss: 0.8273 - classification_loss: 0.0956 123/500 [======>.......................] - ETA: 1:34 - loss: 0.9246 - regression_loss: 0.8289 - classification_loss: 0.0957 124/500 [======>.......................] - ETA: 1:33 - loss: 0.9274 - regression_loss: 0.8313 - classification_loss: 0.0960 125/500 [======>.......................] - ETA: 1:33 - loss: 0.9301 - regression_loss: 0.8339 - classification_loss: 0.0962 126/500 [======>.......................] - ETA: 1:33 - loss: 0.9321 - regression_loss: 0.8355 - classification_loss: 0.0966 127/500 [======>.......................] - ETA: 1:33 - loss: 0.9346 - regression_loss: 0.8379 - classification_loss: 0.0968 128/500 [======>.......................] - ETA: 1:32 - loss: 0.9369 - regression_loss: 0.8398 - classification_loss: 0.0971 129/500 [======>.......................] - ETA: 1:32 - loss: 0.9367 - regression_loss: 0.8395 - classification_loss: 0.0973 130/500 [======>.......................] - ETA: 1:32 - loss: 0.9363 - regression_loss: 0.8392 - classification_loss: 0.0971 131/500 [======>.......................] - ETA: 1:32 - loss: 0.9319 - regression_loss: 0.8353 - classification_loss: 0.0966 132/500 [======>.......................] - ETA: 1:31 - loss: 0.9285 - regression_loss: 0.8324 - classification_loss: 0.0961 133/500 [======>.......................] - ETA: 1:31 - loss: 0.9289 - regression_loss: 0.8325 - classification_loss: 0.0965 134/500 [=======>......................] - ETA: 1:31 - loss: 0.9308 - regression_loss: 0.8342 - classification_loss: 0.0966 135/500 [=======>......................] - ETA: 1:31 - loss: 0.9310 - regression_loss: 0.8337 - classification_loss: 0.0973 136/500 [=======>......................] - ETA: 1:30 - loss: 0.9282 - regression_loss: 0.8315 - classification_loss: 0.0967 137/500 [=======>......................] - ETA: 1:30 - loss: 0.9313 - regression_loss: 0.8340 - classification_loss: 0.0972 138/500 [=======>......................] - ETA: 1:30 - loss: 0.9313 - regression_loss: 0.8343 - classification_loss: 0.0970 139/500 [=======>......................] - ETA: 1:30 - loss: 0.9289 - regression_loss: 0.8323 - classification_loss: 0.0966 140/500 [=======>......................] - ETA: 1:30 - loss: 0.9273 - regression_loss: 0.8310 - classification_loss: 0.0963 141/500 [=======>......................] - ETA: 1:29 - loss: 0.9305 - regression_loss: 0.8335 - classification_loss: 0.0970 142/500 [=======>......................] - ETA: 1:29 - loss: 0.9297 - regression_loss: 0.8330 - classification_loss: 0.0968 143/500 [=======>......................] - ETA: 1:29 - loss: 0.9278 - regression_loss: 0.8313 - classification_loss: 0.0965 144/500 [=======>......................] - ETA: 1:28 - loss: 0.9290 - regression_loss: 0.8319 - classification_loss: 0.0971 145/500 [=======>......................] - ETA: 1:28 - loss: 0.9286 - regression_loss: 0.8318 - classification_loss: 0.0967 146/500 [=======>......................] - ETA: 1:28 - loss: 0.9246 - regression_loss: 0.8285 - classification_loss: 0.0961 147/500 [=======>......................] - ETA: 1:28 - loss: 0.9251 - regression_loss: 0.8292 - classification_loss: 0.0959 148/500 [=======>......................] - ETA: 1:27 - loss: 0.9238 - regression_loss: 0.8282 - classification_loss: 0.0956 149/500 [=======>......................] - ETA: 1:27 - loss: 0.9222 - regression_loss: 0.8270 - classification_loss: 0.0952 150/500 [========>.....................] - ETA: 1:27 - loss: 0.9210 - regression_loss: 0.8261 - classification_loss: 0.0949 151/500 [========>.....................] - ETA: 1:27 - loss: 0.9233 - regression_loss: 0.8278 - classification_loss: 0.0955 152/500 [========>.....................] - ETA: 1:26 - loss: 0.9220 - regression_loss: 0.8268 - classification_loss: 0.0952 153/500 [========>.....................] - ETA: 1:26 - loss: 0.9229 - regression_loss: 0.8274 - classification_loss: 0.0954 154/500 [========>.....................] - ETA: 1:26 - loss: 0.9209 - regression_loss: 0.8256 - classification_loss: 0.0953 155/500 [========>.....................] - ETA: 1:26 - loss: 0.9209 - regression_loss: 0.8258 - classification_loss: 0.0951 156/500 [========>.....................] - ETA: 1:25 - loss: 0.9216 - regression_loss: 0.8260 - classification_loss: 0.0956 157/500 [========>.....................] - ETA: 1:25 - loss: 0.9212 - regression_loss: 0.8259 - classification_loss: 0.0953 158/500 [========>.....................] - ETA: 1:25 - loss: 0.9233 - regression_loss: 0.8275 - classification_loss: 0.0957 159/500 [========>.....................] - ETA: 1:25 - loss: 0.9225 - regression_loss: 0.8270 - classification_loss: 0.0956 160/500 [========>.....................] - ETA: 1:24 - loss: 0.9191 - regression_loss: 0.8240 - classification_loss: 0.0951 161/500 [========>.....................] - ETA: 1:24 - loss: 0.9186 - regression_loss: 0.8234 - classification_loss: 0.0951 162/500 [========>.....................] - ETA: 1:24 - loss: 0.9193 - regression_loss: 0.8240 - classification_loss: 0.0953 163/500 [========>.....................] - ETA: 1:24 - loss: 0.9159 - regression_loss: 0.8210 - classification_loss: 0.0948 164/500 [========>.....................] - ETA: 1:23 - loss: 0.9165 - regression_loss: 0.8217 - classification_loss: 0.0948 165/500 [========>.....................] - ETA: 1:23 - loss: 0.9155 - regression_loss: 0.8206 - classification_loss: 0.0949 166/500 [========>.....................] - ETA: 1:23 - loss: 0.9127 - regression_loss: 0.8182 - classification_loss: 0.0946 167/500 [=========>....................] - ETA: 1:23 - loss: 0.9112 - regression_loss: 0.8169 - classification_loss: 0.0943 168/500 [=========>....................] - ETA: 1:22 - loss: 0.9094 - regression_loss: 0.8154 - classification_loss: 0.0940 169/500 [=========>....................] - ETA: 1:22 - loss: 0.9081 - regression_loss: 0.8143 - classification_loss: 0.0938 170/500 [=========>....................] - ETA: 1:22 - loss: 0.9063 - regression_loss: 0.8129 - classification_loss: 0.0934 171/500 [=========>....................] - ETA: 1:22 - loss: 0.9052 - regression_loss: 0.8120 - classification_loss: 0.0932 172/500 [=========>....................] - ETA: 1:21 - loss: 0.9065 - regression_loss: 0.8131 - classification_loss: 0.0934 173/500 [=========>....................] - ETA: 1:21 - loss: 0.9036 - regression_loss: 0.8106 - classification_loss: 0.0930 174/500 [=========>....................] - ETA: 1:21 - loss: 0.9051 - regression_loss: 0.8120 - classification_loss: 0.0931 175/500 [=========>....................] - ETA: 1:21 - loss: 0.9061 - regression_loss: 0.8132 - classification_loss: 0.0929 176/500 [=========>....................] - ETA: 1:20 - loss: 0.9090 - regression_loss: 0.8156 - classification_loss: 0.0933 177/500 [=========>....................] - ETA: 1:20 - loss: 0.9110 - regression_loss: 0.8170 - classification_loss: 0.0940 178/500 [=========>....................] - ETA: 1:20 - loss: 0.9105 - regression_loss: 0.8166 - classification_loss: 0.0939 179/500 [=========>....................] - ETA: 1:20 - loss: 0.9106 - regression_loss: 0.8166 - classification_loss: 0.0940 180/500 [=========>....................] - ETA: 1:19 - loss: 0.9117 - regression_loss: 0.8176 - classification_loss: 0.0941 181/500 [=========>....................] - ETA: 1:19 - loss: 0.9130 - regression_loss: 0.8185 - classification_loss: 0.0945 182/500 [=========>....................] - ETA: 1:19 - loss: 0.9135 - regression_loss: 0.8189 - classification_loss: 0.0946 183/500 [=========>....................] - ETA: 1:19 - loss: 0.9121 - regression_loss: 0.8176 - classification_loss: 0.0944 184/500 [==========>...................] - ETA: 1:19 - loss: 0.9137 - regression_loss: 0.8191 - classification_loss: 0.0946 185/500 [==========>...................] - ETA: 1:18 - loss: 0.9155 - regression_loss: 0.8206 - classification_loss: 0.0949 186/500 [==========>...................] - ETA: 1:18 - loss: 0.9154 - regression_loss: 0.8207 - classification_loss: 0.0947 187/500 [==========>...................] - ETA: 1:18 - loss: 0.9184 - regression_loss: 0.8233 - classification_loss: 0.0952 188/500 [==========>...................] - ETA: 1:17 - loss: 0.9189 - regression_loss: 0.8237 - classification_loss: 0.0952 189/500 [==========>...................] - ETA: 1:17 - loss: 0.9177 - regression_loss: 0.8228 - classification_loss: 0.0949 190/500 [==========>...................] - ETA: 1:17 - loss: 0.9174 - regression_loss: 0.8226 - classification_loss: 0.0948 191/500 [==========>...................] - ETA: 1:17 - loss: 0.9167 - regression_loss: 0.8222 - classification_loss: 0.0945 192/500 [==========>...................] - ETA: 1:16 - loss: 0.9173 - regression_loss: 0.8227 - classification_loss: 0.0946 193/500 [==========>...................] - ETA: 1:16 - loss: 0.9155 - regression_loss: 0.8213 - classification_loss: 0.0943 194/500 [==========>...................] - ETA: 1:16 - loss: 0.9141 - regression_loss: 0.8201 - classification_loss: 0.0940 195/500 [==========>...................] - ETA: 1:16 - loss: 0.9135 - regression_loss: 0.8197 - classification_loss: 0.0938 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9160 - regression_loss: 0.8218 - classification_loss: 0.0942 197/500 [==========>...................] - ETA: 1:15 - loss: 0.9143 - regression_loss: 0.8205 - classification_loss: 0.0938 198/500 [==========>...................] - ETA: 1:15 - loss: 0.9152 - regression_loss: 0.8210 - classification_loss: 0.0941 199/500 [==========>...................] - ETA: 1:15 - loss: 0.9155 - regression_loss: 0.8214 - classification_loss: 0.0941 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9160 - regression_loss: 0.8219 - classification_loss: 0.0941 201/500 [===========>..................] - ETA: 1:14 - loss: 0.9149 - regression_loss: 0.8208 - classification_loss: 0.0941 202/500 [===========>..................] - ETA: 1:14 - loss: 0.9151 - regression_loss: 0.8209 - classification_loss: 0.0942 203/500 [===========>..................] - ETA: 1:14 - loss: 0.9162 - regression_loss: 0.8220 - classification_loss: 0.0942 204/500 [===========>..................] - ETA: 1:13 - loss: 0.9170 - regression_loss: 0.8226 - classification_loss: 0.0943 205/500 [===========>..................] - ETA: 1:13 - loss: 0.9175 - regression_loss: 0.8230 - classification_loss: 0.0945 206/500 [===========>..................] - ETA: 1:13 - loss: 0.9161 - regression_loss: 0.8219 - classification_loss: 0.0942 207/500 [===========>..................] - ETA: 1:13 - loss: 0.9165 - regression_loss: 0.8225 - classification_loss: 0.0940 208/500 [===========>..................] - ETA: 1:12 - loss: 0.9159 - regression_loss: 0.8223 - classification_loss: 0.0937 209/500 [===========>..................] - ETA: 1:12 - loss: 0.9160 - regression_loss: 0.8224 - classification_loss: 0.0936 210/500 [===========>..................] - ETA: 1:12 - loss: 0.9143 - regression_loss: 0.8210 - classification_loss: 0.0933 211/500 [===========>..................] - ETA: 1:12 - loss: 0.9166 - regression_loss: 0.8228 - classification_loss: 0.0938 212/500 [===========>..................] - ETA: 1:11 - loss: 0.9186 - regression_loss: 0.8242 - classification_loss: 0.0944 213/500 [===========>..................] - ETA: 1:11 - loss: 0.9191 - regression_loss: 0.8248 - classification_loss: 0.0943 214/500 [===========>..................] - ETA: 1:11 - loss: 0.9182 - regression_loss: 0.8242 - classification_loss: 0.0940 215/500 [===========>..................] - ETA: 1:11 - loss: 0.9166 - regression_loss: 0.8230 - classification_loss: 0.0937 216/500 [===========>..................] - ETA: 1:10 - loss: 0.9192 - regression_loss: 0.8250 - classification_loss: 0.0942 217/500 [============>.................] - ETA: 1:10 - loss: 0.9185 - regression_loss: 0.8246 - classification_loss: 0.0939 218/500 [============>.................] - ETA: 1:10 - loss: 0.9195 - regression_loss: 0.8254 - classification_loss: 0.0942 219/500 [============>.................] - ETA: 1:10 - loss: 0.9177 - regression_loss: 0.8238 - classification_loss: 0.0939 220/500 [============>.................] - ETA: 1:09 - loss: 0.9175 - regression_loss: 0.8235 - classification_loss: 0.0940 221/500 [============>.................] - ETA: 1:09 - loss: 0.9183 - regression_loss: 0.8243 - classification_loss: 0.0940 222/500 [============>.................] - ETA: 1:09 - loss: 0.9159 - regression_loss: 0.8222 - classification_loss: 0.0937 223/500 [============>.................] - ETA: 1:09 - loss: 0.9135 - regression_loss: 0.8201 - classification_loss: 0.0934 224/500 [============>.................] - ETA: 1:09 - loss: 0.9133 - regression_loss: 0.8199 - classification_loss: 0.0935 225/500 [============>.................] - ETA: 1:08 - loss: 0.9119 - regression_loss: 0.8188 - classification_loss: 0.0931 226/500 [============>.................] - ETA: 1:08 - loss: 0.9093 - regression_loss: 0.8163 - classification_loss: 0.0930 227/500 [============>.................] - ETA: 1:08 - loss: 0.9103 - regression_loss: 0.8171 - classification_loss: 0.0932 228/500 [============>.................] - ETA: 1:08 - loss: 0.9122 - regression_loss: 0.8187 - classification_loss: 0.0935 229/500 [============>.................] - ETA: 1:07 - loss: 0.9099 - regression_loss: 0.8168 - classification_loss: 0.0932 230/500 [============>.................] - ETA: 1:07 - loss: 0.9096 - regression_loss: 0.8165 - classification_loss: 0.0931 231/500 [============>.................] - ETA: 1:07 - loss: 0.9080 - regression_loss: 0.8151 - classification_loss: 0.0929 232/500 [============>.................] - ETA: 1:07 - loss: 0.9063 - regression_loss: 0.8136 - classification_loss: 0.0927 233/500 [============>.................] - ETA: 1:06 - loss: 0.9066 - regression_loss: 0.8138 - classification_loss: 0.0928 234/500 [=============>................] - ETA: 1:06 - loss: 0.9070 - regression_loss: 0.8143 - classification_loss: 0.0927 235/500 [=============>................] - ETA: 1:06 - loss: 0.9039 - regression_loss: 0.8115 - classification_loss: 0.0924 236/500 [=============>................] - ETA: 1:06 - loss: 0.9061 - regression_loss: 0.8134 - classification_loss: 0.0927 237/500 [=============>................] - ETA: 1:05 - loss: 0.9074 - regression_loss: 0.8144 - classification_loss: 0.0930 238/500 [=============>................] - ETA: 1:05 - loss: 0.9079 - regression_loss: 0.8147 - classification_loss: 0.0933 239/500 [=============>................] - ETA: 1:05 - loss: 0.9082 - regression_loss: 0.8150 - classification_loss: 0.0932 240/500 [=============>................] - ETA: 1:04 - loss: 0.9090 - regression_loss: 0.8156 - classification_loss: 0.0934 241/500 [=============>................] - ETA: 1:04 - loss: 0.9098 - regression_loss: 0.8161 - classification_loss: 0.0937 242/500 [=============>................] - ETA: 1:04 - loss: 0.9099 - regression_loss: 0.8162 - classification_loss: 0.0938 243/500 [=============>................] - ETA: 1:04 - loss: 0.9082 - regression_loss: 0.8143 - classification_loss: 0.0939 244/500 [=============>................] - ETA: 1:04 - loss: 0.9105 - regression_loss: 0.8163 - classification_loss: 0.0942 245/500 [=============>................] - ETA: 1:03 - loss: 0.9126 - regression_loss: 0.8179 - classification_loss: 0.0947 246/500 [=============>................] - ETA: 1:03 - loss: 0.9109 - regression_loss: 0.8165 - classification_loss: 0.0945 247/500 [=============>................] - ETA: 1:03 - loss: 0.9115 - regression_loss: 0.8169 - classification_loss: 0.0947 248/500 [=============>................] - ETA: 1:03 - loss: 0.9099 - regression_loss: 0.8154 - classification_loss: 0.0945 249/500 [=============>................] - ETA: 1:02 - loss: 0.9111 - regression_loss: 0.8165 - classification_loss: 0.0946 250/500 [==============>...............] - ETA: 1:02 - loss: 0.9137 - regression_loss: 0.8185 - classification_loss: 0.0951 251/500 [==============>...............] - ETA: 1:02 - loss: 0.9144 - regression_loss: 0.8193 - classification_loss: 0.0951 252/500 [==============>...............] - ETA: 1:02 - loss: 0.9150 - regression_loss: 0.8197 - classification_loss: 0.0953 253/500 [==============>...............] - ETA: 1:01 - loss: 0.9159 - regression_loss: 0.8205 - classification_loss: 0.0954 254/500 [==============>...............] - ETA: 1:01 - loss: 0.9164 - regression_loss: 0.8209 - classification_loss: 0.0954 255/500 [==============>...............] - ETA: 1:01 - loss: 0.9164 - regression_loss: 0.8211 - classification_loss: 0.0953 256/500 [==============>...............] - ETA: 1:01 - loss: 0.9183 - regression_loss: 0.8227 - classification_loss: 0.0956 257/500 [==============>...............] - ETA: 1:00 - loss: 0.9200 - regression_loss: 0.8240 - classification_loss: 0.0960 258/500 [==============>...............] - ETA: 1:00 - loss: 0.9202 - regression_loss: 0.8240 - classification_loss: 0.0962 259/500 [==============>...............] - ETA: 1:00 - loss: 0.9206 - regression_loss: 0.8241 - classification_loss: 0.0965 260/500 [==============>...............] - ETA: 1:00 - loss: 0.9198 - regression_loss: 0.8234 - classification_loss: 0.0964 261/500 [==============>...............] - ETA: 59s - loss: 0.9184 - regression_loss: 0.8222 - classification_loss: 0.0962  262/500 [==============>...............] - ETA: 59s - loss: 0.9174 - regression_loss: 0.8213 - classification_loss: 0.0961 263/500 [==============>...............] - ETA: 59s - loss: 0.9157 - regression_loss: 0.8199 - classification_loss: 0.0957 264/500 [==============>...............] - ETA: 58s - loss: 0.9149 - regression_loss: 0.8194 - classification_loss: 0.0956 265/500 [==============>...............] - ETA: 58s - loss: 0.9137 - regression_loss: 0.8183 - classification_loss: 0.0954 266/500 [==============>...............] - ETA: 58s - loss: 0.9157 - regression_loss: 0.8203 - classification_loss: 0.0954 267/500 [===============>..............] - ETA: 58s - loss: 0.9148 - regression_loss: 0.8196 - classification_loss: 0.0952 268/500 [===============>..............] - ETA: 57s - loss: 0.9131 - regression_loss: 0.8182 - classification_loss: 0.0949 269/500 [===============>..............] - ETA: 57s - loss: 0.9117 - regression_loss: 0.8171 - classification_loss: 0.0947 270/500 [===============>..............] - ETA: 57s - loss: 0.9111 - regression_loss: 0.8163 - classification_loss: 0.0947 271/500 [===============>..............] - ETA: 57s - loss: 0.9112 - regression_loss: 0.8167 - classification_loss: 0.0945 272/500 [===============>..............] - ETA: 56s - loss: 0.9120 - regression_loss: 0.8174 - classification_loss: 0.0946 273/500 [===============>..............] - ETA: 56s - loss: 0.9141 - regression_loss: 0.8190 - classification_loss: 0.0951 274/500 [===============>..............] - ETA: 56s - loss: 0.9136 - regression_loss: 0.8187 - classification_loss: 0.0949 275/500 [===============>..............] - ETA: 56s - loss: 0.9154 - regression_loss: 0.8201 - classification_loss: 0.0952 276/500 [===============>..............] - ETA: 55s - loss: 0.9150 - regression_loss: 0.8198 - classification_loss: 0.0952 277/500 [===============>..............] - ETA: 55s - loss: 0.9149 - regression_loss: 0.8199 - classification_loss: 0.0950 278/500 [===============>..............] - ETA: 55s - loss: 0.9151 - regression_loss: 0.8198 - classification_loss: 0.0953 279/500 [===============>..............] - ETA: 55s - loss: 0.9146 - regression_loss: 0.8193 - classification_loss: 0.0953 280/500 [===============>..............] - ETA: 54s - loss: 0.9137 - regression_loss: 0.8184 - classification_loss: 0.0952 281/500 [===============>..............] - ETA: 54s - loss: 0.9130 - regression_loss: 0.8178 - classification_loss: 0.0952 282/500 [===============>..............] - ETA: 54s - loss: 0.9142 - regression_loss: 0.8188 - classification_loss: 0.0954 283/500 [===============>..............] - ETA: 54s - loss: 0.9125 - regression_loss: 0.8174 - classification_loss: 0.0951 284/500 [================>.............] - ETA: 53s - loss: 0.9111 - regression_loss: 0.8163 - classification_loss: 0.0948 285/500 [================>.............] - ETA: 53s - loss: 0.9128 - regression_loss: 0.8177 - classification_loss: 0.0951 286/500 [================>.............] - ETA: 53s - loss: 0.9133 - regression_loss: 0.8181 - classification_loss: 0.0952 287/500 [================>.............] - ETA: 53s - loss: 0.9133 - regression_loss: 0.8180 - classification_loss: 0.0953 288/500 [================>.............] - ETA: 52s - loss: 0.9133 - regression_loss: 0.8180 - classification_loss: 0.0953 289/500 [================>.............] - ETA: 52s - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 290/500 [================>.............] - ETA: 52s - loss: 0.9125 - regression_loss: 0.8174 - classification_loss: 0.0951 291/500 [================>.............] - ETA: 52s - loss: 0.9112 - regression_loss: 0.8163 - classification_loss: 0.0949 292/500 [================>.............] - ETA: 51s - loss: 0.9117 - regression_loss: 0.8168 - classification_loss: 0.0949 293/500 [================>.............] - ETA: 51s - loss: 0.9112 - regression_loss: 0.8162 - classification_loss: 0.0950 294/500 [================>.............] - ETA: 51s - loss: 0.9115 - regression_loss: 0.8164 - classification_loss: 0.0951 295/500 [================>.............] - ETA: 51s - loss: 0.9125 - regression_loss: 0.8172 - classification_loss: 0.0952 296/500 [================>.............] - ETA: 50s - loss: 0.9118 - regression_loss: 0.8168 - classification_loss: 0.0950 297/500 [================>.............] - ETA: 50s - loss: 0.9121 - regression_loss: 0.8171 - classification_loss: 0.0950 298/500 [================>.............] - ETA: 50s - loss: 0.9112 - regression_loss: 0.8163 - classification_loss: 0.0949 299/500 [================>.............] - ETA: 50s - loss: 0.9124 - regression_loss: 0.8174 - classification_loss: 0.0950 300/500 [=================>............] - ETA: 49s - loss: 0.9111 - regression_loss: 0.8163 - classification_loss: 0.0948 301/500 [=================>............] - ETA: 49s - loss: 0.9104 - regression_loss: 0.8158 - classification_loss: 0.0946 302/500 [=================>............] - ETA: 49s - loss: 0.9090 - regression_loss: 0.8145 - classification_loss: 0.0944 303/500 [=================>............] - ETA: 49s - loss: 0.9085 - regression_loss: 0.8142 - classification_loss: 0.0943 304/500 [=================>............] - ETA: 48s - loss: 0.9097 - regression_loss: 0.8151 - classification_loss: 0.0945 305/500 [=================>............] - ETA: 48s - loss: 0.9106 - regression_loss: 0.8158 - classification_loss: 0.0947 306/500 [=================>............] - ETA: 48s - loss: 0.9122 - regression_loss: 0.8173 - classification_loss: 0.0949 307/500 [=================>............] - ETA: 48s - loss: 0.9117 - regression_loss: 0.8169 - classification_loss: 0.0948 308/500 [=================>............] - ETA: 47s - loss: 0.9116 - regression_loss: 0.8170 - classification_loss: 0.0946 309/500 [=================>............] - ETA: 47s - loss: 0.9114 - regression_loss: 0.8169 - classification_loss: 0.0945 310/500 [=================>............] - ETA: 47s - loss: 0.9115 - regression_loss: 0.8170 - classification_loss: 0.0945 311/500 [=================>............] - ETA: 47s - loss: 0.9117 - regression_loss: 0.8172 - classification_loss: 0.0945 312/500 [=================>............] - ETA: 46s - loss: 0.9116 - regression_loss: 0.8172 - classification_loss: 0.0945 313/500 [=================>............] - ETA: 46s - loss: 0.9127 - regression_loss: 0.8180 - classification_loss: 0.0947 314/500 [=================>............] - ETA: 46s - loss: 0.9122 - regression_loss: 0.8176 - classification_loss: 0.0946 315/500 [=================>............] - ETA: 46s - loss: 0.9123 - regression_loss: 0.8176 - classification_loss: 0.0947 316/500 [=================>............] - ETA: 45s - loss: 0.9117 - regression_loss: 0.8171 - classification_loss: 0.0946 317/500 [==================>...........] - ETA: 45s - loss: 0.9141 - regression_loss: 0.8190 - classification_loss: 0.0951 318/500 [==================>...........] - ETA: 45s - loss: 0.9157 - regression_loss: 0.8203 - classification_loss: 0.0954 319/500 [==================>...........] - ETA: 45s - loss: 0.9156 - regression_loss: 0.8203 - classification_loss: 0.0953 320/500 [==================>...........] - ETA: 44s - loss: 0.9162 - regression_loss: 0.8209 - classification_loss: 0.0953 321/500 [==================>...........] - ETA: 44s - loss: 0.9162 - regression_loss: 0.8209 - classification_loss: 0.0952 322/500 [==================>...........] - ETA: 44s - loss: 0.9159 - regression_loss: 0.8207 - classification_loss: 0.0952 323/500 [==================>...........] - ETA: 44s - loss: 0.9158 - regression_loss: 0.8207 - classification_loss: 0.0951 324/500 [==================>...........] - ETA: 43s - loss: 0.9162 - regression_loss: 0.8207 - classification_loss: 0.0955 325/500 [==================>...........] - ETA: 43s - loss: 0.9174 - regression_loss: 0.8216 - classification_loss: 0.0958 326/500 [==================>...........] - ETA: 43s - loss: 0.9158 - regression_loss: 0.8202 - classification_loss: 0.0956 327/500 [==================>...........] - ETA: 43s - loss: 0.9158 - regression_loss: 0.8202 - classification_loss: 0.0956 328/500 [==================>...........] - ETA: 42s - loss: 0.9162 - regression_loss: 0.8206 - classification_loss: 0.0956 329/500 [==================>...........] - ETA: 42s - loss: 0.9152 - regression_loss: 0.8197 - classification_loss: 0.0955 330/500 [==================>...........] - ETA: 42s - loss: 0.9148 - regression_loss: 0.8193 - classification_loss: 0.0954 331/500 [==================>...........] - ETA: 42s - loss: 0.9132 - regression_loss: 0.8180 - classification_loss: 0.0953 332/500 [==================>...........] - ETA: 41s - loss: 0.9111 - regression_loss: 0.8160 - classification_loss: 0.0950 333/500 [==================>...........] - ETA: 41s - loss: 0.9119 - regression_loss: 0.8167 - classification_loss: 0.0951 334/500 [===================>..........] - ETA: 41s - loss: 0.9124 - regression_loss: 0.8171 - classification_loss: 0.0953 335/500 [===================>..........] - ETA: 41s - loss: 0.9128 - regression_loss: 0.8174 - classification_loss: 0.0954 336/500 [===================>..........] - ETA: 40s - loss: 0.9127 - regression_loss: 0.8173 - classification_loss: 0.0954 337/500 [===================>..........] - ETA: 40s - loss: 0.9134 - regression_loss: 0.8179 - classification_loss: 0.0955 338/500 [===================>..........] - ETA: 40s - loss: 0.9132 - regression_loss: 0.8177 - classification_loss: 0.0955 339/500 [===================>..........] - ETA: 40s - loss: 0.9129 - regression_loss: 0.8174 - classification_loss: 0.0955 340/500 [===================>..........] - ETA: 39s - loss: 0.9125 - regression_loss: 0.8169 - classification_loss: 0.0955 341/500 [===================>..........] - ETA: 39s - loss: 0.9107 - regression_loss: 0.8154 - classification_loss: 0.0953 342/500 [===================>..........] - ETA: 39s - loss: 0.9115 - regression_loss: 0.8161 - classification_loss: 0.0954 343/500 [===================>..........] - ETA: 39s - loss: 0.9102 - regression_loss: 0.8150 - classification_loss: 0.0952 344/500 [===================>..........] - ETA: 38s - loss: 0.9100 - regression_loss: 0.8148 - classification_loss: 0.0952 345/500 [===================>..........] - ETA: 38s - loss: 0.9104 - regression_loss: 0.8152 - classification_loss: 0.0952 346/500 [===================>..........] - ETA: 38s - loss: 0.9089 - regression_loss: 0.8138 - classification_loss: 0.0950 347/500 [===================>..........] - ETA: 38s - loss: 0.9089 - regression_loss: 0.8138 - classification_loss: 0.0951 348/500 [===================>..........] - ETA: 37s - loss: 0.9071 - regression_loss: 0.8122 - classification_loss: 0.0948 349/500 [===================>..........] - ETA: 37s - loss: 0.9064 - regression_loss: 0.8117 - classification_loss: 0.0947 350/500 [====================>.........] - ETA: 37s - loss: 0.9063 - regression_loss: 0.8118 - classification_loss: 0.0945 351/500 [====================>.........] - ETA: 37s - loss: 0.9081 - regression_loss: 0.8132 - classification_loss: 0.0949 352/500 [====================>.........] - ETA: 36s - loss: 0.9079 - regression_loss: 0.8130 - classification_loss: 0.0948 353/500 [====================>.........] - ETA: 36s - loss: 0.9081 - regression_loss: 0.8133 - classification_loss: 0.0948 354/500 [====================>.........] - ETA: 36s - loss: 0.9070 - regression_loss: 0.8121 - classification_loss: 0.0949 355/500 [====================>.........] - ETA: 36s - loss: 0.9063 - regression_loss: 0.8114 - classification_loss: 0.0949 356/500 [====================>.........] - ETA: 35s - loss: 0.9046 - regression_loss: 0.8099 - classification_loss: 0.0947 357/500 [====================>.........] - ETA: 35s - loss: 0.9046 - regression_loss: 0.8100 - classification_loss: 0.0946 358/500 [====================>.........] - ETA: 35s - loss: 0.9038 - regression_loss: 0.8094 - classification_loss: 0.0944 359/500 [====================>.........] - ETA: 35s - loss: 0.9029 - regression_loss: 0.8087 - classification_loss: 0.0942 360/500 [====================>.........] - ETA: 34s - loss: 0.9024 - regression_loss: 0.8083 - classification_loss: 0.0941 361/500 [====================>.........] - ETA: 34s - loss: 0.9024 - regression_loss: 0.8083 - classification_loss: 0.0941 362/500 [====================>.........] - ETA: 34s - loss: 0.9029 - regression_loss: 0.8087 - classification_loss: 0.0942 363/500 [====================>.........] - ETA: 34s - loss: 0.9019 - regression_loss: 0.8079 - classification_loss: 0.0941 364/500 [====================>.........] - ETA: 33s - loss: 0.9015 - regression_loss: 0.8076 - classification_loss: 0.0939 365/500 [====================>.........] - ETA: 33s - loss: 0.9019 - regression_loss: 0.8079 - classification_loss: 0.0940 366/500 [====================>.........] - ETA: 33s - loss: 0.9012 - regression_loss: 0.8073 - classification_loss: 0.0939 367/500 [=====================>........] - ETA: 33s - loss: 0.9011 - regression_loss: 0.8072 - classification_loss: 0.0939 368/500 [=====================>........] - ETA: 32s - loss: 0.9014 - regression_loss: 0.8075 - classification_loss: 0.0939 369/500 [=====================>........] - ETA: 32s - loss: 0.9001 - regression_loss: 0.8064 - classification_loss: 0.0937 370/500 [=====================>........] - ETA: 32s - loss: 0.8999 - regression_loss: 0.8062 - classification_loss: 0.0936 371/500 [=====================>........] - ETA: 32s - loss: 0.9005 - regression_loss: 0.8068 - classification_loss: 0.0937 372/500 [=====================>........] - ETA: 31s - loss: 0.9008 - regression_loss: 0.8070 - classification_loss: 0.0938 373/500 [=====================>........] - ETA: 31s - loss: 0.9020 - regression_loss: 0.8080 - classification_loss: 0.0941 374/500 [=====================>........] - ETA: 31s - loss: 0.9020 - regression_loss: 0.8080 - classification_loss: 0.0940 375/500 [=====================>........] - ETA: 31s - loss: 0.9004 - regression_loss: 0.8066 - classification_loss: 0.0939 376/500 [=====================>........] - ETA: 30s - loss: 0.9020 - regression_loss: 0.8079 - classification_loss: 0.0941 377/500 [=====================>........] - ETA: 30s - loss: 0.9019 - regression_loss: 0.8078 - classification_loss: 0.0941 378/500 [=====================>........] - ETA: 30s - loss: 0.9030 - regression_loss: 0.8090 - classification_loss: 0.0940 379/500 [=====================>........] - ETA: 30s - loss: 0.9023 - regression_loss: 0.8085 - classification_loss: 0.0938 380/500 [=====================>........] - ETA: 29s - loss: 0.9004 - regression_loss: 0.8068 - classification_loss: 0.0936 381/500 [=====================>........] - ETA: 29s - loss: 0.9007 - regression_loss: 0.8071 - classification_loss: 0.0936 382/500 [=====================>........] - ETA: 29s - loss: 0.9011 - regression_loss: 0.8076 - classification_loss: 0.0936 383/500 [=====================>........] - ETA: 29s - loss: 0.9012 - regression_loss: 0.8077 - classification_loss: 0.0936 384/500 [======================>.......] - ETA: 28s - loss: 0.9019 - regression_loss: 0.8083 - classification_loss: 0.0936 385/500 [======================>.......] - ETA: 28s - loss: 0.9025 - regression_loss: 0.8087 - classification_loss: 0.0938 386/500 [======================>.......] - ETA: 28s - loss: 0.9037 - regression_loss: 0.8097 - classification_loss: 0.0940 387/500 [======================>.......] - ETA: 28s - loss: 0.9026 - regression_loss: 0.8088 - classification_loss: 0.0938 388/500 [======================>.......] - ETA: 27s - loss: 0.9030 - regression_loss: 0.8091 - classification_loss: 0.0939 389/500 [======================>.......] - ETA: 27s - loss: 0.9012 - regression_loss: 0.8075 - classification_loss: 0.0937 390/500 [======================>.......] - ETA: 27s - loss: 0.9022 - regression_loss: 0.8083 - classification_loss: 0.0938 391/500 [======================>.......] - ETA: 27s - loss: 0.9021 - regression_loss: 0.8083 - classification_loss: 0.0938 392/500 [======================>.......] - ETA: 26s - loss: 0.9026 - regression_loss: 0.8088 - classification_loss: 0.0939 393/500 [======================>.......] - ETA: 26s - loss: 0.9017 - regression_loss: 0.8079 - classification_loss: 0.0938 394/500 [======================>.......] - ETA: 26s - loss: 0.9028 - regression_loss: 0.8088 - classification_loss: 0.0940 395/500 [======================>.......] - ETA: 26s - loss: 0.9024 - regression_loss: 0.8083 - classification_loss: 0.0940 396/500 [======================>.......] - ETA: 25s - loss: 0.9017 - regression_loss: 0.8078 - classification_loss: 0.0939 397/500 [======================>.......] - ETA: 25s - loss: 0.9011 - regression_loss: 0.8073 - classification_loss: 0.0938 398/500 [======================>.......] - ETA: 25s - loss: 0.9011 - regression_loss: 0.8072 - classification_loss: 0.0939 399/500 [======================>.......] - ETA: 25s - loss: 0.9007 - regression_loss: 0.8068 - classification_loss: 0.0939 400/500 [=======================>......] - ETA: 24s - loss: 0.9009 - regression_loss: 0.8070 - classification_loss: 0.0940 401/500 [=======================>......] - ETA: 24s - loss: 0.9012 - regression_loss: 0.8072 - classification_loss: 0.0940 402/500 [=======================>......] - ETA: 24s - loss: 0.9004 - regression_loss: 0.8065 - classification_loss: 0.0939 403/500 [=======================>......] - ETA: 24s - loss: 0.9009 - regression_loss: 0.8070 - classification_loss: 0.0939 404/500 [=======================>......] - ETA: 23s - loss: 0.9002 - regression_loss: 0.8065 - classification_loss: 0.0937 405/500 [=======================>......] - ETA: 23s - loss: 0.8999 - regression_loss: 0.8062 - classification_loss: 0.0937 406/500 [=======================>......] - ETA: 23s - loss: 0.8997 - regression_loss: 0.8060 - classification_loss: 0.0937 407/500 [=======================>......] - ETA: 23s - loss: 0.8992 - regression_loss: 0.8056 - classification_loss: 0.0936 408/500 [=======================>......] - ETA: 22s - loss: 0.8998 - regression_loss: 0.8062 - classification_loss: 0.0936 409/500 [=======================>......] - ETA: 22s - loss: 0.9013 - regression_loss: 0.8074 - classification_loss: 0.0939 410/500 [=======================>......] - ETA: 22s - loss: 0.9005 - regression_loss: 0.8067 - classification_loss: 0.0938 411/500 [=======================>......] - ETA: 22s - loss: 0.9020 - regression_loss: 0.8079 - classification_loss: 0.0941 412/500 [=======================>......] - ETA: 21s - loss: 0.9025 - regression_loss: 0.8083 - classification_loss: 0.0942 413/500 [=======================>......] - ETA: 21s - loss: 0.9038 - regression_loss: 0.8094 - classification_loss: 0.0944 414/500 [=======================>......] - ETA: 21s - loss: 0.9045 - regression_loss: 0.8100 - classification_loss: 0.0944 415/500 [=======================>......] - ETA: 21s - loss: 0.9052 - regression_loss: 0.8108 - classification_loss: 0.0945 416/500 [=======================>......] - ETA: 20s - loss: 0.9059 - regression_loss: 0.8114 - classification_loss: 0.0945 417/500 [========================>.....] - ETA: 20s - loss: 0.9043 - regression_loss: 0.8099 - classification_loss: 0.0943 418/500 [========================>.....] - ETA: 20s - loss: 0.9049 - regression_loss: 0.8103 - classification_loss: 0.0945 419/500 [========================>.....] - ETA: 20s - loss: 0.9057 - regression_loss: 0.8110 - classification_loss: 0.0947 420/500 [========================>.....] - ETA: 19s - loss: 0.9064 - regression_loss: 0.8116 - classification_loss: 0.0948 421/500 [========================>.....] - ETA: 19s - loss: 0.9077 - regression_loss: 0.8128 - classification_loss: 0.0950 422/500 [========================>.....] - ETA: 19s - loss: 0.9076 - regression_loss: 0.8126 - classification_loss: 0.0950 423/500 [========================>.....] - ETA: 19s - loss: 0.9071 - regression_loss: 0.8122 - classification_loss: 0.0949 424/500 [========================>.....] - ETA: 18s - loss: 0.9062 - regression_loss: 0.8115 - classification_loss: 0.0947 425/500 [========================>.....] - ETA: 18s - loss: 0.9056 - regression_loss: 0.8110 - classification_loss: 0.0946 426/500 [========================>.....] - ETA: 18s - loss: 0.9058 - regression_loss: 0.8111 - classification_loss: 0.0947 427/500 [========================>.....] - ETA: 18s - loss: 0.9064 - regression_loss: 0.8116 - classification_loss: 0.0948 428/500 [========================>.....] - ETA: 17s - loss: 0.9070 - regression_loss: 0.8121 - classification_loss: 0.0948 429/500 [========================>.....] - ETA: 17s - loss: 0.9072 - regression_loss: 0.8123 - classification_loss: 0.0949 430/500 [========================>.....] - ETA: 17s - loss: 0.9061 - regression_loss: 0.8114 - classification_loss: 0.0947 431/500 [========================>.....] - ETA: 17s - loss: 0.9067 - regression_loss: 0.8118 - classification_loss: 0.0948 432/500 [========================>.....] - ETA: 16s - loss: 0.9071 - regression_loss: 0.8122 - classification_loss: 0.0949 433/500 [========================>.....] - ETA: 16s - loss: 0.9068 - regression_loss: 0.8119 - classification_loss: 0.0949 434/500 [=========================>....] - ETA: 16s - loss: 0.9069 - regression_loss: 0.8120 - classification_loss: 0.0949 435/500 [=========================>....] - ETA: 16s - loss: 0.9078 - regression_loss: 0.8128 - classification_loss: 0.0951 436/500 [=========================>....] - ETA: 15s - loss: 0.9072 - regression_loss: 0.8123 - classification_loss: 0.0950 437/500 [=========================>....] - ETA: 15s - loss: 0.9084 - regression_loss: 0.8132 - classification_loss: 0.0952 438/500 [=========================>....] - ETA: 15s - loss: 0.9086 - regression_loss: 0.8133 - classification_loss: 0.0952 439/500 [=========================>....] - ETA: 15s - loss: 0.9082 - regression_loss: 0.8131 - classification_loss: 0.0952 440/500 [=========================>....] - ETA: 14s - loss: 0.9071 - regression_loss: 0.8121 - classification_loss: 0.0950 441/500 [=========================>....] - ETA: 14s - loss: 0.9068 - regression_loss: 0.8119 - classification_loss: 0.0949 442/500 [=========================>....] - ETA: 14s - loss: 0.9057 - regression_loss: 0.8109 - classification_loss: 0.0948 443/500 [=========================>....] - ETA: 14s - loss: 0.9050 - regression_loss: 0.8104 - classification_loss: 0.0946 444/500 [=========================>....] - ETA: 13s - loss: 0.9058 - regression_loss: 0.8110 - classification_loss: 0.0948 445/500 [=========================>....] - ETA: 13s - loss: 0.9053 - regression_loss: 0.8106 - classification_loss: 0.0946 446/500 [=========================>....] - ETA: 13s - loss: 0.9060 - regression_loss: 0.8113 - classification_loss: 0.0947 447/500 [=========================>....] - ETA: 13s - loss: 0.9066 - regression_loss: 0.8118 - classification_loss: 0.0948 448/500 [=========================>....] - ETA: 12s - loss: 0.9064 - regression_loss: 0.8117 - classification_loss: 0.0947 449/500 [=========================>....] - ETA: 12s - loss: 0.9062 - regression_loss: 0.8115 - classification_loss: 0.0947 450/500 [==========================>...] - ETA: 12s - loss: 0.9060 - regression_loss: 0.8114 - classification_loss: 0.0947 451/500 [==========================>...] - ETA: 12s - loss: 0.9071 - regression_loss: 0.8124 - classification_loss: 0.0948 452/500 [==========================>...] - ETA: 11s - loss: 0.9064 - regression_loss: 0.8117 - classification_loss: 0.0946 453/500 [==========================>...] - ETA: 11s - loss: 0.9057 - regression_loss: 0.8111 - classification_loss: 0.0945 454/500 [==========================>...] - ETA: 11s - loss: 0.9053 - regression_loss: 0.8108 - classification_loss: 0.0945 455/500 [==========================>...] - ETA: 11s - loss: 0.9057 - regression_loss: 0.8112 - classification_loss: 0.0945 456/500 [==========================>...] - ETA: 10s - loss: 0.9065 - regression_loss: 0.8118 - classification_loss: 0.0947 457/500 [==========================>...] - ETA: 10s - loss: 0.9069 - regression_loss: 0.8122 - classification_loss: 0.0947 458/500 [==========================>...] - ETA: 10s - loss: 0.9071 - regression_loss: 0.8124 - classification_loss: 0.0947 459/500 [==========================>...] - ETA: 10s - loss: 0.9076 - regression_loss: 0.8128 - classification_loss: 0.0949 460/500 [==========================>...] - ETA: 9s - loss: 0.9070 - regression_loss: 0.8122 - classification_loss: 0.0947  461/500 [==========================>...] - ETA: 9s - loss: 0.9070 - regression_loss: 0.8123 - classification_loss: 0.0947 462/500 [==========================>...] - ETA: 9s - loss: 0.9074 - regression_loss: 0.8127 - classification_loss: 0.0946 463/500 [==========================>...] - ETA: 9s - loss: 0.9077 - regression_loss: 0.8130 - classification_loss: 0.0947 464/500 [==========================>...] - ETA: 8s - loss: 0.9080 - regression_loss: 0.8132 - classification_loss: 0.0947 465/500 [==========================>...] - ETA: 8s - loss: 0.9068 - regression_loss: 0.8123 - classification_loss: 0.0946 466/500 [==========================>...] - ETA: 8s - loss: 0.9069 - regression_loss: 0.8123 - classification_loss: 0.0946 467/500 [===========================>..] - ETA: 8s - loss: 0.9063 - regression_loss: 0.8118 - classification_loss: 0.0945 468/500 [===========================>..] - ETA: 7s - loss: 0.9061 - regression_loss: 0.8116 - classification_loss: 0.0945 469/500 [===========================>..] - ETA: 7s - loss: 0.9055 - regression_loss: 0.8111 - classification_loss: 0.0944 470/500 [===========================>..] - ETA: 7s - loss: 0.9080 - regression_loss: 0.8134 - classification_loss: 0.0946 471/500 [===========================>..] - ETA: 7s - loss: 0.9070 - regression_loss: 0.8126 - classification_loss: 0.0944 472/500 [===========================>..] - ETA: 6s - loss: 0.9075 - regression_loss: 0.8131 - classification_loss: 0.0945 473/500 [===========================>..] - ETA: 6s - loss: 0.9063 - regression_loss: 0.8120 - classification_loss: 0.0943 474/500 [===========================>..] - ETA: 6s - loss: 0.9059 - regression_loss: 0.8116 - classification_loss: 0.0943 475/500 [===========================>..] - ETA: 6s - loss: 0.9059 - regression_loss: 0.8117 - classification_loss: 0.0942 476/500 [===========================>..] - ETA: 5s - loss: 0.9057 - regression_loss: 0.8115 - classification_loss: 0.0942 477/500 [===========================>..] - ETA: 5s - loss: 0.9061 - regression_loss: 0.8117 - classification_loss: 0.0944 478/500 [===========================>..] - ETA: 5s - loss: 0.9054 - regression_loss: 0.8111 - classification_loss: 0.0943 479/500 [===========================>..] - ETA: 5s - loss: 0.9049 - regression_loss: 0.8106 - classification_loss: 0.0942 480/500 [===========================>..] - ETA: 4s - loss: 0.9039 - regression_loss: 0.8098 - classification_loss: 0.0941 481/500 [===========================>..] - ETA: 4s - loss: 0.9044 - regression_loss: 0.8102 - classification_loss: 0.0942 482/500 [===========================>..] - ETA: 4s - loss: 0.9055 - regression_loss: 0.8111 - classification_loss: 0.0944 483/500 [===========================>..] - ETA: 4s - loss: 0.9053 - regression_loss: 0.8110 - classification_loss: 0.0943 484/500 [============================>.] - ETA: 3s - loss: 0.9054 - regression_loss: 0.8112 - classification_loss: 0.0943 485/500 [============================>.] - ETA: 3s - loss: 0.9048 - regression_loss: 0.8106 - classification_loss: 0.0942 486/500 [============================>.] - ETA: 3s - loss: 0.9055 - regression_loss: 0.8111 - classification_loss: 0.0943 487/500 [============================>.] - ETA: 3s - loss: 0.9053 - regression_loss: 0.8111 - classification_loss: 0.0942 488/500 [============================>.] - ETA: 2s - loss: 0.9068 - regression_loss: 0.8124 - classification_loss: 0.0944 489/500 [============================>.] - ETA: 2s - loss: 0.9068 - regression_loss: 0.8124 - classification_loss: 0.0943 490/500 [============================>.] - ETA: 2s - loss: 0.9070 - regression_loss: 0.8127 - classification_loss: 0.0943 491/500 [============================>.] - ETA: 2s - loss: 0.9076 - regression_loss: 0.8132 - classification_loss: 0.0944 492/500 [============================>.] - ETA: 1s - loss: 0.9084 - regression_loss: 0.8138 - classification_loss: 0.0946 493/500 [============================>.] - ETA: 1s - loss: 0.9080 - regression_loss: 0.8134 - classification_loss: 0.0945 494/500 [============================>.] - ETA: 1s - loss: 0.9081 - regression_loss: 0.8135 - classification_loss: 0.0946 495/500 [============================>.] - ETA: 1s - loss: 0.9070 - regression_loss: 0.8126 - classification_loss: 0.0945 496/500 [============================>.] - ETA: 0s - loss: 0.9071 - regression_loss: 0.8127 - classification_loss: 0.0944 497/500 [============================>.] - ETA: 0s - loss: 0.9077 - regression_loss: 0.8133 - classification_loss: 0.0944 498/500 [============================>.] - ETA: 0s - loss: 0.9079 - regression_loss: 0.8134 - classification_loss: 0.0945 499/500 [============================>.] - ETA: 0s - loss: 0.9076 - regression_loss: 0.8132 - classification_loss: 0.0944 500/500 [==============================] - 125s 249ms/step - loss: 0.9083 - regression_loss: 0.8138 - classification_loss: 0.0945 1172 instances of class plum with average precision: 0.8078 mAP: 0.8078 Epoch 00048: saving model to ./training/snapshots/resnet50_pascal_48.h5 Epoch 49/150 1/500 [..............................] - ETA: 2:04 - loss: 0.6974 - regression_loss: 0.6496 - classification_loss: 0.0478 2/500 [..............................] - ETA: 2:05 - loss: 0.4884 - regression_loss: 0.4530 - classification_loss: 0.0354 3/500 [..............................] - ETA: 2:03 - loss: 0.6339 - regression_loss: 0.5871 - classification_loss: 0.0467 4/500 [..............................] - ETA: 2:02 - loss: 0.5472 - regression_loss: 0.5079 - classification_loss: 0.0393 5/500 [..............................] - ETA: 1:59 - loss: 0.6306 - regression_loss: 0.5717 - classification_loss: 0.0589 6/500 [..............................] - ETA: 1:59 - loss: 0.6682 - regression_loss: 0.5984 - classification_loss: 0.0698 7/500 [..............................] - ETA: 1:57 - loss: 0.6859 - regression_loss: 0.6150 - classification_loss: 0.0709 8/500 [..............................] - ETA: 1:57 - loss: 0.7099 - regression_loss: 0.6282 - classification_loss: 0.0817 9/500 [..............................] - ETA: 1:56 - loss: 0.7467 - regression_loss: 0.6599 - classification_loss: 0.0868 10/500 [..............................] - ETA: 1:57 - loss: 0.7962 - regression_loss: 0.7060 - classification_loss: 0.0903 11/500 [..............................] - ETA: 1:57 - loss: 0.8130 - regression_loss: 0.7216 - classification_loss: 0.0914 12/500 [..............................] - ETA: 1:56 - loss: 0.8128 - regression_loss: 0.7226 - classification_loss: 0.0902 13/500 [..............................] - ETA: 1:56 - loss: 0.8249 - regression_loss: 0.7345 - classification_loss: 0.0904 14/500 [..............................] - ETA: 1:57 - loss: 0.8433 - regression_loss: 0.7492 - classification_loss: 0.0941 15/500 [..............................] - ETA: 1:57 - loss: 0.8437 - regression_loss: 0.7503 - classification_loss: 0.0934 16/500 [..............................] - ETA: 1:57 - loss: 0.8374 - regression_loss: 0.7467 - classification_loss: 0.0907 17/500 [>.............................] - ETA: 1:57 - loss: 0.8300 - regression_loss: 0.7417 - classification_loss: 0.0882 18/500 [>.............................] - ETA: 1:57 - loss: 0.8240 - regression_loss: 0.7377 - classification_loss: 0.0863 19/500 [>.............................] - ETA: 1:57 - loss: 0.8564 - regression_loss: 0.7640 - classification_loss: 0.0924 20/500 [>.............................] - ETA: 1:56 - loss: 0.8592 - regression_loss: 0.7672 - classification_loss: 0.0919 21/500 [>.............................] - ETA: 1:56 - loss: 0.8484 - regression_loss: 0.7572 - classification_loss: 0.0912 22/500 [>.............................] - ETA: 1:56 - loss: 0.8451 - regression_loss: 0.7554 - classification_loss: 0.0897 23/500 [>.............................] - ETA: 1:56 - loss: 0.8425 - regression_loss: 0.7544 - classification_loss: 0.0881 24/500 [>.............................] - ETA: 1:56 - loss: 0.8591 - regression_loss: 0.7689 - classification_loss: 0.0903 25/500 [>.............................] - ETA: 1:55 - loss: 0.8781 - regression_loss: 0.7856 - classification_loss: 0.0925 26/500 [>.............................] - ETA: 1:54 - loss: 0.8761 - regression_loss: 0.7840 - classification_loss: 0.0921 27/500 [>.............................] - ETA: 1:54 - loss: 0.8831 - regression_loss: 0.7899 - classification_loss: 0.0932 28/500 [>.............................] - ETA: 1:53 - loss: 0.8761 - regression_loss: 0.7853 - classification_loss: 0.0908 29/500 [>.............................] - ETA: 1:53 - loss: 0.8791 - regression_loss: 0.7874 - classification_loss: 0.0917 30/500 [>.............................] - ETA: 1:53 - loss: 0.8795 - regression_loss: 0.7890 - classification_loss: 0.0905 31/500 [>.............................] - ETA: 1:53 - loss: 0.8912 - regression_loss: 0.7986 - classification_loss: 0.0927 32/500 [>.............................] - ETA: 1:53 - loss: 0.8814 - regression_loss: 0.7899 - classification_loss: 0.0915 33/500 [>.............................] - ETA: 1:53 - loss: 0.8726 - regression_loss: 0.7827 - classification_loss: 0.0899 34/500 [=>............................] - ETA: 1:52 - loss: 0.8745 - regression_loss: 0.7838 - classification_loss: 0.0907 35/500 [=>............................] - ETA: 1:52 - loss: 0.8764 - regression_loss: 0.7853 - classification_loss: 0.0911 36/500 [=>............................] - ETA: 1:51 - loss: 0.8796 - regression_loss: 0.7876 - classification_loss: 0.0919 37/500 [=>............................] - ETA: 1:51 - loss: 0.8865 - regression_loss: 0.7936 - classification_loss: 0.0929 38/500 [=>............................] - ETA: 1:51 - loss: 0.8718 - regression_loss: 0.7808 - classification_loss: 0.0910 39/500 [=>............................] - ETA: 1:51 - loss: 0.8739 - regression_loss: 0.7833 - classification_loss: 0.0905 40/500 [=>............................] - ETA: 1:51 - loss: 0.8711 - regression_loss: 0.7807 - classification_loss: 0.0904 41/500 [=>............................] - ETA: 1:51 - loss: 0.8715 - regression_loss: 0.7811 - classification_loss: 0.0904 42/500 [=>............................] - ETA: 1:50 - loss: 0.8783 - regression_loss: 0.7874 - classification_loss: 0.0909 43/500 [=>............................] - ETA: 1:50 - loss: 0.8887 - regression_loss: 0.7963 - classification_loss: 0.0923 44/500 [=>............................] - ETA: 1:50 - loss: 0.8824 - regression_loss: 0.7914 - classification_loss: 0.0910 45/500 [=>............................] - ETA: 1:49 - loss: 0.8822 - regression_loss: 0.7916 - classification_loss: 0.0906 46/500 [=>............................] - ETA: 1:49 - loss: 0.8718 - regression_loss: 0.7827 - classification_loss: 0.0892 47/500 [=>............................] - ETA: 1:48 - loss: 0.8650 - regression_loss: 0.7761 - classification_loss: 0.0890 48/500 [=>............................] - ETA: 1:48 - loss: 0.8626 - regression_loss: 0.7750 - classification_loss: 0.0875 49/500 [=>............................] - ETA: 1:47 - loss: 0.8648 - regression_loss: 0.7769 - classification_loss: 0.0879 50/500 [==>...........................] - ETA: 1:47 - loss: 0.8661 - regression_loss: 0.7776 - classification_loss: 0.0885 51/500 [==>...........................] - ETA: 1:47 - loss: 0.8609 - regression_loss: 0.7733 - classification_loss: 0.0876 52/500 [==>...........................] - ETA: 1:47 - loss: 0.8499 - regression_loss: 0.7636 - classification_loss: 0.0863 53/500 [==>...........................] - ETA: 1:46 - loss: 0.8505 - regression_loss: 0.7642 - classification_loss: 0.0863 54/500 [==>...........................] - ETA: 1:46 - loss: 0.8508 - regression_loss: 0.7648 - classification_loss: 0.0860 55/500 [==>...........................] - ETA: 1:46 - loss: 0.8563 - regression_loss: 0.7694 - classification_loss: 0.0870 56/500 [==>...........................] - ETA: 1:45 - loss: 0.8684 - regression_loss: 0.7796 - classification_loss: 0.0888 57/500 [==>...........................] - ETA: 1:45 - loss: 0.8699 - regression_loss: 0.7819 - classification_loss: 0.0880 58/500 [==>...........................] - ETA: 1:45 - loss: 0.8739 - regression_loss: 0.7858 - classification_loss: 0.0881 59/500 [==>...........................] - ETA: 1:45 - loss: 0.8751 - regression_loss: 0.7868 - classification_loss: 0.0883 60/500 [==>...........................] - ETA: 1:44 - loss: 0.8740 - regression_loss: 0.7862 - classification_loss: 0.0877 61/500 [==>...........................] - ETA: 1:44 - loss: 0.8778 - regression_loss: 0.7896 - classification_loss: 0.0883 62/500 [==>...........................] - ETA: 1:44 - loss: 0.8813 - regression_loss: 0.7925 - classification_loss: 0.0888 63/500 [==>...........................] - ETA: 1:44 - loss: 0.8838 - regression_loss: 0.7939 - classification_loss: 0.0899 64/500 [==>...........................] - ETA: 1:44 - loss: 0.8771 - regression_loss: 0.7883 - classification_loss: 0.0888 65/500 [==>...........................] - ETA: 1:44 - loss: 0.8714 - regression_loss: 0.7834 - classification_loss: 0.0880 66/500 [==>...........................] - ETA: 1:44 - loss: 0.8624 - regression_loss: 0.7754 - classification_loss: 0.0871 67/500 [===>..........................] - ETA: 1:43 - loss: 0.8629 - regression_loss: 0.7756 - classification_loss: 0.0873 68/500 [===>..........................] - ETA: 1:43 - loss: 0.8546 - regression_loss: 0.7684 - classification_loss: 0.0862 69/500 [===>..........................] - ETA: 1:43 - loss: 0.8581 - regression_loss: 0.7714 - classification_loss: 0.0868 70/500 [===>..........................] - ETA: 1:43 - loss: 0.8534 - regression_loss: 0.7673 - classification_loss: 0.0861 71/500 [===>..........................] - ETA: 1:42 - loss: 0.8559 - regression_loss: 0.7687 - classification_loss: 0.0871 72/500 [===>..........................] - ETA: 1:42 - loss: 0.8540 - regression_loss: 0.7676 - classification_loss: 0.0864 73/500 [===>..........................] - ETA: 1:42 - loss: 0.8542 - regression_loss: 0.7676 - classification_loss: 0.0866 74/500 [===>..........................] - ETA: 1:42 - loss: 0.8585 - regression_loss: 0.7710 - classification_loss: 0.0875 75/500 [===>..........................] - ETA: 1:42 - loss: 0.8547 - regression_loss: 0.7680 - classification_loss: 0.0866 76/500 [===>..........................] - ETA: 1:42 - loss: 0.8568 - regression_loss: 0.7700 - classification_loss: 0.0867 77/500 [===>..........................] - ETA: 1:41 - loss: 0.8611 - regression_loss: 0.7737 - classification_loss: 0.0874 78/500 [===>..........................] - ETA: 1:41 - loss: 0.8656 - regression_loss: 0.7788 - classification_loss: 0.0868 79/500 [===>..........................] - ETA: 1:41 - loss: 0.8684 - regression_loss: 0.7808 - classification_loss: 0.0876 80/500 [===>..........................] - ETA: 1:41 - loss: 0.8692 - regression_loss: 0.7813 - classification_loss: 0.0878 81/500 [===>..........................] - ETA: 1:41 - loss: 0.8713 - regression_loss: 0.7827 - classification_loss: 0.0885 82/500 [===>..........................] - ETA: 1:40 - loss: 0.8754 - regression_loss: 0.7860 - classification_loss: 0.0895 83/500 [===>..........................] - ETA: 1:40 - loss: 0.8801 - regression_loss: 0.7896 - classification_loss: 0.0906 84/500 [====>.........................] - ETA: 1:40 - loss: 0.8788 - regression_loss: 0.7882 - classification_loss: 0.0906 85/500 [====>.........................] - ETA: 1:40 - loss: 0.8734 - regression_loss: 0.7835 - classification_loss: 0.0899 86/500 [====>.........................] - ETA: 1:40 - loss: 0.8699 - regression_loss: 0.7807 - classification_loss: 0.0892 87/500 [====>.........................] - ETA: 1:39 - loss: 0.8645 - regression_loss: 0.7756 - classification_loss: 0.0889 88/500 [====>.........................] - ETA: 1:39 - loss: 0.8598 - regression_loss: 0.7714 - classification_loss: 0.0884 89/500 [====>.........................] - ETA: 1:39 - loss: 0.8580 - regression_loss: 0.7699 - classification_loss: 0.0881 90/500 [====>.........................] - ETA: 1:39 - loss: 0.8565 - regression_loss: 0.7681 - classification_loss: 0.0884 91/500 [====>.........................] - ETA: 1:38 - loss: 0.8556 - regression_loss: 0.7674 - classification_loss: 0.0881 92/500 [====>.........................] - ETA: 1:38 - loss: 0.8574 - regression_loss: 0.7691 - classification_loss: 0.0883 93/500 [====>.........................] - ETA: 1:38 - loss: 0.8645 - regression_loss: 0.7745 - classification_loss: 0.0900 94/500 [====>.........................] - ETA: 1:38 - loss: 0.8582 - regression_loss: 0.7690 - classification_loss: 0.0892 95/500 [====>.........................] - ETA: 1:37 - loss: 0.8532 - regression_loss: 0.7646 - classification_loss: 0.0886 96/500 [====>.........................] - ETA: 1:37 - loss: 0.8567 - regression_loss: 0.7676 - classification_loss: 0.0890 97/500 [====>.........................] - ETA: 1:37 - loss: 0.8611 - regression_loss: 0.7716 - classification_loss: 0.0896 98/500 [====>.........................] - ETA: 1:37 - loss: 0.8583 - regression_loss: 0.7694 - classification_loss: 0.0888 99/500 [====>.........................] - ETA: 1:36 - loss: 0.8602 - regression_loss: 0.7709 - classification_loss: 0.0893 100/500 [=====>........................] - ETA: 1:36 - loss: 0.8614 - regression_loss: 0.7718 - classification_loss: 0.0896 101/500 [=====>........................] - ETA: 1:36 - loss: 0.8628 - regression_loss: 0.7730 - classification_loss: 0.0898 102/500 [=====>........................] - ETA: 1:36 - loss: 0.8671 - regression_loss: 0.7766 - classification_loss: 0.0905 103/500 [=====>........................] - ETA: 1:36 - loss: 0.8613 - regression_loss: 0.7716 - classification_loss: 0.0897 104/500 [=====>........................] - ETA: 1:35 - loss: 0.8621 - regression_loss: 0.7729 - classification_loss: 0.0893 105/500 [=====>........................] - ETA: 1:35 - loss: 0.8584 - regression_loss: 0.7697 - classification_loss: 0.0887 106/500 [=====>........................] - ETA: 1:35 - loss: 0.8558 - regression_loss: 0.7676 - classification_loss: 0.0882 107/500 [=====>........................] - ETA: 1:35 - loss: 0.8564 - regression_loss: 0.7682 - classification_loss: 0.0883 108/500 [=====>........................] - ETA: 1:35 - loss: 0.8558 - regression_loss: 0.7678 - classification_loss: 0.0881 109/500 [=====>........................] - ETA: 1:34 - loss: 0.8564 - regression_loss: 0.7682 - classification_loss: 0.0882 110/500 [=====>........................] - ETA: 1:34 - loss: 0.8616 - regression_loss: 0.7731 - classification_loss: 0.0885 111/500 [=====>........................] - ETA: 1:34 - loss: 0.8689 - regression_loss: 0.7792 - classification_loss: 0.0897 112/500 [=====>........................] - ETA: 1:34 - loss: 0.8662 - regression_loss: 0.7771 - classification_loss: 0.0891 113/500 [=====>........................] - ETA: 1:33 - loss: 0.8649 - regression_loss: 0.7761 - classification_loss: 0.0888 114/500 [=====>........................] - ETA: 1:33 - loss: 0.8632 - regression_loss: 0.7747 - classification_loss: 0.0885 115/500 [=====>........................] - ETA: 1:33 - loss: 0.8612 - regression_loss: 0.7733 - classification_loss: 0.0879 116/500 [=====>........................] - ETA: 1:33 - loss: 0.8599 - regression_loss: 0.7725 - classification_loss: 0.0875 117/500 [======>.......................] - ETA: 1:32 - loss: 0.8591 - regression_loss: 0.7721 - classification_loss: 0.0871 118/500 [======>.......................] - ETA: 1:32 - loss: 0.8609 - regression_loss: 0.7734 - classification_loss: 0.0875 119/500 [======>.......................] - ETA: 1:32 - loss: 0.8638 - regression_loss: 0.7756 - classification_loss: 0.0882 120/500 [======>.......................] - ETA: 1:32 - loss: 0.8622 - regression_loss: 0.7742 - classification_loss: 0.0881 121/500 [======>.......................] - ETA: 1:32 - loss: 0.8658 - regression_loss: 0.7769 - classification_loss: 0.0889 122/500 [======>.......................] - ETA: 1:31 - loss: 0.8685 - regression_loss: 0.7790 - classification_loss: 0.0894 123/500 [======>.......................] - ETA: 1:31 - loss: 0.8730 - regression_loss: 0.7827 - classification_loss: 0.0903 124/500 [======>.......................] - ETA: 1:31 - loss: 0.8718 - regression_loss: 0.7819 - classification_loss: 0.0899 125/500 [======>.......................] - ETA: 1:31 - loss: 0.8732 - regression_loss: 0.7832 - classification_loss: 0.0900 126/500 [======>.......................] - ETA: 1:31 - loss: 0.8727 - regression_loss: 0.7827 - classification_loss: 0.0900 127/500 [======>.......................] - ETA: 1:30 - loss: 0.8728 - regression_loss: 0.7831 - classification_loss: 0.0897 128/500 [======>.......................] - ETA: 1:30 - loss: 0.8759 - regression_loss: 0.7861 - classification_loss: 0.0898 129/500 [======>.......................] - ETA: 1:30 - loss: 0.8730 - regression_loss: 0.7838 - classification_loss: 0.0893 130/500 [======>.......................] - ETA: 1:30 - loss: 0.8694 - regression_loss: 0.7806 - classification_loss: 0.0888 131/500 [======>.......................] - ETA: 1:30 - loss: 0.8668 - regression_loss: 0.7780 - classification_loss: 0.0888 132/500 [======>.......................] - ETA: 1:29 - loss: 0.8689 - regression_loss: 0.7798 - classification_loss: 0.0891 133/500 [======>.......................] - ETA: 1:29 - loss: 0.8667 - regression_loss: 0.7780 - classification_loss: 0.0886 134/500 [=======>......................] - ETA: 1:29 - loss: 0.8695 - regression_loss: 0.7808 - classification_loss: 0.0888 135/500 [=======>......................] - ETA: 1:29 - loss: 0.8683 - regression_loss: 0.7797 - classification_loss: 0.0887 136/500 [=======>......................] - ETA: 1:28 - loss: 0.8679 - regression_loss: 0.7795 - classification_loss: 0.0884 137/500 [=======>......................] - ETA: 1:28 - loss: 0.8704 - regression_loss: 0.7815 - classification_loss: 0.0889 138/500 [=======>......................] - ETA: 1:28 - loss: 0.8675 - regression_loss: 0.7790 - classification_loss: 0.0885 139/500 [=======>......................] - ETA: 1:28 - loss: 0.8687 - regression_loss: 0.7800 - classification_loss: 0.0887 140/500 [=======>......................] - ETA: 1:28 - loss: 0.8672 - regression_loss: 0.7787 - classification_loss: 0.0884 141/500 [=======>......................] - ETA: 1:27 - loss: 0.8655 - regression_loss: 0.7771 - classification_loss: 0.0884 142/500 [=======>......................] - ETA: 1:27 - loss: 0.8603 - regression_loss: 0.7724 - classification_loss: 0.0879 143/500 [=======>......................] - ETA: 1:27 - loss: 0.8589 - regression_loss: 0.7714 - classification_loss: 0.0875 144/500 [=======>......................] - ETA: 1:27 - loss: 0.8611 - regression_loss: 0.7736 - classification_loss: 0.0875 145/500 [=======>......................] - ETA: 1:27 - loss: 0.8629 - regression_loss: 0.7751 - classification_loss: 0.0878 146/500 [=======>......................] - ETA: 1:26 - loss: 0.8671 - regression_loss: 0.7787 - classification_loss: 0.0883 147/500 [=======>......................] - ETA: 1:26 - loss: 0.8690 - regression_loss: 0.7804 - classification_loss: 0.0886 148/500 [=======>......................] - ETA: 1:26 - loss: 0.8704 - regression_loss: 0.7816 - classification_loss: 0.0888 149/500 [=======>......................] - ETA: 1:26 - loss: 0.8679 - regression_loss: 0.7794 - classification_loss: 0.0885 150/500 [========>.....................] - ETA: 1:25 - loss: 0.8673 - regression_loss: 0.7784 - classification_loss: 0.0889 151/500 [========>.....................] - ETA: 1:25 - loss: 0.8660 - regression_loss: 0.7773 - classification_loss: 0.0886 152/500 [========>.....................] - ETA: 1:25 - loss: 0.8641 - regression_loss: 0.7757 - classification_loss: 0.0885 153/500 [========>.....................] - ETA: 1:25 - loss: 0.8655 - regression_loss: 0.7768 - classification_loss: 0.0887 154/500 [========>.....................] - ETA: 1:25 - loss: 0.8623 - regression_loss: 0.7741 - classification_loss: 0.0883 155/500 [========>.....................] - ETA: 1:24 - loss: 0.8651 - regression_loss: 0.7766 - classification_loss: 0.0885 156/500 [========>.....................] - ETA: 1:24 - loss: 0.8654 - regression_loss: 0.7767 - classification_loss: 0.0887 157/500 [========>.....................] - ETA: 1:24 - loss: 0.8682 - regression_loss: 0.7791 - classification_loss: 0.0891 158/500 [========>.....................] - ETA: 1:24 - loss: 0.8690 - regression_loss: 0.7797 - classification_loss: 0.0892 159/500 [========>.....................] - ETA: 1:23 - loss: 0.8685 - regression_loss: 0.7792 - classification_loss: 0.0893 160/500 [========>.....................] - ETA: 1:23 - loss: 0.8684 - regression_loss: 0.7789 - classification_loss: 0.0895 161/500 [========>.....................] - ETA: 1:23 - loss: 0.8690 - regression_loss: 0.7796 - classification_loss: 0.0893 162/500 [========>.....................] - ETA: 1:23 - loss: 0.8666 - regression_loss: 0.7776 - classification_loss: 0.0889 163/500 [========>.....................] - ETA: 1:22 - loss: 0.8640 - regression_loss: 0.7755 - classification_loss: 0.0886 164/500 [========>.....................] - ETA: 1:22 - loss: 0.8637 - regression_loss: 0.7751 - classification_loss: 0.0886 165/500 [========>.....................] - ETA: 1:22 - loss: 0.8667 - regression_loss: 0.7778 - classification_loss: 0.0889 166/500 [========>.....................] - ETA: 1:22 - loss: 0.8667 - regression_loss: 0.7778 - classification_loss: 0.0889 167/500 [=========>....................] - ETA: 1:21 - loss: 0.8662 - regression_loss: 0.7774 - classification_loss: 0.0888 168/500 [=========>....................] - ETA: 1:21 - loss: 0.8650 - regression_loss: 0.7763 - classification_loss: 0.0886 169/500 [=========>....................] - ETA: 1:21 - loss: 0.8680 - regression_loss: 0.7788 - classification_loss: 0.0892 170/500 [=========>....................] - ETA: 1:21 - loss: 0.8699 - regression_loss: 0.7802 - classification_loss: 0.0897 171/500 [=========>....................] - ETA: 1:20 - loss: 0.8666 - regression_loss: 0.7774 - classification_loss: 0.0892 172/500 [=========>....................] - ETA: 1:20 - loss: 0.8690 - regression_loss: 0.7793 - classification_loss: 0.0897 173/500 [=========>....................] - ETA: 1:20 - loss: 0.8711 - regression_loss: 0.7813 - classification_loss: 0.0898 174/500 [=========>....................] - ETA: 1:20 - loss: 0.8714 - regression_loss: 0.7817 - classification_loss: 0.0898 175/500 [=========>....................] - ETA: 1:20 - loss: 0.8722 - regression_loss: 0.7821 - classification_loss: 0.0900 176/500 [=========>....................] - ETA: 1:19 - loss: 0.8723 - regression_loss: 0.7823 - classification_loss: 0.0900 177/500 [=========>....................] - ETA: 1:19 - loss: 0.8744 - regression_loss: 0.7842 - classification_loss: 0.0902 178/500 [=========>....................] - ETA: 1:19 - loss: 0.8757 - regression_loss: 0.7855 - classification_loss: 0.0902 179/500 [=========>....................] - ETA: 1:19 - loss: 0.8749 - regression_loss: 0.7847 - classification_loss: 0.0902 180/500 [=========>....................] - ETA: 1:18 - loss: 0.8754 - regression_loss: 0.7852 - classification_loss: 0.0902 181/500 [=========>....................] - ETA: 1:18 - loss: 0.8769 - regression_loss: 0.7864 - classification_loss: 0.0905 182/500 [=========>....................] - ETA: 1:18 - loss: 0.8748 - regression_loss: 0.7847 - classification_loss: 0.0901 183/500 [=========>....................] - ETA: 1:18 - loss: 0.8770 - regression_loss: 0.7863 - classification_loss: 0.0906 184/500 [==========>...................] - ETA: 1:18 - loss: 0.8773 - regression_loss: 0.7867 - classification_loss: 0.0906 185/500 [==========>...................] - ETA: 1:17 - loss: 0.8780 - regression_loss: 0.7869 - classification_loss: 0.0911 186/500 [==========>...................] - ETA: 1:17 - loss: 0.8764 - regression_loss: 0.7856 - classification_loss: 0.0908 187/500 [==========>...................] - ETA: 1:17 - loss: 0.8765 - regression_loss: 0.7858 - classification_loss: 0.0907 188/500 [==========>...................] - ETA: 1:17 - loss: 0.8781 - regression_loss: 0.7873 - classification_loss: 0.0908 189/500 [==========>...................] - ETA: 1:16 - loss: 0.8793 - regression_loss: 0.7882 - classification_loss: 0.0911 190/500 [==========>...................] - ETA: 1:16 - loss: 0.8773 - regression_loss: 0.7866 - classification_loss: 0.0908 191/500 [==========>...................] - ETA: 1:16 - loss: 0.8756 - regression_loss: 0.7852 - classification_loss: 0.0904 192/500 [==========>...................] - ETA: 1:16 - loss: 0.8786 - regression_loss: 0.7881 - classification_loss: 0.0905 193/500 [==========>...................] - ETA: 1:15 - loss: 0.8792 - regression_loss: 0.7887 - classification_loss: 0.0905 194/500 [==========>...................] - ETA: 1:15 - loss: 0.8769 - regression_loss: 0.7867 - classification_loss: 0.0902 195/500 [==========>...................] - ETA: 1:15 - loss: 0.8754 - regression_loss: 0.7852 - classification_loss: 0.0902 196/500 [==========>...................] - ETA: 1:15 - loss: 0.8769 - regression_loss: 0.7865 - classification_loss: 0.0904 197/500 [==========>...................] - ETA: 1:14 - loss: 0.8738 - regression_loss: 0.7833 - classification_loss: 0.0904 198/500 [==========>...................] - ETA: 1:14 - loss: 0.8746 - regression_loss: 0.7839 - classification_loss: 0.0907 199/500 [==========>...................] - ETA: 1:14 - loss: 0.8745 - regression_loss: 0.7837 - classification_loss: 0.0908 200/500 [===========>..................] - ETA: 1:14 - loss: 0.8760 - regression_loss: 0.7851 - classification_loss: 0.0909 201/500 [===========>..................] - ETA: 1:14 - loss: 0.8763 - regression_loss: 0.7855 - classification_loss: 0.0908 202/500 [===========>..................] - ETA: 1:13 - loss: 0.8792 - regression_loss: 0.7880 - classification_loss: 0.0912 203/500 [===========>..................] - ETA: 1:13 - loss: 0.8804 - regression_loss: 0.7889 - classification_loss: 0.0914 204/500 [===========>..................] - ETA: 1:13 - loss: 0.8775 - regression_loss: 0.7863 - classification_loss: 0.0911 205/500 [===========>..................] - ETA: 1:13 - loss: 0.8768 - regression_loss: 0.7859 - classification_loss: 0.0909 206/500 [===========>..................] - ETA: 1:12 - loss: 0.8743 - regression_loss: 0.7836 - classification_loss: 0.0907 207/500 [===========>..................] - ETA: 1:12 - loss: 0.8738 - regression_loss: 0.7834 - classification_loss: 0.0904 208/500 [===========>..................] - ETA: 1:12 - loss: 0.8754 - regression_loss: 0.7849 - classification_loss: 0.0905 209/500 [===========>..................] - ETA: 1:12 - loss: 0.8744 - regression_loss: 0.7838 - classification_loss: 0.0906 210/500 [===========>..................] - ETA: 1:11 - loss: 0.8717 - regression_loss: 0.7815 - classification_loss: 0.0903 211/500 [===========>..................] - ETA: 1:11 - loss: 0.8745 - regression_loss: 0.7839 - classification_loss: 0.0907 212/500 [===========>..................] - ETA: 1:11 - loss: 0.8741 - regression_loss: 0.7835 - classification_loss: 0.0907 213/500 [===========>..................] - ETA: 1:11 - loss: 0.8746 - regression_loss: 0.7839 - classification_loss: 0.0907 214/500 [===========>..................] - ETA: 1:10 - loss: 0.8726 - regression_loss: 0.7822 - classification_loss: 0.0903 215/500 [===========>..................] - ETA: 1:10 - loss: 0.8733 - regression_loss: 0.7827 - classification_loss: 0.0906 216/500 [===========>..................] - ETA: 1:10 - loss: 0.8740 - regression_loss: 0.7834 - classification_loss: 0.0906 217/500 [============>.................] - ETA: 1:10 - loss: 0.8772 - regression_loss: 0.7860 - classification_loss: 0.0912 218/500 [============>.................] - ETA: 1:09 - loss: 0.8754 - regression_loss: 0.7844 - classification_loss: 0.0910 219/500 [============>.................] - ETA: 1:09 - loss: 0.8755 - regression_loss: 0.7846 - classification_loss: 0.0909 220/500 [============>.................] - ETA: 1:09 - loss: 0.8779 - regression_loss: 0.7866 - classification_loss: 0.0913 221/500 [============>.................] - ETA: 1:09 - loss: 0.8758 - regression_loss: 0.7849 - classification_loss: 0.0909 222/500 [============>.................] - ETA: 1:08 - loss: 0.8768 - regression_loss: 0.7855 - classification_loss: 0.0913 223/500 [============>.................] - ETA: 1:08 - loss: 0.8746 - regression_loss: 0.7835 - classification_loss: 0.0910 224/500 [============>.................] - ETA: 1:08 - loss: 0.8762 - regression_loss: 0.7852 - classification_loss: 0.0910 225/500 [============>.................] - ETA: 1:08 - loss: 0.8775 - regression_loss: 0.7864 - classification_loss: 0.0911 226/500 [============>.................] - ETA: 1:07 - loss: 0.8770 - regression_loss: 0.7861 - classification_loss: 0.0909 227/500 [============>.................] - ETA: 1:07 - loss: 0.8770 - regression_loss: 0.7861 - classification_loss: 0.0910 228/500 [============>.................] - ETA: 1:07 - loss: 0.8782 - regression_loss: 0.7872 - classification_loss: 0.0910 229/500 [============>.................] - ETA: 1:07 - loss: 0.8790 - regression_loss: 0.7880 - classification_loss: 0.0910 230/500 [============>.................] - ETA: 1:06 - loss: 0.8779 - regression_loss: 0.7870 - classification_loss: 0.0909 231/500 [============>.................] - ETA: 1:06 - loss: 0.8791 - regression_loss: 0.7880 - classification_loss: 0.0911 232/500 [============>.................] - ETA: 1:06 - loss: 0.8790 - regression_loss: 0.7879 - classification_loss: 0.0911 233/500 [============>.................] - ETA: 1:06 - loss: 0.8807 - regression_loss: 0.7893 - classification_loss: 0.0914 234/500 [=============>................] - ETA: 1:05 - loss: 0.8816 - regression_loss: 0.7900 - classification_loss: 0.0916 235/500 [=============>................] - ETA: 1:05 - loss: 0.8840 - regression_loss: 0.7921 - classification_loss: 0.0920 236/500 [=============>................] - ETA: 1:05 - loss: 0.8826 - regression_loss: 0.7908 - classification_loss: 0.0918 237/500 [=============>................] - ETA: 1:05 - loss: 0.8822 - regression_loss: 0.7905 - classification_loss: 0.0917 238/500 [=============>................] - ETA: 1:05 - loss: 0.8821 - regression_loss: 0.7905 - classification_loss: 0.0916 239/500 [=============>................] - ETA: 1:04 - loss: 0.8849 - regression_loss: 0.7931 - classification_loss: 0.0918 240/500 [=============>................] - ETA: 1:04 - loss: 0.8860 - regression_loss: 0.7941 - classification_loss: 0.0919 241/500 [=============>................] - ETA: 1:04 - loss: 0.8844 - regression_loss: 0.7926 - classification_loss: 0.0917 242/500 [=============>................] - ETA: 1:04 - loss: 0.8856 - regression_loss: 0.7936 - classification_loss: 0.0920 243/500 [=============>................] - ETA: 1:03 - loss: 0.8853 - regression_loss: 0.7934 - classification_loss: 0.0919 244/500 [=============>................] - ETA: 1:03 - loss: 0.8832 - regression_loss: 0.7914 - classification_loss: 0.0917 245/500 [=============>................] - ETA: 1:03 - loss: 0.8818 - regression_loss: 0.7903 - classification_loss: 0.0915 246/500 [=============>................] - ETA: 1:03 - loss: 0.8805 - regression_loss: 0.7891 - classification_loss: 0.0913 247/500 [=============>................] - ETA: 1:02 - loss: 0.8792 - regression_loss: 0.7881 - classification_loss: 0.0911 248/500 [=============>................] - ETA: 1:02 - loss: 0.8803 - regression_loss: 0.7890 - classification_loss: 0.0913 249/500 [=============>................] - ETA: 1:02 - loss: 0.8797 - regression_loss: 0.7884 - classification_loss: 0.0913 250/500 [==============>...............] - ETA: 1:02 - loss: 0.8793 - regression_loss: 0.7882 - classification_loss: 0.0911 251/500 [==============>...............] - ETA: 1:01 - loss: 0.8807 - regression_loss: 0.7895 - classification_loss: 0.0912 252/500 [==============>...............] - ETA: 1:01 - loss: 0.8813 - regression_loss: 0.7900 - classification_loss: 0.0913 253/500 [==============>...............] - ETA: 1:01 - loss: 0.8811 - regression_loss: 0.7899 - classification_loss: 0.0912 254/500 [==============>...............] - ETA: 1:01 - loss: 0.8816 - regression_loss: 0.7902 - classification_loss: 0.0914 255/500 [==============>...............] - ETA: 1:00 - loss: 0.8825 - regression_loss: 0.7911 - classification_loss: 0.0915 256/500 [==============>...............] - ETA: 1:00 - loss: 0.8836 - regression_loss: 0.7920 - classification_loss: 0.0916 257/500 [==============>...............] - ETA: 1:00 - loss: 0.8867 - regression_loss: 0.7945 - classification_loss: 0.0922 258/500 [==============>...............] - ETA: 1:00 - loss: 0.8875 - regression_loss: 0.7952 - classification_loss: 0.0923 259/500 [==============>...............] - ETA: 59s - loss: 0.8868 - regression_loss: 0.7948 - classification_loss: 0.0920  260/500 [==============>...............] - ETA: 59s - loss: 0.8869 - regression_loss: 0.7949 - classification_loss: 0.0920 261/500 [==============>...............] - ETA: 59s - loss: 0.8872 - regression_loss: 0.7950 - classification_loss: 0.0922 262/500 [==============>...............] - ETA: 59s - loss: 0.8874 - regression_loss: 0.7953 - classification_loss: 0.0921 263/500 [==============>...............] - ETA: 58s - loss: 0.8868 - regression_loss: 0.7948 - classification_loss: 0.0920 264/500 [==============>...............] - ETA: 58s - loss: 0.8872 - regression_loss: 0.7951 - classification_loss: 0.0922 265/500 [==============>...............] - ETA: 58s - loss: 0.8883 - regression_loss: 0.7960 - classification_loss: 0.0923 266/500 [==============>...............] - ETA: 58s - loss: 0.8882 - regression_loss: 0.7960 - classification_loss: 0.0922 267/500 [===============>..............] - ETA: 57s - loss: 0.8879 - regression_loss: 0.7957 - classification_loss: 0.0922 268/500 [===============>..............] - ETA: 57s - loss: 0.8864 - regression_loss: 0.7945 - classification_loss: 0.0919 269/500 [===============>..............] - ETA: 57s - loss: 0.8840 - regression_loss: 0.7924 - classification_loss: 0.0916 270/500 [===============>..............] - ETA: 57s - loss: 0.8840 - regression_loss: 0.7926 - classification_loss: 0.0913 271/500 [===============>..............] - ETA: 56s - loss: 0.8853 - regression_loss: 0.7937 - classification_loss: 0.0916 272/500 [===============>..............] - ETA: 56s - loss: 0.8843 - regression_loss: 0.7928 - classification_loss: 0.0914 273/500 [===============>..............] - ETA: 56s - loss: 0.8843 - regression_loss: 0.7929 - classification_loss: 0.0914 274/500 [===============>..............] - ETA: 56s - loss: 0.8846 - regression_loss: 0.7932 - classification_loss: 0.0914 275/500 [===============>..............] - ETA: 55s - loss: 0.8859 - regression_loss: 0.7943 - classification_loss: 0.0917 276/500 [===============>..............] - ETA: 55s - loss: 0.8865 - regression_loss: 0.7948 - classification_loss: 0.0917 277/500 [===============>..............] - ETA: 55s - loss: 0.8873 - regression_loss: 0.7954 - classification_loss: 0.0919 278/500 [===============>..............] - ETA: 55s - loss: 0.8867 - regression_loss: 0.7949 - classification_loss: 0.0918 279/500 [===============>..............] - ETA: 54s - loss: 0.8877 - regression_loss: 0.7958 - classification_loss: 0.0919 280/500 [===============>..............] - ETA: 54s - loss: 0.8869 - regression_loss: 0.7953 - classification_loss: 0.0917 281/500 [===============>..............] - ETA: 54s - loss: 0.8856 - regression_loss: 0.7941 - classification_loss: 0.0914 282/500 [===============>..............] - ETA: 54s - loss: 0.8859 - regression_loss: 0.7943 - classification_loss: 0.0916 283/500 [===============>..............] - ETA: 53s - loss: 0.8860 - regression_loss: 0.7940 - classification_loss: 0.0919 284/500 [================>.............] - ETA: 53s - loss: 0.8861 - regression_loss: 0.7942 - classification_loss: 0.0919 285/500 [================>.............] - ETA: 53s - loss: 0.8849 - regression_loss: 0.7932 - classification_loss: 0.0918 286/500 [================>.............] - ETA: 52s - loss: 0.8843 - regression_loss: 0.7927 - classification_loss: 0.0916 287/500 [================>.............] - ETA: 52s - loss: 0.8833 - regression_loss: 0.7918 - classification_loss: 0.0914 288/500 [================>.............] - ETA: 52s - loss: 0.8828 - regression_loss: 0.7914 - classification_loss: 0.0914 289/500 [================>.............] - ETA: 52s - loss: 0.8831 - regression_loss: 0.7917 - classification_loss: 0.0914 290/500 [================>.............] - ETA: 51s - loss: 0.8817 - regression_loss: 0.7904 - classification_loss: 0.0912 291/500 [================>.............] - ETA: 51s - loss: 0.8837 - regression_loss: 0.7921 - classification_loss: 0.0916 292/500 [================>.............] - ETA: 51s - loss: 0.8833 - regression_loss: 0.7917 - classification_loss: 0.0916 293/500 [================>.............] - ETA: 51s - loss: 0.8827 - regression_loss: 0.7913 - classification_loss: 0.0914 294/500 [================>.............] - ETA: 50s - loss: 0.8835 - regression_loss: 0.7920 - classification_loss: 0.0915 295/500 [================>.............] - ETA: 50s - loss: 0.8850 - regression_loss: 0.7932 - classification_loss: 0.0918 296/500 [================>.............] - ETA: 50s - loss: 0.8851 - regression_loss: 0.7933 - classification_loss: 0.0918 297/500 [================>.............] - ETA: 50s - loss: 0.8867 - regression_loss: 0.7947 - classification_loss: 0.0921 298/500 [================>.............] - ETA: 49s - loss: 0.8849 - regression_loss: 0.7931 - classification_loss: 0.0918 299/500 [================>.............] - ETA: 49s - loss: 0.8856 - regression_loss: 0.7937 - classification_loss: 0.0919 300/500 [=================>............] - ETA: 49s - loss: 0.8849 - regression_loss: 0.7931 - classification_loss: 0.0918 301/500 [=================>............] - ETA: 49s - loss: 0.8851 - regression_loss: 0.7934 - classification_loss: 0.0917 302/500 [=================>............] - ETA: 48s - loss: 0.8855 - regression_loss: 0.7937 - classification_loss: 0.0918 303/500 [=================>............] - ETA: 48s - loss: 0.8870 - regression_loss: 0.7951 - classification_loss: 0.0920 304/500 [=================>............] - ETA: 48s - loss: 0.8857 - regression_loss: 0.7940 - classification_loss: 0.0917 305/500 [=================>............] - ETA: 48s - loss: 0.8860 - regression_loss: 0.7942 - classification_loss: 0.0918 306/500 [=================>............] - ETA: 47s - loss: 0.8852 - regression_loss: 0.7935 - classification_loss: 0.0917 307/500 [=================>............] - ETA: 47s - loss: 0.8847 - regression_loss: 0.7931 - classification_loss: 0.0916 308/500 [=================>............] - ETA: 47s - loss: 0.8852 - regression_loss: 0.7934 - classification_loss: 0.0917 309/500 [=================>............] - ETA: 47s - loss: 0.8872 - regression_loss: 0.7951 - classification_loss: 0.0920 310/500 [=================>............] - ETA: 46s - loss: 0.8876 - regression_loss: 0.7955 - classification_loss: 0.0921 311/500 [=================>............] - ETA: 46s - loss: 0.8867 - regression_loss: 0.7947 - classification_loss: 0.0920 312/500 [=================>............] - ETA: 46s - loss: 0.8852 - regression_loss: 0.7934 - classification_loss: 0.0918 313/500 [=================>............] - ETA: 46s - loss: 0.8860 - regression_loss: 0.7941 - classification_loss: 0.0919 314/500 [=================>............] - ETA: 45s - loss: 0.8846 - regression_loss: 0.7929 - classification_loss: 0.0917 315/500 [=================>............] - ETA: 45s - loss: 0.8845 - regression_loss: 0.7930 - classification_loss: 0.0916 316/500 [=================>............] - ETA: 45s - loss: 0.8838 - regression_loss: 0.7924 - classification_loss: 0.0914 317/500 [==================>...........] - ETA: 45s - loss: 0.8829 - regression_loss: 0.7917 - classification_loss: 0.0912 318/500 [==================>...........] - ETA: 44s - loss: 0.8815 - regression_loss: 0.7905 - classification_loss: 0.0910 319/500 [==================>...........] - ETA: 44s - loss: 0.8820 - regression_loss: 0.7910 - classification_loss: 0.0910 320/500 [==================>...........] - ETA: 44s - loss: 0.8819 - regression_loss: 0.7911 - classification_loss: 0.0909 321/500 [==================>...........] - ETA: 44s - loss: 0.8825 - regression_loss: 0.7916 - classification_loss: 0.0909 322/500 [==================>...........] - ETA: 43s - loss: 0.8811 - regression_loss: 0.7904 - classification_loss: 0.0907 323/500 [==================>...........] - ETA: 43s - loss: 0.8797 - regression_loss: 0.7893 - classification_loss: 0.0904 324/500 [==================>...........] - ETA: 43s - loss: 0.8794 - regression_loss: 0.7889 - classification_loss: 0.0905 325/500 [==================>...........] - ETA: 43s - loss: 0.8800 - regression_loss: 0.7895 - classification_loss: 0.0905 326/500 [==================>...........] - ETA: 42s - loss: 0.8791 - regression_loss: 0.7887 - classification_loss: 0.0904 327/500 [==================>...........] - ETA: 42s - loss: 0.8793 - regression_loss: 0.7888 - classification_loss: 0.0905 328/500 [==================>...........] - ETA: 42s - loss: 0.8796 - regression_loss: 0.7893 - classification_loss: 0.0903 329/500 [==================>...........] - ETA: 42s - loss: 0.8797 - regression_loss: 0.7896 - classification_loss: 0.0902 330/500 [==================>...........] - ETA: 41s - loss: 0.8805 - regression_loss: 0.7902 - classification_loss: 0.0903 331/500 [==================>...........] - ETA: 41s - loss: 0.8803 - regression_loss: 0.7901 - classification_loss: 0.0902 332/500 [==================>...........] - ETA: 41s - loss: 0.8793 - regression_loss: 0.7891 - classification_loss: 0.0901 333/500 [==================>...........] - ETA: 41s - loss: 0.8798 - regression_loss: 0.7896 - classification_loss: 0.0902 334/500 [===================>..........] - ETA: 40s - loss: 0.8783 - regression_loss: 0.7883 - classification_loss: 0.0900 335/500 [===================>..........] - ETA: 40s - loss: 0.8789 - regression_loss: 0.7888 - classification_loss: 0.0901 336/500 [===================>..........] - ETA: 40s - loss: 0.8795 - regression_loss: 0.7892 - classification_loss: 0.0902 337/500 [===================>..........] - ETA: 40s - loss: 0.8782 - regression_loss: 0.7881 - classification_loss: 0.0901 338/500 [===================>..........] - ETA: 39s - loss: 0.8784 - regression_loss: 0.7883 - classification_loss: 0.0901 339/500 [===================>..........] - ETA: 39s - loss: 0.8788 - regression_loss: 0.7886 - classification_loss: 0.0902 340/500 [===================>..........] - ETA: 39s - loss: 0.8791 - regression_loss: 0.7889 - classification_loss: 0.0903 341/500 [===================>..........] - ETA: 39s - loss: 0.8801 - regression_loss: 0.7897 - classification_loss: 0.0904 342/500 [===================>..........] - ETA: 39s - loss: 0.8808 - regression_loss: 0.7904 - classification_loss: 0.0904 343/500 [===================>..........] - ETA: 38s - loss: 0.8810 - regression_loss: 0.7906 - classification_loss: 0.0905 344/500 [===================>..........] - ETA: 38s - loss: 0.8820 - regression_loss: 0.7913 - classification_loss: 0.0907 345/500 [===================>..........] - ETA: 38s - loss: 0.8815 - regression_loss: 0.7908 - classification_loss: 0.0907 346/500 [===================>..........] - ETA: 38s - loss: 0.8827 - regression_loss: 0.7919 - classification_loss: 0.0908 347/500 [===================>..........] - ETA: 37s - loss: 0.8827 - regression_loss: 0.7920 - classification_loss: 0.0907 348/500 [===================>..........] - ETA: 37s - loss: 0.8823 - regression_loss: 0.7917 - classification_loss: 0.0906 349/500 [===================>..........] - ETA: 37s - loss: 0.8832 - regression_loss: 0.7925 - classification_loss: 0.0907 350/500 [====================>.........] - ETA: 37s - loss: 0.8836 - regression_loss: 0.7929 - classification_loss: 0.0908 351/500 [====================>.........] - ETA: 36s - loss: 0.8844 - regression_loss: 0.7936 - classification_loss: 0.0909 352/500 [====================>.........] - ETA: 36s - loss: 0.8844 - regression_loss: 0.7936 - classification_loss: 0.0908 353/500 [====================>.........] - ETA: 36s - loss: 0.8831 - regression_loss: 0.7925 - classification_loss: 0.0906 354/500 [====================>.........] - ETA: 36s - loss: 0.8831 - regression_loss: 0.7925 - classification_loss: 0.0906 355/500 [====================>.........] - ETA: 35s - loss: 0.8841 - regression_loss: 0.7932 - classification_loss: 0.0909 356/500 [====================>.........] - ETA: 35s - loss: 0.8836 - regression_loss: 0.7930 - classification_loss: 0.0906 357/500 [====================>.........] - ETA: 35s - loss: 0.8838 - regression_loss: 0.7932 - classification_loss: 0.0906 358/500 [====================>.........] - ETA: 35s - loss: 0.8824 - regression_loss: 0.7918 - classification_loss: 0.0906 359/500 [====================>.........] - ETA: 34s - loss: 0.8813 - regression_loss: 0.7908 - classification_loss: 0.0905 360/500 [====================>.........] - ETA: 34s - loss: 0.8813 - regression_loss: 0.7909 - classification_loss: 0.0904 361/500 [====================>.........] - ETA: 34s - loss: 0.8821 - regression_loss: 0.7916 - classification_loss: 0.0906 362/500 [====================>.........] - ETA: 34s - loss: 0.8823 - regression_loss: 0.7917 - classification_loss: 0.0906 363/500 [====================>.........] - ETA: 33s - loss: 0.8815 - regression_loss: 0.7910 - classification_loss: 0.0905 364/500 [====================>.........] - ETA: 33s - loss: 0.8822 - regression_loss: 0.7915 - classification_loss: 0.0908 365/500 [====================>.........] - ETA: 33s - loss: 0.8825 - regression_loss: 0.7919 - classification_loss: 0.0906 366/500 [====================>.........] - ETA: 33s - loss: 0.8829 - regression_loss: 0.7923 - classification_loss: 0.0907 367/500 [=====================>........] - ETA: 32s - loss: 0.8824 - regression_loss: 0.7919 - classification_loss: 0.0905 368/500 [=====================>........] - ETA: 32s - loss: 0.8836 - regression_loss: 0.7927 - classification_loss: 0.0909 369/500 [=====================>........] - ETA: 32s - loss: 0.8821 - regression_loss: 0.7914 - classification_loss: 0.0907 370/500 [=====================>........] - ETA: 32s - loss: 0.8843 - regression_loss: 0.7932 - classification_loss: 0.0911 371/500 [=====================>........] - ETA: 31s - loss: 0.8856 - regression_loss: 0.7943 - classification_loss: 0.0913 372/500 [=====================>........] - ETA: 31s - loss: 0.8857 - regression_loss: 0.7945 - classification_loss: 0.0912 373/500 [=====================>........] - ETA: 31s - loss: 0.8859 - regression_loss: 0.7947 - classification_loss: 0.0912 374/500 [=====================>........] - ETA: 31s - loss: 0.8863 - regression_loss: 0.7950 - classification_loss: 0.0912 375/500 [=====================>........] - ETA: 30s - loss: 0.8860 - regression_loss: 0.7948 - classification_loss: 0.0912 376/500 [=====================>........] - ETA: 30s - loss: 0.8858 - regression_loss: 0.7946 - classification_loss: 0.0912 377/500 [=====================>........] - ETA: 30s - loss: 0.8855 - regression_loss: 0.7943 - classification_loss: 0.0912 378/500 [=====================>........] - ETA: 30s - loss: 0.8850 - regression_loss: 0.7939 - classification_loss: 0.0911 379/500 [=====================>........] - ETA: 29s - loss: 0.8852 - regression_loss: 0.7941 - classification_loss: 0.0911 380/500 [=====================>........] - ETA: 29s - loss: 0.8867 - regression_loss: 0.7957 - classification_loss: 0.0910 381/500 [=====================>........] - ETA: 29s - loss: 0.8850 - regression_loss: 0.7942 - classification_loss: 0.0908 382/500 [=====================>........] - ETA: 29s - loss: 0.8852 - regression_loss: 0.7944 - classification_loss: 0.0908 383/500 [=====================>........] - ETA: 28s - loss: 0.8839 - regression_loss: 0.7932 - classification_loss: 0.0906 384/500 [======================>.......] - ETA: 28s - loss: 0.8827 - regression_loss: 0.7923 - classification_loss: 0.0905 385/500 [======================>.......] - ETA: 28s - loss: 0.8825 - regression_loss: 0.7921 - classification_loss: 0.0904 386/500 [======================>.......] - ETA: 28s - loss: 0.8813 - regression_loss: 0.7911 - classification_loss: 0.0902 387/500 [======================>.......] - ETA: 27s - loss: 0.8816 - regression_loss: 0.7912 - classification_loss: 0.0904 388/500 [======================>.......] - ETA: 27s - loss: 0.8803 - regression_loss: 0.7900 - classification_loss: 0.0904 389/500 [======================>.......] - ETA: 27s - loss: 0.8789 - regression_loss: 0.7888 - classification_loss: 0.0902 390/500 [======================>.......] - ETA: 27s - loss: 0.8801 - regression_loss: 0.7898 - classification_loss: 0.0903 391/500 [======================>.......] - ETA: 26s - loss: 0.8786 - regression_loss: 0.7885 - classification_loss: 0.0901 392/500 [======================>.......] - ETA: 26s - loss: 0.8791 - regression_loss: 0.7890 - classification_loss: 0.0901 393/500 [======================>.......] - ETA: 26s - loss: 0.8797 - regression_loss: 0.7895 - classification_loss: 0.0902 394/500 [======================>.......] - ETA: 26s - loss: 0.8787 - regression_loss: 0.7886 - classification_loss: 0.0901 395/500 [======================>.......] - ETA: 26s - loss: 0.8798 - regression_loss: 0.7896 - classification_loss: 0.0902 396/500 [======================>.......] - ETA: 25s - loss: 0.8789 - regression_loss: 0.7888 - classification_loss: 0.0901 397/500 [======================>.......] - ETA: 25s - loss: 0.8794 - regression_loss: 0.7892 - classification_loss: 0.0902 398/500 [======================>.......] - ETA: 25s - loss: 0.8791 - regression_loss: 0.7889 - classification_loss: 0.0901 399/500 [======================>.......] - ETA: 25s - loss: 0.8806 - regression_loss: 0.7902 - classification_loss: 0.0904 400/500 [=======================>......] - ETA: 24s - loss: 0.8815 - regression_loss: 0.7910 - classification_loss: 0.0906 401/500 [=======================>......] - ETA: 24s - loss: 0.8820 - regression_loss: 0.7913 - classification_loss: 0.0907 402/500 [=======================>......] - ETA: 24s - loss: 0.8824 - regression_loss: 0.7916 - classification_loss: 0.0907 403/500 [=======================>......] - ETA: 24s - loss: 0.8827 - regression_loss: 0.7919 - classification_loss: 0.0908 404/500 [=======================>......] - ETA: 23s - loss: 0.8816 - regression_loss: 0.7910 - classification_loss: 0.0907 405/500 [=======================>......] - ETA: 23s - loss: 0.8822 - regression_loss: 0.7914 - classification_loss: 0.0907 406/500 [=======================>......] - ETA: 23s - loss: 0.8819 - regression_loss: 0.7912 - classification_loss: 0.0907 407/500 [=======================>......] - ETA: 23s - loss: 0.8830 - regression_loss: 0.7921 - classification_loss: 0.0910 408/500 [=======================>......] - ETA: 22s - loss: 0.8833 - regression_loss: 0.7923 - classification_loss: 0.0910 409/500 [=======================>......] - ETA: 22s - loss: 0.8835 - regression_loss: 0.7925 - classification_loss: 0.0909 410/500 [=======================>......] - ETA: 22s - loss: 0.8843 - regression_loss: 0.7933 - classification_loss: 0.0910 411/500 [=======================>......] - ETA: 22s - loss: 0.8834 - regression_loss: 0.7925 - classification_loss: 0.0909 412/500 [=======================>......] - ETA: 21s - loss: 0.8842 - regression_loss: 0.7933 - classification_loss: 0.0909 413/500 [=======================>......] - ETA: 21s - loss: 0.8832 - regression_loss: 0.7925 - classification_loss: 0.0908 414/500 [=======================>......] - ETA: 21s - loss: 0.8834 - regression_loss: 0.7926 - classification_loss: 0.0909 415/500 [=======================>......] - ETA: 21s - loss: 0.8841 - regression_loss: 0.7931 - classification_loss: 0.0910 416/500 [=======================>......] - ETA: 20s - loss: 0.8840 - regression_loss: 0.7929 - classification_loss: 0.0911 417/500 [========================>.....] - ETA: 20s - loss: 0.8847 - regression_loss: 0.7935 - classification_loss: 0.0912 418/500 [========================>.....] - ETA: 20s - loss: 0.8844 - regression_loss: 0.7932 - classification_loss: 0.0911 419/500 [========================>.....] - ETA: 20s - loss: 0.8836 - regression_loss: 0.7927 - classification_loss: 0.0910 420/500 [========================>.....] - ETA: 19s - loss: 0.8840 - regression_loss: 0.7930 - classification_loss: 0.0910 421/500 [========================>.....] - ETA: 19s - loss: 0.8834 - regression_loss: 0.7925 - classification_loss: 0.0909 422/500 [========================>.....] - ETA: 19s - loss: 0.8831 - regression_loss: 0.7922 - classification_loss: 0.0909 423/500 [========================>.....] - ETA: 19s - loss: 0.8818 - regression_loss: 0.7911 - classification_loss: 0.0907 424/500 [========================>.....] - ETA: 18s - loss: 0.8831 - regression_loss: 0.7924 - classification_loss: 0.0907 425/500 [========================>.....] - ETA: 18s - loss: 0.8828 - regression_loss: 0.7923 - classification_loss: 0.0905 426/500 [========================>.....] - ETA: 18s - loss: 0.8824 - regression_loss: 0.7920 - classification_loss: 0.0904 427/500 [========================>.....] - ETA: 18s - loss: 0.8826 - regression_loss: 0.7921 - classification_loss: 0.0905 428/500 [========================>.....] - ETA: 17s - loss: 0.8841 - regression_loss: 0.7933 - classification_loss: 0.0908 429/500 [========================>.....] - ETA: 17s - loss: 0.8843 - regression_loss: 0.7935 - classification_loss: 0.0908 430/500 [========================>.....] - ETA: 17s - loss: 0.8833 - regression_loss: 0.7926 - classification_loss: 0.0907 431/500 [========================>.....] - ETA: 17s - loss: 0.8833 - regression_loss: 0.7926 - classification_loss: 0.0907 432/500 [========================>.....] - ETA: 16s - loss: 0.8834 - regression_loss: 0.7928 - classification_loss: 0.0907 433/500 [========================>.....] - ETA: 16s - loss: 0.8837 - regression_loss: 0.7931 - classification_loss: 0.0906 434/500 [=========================>....] - ETA: 16s - loss: 0.8845 - regression_loss: 0.7939 - classification_loss: 0.0907 435/500 [=========================>....] - ETA: 16s - loss: 0.8841 - regression_loss: 0.7934 - classification_loss: 0.0907 436/500 [=========================>....] - ETA: 15s - loss: 0.8835 - regression_loss: 0.7928 - classification_loss: 0.0907 437/500 [=========================>....] - ETA: 15s - loss: 0.8838 - regression_loss: 0.7930 - classification_loss: 0.0908 438/500 [=========================>....] - ETA: 15s - loss: 0.8843 - regression_loss: 0.7935 - classification_loss: 0.0908 439/500 [=========================>....] - ETA: 15s - loss: 0.8854 - regression_loss: 0.7945 - classification_loss: 0.0909 440/500 [=========================>....] - ETA: 14s - loss: 0.8852 - regression_loss: 0.7943 - classification_loss: 0.0910 441/500 [=========================>....] - ETA: 14s - loss: 0.8873 - regression_loss: 0.7960 - classification_loss: 0.0913 442/500 [=========================>....] - ETA: 14s - loss: 0.8882 - regression_loss: 0.7968 - classification_loss: 0.0914 443/500 [=========================>....] - ETA: 14s - loss: 0.8884 - regression_loss: 0.7970 - classification_loss: 0.0914 444/500 [=========================>....] - ETA: 13s - loss: 0.8883 - regression_loss: 0.7968 - classification_loss: 0.0914 445/500 [=========================>....] - ETA: 13s - loss: 0.8879 - regression_loss: 0.7966 - classification_loss: 0.0913 446/500 [=========================>....] - ETA: 13s - loss: 0.8875 - regression_loss: 0.7962 - classification_loss: 0.0913 447/500 [=========================>....] - ETA: 13s - loss: 0.8880 - regression_loss: 0.7967 - classification_loss: 0.0913 448/500 [=========================>....] - ETA: 12s - loss: 0.8878 - regression_loss: 0.7966 - classification_loss: 0.0913 449/500 [=========================>....] - ETA: 12s - loss: 0.8879 - regression_loss: 0.7967 - classification_loss: 0.0912 450/500 [==========================>...] - ETA: 12s - loss: 0.8866 - regression_loss: 0.7955 - classification_loss: 0.0911 451/500 [==========================>...] - ETA: 12s - loss: 0.8870 - regression_loss: 0.7958 - classification_loss: 0.0911 452/500 [==========================>...] - ETA: 11s - loss: 0.8863 - regression_loss: 0.7952 - classification_loss: 0.0911 453/500 [==========================>...] - ETA: 11s - loss: 0.8857 - regression_loss: 0.7947 - classification_loss: 0.0910 454/500 [==========================>...] - ETA: 11s - loss: 0.8871 - regression_loss: 0.7959 - classification_loss: 0.0912 455/500 [==========================>...] - ETA: 11s - loss: 0.8859 - regression_loss: 0.7949 - classification_loss: 0.0910 456/500 [==========================>...] - ETA: 10s - loss: 0.8865 - regression_loss: 0.7955 - classification_loss: 0.0910 457/500 [==========================>...] - ETA: 10s - loss: 0.8859 - regression_loss: 0.7949 - classification_loss: 0.0910 458/500 [==========================>...] - ETA: 10s - loss: 0.8863 - regression_loss: 0.7952 - classification_loss: 0.0911 459/500 [==========================>...] - ETA: 10s - loss: 0.8868 - regression_loss: 0.7957 - classification_loss: 0.0911 460/500 [==========================>...] - ETA: 9s - loss: 0.8874 - regression_loss: 0.7962 - classification_loss: 0.0912  461/500 [==========================>...] - ETA: 9s - loss: 0.8875 - regression_loss: 0.7963 - classification_loss: 0.0912 462/500 [==========================>...] - ETA: 9s - loss: 0.8882 - regression_loss: 0.7970 - classification_loss: 0.0912 463/500 [==========================>...] - ETA: 9s - loss: 0.8876 - regression_loss: 0.7966 - classification_loss: 0.0911 464/500 [==========================>...] - ETA: 8s - loss: 0.8883 - regression_loss: 0.7972 - classification_loss: 0.0912 465/500 [==========================>...] - ETA: 8s - loss: 0.8880 - regression_loss: 0.7969 - classification_loss: 0.0911 466/500 [==========================>...] - ETA: 8s - loss: 0.8883 - regression_loss: 0.7972 - classification_loss: 0.0912 467/500 [===========================>..] - ETA: 8s - loss: 0.8886 - regression_loss: 0.7974 - classification_loss: 0.0913 468/500 [===========================>..] - ETA: 7s - loss: 0.8880 - regression_loss: 0.7968 - classification_loss: 0.0912 469/500 [===========================>..] - ETA: 7s - loss: 0.8895 - regression_loss: 0.7980 - classification_loss: 0.0916 470/500 [===========================>..] - ETA: 7s - loss: 0.8902 - regression_loss: 0.7986 - classification_loss: 0.0916 471/500 [===========================>..] - ETA: 7s - loss: 0.8907 - regression_loss: 0.7990 - classification_loss: 0.0916 472/500 [===========================>..] - ETA: 6s - loss: 0.8894 - regression_loss: 0.7979 - classification_loss: 0.0915 473/500 [===========================>..] - ETA: 6s - loss: 0.8909 - regression_loss: 0.7992 - classification_loss: 0.0918 474/500 [===========================>..] - ETA: 6s - loss: 0.8902 - regression_loss: 0.7985 - classification_loss: 0.0917 475/500 [===========================>..] - ETA: 6s - loss: 0.8900 - regression_loss: 0.7983 - classification_loss: 0.0916 476/500 [===========================>..] - ETA: 5s - loss: 0.8888 - regression_loss: 0.7973 - classification_loss: 0.0915 477/500 [===========================>..] - ETA: 5s - loss: 0.8905 - regression_loss: 0.7987 - classification_loss: 0.0918 478/500 [===========================>..] - ETA: 5s - loss: 0.8913 - regression_loss: 0.7994 - classification_loss: 0.0919 479/500 [===========================>..] - ETA: 5s - loss: 0.8909 - regression_loss: 0.7991 - classification_loss: 0.0918 480/500 [===========================>..] - ETA: 4s - loss: 0.8909 - regression_loss: 0.7991 - classification_loss: 0.0918 481/500 [===========================>..] - ETA: 4s - loss: 0.8895 - regression_loss: 0.7979 - classification_loss: 0.0917 482/500 [===========================>..] - ETA: 4s - loss: 0.8898 - regression_loss: 0.7981 - classification_loss: 0.0917 483/500 [===========================>..] - ETA: 4s - loss: 0.8892 - regression_loss: 0.7976 - classification_loss: 0.0916 484/500 [============================>.] - ETA: 3s - loss: 0.8893 - regression_loss: 0.7977 - classification_loss: 0.0916 485/500 [============================>.] - ETA: 3s - loss: 0.8892 - regression_loss: 0.7976 - classification_loss: 0.0916 486/500 [============================>.] - ETA: 3s - loss: 0.8893 - regression_loss: 0.7977 - classification_loss: 0.0916 487/500 [============================>.] - ETA: 3s - loss: 0.8898 - regression_loss: 0.7981 - classification_loss: 0.0916 488/500 [============================>.] - ETA: 2s - loss: 0.8897 - regression_loss: 0.7981 - classification_loss: 0.0916 489/500 [============================>.] - ETA: 2s - loss: 0.8904 - regression_loss: 0.7986 - classification_loss: 0.0917 490/500 [============================>.] - ETA: 2s - loss: 0.8904 - regression_loss: 0.7987 - classification_loss: 0.0917 491/500 [============================>.] - ETA: 2s - loss: 0.8904 - regression_loss: 0.7987 - classification_loss: 0.0917 492/500 [============================>.] - ETA: 1s - loss: 0.8897 - regression_loss: 0.7982 - classification_loss: 0.0916 493/500 [============================>.] - ETA: 1s - loss: 0.8898 - regression_loss: 0.7983 - classification_loss: 0.0915 494/500 [============================>.] - ETA: 1s - loss: 0.8894 - regression_loss: 0.7980 - classification_loss: 0.0914 495/500 [============================>.] - ETA: 1s - loss: 0.8896 - regression_loss: 0.7981 - classification_loss: 0.0914 496/500 [============================>.] - ETA: 0s - loss: 0.8896 - regression_loss: 0.7982 - classification_loss: 0.0914 497/500 [============================>.] - ETA: 0s - loss: 0.8891 - regression_loss: 0.7977 - classification_loss: 0.0913 498/500 [============================>.] - ETA: 0s - loss: 0.8882 - regression_loss: 0.7970 - classification_loss: 0.0912 499/500 [============================>.] - ETA: 0s - loss: 0.8886 - regression_loss: 0.7973 - classification_loss: 0.0913 500/500 [==============================] - 124s 248ms/step - loss: 0.8886 - regression_loss: 0.7974 - classification_loss: 0.0912 1172 instances of class plum with average precision: 0.8081 mAP: 0.8081 Epoch 00049: saving model to ./training/snapshots/resnet50_pascal_49.h5 Epoch 50/150 1/500 [..............................] - ETA: 2:03 - loss: 0.6377 - regression_loss: 0.5764 - classification_loss: 0.0613 2/500 [..............................] - ETA: 2:06 - loss: 0.6124 - regression_loss: 0.5595 - classification_loss: 0.0529 3/500 [..............................] - ETA: 2:06 - loss: 0.6905 - regression_loss: 0.6357 - classification_loss: 0.0548 4/500 [..............................] - ETA: 2:05 - loss: 0.6316 - regression_loss: 0.5836 - classification_loss: 0.0480 5/500 [..............................] - ETA: 2:04 - loss: 0.5924 - regression_loss: 0.5481 - classification_loss: 0.0443 6/500 [..............................] - ETA: 2:04 - loss: 0.5568 - regression_loss: 0.5163 - classification_loss: 0.0405 7/500 [..............................] - ETA: 2:04 - loss: 0.5621 - regression_loss: 0.5245 - classification_loss: 0.0375 8/500 [..............................] - ETA: 2:04 - loss: 0.5347 - regression_loss: 0.5005 - classification_loss: 0.0342 9/500 [..............................] - ETA: 2:04 - loss: 0.5823 - regression_loss: 0.5402 - classification_loss: 0.0421 10/500 [..............................] - ETA: 2:03 - loss: 0.5876 - regression_loss: 0.5445 - classification_loss: 0.0431 11/500 [..............................] - ETA: 2:03 - loss: 0.6483 - regression_loss: 0.5956 - classification_loss: 0.0527 12/500 [..............................] - ETA: 2:03 - loss: 0.6654 - regression_loss: 0.6115 - classification_loss: 0.0539 13/500 [..............................] - ETA: 2:04 - loss: 0.6559 - regression_loss: 0.6043 - classification_loss: 0.0516 14/500 [..............................] - ETA: 2:03 - loss: 0.6391 - regression_loss: 0.5887 - classification_loss: 0.0505 15/500 [..............................] - ETA: 2:03 - loss: 0.6504 - regression_loss: 0.5954 - classification_loss: 0.0550 16/500 [..............................] - ETA: 2:03 - loss: 0.6720 - regression_loss: 0.6116 - classification_loss: 0.0604 17/500 [>.............................] - ETA: 2:02 - loss: 0.6810 - regression_loss: 0.6204 - classification_loss: 0.0607 18/500 [>.............................] - ETA: 2:02 - loss: 0.6697 - regression_loss: 0.6101 - classification_loss: 0.0596 19/500 [>.............................] - ETA: 2:02 - loss: 0.6826 - regression_loss: 0.6229 - classification_loss: 0.0597 20/500 [>.............................] - ETA: 2:01 - loss: 0.6973 - regression_loss: 0.6348 - classification_loss: 0.0625 21/500 [>.............................] - ETA: 2:01 - loss: 0.7147 - regression_loss: 0.6528 - classification_loss: 0.0619 22/500 [>.............................] - ETA: 2:01 - loss: 0.7095 - regression_loss: 0.6488 - classification_loss: 0.0606 23/500 [>.............................] - ETA: 2:01 - loss: 0.7121 - regression_loss: 0.6507 - classification_loss: 0.0613 24/500 [>.............................] - ETA: 2:01 - loss: 0.7226 - regression_loss: 0.6592 - classification_loss: 0.0634 25/500 [>.............................] - ETA: 2:00 - loss: 0.7131 - regression_loss: 0.6511 - classification_loss: 0.0621 26/500 [>.............................] - ETA: 2:00 - loss: 0.7178 - regression_loss: 0.6561 - classification_loss: 0.0618 27/500 [>.............................] - ETA: 2:00 - loss: 0.7231 - regression_loss: 0.6606 - classification_loss: 0.0625 28/500 [>.............................] - ETA: 2:00 - loss: 0.7287 - regression_loss: 0.6653 - classification_loss: 0.0634 29/500 [>.............................] - ETA: 1:59 - loss: 0.7313 - regression_loss: 0.6676 - classification_loss: 0.0638 30/500 [>.............................] - ETA: 1:59 - loss: 0.7466 - regression_loss: 0.6806 - classification_loss: 0.0660 31/500 [>.............................] - ETA: 1:59 - loss: 0.7632 - regression_loss: 0.6949 - classification_loss: 0.0683 32/500 [>.............................] - ETA: 1:59 - loss: 0.7807 - regression_loss: 0.7087 - classification_loss: 0.0720 33/500 [>.............................] - ETA: 1:58 - loss: 0.7764 - regression_loss: 0.7056 - classification_loss: 0.0708 34/500 [=>............................] - ETA: 1:58 - loss: 0.7793 - regression_loss: 0.7079 - classification_loss: 0.0714 35/500 [=>............................] - ETA: 1:57 - loss: 0.7970 - regression_loss: 0.7226 - classification_loss: 0.0744 36/500 [=>............................] - ETA: 1:57 - loss: 0.8023 - regression_loss: 0.7269 - classification_loss: 0.0754 37/500 [=>............................] - ETA: 1:57 - loss: 0.8091 - regression_loss: 0.7327 - classification_loss: 0.0764 38/500 [=>............................] - ETA: 1:57 - loss: 0.8145 - regression_loss: 0.7372 - classification_loss: 0.0774 39/500 [=>............................] - ETA: 1:56 - loss: 0.8098 - regression_loss: 0.7331 - classification_loss: 0.0767 40/500 [=>............................] - ETA: 1:56 - loss: 0.7955 - regression_loss: 0.7205 - classification_loss: 0.0751 41/500 [=>............................] - ETA: 1:56 - loss: 0.7957 - regression_loss: 0.7208 - classification_loss: 0.0749 42/500 [=>............................] - ETA: 1:56 - loss: 0.7957 - regression_loss: 0.7210 - classification_loss: 0.0747 43/500 [=>............................] - ETA: 1:55 - loss: 0.7832 - regression_loss: 0.7099 - classification_loss: 0.0733 44/500 [=>............................] - ETA: 1:55 - loss: 0.7913 - regression_loss: 0.7178 - classification_loss: 0.0735 45/500 [=>............................] - ETA: 1:55 - loss: 0.7919 - regression_loss: 0.7165 - classification_loss: 0.0754 46/500 [=>............................] - ETA: 1:54 - loss: 0.8022 - regression_loss: 0.7245 - classification_loss: 0.0776 47/500 [=>............................] - ETA: 1:54 - loss: 0.8134 - regression_loss: 0.7334 - classification_loss: 0.0800 48/500 [=>............................] - ETA: 1:54 - loss: 0.8014 - regression_loss: 0.7226 - classification_loss: 0.0788 49/500 [=>............................] - ETA: 1:54 - loss: 0.8032 - regression_loss: 0.7240 - classification_loss: 0.0792 50/500 [==>...........................] - ETA: 1:53 - loss: 0.8001 - regression_loss: 0.7214 - classification_loss: 0.0788 51/500 [==>...........................] - ETA: 1:53 - loss: 0.8044 - regression_loss: 0.7249 - classification_loss: 0.0795 52/500 [==>...........................] - ETA: 1:53 - loss: 0.8118 - regression_loss: 0.7309 - classification_loss: 0.0809 53/500 [==>...........................] - ETA: 1:53 - loss: 0.8119 - regression_loss: 0.7308 - classification_loss: 0.0810 54/500 [==>...........................] - ETA: 1:52 - loss: 0.8139 - regression_loss: 0.7340 - classification_loss: 0.0799 55/500 [==>...........................] - ETA: 1:52 - loss: 0.8116 - regression_loss: 0.7317 - classification_loss: 0.0799 56/500 [==>...........................] - ETA: 1:52 - loss: 0.8193 - regression_loss: 0.7383 - classification_loss: 0.0810 57/500 [==>...........................] - ETA: 1:52 - loss: 0.8204 - regression_loss: 0.7395 - classification_loss: 0.0809 58/500 [==>...........................] - ETA: 1:51 - loss: 0.8172 - regression_loss: 0.7368 - classification_loss: 0.0805 59/500 [==>...........................] - ETA: 1:51 - loss: 0.8224 - regression_loss: 0.7407 - classification_loss: 0.0817 60/500 [==>...........................] - ETA: 1:51 - loss: 0.8170 - regression_loss: 0.7360 - classification_loss: 0.0809 61/500 [==>...........................] - ETA: 1:50 - loss: 0.8215 - regression_loss: 0.7400 - classification_loss: 0.0816 62/500 [==>...........................] - ETA: 1:50 - loss: 0.8150 - regression_loss: 0.7339 - classification_loss: 0.0811 63/500 [==>...........................] - ETA: 1:50 - loss: 0.8158 - regression_loss: 0.7342 - classification_loss: 0.0816 64/500 [==>...........................] - ETA: 1:50 - loss: 0.8112 - regression_loss: 0.7304 - classification_loss: 0.0808 65/500 [==>...........................] - ETA: 1:49 - loss: 0.8097 - regression_loss: 0.7296 - classification_loss: 0.0802 66/500 [==>...........................] - ETA: 1:49 - loss: 0.8141 - regression_loss: 0.7327 - classification_loss: 0.0814 67/500 [===>..........................] - ETA: 1:49 - loss: 0.8167 - regression_loss: 0.7359 - classification_loss: 0.0808 68/500 [===>..........................] - ETA: 1:49 - loss: 0.8218 - regression_loss: 0.7399 - classification_loss: 0.0818 69/500 [===>..........................] - ETA: 1:48 - loss: 0.8203 - regression_loss: 0.7393 - classification_loss: 0.0810 70/500 [===>..........................] - ETA: 1:48 - loss: 0.8261 - regression_loss: 0.7444 - classification_loss: 0.0817 71/500 [===>..........................] - ETA: 1:48 - loss: 0.8271 - regression_loss: 0.7454 - classification_loss: 0.0817 72/500 [===>..........................] - ETA: 1:48 - loss: 0.8266 - regression_loss: 0.7447 - classification_loss: 0.0819 73/500 [===>..........................] - ETA: 1:47 - loss: 0.8233 - regression_loss: 0.7421 - classification_loss: 0.0812 74/500 [===>..........................] - ETA: 1:47 - loss: 0.8183 - regression_loss: 0.7379 - classification_loss: 0.0804 75/500 [===>..........................] - ETA: 1:47 - loss: 0.8113 - regression_loss: 0.7317 - classification_loss: 0.0796 76/500 [===>..........................] - ETA: 1:47 - loss: 0.8082 - regression_loss: 0.7293 - classification_loss: 0.0789 77/500 [===>..........................] - ETA: 1:46 - loss: 0.8235 - regression_loss: 0.7438 - classification_loss: 0.0796 78/500 [===>..........................] - ETA: 1:46 - loss: 0.8264 - regression_loss: 0.7462 - classification_loss: 0.0802 79/500 [===>..........................] - ETA: 1:46 - loss: 0.8284 - regression_loss: 0.7481 - classification_loss: 0.0803 80/500 [===>..........................] - ETA: 1:46 - loss: 0.8334 - regression_loss: 0.7519 - classification_loss: 0.0814 81/500 [===>..........................] - ETA: 1:46 - loss: 0.8317 - regression_loss: 0.7505 - classification_loss: 0.0812 82/500 [===>..........................] - ETA: 1:45 - loss: 0.8322 - regression_loss: 0.7512 - classification_loss: 0.0810 83/500 [===>..........................] - ETA: 1:45 - loss: 0.8380 - regression_loss: 0.7559 - classification_loss: 0.0821 84/500 [====>.........................] - ETA: 1:45 - loss: 0.8402 - regression_loss: 0.7579 - classification_loss: 0.0823 85/500 [====>.........................] - ETA: 1:45 - loss: 0.8347 - regression_loss: 0.7532 - classification_loss: 0.0816 86/500 [====>.........................] - ETA: 1:44 - loss: 0.8311 - regression_loss: 0.7499 - classification_loss: 0.0812 87/500 [====>.........................] - ETA: 1:44 - loss: 0.8277 - regression_loss: 0.7470 - classification_loss: 0.0807 88/500 [====>.........................] - ETA: 1:44 - loss: 0.8280 - regression_loss: 0.7476 - classification_loss: 0.0805 89/500 [====>.........................] - ETA: 1:44 - loss: 0.8283 - regression_loss: 0.7483 - classification_loss: 0.0799 90/500 [====>.........................] - ETA: 1:43 - loss: 0.8301 - regression_loss: 0.7498 - classification_loss: 0.0803 91/500 [====>.........................] - ETA: 1:43 - loss: 0.8369 - regression_loss: 0.7555 - classification_loss: 0.0814 92/500 [====>.........................] - ETA: 1:43 - loss: 0.8366 - regression_loss: 0.7552 - classification_loss: 0.0814 93/500 [====>.........................] - ETA: 1:43 - loss: 0.8341 - regression_loss: 0.7530 - classification_loss: 0.0811 94/500 [====>.........................] - ETA: 1:42 - loss: 0.8286 - regression_loss: 0.7482 - classification_loss: 0.0803 95/500 [====>.........................] - ETA: 1:42 - loss: 0.8248 - regression_loss: 0.7450 - classification_loss: 0.0798 96/500 [====>.........................] - ETA: 1:42 - loss: 0.8298 - regression_loss: 0.7492 - classification_loss: 0.0807 97/500 [====>.........................] - ETA: 1:42 - loss: 0.8289 - regression_loss: 0.7482 - classification_loss: 0.0807 98/500 [====>.........................] - ETA: 1:41 - loss: 0.8306 - regression_loss: 0.7496 - classification_loss: 0.0810 99/500 [====>.........................] - ETA: 1:41 - loss: 0.8266 - regression_loss: 0.7462 - classification_loss: 0.0804 100/500 [=====>........................] - ETA: 1:41 - loss: 0.8287 - regression_loss: 0.7478 - classification_loss: 0.0809 101/500 [=====>........................] - ETA: 1:41 - loss: 0.8245 - regression_loss: 0.7443 - classification_loss: 0.0802 102/500 [=====>........................] - ETA: 1:40 - loss: 0.8242 - regression_loss: 0.7441 - classification_loss: 0.0802 103/500 [=====>........................] - ETA: 1:40 - loss: 0.8344 - regression_loss: 0.7533 - classification_loss: 0.0810 104/500 [=====>........................] - ETA: 1:40 - loss: 0.8371 - regression_loss: 0.7555 - classification_loss: 0.0816 105/500 [=====>........................] - ETA: 1:40 - loss: 0.8404 - regression_loss: 0.7586 - classification_loss: 0.0819 106/500 [=====>........................] - ETA: 1:39 - loss: 0.8465 - regression_loss: 0.7637 - classification_loss: 0.0828 107/500 [=====>........................] - ETA: 1:39 - loss: 0.8492 - regression_loss: 0.7663 - classification_loss: 0.0829 108/500 [=====>........................] - ETA: 1:39 - loss: 0.8524 - regression_loss: 0.7689 - classification_loss: 0.0835 109/500 [=====>........................] - ETA: 1:39 - loss: 0.8482 - regression_loss: 0.7652 - classification_loss: 0.0829 110/500 [=====>........................] - ETA: 1:38 - loss: 0.8495 - regression_loss: 0.7661 - classification_loss: 0.0833 111/500 [=====>........................] - ETA: 1:39 - loss: 0.8573 - regression_loss: 0.7732 - classification_loss: 0.0841 112/500 [=====>........................] - ETA: 1:38 - loss: 0.8612 - regression_loss: 0.7764 - classification_loss: 0.0848 113/500 [=====>........................] - ETA: 1:38 - loss: 0.8625 - regression_loss: 0.7774 - classification_loss: 0.0851 114/500 [=====>........................] - ETA: 1:38 - loss: 0.8614 - regression_loss: 0.7763 - classification_loss: 0.0851 115/500 [=====>........................] - ETA: 1:37 - loss: 0.8607 - regression_loss: 0.7757 - classification_loss: 0.0850 116/500 [=====>........................] - ETA: 1:37 - loss: 0.8611 - regression_loss: 0.7763 - classification_loss: 0.0847 117/500 [======>.......................] - ETA: 1:37 - loss: 0.8660 - regression_loss: 0.7802 - classification_loss: 0.0858 118/500 [======>.......................] - ETA: 1:36 - loss: 0.8670 - regression_loss: 0.7808 - classification_loss: 0.0862 119/500 [======>.......................] - ETA: 1:36 - loss: 0.8676 - regression_loss: 0.7814 - classification_loss: 0.0862 120/500 [======>.......................] - ETA: 1:36 - loss: 0.8701 - regression_loss: 0.7831 - classification_loss: 0.0870 121/500 [======>.......................] - ETA: 1:36 - loss: 0.8664 - regression_loss: 0.7801 - classification_loss: 0.0864 122/500 [======>.......................] - ETA: 1:35 - loss: 0.8676 - regression_loss: 0.7813 - classification_loss: 0.0863 123/500 [======>.......................] - ETA: 1:35 - loss: 0.8675 - regression_loss: 0.7809 - classification_loss: 0.0866 124/500 [======>.......................] - ETA: 1:35 - loss: 0.8643 - regression_loss: 0.7782 - classification_loss: 0.0861 125/500 [======>.......................] - ETA: 1:35 - loss: 0.8610 - regression_loss: 0.7751 - classification_loss: 0.0860 126/500 [======>.......................] - ETA: 1:34 - loss: 0.8574 - regression_loss: 0.7719 - classification_loss: 0.0855 127/500 [======>.......................] - ETA: 1:34 - loss: 0.8586 - regression_loss: 0.7730 - classification_loss: 0.0856 128/500 [======>.......................] - ETA: 1:34 - loss: 0.8597 - regression_loss: 0.7741 - classification_loss: 0.0856 129/500 [======>.......................] - ETA: 1:34 - loss: 0.8572 - regression_loss: 0.7719 - classification_loss: 0.0853 130/500 [======>.......................] - ETA: 1:33 - loss: 0.8610 - regression_loss: 0.7755 - classification_loss: 0.0855 131/500 [======>.......................] - ETA: 1:33 - loss: 0.8588 - regression_loss: 0.7737 - classification_loss: 0.0851 132/500 [======>.......................] - ETA: 1:33 - loss: 0.8562 - regression_loss: 0.7714 - classification_loss: 0.0848 133/500 [======>.......................] - ETA: 1:33 - loss: 0.8565 - regression_loss: 0.7716 - classification_loss: 0.0849 134/500 [=======>......................] - ETA: 1:32 - loss: 0.8527 - regression_loss: 0.7683 - classification_loss: 0.0844 135/500 [=======>......................] - ETA: 1:32 - loss: 0.8506 - regression_loss: 0.7666 - classification_loss: 0.0840 136/500 [=======>......................] - ETA: 1:32 - loss: 0.8487 - regression_loss: 0.7651 - classification_loss: 0.0837 137/500 [=======>......................] - ETA: 1:32 - loss: 0.8498 - regression_loss: 0.7660 - classification_loss: 0.0839 138/500 [=======>......................] - ETA: 1:31 - loss: 0.8484 - regression_loss: 0.7648 - classification_loss: 0.0836 139/500 [=======>......................] - ETA: 1:31 - loss: 0.8470 - regression_loss: 0.7634 - classification_loss: 0.0836 140/500 [=======>......................] - ETA: 1:31 - loss: 0.8478 - regression_loss: 0.7640 - classification_loss: 0.0838 141/500 [=======>......................] - ETA: 1:30 - loss: 0.8487 - regression_loss: 0.7648 - classification_loss: 0.0838 142/500 [=======>......................] - ETA: 1:30 - loss: 0.8454 - regression_loss: 0.7620 - classification_loss: 0.0834 143/500 [=======>......................] - ETA: 1:30 - loss: 0.8479 - regression_loss: 0.7640 - classification_loss: 0.0839 144/500 [=======>......................] - ETA: 1:30 - loss: 0.8447 - regression_loss: 0.7613 - classification_loss: 0.0834 145/500 [=======>......................] - ETA: 1:29 - loss: 0.8472 - regression_loss: 0.7633 - classification_loss: 0.0839 146/500 [=======>......................] - ETA: 1:29 - loss: 0.8444 - regression_loss: 0.7610 - classification_loss: 0.0834 147/500 [=======>......................] - ETA: 1:29 - loss: 0.8416 - regression_loss: 0.7586 - classification_loss: 0.0830 148/500 [=======>......................] - ETA: 1:29 - loss: 0.8437 - regression_loss: 0.7605 - classification_loss: 0.0832 149/500 [=======>......................] - ETA: 1:28 - loss: 0.8442 - regression_loss: 0.7610 - classification_loss: 0.0832 150/500 [========>.....................] - ETA: 1:28 - loss: 0.8416 - regression_loss: 0.7587 - classification_loss: 0.0829 151/500 [========>.....................] - ETA: 1:28 - loss: 0.8428 - regression_loss: 0.7597 - classification_loss: 0.0830 152/500 [========>.....................] - ETA: 1:27 - loss: 0.8413 - regression_loss: 0.7585 - classification_loss: 0.0828 153/500 [========>.....................] - ETA: 1:27 - loss: 0.8383 - regression_loss: 0.7559 - classification_loss: 0.0825 154/500 [========>.....................] - ETA: 1:27 - loss: 0.8358 - regression_loss: 0.7537 - classification_loss: 0.0821 155/500 [========>.....................] - ETA: 1:27 - loss: 0.8387 - regression_loss: 0.7561 - classification_loss: 0.0826 156/500 [========>.....................] - ETA: 1:26 - loss: 0.8375 - regression_loss: 0.7551 - classification_loss: 0.0824 157/500 [========>.....................] - ETA: 1:26 - loss: 0.8371 - regression_loss: 0.7547 - classification_loss: 0.0824 158/500 [========>.....................] - ETA: 1:26 - loss: 0.8392 - regression_loss: 0.7566 - classification_loss: 0.0827 159/500 [========>.....................] - ETA: 1:25 - loss: 0.8410 - regression_loss: 0.7582 - classification_loss: 0.0828 160/500 [========>.....................] - ETA: 1:25 - loss: 0.8384 - regression_loss: 0.7561 - classification_loss: 0.0823 161/500 [========>.....................] - ETA: 1:25 - loss: 0.8407 - regression_loss: 0.7581 - classification_loss: 0.0825 162/500 [========>.....................] - ETA: 1:25 - loss: 0.8419 - regression_loss: 0.7593 - classification_loss: 0.0826 163/500 [========>.....................] - ETA: 1:25 - loss: 0.8444 - regression_loss: 0.7614 - classification_loss: 0.0830 164/500 [========>.....................] - ETA: 1:24 - loss: 0.8422 - regression_loss: 0.7596 - classification_loss: 0.0826 165/500 [========>.....................] - ETA: 1:24 - loss: 0.8406 - regression_loss: 0.7582 - classification_loss: 0.0823 166/500 [========>.....................] - ETA: 1:24 - loss: 0.8402 - regression_loss: 0.7580 - classification_loss: 0.0823 167/500 [=========>....................] - ETA: 1:24 - loss: 0.8420 - regression_loss: 0.7594 - classification_loss: 0.0826 168/500 [=========>....................] - ETA: 1:23 - loss: 0.8430 - regression_loss: 0.7605 - classification_loss: 0.0825 169/500 [=========>....................] - ETA: 1:23 - loss: 0.8412 - regression_loss: 0.7590 - classification_loss: 0.0822 170/500 [=========>....................] - ETA: 1:23 - loss: 0.8434 - regression_loss: 0.7610 - classification_loss: 0.0824 171/500 [=========>....................] - ETA: 1:23 - loss: 0.8438 - regression_loss: 0.7612 - classification_loss: 0.0827 172/500 [=========>....................] - ETA: 1:22 - loss: 0.8427 - regression_loss: 0.7603 - classification_loss: 0.0824 173/500 [=========>....................] - ETA: 1:22 - loss: 0.8424 - regression_loss: 0.7601 - classification_loss: 0.0823 174/500 [=========>....................] - ETA: 1:22 - loss: 0.8436 - regression_loss: 0.7610 - classification_loss: 0.0826 175/500 [=========>....................] - ETA: 1:22 - loss: 0.8435 - regression_loss: 0.7607 - classification_loss: 0.0828 176/500 [=========>....................] - ETA: 1:21 - loss: 0.8452 - regression_loss: 0.7622 - classification_loss: 0.0830 177/500 [=========>....................] - ETA: 1:21 - loss: 0.8455 - regression_loss: 0.7626 - classification_loss: 0.0829 178/500 [=========>....................] - ETA: 1:21 - loss: 0.8462 - regression_loss: 0.7633 - classification_loss: 0.0829 179/500 [=========>....................] - ETA: 1:21 - loss: 0.8469 - regression_loss: 0.7640 - classification_loss: 0.0829 180/500 [=========>....................] - ETA: 1:20 - loss: 0.8461 - regression_loss: 0.7632 - classification_loss: 0.0829 181/500 [=========>....................] - ETA: 1:20 - loss: 0.8467 - regression_loss: 0.7636 - classification_loss: 0.0831 182/500 [=========>....................] - ETA: 1:20 - loss: 0.8478 - regression_loss: 0.7643 - classification_loss: 0.0835 183/500 [=========>....................] - ETA: 1:19 - loss: 0.8486 - regression_loss: 0.7647 - classification_loss: 0.0839 184/500 [==========>...................] - ETA: 1:19 - loss: 0.8478 - regression_loss: 0.7640 - classification_loss: 0.0838 185/500 [==========>...................] - ETA: 1:19 - loss: 0.8473 - regression_loss: 0.7636 - classification_loss: 0.0837 186/500 [==========>...................] - ETA: 1:19 - loss: 0.8457 - regression_loss: 0.7622 - classification_loss: 0.0835 187/500 [==========>...................] - ETA: 1:18 - loss: 0.8462 - regression_loss: 0.7628 - classification_loss: 0.0835 188/500 [==========>...................] - ETA: 1:18 - loss: 0.8451 - regression_loss: 0.7619 - classification_loss: 0.0833 189/500 [==========>...................] - ETA: 1:18 - loss: 0.8452 - regression_loss: 0.7619 - classification_loss: 0.0834 190/500 [==========>...................] - ETA: 1:18 - loss: 0.8471 - regression_loss: 0.7632 - classification_loss: 0.0839 191/500 [==========>...................] - ETA: 1:18 - loss: 0.8464 - regression_loss: 0.7628 - classification_loss: 0.0836 192/500 [==========>...................] - ETA: 1:17 - loss: 0.8459 - regression_loss: 0.7621 - classification_loss: 0.0838 193/500 [==========>...................] - ETA: 1:17 - loss: 0.8485 - regression_loss: 0.7642 - classification_loss: 0.0843 194/500 [==========>...................] - ETA: 1:17 - loss: 0.8454 - regression_loss: 0.7614 - classification_loss: 0.0840 195/500 [==========>...................] - ETA: 1:17 - loss: 0.8439 - regression_loss: 0.7602 - classification_loss: 0.0837 196/500 [==========>...................] - ETA: 1:16 - loss: 0.8420 - regression_loss: 0.7587 - classification_loss: 0.0834 197/500 [==========>...................] - ETA: 1:16 - loss: 0.8424 - regression_loss: 0.7590 - classification_loss: 0.0835 198/500 [==========>...................] - ETA: 1:16 - loss: 0.8415 - regression_loss: 0.7580 - classification_loss: 0.0835 199/500 [==========>...................] - ETA: 1:16 - loss: 0.8431 - regression_loss: 0.7594 - classification_loss: 0.0838 200/500 [===========>..................] - ETA: 1:15 - loss: 0.8430 - regression_loss: 0.7593 - classification_loss: 0.0837 201/500 [===========>..................] - ETA: 1:15 - loss: 0.8417 - regression_loss: 0.7582 - classification_loss: 0.0835 202/500 [===========>..................] - ETA: 1:15 - loss: 0.8416 - regression_loss: 0.7581 - classification_loss: 0.0835 203/500 [===========>..................] - ETA: 1:15 - loss: 0.8425 - regression_loss: 0.7587 - classification_loss: 0.0838 204/500 [===========>..................] - ETA: 1:14 - loss: 0.8415 - regression_loss: 0.7579 - classification_loss: 0.0836 205/500 [===========>..................] - ETA: 1:14 - loss: 0.8430 - regression_loss: 0.7592 - classification_loss: 0.0839 206/500 [===========>..................] - ETA: 1:14 - loss: 0.8415 - regression_loss: 0.7580 - classification_loss: 0.0835 207/500 [===========>..................] - ETA: 1:14 - loss: 0.8420 - regression_loss: 0.7584 - classification_loss: 0.0836 208/500 [===========>..................] - ETA: 1:13 - loss: 0.8423 - regression_loss: 0.7586 - classification_loss: 0.0837 209/500 [===========>..................] - ETA: 1:13 - loss: 0.8428 - regression_loss: 0.7591 - classification_loss: 0.0837 210/500 [===========>..................] - ETA: 1:13 - loss: 0.8418 - regression_loss: 0.7583 - classification_loss: 0.0835 211/500 [===========>..................] - ETA: 1:13 - loss: 0.8418 - regression_loss: 0.7583 - classification_loss: 0.0835 212/500 [===========>..................] - ETA: 1:12 - loss: 0.8433 - regression_loss: 0.7595 - classification_loss: 0.0838 213/500 [===========>..................] - ETA: 1:12 - loss: 0.8429 - regression_loss: 0.7593 - classification_loss: 0.0836 214/500 [===========>..................] - ETA: 1:12 - loss: 0.8448 - regression_loss: 0.7608 - classification_loss: 0.0840 215/500 [===========>..................] - ETA: 1:12 - loss: 0.8443 - regression_loss: 0.7602 - classification_loss: 0.0841 216/500 [===========>..................] - ETA: 1:11 - loss: 0.8463 - regression_loss: 0.7619 - classification_loss: 0.0844 217/500 [============>.................] - ETA: 1:11 - loss: 0.8456 - regression_loss: 0.7612 - classification_loss: 0.0844 218/500 [============>.................] - ETA: 1:11 - loss: 0.8474 - regression_loss: 0.7628 - classification_loss: 0.0846 219/500 [============>.................] - ETA: 1:11 - loss: 0.8497 - regression_loss: 0.7648 - classification_loss: 0.0849 220/500 [============>.................] - ETA: 1:10 - loss: 0.8489 - regression_loss: 0.7640 - classification_loss: 0.0849 221/500 [============>.................] - ETA: 1:10 - loss: 0.8491 - regression_loss: 0.7642 - classification_loss: 0.0849 222/500 [============>.................] - ETA: 1:10 - loss: 0.8490 - regression_loss: 0.7641 - classification_loss: 0.0848 223/500 [============>.................] - ETA: 1:10 - loss: 0.8508 - regression_loss: 0.7657 - classification_loss: 0.0851 224/500 [============>.................] - ETA: 1:09 - loss: 0.8486 - regression_loss: 0.7638 - classification_loss: 0.0848 225/500 [============>.................] - ETA: 1:09 - loss: 0.8494 - regression_loss: 0.7647 - classification_loss: 0.0848 226/500 [============>.................] - ETA: 1:09 - loss: 0.8518 - regression_loss: 0.7666 - classification_loss: 0.0852 227/500 [============>.................] - ETA: 1:09 - loss: 0.8511 - regression_loss: 0.7661 - classification_loss: 0.0850 228/500 [============>.................] - ETA: 1:08 - loss: 0.8503 - regression_loss: 0.7654 - classification_loss: 0.0849 229/500 [============>.................] - ETA: 1:08 - loss: 0.8528 - regression_loss: 0.7671 - classification_loss: 0.0857 230/500 [============>.................] - ETA: 1:08 - loss: 0.8512 - regression_loss: 0.7658 - classification_loss: 0.0854 231/500 [============>.................] - ETA: 1:08 - loss: 0.8503 - regression_loss: 0.7649 - classification_loss: 0.0854 232/500 [============>.................] - ETA: 1:07 - loss: 0.8528 - regression_loss: 0.7671 - classification_loss: 0.0857 233/500 [============>.................] - ETA: 1:07 - loss: 0.8532 - regression_loss: 0.7674 - classification_loss: 0.0858 234/500 [=============>................] - ETA: 1:07 - loss: 0.8554 - regression_loss: 0.7692 - classification_loss: 0.0862 235/500 [=============>................] - ETA: 1:07 - loss: 0.8586 - regression_loss: 0.7717 - classification_loss: 0.0868 236/500 [=============>................] - ETA: 1:06 - loss: 0.8579 - regression_loss: 0.7711 - classification_loss: 0.0868 237/500 [=============>................] - ETA: 1:06 - loss: 0.8570 - regression_loss: 0.7704 - classification_loss: 0.0866 238/500 [=============>................] - ETA: 1:06 - loss: 0.8551 - regression_loss: 0.7688 - classification_loss: 0.0863 239/500 [=============>................] - ETA: 1:06 - loss: 0.8539 - regression_loss: 0.7678 - classification_loss: 0.0861 240/500 [=============>................] - ETA: 1:05 - loss: 0.8513 - regression_loss: 0.7655 - classification_loss: 0.0858 241/500 [=============>................] - ETA: 1:05 - loss: 0.8520 - regression_loss: 0.7660 - classification_loss: 0.0860 242/500 [=============>................] - ETA: 1:05 - loss: 0.8499 - regression_loss: 0.7643 - classification_loss: 0.0856 243/500 [=============>................] - ETA: 1:05 - loss: 0.8476 - regression_loss: 0.7623 - classification_loss: 0.0854 244/500 [=============>................] - ETA: 1:04 - loss: 0.8490 - regression_loss: 0.7638 - classification_loss: 0.0852 245/500 [=============>................] - ETA: 1:04 - loss: 0.8493 - regression_loss: 0.7639 - classification_loss: 0.0854 246/500 [=============>................] - ETA: 1:04 - loss: 0.8495 - regression_loss: 0.7642 - classification_loss: 0.0853 247/500 [=============>................] - ETA: 1:04 - loss: 0.8484 - regression_loss: 0.7630 - classification_loss: 0.0854 248/500 [=============>................] - ETA: 1:03 - loss: 0.8506 - regression_loss: 0.7649 - classification_loss: 0.0857 249/500 [=============>................] - ETA: 1:03 - loss: 0.8496 - regression_loss: 0.7641 - classification_loss: 0.0855 250/500 [==============>...............] - ETA: 1:03 - loss: 0.8504 - regression_loss: 0.7649 - classification_loss: 0.0856 251/500 [==============>...............] - ETA: 1:03 - loss: 0.8524 - regression_loss: 0.7665 - classification_loss: 0.0859 252/500 [==============>...............] - ETA: 1:02 - loss: 0.8533 - regression_loss: 0.7672 - classification_loss: 0.0861 253/500 [==============>...............] - ETA: 1:02 - loss: 0.8518 - regression_loss: 0.7659 - classification_loss: 0.0859 254/500 [==============>...............] - ETA: 1:02 - loss: 0.8511 - regression_loss: 0.7652 - classification_loss: 0.0859 255/500 [==============>...............] - ETA: 1:01 - loss: 0.8493 - regression_loss: 0.7636 - classification_loss: 0.0856 256/500 [==============>...............] - ETA: 1:01 - loss: 0.8495 - regression_loss: 0.7639 - classification_loss: 0.0857 257/500 [==============>...............] - ETA: 1:01 - loss: 0.8492 - regression_loss: 0.7636 - classification_loss: 0.0856 258/500 [==============>...............] - ETA: 1:01 - loss: 0.8488 - regression_loss: 0.7633 - classification_loss: 0.0855 259/500 [==============>...............] - ETA: 1:00 - loss: 0.8497 - regression_loss: 0.7641 - classification_loss: 0.0856 260/500 [==============>...............] - ETA: 1:00 - loss: 0.8515 - regression_loss: 0.7655 - classification_loss: 0.0861 261/500 [==============>...............] - ETA: 1:00 - loss: 0.8535 - regression_loss: 0.7672 - classification_loss: 0.0863 262/500 [==============>...............] - ETA: 1:00 - loss: 0.8539 - regression_loss: 0.7676 - classification_loss: 0.0863 263/500 [==============>...............] - ETA: 1:00 - loss: 0.8547 - regression_loss: 0.7680 - classification_loss: 0.0866 264/500 [==============>...............] - ETA: 59s - loss: 0.8573 - regression_loss: 0.7703 - classification_loss: 0.0870  265/500 [==============>...............] - ETA: 59s - loss: 0.8552 - regression_loss: 0.7684 - classification_loss: 0.0868 266/500 [==============>...............] - ETA: 59s - loss: 0.8533 - regression_loss: 0.7668 - classification_loss: 0.0865 267/500 [===============>..............] - ETA: 59s - loss: 0.8533 - regression_loss: 0.7668 - classification_loss: 0.0865 268/500 [===============>..............] - ETA: 58s - loss: 0.8536 - regression_loss: 0.7673 - classification_loss: 0.0864 269/500 [===============>..............] - ETA: 58s - loss: 0.8538 - regression_loss: 0.7675 - classification_loss: 0.0863 270/500 [===============>..............] - ETA: 58s - loss: 0.8548 - regression_loss: 0.7683 - classification_loss: 0.0865 271/500 [===============>..............] - ETA: 58s - loss: 0.8531 - regression_loss: 0.7668 - classification_loss: 0.0863 272/500 [===============>..............] - ETA: 57s - loss: 0.8542 - regression_loss: 0.7678 - classification_loss: 0.0864 273/500 [===============>..............] - ETA: 57s - loss: 0.8544 - regression_loss: 0.7682 - classification_loss: 0.0863 274/500 [===============>..............] - ETA: 57s - loss: 0.8530 - regression_loss: 0.7670 - classification_loss: 0.0860 275/500 [===============>..............] - ETA: 56s - loss: 0.8539 - regression_loss: 0.7678 - classification_loss: 0.0861 276/500 [===============>..............] - ETA: 56s - loss: 0.8547 - regression_loss: 0.7684 - classification_loss: 0.0863 277/500 [===============>..............] - ETA: 56s - loss: 0.8555 - regression_loss: 0.7691 - classification_loss: 0.0864 278/500 [===============>..............] - ETA: 56s - loss: 0.8570 - regression_loss: 0.7704 - classification_loss: 0.0866 279/500 [===============>..............] - ETA: 55s - loss: 0.8559 - regression_loss: 0.7696 - classification_loss: 0.0864 280/500 [===============>..............] - ETA: 55s - loss: 0.8577 - regression_loss: 0.7709 - classification_loss: 0.0868 281/500 [===============>..............] - ETA: 55s - loss: 0.8584 - regression_loss: 0.7715 - classification_loss: 0.0868 282/500 [===============>..............] - ETA: 55s - loss: 0.8590 - regression_loss: 0.7722 - classification_loss: 0.0868 283/500 [===============>..............] - ETA: 54s - loss: 0.8594 - regression_loss: 0.7726 - classification_loss: 0.0868 284/500 [================>.............] - ETA: 54s - loss: 0.8581 - regression_loss: 0.7716 - classification_loss: 0.0865 285/500 [================>.............] - ETA: 54s - loss: 0.8576 - regression_loss: 0.7712 - classification_loss: 0.0864 286/500 [================>.............] - ETA: 54s - loss: 0.8581 - regression_loss: 0.7716 - classification_loss: 0.0864 287/500 [================>.............] - ETA: 53s - loss: 0.8584 - regression_loss: 0.7716 - classification_loss: 0.0868 288/500 [================>.............] - ETA: 53s - loss: 0.8585 - regression_loss: 0.7717 - classification_loss: 0.0868 289/500 [================>.............] - ETA: 53s - loss: 0.8589 - regression_loss: 0.7720 - classification_loss: 0.0868 290/500 [================>.............] - ETA: 53s - loss: 0.8594 - regression_loss: 0.7724 - classification_loss: 0.0870 291/500 [================>.............] - ETA: 52s - loss: 0.8577 - regression_loss: 0.7710 - classification_loss: 0.0867 292/500 [================>.............] - ETA: 52s - loss: 0.8582 - regression_loss: 0.7714 - classification_loss: 0.0868 293/500 [================>.............] - ETA: 52s - loss: 0.8591 - regression_loss: 0.7721 - classification_loss: 0.0869 294/500 [================>.............] - ETA: 52s - loss: 0.8592 - regression_loss: 0.7724 - classification_loss: 0.0868 295/500 [================>.............] - ETA: 51s - loss: 0.8577 - regression_loss: 0.7711 - classification_loss: 0.0866 296/500 [================>.............] - ETA: 51s - loss: 0.8586 - regression_loss: 0.7719 - classification_loss: 0.0867 297/500 [================>.............] - ETA: 51s - loss: 0.8569 - regression_loss: 0.7704 - classification_loss: 0.0866 298/500 [================>.............] - ETA: 51s - loss: 0.8586 - regression_loss: 0.7716 - classification_loss: 0.0870 299/500 [================>.............] - ETA: 50s - loss: 0.8594 - regression_loss: 0.7723 - classification_loss: 0.0871 300/500 [=================>............] - ETA: 50s - loss: 0.8596 - regression_loss: 0.7725 - classification_loss: 0.0871 301/500 [=================>............] - ETA: 50s - loss: 0.8602 - regression_loss: 0.7730 - classification_loss: 0.0873 302/500 [=================>............] - ETA: 50s - loss: 0.8588 - regression_loss: 0.7716 - classification_loss: 0.0872 303/500 [=================>............] - ETA: 49s - loss: 0.8583 - regression_loss: 0.7712 - classification_loss: 0.0871 304/500 [=================>............] - ETA: 49s - loss: 0.8603 - regression_loss: 0.7730 - classification_loss: 0.0873 305/500 [=================>............] - ETA: 49s - loss: 0.8612 - regression_loss: 0.7738 - classification_loss: 0.0874 306/500 [=================>............] - ETA: 49s - loss: 0.8622 - regression_loss: 0.7746 - classification_loss: 0.0876 307/500 [=================>............] - ETA: 48s - loss: 0.8603 - regression_loss: 0.7729 - classification_loss: 0.0874 308/500 [=================>............] - ETA: 48s - loss: 0.8601 - regression_loss: 0.7726 - classification_loss: 0.0875 309/500 [=================>............] - ETA: 48s - loss: 0.8606 - regression_loss: 0.7731 - classification_loss: 0.0875 310/500 [=================>............] - ETA: 48s - loss: 0.8588 - regression_loss: 0.7715 - classification_loss: 0.0873 311/500 [=================>............] - ETA: 47s - loss: 0.8581 - regression_loss: 0.7710 - classification_loss: 0.0871 312/500 [=================>............] - ETA: 47s - loss: 0.8589 - regression_loss: 0.7716 - classification_loss: 0.0873 313/500 [=================>............] - ETA: 47s - loss: 0.8603 - regression_loss: 0.7728 - classification_loss: 0.0874 314/500 [=================>............] - ETA: 47s - loss: 0.8610 - regression_loss: 0.7734 - classification_loss: 0.0876 315/500 [=================>............] - ETA: 46s - loss: 0.8610 - regression_loss: 0.7734 - classification_loss: 0.0876 316/500 [=================>............] - ETA: 46s - loss: 0.8615 - regression_loss: 0.7737 - classification_loss: 0.0877 317/500 [==================>...........] - ETA: 46s - loss: 0.8611 - regression_loss: 0.7735 - classification_loss: 0.0876 318/500 [==================>...........] - ETA: 46s - loss: 0.8617 - regression_loss: 0.7742 - classification_loss: 0.0875 319/500 [==================>...........] - ETA: 45s - loss: 0.8622 - regression_loss: 0.7747 - classification_loss: 0.0876 320/500 [==================>...........] - ETA: 45s - loss: 0.8629 - regression_loss: 0.7754 - classification_loss: 0.0875 321/500 [==================>...........] - ETA: 45s - loss: 0.8631 - regression_loss: 0.7757 - classification_loss: 0.0874 322/500 [==================>...........] - ETA: 45s - loss: 0.8636 - regression_loss: 0.7761 - classification_loss: 0.0875 323/500 [==================>...........] - ETA: 44s - loss: 0.8643 - regression_loss: 0.7767 - classification_loss: 0.0876 324/500 [==================>...........] - ETA: 44s - loss: 0.8632 - regression_loss: 0.7758 - classification_loss: 0.0874 325/500 [==================>...........] - ETA: 44s - loss: 0.8639 - regression_loss: 0.7765 - classification_loss: 0.0874 326/500 [==================>...........] - ETA: 44s - loss: 0.8637 - regression_loss: 0.7764 - classification_loss: 0.0873 327/500 [==================>...........] - ETA: 43s - loss: 0.8637 - regression_loss: 0.7762 - classification_loss: 0.0875 328/500 [==================>...........] - ETA: 43s - loss: 0.8640 - regression_loss: 0.7763 - classification_loss: 0.0877 329/500 [==================>...........] - ETA: 43s - loss: 0.8645 - regression_loss: 0.7768 - classification_loss: 0.0877 330/500 [==================>...........] - ETA: 43s - loss: 0.8642 - regression_loss: 0.7766 - classification_loss: 0.0876 331/500 [==================>...........] - ETA: 42s - loss: 0.8649 - regression_loss: 0.7771 - classification_loss: 0.0878 332/500 [==================>...........] - ETA: 42s - loss: 0.8655 - regression_loss: 0.7777 - classification_loss: 0.0879 333/500 [==================>...........] - ETA: 42s - loss: 0.8657 - regression_loss: 0.7778 - classification_loss: 0.0880 334/500 [===================>..........] - ETA: 42s - loss: 0.8645 - regression_loss: 0.7768 - classification_loss: 0.0877 335/500 [===================>..........] - ETA: 41s - loss: 0.8654 - regression_loss: 0.7777 - classification_loss: 0.0877 336/500 [===================>..........] - ETA: 41s - loss: 0.8646 - regression_loss: 0.7770 - classification_loss: 0.0876 337/500 [===================>..........] - ETA: 41s - loss: 0.8652 - regression_loss: 0.7774 - classification_loss: 0.0878 338/500 [===================>..........] - ETA: 41s - loss: 0.8644 - regression_loss: 0.7768 - classification_loss: 0.0876 339/500 [===================>..........] - ETA: 40s - loss: 0.8646 - regression_loss: 0.7770 - classification_loss: 0.0876 340/500 [===================>..........] - ETA: 40s - loss: 0.8659 - regression_loss: 0.7781 - classification_loss: 0.0878 341/500 [===================>..........] - ETA: 40s - loss: 0.8658 - regression_loss: 0.7780 - classification_loss: 0.0878 342/500 [===================>..........] - ETA: 40s - loss: 0.8661 - regression_loss: 0.7782 - classification_loss: 0.0879 343/500 [===================>..........] - ETA: 39s - loss: 0.8678 - regression_loss: 0.7796 - classification_loss: 0.0882 344/500 [===================>..........] - ETA: 39s - loss: 0.8672 - regression_loss: 0.7790 - classification_loss: 0.0882 345/500 [===================>..........] - ETA: 39s - loss: 0.8667 - regression_loss: 0.7785 - classification_loss: 0.0882 346/500 [===================>..........] - ETA: 39s - loss: 0.8659 - regression_loss: 0.7778 - classification_loss: 0.0881 347/500 [===================>..........] - ETA: 38s - loss: 0.8663 - regression_loss: 0.7782 - classification_loss: 0.0881 348/500 [===================>..........] - ETA: 38s - loss: 0.8669 - regression_loss: 0.7789 - classification_loss: 0.0880 349/500 [===================>..........] - ETA: 38s - loss: 0.8656 - regression_loss: 0.7777 - classification_loss: 0.0879 350/500 [====================>.........] - ETA: 37s - loss: 0.8665 - regression_loss: 0.7785 - classification_loss: 0.0880 351/500 [====================>.........] - ETA: 37s - loss: 0.8654 - regression_loss: 0.7776 - classification_loss: 0.0878 352/500 [====================>.........] - ETA: 37s - loss: 0.8666 - regression_loss: 0.7787 - classification_loss: 0.0878 353/500 [====================>.........] - ETA: 37s - loss: 0.8657 - regression_loss: 0.7781 - classification_loss: 0.0877 354/500 [====================>.........] - ETA: 36s - loss: 0.8659 - regression_loss: 0.7782 - classification_loss: 0.0877 355/500 [====================>.........] - ETA: 36s - loss: 0.8653 - regression_loss: 0.7778 - classification_loss: 0.0875 356/500 [====================>.........] - ETA: 36s - loss: 0.8656 - regression_loss: 0.7779 - classification_loss: 0.0877 357/500 [====================>.........] - ETA: 36s - loss: 0.8655 - regression_loss: 0.7778 - classification_loss: 0.0877 358/500 [====================>.........] - ETA: 35s - loss: 0.8651 - regression_loss: 0.7774 - classification_loss: 0.0876 359/500 [====================>.........] - ETA: 35s - loss: 0.8653 - regression_loss: 0.7776 - classification_loss: 0.0877 360/500 [====================>.........] - ETA: 35s - loss: 0.8656 - regression_loss: 0.7780 - classification_loss: 0.0876 361/500 [====================>.........] - ETA: 35s - loss: 0.8649 - regression_loss: 0.7774 - classification_loss: 0.0875 362/500 [====================>.........] - ETA: 34s - loss: 0.8664 - regression_loss: 0.7787 - classification_loss: 0.0878 363/500 [====================>.........] - ETA: 34s - loss: 0.8673 - regression_loss: 0.7795 - classification_loss: 0.0878 364/500 [====================>.........] - ETA: 34s - loss: 0.8692 - regression_loss: 0.7811 - classification_loss: 0.0881 365/500 [====================>.........] - ETA: 34s - loss: 0.8682 - regression_loss: 0.7803 - classification_loss: 0.0880 366/500 [====================>.........] - ETA: 33s - loss: 0.8680 - regression_loss: 0.7799 - classification_loss: 0.0880 367/500 [=====================>........] - ETA: 33s - loss: 0.8678 - regression_loss: 0.7799 - classification_loss: 0.0879 368/500 [=====================>........] - ETA: 33s - loss: 0.8670 - regression_loss: 0.7792 - classification_loss: 0.0878 369/500 [=====================>........] - ETA: 33s - loss: 0.8666 - regression_loss: 0.7789 - classification_loss: 0.0877 370/500 [=====================>........] - ETA: 32s - loss: 0.8668 - regression_loss: 0.7790 - classification_loss: 0.0878 371/500 [=====================>........] - ETA: 32s - loss: 0.8694 - regression_loss: 0.7813 - classification_loss: 0.0881 372/500 [=====================>........] - ETA: 32s - loss: 0.8705 - regression_loss: 0.7824 - classification_loss: 0.0881 373/500 [=====================>........] - ETA: 32s - loss: 0.8687 - regression_loss: 0.7808 - classification_loss: 0.0879 374/500 [=====================>........] - ETA: 31s - loss: 0.8675 - regression_loss: 0.7798 - classification_loss: 0.0878 375/500 [=====================>........] - ETA: 31s - loss: 0.8662 - regression_loss: 0.7786 - classification_loss: 0.0876 376/500 [=====================>........] - ETA: 31s - loss: 0.8667 - regression_loss: 0.7791 - classification_loss: 0.0876 377/500 [=====================>........] - ETA: 31s - loss: 0.8677 - regression_loss: 0.7800 - classification_loss: 0.0877 378/500 [=====================>........] - ETA: 30s - loss: 0.8666 - regression_loss: 0.7791 - classification_loss: 0.0875 379/500 [=====================>........] - ETA: 30s - loss: 0.8681 - regression_loss: 0.7801 - classification_loss: 0.0880 380/500 [=====================>........] - ETA: 30s - loss: 0.8681 - regression_loss: 0.7802 - classification_loss: 0.0880 381/500 [=====================>........] - ETA: 30s - loss: 0.8684 - regression_loss: 0.7803 - classification_loss: 0.0880 382/500 [=====================>........] - ETA: 29s - loss: 0.8677 - regression_loss: 0.7798 - classification_loss: 0.0880 383/500 [=====================>........] - ETA: 29s - loss: 0.8679 - regression_loss: 0.7799 - classification_loss: 0.0880 384/500 [======================>.......] - ETA: 29s - loss: 0.8677 - regression_loss: 0.7797 - classification_loss: 0.0880 385/500 [======================>.......] - ETA: 29s - loss: 0.8677 - regression_loss: 0.7797 - classification_loss: 0.0880 386/500 [======================>.......] - ETA: 28s - loss: 0.8685 - regression_loss: 0.7804 - classification_loss: 0.0882 387/500 [======================>.......] - ETA: 28s - loss: 0.8695 - regression_loss: 0.7812 - classification_loss: 0.0884 388/500 [======================>.......] - ETA: 28s - loss: 0.8701 - regression_loss: 0.7817 - classification_loss: 0.0885 389/500 [======================>.......] - ETA: 28s - loss: 0.8709 - regression_loss: 0.7823 - classification_loss: 0.0885 390/500 [======================>.......] - ETA: 27s - loss: 0.8727 - regression_loss: 0.7839 - classification_loss: 0.0888 391/500 [======================>.......] - ETA: 27s - loss: 0.8728 - regression_loss: 0.7840 - classification_loss: 0.0887 392/500 [======================>.......] - ETA: 27s - loss: 0.8729 - regression_loss: 0.7843 - classification_loss: 0.0887 393/500 [======================>.......] - ETA: 27s - loss: 0.8731 - regression_loss: 0.7845 - classification_loss: 0.0886 394/500 [======================>.......] - ETA: 26s - loss: 0.8726 - regression_loss: 0.7841 - classification_loss: 0.0885 395/500 [======================>.......] - ETA: 26s - loss: 0.8731 - regression_loss: 0.7845 - classification_loss: 0.0886 396/500 [======================>.......] - ETA: 26s - loss: 0.8728 - regression_loss: 0.7842 - classification_loss: 0.0886 397/500 [======================>.......] - ETA: 26s - loss: 0.8723 - regression_loss: 0.7838 - classification_loss: 0.0885 398/500 [======================>.......] - ETA: 25s - loss: 0.8728 - regression_loss: 0.7842 - classification_loss: 0.0886 399/500 [======================>.......] - ETA: 25s - loss: 0.8738 - regression_loss: 0.7850 - classification_loss: 0.0888 400/500 [=======================>......] - ETA: 25s - loss: 0.8746 - regression_loss: 0.7857 - classification_loss: 0.0889 401/500 [=======================>......] - ETA: 25s - loss: 0.8757 - regression_loss: 0.7866 - classification_loss: 0.0891 402/500 [=======================>......] - ETA: 24s - loss: 0.8764 - regression_loss: 0.7873 - classification_loss: 0.0891 403/500 [=======================>......] - ETA: 24s - loss: 0.8768 - regression_loss: 0.7877 - classification_loss: 0.0891 404/500 [=======================>......] - ETA: 24s - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0892 405/500 [=======================>......] - ETA: 24s - loss: 0.8766 - regression_loss: 0.7875 - classification_loss: 0.0891 406/500 [=======================>......] - ETA: 23s - loss: 0.8773 - regression_loss: 0.7881 - classification_loss: 0.0892 407/500 [=======================>......] - ETA: 23s - loss: 0.8783 - regression_loss: 0.7889 - classification_loss: 0.0894 408/500 [=======================>......] - ETA: 23s - loss: 0.8780 - regression_loss: 0.7887 - classification_loss: 0.0893 409/500 [=======================>......] - ETA: 23s - loss: 0.8785 - regression_loss: 0.7891 - classification_loss: 0.0894 410/500 [=======================>......] - ETA: 22s - loss: 0.8798 - regression_loss: 0.7902 - classification_loss: 0.0896 411/500 [=======================>......] - ETA: 22s - loss: 0.8796 - regression_loss: 0.7900 - classification_loss: 0.0896 412/500 [=======================>......] - ETA: 22s - loss: 0.8783 - regression_loss: 0.7890 - classification_loss: 0.0894 413/500 [=======================>......] - ETA: 22s - loss: 0.8796 - regression_loss: 0.7900 - classification_loss: 0.0896 414/500 [=======================>......] - ETA: 21s - loss: 0.8793 - regression_loss: 0.7897 - classification_loss: 0.0895 415/500 [=======================>......] - ETA: 21s - loss: 0.8787 - regression_loss: 0.7891 - classification_loss: 0.0896 416/500 [=======================>......] - ETA: 21s - loss: 0.8781 - regression_loss: 0.7887 - classification_loss: 0.0895 417/500 [========================>.....] - ETA: 21s - loss: 0.8773 - regression_loss: 0.7880 - classification_loss: 0.0893 418/500 [========================>.....] - ETA: 20s - loss: 0.8770 - regression_loss: 0.7877 - classification_loss: 0.0893 419/500 [========================>.....] - ETA: 20s - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0892 420/500 [========================>.....] - ETA: 20s - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0892 421/500 [========================>.....] - ETA: 20s - loss: 0.8776 - regression_loss: 0.7884 - classification_loss: 0.0892 422/500 [========================>.....] - ETA: 19s - loss: 0.8786 - regression_loss: 0.7892 - classification_loss: 0.0894 423/500 [========================>.....] - ETA: 19s - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0892 424/500 [========================>.....] - ETA: 19s - loss: 0.8769 - regression_loss: 0.7878 - classification_loss: 0.0892 425/500 [========================>.....] - ETA: 19s - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0893 426/500 [========================>.....] - ETA: 18s - loss: 0.8773 - regression_loss: 0.7879 - classification_loss: 0.0894 427/500 [========================>.....] - ETA: 18s - loss: 0.8776 - regression_loss: 0.7882 - classification_loss: 0.0894 428/500 [========================>.....] - ETA: 18s - loss: 0.8776 - regression_loss: 0.7882 - classification_loss: 0.0894 429/500 [========================>.....] - ETA: 18s - loss: 0.8788 - regression_loss: 0.7894 - classification_loss: 0.0894 430/500 [========================>.....] - ETA: 17s - loss: 0.8791 - regression_loss: 0.7897 - classification_loss: 0.0894 431/500 [========================>.....] - ETA: 17s - loss: 0.8799 - regression_loss: 0.7903 - classification_loss: 0.0895 432/500 [========================>.....] - ETA: 17s - loss: 0.8794 - regression_loss: 0.7899 - classification_loss: 0.0895 433/500 [========================>.....] - ETA: 17s - loss: 0.8789 - regression_loss: 0.7895 - classification_loss: 0.0894 434/500 [=========================>....] - ETA: 16s - loss: 0.8772 - regression_loss: 0.7880 - classification_loss: 0.0893 435/500 [=========================>....] - ETA: 16s - loss: 0.8764 - regression_loss: 0.7872 - classification_loss: 0.0892 436/500 [=========================>....] - ETA: 16s - loss: 0.8757 - regression_loss: 0.7866 - classification_loss: 0.0891 437/500 [=========================>....] - ETA: 15s - loss: 0.8761 - regression_loss: 0.7870 - classification_loss: 0.0891 438/500 [=========================>....] - ETA: 15s - loss: 0.8748 - regression_loss: 0.7860 - classification_loss: 0.0889 439/500 [=========================>....] - ETA: 15s - loss: 0.8747 - regression_loss: 0.7859 - classification_loss: 0.0889 440/500 [=========================>....] - ETA: 15s - loss: 0.8750 - regression_loss: 0.7861 - classification_loss: 0.0889 441/500 [=========================>....] - ETA: 14s - loss: 0.8768 - regression_loss: 0.7876 - classification_loss: 0.0892 442/500 [=========================>....] - ETA: 14s - loss: 0.8755 - regression_loss: 0.7865 - classification_loss: 0.0890 443/500 [=========================>....] - ETA: 14s - loss: 0.8754 - regression_loss: 0.7865 - classification_loss: 0.0889 444/500 [=========================>....] - ETA: 14s - loss: 0.8760 - regression_loss: 0.7870 - classification_loss: 0.0890 445/500 [=========================>....] - ETA: 13s - loss: 0.8759 - regression_loss: 0.7868 - classification_loss: 0.0891 446/500 [=========================>....] - ETA: 13s - loss: 0.8760 - regression_loss: 0.7869 - classification_loss: 0.0891 447/500 [=========================>....] - ETA: 13s - loss: 0.8747 - regression_loss: 0.7857 - classification_loss: 0.0890 448/500 [=========================>....] - ETA: 13s - loss: 0.8735 - regression_loss: 0.7845 - classification_loss: 0.0889 449/500 [=========================>....] - ETA: 12s - loss: 0.8733 - regression_loss: 0.7845 - classification_loss: 0.0888 450/500 [==========================>...] - ETA: 12s - loss: 0.8731 - regression_loss: 0.7844 - classification_loss: 0.0887 451/500 [==========================>...] - ETA: 12s - loss: 0.8725 - regression_loss: 0.7838 - classification_loss: 0.0886 452/500 [==========================>...] - ETA: 12s - loss: 0.8736 - regression_loss: 0.7848 - classification_loss: 0.0888 453/500 [==========================>...] - ETA: 11s - loss: 0.8731 - regression_loss: 0.7844 - classification_loss: 0.0887 454/500 [==========================>...] - ETA: 11s - loss: 0.8737 - regression_loss: 0.7848 - classification_loss: 0.0889 455/500 [==========================>...] - ETA: 11s - loss: 0.8740 - regression_loss: 0.7851 - classification_loss: 0.0889 456/500 [==========================>...] - ETA: 11s - loss: 0.8743 - regression_loss: 0.7853 - classification_loss: 0.0890 457/500 [==========================>...] - ETA: 10s - loss: 0.8749 - regression_loss: 0.7859 - classification_loss: 0.0891 458/500 [==========================>...] - ETA: 10s - loss: 0.8757 - regression_loss: 0.7866 - classification_loss: 0.0892 459/500 [==========================>...] - ETA: 10s - loss: 0.8770 - regression_loss: 0.7877 - classification_loss: 0.0893 460/500 [==========================>...] - ETA: 10s - loss: 0.8775 - regression_loss: 0.7881 - classification_loss: 0.0894 461/500 [==========================>...] - ETA: 9s - loss: 0.8780 - regression_loss: 0.7885 - classification_loss: 0.0895  462/500 [==========================>...] - ETA: 9s - loss: 0.8776 - regression_loss: 0.7882 - classification_loss: 0.0894 463/500 [==========================>...] - ETA: 9s - loss: 0.8781 - regression_loss: 0.7887 - classification_loss: 0.0895 464/500 [==========================>...] - ETA: 9s - loss: 0.8797 - regression_loss: 0.7899 - classification_loss: 0.0898 465/500 [==========================>...] - ETA: 8s - loss: 0.8803 - regression_loss: 0.7905 - classification_loss: 0.0898 466/500 [==========================>...] - ETA: 8s - loss: 0.8804 - regression_loss: 0.7905 - classification_loss: 0.0899 467/500 [===========================>..] - ETA: 8s - loss: 0.8789 - regression_loss: 0.7892 - classification_loss: 0.0897 468/500 [===========================>..] - ETA: 8s - loss: 0.8804 - regression_loss: 0.7904 - classification_loss: 0.0900 469/500 [===========================>..] - ETA: 7s - loss: 0.8797 - regression_loss: 0.7899 - classification_loss: 0.0899 470/500 [===========================>..] - ETA: 7s - loss: 0.8800 - regression_loss: 0.7902 - classification_loss: 0.0899 471/500 [===========================>..] - ETA: 7s - loss: 0.8809 - regression_loss: 0.7909 - classification_loss: 0.0900 472/500 [===========================>..] - ETA: 7s - loss: 0.8809 - regression_loss: 0.7910 - classification_loss: 0.0899 473/500 [===========================>..] - ETA: 6s - loss: 0.8812 - regression_loss: 0.7912 - classification_loss: 0.0900 474/500 [===========================>..] - ETA: 6s - loss: 0.8809 - regression_loss: 0.7910 - classification_loss: 0.0899 475/500 [===========================>..] - ETA: 6s - loss: 0.8815 - regression_loss: 0.7914 - classification_loss: 0.0900 476/500 [===========================>..] - ETA: 6s - loss: 0.8804 - regression_loss: 0.7905 - classification_loss: 0.0899 477/500 [===========================>..] - ETA: 5s - loss: 0.8805 - regression_loss: 0.7905 - classification_loss: 0.0899 478/500 [===========================>..] - ETA: 5s - loss: 0.8801 - regression_loss: 0.7902 - classification_loss: 0.0898 479/500 [===========================>..] - ETA: 5s - loss: 0.8790 - regression_loss: 0.7893 - classification_loss: 0.0897 480/500 [===========================>..] - ETA: 5s - loss: 0.8777 - regression_loss: 0.7881 - classification_loss: 0.0896 481/500 [===========================>..] - ETA: 4s - loss: 0.8779 - regression_loss: 0.7883 - classification_loss: 0.0896 482/500 [===========================>..] - ETA: 4s - loss: 0.8789 - regression_loss: 0.7893 - classification_loss: 0.0896 483/500 [===========================>..] - ETA: 4s - loss: 0.8792 - regression_loss: 0.7895 - classification_loss: 0.0896 484/500 [============================>.] - ETA: 4s - loss: 0.8784 - regression_loss: 0.7889 - classification_loss: 0.0895 485/500 [============================>.] - ETA: 3s - loss: 0.8787 - regression_loss: 0.7891 - classification_loss: 0.0896 486/500 [============================>.] - ETA: 3s - loss: 0.8783 - regression_loss: 0.7887 - classification_loss: 0.0896 487/500 [============================>.] - ETA: 3s - loss: 0.8773 - regression_loss: 0.7878 - classification_loss: 0.0894 488/500 [============================>.] - ETA: 3s - loss: 0.8779 - regression_loss: 0.7884 - classification_loss: 0.0895 489/500 [============================>.] - ETA: 2s - loss: 0.8770 - regression_loss: 0.7877 - classification_loss: 0.0894 490/500 [============================>.] - ETA: 2s - loss: 0.8771 - regression_loss: 0.7878 - classification_loss: 0.0894 491/500 [============================>.] - ETA: 2s - loss: 0.8780 - regression_loss: 0.7885 - classification_loss: 0.0895 492/500 [============================>.] - ETA: 2s - loss: 0.8783 - regression_loss: 0.7889 - classification_loss: 0.0895 493/500 [============================>.] - ETA: 1s - loss: 0.8791 - regression_loss: 0.7896 - classification_loss: 0.0895 494/500 [============================>.] - ETA: 1s - loss: 0.8802 - regression_loss: 0.7906 - classification_loss: 0.0897 495/500 [============================>.] - ETA: 1s - loss: 0.8796 - regression_loss: 0.7900 - classification_loss: 0.0896 496/500 [============================>.] - ETA: 1s - loss: 0.8806 - regression_loss: 0.7908 - classification_loss: 0.0898 497/500 [============================>.] - ETA: 0s - loss: 0.8805 - regression_loss: 0.7906 - classification_loss: 0.0899 498/500 [============================>.] - ETA: 0s - loss: 0.8805 - regression_loss: 0.7907 - classification_loss: 0.0898 499/500 [============================>.] - ETA: 0s - loss: 0.8807 - regression_loss: 0.7908 - classification_loss: 0.0899 500/500 [==============================] - 127s 254ms/step - loss: 0.8810 - regression_loss: 0.7912 - classification_loss: 0.0898 1172 instances of class plum with average precision: 0.7852 mAP: 0.7852 Epoch 00050: saving model to ./training/snapshots/resnet50_pascal_50.h5 Epoch 51/150 1/500 [..............................] - ETA: 1:51 - loss: 1.2497 - regression_loss: 1.0979 - classification_loss: 0.1518 2/500 [..............................] - ETA: 1:56 - loss: 1.2148 - regression_loss: 1.0719 - classification_loss: 0.1428 3/500 [..............................] - ETA: 1:59 - loss: 1.1072 - regression_loss: 0.9922 - classification_loss: 0.1149 4/500 [..............................] - ETA: 2:00 - loss: 1.1260 - regression_loss: 1.0165 - classification_loss: 0.1095 5/500 [..............................] - ETA: 1:59 - loss: 1.1734 - regression_loss: 1.0542 - classification_loss: 0.1192 6/500 [..............................] - ETA: 1:59 - loss: 1.1278 - regression_loss: 1.0075 - classification_loss: 0.1203 7/500 [..............................] - ETA: 1:59 - loss: 1.0340 - regression_loss: 0.9252 - classification_loss: 0.1089 8/500 [..............................] - ETA: 2:00 - loss: 1.0165 - regression_loss: 0.9143 - classification_loss: 0.1023 9/500 [..............................] - ETA: 2:01 - loss: 1.0183 - regression_loss: 0.9164 - classification_loss: 0.1019 10/500 [..............................] - ETA: 2:01 - loss: 0.9792 - regression_loss: 0.8816 - classification_loss: 0.0976 11/500 [..............................] - ETA: 2:01 - loss: 0.9667 - regression_loss: 0.8710 - classification_loss: 0.0956 12/500 [..............................] - ETA: 2:01 - loss: 0.9302 - regression_loss: 0.8390 - classification_loss: 0.0912 13/500 [..............................] - ETA: 2:01 - loss: 0.9545 - regression_loss: 0.8592 - classification_loss: 0.0953 14/500 [..............................] - ETA: 2:00 - loss: 0.9365 - regression_loss: 0.8457 - classification_loss: 0.0907 15/500 [..............................] - ETA: 2:00 - loss: 0.9148 - regression_loss: 0.8277 - classification_loss: 0.0870 16/500 [..............................] - ETA: 2:00 - loss: 0.9053 - regression_loss: 0.8205 - classification_loss: 0.0848 17/500 [>.............................] - ETA: 2:00 - loss: 0.8864 - regression_loss: 0.8043 - classification_loss: 0.0821 18/500 [>.............................] - ETA: 2:00 - loss: 0.8912 - regression_loss: 0.8078 - classification_loss: 0.0834 19/500 [>.............................] - ETA: 2:00 - loss: 0.8991 - regression_loss: 0.8148 - classification_loss: 0.0843 20/500 [>.............................] - ETA: 2:00 - loss: 0.9219 - regression_loss: 0.8340 - classification_loss: 0.0879 21/500 [>.............................] - ETA: 1:59 - loss: 0.9150 - regression_loss: 0.8292 - classification_loss: 0.0858 22/500 [>.............................] - ETA: 1:58 - loss: 0.9231 - regression_loss: 0.8325 - classification_loss: 0.0906 23/500 [>.............................] - ETA: 1:58 - loss: 0.9057 - regression_loss: 0.8168 - classification_loss: 0.0889 24/500 [>.............................] - ETA: 1:57 - loss: 0.9166 - regression_loss: 0.8249 - classification_loss: 0.0917 25/500 [>.............................] - ETA: 1:58 - loss: 0.9163 - regression_loss: 0.8249 - classification_loss: 0.0913 26/500 [>.............................] - ETA: 1:57 - loss: 0.9001 - regression_loss: 0.8106 - classification_loss: 0.0895 27/500 [>.............................] - ETA: 1:57 - loss: 0.8930 - regression_loss: 0.8055 - classification_loss: 0.0875 28/500 [>.............................] - ETA: 1:57 - loss: 0.8927 - regression_loss: 0.8048 - classification_loss: 0.0879 29/500 [>.............................] - ETA: 1:57 - loss: 0.8924 - regression_loss: 0.8043 - classification_loss: 0.0881 30/500 [>.............................] - ETA: 1:57 - loss: 0.9019 - regression_loss: 0.8120 - classification_loss: 0.0898 31/500 [>.............................] - ETA: 1:57 - loss: 0.9086 - regression_loss: 0.8175 - classification_loss: 0.0910 32/500 [>.............................] - ETA: 1:56 - loss: 0.9204 - regression_loss: 0.8274 - classification_loss: 0.0930 33/500 [>.............................] - ETA: 1:56 - loss: 0.9368 - regression_loss: 0.8421 - classification_loss: 0.0947 34/500 [=>............................] - ETA: 1:56 - loss: 0.9315 - regression_loss: 0.8376 - classification_loss: 0.0938 35/500 [=>............................] - ETA: 1:56 - loss: 0.9379 - regression_loss: 0.8433 - classification_loss: 0.0946 36/500 [=>............................] - ETA: 1:55 - loss: 0.9384 - regression_loss: 0.8435 - classification_loss: 0.0950 37/500 [=>............................] - ETA: 1:55 - loss: 0.9249 - regression_loss: 0.8320 - classification_loss: 0.0930 38/500 [=>............................] - ETA: 1:55 - loss: 0.9366 - regression_loss: 0.8412 - classification_loss: 0.0953 39/500 [=>............................] - ETA: 1:55 - loss: 0.9161 - regression_loss: 0.8224 - classification_loss: 0.0938 40/500 [=>............................] - ETA: 1:54 - loss: 0.9133 - regression_loss: 0.8198 - classification_loss: 0.0935 41/500 [=>............................] - ETA: 1:54 - loss: 0.9209 - regression_loss: 0.8262 - classification_loss: 0.0947 42/500 [=>............................] - ETA: 1:54 - loss: 0.9323 - regression_loss: 0.8355 - classification_loss: 0.0969 43/500 [=>............................] - ETA: 1:54 - loss: 0.9237 - regression_loss: 0.8282 - classification_loss: 0.0955 44/500 [=>............................] - ETA: 1:54 - loss: 0.9150 - regression_loss: 0.8205 - classification_loss: 0.0945 45/500 [=>............................] - ETA: 1:53 - loss: 0.9207 - regression_loss: 0.8276 - classification_loss: 0.0931 46/500 [=>............................] - ETA: 1:53 - loss: 0.9281 - regression_loss: 0.8359 - classification_loss: 0.0923 47/500 [=>............................] - ETA: 1:53 - loss: 0.9148 - regression_loss: 0.8244 - classification_loss: 0.0905 48/500 [=>............................] - ETA: 1:53 - loss: 0.9194 - regression_loss: 0.8289 - classification_loss: 0.0905 49/500 [=>............................] - ETA: 1:53 - loss: 0.9169 - regression_loss: 0.8255 - classification_loss: 0.0914 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9163 - regression_loss: 0.8251 - classification_loss: 0.0912 51/500 [==>...........................] - ETA: 1:52 - loss: 0.9192 - regression_loss: 0.8278 - classification_loss: 0.0914 52/500 [==>...........................] - ETA: 1:52 - loss: 0.9197 - regression_loss: 0.8284 - classification_loss: 0.0913 53/500 [==>...........................] - ETA: 1:52 - loss: 0.9159 - regression_loss: 0.8248 - classification_loss: 0.0912 54/500 [==>...........................] - ETA: 1:51 - loss: 0.9065 - regression_loss: 0.8167 - classification_loss: 0.0898 55/500 [==>...........................] - ETA: 1:51 - loss: 0.9098 - regression_loss: 0.8187 - classification_loss: 0.0911 56/500 [==>...........................] - ETA: 1:51 - loss: 0.9107 - regression_loss: 0.8195 - classification_loss: 0.0912 57/500 [==>...........................] - ETA: 1:51 - loss: 0.9046 - regression_loss: 0.8145 - classification_loss: 0.0901 58/500 [==>...........................] - ETA: 1:50 - loss: 0.9189 - regression_loss: 0.8275 - classification_loss: 0.0914 59/500 [==>...........................] - ETA: 1:50 - loss: 0.9193 - regression_loss: 0.8275 - classification_loss: 0.0918 60/500 [==>...........................] - ETA: 1:50 - loss: 0.9117 - regression_loss: 0.8208 - classification_loss: 0.0909 61/500 [==>...........................] - ETA: 1:49 - loss: 0.9048 - regression_loss: 0.8151 - classification_loss: 0.0897 62/500 [==>...........................] - ETA: 1:49 - loss: 0.8983 - regression_loss: 0.8095 - classification_loss: 0.0888 63/500 [==>...........................] - ETA: 1:49 - loss: 0.8956 - regression_loss: 0.8070 - classification_loss: 0.0887 64/500 [==>...........................] - ETA: 1:49 - loss: 0.8981 - regression_loss: 0.8091 - classification_loss: 0.0890 65/500 [==>...........................] - ETA: 1:49 - loss: 0.9036 - regression_loss: 0.8142 - classification_loss: 0.0893 66/500 [==>...........................] - ETA: 1:48 - loss: 0.8995 - regression_loss: 0.8104 - classification_loss: 0.0891 67/500 [===>..........................] - ETA: 1:48 - loss: 0.9025 - regression_loss: 0.8128 - classification_loss: 0.0897 68/500 [===>..........................] - ETA: 1:48 - loss: 0.9040 - regression_loss: 0.8135 - classification_loss: 0.0905 69/500 [===>..........................] - ETA: 1:48 - loss: 0.9067 - regression_loss: 0.8159 - classification_loss: 0.0908 70/500 [===>..........................] - ETA: 1:47 - loss: 0.9018 - regression_loss: 0.8118 - classification_loss: 0.0900 71/500 [===>..........................] - ETA: 1:47 - loss: 0.8975 - regression_loss: 0.8082 - classification_loss: 0.0894 72/500 [===>..........................] - ETA: 1:47 - loss: 0.8892 - regression_loss: 0.8009 - classification_loss: 0.0883 73/500 [===>..........................] - ETA: 1:46 - loss: 0.8916 - regression_loss: 0.8033 - classification_loss: 0.0883 74/500 [===>..........................] - ETA: 1:46 - loss: 0.8860 - regression_loss: 0.7985 - classification_loss: 0.0874 75/500 [===>..........................] - ETA: 1:46 - loss: 0.8818 - regression_loss: 0.7947 - classification_loss: 0.0870 76/500 [===>..........................] - ETA: 1:46 - loss: 0.8825 - regression_loss: 0.7950 - classification_loss: 0.0875 77/500 [===>..........................] - ETA: 1:46 - loss: 0.8757 - regression_loss: 0.7890 - classification_loss: 0.0867 78/500 [===>..........................] - ETA: 1:45 - loss: 0.8796 - regression_loss: 0.7924 - classification_loss: 0.0873 79/500 [===>..........................] - ETA: 1:45 - loss: 0.8851 - regression_loss: 0.7969 - classification_loss: 0.0882 80/500 [===>..........................] - ETA: 1:45 - loss: 0.8845 - regression_loss: 0.7968 - classification_loss: 0.0877 81/500 [===>..........................] - ETA: 1:45 - loss: 0.8878 - regression_loss: 0.7995 - classification_loss: 0.0883 82/500 [===>..........................] - ETA: 1:44 - loss: 0.8886 - regression_loss: 0.8002 - classification_loss: 0.0884 83/500 [===>..........................] - ETA: 1:44 - loss: 0.8896 - regression_loss: 0.8012 - classification_loss: 0.0884 84/500 [====>.........................] - ETA: 1:44 - loss: 0.8907 - regression_loss: 0.8017 - classification_loss: 0.0890 85/500 [====>.........................] - ETA: 1:44 - loss: 0.8944 - regression_loss: 0.8054 - classification_loss: 0.0890 86/500 [====>.........................] - ETA: 1:43 - loss: 0.9003 - regression_loss: 0.8104 - classification_loss: 0.0899 87/500 [====>.........................] - ETA: 1:43 - loss: 0.9012 - regression_loss: 0.8110 - classification_loss: 0.0902 88/500 [====>.........................] - ETA: 1:43 - loss: 0.9023 - regression_loss: 0.8119 - classification_loss: 0.0904 89/500 [====>.........................] - ETA: 1:43 - loss: 0.9049 - regression_loss: 0.8143 - classification_loss: 0.0907 90/500 [====>.........................] - ETA: 1:42 - loss: 0.9069 - regression_loss: 0.8161 - classification_loss: 0.0908 91/500 [====>.........................] - ETA: 1:42 - loss: 0.9024 - regression_loss: 0.8121 - classification_loss: 0.0902 92/500 [====>.........................] - ETA: 1:42 - loss: 0.8979 - regression_loss: 0.8083 - classification_loss: 0.0896 93/500 [====>.........................] - ETA: 1:42 - loss: 0.8913 - regression_loss: 0.8025 - classification_loss: 0.0888 94/500 [====>.........................] - ETA: 1:41 - loss: 0.8924 - regression_loss: 0.8037 - classification_loss: 0.0887 95/500 [====>.........................] - ETA: 1:41 - loss: 0.8931 - regression_loss: 0.8045 - classification_loss: 0.0886 96/500 [====>.........................] - ETA: 1:41 - loss: 0.8897 - regression_loss: 0.8018 - classification_loss: 0.0880 97/500 [====>.........................] - ETA: 1:41 - loss: 0.8884 - regression_loss: 0.8006 - classification_loss: 0.0878 98/500 [====>.........................] - ETA: 1:41 - loss: 0.8890 - regression_loss: 0.8012 - classification_loss: 0.0878 99/500 [====>.........................] - ETA: 1:40 - loss: 0.8932 - regression_loss: 0.8049 - classification_loss: 0.0883 100/500 [=====>........................] - ETA: 1:40 - loss: 0.8941 - regression_loss: 0.8057 - classification_loss: 0.0884 101/500 [=====>........................] - ETA: 1:40 - loss: 0.8950 - regression_loss: 0.8063 - classification_loss: 0.0887 102/500 [=====>........................] - ETA: 1:40 - loss: 0.8892 - regression_loss: 0.8011 - classification_loss: 0.0881 103/500 [=====>........................] - ETA: 1:39 - loss: 0.8907 - regression_loss: 0.8026 - classification_loss: 0.0882 104/500 [=====>........................] - ETA: 1:39 - loss: 0.8898 - regression_loss: 0.8019 - classification_loss: 0.0879 105/500 [=====>........................] - ETA: 1:39 - loss: 0.8868 - regression_loss: 0.7996 - classification_loss: 0.0872 106/500 [=====>........................] - ETA: 1:39 - loss: 0.8895 - regression_loss: 0.8021 - classification_loss: 0.0874 107/500 [=====>........................] - ETA: 1:39 - loss: 0.8855 - regression_loss: 0.7986 - classification_loss: 0.0869 108/500 [=====>........................] - ETA: 1:38 - loss: 0.8932 - regression_loss: 0.8048 - classification_loss: 0.0883 109/500 [=====>........................] - ETA: 1:38 - loss: 0.8930 - regression_loss: 0.8045 - classification_loss: 0.0885 110/500 [=====>........................] - ETA: 1:38 - loss: 0.8951 - regression_loss: 0.8064 - classification_loss: 0.0887 111/500 [=====>........................] - ETA: 1:38 - loss: 0.8970 - regression_loss: 0.8079 - classification_loss: 0.0892 112/500 [=====>........................] - ETA: 1:37 - loss: 0.8943 - regression_loss: 0.8054 - classification_loss: 0.0889 113/500 [=====>........................] - ETA: 1:37 - loss: 0.8906 - regression_loss: 0.8024 - classification_loss: 0.0882 114/500 [=====>........................] - ETA: 1:37 - loss: 0.8860 - regression_loss: 0.7985 - classification_loss: 0.0876 115/500 [=====>........................] - ETA: 1:37 - loss: 0.8872 - regression_loss: 0.7997 - classification_loss: 0.0875 116/500 [=====>........................] - ETA: 1:36 - loss: 0.8841 - regression_loss: 0.7972 - classification_loss: 0.0869 117/500 [======>.......................] - ETA: 1:36 - loss: 0.8838 - regression_loss: 0.7967 - classification_loss: 0.0871 118/500 [======>.......................] - ETA: 1:36 - loss: 0.8839 - regression_loss: 0.7970 - classification_loss: 0.0869 119/500 [======>.......................] - ETA: 1:36 - loss: 0.8819 - regression_loss: 0.7952 - classification_loss: 0.0867 120/500 [======>.......................] - ETA: 1:35 - loss: 0.8788 - regression_loss: 0.7926 - classification_loss: 0.0862 121/500 [======>.......................] - ETA: 1:35 - loss: 0.8786 - regression_loss: 0.7928 - classification_loss: 0.0859 122/500 [======>.......................] - ETA: 1:35 - loss: 0.8803 - regression_loss: 0.7942 - classification_loss: 0.0860 123/500 [======>.......................] - ETA: 1:35 - loss: 0.8835 - regression_loss: 0.7971 - classification_loss: 0.0864 124/500 [======>.......................] - ETA: 1:34 - loss: 0.8796 - regression_loss: 0.7936 - classification_loss: 0.0860 125/500 [======>.......................] - ETA: 1:34 - loss: 0.8774 - regression_loss: 0.7917 - classification_loss: 0.0857 126/500 [======>.......................] - ETA: 1:34 - loss: 0.8800 - regression_loss: 0.7937 - classification_loss: 0.0862 127/500 [======>.......................] - ETA: 1:34 - loss: 0.8754 - regression_loss: 0.7896 - classification_loss: 0.0859 128/500 [======>.......................] - ETA: 1:33 - loss: 0.8743 - regression_loss: 0.7886 - classification_loss: 0.0857 129/500 [======>.......................] - ETA: 1:33 - loss: 0.8764 - regression_loss: 0.7899 - classification_loss: 0.0864 130/500 [======>.......................] - ETA: 1:33 - loss: 0.8755 - regression_loss: 0.7892 - classification_loss: 0.0863 131/500 [======>.......................] - ETA: 1:33 - loss: 0.8764 - regression_loss: 0.7899 - classification_loss: 0.0866 132/500 [======>.......................] - ETA: 1:32 - loss: 0.8769 - regression_loss: 0.7903 - classification_loss: 0.0866 133/500 [======>.......................] - ETA: 1:32 - loss: 0.8781 - regression_loss: 0.7916 - classification_loss: 0.0865 134/500 [=======>......................] - ETA: 1:32 - loss: 0.8761 - regression_loss: 0.7899 - classification_loss: 0.0862 135/500 [=======>......................] - ETA: 1:32 - loss: 0.8728 - regression_loss: 0.7870 - classification_loss: 0.0858 136/500 [=======>......................] - ETA: 1:31 - loss: 0.8750 - regression_loss: 0.7887 - classification_loss: 0.0863 137/500 [=======>......................] - ETA: 1:31 - loss: 0.8729 - regression_loss: 0.7868 - classification_loss: 0.0861 138/500 [=======>......................] - ETA: 1:31 - loss: 0.8734 - regression_loss: 0.7872 - classification_loss: 0.0861 139/500 [=======>......................] - ETA: 1:31 - loss: 0.8728 - regression_loss: 0.7870 - classification_loss: 0.0857 140/500 [=======>......................] - ETA: 1:30 - loss: 0.8750 - regression_loss: 0.7889 - classification_loss: 0.0860 141/500 [=======>......................] - ETA: 1:30 - loss: 0.8719 - regression_loss: 0.7862 - classification_loss: 0.0857 142/500 [=======>......................] - ETA: 1:30 - loss: 0.8725 - regression_loss: 0.7871 - classification_loss: 0.0854 143/500 [=======>......................] - ETA: 1:30 - loss: 0.8748 - regression_loss: 0.7889 - classification_loss: 0.0859 144/500 [=======>......................] - ETA: 1:29 - loss: 0.8790 - regression_loss: 0.7920 - classification_loss: 0.0870 145/500 [=======>......................] - ETA: 1:29 - loss: 0.8813 - regression_loss: 0.7940 - classification_loss: 0.0873 146/500 [=======>......................] - ETA: 1:29 - loss: 0.8846 - regression_loss: 0.7965 - classification_loss: 0.0880 147/500 [=======>......................] - ETA: 1:29 - loss: 0.8858 - regression_loss: 0.7976 - classification_loss: 0.0882 148/500 [=======>......................] - ETA: 1:28 - loss: 0.8859 - regression_loss: 0.7976 - classification_loss: 0.0883 149/500 [=======>......................] - ETA: 1:28 - loss: 0.8880 - regression_loss: 0.7994 - classification_loss: 0.0887 150/500 [========>.....................] - ETA: 1:28 - loss: 0.8883 - regression_loss: 0.7994 - classification_loss: 0.0889 151/500 [========>.....................] - ETA: 1:28 - loss: 0.8871 - regression_loss: 0.7982 - classification_loss: 0.0889 152/500 [========>.....................] - ETA: 1:27 - loss: 0.8890 - regression_loss: 0.7996 - classification_loss: 0.0894 153/500 [========>.....................] - ETA: 1:27 - loss: 0.8895 - regression_loss: 0.8002 - classification_loss: 0.0893 154/500 [========>.....................] - ETA: 1:27 - loss: 0.8874 - regression_loss: 0.7984 - classification_loss: 0.0890 155/500 [========>.....................] - ETA: 1:27 - loss: 0.8876 - regression_loss: 0.7986 - classification_loss: 0.0890 156/500 [========>.....................] - ETA: 1:26 - loss: 0.8863 - regression_loss: 0.7975 - classification_loss: 0.0888 157/500 [========>.....................] - ETA: 1:26 - loss: 0.8893 - regression_loss: 0.7999 - classification_loss: 0.0894 158/500 [========>.....................] - ETA: 1:26 - loss: 0.8848 - regression_loss: 0.7959 - classification_loss: 0.0889 159/500 [========>.....................] - ETA: 1:25 - loss: 0.8850 - regression_loss: 0.7961 - classification_loss: 0.0889 160/500 [========>.....................] - ETA: 1:25 - loss: 0.8886 - regression_loss: 0.7992 - classification_loss: 0.0894 161/500 [========>.....................] - ETA: 1:25 - loss: 0.8872 - regression_loss: 0.7979 - classification_loss: 0.0893 162/500 [========>.....................] - ETA: 1:25 - loss: 0.8881 - regression_loss: 0.7986 - classification_loss: 0.0895 163/500 [========>.....................] - ETA: 1:24 - loss: 0.8893 - regression_loss: 0.7995 - classification_loss: 0.0898 164/500 [========>.....................] - ETA: 1:24 - loss: 0.8894 - regression_loss: 0.7997 - classification_loss: 0.0898 165/500 [========>.....................] - ETA: 1:24 - loss: 0.8870 - regression_loss: 0.7976 - classification_loss: 0.0894 166/500 [========>.....................] - ETA: 1:24 - loss: 0.8871 - regression_loss: 0.7976 - classification_loss: 0.0894 167/500 [=========>....................] - ETA: 1:23 - loss: 0.8845 - regression_loss: 0.7954 - classification_loss: 0.0891 168/500 [=========>....................] - ETA: 1:23 - loss: 0.8849 - regression_loss: 0.7959 - classification_loss: 0.0890 169/500 [=========>....................] - ETA: 1:23 - loss: 0.8870 - regression_loss: 0.7977 - classification_loss: 0.0893 170/500 [=========>....................] - ETA: 1:22 - loss: 0.8882 - regression_loss: 0.7986 - classification_loss: 0.0895 171/500 [=========>....................] - ETA: 1:22 - loss: 0.8855 - regression_loss: 0.7964 - classification_loss: 0.0891 172/500 [=========>....................] - ETA: 1:22 - loss: 0.8876 - regression_loss: 0.7984 - classification_loss: 0.0892 173/500 [=========>....................] - ETA: 1:22 - loss: 0.8854 - regression_loss: 0.7965 - classification_loss: 0.0889 174/500 [=========>....................] - ETA: 1:21 - loss: 0.8839 - regression_loss: 0.7953 - classification_loss: 0.0886 175/500 [=========>....................] - ETA: 1:21 - loss: 0.8841 - regression_loss: 0.7955 - classification_loss: 0.0886 176/500 [=========>....................] - ETA: 1:21 - loss: 0.8866 - regression_loss: 0.7977 - classification_loss: 0.0889 177/500 [=========>....................] - ETA: 1:21 - loss: 0.8860 - regression_loss: 0.7971 - classification_loss: 0.0889 178/500 [=========>....................] - ETA: 1:20 - loss: 0.8843 - regression_loss: 0.7958 - classification_loss: 0.0885 179/500 [=========>....................] - ETA: 1:20 - loss: 0.8866 - regression_loss: 0.7976 - classification_loss: 0.0890 180/500 [=========>....................] - ETA: 1:20 - loss: 0.8851 - regression_loss: 0.7964 - classification_loss: 0.0887 181/500 [=========>....................] - ETA: 1:20 - loss: 0.8861 - regression_loss: 0.7963 - classification_loss: 0.0897 182/500 [=========>....................] - ETA: 1:20 - loss: 0.8827 - regression_loss: 0.7934 - classification_loss: 0.0893 183/500 [=========>....................] - ETA: 1:19 - loss: 0.8843 - regression_loss: 0.7947 - classification_loss: 0.0897 184/500 [==========>...................] - ETA: 1:19 - loss: 0.8843 - regression_loss: 0.7946 - classification_loss: 0.0898 185/500 [==========>...................] - ETA: 1:19 - loss: 0.8835 - regression_loss: 0.7939 - classification_loss: 0.0896 186/500 [==========>...................] - ETA: 1:19 - loss: 0.8819 - regression_loss: 0.7926 - classification_loss: 0.0893 187/500 [==========>...................] - ETA: 1:18 - loss: 0.8833 - regression_loss: 0.7937 - classification_loss: 0.0895 188/500 [==========>...................] - ETA: 1:18 - loss: 0.8863 - regression_loss: 0.7965 - classification_loss: 0.0898 189/500 [==========>...................] - ETA: 1:18 - loss: 0.8855 - regression_loss: 0.7958 - classification_loss: 0.0897 190/500 [==========>...................] - ETA: 1:18 - loss: 0.8874 - regression_loss: 0.7977 - classification_loss: 0.0897 191/500 [==========>...................] - ETA: 1:17 - loss: 0.8867 - regression_loss: 0.7971 - classification_loss: 0.0896 192/500 [==========>...................] - ETA: 1:17 - loss: 0.8873 - regression_loss: 0.7975 - classification_loss: 0.0898 193/500 [==========>...................] - ETA: 1:17 - loss: 0.8890 - regression_loss: 0.7989 - classification_loss: 0.0901 194/500 [==========>...................] - ETA: 1:17 - loss: 0.8900 - regression_loss: 0.8001 - classification_loss: 0.0899 195/500 [==========>...................] - ETA: 1:16 - loss: 0.8933 - regression_loss: 0.8028 - classification_loss: 0.0904 196/500 [==========>...................] - ETA: 1:16 - loss: 0.8933 - regression_loss: 0.8029 - classification_loss: 0.0904 197/500 [==========>...................] - ETA: 1:16 - loss: 0.8932 - regression_loss: 0.8029 - classification_loss: 0.0904 198/500 [==========>...................] - ETA: 1:16 - loss: 0.8907 - regression_loss: 0.8008 - classification_loss: 0.0900 199/500 [==========>...................] - ETA: 1:15 - loss: 0.8927 - regression_loss: 0.8027 - classification_loss: 0.0901 200/500 [===========>..................] - ETA: 1:15 - loss: 0.8923 - regression_loss: 0.8024 - classification_loss: 0.0900 201/500 [===========>..................] - ETA: 1:15 - loss: 0.8901 - regression_loss: 0.8003 - classification_loss: 0.0898 202/500 [===========>..................] - ETA: 1:15 - loss: 0.8892 - regression_loss: 0.7995 - classification_loss: 0.0896 203/500 [===========>..................] - ETA: 1:14 - loss: 0.8891 - regression_loss: 0.7994 - classification_loss: 0.0897 204/500 [===========>..................] - ETA: 1:14 - loss: 0.8914 - regression_loss: 0.8014 - classification_loss: 0.0900 205/500 [===========>..................] - ETA: 1:14 - loss: 0.8916 - regression_loss: 0.8014 - classification_loss: 0.0902 206/500 [===========>..................] - ETA: 1:14 - loss: 0.8909 - regression_loss: 0.8009 - classification_loss: 0.0900 207/500 [===========>..................] - ETA: 1:13 - loss: 0.8924 - regression_loss: 0.8024 - classification_loss: 0.0901 208/500 [===========>..................] - ETA: 1:13 - loss: 0.8908 - regression_loss: 0.8011 - classification_loss: 0.0897 209/500 [===========>..................] - ETA: 1:13 - loss: 0.8935 - regression_loss: 0.8034 - classification_loss: 0.0901 210/500 [===========>..................] - ETA: 1:13 - loss: 0.8961 - regression_loss: 0.8057 - classification_loss: 0.0903 211/500 [===========>..................] - ETA: 1:12 - loss: 0.8961 - regression_loss: 0.8058 - classification_loss: 0.0903 212/500 [===========>..................] - ETA: 1:12 - loss: 0.8947 - regression_loss: 0.8045 - classification_loss: 0.0902 213/500 [===========>..................] - ETA: 1:12 - loss: 0.8960 - regression_loss: 0.8057 - classification_loss: 0.0903 214/500 [===========>..................] - ETA: 1:12 - loss: 0.8950 - regression_loss: 0.8049 - classification_loss: 0.0901 215/500 [===========>..................] - ETA: 1:11 - loss: 0.8947 - regression_loss: 0.8047 - classification_loss: 0.0900 216/500 [===========>..................] - ETA: 1:11 - loss: 0.8967 - regression_loss: 0.8064 - classification_loss: 0.0903 217/500 [============>.................] - ETA: 1:11 - loss: 0.8944 - regression_loss: 0.8045 - classification_loss: 0.0899 218/500 [============>.................] - ETA: 1:11 - loss: 0.8941 - regression_loss: 0.8042 - classification_loss: 0.0899 219/500 [============>.................] - ETA: 1:10 - loss: 0.8948 - regression_loss: 0.8047 - classification_loss: 0.0901 220/500 [============>.................] - ETA: 1:10 - loss: 0.8938 - regression_loss: 0.8039 - classification_loss: 0.0899 221/500 [============>.................] - ETA: 1:10 - loss: 0.8928 - regression_loss: 0.8031 - classification_loss: 0.0897 222/500 [============>.................] - ETA: 1:10 - loss: 0.8929 - regression_loss: 0.8031 - classification_loss: 0.0898 223/500 [============>.................] - ETA: 1:09 - loss: 0.8933 - regression_loss: 0.8035 - classification_loss: 0.0898 224/500 [============>.................] - ETA: 1:09 - loss: 0.8930 - regression_loss: 0.8032 - classification_loss: 0.0898 225/500 [============>.................] - ETA: 1:09 - loss: 0.8930 - regression_loss: 0.8033 - classification_loss: 0.0897 226/500 [============>.................] - ETA: 1:09 - loss: 0.8943 - regression_loss: 0.8045 - classification_loss: 0.0898 227/500 [============>.................] - ETA: 1:08 - loss: 0.8954 - regression_loss: 0.8053 - classification_loss: 0.0901 228/500 [============>.................] - ETA: 1:08 - loss: 0.8958 - regression_loss: 0.8055 - classification_loss: 0.0903 229/500 [============>.................] - ETA: 1:08 - loss: 0.8959 - regression_loss: 0.8055 - classification_loss: 0.0904 230/500 [============>.................] - ETA: 1:08 - loss: 0.8948 - regression_loss: 0.8045 - classification_loss: 0.0903 231/500 [============>.................] - ETA: 1:07 - loss: 0.8957 - regression_loss: 0.8053 - classification_loss: 0.0904 232/500 [============>.................] - ETA: 1:07 - loss: 0.8937 - regression_loss: 0.8035 - classification_loss: 0.0902 233/500 [============>.................] - ETA: 1:07 - loss: 0.8919 - regression_loss: 0.8020 - classification_loss: 0.0900 234/500 [=============>................] - ETA: 1:06 - loss: 0.8926 - regression_loss: 0.8024 - classification_loss: 0.0902 235/500 [=============>................] - ETA: 1:06 - loss: 0.8930 - regression_loss: 0.8027 - classification_loss: 0.0903 236/500 [=============>................] - ETA: 1:06 - loss: 0.8935 - regression_loss: 0.8033 - classification_loss: 0.0903 237/500 [=============>................] - ETA: 1:06 - loss: 0.8938 - regression_loss: 0.8036 - classification_loss: 0.0902 238/500 [=============>................] - ETA: 1:05 - loss: 0.8936 - regression_loss: 0.8035 - classification_loss: 0.0901 239/500 [=============>................] - ETA: 1:05 - loss: 0.8940 - regression_loss: 0.8037 - classification_loss: 0.0902 240/500 [=============>................] - ETA: 1:05 - loss: 0.8940 - regression_loss: 0.8038 - classification_loss: 0.0902 241/500 [=============>................] - ETA: 1:05 - loss: 0.8944 - regression_loss: 0.8042 - classification_loss: 0.0902 242/500 [=============>................] - ETA: 1:04 - loss: 0.8928 - regression_loss: 0.8026 - classification_loss: 0.0902 243/500 [=============>................] - ETA: 1:04 - loss: 0.8907 - regression_loss: 0.8008 - classification_loss: 0.0899 244/500 [=============>................] - ETA: 1:04 - loss: 0.8932 - regression_loss: 0.8028 - classification_loss: 0.0904 245/500 [=============>................] - ETA: 1:04 - loss: 0.8911 - regression_loss: 0.8009 - classification_loss: 0.0902 246/500 [=============>................] - ETA: 1:03 - loss: 0.8898 - regression_loss: 0.7999 - classification_loss: 0.0899 247/500 [=============>................] - ETA: 1:03 - loss: 0.8873 - regression_loss: 0.7978 - classification_loss: 0.0896 248/500 [=============>................] - ETA: 1:03 - loss: 0.8856 - regression_loss: 0.7962 - classification_loss: 0.0894 249/500 [=============>................] - ETA: 1:03 - loss: 0.8832 - regression_loss: 0.7941 - classification_loss: 0.0891 250/500 [==============>...............] - ETA: 1:03 - loss: 0.8823 - regression_loss: 0.7934 - classification_loss: 0.0889 251/500 [==============>...............] - ETA: 1:02 - loss: 0.8830 - regression_loss: 0.7941 - classification_loss: 0.0890 252/500 [==============>...............] - ETA: 1:02 - loss: 0.8843 - regression_loss: 0.7951 - classification_loss: 0.0892 253/500 [==============>...............] - ETA: 1:02 - loss: 0.8863 - regression_loss: 0.7967 - classification_loss: 0.0895 254/500 [==============>...............] - ETA: 1:02 - loss: 0.8855 - regression_loss: 0.7961 - classification_loss: 0.0894 255/500 [==============>...............] - ETA: 1:01 - loss: 0.8872 - regression_loss: 0.7976 - classification_loss: 0.0896 256/500 [==============>...............] - ETA: 1:01 - loss: 0.8852 - regression_loss: 0.7958 - classification_loss: 0.0894 257/500 [==============>...............] - ETA: 1:01 - loss: 0.8861 - regression_loss: 0.7965 - classification_loss: 0.0896 258/500 [==============>...............] - ETA: 1:01 - loss: 0.8840 - regression_loss: 0.7947 - classification_loss: 0.0893 259/500 [==============>...............] - ETA: 1:00 - loss: 0.8849 - regression_loss: 0.7956 - classification_loss: 0.0894 260/500 [==============>...............] - ETA: 1:00 - loss: 0.8842 - regression_loss: 0.7950 - classification_loss: 0.0892 261/500 [==============>...............] - ETA: 1:00 - loss: 0.8826 - regression_loss: 0.7936 - classification_loss: 0.0889 262/500 [==============>...............] - ETA: 1:00 - loss: 0.8809 - regression_loss: 0.7921 - classification_loss: 0.0888 263/500 [==============>...............] - ETA: 59s - loss: 0.8834 - regression_loss: 0.7942 - classification_loss: 0.0892  264/500 [==============>...............] - ETA: 59s - loss: 0.8835 - regression_loss: 0.7944 - classification_loss: 0.0892 265/500 [==============>...............] - ETA: 59s - loss: 0.8837 - regression_loss: 0.7945 - classification_loss: 0.0892 266/500 [==============>...............] - ETA: 59s - loss: 0.8847 - regression_loss: 0.7949 - classification_loss: 0.0897 267/500 [===============>..............] - ETA: 58s - loss: 0.8842 - regression_loss: 0.7944 - classification_loss: 0.0898 268/500 [===============>..............] - ETA: 58s - loss: 0.8838 - regression_loss: 0.7942 - classification_loss: 0.0896 269/500 [===============>..............] - ETA: 58s - loss: 0.8826 - regression_loss: 0.7933 - classification_loss: 0.0894 270/500 [===============>..............] - ETA: 58s - loss: 0.8830 - regression_loss: 0.7935 - classification_loss: 0.0894 271/500 [===============>..............] - ETA: 57s - loss: 0.8837 - regression_loss: 0.7944 - classification_loss: 0.0893 272/500 [===============>..............] - ETA: 57s - loss: 0.8839 - regression_loss: 0.7946 - classification_loss: 0.0894 273/500 [===============>..............] - ETA: 57s - loss: 0.8855 - regression_loss: 0.7957 - classification_loss: 0.0897 274/500 [===============>..............] - ETA: 57s - loss: 0.8854 - regression_loss: 0.7956 - classification_loss: 0.0898 275/500 [===============>..............] - ETA: 56s - loss: 0.8871 - regression_loss: 0.7970 - classification_loss: 0.0902 276/500 [===============>..............] - ETA: 56s - loss: 0.8880 - regression_loss: 0.7973 - classification_loss: 0.0906 277/500 [===============>..............] - ETA: 56s - loss: 0.8889 - regression_loss: 0.7981 - classification_loss: 0.0909 278/500 [===============>..............] - ETA: 56s - loss: 0.8880 - regression_loss: 0.7973 - classification_loss: 0.0907 279/500 [===============>..............] - ETA: 55s - loss: 0.8883 - regression_loss: 0.7975 - classification_loss: 0.0908 280/500 [===============>..............] - ETA: 55s - loss: 0.8881 - regression_loss: 0.7973 - classification_loss: 0.0908 281/500 [===============>..............] - ETA: 55s - loss: 0.8889 - regression_loss: 0.7979 - classification_loss: 0.0911 282/500 [===============>..............] - ETA: 55s - loss: 0.8875 - regression_loss: 0.7967 - classification_loss: 0.0908 283/500 [===============>..............] - ETA: 54s - loss: 0.8873 - regression_loss: 0.7965 - classification_loss: 0.0908 284/500 [================>.............] - ETA: 54s - loss: 0.8883 - regression_loss: 0.7972 - classification_loss: 0.0910 285/500 [================>.............] - ETA: 54s - loss: 0.8892 - regression_loss: 0.7980 - classification_loss: 0.0911 286/500 [================>.............] - ETA: 54s - loss: 0.8898 - regression_loss: 0.7986 - classification_loss: 0.0912 287/500 [================>.............] - ETA: 53s - loss: 0.8902 - regression_loss: 0.7989 - classification_loss: 0.0913 288/500 [================>.............] - ETA: 53s - loss: 0.8910 - regression_loss: 0.7996 - classification_loss: 0.0914 289/500 [================>.............] - ETA: 53s - loss: 0.8917 - regression_loss: 0.8002 - classification_loss: 0.0915 290/500 [================>.............] - ETA: 53s - loss: 0.8904 - regression_loss: 0.7991 - classification_loss: 0.0914 291/500 [================>.............] - ETA: 52s - loss: 0.8899 - regression_loss: 0.7985 - classification_loss: 0.0914 292/500 [================>.............] - ETA: 52s - loss: 0.8897 - regression_loss: 0.7984 - classification_loss: 0.0913 293/500 [================>.............] - ETA: 52s - loss: 0.8894 - regression_loss: 0.7983 - classification_loss: 0.0912 294/500 [================>.............] - ETA: 52s - loss: 0.8895 - regression_loss: 0.7983 - classification_loss: 0.0912 295/500 [================>.............] - ETA: 51s - loss: 0.8892 - regression_loss: 0.7979 - classification_loss: 0.0913 296/500 [================>.............] - ETA: 51s - loss: 0.8897 - regression_loss: 0.7983 - classification_loss: 0.0914 297/500 [================>.............] - ETA: 51s - loss: 0.8911 - regression_loss: 0.7995 - classification_loss: 0.0917 298/500 [================>.............] - ETA: 51s - loss: 0.8916 - regression_loss: 0.7998 - classification_loss: 0.0918 299/500 [================>.............] - ETA: 50s - loss: 0.8929 - regression_loss: 0.8009 - classification_loss: 0.0920 300/500 [=================>............] - ETA: 50s - loss: 0.8916 - regression_loss: 0.7998 - classification_loss: 0.0918 301/500 [=================>............] - ETA: 50s - loss: 0.8901 - regression_loss: 0.7984 - classification_loss: 0.0917 302/500 [=================>............] - ETA: 50s - loss: 0.8897 - regression_loss: 0.7982 - classification_loss: 0.0916 303/500 [=================>............] - ETA: 49s - loss: 0.8896 - regression_loss: 0.7982 - classification_loss: 0.0914 304/500 [=================>............] - ETA: 49s - loss: 0.8896 - regression_loss: 0.7981 - classification_loss: 0.0914 305/500 [=================>............] - ETA: 49s - loss: 0.8887 - regression_loss: 0.7975 - classification_loss: 0.0912 306/500 [=================>............] - ETA: 49s - loss: 0.8875 - regression_loss: 0.7965 - classification_loss: 0.0910 307/500 [=================>............] - ETA: 48s - loss: 0.8879 - regression_loss: 0.7968 - classification_loss: 0.0911 308/500 [=================>............] - ETA: 48s - loss: 0.8862 - regression_loss: 0.7954 - classification_loss: 0.0909 309/500 [=================>............] - ETA: 48s - loss: 0.8860 - regression_loss: 0.7951 - classification_loss: 0.0909 310/500 [=================>............] - ETA: 48s - loss: 0.8851 - regression_loss: 0.7945 - classification_loss: 0.0906 311/500 [=================>............] - ETA: 47s - loss: 0.8849 - regression_loss: 0.7944 - classification_loss: 0.0904 312/500 [=================>............] - ETA: 47s - loss: 0.8856 - regression_loss: 0.7952 - classification_loss: 0.0905 313/500 [=================>............] - ETA: 47s - loss: 0.8854 - regression_loss: 0.7951 - classification_loss: 0.0903 314/500 [=================>............] - ETA: 47s - loss: 0.8851 - regression_loss: 0.7949 - classification_loss: 0.0902 315/500 [=================>............] - ETA: 46s - loss: 0.8864 - regression_loss: 0.7960 - classification_loss: 0.0904 316/500 [=================>............] - ETA: 46s - loss: 0.8855 - regression_loss: 0.7953 - classification_loss: 0.0902 317/500 [==================>...........] - ETA: 46s - loss: 0.8847 - regression_loss: 0.7948 - classification_loss: 0.0900 318/500 [==================>...........] - ETA: 46s - loss: 0.8855 - regression_loss: 0.7954 - classification_loss: 0.0900 319/500 [==================>...........] - ETA: 45s - loss: 0.8870 - regression_loss: 0.7968 - classification_loss: 0.0902 320/500 [==================>...........] - ETA: 45s - loss: 0.8860 - regression_loss: 0.7959 - classification_loss: 0.0900 321/500 [==================>...........] - ETA: 45s - loss: 0.8878 - regression_loss: 0.7975 - classification_loss: 0.0902 322/500 [==================>...........] - ETA: 45s - loss: 0.8858 - regression_loss: 0.7958 - classification_loss: 0.0900 323/500 [==================>...........] - ETA: 44s - loss: 0.8864 - regression_loss: 0.7963 - classification_loss: 0.0902 324/500 [==================>...........] - ETA: 44s - loss: 0.8853 - regression_loss: 0.7952 - classification_loss: 0.0900 325/500 [==================>...........] - ETA: 44s - loss: 0.8853 - regression_loss: 0.7953 - classification_loss: 0.0901 326/500 [==================>...........] - ETA: 44s - loss: 0.8866 - regression_loss: 0.7964 - classification_loss: 0.0903 327/500 [==================>...........] - ETA: 43s - loss: 0.8858 - regression_loss: 0.7957 - classification_loss: 0.0901 328/500 [==================>...........] - ETA: 43s - loss: 0.8856 - regression_loss: 0.7956 - classification_loss: 0.0900 329/500 [==================>...........] - ETA: 43s - loss: 0.8879 - regression_loss: 0.7975 - classification_loss: 0.0903 330/500 [==================>...........] - ETA: 42s - loss: 0.8893 - regression_loss: 0.7987 - classification_loss: 0.0906 331/500 [==================>...........] - ETA: 42s - loss: 0.8912 - regression_loss: 0.8003 - classification_loss: 0.0909 332/500 [==================>...........] - ETA: 42s - loss: 0.8916 - regression_loss: 0.8007 - classification_loss: 0.0909 333/500 [==================>...........] - ETA: 42s - loss: 0.8909 - regression_loss: 0.8002 - classification_loss: 0.0908 334/500 [===================>..........] - ETA: 42s - loss: 0.8915 - regression_loss: 0.8006 - classification_loss: 0.0909 335/500 [===================>..........] - ETA: 41s - loss: 0.8922 - regression_loss: 0.8012 - classification_loss: 0.0910 336/500 [===================>..........] - ETA: 41s - loss: 0.8905 - regression_loss: 0.7997 - classification_loss: 0.0908 337/500 [===================>..........] - ETA: 41s - loss: 0.8910 - regression_loss: 0.8001 - classification_loss: 0.0909 338/500 [===================>..........] - ETA: 40s - loss: 0.8896 - regression_loss: 0.7989 - classification_loss: 0.0907 339/500 [===================>..........] - ETA: 40s - loss: 0.8916 - regression_loss: 0.8005 - classification_loss: 0.0911 340/500 [===================>..........] - ETA: 40s - loss: 0.8928 - regression_loss: 0.8014 - classification_loss: 0.0913 341/500 [===================>..........] - ETA: 40s - loss: 0.8941 - regression_loss: 0.8026 - classification_loss: 0.0916 342/500 [===================>..........] - ETA: 39s - loss: 0.8927 - regression_loss: 0.8014 - classification_loss: 0.0914 343/500 [===================>..........] - ETA: 39s - loss: 0.8915 - regression_loss: 0.8003 - classification_loss: 0.0912 344/500 [===================>..........] - ETA: 39s - loss: 0.8904 - regression_loss: 0.7994 - classification_loss: 0.0910 345/500 [===================>..........] - ETA: 39s - loss: 0.8904 - regression_loss: 0.7993 - classification_loss: 0.0911 346/500 [===================>..........] - ETA: 38s - loss: 0.8905 - regression_loss: 0.7994 - classification_loss: 0.0911 347/500 [===================>..........] - ETA: 38s - loss: 0.8899 - regression_loss: 0.7987 - classification_loss: 0.0912 348/500 [===================>..........] - ETA: 38s - loss: 0.8892 - regression_loss: 0.7981 - classification_loss: 0.0911 349/500 [===================>..........] - ETA: 38s - loss: 0.8900 - regression_loss: 0.7987 - classification_loss: 0.0913 350/500 [====================>.........] - ETA: 37s - loss: 0.8889 - regression_loss: 0.7978 - classification_loss: 0.0911 351/500 [====================>.........] - ETA: 37s - loss: 0.8894 - regression_loss: 0.7983 - classification_loss: 0.0911 352/500 [====================>.........] - ETA: 37s - loss: 0.8881 - regression_loss: 0.7972 - classification_loss: 0.0909 353/500 [====================>.........] - ETA: 37s - loss: 0.8879 - regression_loss: 0.7969 - classification_loss: 0.0910 354/500 [====================>.........] - ETA: 36s - loss: 0.8880 - regression_loss: 0.7969 - classification_loss: 0.0911 355/500 [====================>.........] - ETA: 36s - loss: 0.8866 - regression_loss: 0.7957 - classification_loss: 0.0909 356/500 [====================>.........] - ETA: 36s - loss: 0.8867 - regression_loss: 0.7958 - classification_loss: 0.0909 357/500 [====================>.........] - ETA: 36s - loss: 0.8859 - regression_loss: 0.7951 - classification_loss: 0.0907 358/500 [====================>.........] - ETA: 35s - loss: 0.8850 - regression_loss: 0.7944 - classification_loss: 0.0906 359/500 [====================>.........] - ETA: 35s - loss: 0.8835 - regression_loss: 0.7931 - classification_loss: 0.0904 360/500 [====================>.........] - ETA: 35s - loss: 0.8845 - regression_loss: 0.7940 - classification_loss: 0.0905 361/500 [====================>.........] - ETA: 35s - loss: 0.8836 - regression_loss: 0.7932 - classification_loss: 0.0904 362/500 [====================>.........] - ETA: 34s - loss: 0.8835 - regression_loss: 0.7931 - classification_loss: 0.0904 363/500 [====================>.........] - ETA: 34s - loss: 0.8839 - regression_loss: 0.7934 - classification_loss: 0.0905 364/500 [====================>.........] - ETA: 34s - loss: 0.8834 - regression_loss: 0.7930 - classification_loss: 0.0904 365/500 [====================>.........] - ETA: 34s - loss: 0.8852 - regression_loss: 0.7946 - classification_loss: 0.0906 366/500 [====================>.........] - ETA: 33s - loss: 0.8844 - regression_loss: 0.7939 - classification_loss: 0.0905 367/500 [=====================>........] - ETA: 33s - loss: 0.8847 - regression_loss: 0.7941 - classification_loss: 0.0905 368/500 [=====================>........] - ETA: 33s - loss: 0.8828 - regression_loss: 0.7924 - classification_loss: 0.0904 369/500 [=====================>........] - ETA: 33s - loss: 0.8818 - regression_loss: 0.7916 - classification_loss: 0.0902 370/500 [=====================>........] - ETA: 32s - loss: 0.8819 - regression_loss: 0.7917 - classification_loss: 0.0902 371/500 [=====================>........] - ETA: 32s - loss: 0.8823 - regression_loss: 0.7921 - classification_loss: 0.0902 372/500 [=====================>........] - ETA: 32s - loss: 0.8812 - regression_loss: 0.7910 - classification_loss: 0.0902 373/500 [=====================>........] - ETA: 32s - loss: 0.8800 - regression_loss: 0.7900 - classification_loss: 0.0900 374/500 [=====================>........] - ETA: 31s - loss: 0.8807 - regression_loss: 0.7906 - classification_loss: 0.0902 375/500 [=====================>........] - ETA: 31s - loss: 0.8810 - regression_loss: 0.7907 - classification_loss: 0.0903 376/500 [=====================>........] - ETA: 31s - loss: 0.8804 - regression_loss: 0.7902 - classification_loss: 0.0902 377/500 [=====================>........] - ETA: 31s - loss: 0.8804 - regression_loss: 0.7903 - classification_loss: 0.0901 378/500 [=====================>........] - ETA: 30s - loss: 0.8797 - regression_loss: 0.7898 - classification_loss: 0.0900 379/500 [=====================>........] - ETA: 30s - loss: 0.8794 - regression_loss: 0.7895 - classification_loss: 0.0899 380/500 [=====================>........] - ETA: 30s - loss: 0.8790 - regression_loss: 0.7892 - classification_loss: 0.0899 381/500 [=====================>........] - ETA: 30s - loss: 0.8796 - regression_loss: 0.7897 - classification_loss: 0.0899 382/500 [=====================>........] - ETA: 29s - loss: 0.8803 - regression_loss: 0.7901 - classification_loss: 0.0902 383/500 [=====================>........] - ETA: 29s - loss: 0.8792 - regression_loss: 0.7893 - classification_loss: 0.0900 384/500 [======================>.......] - ETA: 29s - loss: 0.8795 - regression_loss: 0.7896 - classification_loss: 0.0899 385/500 [======================>.......] - ETA: 29s - loss: 0.8790 - regression_loss: 0.7892 - classification_loss: 0.0898 386/500 [======================>.......] - ETA: 28s - loss: 0.8786 - regression_loss: 0.7890 - classification_loss: 0.0896 387/500 [======================>.......] - ETA: 28s - loss: 0.8780 - regression_loss: 0.7885 - classification_loss: 0.0895 388/500 [======================>.......] - ETA: 28s - loss: 0.8782 - regression_loss: 0.7886 - classification_loss: 0.0896 389/500 [======================>.......] - ETA: 28s - loss: 0.8787 - regression_loss: 0.7892 - classification_loss: 0.0896 390/500 [======================>.......] - ETA: 27s - loss: 0.8791 - regression_loss: 0.7895 - classification_loss: 0.0896 391/500 [======================>.......] - ETA: 27s - loss: 0.8774 - regression_loss: 0.7880 - classification_loss: 0.0894 392/500 [======================>.......] - ETA: 27s - loss: 0.8773 - regression_loss: 0.7878 - classification_loss: 0.0895 393/500 [======================>.......] - ETA: 27s - loss: 0.8764 - regression_loss: 0.7870 - classification_loss: 0.0894 394/500 [======================>.......] - ETA: 26s - loss: 0.8765 - regression_loss: 0.7871 - classification_loss: 0.0894 395/500 [======================>.......] - ETA: 26s - loss: 0.8763 - regression_loss: 0.7869 - classification_loss: 0.0894 396/500 [======================>.......] - ETA: 26s - loss: 0.8772 - regression_loss: 0.7876 - classification_loss: 0.0896 397/500 [======================>.......] - ETA: 26s - loss: 0.8768 - regression_loss: 0.7872 - classification_loss: 0.0895 398/500 [======================>.......] - ETA: 25s - loss: 0.8761 - regression_loss: 0.7867 - classification_loss: 0.0895 399/500 [======================>.......] - ETA: 25s - loss: 0.8766 - regression_loss: 0.7872 - classification_loss: 0.0895 400/500 [=======================>......] - ETA: 25s - loss: 0.8760 - regression_loss: 0.7864 - classification_loss: 0.0896 401/500 [=======================>......] - ETA: 25s - loss: 0.8765 - regression_loss: 0.7868 - classification_loss: 0.0896 402/500 [=======================>......] - ETA: 24s - loss: 0.8758 - regression_loss: 0.7863 - classification_loss: 0.0895 403/500 [=======================>......] - ETA: 24s - loss: 0.8765 - regression_loss: 0.7869 - classification_loss: 0.0896 404/500 [=======================>......] - ETA: 24s - loss: 0.8763 - regression_loss: 0.7868 - classification_loss: 0.0895 405/500 [=======================>......] - ETA: 24s - loss: 0.8768 - regression_loss: 0.7872 - classification_loss: 0.0896 406/500 [=======================>......] - ETA: 23s - loss: 0.8771 - regression_loss: 0.7874 - classification_loss: 0.0896 407/500 [=======================>......] - ETA: 23s - loss: 0.8772 - regression_loss: 0.7876 - classification_loss: 0.0897 408/500 [=======================>......] - ETA: 23s - loss: 0.8776 - regression_loss: 0.7879 - classification_loss: 0.0896 409/500 [=======================>......] - ETA: 23s - loss: 0.8789 - regression_loss: 0.7893 - classification_loss: 0.0896 410/500 [=======================>......] - ETA: 22s - loss: 0.8791 - regression_loss: 0.7895 - classification_loss: 0.0896 411/500 [=======================>......] - ETA: 22s - loss: 0.8802 - regression_loss: 0.7906 - classification_loss: 0.0896 412/500 [=======================>......] - ETA: 22s - loss: 0.8807 - regression_loss: 0.7910 - classification_loss: 0.0897 413/500 [=======================>......] - ETA: 22s - loss: 0.8809 - regression_loss: 0.7911 - classification_loss: 0.0898 414/500 [=======================>......] - ETA: 21s - loss: 0.8806 - regression_loss: 0.7908 - classification_loss: 0.0898 415/500 [=======================>......] - ETA: 21s - loss: 0.8815 - regression_loss: 0.7915 - classification_loss: 0.0900 416/500 [=======================>......] - ETA: 21s - loss: 0.8813 - regression_loss: 0.7911 - classification_loss: 0.0902 417/500 [========================>.....] - ETA: 20s - loss: 0.8806 - regression_loss: 0.7905 - classification_loss: 0.0901 418/500 [========================>.....] - ETA: 20s - loss: 0.8794 - regression_loss: 0.7895 - classification_loss: 0.0899 419/500 [========================>.....] - ETA: 20s - loss: 0.8782 - regression_loss: 0.7885 - classification_loss: 0.0898 420/500 [========================>.....] - ETA: 20s - loss: 0.8773 - regression_loss: 0.7877 - classification_loss: 0.0896 421/500 [========================>.....] - ETA: 19s - loss: 0.8766 - regression_loss: 0.7871 - classification_loss: 0.0895 422/500 [========================>.....] - ETA: 19s - loss: 0.8769 - regression_loss: 0.7873 - classification_loss: 0.0896 423/500 [========================>.....] - ETA: 19s - loss: 0.8764 - regression_loss: 0.7870 - classification_loss: 0.0895 424/500 [========================>.....] - ETA: 19s - loss: 0.8763 - regression_loss: 0.7868 - classification_loss: 0.0894 425/500 [========================>.....] - ETA: 18s - loss: 0.8767 - regression_loss: 0.7872 - classification_loss: 0.0895 426/500 [========================>.....] - ETA: 18s - loss: 0.8768 - regression_loss: 0.7872 - classification_loss: 0.0896 427/500 [========================>.....] - ETA: 18s - loss: 0.8776 - regression_loss: 0.7879 - classification_loss: 0.0897 428/500 [========================>.....] - ETA: 18s - loss: 0.8771 - regression_loss: 0.7876 - classification_loss: 0.0896 429/500 [========================>.....] - ETA: 17s - loss: 0.8765 - regression_loss: 0.7871 - classification_loss: 0.0894 430/500 [========================>.....] - ETA: 17s - loss: 0.8767 - regression_loss: 0.7872 - classification_loss: 0.0895 431/500 [========================>.....] - ETA: 17s - loss: 0.8775 - regression_loss: 0.7879 - classification_loss: 0.0896 432/500 [========================>.....] - ETA: 17s - loss: 0.8763 - regression_loss: 0.7869 - classification_loss: 0.0894 433/500 [========================>.....] - ETA: 16s - loss: 0.8764 - regression_loss: 0.7871 - classification_loss: 0.0893 434/500 [=========================>....] - ETA: 16s - loss: 0.8762 - regression_loss: 0.7868 - classification_loss: 0.0893 435/500 [=========================>....] - ETA: 16s - loss: 0.8759 - regression_loss: 0.7867 - classification_loss: 0.0892 436/500 [=========================>....] - ETA: 16s - loss: 0.8760 - regression_loss: 0.7867 - classification_loss: 0.0893 437/500 [=========================>....] - ETA: 15s - loss: 0.8745 - regression_loss: 0.7854 - classification_loss: 0.0891 438/500 [=========================>....] - ETA: 15s - loss: 0.8746 - regression_loss: 0.7855 - classification_loss: 0.0891 439/500 [=========================>....] - ETA: 15s - loss: 0.8757 - regression_loss: 0.7864 - classification_loss: 0.0893 440/500 [=========================>....] - ETA: 15s - loss: 0.8752 - regression_loss: 0.7860 - classification_loss: 0.0892 441/500 [=========================>....] - ETA: 14s - loss: 0.8757 - regression_loss: 0.7865 - classification_loss: 0.0892 442/500 [=========================>....] - ETA: 14s - loss: 0.8751 - regression_loss: 0.7860 - classification_loss: 0.0891 443/500 [=========================>....] - ETA: 14s - loss: 0.8750 - regression_loss: 0.7859 - classification_loss: 0.0891 444/500 [=========================>....] - ETA: 14s - loss: 0.8750 - regression_loss: 0.7859 - classification_loss: 0.0890 445/500 [=========================>....] - ETA: 13s - loss: 0.8763 - regression_loss: 0.7871 - classification_loss: 0.0893 446/500 [=========================>....] - ETA: 13s - loss: 0.8772 - regression_loss: 0.7878 - classification_loss: 0.0894 447/500 [=========================>....] - ETA: 13s - loss: 0.8785 - regression_loss: 0.7889 - classification_loss: 0.0896 448/500 [=========================>....] - ETA: 13s - loss: 0.8787 - regression_loss: 0.7890 - classification_loss: 0.0897 449/500 [=========================>....] - ETA: 12s - loss: 0.8786 - regression_loss: 0.7890 - classification_loss: 0.0897 450/500 [==========================>...] - ETA: 12s - loss: 0.8777 - regression_loss: 0.7881 - classification_loss: 0.0895 451/500 [==========================>...] - ETA: 12s - loss: 0.8782 - regression_loss: 0.7886 - classification_loss: 0.0896 452/500 [==========================>...] - ETA: 12s - loss: 0.8784 - regression_loss: 0.7888 - classification_loss: 0.0896 453/500 [==========================>...] - ETA: 11s - loss: 0.8778 - regression_loss: 0.7882 - classification_loss: 0.0896 454/500 [==========================>...] - ETA: 11s - loss: 0.8784 - regression_loss: 0.7887 - classification_loss: 0.0897 455/500 [==========================>...] - ETA: 11s - loss: 0.8779 - regression_loss: 0.7883 - classification_loss: 0.0896 456/500 [==========================>...] - ETA: 11s - loss: 0.8786 - regression_loss: 0.7890 - classification_loss: 0.0895 457/500 [==========================>...] - ETA: 10s - loss: 0.8791 - regression_loss: 0.7895 - classification_loss: 0.0896 458/500 [==========================>...] - ETA: 10s - loss: 0.8796 - regression_loss: 0.7900 - classification_loss: 0.0896 459/500 [==========================>...] - ETA: 10s - loss: 0.8785 - regression_loss: 0.7890 - classification_loss: 0.0895 460/500 [==========================>...] - ETA: 10s - loss: 0.8775 - regression_loss: 0.7881 - classification_loss: 0.0894 461/500 [==========================>...] - ETA: 9s - loss: 0.8775 - regression_loss: 0.7881 - classification_loss: 0.0894  462/500 [==========================>...] - ETA: 9s - loss: 0.8768 - regression_loss: 0.7875 - classification_loss: 0.0893 463/500 [==========================>...] - ETA: 9s - loss: 0.8773 - regression_loss: 0.7880 - classification_loss: 0.0893 464/500 [==========================>...] - ETA: 9s - loss: 0.8770 - regression_loss: 0.7877 - classification_loss: 0.0893 465/500 [==========================>...] - ETA: 8s - loss: 0.8774 - regression_loss: 0.7880 - classification_loss: 0.0893 466/500 [==========================>...] - ETA: 8s - loss: 0.8771 - regression_loss: 0.7879 - classification_loss: 0.0893 467/500 [===========================>..] - ETA: 8s - loss: 0.8784 - regression_loss: 0.7890 - classification_loss: 0.0895 468/500 [===========================>..] - ETA: 8s - loss: 0.8792 - regression_loss: 0.7897 - classification_loss: 0.0895 469/500 [===========================>..] - ETA: 7s - loss: 0.8788 - regression_loss: 0.7895 - classification_loss: 0.0893 470/500 [===========================>..] - ETA: 7s - loss: 0.8779 - regression_loss: 0.7887 - classification_loss: 0.0892 471/500 [===========================>..] - ETA: 7s - loss: 0.8781 - regression_loss: 0.7890 - classification_loss: 0.0891 472/500 [===========================>..] - ETA: 7s - loss: 0.8780 - regression_loss: 0.7889 - classification_loss: 0.0891 473/500 [===========================>..] - ETA: 6s - loss: 0.8774 - regression_loss: 0.7884 - classification_loss: 0.0890 474/500 [===========================>..] - ETA: 6s - loss: 0.8774 - regression_loss: 0.7883 - classification_loss: 0.0890 475/500 [===========================>..] - ETA: 6s - loss: 0.8779 - regression_loss: 0.7887 - classification_loss: 0.0891 476/500 [===========================>..] - ETA: 6s - loss: 0.8770 - regression_loss: 0.7880 - classification_loss: 0.0891 477/500 [===========================>..] - ETA: 5s - loss: 0.8763 - regression_loss: 0.7874 - classification_loss: 0.0890 478/500 [===========================>..] - ETA: 5s - loss: 0.8767 - regression_loss: 0.7877 - classification_loss: 0.0890 479/500 [===========================>..] - ETA: 5s - loss: 0.8778 - regression_loss: 0.7886 - classification_loss: 0.0892 480/500 [===========================>..] - ETA: 5s - loss: 0.8795 - regression_loss: 0.7898 - classification_loss: 0.0897 481/500 [===========================>..] - ETA: 4s - loss: 0.8798 - regression_loss: 0.7901 - classification_loss: 0.0897 482/500 [===========================>..] - ETA: 4s - loss: 0.8792 - regression_loss: 0.7896 - classification_loss: 0.0896 483/500 [===========================>..] - ETA: 4s - loss: 0.8795 - regression_loss: 0.7900 - classification_loss: 0.0895 484/500 [============================>.] - ETA: 4s - loss: 0.8800 - regression_loss: 0.7903 - classification_loss: 0.0897 485/500 [============================>.] - ETA: 3s - loss: 0.8789 - regression_loss: 0.7894 - classification_loss: 0.0895 486/500 [============================>.] - ETA: 3s - loss: 0.8795 - regression_loss: 0.7899 - classification_loss: 0.0896 487/500 [============================>.] - ETA: 3s - loss: 0.8791 - regression_loss: 0.7896 - classification_loss: 0.0895 488/500 [============================>.] - ETA: 3s - loss: 0.8790 - regression_loss: 0.7895 - classification_loss: 0.0895 489/500 [============================>.] - ETA: 2s - loss: 0.8778 - regression_loss: 0.7885 - classification_loss: 0.0893 490/500 [============================>.] - ETA: 2s - loss: 0.8778 - regression_loss: 0.7885 - classification_loss: 0.0892 491/500 [============================>.] - ETA: 2s - loss: 0.8776 - regression_loss: 0.7884 - classification_loss: 0.0892 492/500 [============================>.] - ETA: 2s - loss: 0.8779 - regression_loss: 0.7887 - classification_loss: 0.0892 493/500 [============================>.] - ETA: 1s - loss: 0.8779 - regression_loss: 0.7887 - classification_loss: 0.0892 494/500 [============================>.] - ETA: 1s - loss: 0.8786 - regression_loss: 0.7892 - classification_loss: 0.0894 495/500 [============================>.] - ETA: 1s - loss: 0.8779 - regression_loss: 0.7886 - classification_loss: 0.0893 496/500 [============================>.] - ETA: 1s - loss: 0.8780 - regression_loss: 0.7887 - classification_loss: 0.0893 497/500 [============================>.] - ETA: 0s - loss: 0.8779 - regression_loss: 0.7886 - classification_loss: 0.0893 498/500 [============================>.] - ETA: 0s - loss: 0.8774 - regression_loss: 0.7881 - classification_loss: 0.0893 499/500 [============================>.] - ETA: 0s - loss: 0.8787 - regression_loss: 0.7893 - classification_loss: 0.0894 500/500 [==============================] - 127s 253ms/step - loss: 0.8785 - regression_loss: 0.7891 - classification_loss: 0.0894 1172 instances of class plum with average precision: 0.7823 mAP: 0.7823 Epoch 00051: saving model to ./training/snapshots/resnet50_pascal_51.h5 Epoch 52/150 1/500 [..............................] - ETA: 2:03 - loss: 0.8496 - regression_loss: 0.7616 - classification_loss: 0.0880 2/500 [..............................] - ETA: 2:04 - loss: 0.6683 - regression_loss: 0.6186 - classification_loss: 0.0497 3/500 [..............................] - ETA: 2:05 - loss: 0.7149 - regression_loss: 0.6733 - classification_loss: 0.0416 4/500 [..............................] - ETA: 2:04 - loss: 0.7432 - regression_loss: 0.6906 - classification_loss: 0.0526 5/500 [..............................] - ETA: 2:04 - loss: 0.6603 - regression_loss: 0.6155 - classification_loss: 0.0448 6/500 [..............................] - ETA: 2:03 - loss: 0.7100 - regression_loss: 0.6582 - classification_loss: 0.0518 7/500 [..............................] - ETA: 2:03 - loss: 0.7910 - regression_loss: 0.7212 - classification_loss: 0.0698 8/500 [..............................] - ETA: 2:02 - loss: 0.8345 - regression_loss: 0.7612 - classification_loss: 0.0733 9/500 [..............................] - ETA: 2:03 - loss: 0.8537 - regression_loss: 0.7737 - classification_loss: 0.0800 10/500 [..............................] - ETA: 2:03 - loss: 0.9197 - regression_loss: 0.8331 - classification_loss: 0.0866 11/500 [..............................] - ETA: 2:03 - loss: 0.9355 - regression_loss: 0.8497 - classification_loss: 0.0859 12/500 [..............................] - ETA: 2:03 - loss: 0.9317 - regression_loss: 0.8480 - classification_loss: 0.0838 13/500 [..............................] - ETA: 2:03 - loss: 0.9212 - regression_loss: 0.8359 - classification_loss: 0.0853 14/500 [..............................] - ETA: 2:03 - loss: 0.9296 - regression_loss: 0.8411 - classification_loss: 0.0885 15/500 [..............................] - ETA: 2:03 - loss: 0.9437 - regression_loss: 0.8530 - classification_loss: 0.0907 16/500 [..............................] - ETA: 2:03 - loss: 0.9457 - regression_loss: 0.8559 - classification_loss: 0.0898 17/500 [>.............................] - ETA: 2:03 - loss: 0.9734 - regression_loss: 0.8772 - classification_loss: 0.0962 18/500 [>.............................] - ETA: 2:03 - loss: 0.9592 - regression_loss: 0.8636 - classification_loss: 0.0955 19/500 [>.............................] - ETA: 2:03 - loss: 0.9691 - regression_loss: 0.8711 - classification_loss: 0.0980 20/500 [>.............................] - ETA: 2:02 - loss: 0.9660 - regression_loss: 0.8664 - classification_loss: 0.0996 21/500 [>.............................] - ETA: 2:01 - loss: 0.9895 - regression_loss: 0.8826 - classification_loss: 0.1069 22/500 [>.............................] - ETA: 2:00 - loss: 0.9753 - regression_loss: 0.8704 - classification_loss: 0.1049 23/500 [>.............................] - ETA: 1:59 - loss: 0.9506 - regression_loss: 0.8487 - classification_loss: 0.1019 24/500 [>.............................] - ETA: 1:58 - loss: 0.9474 - regression_loss: 0.8480 - classification_loss: 0.0994 25/500 [>.............................] - ETA: 1:58 - loss: 0.9599 - regression_loss: 0.8593 - classification_loss: 0.1007 26/500 [>.............................] - ETA: 1:58 - loss: 0.9441 - regression_loss: 0.8457 - classification_loss: 0.0984 27/500 [>.............................] - ETA: 1:58 - loss: 0.9185 - regression_loss: 0.8235 - classification_loss: 0.0950 28/500 [>.............................] - ETA: 1:57 - loss: 0.9413 - regression_loss: 0.8399 - classification_loss: 0.1015 29/500 [>.............................] - ETA: 1:57 - loss: 0.9238 - regression_loss: 0.8254 - classification_loss: 0.0983 30/500 [>.............................] - ETA: 1:57 - loss: 0.9219 - regression_loss: 0.8229 - classification_loss: 0.0990 31/500 [>.............................] - ETA: 1:57 - loss: 0.9225 - regression_loss: 0.8237 - classification_loss: 0.0988 32/500 [>.............................] - ETA: 1:56 - loss: 0.9229 - regression_loss: 0.8241 - classification_loss: 0.0987 33/500 [>.............................] - ETA: 1:56 - loss: 0.9314 - regression_loss: 0.8322 - classification_loss: 0.0992 34/500 [=>............................] - ETA: 1:56 - loss: 0.9282 - regression_loss: 0.8296 - classification_loss: 0.0986 35/500 [=>............................] - ETA: 1:56 - loss: 0.9196 - regression_loss: 0.8226 - classification_loss: 0.0970 36/500 [=>............................] - ETA: 1:55 - loss: 0.9190 - regression_loss: 0.8213 - classification_loss: 0.0977 37/500 [=>............................] - ETA: 1:55 - loss: 0.9333 - regression_loss: 0.8334 - classification_loss: 0.1000 38/500 [=>............................] - ETA: 1:55 - loss: 0.9308 - regression_loss: 0.8318 - classification_loss: 0.0989 39/500 [=>............................] - ETA: 1:55 - loss: 0.9244 - regression_loss: 0.8268 - classification_loss: 0.0976 40/500 [=>............................] - ETA: 1:55 - loss: 0.9235 - regression_loss: 0.8274 - classification_loss: 0.0961 41/500 [=>............................] - ETA: 1:54 - loss: 0.9255 - regression_loss: 0.8299 - classification_loss: 0.0956 42/500 [=>............................] - ETA: 1:54 - loss: 0.9231 - regression_loss: 0.8277 - classification_loss: 0.0955 43/500 [=>............................] - ETA: 1:54 - loss: 0.9280 - regression_loss: 0.8317 - classification_loss: 0.0963 44/500 [=>............................] - ETA: 1:54 - loss: 0.9298 - regression_loss: 0.8331 - classification_loss: 0.0967 45/500 [=>............................] - ETA: 1:53 - loss: 0.9276 - regression_loss: 0.8312 - classification_loss: 0.0964 46/500 [=>............................] - ETA: 1:53 - loss: 0.9248 - regression_loss: 0.8290 - classification_loss: 0.0958 47/500 [=>............................] - ETA: 1:53 - loss: 0.9198 - regression_loss: 0.8252 - classification_loss: 0.0947 48/500 [=>............................] - ETA: 1:53 - loss: 0.9274 - regression_loss: 0.8316 - classification_loss: 0.0958 49/500 [=>............................] - ETA: 1:52 - loss: 0.9348 - regression_loss: 0.8378 - classification_loss: 0.0970 50/500 [==>...........................] - ETA: 1:52 - loss: 0.9333 - regression_loss: 0.8374 - classification_loss: 0.0958 51/500 [==>...........................] - ETA: 1:52 - loss: 0.9354 - regression_loss: 0.8399 - classification_loss: 0.0955 52/500 [==>...........................] - ETA: 1:52 - loss: 0.9291 - regression_loss: 0.8350 - classification_loss: 0.0941 53/500 [==>...........................] - ETA: 1:52 - loss: 0.9281 - regression_loss: 0.8350 - classification_loss: 0.0931 54/500 [==>...........................] - ETA: 1:52 - loss: 0.9342 - regression_loss: 0.8402 - classification_loss: 0.0940 55/500 [==>...........................] - ETA: 1:51 - loss: 0.9372 - regression_loss: 0.8426 - classification_loss: 0.0946 56/500 [==>...........................] - ETA: 1:51 - loss: 0.9339 - regression_loss: 0.8397 - classification_loss: 0.0941 57/500 [==>...........................] - ETA: 1:51 - loss: 0.9248 - regression_loss: 0.8319 - classification_loss: 0.0929 58/500 [==>...........................] - ETA: 1:51 - loss: 0.9150 - regression_loss: 0.8232 - classification_loss: 0.0918 59/500 [==>...........................] - ETA: 1:51 - loss: 0.9189 - regression_loss: 0.8263 - classification_loss: 0.0926 60/500 [==>...........................] - ETA: 1:51 - loss: 0.9208 - regression_loss: 0.8281 - classification_loss: 0.0927 61/500 [==>...........................] - ETA: 1:50 - loss: 0.9192 - regression_loss: 0.8266 - classification_loss: 0.0926 62/500 [==>...........................] - ETA: 1:50 - loss: 0.9272 - regression_loss: 0.8334 - classification_loss: 0.0938 63/500 [==>...........................] - ETA: 1:50 - loss: 0.9214 - regression_loss: 0.8283 - classification_loss: 0.0931 64/500 [==>...........................] - ETA: 1:50 - loss: 0.9171 - regression_loss: 0.8250 - classification_loss: 0.0921 65/500 [==>...........................] - ETA: 1:49 - loss: 0.9207 - regression_loss: 0.8281 - classification_loss: 0.0926 66/500 [==>...........................] - ETA: 1:49 - loss: 0.9180 - regression_loss: 0.8254 - classification_loss: 0.0925 67/500 [===>..........................] - ETA: 1:49 - loss: 0.9164 - regression_loss: 0.8239 - classification_loss: 0.0925 68/500 [===>..........................] - ETA: 1:48 - loss: 0.9207 - regression_loss: 0.8277 - classification_loss: 0.0930 69/500 [===>..........................] - ETA: 1:48 - loss: 0.9202 - regression_loss: 0.8272 - classification_loss: 0.0930 70/500 [===>..........................] - ETA: 1:48 - loss: 0.9213 - regression_loss: 0.8282 - classification_loss: 0.0932 71/500 [===>..........................] - ETA: 1:48 - loss: 0.9224 - regression_loss: 0.8290 - classification_loss: 0.0933 72/500 [===>..........................] - ETA: 1:48 - loss: 0.9233 - regression_loss: 0.8296 - classification_loss: 0.0937 73/500 [===>..........................] - ETA: 1:47 - loss: 0.9196 - regression_loss: 0.8265 - classification_loss: 0.0930 74/500 [===>..........................] - ETA: 1:47 - loss: 0.9161 - regression_loss: 0.8237 - classification_loss: 0.0924 75/500 [===>..........................] - ETA: 1:47 - loss: 0.9168 - regression_loss: 0.8245 - classification_loss: 0.0923 76/500 [===>..........................] - ETA: 1:47 - loss: 0.9123 - regression_loss: 0.8210 - classification_loss: 0.0914 77/500 [===>..........................] - ETA: 1:46 - loss: 0.9165 - regression_loss: 0.8240 - classification_loss: 0.0926 78/500 [===>..........................] - ETA: 1:46 - loss: 0.9125 - regression_loss: 0.8206 - classification_loss: 0.0918 79/500 [===>..........................] - ETA: 1:46 - loss: 0.9063 - regression_loss: 0.8154 - classification_loss: 0.0909 80/500 [===>..........................] - ETA: 1:45 - loss: 0.9038 - regression_loss: 0.8135 - classification_loss: 0.0903 81/500 [===>..........................] - ETA: 1:45 - loss: 0.8964 - regression_loss: 0.8070 - classification_loss: 0.0895 82/500 [===>..........................] - ETA: 1:45 - loss: 0.9032 - regression_loss: 0.8121 - classification_loss: 0.0911 83/500 [===>..........................] - ETA: 1:45 - loss: 0.9028 - regression_loss: 0.8115 - classification_loss: 0.0914 84/500 [====>.........................] - ETA: 1:45 - loss: 0.8995 - regression_loss: 0.8087 - classification_loss: 0.0907 85/500 [====>.........................] - ETA: 1:44 - loss: 0.8981 - regression_loss: 0.8074 - classification_loss: 0.0907 86/500 [====>.........................] - ETA: 1:44 - loss: 0.8989 - regression_loss: 0.8077 - classification_loss: 0.0912 87/500 [====>.........................] - ETA: 1:44 - loss: 0.9022 - regression_loss: 0.8103 - classification_loss: 0.0919 88/500 [====>.........................] - ETA: 1:44 - loss: 0.9027 - regression_loss: 0.8111 - classification_loss: 0.0917 89/500 [====>.........................] - ETA: 1:43 - loss: 0.9018 - regression_loss: 0.8105 - classification_loss: 0.0913 90/500 [====>.........................] - ETA: 1:43 - loss: 0.9052 - regression_loss: 0.8135 - classification_loss: 0.0916 91/500 [====>.........................] - ETA: 1:43 - loss: 0.9030 - regression_loss: 0.8116 - classification_loss: 0.0914 92/500 [====>.........................] - ETA: 1:43 - loss: 0.8990 - regression_loss: 0.8084 - classification_loss: 0.0906 93/500 [====>.........................] - ETA: 1:42 - loss: 0.8977 - regression_loss: 0.8074 - classification_loss: 0.0903 94/500 [====>.........................] - ETA: 1:42 - loss: 0.8929 - regression_loss: 0.8033 - classification_loss: 0.0896 95/500 [====>.........................] - ETA: 1:42 - loss: 0.8948 - regression_loss: 0.8047 - classification_loss: 0.0901 96/500 [====>.........................] - ETA: 1:42 - loss: 0.8946 - regression_loss: 0.8046 - classification_loss: 0.0900 97/500 [====>.........................] - ETA: 1:41 - loss: 0.8998 - regression_loss: 0.8089 - classification_loss: 0.0909 98/500 [====>.........................] - ETA: 1:41 - loss: 0.9020 - regression_loss: 0.8105 - classification_loss: 0.0915 99/500 [====>.........................] - ETA: 1:41 - loss: 0.9019 - regression_loss: 0.8104 - classification_loss: 0.0915 100/500 [=====>........................] - ETA: 1:41 - loss: 0.9035 - regression_loss: 0.8116 - classification_loss: 0.0918 101/500 [=====>........................] - ETA: 1:40 - loss: 0.9032 - regression_loss: 0.8113 - classification_loss: 0.0920 102/500 [=====>........................] - ETA: 1:40 - loss: 0.9023 - regression_loss: 0.8107 - classification_loss: 0.0917 103/500 [=====>........................] - ETA: 1:40 - loss: 0.8984 - regression_loss: 0.8071 - classification_loss: 0.0913 104/500 [=====>........................] - ETA: 1:40 - loss: 0.8999 - regression_loss: 0.8084 - classification_loss: 0.0916 105/500 [=====>........................] - ETA: 1:40 - loss: 0.9017 - regression_loss: 0.8099 - classification_loss: 0.0918 106/500 [=====>........................] - ETA: 1:39 - loss: 0.9097 - regression_loss: 0.8165 - classification_loss: 0.0932 107/500 [=====>........................] - ETA: 1:39 - loss: 0.9129 - regression_loss: 0.8194 - classification_loss: 0.0935 108/500 [=====>........................] - ETA: 1:39 - loss: 0.9104 - regression_loss: 0.8174 - classification_loss: 0.0930 109/500 [=====>........................] - ETA: 1:38 - loss: 0.9093 - regression_loss: 0.8157 - classification_loss: 0.0936 110/500 [=====>........................] - ETA: 1:38 - loss: 0.9138 - regression_loss: 0.8196 - classification_loss: 0.0942 111/500 [=====>........................] - ETA: 1:38 - loss: 0.9115 - regression_loss: 0.8179 - classification_loss: 0.0937 112/500 [=====>........................] - ETA: 1:38 - loss: 0.9160 - regression_loss: 0.8217 - classification_loss: 0.0943 113/500 [=====>........................] - ETA: 1:38 - loss: 0.9125 - regression_loss: 0.8185 - classification_loss: 0.0940 114/500 [=====>........................] - ETA: 1:37 - loss: 0.9116 - regression_loss: 0.8176 - classification_loss: 0.0940 115/500 [=====>........................] - ETA: 1:37 - loss: 0.9137 - regression_loss: 0.8195 - classification_loss: 0.0942 116/500 [=====>........................] - ETA: 1:37 - loss: 0.9126 - regression_loss: 0.8181 - classification_loss: 0.0945 117/500 [======>.......................] - ETA: 1:37 - loss: 0.9130 - regression_loss: 0.8187 - classification_loss: 0.0944 118/500 [======>.......................] - ETA: 1:36 - loss: 0.9105 - regression_loss: 0.8166 - classification_loss: 0.0939 119/500 [======>.......................] - ETA: 1:36 - loss: 0.9120 - regression_loss: 0.8180 - classification_loss: 0.0940 120/500 [======>.......................] - ETA: 1:36 - loss: 0.9118 - regression_loss: 0.8176 - classification_loss: 0.0942 121/500 [======>.......................] - ETA: 1:36 - loss: 0.9081 - regression_loss: 0.8145 - classification_loss: 0.0935 122/500 [======>.......................] - ETA: 1:35 - loss: 0.9078 - regression_loss: 0.8145 - classification_loss: 0.0933 123/500 [======>.......................] - ETA: 1:35 - loss: 0.9032 - regression_loss: 0.8100 - classification_loss: 0.0931 124/500 [======>.......................] - ETA: 1:35 - loss: 0.8996 - regression_loss: 0.8069 - classification_loss: 0.0927 125/500 [======>.......................] - ETA: 1:35 - loss: 0.9031 - regression_loss: 0.8095 - classification_loss: 0.0937 126/500 [======>.......................] - ETA: 1:34 - loss: 0.9067 - regression_loss: 0.8119 - classification_loss: 0.0948 127/500 [======>.......................] - ETA: 1:34 - loss: 0.9094 - regression_loss: 0.8137 - classification_loss: 0.0958 128/500 [======>.......................] - ETA: 1:34 - loss: 0.9087 - regression_loss: 0.8129 - classification_loss: 0.0957 129/500 [======>.......................] - ETA: 1:34 - loss: 0.9073 - regression_loss: 0.8120 - classification_loss: 0.0952 130/500 [======>.......................] - ETA: 1:33 - loss: 0.9007 - regression_loss: 0.8058 - classification_loss: 0.0949 131/500 [======>.......................] - ETA: 1:33 - loss: 0.9023 - regression_loss: 0.8075 - classification_loss: 0.0948 132/500 [======>.......................] - ETA: 1:33 - loss: 0.9048 - regression_loss: 0.8106 - classification_loss: 0.0943 133/500 [======>.......................] - ETA: 1:32 - loss: 0.9006 - regression_loss: 0.8069 - classification_loss: 0.0936 134/500 [=======>......................] - ETA: 1:32 - loss: 0.8987 - regression_loss: 0.8054 - classification_loss: 0.0934 135/500 [=======>......................] - ETA: 1:32 - loss: 0.9015 - regression_loss: 0.8078 - classification_loss: 0.0937 136/500 [=======>......................] - ETA: 1:32 - loss: 0.9051 - regression_loss: 0.8108 - classification_loss: 0.0944 137/500 [=======>......................] - ETA: 1:31 - loss: 0.9092 - regression_loss: 0.8141 - classification_loss: 0.0951 138/500 [=======>......................] - ETA: 1:31 - loss: 0.9083 - regression_loss: 0.8133 - classification_loss: 0.0949 139/500 [=======>......................] - ETA: 1:31 - loss: 0.9040 - regression_loss: 0.8095 - classification_loss: 0.0944 140/500 [=======>......................] - ETA: 1:31 - loss: 0.9008 - regression_loss: 0.8069 - classification_loss: 0.0939 141/500 [=======>......................] - ETA: 1:30 - loss: 0.9009 - regression_loss: 0.8071 - classification_loss: 0.0938 142/500 [=======>......................] - ETA: 1:30 - loss: 0.8994 - regression_loss: 0.8059 - classification_loss: 0.0935 143/500 [=======>......................] - ETA: 1:30 - loss: 0.9011 - regression_loss: 0.8077 - classification_loss: 0.0934 144/500 [=======>......................] - ETA: 1:30 - loss: 0.8986 - regression_loss: 0.8054 - classification_loss: 0.0931 145/500 [=======>......................] - ETA: 1:29 - loss: 0.8997 - regression_loss: 0.8061 - classification_loss: 0.0935 146/500 [=======>......................] - ETA: 1:29 - loss: 0.8991 - regression_loss: 0.8059 - classification_loss: 0.0931 147/500 [=======>......................] - ETA: 1:29 - loss: 0.9010 - regression_loss: 0.8075 - classification_loss: 0.0935 148/500 [=======>......................] - ETA: 1:29 - loss: 0.8999 - regression_loss: 0.8062 - classification_loss: 0.0937 149/500 [=======>......................] - ETA: 1:28 - loss: 0.8964 - regression_loss: 0.8032 - classification_loss: 0.0931 150/500 [========>.....................] - ETA: 1:28 - loss: 0.8933 - regression_loss: 0.8007 - classification_loss: 0.0926 151/500 [========>.....................] - ETA: 1:28 - loss: 0.8952 - regression_loss: 0.8022 - classification_loss: 0.0931 152/500 [========>.....................] - ETA: 1:28 - loss: 0.8967 - regression_loss: 0.8033 - classification_loss: 0.0934 153/500 [========>.....................] - ETA: 1:27 - loss: 0.8993 - regression_loss: 0.8054 - classification_loss: 0.0939 154/500 [========>.....................] - ETA: 1:27 - loss: 0.8962 - regression_loss: 0.8027 - classification_loss: 0.0936 155/500 [========>.....................] - ETA: 1:27 - loss: 0.8942 - regression_loss: 0.8011 - classification_loss: 0.0931 156/500 [========>.....................] - ETA: 1:27 - loss: 0.8965 - regression_loss: 0.8030 - classification_loss: 0.0935 157/500 [========>.....................] - ETA: 1:26 - loss: 0.8956 - regression_loss: 0.8023 - classification_loss: 0.0934 158/500 [========>.....................] - ETA: 1:26 - loss: 0.8957 - regression_loss: 0.8022 - classification_loss: 0.0935 159/500 [========>.....................] - ETA: 1:26 - loss: 0.8956 - regression_loss: 0.8021 - classification_loss: 0.0935 160/500 [========>.....................] - ETA: 1:26 - loss: 0.8954 - regression_loss: 0.8021 - classification_loss: 0.0934 161/500 [========>.....................] - ETA: 1:25 - loss: 0.8952 - regression_loss: 0.8017 - classification_loss: 0.0935 162/500 [========>.....................] - ETA: 1:25 - loss: 0.8951 - regression_loss: 0.8017 - classification_loss: 0.0934 163/500 [========>.....................] - ETA: 1:25 - loss: 0.8931 - regression_loss: 0.8001 - classification_loss: 0.0930 164/500 [========>.....................] - ETA: 1:25 - loss: 0.8961 - regression_loss: 0.8027 - classification_loss: 0.0934 165/500 [========>.....................] - ETA: 1:24 - loss: 0.8977 - regression_loss: 0.8042 - classification_loss: 0.0935 166/500 [========>.....................] - ETA: 1:24 - loss: 0.8965 - regression_loss: 0.8033 - classification_loss: 0.0932 167/500 [=========>....................] - ETA: 1:24 - loss: 0.8984 - regression_loss: 0.8051 - classification_loss: 0.0933 168/500 [=========>....................] - ETA: 1:24 - loss: 0.8988 - regression_loss: 0.8056 - classification_loss: 0.0932 169/500 [=========>....................] - ETA: 1:23 - loss: 0.9001 - regression_loss: 0.8066 - classification_loss: 0.0935 170/500 [=========>....................] - ETA: 1:23 - loss: 0.9006 - regression_loss: 0.8067 - classification_loss: 0.0939 171/500 [=========>....................] - ETA: 1:23 - loss: 0.9031 - regression_loss: 0.8087 - classification_loss: 0.0945 172/500 [=========>....................] - ETA: 1:23 - loss: 0.9033 - regression_loss: 0.8088 - classification_loss: 0.0945 173/500 [=========>....................] - ETA: 1:22 - loss: 0.9030 - regression_loss: 0.8087 - classification_loss: 0.0943 174/500 [=========>....................] - ETA: 1:22 - loss: 0.9007 - regression_loss: 0.8068 - classification_loss: 0.0939 175/500 [=========>....................] - ETA: 1:22 - loss: 0.9013 - regression_loss: 0.8071 - classification_loss: 0.0942 176/500 [=========>....................] - ETA: 1:22 - loss: 0.9022 - regression_loss: 0.8079 - classification_loss: 0.0943 177/500 [=========>....................] - ETA: 1:21 - loss: 0.9007 - regression_loss: 0.8067 - classification_loss: 0.0939 178/500 [=========>....................] - ETA: 1:21 - loss: 0.8988 - regression_loss: 0.8052 - classification_loss: 0.0936 179/500 [=========>....................] - ETA: 1:21 - loss: 0.8970 - regression_loss: 0.8037 - classification_loss: 0.0933 180/500 [=========>....................] - ETA: 1:21 - loss: 0.8964 - regression_loss: 0.8031 - classification_loss: 0.0932 181/500 [=========>....................] - ETA: 1:20 - loss: 0.8961 - regression_loss: 0.8030 - classification_loss: 0.0931 182/500 [=========>....................] - ETA: 1:20 - loss: 0.8969 - regression_loss: 0.8036 - classification_loss: 0.0933 183/500 [=========>....................] - ETA: 1:20 - loss: 0.8995 - regression_loss: 0.8054 - classification_loss: 0.0940 184/500 [==========>...................] - ETA: 1:20 - loss: 0.8973 - regression_loss: 0.8035 - classification_loss: 0.0938 185/500 [==========>...................] - ETA: 1:19 - loss: 0.8978 - regression_loss: 0.8041 - classification_loss: 0.0938 186/500 [==========>...................] - ETA: 1:19 - loss: 0.8979 - regression_loss: 0.8042 - classification_loss: 0.0937 187/500 [==========>...................] - ETA: 1:19 - loss: 0.8992 - regression_loss: 0.8056 - classification_loss: 0.0936 188/500 [==========>...................] - ETA: 1:19 - loss: 0.9008 - regression_loss: 0.8070 - classification_loss: 0.0939 189/500 [==========>...................] - ETA: 1:18 - loss: 0.9022 - regression_loss: 0.8082 - classification_loss: 0.0940 190/500 [==========>...................] - ETA: 1:18 - loss: 0.9018 - regression_loss: 0.8073 - classification_loss: 0.0945 191/500 [==========>...................] - ETA: 1:18 - loss: 0.8991 - regression_loss: 0.8050 - classification_loss: 0.0941 192/500 [==========>...................] - ETA: 1:18 - loss: 0.9001 - regression_loss: 0.8058 - classification_loss: 0.0943 193/500 [==========>...................] - ETA: 1:17 - loss: 0.9021 - regression_loss: 0.8075 - classification_loss: 0.0946 194/500 [==========>...................] - ETA: 1:17 - loss: 0.9008 - regression_loss: 0.8062 - classification_loss: 0.0946 195/500 [==========>...................] - ETA: 1:17 - loss: 0.9002 - regression_loss: 0.8057 - classification_loss: 0.0944 196/500 [==========>...................] - ETA: 1:16 - loss: 0.9004 - regression_loss: 0.8059 - classification_loss: 0.0945 197/500 [==========>...................] - ETA: 1:16 - loss: 0.9012 - regression_loss: 0.8065 - classification_loss: 0.0947 198/500 [==========>...................] - ETA: 1:16 - loss: 0.8998 - regression_loss: 0.8053 - classification_loss: 0.0945 199/500 [==========>...................] - ETA: 1:16 - loss: 0.9007 - regression_loss: 0.8062 - classification_loss: 0.0946 200/500 [===========>..................] - ETA: 1:15 - loss: 0.9005 - regression_loss: 0.8061 - classification_loss: 0.0944 201/500 [===========>..................] - ETA: 1:15 - loss: 0.9027 - regression_loss: 0.8078 - classification_loss: 0.0948 202/500 [===========>..................] - ETA: 1:15 - loss: 0.9010 - regression_loss: 0.8065 - classification_loss: 0.0946 203/500 [===========>..................] - ETA: 1:15 - loss: 0.9006 - regression_loss: 0.8059 - classification_loss: 0.0947 204/500 [===========>..................] - ETA: 1:14 - loss: 0.9009 - regression_loss: 0.8060 - classification_loss: 0.0948 205/500 [===========>..................] - ETA: 1:14 - loss: 0.9025 - regression_loss: 0.8073 - classification_loss: 0.0951 206/500 [===========>..................] - ETA: 1:14 - loss: 0.9006 - regression_loss: 0.8057 - classification_loss: 0.0949 207/500 [===========>..................] - ETA: 1:14 - loss: 0.9020 - regression_loss: 0.8070 - classification_loss: 0.0951 208/500 [===========>..................] - ETA: 1:13 - loss: 0.9002 - regression_loss: 0.8055 - classification_loss: 0.0947 209/500 [===========>..................] - ETA: 1:13 - loss: 0.9015 - regression_loss: 0.8064 - classification_loss: 0.0951 210/500 [===========>..................] - ETA: 1:13 - loss: 0.9013 - regression_loss: 0.8064 - classification_loss: 0.0949 211/500 [===========>..................] - ETA: 1:13 - loss: 0.9009 - regression_loss: 0.8060 - classification_loss: 0.0949 212/500 [===========>..................] - ETA: 1:12 - loss: 0.9010 - regression_loss: 0.8059 - classification_loss: 0.0951 213/500 [===========>..................] - ETA: 1:12 - loss: 0.9009 - regression_loss: 0.8058 - classification_loss: 0.0951 214/500 [===========>..................] - ETA: 1:12 - loss: 0.9040 - regression_loss: 0.8083 - classification_loss: 0.0958 215/500 [===========>..................] - ETA: 1:12 - loss: 0.9041 - regression_loss: 0.8084 - classification_loss: 0.0957 216/500 [===========>..................] - ETA: 1:11 - loss: 0.9064 - regression_loss: 0.8102 - classification_loss: 0.0962 217/500 [============>.................] - ETA: 1:11 - loss: 0.9062 - regression_loss: 0.8101 - classification_loss: 0.0961 218/500 [============>.................] - ETA: 1:11 - loss: 0.9050 - regression_loss: 0.8091 - classification_loss: 0.0959 219/500 [============>.................] - ETA: 1:11 - loss: 0.9047 - regression_loss: 0.8088 - classification_loss: 0.0960 220/500 [============>.................] - ETA: 1:10 - loss: 0.9047 - regression_loss: 0.8088 - classification_loss: 0.0958 221/500 [============>.................] - ETA: 1:10 - loss: 0.9066 - regression_loss: 0.8106 - classification_loss: 0.0960 222/500 [============>.................] - ETA: 1:10 - loss: 0.9077 - regression_loss: 0.8114 - classification_loss: 0.0963 223/500 [============>.................] - ETA: 1:10 - loss: 0.9088 - regression_loss: 0.8123 - classification_loss: 0.0965 224/500 [============>.................] - ETA: 1:09 - loss: 0.9077 - regression_loss: 0.8114 - classification_loss: 0.0962 225/500 [============>.................] - ETA: 1:09 - loss: 0.9081 - regression_loss: 0.8118 - classification_loss: 0.0964 226/500 [============>.................] - ETA: 1:09 - loss: 0.9073 - regression_loss: 0.8111 - classification_loss: 0.0961 227/500 [============>.................] - ETA: 1:09 - loss: 0.9053 - regression_loss: 0.8095 - classification_loss: 0.0958 228/500 [============>.................] - ETA: 1:08 - loss: 0.9028 - regression_loss: 0.8073 - classification_loss: 0.0956 229/500 [============>.................] - ETA: 1:08 - loss: 0.9005 - regression_loss: 0.8052 - classification_loss: 0.0953 230/500 [============>.................] - ETA: 1:08 - loss: 0.9003 - regression_loss: 0.8050 - classification_loss: 0.0952 231/500 [============>.................] - ETA: 1:08 - loss: 0.8987 - regression_loss: 0.8038 - classification_loss: 0.0950 232/500 [============>.................] - ETA: 1:07 - loss: 0.8982 - regression_loss: 0.8034 - classification_loss: 0.0948 233/500 [============>.................] - ETA: 1:07 - loss: 0.8981 - regression_loss: 0.8034 - classification_loss: 0.0947 234/500 [=============>................] - ETA: 1:07 - loss: 0.8974 - regression_loss: 0.8030 - classification_loss: 0.0945 235/500 [=============>................] - ETA: 1:07 - loss: 0.8945 - regression_loss: 0.8004 - classification_loss: 0.0941 236/500 [=============>................] - ETA: 1:06 - loss: 0.8955 - regression_loss: 0.8011 - classification_loss: 0.0943 237/500 [=============>................] - ETA: 1:06 - loss: 0.8936 - regression_loss: 0.7996 - classification_loss: 0.0941 238/500 [=============>................] - ETA: 1:06 - loss: 0.8928 - regression_loss: 0.7989 - classification_loss: 0.0939 239/500 [=============>................] - ETA: 1:06 - loss: 0.8908 - regression_loss: 0.7972 - classification_loss: 0.0936 240/500 [=============>................] - ETA: 1:05 - loss: 0.8922 - regression_loss: 0.7984 - classification_loss: 0.0938 241/500 [=============>................] - ETA: 1:05 - loss: 0.8934 - regression_loss: 0.7993 - classification_loss: 0.0941 242/500 [=============>................] - ETA: 1:05 - loss: 0.8947 - regression_loss: 0.8005 - classification_loss: 0.0943 243/500 [=============>................] - ETA: 1:05 - loss: 0.8949 - regression_loss: 0.8005 - classification_loss: 0.0944 244/500 [=============>................] - ETA: 1:04 - loss: 0.8947 - regression_loss: 0.8004 - classification_loss: 0.0943 245/500 [=============>................] - ETA: 1:04 - loss: 0.8963 - regression_loss: 0.8020 - classification_loss: 0.0943 246/500 [=============>................] - ETA: 1:04 - loss: 0.8946 - regression_loss: 0.8005 - classification_loss: 0.0940 247/500 [=============>................] - ETA: 1:04 - loss: 0.8937 - regression_loss: 0.7998 - classification_loss: 0.0939 248/500 [=============>................] - ETA: 1:03 - loss: 0.8941 - regression_loss: 0.8000 - classification_loss: 0.0940 249/500 [=============>................] - ETA: 1:03 - loss: 0.8922 - regression_loss: 0.7985 - classification_loss: 0.0938 250/500 [==============>...............] - ETA: 1:03 - loss: 0.8930 - regression_loss: 0.7992 - classification_loss: 0.0938 251/500 [==============>...............] - ETA: 1:03 - loss: 0.8939 - regression_loss: 0.7998 - classification_loss: 0.0941 252/500 [==============>...............] - ETA: 1:02 - loss: 0.8922 - regression_loss: 0.7983 - classification_loss: 0.0939 253/500 [==============>...............] - ETA: 1:02 - loss: 0.8921 - regression_loss: 0.7982 - classification_loss: 0.0939 254/500 [==============>...............] - ETA: 1:02 - loss: 0.8922 - regression_loss: 0.7984 - classification_loss: 0.0938 255/500 [==============>...............] - ETA: 1:02 - loss: 0.8930 - regression_loss: 0.7993 - classification_loss: 0.0937 256/500 [==============>...............] - ETA: 1:01 - loss: 0.8934 - regression_loss: 0.7997 - classification_loss: 0.0937 257/500 [==============>...............] - ETA: 1:01 - loss: 0.8920 - regression_loss: 0.7984 - classification_loss: 0.0937 258/500 [==============>...............] - ETA: 1:01 - loss: 0.8907 - regression_loss: 0.7973 - classification_loss: 0.0934 259/500 [==============>...............] - ETA: 1:01 - loss: 0.8903 - regression_loss: 0.7970 - classification_loss: 0.0933 260/500 [==============>...............] - ETA: 1:00 - loss: 0.8882 - regression_loss: 0.7951 - classification_loss: 0.0931 261/500 [==============>...............] - ETA: 1:00 - loss: 0.8891 - regression_loss: 0.7960 - classification_loss: 0.0931 262/500 [==============>...............] - ETA: 1:00 - loss: 0.8901 - regression_loss: 0.7968 - classification_loss: 0.0933 263/500 [==============>...............] - ETA: 1:00 - loss: 0.8880 - regression_loss: 0.7950 - classification_loss: 0.0930 264/500 [==============>...............] - ETA: 59s - loss: 0.8885 - regression_loss: 0.7956 - classification_loss: 0.0929  265/500 [==============>...............] - ETA: 59s - loss: 0.8873 - regression_loss: 0.7946 - classification_loss: 0.0927 266/500 [==============>...............] - ETA: 59s - loss: 0.8860 - regression_loss: 0.7934 - classification_loss: 0.0926 267/500 [===============>..............] - ETA: 59s - loss: 0.8842 - regression_loss: 0.7919 - classification_loss: 0.0923 268/500 [===============>..............] - ETA: 58s - loss: 0.8839 - regression_loss: 0.7917 - classification_loss: 0.0923 269/500 [===============>..............] - ETA: 58s - loss: 0.8837 - regression_loss: 0.7916 - classification_loss: 0.0921 270/500 [===============>..............] - ETA: 58s - loss: 0.8845 - regression_loss: 0.7922 - classification_loss: 0.0923 271/500 [===============>..............] - ETA: 58s - loss: 0.8869 - regression_loss: 0.7942 - classification_loss: 0.0926 272/500 [===============>..............] - ETA: 57s - loss: 0.8863 - regression_loss: 0.7937 - classification_loss: 0.0926 273/500 [===============>..............] - ETA: 57s - loss: 0.8862 - regression_loss: 0.7936 - classification_loss: 0.0926 274/500 [===============>..............] - ETA: 57s - loss: 0.8859 - regression_loss: 0.7934 - classification_loss: 0.0925 275/500 [===============>..............] - ETA: 57s - loss: 0.8861 - regression_loss: 0.7935 - classification_loss: 0.0926 276/500 [===============>..............] - ETA: 56s - loss: 0.8873 - regression_loss: 0.7946 - classification_loss: 0.0927 277/500 [===============>..............] - ETA: 56s - loss: 0.8875 - regression_loss: 0.7948 - classification_loss: 0.0926 278/500 [===============>..............] - ETA: 56s - loss: 0.8858 - regression_loss: 0.7934 - classification_loss: 0.0923 279/500 [===============>..............] - ETA: 55s - loss: 0.8857 - regression_loss: 0.7935 - classification_loss: 0.0922 280/500 [===============>..............] - ETA: 55s - loss: 0.8871 - regression_loss: 0.7947 - classification_loss: 0.0924 281/500 [===============>..............] - ETA: 55s - loss: 0.8866 - regression_loss: 0.7944 - classification_loss: 0.0921 282/500 [===============>..............] - ETA: 55s - loss: 0.8875 - regression_loss: 0.7953 - classification_loss: 0.0922 283/500 [===============>..............] - ETA: 54s - loss: 0.8854 - regression_loss: 0.7935 - classification_loss: 0.0919 284/500 [================>.............] - ETA: 54s - loss: 0.8835 - regression_loss: 0.7919 - classification_loss: 0.0916 285/500 [================>.............] - ETA: 54s - loss: 0.8846 - regression_loss: 0.7927 - classification_loss: 0.0919 286/500 [================>.............] - ETA: 54s - loss: 0.8848 - regression_loss: 0.7929 - classification_loss: 0.0919 287/500 [================>.............] - ETA: 53s - loss: 0.8854 - regression_loss: 0.7933 - classification_loss: 0.0920 288/500 [================>.............] - ETA: 53s - loss: 0.8847 - regression_loss: 0.7929 - classification_loss: 0.0918 289/500 [================>.............] - ETA: 53s - loss: 0.8852 - regression_loss: 0.7935 - classification_loss: 0.0918 290/500 [================>.............] - ETA: 53s - loss: 0.8858 - regression_loss: 0.7939 - classification_loss: 0.0918 291/500 [================>.............] - ETA: 52s - loss: 0.8847 - regression_loss: 0.7930 - classification_loss: 0.0916 292/500 [================>.............] - ETA: 52s - loss: 0.8851 - regression_loss: 0.7935 - classification_loss: 0.0916 293/500 [================>.............] - ETA: 52s - loss: 0.8922 - regression_loss: 0.7980 - classification_loss: 0.0942 294/500 [================>.............] - ETA: 52s - loss: 0.8927 - regression_loss: 0.7986 - classification_loss: 0.0941 295/500 [================>.............] - ETA: 51s - loss: 0.8934 - regression_loss: 0.7992 - classification_loss: 0.0942 296/500 [================>.............] - ETA: 51s - loss: 0.8921 - regression_loss: 0.7980 - classification_loss: 0.0941 297/500 [================>.............] - ETA: 51s - loss: 0.8922 - regression_loss: 0.7981 - classification_loss: 0.0940 298/500 [================>.............] - ETA: 51s - loss: 0.8918 - regression_loss: 0.7978 - classification_loss: 0.0941 299/500 [================>.............] - ETA: 50s - loss: 0.8922 - regression_loss: 0.7982 - classification_loss: 0.0940 300/500 [=================>............] - ETA: 50s - loss: 0.8933 - regression_loss: 0.7993 - classification_loss: 0.0940 301/500 [=================>............] - ETA: 50s - loss: 0.8926 - regression_loss: 0.7987 - classification_loss: 0.0939 302/500 [=================>............] - ETA: 50s - loss: 0.8925 - regression_loss: 0.7989 - classification_loss: 0.0936 303/500 [=================>............] - ETA: 49s - loss: 0.8912 - regression_loss: 0.7976 - classification_loss: 0.0935 304/500 [=================>............] - ETA: 49s - loss: 0.8889 - regression_loss: 0.7957 - classification_loss: 0.0933 305/500 [=================>............] - ETA: 49s - loss: 0.8889 - regression_loss: 0.7957 - classification_loss: 0.0932 306/500 [=================>............] - ETA: 49s - loss: 0.8899 - regression_loss: 0.7965 - classification_loss: 0.0934 307/500 [=================>............] - ETA: 48s - loss: 0.8878 - regression_loss: 0.7946 - classification_loss: 0.0931 308/500 [=================>............] - ETA: 48s - loss: 0.8882 - regression_loss: 0.7950 - classification_loss: 0.0932 309/500 [=================>............] - ETA: 48s - loss: 0.8877 - regression_loss: 0.7945 - classification_loss: 0.0932 310/500 [=================>............] - ETA: 48s - loss: 0.8871 - regression_loss: 0.7940 - classification_loss: 0.0931 311/500 [=================>............] - ETA: 47s - loss: 0.8853 - regression_loss: 0.7924 - classification_loss: 0.0929 312/500 [=================>............] - ETA: 47s - loss: 0.8856 - regression_loss: 0.7927 - classification_loss: 0.0929 313/500 [=================>............] - ETA: 47s - loss: 0.8852 - regression_loss: 0.7924 - classification_loss: 0.0928 314/500 [=================>............] - ETA: 47s - loss: 0.8861 - regression_loss: 0.7933 - classification_loss: 0.0928 315/500 [=================>............] - ETA: 46s - loss: 0.8851 - regression_loss: 0.7923 - classification_loss: 0.0927 316/500 [=================>............] - ETA: 46s - loss: 0.8853 - regression_loss: 0.7925 - classification_loss: 0.0928 317/500 [==================>...........] - ETA: 46s - loss: 0.8851 - regression_loss: 0.7923 - classification_loss: 0.0928 318/500 [==================>...........] - ETA: 46s - loss: 0.8844 - regression_loss: 0.7917 - classification_loss: 0.0927 319/500 [==================>...........] - ETA: 45s - loss: 0.8848 - regression_loss: 0.7921 - classification_loss: 0.0926 320/500 [==================>...........] - ETA: 45s - loss: 0.8845 - regression_loss: 0.7919 - classification_loss: 0.0926 321/500 [==================>...........] - ETA: 45s - loss: 0.8837 - regression_loss: 0.7912 - classification_loss: 0.0925 322/500 [==================>...........] - ETA: 45s - loss: 0.8838 - regression_loss: 0.7913 - classification_loss: 0.0925 323/500 [==================>...........] - ETA: 44s - loss: 0.8831 - regression_loss: 0.7908 - classification_loss: 0.0924 324/500 [==================>...........] - ETA: 44s - loss: 0.8857 - regression_loss: 0.7931 - classification_loss: 0.0926 325/500 [==================>...........] - ETA: 44s - loss: 0.8866 - regression_loss: 0.7939 - classification_loss: 0.0928 326/500 [==================>...........] - ETA: 44s - loss: 0.8866 - regression_loss: 0.7939 - classification_loss: 0.0927 327/500 [==================>...........] - ETA: 43s - loss: 0.8884 - regression_loss: 0.7954 - classification_loss: 0.0930 328/500 [==================>...........] - ETA: 43s - loss: 0.8872 - regression_loss: 0.7944 - classification_loss: 0.0928 329/500 [==================>...........] - ETA: 43s - loss: 0.8867 - regression_loss: 0.7940 - classification_loss: 0.0927 330/500 [==================>...........] - ETA: 43s - loss: 0.8856 - regression_loss: 0.7930 - classification_loss: 0.0926 331/500 [==================>...........] - ETA: 42s - loss: 0.8839 - regression_loss: 0.7915 - classification_loss: 0.0923 332/500 [==================>...........] - ETA: 42s - loss: 0.8834 - regression_loss: 0.7912 - classification_loss: 0.0922 333/500 [==================>...........] - ETA: 42s - loss: 0.8820 - regression_loss: 0.7900 - classification_loss: 0.0919 334/500 [===================>..........] - ETA: 42s - loss: 0.8821 - regression_loss: 0.7902 - classification_loss: 0.0919 335/500 [===================>..........] - ETA: 41s - loss: 0.8822 - regression_loss: 0.7903 - classification_loss: 0.0919 336/500 [===================>..........] - ETA: 41s - loss: 0.8818 - regression_loss: 0.7899 - classification_loss: 0.0919 337/500 [===================>..........] - ETA: 41s - loss: 0.8808 - regression_loss: 0.7891 - classification_loss: 0.0917 338/500 [===================>..........] - ETA: 41s - loss: 0.8814 - regression_loss: 0.7897 - classification_loss: 0.0917 339/500 [===================>..........] - ETA: 40s - loss: 0.8824 - regression_loss: 0.7905 - classification_loss: 0.0919 340/500 [===================>..........] - ETA: 40s - loss: 0.8827 - regression_loss: 0.7908 - classification_loss: 0.0918 341/500 [===================>..........] - ETA: 40s - loss: 0.8828 - regression_loss: 0.7910 - classification_loss: 0.0918 342/500 [===================>..........] - ETA: 40s - loss: 0.8831 - regression_loss: 0.7912 - classification_loss: 0.0919 343/500 [===================>..........] - ETA: 39s - loss: 0.8836 - regression_loss: 0.7917 - classification_loss: 0.0919 344/500 [===================>..........] - ETA: 39s - loss: 0.8834 - regression_loss: 0.7914 - classification_loss: 0.0920 345/500 [===================>..........] - ETA: 39s - loss: 0.8826 - regression_loss: 0.7907 - classification_loss: 0.0919 346/500 [===================>..........] - ETA: 39s - loss: 0.8815 - regression_loss: 0.7897 - classification_loss: 0.0918 347/500 [===================>..........] - ETA: 38s - loss: 0.8794 - regression_loss: 0.7879 - classification_loss: 0.0915 348/500 [===================>..........] - ETA: 38s - loss: 0.8784 - regression_loss: 0.7871 - classification_loss: 0.0913 349/500 [===================>..........] - ETA: 38s - loss: 0.8782 - regression_loss: 0.7869 - classification_loss: 0.0913 350/500 [====================>.........] - ETA: 38s - loss: 0.8769 - regression_loss: 0.7858 - classification_loss: 0.0911 351/500 [====================>.........] - ETA: 37s - loss: 0.8782 - regression_loss: 0.7868 - classification_loss: 0.0914 352/500 [====================>.........] - ETA: 37s - loss: 0.8782 - regression_loss: 0.7869 - classification_loss: 0.0914 353/500 [====================>.........] - ETA: 37s - loss: 0.8779 - regression_loss: 0.7865 - classification_loss: 0.0914 354/500 [====================>.........] - ETA: 37s - loss: 0.8777 - regression_loss: 0.7863 - classification_loss: 0.0915 355/500 [====================>.........] - ETA: 36s - loss: 0.8777 - regression_loss: 0.7863 - classification_loss: 0.0913 356/500 [====================>.........] - ETA: 36s - loss: 0.8767 - regression_loss: 0.7856 - classification_loss: 0.0911 357/500 [====================>.........] - ETA: 36s - loss: 0.8756 - regression_loss: 0.7847 - classification_loss: 0.0909 358/500 [====================>.........] - ETA: 36s - loss: 0.8755 - regression_loss: 0.7846 - classification_loss: 0.0908 359/500 [====================>.........] - ETA: 35s - loss: 0.8758 - regression_loss: 0.7850 - classification_loss: 0.0908 360/500 [====================>.........] - ETA: 35s - loss: 0.8776 - regression_loss: 0.7864 - classification_loss: 0.0912 361/500 [====================>.........] - ETA: 35s - loss: 0.8766 - regression_loss: 0.7856 - classification_loss: 0.0910 362/500 [====================>.........] - ETA: 34s - loss: 0.8757 - regression_loss: 0.7848 - classification_loss: 0.0909 363/500 [====================>.........] - ETA: 34s - loss: 0.8763 - regression_loss: 0.7853 - classification_loss: 0.0910 364/500 [====================>.........] - ETA: 34s - loss: 0.8777 - regression_loss: 0.7861 - classification_loss: 0.0917 365/500 [====================>.........] - ETA: 34s - loss: 0.8765 - regression_loss: 0.7851 - classification_loss: 0.0915 366/500 [====================>.........] - ETA: 33s - loss: 0.8771 - regression_loss: 0.7854 - classification_loss: 0.0916 367/500 [=====================>........] - ETA: 33s - loss: 0.8772 - regression_loss: 0.7856 - classification_loss: 0.0916 368/500 [=====================>........] - ETA: 33s - loss: 0.8782 - regression_loss: 0.7865 - classification_loss: 0.0917 369/500 [=====================>........] - ETA: 33s - loss: 0.8778 - regression_loss: 0.7862 - classification_loss: 0.0916 370/500 [=====================>........] - ETA: 32s - loss: 0.8779 - regression_loss: 0.7864 - classification_loss: 0.0916 371/500 [=====================>........] - ETA: 32s - loss: 0.8781 - regression_loss: 0.7866 - classification_loss: 0.0915 372/500 [=====================>........] - ETA: 32s - loss: 0.8785 - regression_loss: 0.7869 - classification_loss: 0.0916 373/500 [=====================>........] - ETA: 32s - loss: 0.8791 - regression_loss: 0.7874 - classification_loss: 0.0917 374/500 [=====================>........] - ETA: 31s - loss: 0.8780 - regression_loss: 0.7865 - classification_loss: 0.0915 375/500 [=====================>........] - ETA: 31s - loss: 0.8801 - regression_loss: 0.7883 - classification_loss: 0.0918 376/500 [=====================>........] - ETA: 31s - loss: 0.8799 - regression_loss: 0.7882 - classification_loss: 0.0917 377/500 [=====================>........] - ETA: 31s - loss: 0.8784 - regression_loss: 0.7869 - classification_loss: 0.0915 378/500 [=====================>........] - ETA: 30s - loss: 0.8775 - regression_loss: 0.7861 - classification_loss: 0.0914 379/500 [=====================>........] - ETA: 30s - loss: 0.8778 - regression_loss: 0.7864 - classification_loss: 0.0914 380/500 [=====================>........] - ETA: 30s - loss: 0.8787 - regression_loss: 0.7873 - classification_loss: 0.0914 381/500 [=====================>........] - ETA: 30s - loss: 0.8793 - regression_loss: 0.7878 - classification_loss: 0.0915 382/500 [=====================>........] - ETA: 29s - loss: 0.8789 - regression_loss: 0.7875 - classification_loss: 0.0914 383/500 [=====================>........] - ETA: 29s - loss: 0.8791 - regression_loss: 0.7877 - classification_loss: 0.0914 384/500 [======================>.......] - ETA: 29s - loss: 0.8777 - regression_loss: 0.7865 - classification_loss: 0.0912 385/500 [======================>.......] - ETA: 29s - loss: 0.8777 - regression_loss: 0.7866 - classification_loss: 0.0911 386/500 [======================>.......] - ETA: 28s - loss: 0.8782 - regression_loss: 0.7870 - classification_loss: 0.0912 387/500 [======================>.......] - ETA: 28s - loss: 0.8790 - regression_loss: 0.7876 - classification_loss: 0.0914 388/500 [======================>.......] - ETA: 28s - loss: 0.8786 - regression_loss: 0.7873 - classification_loss: 0.0912 389/500 [======================>.......] - ETA: 28s - loss: 0.8781 - regression_loss: 0.7869 - classification_loss: 0.0912 390/500 [======================>.......] - ETA: 27s - loss: 0.8786 - regression_loss: 0.7874 - classification_loss: 0.0912 391/500 [======================>.......] - ETA: 27s - loss: 0.8787 - regression_loss: 0.7876 - classification_loss: 0.0911 392/500 [======================>.......] - ETA: 27s - loss: 0.8779 - regression_loss: 0.7869 - classification_loss: 0.0910 393/500 [======================>.......] - ETA: 27s - loss: 0.8774 - regression_loss: 0.7865 - classification_loss: 0.0908 394/500 [======================>.......] - ETA: 26s - loss: 0.8763 - regression_loss: 0.7856 - classification_loss: 0.0907 395/500 [======================>.......] - ETA: 26s - loss: 0.8750 - regression_loss: 0.7845 - classification_loss: 0.0905 396/500 [======================>.......] - ETA: 26s - loss: 0.8731 - regression_loss: 0.7828 - classification_loss: 0.0903 397/500 [======================>.......] - ETA: 26s - loss: 0.8718 - regression_loss: 0.7816 - classification_loss: 0.0901 398/500 [======================>.......] - ETA: 25s - loss: 0.8716 - regression_loss: 0.7814 - classification_loss: 0.0902 399/500 [======================>.......] - ETA: 25s - loss: 0.8740 - regression_loss: 0.7834 - classification_loss: 0.0906 400/500 [=======================>......] - ETA: 25s - loss: 0.8736 - regression_loss: 0.7832 - classification_loss: 0.0905 401/500 [=======================>......] - ETA: 25s - loss: 0.8738 - regression_loss: 0.7834 - classification_loss: 0.0904 402/500 [=======================>......] - ETA: 24s - loss: 0.8755 - regression_loss: 0.7847 - classification_loss: 0.0908 403/500 [=======================>......] - ETA: 24s - loss: 0.8776 - regression_loss: 0.7864 - classification_loss: 0.0911 404/500 [=======================>......] - ETA: 24s - loss: 0.8776 - regression_loss: 0.7865 - classification_loss: 0.0912 405/500 [=======================>......] - ETA: 24s - loss: 0.8771 - regression_loss: 0.7861 - classification_loss: 0.0910 406/500 [=======================>......] - ETA: 23s - loss: 0.8776 - regression_loss: 0.7865 - classification_loss: 0.0911 407/500 [=======================>......] - ETA: 23s - loss: 0.8784 - regression_loss: 0.7874 - classification_loss: 0.0910 408/500 [=======================>......] - ETA: 23s - loss: 0.8783 - regression_loss: 0.7874 - classification_loss: 0.0910 409/500 [=======================>......] - ETA: 23s - loss: 0.8785 - regression_loss: 0.7876 - classification_loss: 0.0909 410/500 [=======================>......] - ETA: 22s - loss: 0.8779 - regression_loss: 0.7871 - classification_loss: 0.0908 411/500 [=======================>......] - ETA: 22s - loss: 0.8787 - regression_loss: 0.7878 - classification_loss: 0.0909 412/500 [=======================>......] - ETA: 22s - loss: 0.8786 - regression_loss: 0.7878 - classification_loss: 0.0908 413/500 [=======================>......] - ETA: 22s - loss: 0.8805 - regression_loss: 0.7893 - classification_loss: 0.0912 414/500 [=======================>......] - ETA: 21s - loss: 0.8807 - regression_loss: 0.7894 - classification_loss: 0.0913 415/500 [=======================>......] - ETA: 21s - loss: 0.8810 - regression_loss: 0.7898 - classification_loss: 0.0912 416/500 [=======================>......] - ETA: 21s - loss: 0.8821 - regression_loss: 0.7908 - classification_loss: 0.0913 417/500 [========================>.....] - ETA: 21s - loss: 0.8829 - regression_loss: 0.7917 - classification_loss: 0.0913 418/500 [========================>.....] - ETA: 20s - loss: 0.8833 - regression_loss: 0.7920 - classification_loss: 0.0913 419/500 [========================>.....] - ETA: 20s - loss: 0.8832 - regression_loss: 0.7919 - classification_loss: 0.0912 420/500 [========================>.....] - ETA: 20s - loss: 0.8847 - regression_loss: 0.7931 - classification_loss: 0.0915 421/500 [========================>.....] - ETA: 20s - loss: 0.8854 - regression_loss: 0.7939 - classification_loss: 0.0914 422/500 [========================>.....] - ETA: 19s - loss: 0.8850 - regression_loss: 0.7936 - classification_loss: 0.0914 423/500 [========================>.....] - ETA: 19s - loss: 0.8844 - regression_loss: 0.7931 - classification_loss: 0.0913 424/500 [========================>.....] - ETA: 19s - loss: 0.8832 - regression_loss: 0.7922 - classification_loss: 0.0911 425/500 [========================>.....] - ETA: 18s - loss: 0.8822 - regression_loss: 0.7913 - classification_loss: 0.0909 426/500 [========================>.....] - ETA: 18s - loss: 0.8823 - regression_loss: 0.7914 - classification_loss: 0.0909 427/500 [========================>.....] - ETA: 18s - loss: 0.8824 - regression_loss: 0.7914 - classification_loss: 0.0909 428/500 [========================>.....] - ETA: 18s - loss: 0.8804 - regression_loss: 0.7896 - classification_loss: 0.0909 429/500 [========================>.....] - ETA: 17s - loss: 0.8812 - regression_loss: 0.7903 - classification_loss: 0.0909 430/500 [========================>.....] - ETA: 17s - loss: 0.8816 - regression_loss: 0.7907 - classification_loss: 0.0909 431/500 [========================>.....] - ETA: 17s - loss: 0.8812 - regression_loss: 0.7904 - classification_loss: 0.0908 432/500 [========================>.....] - ETA: 17s - loss: 0.8805 - regression_loss: 0.7898 - classification_loss: 0.0906 433/500 [========================>.....] - ETA: 16s - loss: 0.8809 - regression_loss: 0.7902 - classification_loss: 0.0907 434/500 [=========================>....] - ETA: 16s - loss: 0.8806 - regression_loss: 0.7901 - classification_loss: 0.0906 435/500 [=========================>....] - ETA: 16s - loss: 0.8811 - regression_loss: 0.7904 - classification_loss: 0.0906 436/500 [=========================>....] - ETA: 16s - loss: 0.8808 - regression_loss: 0.7903 - classification_loss: 0.0905 437/500 [=========================>....] - ETA: 15s - loss: 0.8814 - regression_loss: 0.7910 - classification_loss: 0.0905 438/500 [=========================>....] - ETA: 15s - loss: 0.8809 - regression_loss: 0.7906 - classification_loss: 0.0903 439/500 [=========================>....] - ETA: 15s - loss: 0.8800 - regression_loss: 0.7899 - classification_loss: 0.0902 440/500 [=========================>....] - ETA: 15s - loss: 0.8796 - regression_loss: 0.7896 - classification_loss: 0.0901 441/500 [=========================>....] - ETA: 14s - loss: 0.8790 - regression_loss: 0.7891 - classification_loss: 0.0900 442/500 [=========================>....] - ETA: 14s - loss: 0.8802 - regression_loss: 0.7900 - classification_loss: 0.0902 443/500 [=========================>....] - ETA: 14s - loss: 0.8801 - regression_loss: 0.7900 - classification_loss: 0.0901 444/500 [=========================>....] - ETA: 14s - loss: 0.8804 - regression_loss: 0.7902 - classification_loss: 0.0902 445/500 [=========================>....] - ETA: 13s - loss: 0.8801 - regression_loss: 0.7901 - classification_loss: 0.0901 446/500 [=========================>....] - ETA: 13s - loss: 0.8801 - regression_loss: 0.7899 - classification_loss: 0.0902 447/500 [=========================>....] - ETA: 13s - loss: 0.8809 - regression_loss: 0.7906 - classification_loss: 0.0902 448/500 [=========================>....] - ETA: 13s - loss: 0.8807 - regression_loss: 0.7906 - classification_loss: 0.0901 449/500 [=========================>....] - ETA: 12s - loss: 0.8812 - regression_loss: 0.7910 - classification_loss: 0.0902 450/500 [==========================>...] - ETA: 12s - loss: 0.8809 - regression_loss: 0.7907 - classification_loss: 0.0902 451/500 [==========================>...] - ETA: 12s - loss: 0.8809 - regression_loss: 0.7908 - classification_loss: 0.0901 452/500 [==========================>...] - ETA: 12s - loss: 0.8809 - regression_loss: 0.7907 - classification_loss: 0.0902 453/500 [==========================>...] - ETA: 11s - loss: 0.8796 - regression_loss: 0.7896 - classification_loss: 0.0900 454/500 [==========================>...] - ETA: 11s - loss: 0.8794 - regression_loss: 0.7894 - classification_loss: 0.0900 455/500 [==========================>...] - ETA: 11s - loss: 0.8796 - regression_loss: 0.7895 - classification_loss: 0.0901 456/500 [==========================>...] - ETA: 11s - loss: 0.8782 - regression_loss: 0.7883 - classification_loss: 0.0899 457/500 [==========================>...] - ETA: 10s - loss: 0.8790 - regression_loss: 0.7890 - classification_loss: 0.0900 458/500 [==========================>...] - ETA: 10s - loss: 0.8780 - regression_loss: 0.7881 - classification_loss: 0.0899 459/500 [==========================>...] - ETA: 10s - loss: 0.8795 - regression_loss: 0.7891 - classification_loss: 0.0904 460/500 [==========================>...] - ETA: 10s - loss: 0.8791 - regression_loss: 0.7888 - classification_loss: 0.0903 461/500 [==========================>...] - ETA: 9s - loss: 0.8789 - regression_loss: 0.7887 - classification_loss: 0.0902  462/500 [==========================>...] - ETA: 9s - loss: 0.8791 - regression_loss: 0.7889 - classification_loss: 0.0902 463/500 [==========================>...] - ETA: 9s - loss: 0.8788 - regression_loss: 0.7886 - classification_loss: 0.0902 464/500 [==========================>...] - ETA: 9s - loss: 0.8790 - regression_loss: 0.7888 - classification_loss: 0.0902 465/500 [==========================>...] - ETA: 8s - loss: 0.8787 - regression_loss: 0.7885 - classification_loss: 0.0902 466/500 [==========================>...] - ETA: 8s - loss: 0.8781 - regression_loss: 0.7879 - classification_loss: 0.0902 467/500 [===========================>..] - ETA: 8s - loss: 0.8774 - regression_loss: 0.7873 - classification_loss: 0.0901 468/500 [===========================>..] - ETA: 8s - loss: 0.8763 - regression_loss: 0.7864 - classification_loss: 0.0899 469/500 [===========================>..] - ETA: 7s - loss: 0.8763 - regression_loss: 0.7863 - classification_loss: 0.0899 470/500 [===========================>..] - ETA: 7s - loss: 0.8767 - regression_loss: 0.7867 - classification_loss: 0.0900 471/500 [===========================>..] - ETA: 7s - loss: 0.8775 - regression_loss: 0.7874 - classification_loss: 0.0901 472/500 [===========================>..] - ETA: 7s - loss: 0.8785 - regression_loss: 0.7883 - classification_loss: 0.0902 473/500 [===========================>..] - ETA: 6s - loss: 0.8790 - regression_loss: 0.7888 - classification_loss: 0.0902 474/500 [===========================>..] - ETA: 6s - loss: 0.8787 - regression_loss: 0.7885 - classification_loss: 0.0902 475/500 [===========================>..] - ETA: 6s - loss: 0.8778 - regression_loss: 0.7878 - classification_loss: 0.0901 476/500 [===========================>..] - ETA: 6s - loss: 0.8780 - regression_loss: 0.7880 - classification_loss: 0.0900 477/500 [===========================>..] - ETA: 5s - loss: 0.8796 - regression_loss: 0.7893 - classification_loss: 0.0903 478/500 [===========================>..] - ETA: 5s - loss: 0.8804 - regression_loss: 0.7900 - classification_loss: 0.0905 479/500 [===========================>..] - ETA: 5s - loss: 0.8794 - regression_loss: 0.7891 - classification_loss: 0.0903 480/500 [===========================>..] - ETA: 5s - loss: 0.8800 - regression_loss: 0.7897 - classification_loss: 0.0903 481/500 [===========================>..] - ETA: 4s - loss: 0.8801 - regression_loss: 0.7898 - classification_loss: 0.0904 482/500 [===========================>..] - ETA: 4s - loss: 0.8784 - regression_loss: 0.7881 - classification_loss: 0.0902 483/500 [===========================>..] - ETA: 4s - loss: 0.8789 - regression_loss: 0.7886 - classification_loss: 0.0904 484/500 [============================>.] - ETA: 4s - loss: 0.8783 - regression_loss: 0.7880 - classification_loss: 0.0903 485/500 [============================>.] - ETA: 3s - loss: 0.8791 - regression_loss: 0.7887 - classification_loss: 0.0904 486/500 [============================>.] - ETA: 3s - loss: 0.8792 - regression_loss: 0.7888 - classification_loss: 0.0904 487/500 [============================>.] - ETA: 3s - loss: 0.8788 - regression_loss: 0.7886 - classification_loss: 0.0903 488/500 [============================>.] - ETA: 3s - loss: 0.8791 - regression_loss: 0.7888 - classification_loss: 0.0903 489/500 [============================>.] - ETA: 2s - loss: 0.8796 - regression_loss: 0.7892 - classification_loss: 0.0904 490/500 [============================>.] - ETA: 2s - loss: 0.8802 - regression_loss: 0.7897 - classification_loss: 0.0905 491/500 [============================>.] - ETA: 2s - loss: 0.8796 - regression_loss: 0.7891 - classification_loss: 0.0905 492/500 [============================>.] - ETA: 2s - loss: 0.8793 - regression_loss: 0.7889 - classification_loss: 0.0904 493/500 [============================>.] - ETA: 1s - loss: 0.8794 - regression_loss: 0.7889 - classification_loss: 0.0905 494/500 [============================>.] - ETA: 1s - loss: 0.8801 - regression_loss: 0.7895 - classification_loss: 0.0906 495/500 [============================>.] - ETA: 1s - loss: 0.8818 - regression_loss: 0.7909 - classification_loss: 0.0909 496/500 [============================>.] - ETA: 1s - loss: 0.8825 - regression_loss: 0.7915 - classification_loss: 0.0910 497/500 [============================>.] - ETA: 0s - loss: 0.8828 - regression_loss: 0.7917 - classification_loss: 0.0910 498/500 [============================>.] - ETA: 0s - loss: 0.8824 - regression_loss: 0.7915 - classification_loss: 0.0910 499/500 [============================>.] - ETA: 0s - loss: 0.8818 - regression_loss: 0.7909 - classification_loss: 0.0909 500/500 [==============================] - 127s 254ms/step - loss: 0.8809 - regression_loss: 0.7902 - classification_loss: 0.0908 1172 instances of class plum with average precision: 0.8028 mAP: 0.8028 Epoch 00052: saving model to ./training/snapshots/resnet50_pascal_52.h5 Epoch 53/150 1/500 [..............................] - ETA: 1:58 - loss: 0.9738 - regression_loss: 0.8452 - classification_loss: 0.1286 2/500 [..............................] - ETA: 2:00 - loss: 0.7223 - regression_loss: 0.6522 - classification_loss: 0.0701 3/500 [..............................] - ETA: 1:57 - loss: 0.7888 - regression_loss: 0.7182 - classification_loss: 0.0706 4/500 [..............................] - ETA: 1:56 - loss: 0.8949 - regression_loss: 0.8045 - classification_loss: 0.0904 5/500 [..............................] - ETA: 1:56 - loss: 0.8557 - regression_loss: 0.7737 - classification_loss: 0.0820 6/500 [..............................] - ETA: 1:55 - loss: 0.8614 - regression_loss: 0.7779 - classification_loss: 0.0836 7/500 [..............................] - ETA: 1:56 - loss: 0.9195 - regression_loss: 0.8252 - classification_loss: 0.0943 8/500 [..............................] - ETA: 1:56 - loss: 0.9375 - regression_loss: 0.8388 - classification_loss: 0.0987 9/500 [..............................] - ETA: 1:57 - loss: 0.9578 - regression_loss: 0.8549 - classification_loss: 0.1029 10/500 [..............................] - ETA: 1:57 - loss: 0.9635 - regression_loss: 0.8585 - classification_loss: 0.1050 11/500 [..............................] - ETA: 1:57 - loss: 0.9734 - regression_loss: 0.8680 - classification_loss: 0.1055 12/500 [..............................] - ETA: 1:58 - loss: 0.9155 - regression_loss: 0.8177 - classification_loss: 0.0978 13/500 [..............................] - ETA: 1:57 - loss: 0.9095 - regression_loss: 0.8130 - classification_loss: 0.0964 14/500 [..............................] - ETA: 1:56 - loss: 0.9002 - regression_loss: 0.8046 - classification_loss: 0.0957 15/500 [..............................] - ETA: 1:56 - loss: 0.9058 - regression_loss: 0.8109 - classification_loss: 0.0949 16/500 [..............................] - ETA: 1:56 - loss: 0.9109 - regression_loss: 0.8155 - classification_loss: 0.0955 17/500 [>.............................] - ETA: 1:56 - loss: 0.9155 - regression_loss: 0.8215 - classification_loss: 0.0940 18/500 [>.............................] - ETA: 1:55 - loss: 0.9256 - regression_loss: 0.8308 - classification_loss: 0.0948 19/500 [>.............................] - ETA: 1:56 - loss: 0.8951 - regression_loss: 0.8038 - classification_loss: 0.0913 20/500 [>.............................] - ETA: 1:55 - loss: 0.8631 - regression_loss: 0.7760 - classification_loss: 0.0871 21/500 [>.............................] - ETA: 1:55 - loss: 0.8464 - regression_loss: 0.7616 - classification_loss: 0.0848 22/500 [>.............................] - ETA: 1:55 - loss: 0.8388 - regression_loss: 0.7575 - classification_loss: 0.0813 23/500 [>.............................] - ETA: 1:55 - loss: 0.8298 - regression_loss: 0.7509 - classification_loss: 0.0789 24/500 [>.............................] - ETA: 1:55 - loss: 0.8495 - regression_loss: 0.7672 - classification_loss: 0.0823 25/500 [>.............................] - ETA: 1:55 - loss: 0.8608 - regression_loss: 0.7772 - classification_loss: 0.0836 26/500 [>.............................] - ETA: 1:55 - loss: 0.8789 - regression_loss: 0.7907 - classification_loss: 0.0881 27/500 [>.............................] - ETA: 1:55 - loss: 0.8653 - regression_loss: 0.7794 - classification_loss: 0.0860 28/500 [>.............................] - ETA: 1:55 - loss: 0.8527 - regression_loss: 0.7679 - classification_loss: 0.0848 29/500 [>.............................] - ETA: 1:55 - loss: 0.8514 - regression_loss: 0.7669 - classification_loss: 0.0845 30/500 [>.............................] - ETA: 1:54 - loss: 0.8607 - regression_loss: 0.7741 - classification_loss: 0.0866 31/500 [>.............................] - ETA: 1:54 - loss: 0.8586 - regression_loss: 0.7722 - classification_loss: 0.0864 32/500 [>.............................] - ETA: 1:54 - loss: 0.8560 - regression_loss: 0.7701 - classification_loss: 0.0859 33/500 [>.............................] - ETA: 1:54 - loss: 0.8648 - regression_loss: 0.7779 - classification_loss: 0.0869 34/500 [=>............................] - ETA: 1:53 - loss: 0.8544 - regression_loss: 0.7692 - classification_loss: 0.0852 35/500 [=>............................] - ETA: 1:53 - loss: 0.8506 - regression_loss: 0.7669 - classification_loss: 0.0837 36/500 [=>............................] - ETA: 1:53 - loss: 0.8502 - regression_loss: 0.7676 - classification_loss: 0.0826 37/500 [=>............................] - ETA: 1:53 - loss: 0.8413 - regression_loss: 0.7597 - classification_loss: 0.0815 38/500 [=>............................] - ETA: 1:52 - loss: 0.8397 - regression_loss: 0.7575 - classification_loss: 0.0822 39/500 [=>............................] - ETA: 1:52 - loss: 0.8723 - regression_loss: 0.7830 - classification_loss: 0.0893 40/500 [=>............................] - ETA: 1:52 - loss: 0.8702 - regression_loss: 0.7818 - classification_loss: 0.0885 41/500 [=>............................] - ETA: 1:51 - loss: 0.8697 - regression_loss: 0.7819 - classification_loss: 0.0878 42/500 [=>............................] - ETA: 1:51 - loss: 0.8645 - regression_loss: 0.7776 - classification_loss: 0.0869 43/500 [=>............................] - ETA: 1:51 - loss: 0.8625 - regression_loss: 0.7758 - classification_loss: 0.0867 44/500 [=>............................] - ETA: 1:51 - loss: 0.8783 - regression_loss: 0.7893 - classification_loss: 0.0890 45/500 [=>............................] - ETA: 1:51 - loss: 0.8774 - regression_loss: 0.7882 - classification_loss: 0.0892 46/500 [=>............................] - ETA: 1:51 - loss: 0.8853 - regression_loss: 0.7947 - classification_loss: 0.0906 47/500 [=>............................] - ETA: 1:50 - loss: 0.8894 - regression_loss: 0.7979 - classification_loss: 0.0916 48/500 [=>............................] - ETA: 1:50 - loss: 0.8896 - regression_loss: 0.7985 - classification_loss: 0.0911 49/500 [=>............................] - ETA: 1:50 - loss: 0.8812 - regression_loss: 0.7913 - classification_loss: 0.0898 50/500 [==>...........................] - ETA: 1:50 - loss: 0.8742 - regression_loss: 0.7857 - classification_loss: 0.0885 51/500 [==>...........................] - ETA: 1:49 - loss: 0.8703 - regression_loss: 0.7827 - classification_loss: 0.0876 52/500 [==>...........................] - ETA: 1:49 - loss: 0.8733 - regression_loss: 0.7851 - classification_loss: 0.0882 53/500 [==>...........................] - ETA: 1:49 - loss: 0.8736 - regression_loss: 0.7856 - classification_loss: 0.0880 54/500 [==>...........................] - ETA: 1:48 - loss: 0.8747 - regression_loss: 0.7864 - classification_loss: 0.0882 55/500 [==>...........................] - ETA: 1:48 - loss: 0.8676 - regression_loss: 0.7804 - classification_loss: 0.0872 56/500 [==>...........................] - ETA: 1:48 - loss: 0.8740 - regression_loss: 0.7852 - classification_loss: 0.0887 57/500 [==>...........................] - ETA: 1:48 - loss: 0.8795 - regression_loss: 0.7900 - classification_loss: 0.0894 58/500 [==>...........................] - ETA: 1:48 - loss: 0.8804 - regression_loss: 0.7905 - classification_loss: 0.0898 59/500 [==>...........................] - ETA: 1:48 - loss: 0.8832 - regression_loss: 0.7928 - classification_loss: 0.0904 60/500 [==>...........................] - ETA: 1:47 - loss: 0.8860 - regression_loss: 0.7965 - classification_loss: 0.0895 61/500 [==>...........................] - ETA: 1:47 - loss: 0.8778 - regression_loss: 0.7892 - classification_loss: 0.0886 62/500 [==>...........................] - ETA: 1:47 - loss: 0.8700 - regression_loss: 0.7825 - classification_loss: 0.0875 63/500 [==>...........................] - ETA: 1:47 - loss: 0.8724 - regression_loss: 0.7843 - classification_loss: 0.0881 64/500 [==>...........................] - ETA: 1:47 - loss: 0.8730 - regression_loss: 0.7851 - classification_loss: 0.0879 65/500 [==>...........................] - ETA: 1:47 - loss: 0.8720 - regression_loss: 0.7828 - classification_loss: 0.0891 66/500 [==>...........................] - ETA: 1:47 - loss: 0.8754 - regression_loss: 0.7864 - classification_loss: 0.0890 67/500 [===>..........................] - ETA: 1:46 - loss: 0.8698 - regression_loss: 0.7818 - classification_loss: 0.0880 68/500 [===>..........................] - ETA: 1:46 - loss: 0.8766 - regression_loss: 0.7877 - classification_loss: 0.0889 69/500 [===>..........................] - ETA: 1:46 - loss: 0.8780 - regression_loss: 0.7891 - classification_loss: 0.0889 70/500 [===>..........................] - ETA: 1:46 - loss: 0.8785 - regression_loss: 0.7897 - classification_loss: 0.0888 71/500 [===>..........................] - ETA: 1:46 - loss: 0.8812 - regression_loss: 0.7921 - classification_loss: 0.0892 72/500 [===>..........................] - ETA: 1:47 - loss: 0.8851 - regression_loss: 0.7953 - classification_loss: 0.0898 73/500 [===>..........................] - ETA: 1:46 - loss: 0.8776 - regression_loss: 0.7888 - classification_loss: 0.0888 74/500 [===>..........................] - ETA: 1:46 - loss: 0.8762 - regression_loss: 0.7875 - classification_loss: 0.0888 75/500 [===>..........................] - ETA: 1:46 - loss: 0.8749 - regression_loss: 0.7861 - classification_loss: 0.0888 76/500 [===>..........................] - ETA: 1:45 - loss: 0.8747 - regression_loss: 0.7854 - classification_loss: 0.0893 77/500 [===>..........................] - ETA: 1:45 - loss: 0.8767 - regression_loss: 0.7868 - classification_loss: 0.0899 78/500 [===>..........................] - ETA: 1:45 - loss: 0.8767 - regression_loss: 0.7868 - classification_loss: 0.0899 79/500 [===>..........................] - ETA: 1:44 - loss: 0.8701 - regression_loss: 0.7811 - classification_loss: 0.0890 80/500 [===>..........................] - ETA: 1:44 - loss: 0.8682 - regression_loss: 0.7799 - classification_loss: 0.0883 81/500 [===>..........................] - ETA: 1:44 - loss: 0.8635 - regression_loss: 0.7755 - classification_loss: 0.0880 82/500 [===>..........................] - ETA: 1:43 - loss: 0.8645 - regression_loss: 0.7763 - classification_loss: 0.0882 83/500 [===>..........................] - ETA: 1:43 - loss: 0.8609 - regression_loss: 0.7733 - classification_loss: 0.0877 84/500 [====>.........................] - ETA: 1:43 - loss: 0.8671 - regression_loss: 0.7783 - classification_loss: 0.0888 85/500 [====>.........................] - ETA: 1:42 - loss: 0.8632 - regression_loss: 0.7744 - classification_loss: 0.0889 86/500 [====>.........................] - ETA: 1:42 - loss: 0.8605 - regression_loss: 0.7718 - classification_loss: 0.0887 87/500 [====>.........................] - ETA: 1:42 - loss: 0.8672 - regression_loss: 0.7771 - classification_loss: 0.0901 88/500 [====>.........................] - ETA: 1:41 - loss: 0.8671 - regression_loss: 0.7771 - classification_loss: 0.0900 89/500 [====>.........................] - ETA: 1:41 - loss: 0.8660 - regression_loss: 0.7761 - classification_loss: 0.0899 90/500 [====>.........................] - ETA: 1:41 - loss: 0.8607 - regression_loss: 0.7716 - classification_loss: 0.0891 91/500 [====>.........................] - ETA: 1:40 - loss: 0.8583 - regression_loss: 0.7699 - classification_loss: 0.0884 92/500 [====>.........................] - ETA: 1:40 - loss: 0.8560 - regression_loss: 0.7683 - classification_loss: 0.0877 93/500 [====>.........................] - ETA: 1:40 - loss: 0.8507 - regression_loss: 0.7638 - classification_loss: 0.0870 94/500 [====>.........................] - ETA: 1:39 - loss: 0.8552 - regression_loss: 0.7683 - classification_loss: 0.0869 95/500 [====>.........................] - ETA: 1:39 - loss: 0.8556 - regression_loss: 0.7684 - classification_loss: 0.0871 96/500 [====>.........................] - ETA: 1:39 - loss: 0.8554 - regression_loss: 0.7686 - classification_loss: 0.0868 97/500 [====>.........................] - ETA: 1:39 - loss: 0.8616 - regression_loss: 0.7732 - classification_loss: 0.0884 98/500 [====>.........................] - ETA: 1:38 - loss: 0.8680 - regression_loss: 0.7787 - classification_loss: 0.0893 99/500 [====>.........................] - ETA: 1:38 - loss: 0.8698 - regression_loss: 0.7804 - classification_loss: 0.0894 100/500 [=====>........................] - ETA: 1:38 - loss: 0.8707 - regression_loss: 0.7813 - classification_loss: 0.0894 101/500 [=====>........................] - ETA: 1:37 - loss: 0.8687 - regression_loss: 0.7797 - classification_loss: 0.0890 102/500 [=====>........................] - ETA: 1:37 - loss: 0.8666 - regression_loss: 0.7781 - classification_loss: 0.0885 103/500 [=====>........................] - ETA: 1:37 - loss: 0.8697 - regression_loss: 0.7807 - classification_loss: 0.0890 104/500 [=====>........................] - ETA: 1:36 - loss: 0.8705 - regression_loss: 0.7812 - classification_loss: 0.0893 105/500 [=====>........................] - ETA: 1:36 - loss: 0.8710 - regression_loss: 0.7818 - classification_loss: 0.0892 106/500 [=====>........................] - ETA: 1:36 - loss: 0.8716 - regression_loss: 0.7825 - classification_loss: 0.0891 107/500 [=====>........................] - ETA: 1:36 - loss: 0.8717 - regression_loss: 0.7828 - classification_loss: 0.0890 108/500 [=====>........................] - ETA: 1:35 - loss: 0.8725 - regression_loss: 0.7835 - classification_loss: 0.0890 109/500 [=====>........................] - ETA: 1:35 - loss: 0.8761 - regression_loss: 0.7868 - classification_loss: 0.0893 110/500 [=====>........................] - ETA: 1:35 - loss: 0.8782 - regression_loss: 0.7888 - classification_loss: 0.0893 111/500 [=====>........................] - ETA: 1:35 - loss: 0.8784 - regression_loss: 0.7890 - classification_loss: 0.0894 112/500 [=====>........................] - ETA: 1:35 - loss: 0.8805 - regression_loss: 0.7911 - classification_loss: 0.0894 113/500 [=====>........................] - ETA: 1:34 - loss: 0.8750 - regression_loss: 0.7863 - classification_loss: 0.0887 114/500 [=====>........................] - ETA: 1:34 - loss: 0.8771 - regression_loss: 0.7879 - classification_loss: 0.0892 115/500 [=====>........................] - ETA: 1:34 - loss: 0.8791 - regression_loss: 0.7896 - classification_loss: 0.0895 116/500 [=====>........................] - ETA: 1:34 - loss: 0.8772 - regression_loss: 0.7879 - classification_loss: 0.0892 117/500 [======>.......................] - ETA: 1:33 - loss: 0.8725 - regression_loss: 0.7839 - classification_loss: 0.0886 118/500 [======>.......................] - ETA: 1:33 - loss: 0.8707 - regression_loss: 0.7827 - classification_loss: 0.0880 119/500 [======>.......................] - ETA: 1:33 - loss: 0.8686 - regression_loss: 0.7807 - classification_loss: 0.0879 120/500 [======>.......................] - ETA: 1:33 - loss: 0.8652 - regression_loss: 0.7779 - classification_loss: 0.0873 121/500 [======>.......................] - ETA: 1:33 - loss: 0.8617 - regression_loss: 0.7750 - classification_loss: 0.0867 122/500 [======>.......................] - ETA: 1:32 - loss: 0.8640 - regression_loss: 0.7765 - classification_loss: 0.0875 123/500 [======>.......................] - ETA: 1:32 - loss: 0.8640 - regression_loss: 0.7764 - classification_loss: 0.0876 124/500 [======>.......................] - ETA: 1:32 - loss: 0.8664 - regression_loss: 0.7782 - classification_loss: 0.0882 125/500 [======>.......................] - ETA: 1:32 - loss: 0.8650 - regression_loss: 0.7770 - classification_loss: 0.0880 126/500 [======>.......................] - ETA: 1:31 - loss: 0.8662 - regression_loss: 0.7780 - classification_loss: 0.0882 127/500 [======>.......................] - ETA: 1:31 - loss: 0.8662 - regression_loss: 0.7781 - classification_loss: 0.0881 128/500 [======>.......................] - ETA: 1:31 - loss: 0.8672 - regression_loss: 0.7790 - classification_loss: 0.0882 129/500 [======>.......................] - ETA: 1:31 - loss: 0.8647 - regression_loss: 0.7769 - classification_loss: 0.0877 130/500 [======>.......................] - ETA: 1:30 - loss: 0.8664 - regression_loss: 0.7784 - classification_loss: 0.0880 131/500 [======>.......................] - ETA: 1:30 - loss: 0.8688 - regression_loss: 0.7805 - classification_loss: 0.0883 132/500 [======>.......................] - ETA: 1:30 - loss: 0.8690 - regression_loss: 0.7807 - classification_loss: 0.0883 133/500 [======>.......................] - ETA: 1:30 - loss: 0.8647 - regression_loss: 0.7769 - classification_loss: 0.0878 134/500 [=======>......................] - ETA: 1:30 - loss: 0.8609 - regression_loss: 0.7735 - classification_loss: 0.0874 135/500 [=======>......................] - ETA: 1:29 - loss: 0.8613 - regression_loss: 0.7740 - classification_loss: 0.0873 136/500 [=======>......................] - ETA: 1:29 - loss: 0.8631 - regression_loss: 0.7757 - classification_loss: 0.0873 137/500 [=======>......................] - ETA: 1:29 - loss: 0.8640 - regression_loss: 0.7763 - classification_loss: 0.0877 138/500 [=======>......................] - ETA: 1:29 - loss: 0.8598 - regression_loss: 0.7726 - classification_loss: 0.0871 139/500 [=======>......................] - ETA: 1:28 - loss: 0.8646 - regression_loss: 0.7765 - classification_loss: 0.0881 140/500 [=======>......................] - ETA: 1:28 - loss: 0.8681 - regression_loss: 0.7792 - classification_loss: 0.0888 141/500 [=======>......................] - ETA: 1:28 - loss: 0.8699 - regression_loss: 0.7812 - classification_loss: 0.0887 142/500 [=======>......................] - ETA: 1:28 - loss: 0.8718 - regression_loss: 0.7828 - classification_loss: 0.0890 143/500 [=======>......................] - ETA: 1:27 - loss: 0.8725 - regression_loss: 0.7835 - classification_loss: 0.0890 144/500 [=======>......................] - ETA: 1:27 - loss: 0.8730 - regression_loss: 0.7841 - classification_loss: 0.0889 145/500 [=======>......................] - ETA: 1:27 - loss: 0.8703 - regression_loss: 0.7819 - classification_loss: 0.0884 146/500 [=======>......................] - ETA: 1:27 - loss: 0.8742 - regression_loss: 0.7849 - classification_loss: 0.0893 147/500 [=======>......................] - ETA: 1:26 - loss: 0.8724 - regression_loss: 0.7836 - classification_loss: 0.0888 148/500 [=======>......................] - ETA: 1:26 - loss: 0.8746 - regression_loss: 0.7861 - classification_loss: 0.0886 149/500 [=======>......................] - ETA: 1:26 - loss: 0.8729 - regression_loss: 0.7845 - classification_loss: 0.0884 150/500 [========>.....................] - ETA: 1:25 - loss: 0.8702 - regression_loss: 0.7822 - classification_loss: 0.0880 151/500 [========>.....................] - ETA: 1:25 - loss: 0.8708 - regression_loss: 0.7824 - classification_loss: 0.0883 152/500 [========>.....................] - ETA: 1:25 - loss: 0.8673 - regression_loss: 0.7794 - classification_loss: 0.0879 153/500 [========>.....................] - ETA: 1:25 - loss: 0.8698 - regression_loss: 0.7813 - classification_loss: 0.0884 154/500 [========>.....................] - ETA: 1:25 - loss: 0.8721 - regression_loss: 0.7834 - classification_loss: 0.0888 155/500 [========>.....................] - ETA: 1:24 - loss: 0.8716 - regression_loss: 0.7832 - classification_loss: 0.0884 156/500 [========>.....................] - ETA: 1:24 - loss: 0.8712 - regression_loss: 0.7828 - classification_loss: 0.0884 157/500 [========>.....................] - ETA: 1:24 - loss: 0.8697 - regression_loss: 0.7815 - classification_loss: 0.0882 158/500 [========>.....................] - ETA: 1:24 - loss: 0.8697 - regression_loss: 0.7817 - classification_loss: 0.0879 159/500 [========>.....................] - ETA: 1:23 - loss: 0.8693 - regression_loss: 0.7815 - classification_loss: 0.0878 160/500 [========>.....................] - ETA: 1:23 - loss: 0.8689 - regression_loss: 0.7809 - classification_loss: 0.0879 161/500 [========>.....................] - ETA: 1:23 - loss: 0.8723 - regression_loss: 0.7834 - classification_loss: 0.0889 162/500 [========>.....................] - ETA: 1:23 - loss: 0.8714 - regression_loss: 0.7824 - classification_loss: 0.0890 163/500 [========>.....................] - ETA: 1:22 - loss: 0.8716 - regression_loss: 0.7825 - classification_loss: 0.0891 164/500 [========>.....................] - ETA: 1:22 - loss: 0.8701 - regression_loss: 0.7815 - classification_loss: 0.0887 165/500 [========>.....................] - ETA: 1:22 - loss: 0.8707 - regression_loss: 0.7820 - classification_loss: 0.0887 166/500 [========>.....................] - ETA: 1:22 - loss: 0.8715 - regression_loss: 0.7827 - classification_loss: 0.0888 167/500 [=========>....................] - ETA: 1:21 - loss: 0.8735 - regression_loss: 0.7845 - classification_loss: 0.0890 168/500 [=========>....................] - ETA: 1:21 - loss: 0.8715 - regression_loss: 0.7828 - classification_loss: 0.0887 169/500 [=========>....................] - ETA: 1:21 - loss: 0.8720 - regression_loss: 0.7833 - classification_loss: 0.0887 170/500 [=========>....................] - ETA: 1:21 - loss: 0.8703 - regression_loss: 0.7818 - classification_loss: 0.0885 171/500 [=========>....................] - ETA: 1:20 - loss: 0.8689 - regression_loss: 0.7808 - classification_loss: 0.0881 172/500 [=========>....................] - ETA: 1:20 - loss: 0.8699 - regression_loss: 0.7816 - classification_loss: 0.0883 173/500 [=========>....................] - ETA: 1:20 - loss: 0.8698 - regression_loss: 0.7816 - classification_loss: 0.0881 174/500 [=========>....................] - ETA: 1:20 - loss: 0.8698 - regression_loss: 0.7816 - classification_loss: 0.0882 175/500 [=========>....................] - ETA: 1:19 - loss: 0.8687 - regression_loss: 0.7808 - classification_loss: 0.0879 176/500 [=========>....................] - ETA: 1:19 - loss: 0.8649 - regression_loss: 0.7775 - classification_loss: 0.0875 177/500 [=========>....................] - ETA: 1:19 - loss: 0.8674 - regression_loss: 0.7799 - classification_loss: 0.0875 178/500 [=========>....................] - ETA: 1:19 - loss: 0.8663 - regression_loss: 0.7790 - classification_loss: 0.0873 179/500 [=========>....................] - ETA: 1:18 - loss: 0.8679 - regression_loss: 0.7806 - classification_loss: 0.0873 180/500 [=========>....................] - ETA: 1:18 - loss: 0.8680 - regression_loss: 0.7807 - classification_loss: 0.0873 181/500 [=========>....................] - ETA: 1:18 - loss: 0.8674 - regression_loss: 0.7802 - classification_loss: 0.0872 182/500 [=========>....................] - ETA: 1:18 - loss: 0.8675 - regression_loss: 0.7805 - classification_loss: 0.0870 183/500 [=========>....................] - ETA: 1:17 - loss: 0.8654 - regression_loss: 0.7786 - classification_loss: 0.0868 184/500 [==========>...................] - ETA: 1:17 - loss: 0.8660 - regression_loss: 0.7791 - classification_loss: 0.0869 185/500 [==========>...................] - ETA: 1:17 - loss: 0.8659 - regression_loss: 0.7777 - classification_loss: 0.0881 186/500 [==========>...................] - ETA: 1:17 - loss: 0.8653 - regression_loss: 0.7774 - classification_loss: 0.0879 187/500 [==========>...................] - ETA: 1:16 - loss: 0.8671 - regression_loss: 0.7789 - classification_loss: 0.0881 188/500 [==========>...................] - ETA: 1:16 - loss: 0.8693 - regression_loss: 0.7808 - classification_loss: 0.0884 189/500 [==========>...................] - ETA: 1:16 - loss: 0.8713 - regression_loss: 0.7826 - classification_loss: 0.0887 190/500 [==========>...................] - ETA: 1:16 - loss: 0.8715 - regression_loss: 0.7830 - classification_loss: 0.0886 191/500 [==========>...................] - ETA: 1:15 - loss: 0.8709 - regression_loss: 0.7825 - classification_loss: 0.0885 192/500 [==========>...................] - ETA: 1:15 - loss: 0.8690 - regression_loss: 0.7809 - classification_loss: 0.0881 193/500 [==========>...................] - ETA: 1:15 - loss: 0.8690 - regression_loss: 0.7806 - classification_loss: 0.0884 194/500 [==========>...................] - ETA: 1:15 - loss: 0.8685 - regression_loss: 0.7803 - classification_loss: 0.0882 195/500 [==========>...................] - ETA: 1:15 - loss: 0.8655 - regression_loss: 0.7774 - classification_loss: 0.0881 196/500 [==========>...................] - ETA: 1:14 - loss: 0.8663 - regression_loss: 0.7784 - classification_loss: 0.0880 197/500 [==========>...................] - ETA: 1:14 - loss: 0.8648 - regression_loss: 0.7771 - classification_loss: 0.0877 198/500 [==========>...................] - ETA: 1:14 - loss: 0.8644 - regression_loss: 0.7767 - classification_loss: 0.0877 199/500 [==========>...................] - ETA: 1:14 - loss: 0.8650 - regression_loss: 0.7771 - classification_loss: 0.0879 200/500 [===========>..................] - ETA: 1:13 - loss: 0.8671 - regression_loss: 0.7790 - classification_loss: 0.0881 201/500 [===========>..................] - ETA: 1:13 - loss: 0.8664 - regression_loss: 0.7784 - classification_loss: 0.0880 202/500 [===========>..................] - ETA: 1:13 - loss: 0.8665 - regression_loss: 0.7787 - classification_loss: 0.0878 203/500 [===========>..................] - ETA: 1:13 - loss: 0.8663 - regression_loss: 0.7785 - classification_loss: 0.0878 204/500 [===========>..................] - ETA: 1:12 - loss: 0.8689 - regression_loss: 0.7808 - classification_loss: 0.0881 205/500 [===========>..................] - ETA: 1:12 - loss: 0.8687 - regression_loss: 0.7806 - classification_loss: 0.0880 206/500 [===========>..................] - ETA: 1:12 - loss: 0.8662 - regression_loss: 0.7786 - classification_loss: 0.0877 207/500 [===========>..................] - ETA: 1:12 - loss: 0.8663 - regression_loss: 0.7787 - classification_loss: 0.0876 208/500 [===========>..................] - ETA: 1:11 - loss: 0.8658 - regression_loss: 0.7782 - classification_loss: 0.0876 209/500 [===========>..................] - ETA: 1:11 - loss: 0.8638 - regression_loss: 0.7765 - classification_loss: 0.0873 210/500 [===========>..................] - ETA: 1:11 - loss: 0.8626 - regression_loss: 0.7755 - classification_loss: 0.0870 211/500 [===========>..................] - ETA: 1:11 - loss: 0.8612 - regression_loss: 0.7743 - classification_loss: 0.0869 212/500 [===========>..................] - ETA: 1:10 - loss: 0.8633 - regression_loss: 0.7761 - classification_loss: 0.0872 213/500 [===========>..................] - ETA: 1:10 - loss: 0.8633 - regression_loss: 0.7759 - classification_loss: 0.0873 214/500 [===========>..................] - ETA: 1:10 - loss: 0.8612 - regression_loss: 0.7743 - classification_loss: 0.0870 215/500 [===========>..................] - ETA: 1:10 - loss: 0.8585 - regression_loss: 0.7719 - classification_loss: 0.0867 216/500 [===========>..................] - ETA: 1:10 - loss: 0.8590 - regression_loss: 0.7724 - classification_loss: 0.0866 217/500 [============>.................] - ETA: 1:09 - loss: 0.8592 - regression_loss: 0.7726 - classification_loss: 0.0866 218/500 [============>.................] - ETA: 1:09 - loss: 0.8587 - regression_loss: 0.7721 - classification_loss: 0.0866 219/500 [============>.................] - ETA: 1:09 - loss: 0.8602 - regression_loss: 0.7733 - classification_loss: 0.0868 220/500 [============>.................] - ETA: 1:09 - loss: 0.8602 - regression_loss: 0.7733 - classification_loss: 0.0869 221/500 [============>.................] - ETA: 1:08 - loss: 0.8578 - regression_loss: 0.7712 - classification_loss: 0.0866 222/500 [============>.................] - ETA: 1:08 - loss: 0.8565 - regression_loss: 0.7701 - classification_loss: 0.0865 223/500 [============>.................] - ETA: 1:08 - loss: 0.8587 - regression_loss: 0.7720 - classification_loss: 0.0867 224/500 [============>.................] - ETA: 1:08 - loss: 0.8573 - regression_loss: 0.7709 - classification_loss: 0.0864 225/500 [============>.................] - ETA: 1:07 - loss: 0.8576 - regression_loss: 0.7709 - classification_loss: 0.0867 226/500 [============>.................] - ETA: 1:07 - loss: 0.8555 - regression_loss: 0.7687 - classification_loss: 0.0868 227/500 [============>.................] - ETA: 1:07 - loss: 0.8573 - regression_loss: 0.7703 - classification_loss: 0.0870 228/500 [============>.................] - ETA: 1:07 - loss: 0.8579 - regression_loss: 0.7709 - classification_loss: 0.0870 229/500 [============>.................] - ETA: 1:06 - loss: 0.8562 - regression_loss: 0.7695 - classification_loss: 0.0867 230/500 [============>.................] - ETA: 1:06 - loss: 0.8560 - regression_loss: 0.7693 - classification_loss: 0.0866 231/500 [============>.................] - ETA: 1:06 - loss: 0.8590 - regression_loss: 0.7720 - classification_loss: 0.0870 232/500 [============>.................] - ETA: 1:06 - loss: 0.8609 - regression_loss: 0.7733 - classification_loss: 0.0876 233/500 [============>.................] - ETA: 1:05 - loss: 0.8606 - regression_loss: 0.7732 - classification_loss: 0.0874 234/500 [=============>................] - ETA: 1:05 - loss: 0.8597 - regression_loss: 0.7723 - classification_loss: 0.0873 235/500 [=============>................] - ETA: 1:05 - loss: 0.8585 - regression_loss: 0.7710 - classification_loss: 0.0875 236/500 [=============>................] - ETA: 1:05 - loss: 0.8585 - regression_loss: 0.7710 - classification_loss: 0.0875 237/500 [=============>................] - ETA: 1:04 - loss: 0.8590 - regression_loss: 0.7715 - classification_loss: 0.0875 238/500 [=============>................] - ETA: 1:04 - loss: 0.8588 - regression_loss: 0.7716 - classification_loss: 0.0872 239/500 [=============>................] - ETA: 1:04 - loss: 0.8589 - regression_loss: 0.7718 - classification_loss: 0.0871 240/500 [=============>................] - ETA: 1:04 - loss: 0.8585 - regression_loss: 0.7714 - classification_loss: 0.0871 241/500 [=============>................] - ETA: 1:03 - loss: 0.8593 - regression_loss: 0.7719 - classification_loss: 0.0874 242/500 [=============>................] - ETA: 1:03 - loss: 0.8614 - regression_loss: 0.7736 - classification_loss: 0.0877 243/500 [=============>................] - ETA: 1:03 - loss: 0.8616 - regression_loss: 0.7739 - classification_loss: 0.0878 244/500 [=============>................] - ETA: 1:03 - loss: 0.8604 - regression_loss: 0.7729 - classification_loss: 0.0876 245/500 [=============>................] - ETA: 1:03 - loss: 0.8611 - regression_loss: 0.7735 - classification_loss: 0.0876 246/500 [=============>................] - ETA: 1:02 - loss: 0.8614 - regression_loss: 0.7738 - classification_loss: 0.0875 247/500 [=============>................] - ETA: 1:02 - loss: 0.8623 - regression_loss: 0.7746 - classification_loss: 0.0877 248/500 [=============>................] - ETA: 1:02 - loss: 0.8601 - regression_loss: 0.7727 - classification_loss: 0.0875 249/500 [=============>................] - ETA: 1:02 - loss: 0.8619 - regression_loss: 0.7741 - classification_loss: 0.0878 250/500 [==============>...............] - ETA: 1:01 - loss: 0.8621 - regression_loss: 0.7742 - classification_loss: 0.0879 251/500 [==============>...............] - ETA: 1:01 - loss: 0.8632 - regression_loss: 0.7752 - classification_loss: 0.0880 252/500 [==============>...............] - ETA: 1:01 - loss: 0.8658 - regression_loss: 0.7773 - classification_loss: 0.0885 253/500 [==============>...............] - ETA: 1:01 - loss: 0.8658 - regression_loss: 0.7774 - classification_loss: 0.0884 254/500 [==============>...............] - ETA: 1:00 - loss: 0.8673 - regression_loss: 0.7785 - classification_loss: 0.0888 255/500 [==============>...............] - ETA: 1:00 - loss: 0.8661 - regression_loss: 0.7775 - classification_loss: 0.0886 256/500 [==============>...............] - ETA: 1:00 - loss: 0.8677 - regression_loss: 0.7788 - classification_loss: 0.0889 257/500 [==============>...............] - ETA: 1:00 - loss: 0.8677 - regression_loss: 0.7789 - classification_loss: 0.0887 258/500 [==============>...............] - ETA: 59s - loss: 0.8681 - regression_loss: 0.7793 - classification_loss: 0.0888  259/500 [==============>...............] - ETA: 59s - loss: 0.8665 - regression_loss: 0.7780 - classification_loss: 0.0885 260/500 [==============>...............] - ETA: 59s - loss: 0.8653 - regression_loss: 0.7769 - classification_loss: 0.0884 261/500 [==============>...............] - ETA: 59s - loss: 0.8661 - regression_loss: 0.7777 - classification_loss: 0.0884 262/500 [==============>...............] - ETA: 58s - loss: 0.8643 - regression_loss: 0.7761 - classification_loss: 0.0882 263/500 [==============>...............] - ETA: 58s - loss: 0.8655 - regression_loss: 0.7771 - classification_loss: 0.0884 264/500 [==============>...............] - ETA: 58s - loss: 0.8649 - regression_loss: 0.7766 - classification_loss: 0.0883 265/500 [==============>...............] - ETA: 58s - loss: 0.8664 - regression_loss: 0.7779 - classification_loss: 0.0885 266/500 [==============>...............] - ETA: 57s - loss: 0.8640 - regression_loss: 0.7758 - classification_loss: 0.0882 267/500 [===============>..............] - ETA: 57s - loss: 0.8648 - regression_loss: 0.7766 - classification_loss: 0.0882 268/500 [===============>..............] - ETA: 57s - loss: 0.8636 - regression_loss: 0.7755 - classification_loss: 0.0881 269/500 [===============>..............] - ETA: 57s - loss: 0.8631 - regression_loss: 0.7752 - classification_loss: 0.0879 270/500 [===============>..............] - ETA: 56s - loss: 0.8630 - regression_loss: 0.7751 - classification_loss: 0.0879 271/500 [===============>..............] - ETA: 56s - loss: 0.8614 - regression_loss: 0.7737 - classification_loss: 0.0877 272/500 [===============>..............] - ETA: 56s - loss: 0.8605 - regression_loss: 0.7730 - classification_loss: 0.0875 273/500 [===============>..............] - ETA: 56s - loss: 0.8610 - regression_loss: 0.7735 - classification_loss: 0.0875 274/500 [===============>..............] - ETA: 55s - loss: 0.8615 - regression_loss: 0.7740 - classification_loss: 0.0874 275/500 [===============>..............] - ETA: 55s - loss: 0.8600 - regression_loss: 0.7728 - classification_loss: 0.0872 276/500 [===============>..............] - ETA: 55s - loss: 0.8607 - regression_loss: 0.7733 - classification_loss: 0.0873 277/500 [===============>..............] - ETA: 55s - loss: 0.8600 - regression_loss: 0.7728 - classification_loss: 0.0871 278/500 [===============>..............] - ETA: 55s - loss: 0.8596 - regression_loss: 0.7726 - classification_loss: 0.0871 279/500 [===============>..............] - ETA: 54s - loss: 0.8574 - regression_loss: 0.7706 - classification_loss: 0.0868 280/500 [===============>..............] - ETA: 54s - loss: 0.8583 - regression_loss: 0.7713 - classification_loss: 0.0870 281/500 [===============>..............] - ETA: 54s - loss: 0.8580 - regression_loss: 0.7711 - classification_loss: 0.0869 282/500 [===============>..............] - ETA: 54s - loss: 0.8594 - regression_loss: 0.7725 - classification_loss: 0.0869 283/500 [===============>..............] - ETA: 53s - loss: 0.8597 - regression_loss: 0.7728 - classification_loss: 0.0869 284/500 [================>.............] - ETA: 53s - loss: 0.8610 - regression_loss: 0.7739 - classification_loss: 0.0870 285/500 [================>.............] - ETA: 53s - loss: 0.8607 - regression_loss: 0.7738 - classification_loss: 0.0868 286/500 [================>.............] - ETA: 53s - loss: 0.8613 - regression_loss: 0.7744 - classification_loss: 0.0869 287/500 [================>.............] - ETA: 52s - loss: 0.8594 - regression_loss: 0.7728 - classification_loss: 0.0866 288/500 [================>.............] - ETA: 52s - loss: 0.8598 - regression_loss: 0.7730 - classification_loss: 0.0868 289/500 [================>.............] - ETA: 52s - loss: 0.8600 - regression_loss: 0.7732 - classification_loss: 0.0868 290/500 [================>.............] - ETA: 52s - loss: 0.8605 - regression_loss: 0.7736 - classification_loss: 0.0869 291/500 [================>.............] - ETA: 51s - loss: 0.8608 - regression_loss: 0.7739 - classification_loss: 0.0868 292/500 [================>.............] - ETA: 51s - loss: 0.8609 - regression_loss: 0.7741 - classification_loss: 0.0867 293/500 [================>.............] - ETA: 51s - loss: 0.8604 - regression_loss: 0.7737 - classification_loss: 0.0867 294/500 [================>.............] - ETA: 51s - loss: 0.8605 - regression_loss: 0.7737 - classification_loss: 0.0869 295/500 [================>.............] - ETA: 50s - loss: 0.8608 - regression_loss: 0.7739 - classification_loss: 0.0869 296/500 [================>.............] - ETA: 50s - loss: 0.8608 - regression_loss: 0.7739 - classification_loss: 0.0869 297/500 [================>.............] - ETA: 50s - loss: 0.8612 - regression_loss: 0.7742 - classification_loss: 0.0870 298/500 [================>.............] - ETA: 50s - loss: 0.8619 - regression_loss: 0.7748 - classification_loss: 0.0870 299/500 [================>.............] - ETA: 49s - loss: 0.8600 - regression_loss: 0.7731 - classification_loss: 0.0869 300/500 [=================>............] - ETA: 49s - loss: 0.8608 - regression_loss: 0.7739 - classification_loss: 0.0869 301/500 [=================>............] - ETA: 49s - loss: 0.8614 - regression_loss: 0.7744 - classification_loss: 0.0870 302/500 [=================>............] - ETA: 49s - loss: 0.8598 - regression_loss: 0.7730 - classification_loss: 0.0868 303/500 [=================>............] - ETA: 48s - loss: 0.8612 - regression_loss: 0.7742 - classification_loss: 0.0870 304/500 [=================>............] - ETA: 48s - loss: 0.8615 - regression_loss: 0.7746 - classification_loss: 0.0870 305/500 [=================>............] - ETA: 48s - loss: 0.8604 - regression_loss: 0.7736 - classification_loss: 0.0868 306/500 [=================>............] - ETA: 48s - loss: 0.8610 - regression_loss: 0.7742 - classification_loss: 0.0868 307/500 [=================>............] - ETA: 47s - loss: 0.8605 - regression_loss: 0.7738 - classification_loss: 0.0867 308/500 [=================>............] - ETA: 47s - loss: 0.8604 - regression_loss: 0.7737 - classification_loss: 0.0867 309/500 [=================>............] - ETA: 47s - loss: 0.8606 - regression_loss: 0.7740 - classification_loss: 0.0866 310/500 [=================>............] - ETA: 47s - loss: 0.8606 - regression_loss: 0.7740 - classification_loss: 0.0867 311/500 [=================>............] - ETA: 46s - loss: 0.8591 - regression_loss: 0.7726 - classification_loss: 0.0864 312/500 [=================>............] - ETA: 46s - loss: 0.8577 - regression_loss: 0.7715 - classification_loss: 0.0862 313/500 [=================>............] - ETA: 46s - loss: 0.8587 - regression_loss: 0.7722 - classification_loss: 0.0864 314/500 [=================>............] - ETA: 46s - loss: 0.8584 - regression_loss: 0.7720 - classification_loss: 0.0864 315/500 [=================>............] - ETA: 45s - loss: 0.8574 - regression_loss: 0.7712 - classification_loss: 0.0863 316/500 [=================>............] - ETA: 45s - loss: 0.8564 - regression_loss: 0.7703 - classification_loss: 0.0862 317/500 [==================>...........] - ETA: 45s - loss: 0.8547 - regression_loss: 0.7688 - classification_loss: 0.0859 318/500 [==================>...........] - ETA: 45s - loss: 0.8529 - regression_loss: 0.7671 - classification_loss: 0.0858 319/500 [==================>...........] - ETA: 44s - loss: 0.8515 - regression_loss: 0.7659 - classification_loss: 0.0856 320/500 [==================>...........] - ETA: 44s - loss: 0.8511 - regression_loss: 0.7656 - classification_loss: 0.0855 321/500 [==================>...........] - ETA: 44s - loss: 0.8522 - regression_loss: 0.7666 - classification_loss: 0.0856 322/500 [==================>...........] - ETA: 44s - loss: 0.8508 - regression_loss: 0.7654 - classification_loss: 0.0854 323/500 [==================>...........] - ETA: 43s - loss: 0.8507 - regression_loss: 0.7653 - classification_loss: 0.0854 324/500 [==================>...........] - ETA: 43s - loss: 0.8507 - regression_loss: 0.7653 - classification_loss: 0.0853 325/500 [==================>...........] - ETA: 43s - loss: 0.8505 - regression_loss: 0.7652 - classification_loss: 0.0854 326/500 [==================>...........] - ETA: 43s - loss: 0.8504 - regression_loss: 0.7651 - classification_loss: 0.0853 327/500 [==================>...........] - ETA: 42s - loss: 0.8500 - regression_loss: 0.7649 - classification_loss: 0.0852 328/500 [==================>...........] - ETA: 42s - loss: 0.8503 - regression_loss: 0.7651 - classification_loss: 0.0852 329/500 [==================>...........] - ETA: 42s - loss: 0.8498 - regression_loss: 0.7647 - classification_loss: 0.0850 330/500 [==================>...........] - ETA: 42s - loss: 0.8506 - regression_loss: 0.7655 - classification_loss: 0.0851 331/500 [==================>...........] - ETA: 41s - loss: 0.8512 - regression_loss: 0.7661 - classification_loss: 0.0852 332/500 [==================>...........] - ETA: 41s - loss: 0.8507 - regression_loss: 0.7657 - classification_loss: 0.0851 333/500 [==================>...........] - ETA: 41s - loss: 0.8510 - regression_loss: 0.7657 - classification_loss: 0.0852 334/500 [===================>..........] - ETA: 41s - loss: 0.8505 - regression_loss: 0.7654 - classification_loss: 0.0851 335/500 [===================>..........] - ETA: 40s - loss: 0.8508 - regression_loss: 0.7656 - classification_loss: 0.0852 336/500 [===================>..........] - ETA: 40s - loss: 0.8493 - regression_loss: 0.7644 - classification_loss: 0.0850 337/500 [===================>..........] - ETA: 40s - loss: 0.8481 - regression_loss: 0.7633 - classification_loss: 0.0847 338/500 [===================>..........] - ETA: 40s - loss: 0.8488 - regression_loss: 0.7639 - classification_loss: 0.0849 339/500 [===================>..........] - ETA: 39s - loss: 0.8474 - regression_loss: 0.7627 - classification_loss: 0.0847 340/500 [===================>..........] - ETA: 39s - loss: 0.8459 - regression_loss: 0.7614 - classification_loss: 0.0845 341/500 [===================>..........] - ETA: 39s - loss: 0.8462 - regression_loss: 0.7615 - classification_loss: 0.0847 342/500 [===================>..........] - ETA: 39s - loss: 0.8462 - regression_loss: 0.7615 - classification_loss: 0.0848 343/500 [===================>..........] - ETA: 38s - loss: 0.8464 - regression_loss: 0.7617 - classification_loss: 0.0848 344/500 [===================>..........] - ETA: 38s - loss: 0.8463 - regression_loss: 0.7616 - classification_loss: 0.0847 345/500 [===================>..........] - ETA: 38s - loss: 0.8470 - regression_loss: 0.7623 - classification_loss: 0.0847 346/500 [===================>..........] - ETA: 38s - loss: 0.8454 - regression_loss: 0.7609 - classification_loss: 0.0845 347/500 [===================>..........] - ETA: 37s - loss: 0.8451 - regression_loss: 0.7606 - classification_loss: 0.0845 348/500 [===================>..........] - ETA: 37s - loss: 0.8457 - regression_loss: 0.7611 - classification_loss: 0.0846 349/500 [===================>..........] - ETA: 37s - loss: 0.8472 - regression_loss: 0.7625 - classification_loss: 0.0848 350/500 [====================>.........] - ETA: 37s - loss: 0.8475 - regression_loss: 0.7626 - classification_loss: 0.0849 351/500 [====================>.........] - ETA: 36s - loss: 0.8472 - regression_loss: 0.7624 - classification_loss: 0.0848 352/500 [====================>.........] - ETA: 36s - loss: 0.8481 - regression_loss: 0.7632 - classification_loss: 0.0849 353/500 [====================>.........] - ETA: 36s - loss: 0.8472 - regression_loss: 0.7624 - classification_loss: 0.0847 354/500 [====================>.........] - ETA: 36s - loss: 0.8496 - regression_loss: 0.7644 - classification_loss: 0.0852 355/500 [====================>.........] - ETA: 35s - loss: 0.8484 - regression_loss: 0.7634 - classification_loss: 0.0850 356/500 [====================>.........] - ETA: 35s - loss: 0.8488 - regression_loss: 0.7636 - classification_loss: 0.0851 357/500 [====================>.........] - ETA: 35s - loss: 0.8475 - regression_loss: 0.7625 - classification_loss: 0.0850 358/500 [====================>.........] - ETA: 35s - loss: 0.8466 - regression_loss: 0.7618 - classification_loss: 0.0848 359/500 [====================>.........] - ETA: 34s - loss: 0.8476 - regression_loss: 0.7626 - classification_loss: 0.0850 360/500 [====================>.........] - ETA: 34s - loss: 0.8474 - regression_loss: 0.7626 - classification_loss: 0.0849 361/500 [====================>.........] - ETA: 34s - loss: 0.8465 - regression_loss: 0.7618 - classification_loss: 0.0847 362/500 [====================>.........] - ETA: 34s - loss: 0.8464 - regression_loss: 0.7617 - classification_loss: 0.0847 363/500 [====================>.........] - ETA: 34s - loss: 0.8451 - regression_loss: 0.7606 - classification_loss: 0.0845 364/500 [====================>.........] - ETA: 33s - loss: 0.8457 - regression_loss: 0.7610 - classification_loss: 0.0847 365/500 [====================>.........] - ETA: 33s - loss: 0.8450 - regression_loss: 0.7605 - classification_loss: 0.0845 366/500 [====================>.........] - ETA: 33s - loss: 0.8457 - regression_loss: 0.7609 - classification_loss: 0.0847 367/500 [=====================>........] - ETA: 33s - loss: 0.8452 - regression_loss: 0.7606 - classification_loss: 0.0846 368/500 [=====================>........] - ETA: 32s - loss: 0.8455 - regression_loss: 0.7609 - classification_loss: 0.0846 369/500 [=====================>........] - ETA: 32s - loss: 0.8458 - regression_loss: 0.7612 - classification_loss: 0.0846 370/500 [=====================>........] - ETA: 32s - loss: 0.8473 - regression_loss: 0.7625 - classification_loss: 0.0848 371/500 [=====================>........] - ETA: 32s - loss: 0.8483 - regression_loss: 0.7633 - classification_loss: 0.0850 372/500 [=====================>........] - ETA: 31s - loss: 0.8490 - regression_loss: 0.7639 - classification_loss: 0.0851 373/500 [=====================>........] - ETA: 31s - loss: 0.8487 - regression_loss: 0.7636 - classification_loss: 0.0850 374/500 [=====================>........] - ETA: 31s - loss: 0.8491 - regression_loss: 0.7640 - classification_loss: 0.0851 375/500 [=====================>........] - ETA: 31s - loss: 0.8488 - regression_loss: 0.7638 - classification_loss: 0.0850 376/500 [=====================>........] - ETA: 30s - loss: 0.8489 - regression_loss: 0.7639 - classification_loss: 0.0850 377/500 [=====================>........] - ETA: 30s - loss: 0.8506 - regression_loss: 0.7653 - classification_loss: 0.0852 378/500 [=====================>........] - ETA: 30s - loss: 0.8521 - regression_loss: 0.7666 - classification_loss: 0.0855 379/500 [=====================>........] - ETA: 30s - loss: 0.8530 - regression_loss: 0.7673 - classification_loss: 0.0857 380/500 [=====================>........] - ETA: 29s - loss: 0.8530 - regression_loss: 0.7673 - classification_loss: 0.0857 381/500 [=====================>........] - ETA: 29s - loss: 0.8522 - regression_loss: 0.7666 - classification_loss: 0.0855 382/500 [=====================>........] - ETA: 29s - loss: 0.8512 - regression_loss: 0.7659 - classification_loss: 0.0854 383/500 [=====================>........] - ETA: 29s - loss: 0.8514 - regression_loss: 0.7661 - classification_loss: 0.0853 384/500 [======================>.......] - ETA: 28s - loss: 0.8502 - regression_loss: 0.7650 - classification_loss: 0.0852 385/500 [======================>.......] - ETA: 28s - loss: 0.8510 - regression_loss: 0.7655 - classification_loss: 0.0855 386/500 [======================>.......] - ETA: 28s - loss: 0.8515 - regression_loss: 0.7660 - classification_loss: 0.0855 387/500 [======================>.......] - ETA: 28s - loss: 0.8514 - regression_loss: 0.7660 - classification_loss: 0.0855 388/500 [======================>.......] - ETA: 27s - loss: 0.8514 - regression_loss: 0.7660 - classification_loss: 0.0854 389/500 [======================>.......] - ETA: 27s - loss: 0.8507 - regression_loss: 0.7654 - classification_loss: 0.0853 390/500 [======================>.......] - ETA: 27s - loss: 0.8520 - regression_loss: 0.7665 - classification_loss: 0.0855 391/500 [======================>.......] - ETA: 27s - loss: 0.8518 - regression_loss: 0.7664 - classification_loss: 0.0854 392/500 [======================>.......] - ETA: 26s - loss: 0.8528 - regression_loss: 0.7673 - classification_loss: 0.0855 393/500 [======================>.......] - ETA: 26s - loss: 0.8524 - regression_loss: 0.7670 - classification_loss: 0.0854 394/500 [======================>.......] - ETA: 26s - loss: 0.8531 - regression_loss: 0.7675 - classification_loss: 0.0855 395/500 [======================>.......] - ETA: 26s - loss: 0.8516 - regression_loss: 0.7662 - classification_loss: 0.0854 396/500 [======================>.......] - ETA: 25s - loss: 0.8528 - regression_loss: 0.7671 - classification_loss: 0.0856 397/500 [======================>.......] - ETA: 25s - loss: 0.8537 - regression_loss: 0.7678 - classification_loss: 0.0858 398/500 [======================>.......] - ETA: 25s - loss: 0.8524 - regression_loss: 0.7667 - classification_loss: 0.0857 399/500 [======================>.......] - ETA: 25s - loss: 0.8515 - regression_loss: 0.7660 - classification_loss: 0.0855 400/500 [=======================>......] - ETA: 24s - loss: 0.8518 - regression_loss: 0.7664 - classification_loss: 0.0854 401/500 [=======================>......] - ETA: 24s - loss: 0.8527 - regression_loss: 0.7671 - classification_loss: 0.0855 402/500 [=======================>......] - ETA: 24s - loss: 0.8531 - regression_loss: 0.7676 - classification_loss: 0.0855 403/500 [=======================>......] - ETA: 24s - loss: 0.8527 - regression_loss: 0.7672 - classification_loss: 0.0855 404/500 [=======================>......] - ETA: 23s - loss: 0.8528 - regression_loss: 0.7672 - classification_loss: 0.0857 405/500 [=======================>......] - ETA: 23s - loss: 0.8529 - regression_loss: 0.7674 - classification_loss: 0.0855 406/500 [=======================>......] - ETA: 23s - loss: 0.8528 - regression_loss: 0.7673 - classification_loss: 0.0855 407/500 [=======================>......] - ETA: 23s - loss: 0.8523 - regression_loss: 0.7669 - classification_loss: 0.0854 408/500 [=======================>......] - ETA: 22s - loss: 0.8529 - regression_loss: 0.7674 - classification_loss: 0.0855 409/500 [=======================>......] - ETA: 22s - loss: 0.8536 - regression_loss: 0.7681 - classification_loss: 0.0856 410/500 [=======================>......] - ETA: 22s - loss: 0.8537 - regression_loss: 0.7681 - classification_loss: 0.0856 411/500 [=======================>......] - ETA: 22s - loss: 0.8540 - regression_loss: 0.7683 - classification_loss: 0.0857 412/500 [=======================>......] - ETA: 21s - loss: 0.8539 - regression_loss: 0.7682 - classification_loss: 0.0857 413/500 [=======================>......] - ETA: 21s - loss: 0.8529 - regression_loss: 0.7674 - classification_loss: 0.0855 414/500 [=======================>......] - ETA: 21s - loss: 0.8526 - regression_loss: 0.7672 - classification_loss: 0.0854 415/500 [=======================>......] - ETA: 21s - loss: 0.8524 - regression_loss: 0.7671 - classification_loss: 0.0853 416/500 [=======================>......] - ETA: 20s - loss: 0.8530 - regression_loss: 0.7678 - classification_loss: 0.0852 417/500 [========================>.....] - ETA: 20s - loss: 0.8521 - regression_loss: 0.7670 - classification_loss: 0.0851 418/500 [========================>.....] - ETA: 20s - loss: 0.8517 - regression_loss: 0.7666 - classification_loss: 0.0850 419/500 [========================>.....] - ETA: 20s - loss: 0.8513 - regression_loss: 0.7663 - classification_loss: 0.0850 420/500 [========================>.....] - ETA: 19s - loss: 0.8518 - regression_loss: 0.7667 - classification_loss: 0.0851 421/500 [========================>.....] - ETA: 19s - loss: 0.8508 - regression_loss: 0.7658 - classification_loss: 0.0849 422/500 [========================>.....] - ETA: 19s - loss: 0.8513 - regression_loss: 0.7663 - classification_loss: 0.0849 423/500 [========================>.....] - ETA: 19s - loss: 0.8511 - regression_loss: 0.7661 - classification_loss: 0.0849 424/500 [========================>.....] - ETA: 18s - loss: 0.8511 - regression_loss: 0.7661 - classification_loss: 0.0850 425/500 [========================>.....] - ETA: 18s - loss: 0.8509 - regression_loss: 0.7657 - classification_loss: 0.0852 426/500 [========================>.....] - ETA: 18s - loss: 0.8517 - regression_loss: 0.7664 - classification_loss: 0.0853 427/500 [========================>.....] - ETA: 18s - loss: 0.8511 - regression_loss: 0.7659 - classification_loss: 0.0852 428/500 [========================>.....] - ETA: 17s - loss: 0.8512 - regression_loss: 0.7660 - classification_loss: 0.0852 429/500 [========================>.....] - ETA: 17s - loss: 0.8507 - regression_loss: 0.7656 - classification_loss: 0.0851 430/500 [========================>.....] - ETA: 17s - loss: 0.8506 - regression_loss: 0.7655 - classification_loss: 0.0850 431/500 [========================>.....] - ETA: 17s - loss: 0.8508 - regression_loss: 0.7657 - classification_loss: 0.0850 432/500 [========================>.....] - ETA: 16s - loss: 0.8508 - regression_loss: 0.7659 - classification_loss: 0.0850 433/500 [========================>.....] - ETA: 16s - loss: 0.8513 - regression_loss: 0.7662 - classification_loss: 0.0851 434/500 [=========================>....] - ETA: 16s - loss: 0.8516 - regression_loss: 0.7665 - classification_loss: 0.0851 435/500 [=========================>....] - ETA: 16s - loss: 0.8513 - regression_loss: 0.7663 - classification_loss: 0.0851 436/500 [=========================>....] - ETA: 15s - loss: 0.8534 - regression_loss: 0.7681 - classification_loss: 0.0853 437/500 [=========================>....] - ETA: 15s - loss: 0.8539 - regression_loss: 0.7685 - classification_loss: 0.0854 438/500 [=========================>....] - ETA: 15s - loss: 0.8535 - regression_loss: 0.7682 - classification_loss: 0.0853 439/500 [=========================>....] - ETA: 15s - loss: 0.8534 - regression_loss: 0.7682 - classification_loss: 0.0853 440/500 [=========================>....] - ETA: 14s - loss: 0.8530 - regression_loss: 0.7678 - classification_loss: 0.0852 441/500 [=========================>....] - ETA: 14s - loss: 0.8538 - regression_loss: 0.7686 - classification_loss: 0.0853 442/500 [=========================>....] - ETA: 14s - loss: 0.8525 - regression_loss: 0.7674 - classification_loss: 0.0851 443/500 [=========================>....] - ETA: 14s - loss: 0.8517 - regression_loss: 0.7667 - classification_loss: 0.0850 444/500 [=========================>....] - ETA: 13s - loss: 0.8508 - regression_loss: 0.7659 - classification_loss: 0.0849 445/500 [=========================>....] - ETA: 13s - loss: 0.8513 - regression_loss: 0.7663 - classification_loss: 0.0850 446/500 [=========================>....] - ETA: 13s - loss: 0.8512 - regression_loss: 0.7663 - classification_loss: 0.0849 447/500 [=========================>....] - ETA: 13s - loss: 0.8504 - regression_loss: 0.7656 - classification_loss: 0.0848 448/500 [=========================>....] - ETA: 12s - loss: 0.8510 - regression_loss: 0.7662 - classification_loss: 0.0849 449/500 [=========================>....] - ETA: 12s - loss: 0.8503 - regression_loss: 0.7655 - classification_loss: 0.0848 450/500 [==========================>...] - ETA: 12s - loss: 0.8500 - regression_loss: 0.7653 - classification_loss: 0.0847 451/500 [==========================>...] - ETA: 12s - loss: 0.8501 - regression_loss: 0.7654 - classification_loss: 0.0847 452/500 [==========================>...] - ETA: 11s - loss: 0.8494 - regression_loss: 0.7648 - classification_loss: 0.0846 453/500 [==========================>...] - ETA: 11s - loss: 0.8487 - regression_loss: 0.7642 - classification_loss: 0.0845 454/500 [==========================>...] - ETA: 11s - loss: 0.8482 - regression_loss: 0.7638 - classification_loss: 0.0844 455/500 [==========================>...] - ETA: 11s - loss: 0.8472 - regression_loss: 0.7629 - classification_loss: 0.0843 456/500 [==========================>...] - ETA: 10s - loss: 0.8482 - regression_loss: 0.7637 - classification_loss: 0.0845 457/500 [==========================>...] - ETA: 10s - loss: 0.8475 - regression_loss: 0.7631 - classification_loss: 0.0844 458/500 [==========================>...] - ETA: 10s - loss: 0.8464 - regression_loss: 0.7622 - classification_loss: 0.0842 459/500 [==========================>...] - ETA: 10s - loss: 0.8461 - regression_loss: 0.7620 - classification_loss: 0.0841 460/500 [==========================>...] - ETA: 9s - loss: 0.8460 - regression_loss: 0.7618 - classification_loss: 0.0841  461/500 [==========================>...] - ETA: 9s - loss: 0.8468 - regression_loss: 0.7625 - classification_loss: 0.0843 462/500 [==========================>...] - ETA: 9s - loss: 0.8468 - regression_loss: 0.7626 - classification_loss: 0.0843 463/500 [==========================>...] - ETA: 9s - loss: 0.8471 - regression_loss: 0.7628 - classification_loss: 0.0843 464/500 [==========================>...] - ETA: 8s - loss: 0.8473 - regression_loss: 0.7630 - classification_loss: 0.0843 465/500 [==========================>...] - ETA: 8s - loss: 0.8473 - regression_loss: 0.7630 - classification_loss: 0.0844 466/500 [==========================>...] - ETA: 8s - loss: 0.8475 - regression_loss: 0.7631 - classification_loss: 0.0844 467/500 [===========================>..] - ETA: 8s - loss: 0.8478 - regression_loss: 0.7634 - classification_loss: 0.0844 468/500 [===========================>..] - ETA: 7s - loss: 0.8468 - regression_loss: 0.7625 - classification_loss: 0.0843 469/500 [===========================>..] - ETA: 7s - loss: 0.8469 - regression_loss: 0.7626 - classification_loss: 0.0843 470/500 [===========================>..] - ETA: 7s - loss: 0.8477 - regression_loss: 0.7633 - classification_loss: 0.0844 471/500 [===========================>..] - ETA: 7s - loss: 0.8469 - regression_loss: 0.7627 - classification_loss: 0.0843 472/500 [===========================>..] - ETA: 6s - loss: 0.8456 - regression_loss: 0.7615 - classification_loss: 0.0841 473/500 [===========================>..] - ETA: 6s - loss: 0.8458 - regression_loss: 0.7617 - classification_loss: 0.0841 474/500 [===========================>..] - ETA: 6s - loss: 0.8453 - regression_loss: 0.7613 - classification_loss: 0.0840 475/500 [===========================>..] - ETA: 6s - loss: 0.8442 - regression_loss: 0.7603 - classification_loss: 0.0839 476/500 [===========================>..] - ETA: 5s - loss: 0.8457 - regression_loss: 0.7615 - classification_loss: 0.0841 477/500 [===========================>..] - ETA: 5s - loss: 0.8468 - regression_loss: 0.7625 - classification_loss: 0.0843 478/500 [===========================>..] - ETA: 5s - loss: 0.8485 - regression_loss: 0.7641 - classification_loss: 0.0844 479/500 [===========================>..] - ETA: 5s - loss: 0.8473 - regression_loss: 0.7631 - classification_loss: 0.0842 480/500 [===========================>..] - ETA: 4s - loss: 0.8466 - regression_loss: 0.7625 - classification_loss: 0.0842 481/500 [===========================>..] - ETA: 4s - loss: 0.8471 - regression_loss: 0.7628 - classification_loss: 0.0842 482/500 [===========================>..] - ETA: 4s - loss: 0.8475 - regression_loss: 0.7632 - classification_loss: 0.0843 483/500 [===========================>..] - ETA: 4s - loss: 0.8467 - regression_loss: 0.7625 - classification_loss: 0.0842 484/500 [============================>.] - ETA: 3s - loss: 0.8470 - regression_loss: 0.7628 - classification_loss: 0.0842 485/500 [============================>.] - ETA: 3s - loss: 0.8475 - regression_loss: 0.7632 - classification_loss: 0.0844 486/500 [============================>.] - ETA: 3s - loss: 0.8478 - regression_loss: 0.7634 - classification_loss: 0.0844 487/500 [============================>.] - ETA: 3s - loss: 0.8473 - regression_loss: 0.7630 - classification_loss: 0.0843 488/500 [============================>.] - ETA: 2s - loss: 0.8476 - regression_loss: 0.7632 - classification_loss: 0.0845 489/500 [============================>.] - ETA: 2s - loss: 0.8479 - regression_loss: 0.7634 - classification_loss: 0.0845 490/500 [============================>.] - ETA: 2s - loss: 0.8475 - regression_loss: 0.7631 - classification_loss: 0.0844 491/500 [============================>.] - ETA: 2s - loss: 0.8468 - regression_loss: 0.7625 - classification_loss: 0.0843 492/500 [============================>.] - ETA: 1s - loss: 0.8458 - regression_loss: 0.7617 - classification_loss: 0.0841 493/500 [============================>.] - ETA: 1s - loss: 0.8463 - regression_loss: 0.7620 - classification_loss: 0.0843 494/500 [============================>.] - ETA: 1s - loss: 0.8465 - regression_loss: 0.7622 - classification_loss: 0.0843 495/500 [============================>.] - ETA: 1s - loss: 0.8470 - regression_loss: 0.7626 - classification_loss: 0.0844 496/500 [============================>.] - ETA: 0s - loss: 0.8473 - regression_loss: 0.7629 - classification_loss: 0.0845 497/500 [============================>.] - ETA: 0s - loss: 0.8465 - regression_loss: 0.7622 - classification_loss: 0.0843 498/500 [============================>.] - ETA: 0s - loss: 0.8470 - regression_loss: 0.7626 - classification_loss: 0.0844 499/500 [============================>.] - ETA: 0s - loss: 0.8472 - regression_loss: 0.7629 - classification_loss: 0.0843 500/500 [==============================] - 124s 249ms/step - loss: 0.8472 - regression_loss: 0.7628 - classification_loss: 0.0844 1172 instances of class plum with average precision: 0.8050 mAP: 0.8050 Epoch 00053: saving model to ./training/snapshots/resnet50_pascal_53.h5 Epoch 54/150 1/500 [..............................] - ETA: 2:00 - loss: 0.6842 - regression_loss: 0.6534 - classification_loss: 0.0308 2/500 [..............................] - ETA: 2:01 - loss: 0.7486 - regression_loss: 0.7041 - classification_loss: 0.0445 3/500 [..............................] - ETA: 2:01 - loss: 0.6792 - regression_loss: 0.6406 - classification_loss: 0.0386 4/500 [..............................] - ETA: 2:01 - loss: 0.5676 - regression_loss: 0.5366 - classification_loss: 0.0309 5/500 [..............................] - ETA: 1:58 - loss: 0.5463 - regression_loss: 0.5195 - classification_loss: 0.0268 6/500 [..............................] - ETA: 1:58 - loss: 0.6170 - regression_loss: 0.5825 - classification_loss: 0.0345 7/500 [..............................] - ETA: 1:58 - loss: 0.6375 - regression_loss: 0.6048 - classification_loss: 0.0328 8/500 [..............................] - ETA: 1:58 - loss: 0.6822 - regression_loss: 0.6383 - classification_loss: 0.0440 9/500 [..............................] - ETA: 1:57 - loss: 0.6646 - regression_loss: 0.6228 - classification_loss: 0.0417 10/500 [..............................] - ETA: 1:56 - loss: 0.6314 - regression_loss: 0.5911 - classification_loss: 0.0403 11/500 [..............................] - ETA: 1:56 - loss: 0.6494 - regression_loss: 0.6069 - classification_loss: 0.0425 12/500 [..............................] - ETA: 1:56 - loss: 0.6606 - regression_loss: 0.6150 - classification_loss: 0.0457 13/500 [..............................] - ETA: 1:56 - loss: 0.6950 - regression_loss: 0.6436 - classification_loss: 0.0514 14/500 [..............................] - ETA: 1:56 - loss: 0.7216 - regression_loss: 0.6662 - classification_loss: 0.0554 15/500 [..............................] - ETA: 1:56 - loss: 0.7645 - regression_loss: 0.7052 - classification_loss: 0.0593 16/500 [..............................] - ETA: 1:56 - loss: 0.7601 - regression_loss: 0.7003 - classification_loss: 0.0598 17/500 [>.............................] - ETA: 1:55 - loss: 0.7420 - regression_loss: 0.6840 - classification_loss: 0.0580 18/500 [>.............................] - ETA: 1:55 - loss: 0.7381 - regression_loss: 0.6818 - classification_loss: 0.0563 19/500 [>.............................] - ETA: 1:55 - loss: 0.7709 - regression_loss: 0.7038 - classification_loss: 0.0672 20/500 [>.............................] - ETA: 1:55 - loss: 0.7879 - regression_loss: 0.7160 - classification_loss: 0.0719 21/500 [>.............................] - ETA: 1:56 - loss: 0.7667 - regression_loss: 0.6973 - classification_loss: 0.0694 22/500 [>.............................] - ETA: 1:55 - loss: 0.7821 - regression_loss: 0.7108 - classification_loss: 0.0713 23/500 [>.............................] - ETA: 1:55 - loss: 0.7925 - regression_loss: 0.7198 - classification_loss: 0.0727 24/500 [>.............................] - ETA: 1:55 - loss: 0.7840 - regression_loss: 0.7130 - classification_loss: 0.0709 25/500 [>.............................] - ETA: 1:54 - loss: 0.8048 - regression_loss: 0.7323 - classification_loss: 0.0726 26/500 [>.............................] - ETA: 1:54 - loss: 0.8092 - regression_loss: 0.7346 - classification_loss: 0.0746 27/500 [>.............................] - ETA: 1:54 - loss: 0.8069 - regression_loss: 0.7330 - classification_loss: 0.0739 28/500 [>.............................] - ETA: 1:54 - loss: 0.8157 - regression_loss: 0.7416 - classification_loss: 0.0740 29/500 [>.............................] - ETA: 1:54 - loss: 0.8156 - regression_loss: 0.7419 - classification_loss: 0.0737 30/500 [>.............................] - ETA: 1:54 - loss: 0.8153 - regression_loss: 0.7408 - classification_loss: 0.0745 31/500 [>.............................] - ETA: 1:53 - loss: 0.8269 - regression_loss: 0.7492 - classification_loss: 0.0777 32/500 [>.............................] - ETA: 1:53 - loss: 0.8313 - regression_loss: 0.7524 - classification_loss: 0.0789 33/500 [>.............................] - ETA: 1:53 - loss: 0.8245 - regression_loss: 0.7469 - classification_loss: 0.0777 34/500 [=>............................] - ETA: 1:53 - loss: 0.8317 - regression_loss: 0.7533 - classification_loss: 0.0784 35/500 [=>............................] - ETA: 1:53 - loss: 0.8401 - regression_loss: 0.7600 - classification_loss: 0.0802 36/500 [=>............................] - ETA: 1:53 - loss: 0.8387 - regression_loss: 0.7593 - classification_loss: 0.0795 37/500 [=>............................] - ETA: 1:53 - loss: 0.8631 - regression_loss: 0.7788 - classification_loss: 0.0843 38/500 [=>............................] - ETA: 1:53 - loss: 0.8624 - regression_loss: 0.7784 - classification_loss: 0.0840 39/500 [=>............................] - ETA: 1:53 - loss: 0.8488 - regression_loss: 0.7664 - classification_loss: 0.0823 40/500 [=>............................] - ETA: 1:52 - loss: 0.8387 - regression_loss: 0.7580 - classification_loss: 0.0807 41/500 [=>............................] - ETA: 1:52 - loss: 0.8443 - regression_loss: 0.7626 - classification_loss: 0.0817 42/500 [=>............................] - ETA: 1:52 - loss: 0.8624 - regression_loss: 0.7763 - classification_loss: 0.0861 43/500 [=>............................] - ETA: 1:52 - loss: 0.8476 - regression_loss: 0.7633 - classification_loss: 0.0843 44/500 [=>............................] - ETA: 1:52 - loss: 0.8524 - regression_loss: 0.7674 - classification_loss: 0.0850 45/500 [=>............................] - ETA: 1:51 - loss: 0.8667 - regression_loss: 0.7792 - classification_loss: 0.0875 46/500 [=>............................] - ETA: 1:51 - loss: 0.8728 - regression_loss: 0.7842 - classification_loss: 0.0886 47/500 [=>............................] - ETA: 1:51 - loss: 0.8831 - regression_loss: 0.7934 - classification_loss: 0.0897 48/500 [=>............................] - ETA: 1:51 - loss: 0.8862 - regression_loss: 0.7961 - classification_loss: 0.0901 49/500 [=>............................] - ETA: 1:50 - loss: 0.8813 - regression_loss: 0.7924 - classification_loss: 0.0888 50/500 [==>...........................] - ETA: 1:50 - loss: 0.8911 - regression_loss: 0.8001 - classification_loss: 0.0910 51/500 [==>...........................] - ETA: 1:50 - loss: 0.8875 - regression_loss: 0.7977 - classification_loss: 0.0898 52/500 [==>...........................] - ETA: 1:50 - loss: 0.8789 - regression_loss: 0.7904 - classification_loss: 0.0884 53/500 [==>...........................] - ETA: 1:50 - loss: 0.8760 - regression_loss: 0.7875 - classification_loss: 0.0885 54/500 [==>...........................] - ETA: 1:49 - loss: 0.8648 - regression_loss: 0.7774 - classification_loss: 0.0874 55/500 [==>...........................] - ETA: 1:49 - loss: 0.8544 - regression_loss: 0.7682 - classification_loss: 0.0862 56/500 [==>...........................] - ETA: 1:49 - loss: 0.8559 - regression_loss: 0.7696 - classification_loss: 0.0863 57/500 [==>...........................] - ETA: 1:49 - loss: 0.8525 - regression_loss: 0.7661 - classification_loss: 0.0864 58/500 [==>...........................] - ETA: 1:48 - loss: 0.8496 - regression_loss: 0.7636 - classification_loss: 0.0859 59/500 [==>...........................] - ETA: 1:48 - loss: 0.8504 - regression_loss: 0.7644 - classification_loss: 0.0860 60/500 [==>...........................] - ETA: 1:48 - loss: 0.8523 - regression_loss: 0.7659 - classification_loss: 0.0863 61/500 [==>...........................] - ETA: 1:48 - loss: 0.8486 - regression_loss: 0.7626 - classification_loss: 0.0860 62/500 [==>...........................] - ETA: 1:47 - loss: 0.8491 - regression_loss: 0.7633 - classification_loss: 0.0859 63/500 [==>...........................] - ETA: 1:47 - loss: 0.8520 - regression_loss: 0.7658 - classification_loss: 0.0862 64/500 [==>...........................] - ETA: 1:47 - loss: 0.8494 - regression_loss: 0.7639 - classification_loss: 0.0854 65/500 [==>...........................] - ETA: 1:47 - loss: 0.8457 - regression_loss: 0.7610 - classification_loss: 0.0847 66/500 [==>...........................] - ETA: 1:47 - loss: 0.8509 - regression_loss: 0.7652 - classification_loss: 0.0858 67/500 [===>..........................] - ETA: 1:46 - loss: 0.8504 - regression_loss: 0.7648 - classification_loss: 0.0856 68/500 [===>..........................] - ETA: 1:46 - loss: 0.8547 - regression_loss: 0.7690 - classification_loss: 0.0857 69/500 [===>..........................] - ETA: 1:46 - loss: 0.8620 - regression_loss: 0.7753 - classification_loss: 0.0866 70/500 [===>..........................] - ETA: 1:46 - loss: 0.8583 - regression_loss: 0.7721 - classification_loss: 0.0863 71/500 [===>..........................] - ETA: 1:46 - loss: 0.8606 - regression_loss: 0.7747 - classification_loss: 0.0859 72/500 [===>..........................] - ETA: 1:45 - loss: 0.8677 - regression_loss: 0.7813 - classification_loss: 0.0863 73/500 [===>..........................] - ETA: 1:45 - loss: 0.8695 - regression_loss: 0.7827 - classification_loss: 0.0868 74/500 [===>..........................] - ETA: 1:45 - loss: 0.8731 - regression_loss: 0.7864 - classification_loss: 0.0867 75/500 [===>..........................] - ETA: 1:45 - loss: 0.8701 - regression_loss: 0.7841 - classification_loss: 0.0861 76/500 [===>..........................] - ETA: 1:45 - loss: 0.8682 - regression_loss: 0.7826 - classification_loss: 0.0857 77/500 [===>..........................] - ETA: 1:44 - loss: 0.8708 - regression_loss: 0.7846 - classification_loss: 0.0862 78/500 [===>..........................] - ETA: 1:44 - loss: 0.8663 - regression_loss: 0.7808 - classification_loss: 0.0854 79/500 [===>..........................] - ETA: 1:43 - loss: 0.8719 - regression_loss: 0.7859 - classification_loss: 0.0860 80/500 [===>..........................] - ETA: 1:43 - loss: 0.8684 - regression_loss: 0.7832 - classification_loss: 0.0852 81/500 [===>..........................] - ETA: 1:43 - loss: 0.8631 - regression_loss: 0.7785 - classification_loss: 0.0846 82/500 [===>..........................] - ETA: 1:43 - loss: 0.8564 - regression_loss: 0.7726 - classification_loss: 0.0837 83/500 [===>..........................] - ETA: 1:42 - loss: 0.8548 - regression_loss: 0.7710 - classification_loss: 0.0838 84/500 [====>.........................] - ETA: 1:42 - loss: 0.8570 - regression_loss: 0.7730 - classification_loss: 0.0839 85/500 [====>.........................] - ETA: 1:42 - loss: 0.8518 - regression_loss: 0.7686 - classification_loss: 0.0832 86/500 [====>.........................] - ETA: 1:42 - loss: 0.8517 - regression_loss: 0.7681 - classification_loss: 0.0835 87/500 [====>.........................] - ETA: 1:41 - loss: 0.8527 - regression_loss: 0.7692 - classification_loss: 0.0834 88/500 [====>.........................] - ETA: 1:41 - loss: 0.8519 - regression_loss: 0.7685 - classification_loss: 0.0834 89/500 [====>.........................] - ETA: 1:41 - loss: 0.8500 - regression_loss: 0.7672 - classification_loss: 0.0828 90/500 [====>.........................] - ETA: 1:41 - loss: 0.8495 - regression_loss: 0.7668 - classification_loss: 0.0827 91/500 [====>.........................] - ETA: 1:41 - loss: 0.8433 - regression_loss: 0.7614 - classification_loss: 0.0819 92/500 [====>.........................] - ETA: 1:40 - loss: 0.8442 - regression_loss: 0.7621 - classification_loss: 0.0821 93/500 [====>.........................] - ETA: 1:40 - loss: 0.8425 - regression_loss: 0.7608 - classification_loss: 0.0817 94/500 [====>.........................] - ETA: 1:40 - loss: 0.8487 - regression_loss: 0.7661 - classification_loss: 0.0827 95/500 [====>.........................] - ETA: 1:40 - loss: 0.8444 - regression_loss: 0.7620 - classification_loss: 0.0824 96/500 [====>.........................] - ETA: 1:39 - loss: 0.8413 - regression_loss: 0.7595 - classification_loss: 0.0818 97/500 [====>.........................] - ETA: 1:39 - loss: 0.8408 - regression_loss: 0.7592 - classification_loss: 0.0816 98/500 [====>.........................] - ETA: 1:39 - loss: 0.8415 - regression_loss: 0.7599 - classification_loss: 0.0816 99/500 [====>.........................] - ETA: 1:38 - loss: 0.8408 - regression_loss: 0.7594 - classification_loss: 0.0814 100/500 [=====>........................] - ETA: 1:38 - loss: 0.8388 - regression_loss: 0.7576 - classification_loss: 0.0812 101/500 [=====>........................] - ETA: 1:38 - loss: 0.8345 - regression_loss: 0.7538 - classification_loss: 0.0807 102/500 [=====>........................] - ETA: 1:38 - loss: 0.8314 - regression_loss: 0.7511 - classification_loss: 0.0802 103/500 [=====>........................] - ETA: 1:37 - loss: 0.8343 - regression_loss: 0.7536 - classification_loss: 0.0806 104/500 [=====>........................] - ETA: 1:37 - loss: 0.8407 - regression_loss: 0.7593 - classification_loss: 0.0815 105/500 [=====>........................] - ETA: 1:37 - loss: 0.8370 - regression_loss: 0.7559 - classification_loss: 0.0811 106/500 [=====>........................] - ETA: 1:37 - loss: 0.8350 - regression_loss: 0.7545 - classification_loss: 0.0806 107/500 [=====>........................] - ETA: 1:36 - loss: 0.8321 - regression_loss: 0.7521 - classification_loss: 0.0800 108/500 [=====>........................] - ETA: 1:36 - loss: 0.8290 - regression_loss: 0.7496 - classification_loss: 0.0795 109/500 [=====>........................] - ETA: 1:36 - loss: 0.8318 - regression_loss: 0.7519 - classification_loss: 0.0799 110/500 [=====>........................] - ETA: 1:36 - loss: 0.8334 - regression_loss: 0.7534 - classification_loss: 0.0800 111/500 [=====>........................] - ETA: 1:35 - loss: 0.8294 - regression_loss: 0.7500 - classification_loss: 0.0794 112/500 [=====>........................] - ETA: 1:35 - loss: 0.8354 - regression_loss: 0.7551 - classification_loss: 0.0803 113/500 [=====>........................] - ETA: 1:35 - loss: 0.8381 - regression_loss: 0.7574 - classification_loss: 0.0807 114/500 [=====>........................] - ETA: 1:35 - loss: 0.8353 - regression_loss: 0.7551 - classification_loss: 0.0801 115/500 [=====>........................] - ETA: 1:34 - loss: 0.8378 - regression_loss: 0.7570 - classification_loss: 0.0808 116/500 [=====>........................] - ETA: 1:34 - loss: 0.8320 - regression_loss: 0.7519 - classification_loss: 0.0801 117/500 [======>.......................] - ETA: 1:34 - loss: 0.8346 - regression_loss: 0.7538 - classification_loss: 0.0807 118/500 [======>.......................] - ETA: 1:34 - loss: 0.8396 - regression_loss: 0.7579 - classification_loss: 0.0817 119/500 [======>.......................] - ETA: 1:33 - loss: 0.8364 - regression_loss: 0.7552 - classification_loss: 0.0812 120/500 [======>.......................] - ETA: 1:33 - loss: 0.8357 - regression_loss: 0.7549 - classification_loss: 0.0809 121/500 [======>.......................] - ETA: 1:33 - loss: 0.8360 - regression_loss: 0.7554 - classification_loss: 0.0806 122/500 [======>.......................] - ETA: 1:33 - loss: 0.8350 - regression_loss: 0.7548 - classification_loss: 0.0801 123/500 [======>.......................] - ETA: 1:32 - loss: 0.8406 - regression_loss: 0.7594 - classification_loss: 0.0812 124/500 [======>.......................] - ETA: 1:32 - loss: 0.8428 - regression_loss: 0.7615 - classification_loss: 0.0814 125/500 [======>.......................] - ETA: 1:32 - loss: 0.8434 - regression_loss: 0.7617 - classification_loss: 0.0817 126/500 [======>.......................] - ETA: 1:32 - loss: 0.8450 - regression_loss: 0.7635 - classification_loss: 0.0814 127/500 [======>.......................] - ETA: 1:31 - loss: 0.8453 - regression_loss: 0.7641 - classification_loss: 0.0813 128/500 [======>.......................] - ETA: 1:31 - loss: 0.8459 - regression_loss: 0.7647 - classification_loss: 0.0812 129/500 [======>.......................] - ETA: 1:31 - loss: 0.8414 - regression_loss: 0.7608 - classification_loss: 0.0807 130/500 [======>.......................] - ETA: 1:30 - loss: 0.8443 - regression_loss: 0.7631 - classification_loss: 0.0811 131/500 [======>.......................] - ETA: 1:30 - loss: 0.8424 - regression_loss: 0.7615 - classification_loss: 0.0808 132/500 [======>.......................] - ETA: 1:30 - loss: 0.8410 - regression_loss: 0.7604 - classification_loss: 0.0806 133/500 [======>.......................] - ETA: 1:30 - loss: 0.8421 - regression_loss: 0.7617 - classification_loss: 0.0804 134/500 [=======>......................] - ETA: 1:29 - loss: 0.8453 - regression_loss: 0.7642 - classification_loss: 0.0811 135/500 [=======>......................] - ETA: 1:29 - loss: 0.8476 - regression_loss: 0.7666 - classification_loss: 0.0810 136/500 [=======>......................] - ETA: 1:29 - loss: 0.8446 - regression_loss: 0.7640 - classification_loss: 0.0807 137/500 [=======>......................] - ETA: 1:29 - loss: 0.8464 - regression_loss: 0.7655 - classification_loss: 0.0810 138/500 [=======>......................] - ETA: 1:28 - loss: 0.8476 - regression_loss: 0.7667 - classification_loss: 0.0809 139/500 [=======>......................] - ETA: 1:28 - loss: 0.8468 - regression_loss: 0.7661 - classification_loss: 0.0806 140/500 [=======>......................] - ETA: 1:28 - loss: 0.8479 - regression_loss: 0.7671 - classification_loss: 0.0808 141/500 [=======>......................] - ETA: 1:28 - loss: 0.8448 - regression_loss: 0.7645 - classification_loss: 0.0803 142/500 [=======>......................] - ETA: 1:28 - loss: 0.8414 - regression_loss: 0.7615 - classification_loss: 0.0799 143/500 [=======>......................] - ETA: 1:27 - loss: 0.8395 - regression_loss: 0.7599 - classification_loss: 0.0796 144/500 [=======>......................] - ETA: 1:27 - loss: 0.8388 - regression_loss: 0.7592 - classification_loss: 0.0796 145/500 [=======>......................] - ETA: 1:27 - loss: 0.8438 - regression_loss: 0.7634 - classification_loss: 0.0804 146/500 [=======>......................] - ETA: 1:27 - loss: 0.8442 - regression_loss: 0.7637 - classification_loss: 0.0805 147/500 [=======>......................] - ETA: 1:26 - loss: 0.8464 - regression_loss: 0.7656 - classification_loss: 0.0809 148/500 [=======>......................] - ETA: 1:26 - loss: 0.8495 - regression_loss: 0.7683 - classification_loss: 0.0812 149/500 [=======>......................] - ETA: 1:26 - loss: 0.8485 - regression_loss: 0.7673 - classification_loss: 0.0812 150/500 [========>.....................] - ETA: 1:26 - loss: 0.8504 - regression_loss: 0.7689 - classification_loss: 0.0815 151/500 [========>.....................] - ETA: 1:26 - loss: 0.8532 - regression_loss: 0.7710 - classification_loss: 0.0822 152/500 [========>.....................] - ETA: 1:25 - loss: 0.8525 - regression_loss: 0.7705 - classification_loss: 0.0820 153/500 [========>.....................] - ETA: 1:25 - loss: 0.8535 - regression_loss: 0.7714 - classification_loss: 0.0821 154/500 [========>.....................] - ETA: 1:25 - loss: 0.8541 - regression_loss: 0.7719 - classification_loss: 0.0822 155/500 [========>.....................] - ETA: 1:25 - loss: 0.8524 - regression_loss: 0.7706 - classification_loss: 0.0817 156/500 [========>.....................] - ETA: 1:24 - loss: 0.8531 - regression_loss: 0.7711 - classification_loss: 0.0820 157/500 [========>.....................] - ETA: 1:24 - loss: 0.8544 - regression_loss: 0.7722 - classification_loss: 0.0822 158/500 [========>.....................] - ETA: 1:24 - loss: 0.8522 - regression_loss: 0.7703 - classification_loss: 0.0820 159/500 [========>.....................] - ETA: 1:24 - loss: 0.8535 - regression_loss: 0.7715 - classification_loss: 0.0820 160/500 [========>.....................] - ETA: 1:23 - loss: 0.8549 - regression_loss: 0.7726 - classification_loss: 0.0823 161/500 [========>.....................] - ETA: 1:23 - loss: 0.8561 - regression_loss: 0.7735 - classification_loss: 0.0825 162/500 [========>.....................] - ETA: 1:23 - loss: 0.8568 - regression_loss: 0.7740 - classification_loss: 0.0828 163/500 [========>.....................] - ETA: 1:23 - loss: 0.8579 - regression_loss: 0.7753 - classification_loss: 0.0826 164/500 [========>.....................] - ETA: 1:22 - loss: 0.8609 - regression_loss: 0.7778 - classification_loss: 0.0831 165/500 [========>.....................] - ETA: 1:22 - loss: 0.8609 - regression_loss: 0.7782 - classification_loss: 0.0827 166/500 [========>.....................] - ETA: 1:22 - loss: 0.8614 - regression_loss: 0.7786 - classification_loss: 0.0828 167/500 [=========>....................] - ETA: 1:22 - loss: 0.8627 - regression_loss: 0.7796 - classification_loss: 0.0831 168/500 [=========>....................] - ETA: 1:22 - loss: 0.8638 - regression_loss: 0.7807 - classification_loss: 0.0832 169/500 [=========>....................] - ETA: 1:21 - loss: 0.8638 - regression_loss: 0.7807 - classification_loss: 0.0831 170/500 [=========>....................] - ETA: 1:21 - loss: 0.8644 - regression_loss: 0.7817 - classification_loss: 0.0828 171/500 [=========>....................] - ETA: 1:21 - loss: 0.8632 - regression_loss: 0.7806 - classification_loss: 0.0826 172/500 [=========>....................] - ETA: 1:21 - loss: 0.8629 - regression_loss: 0.7804 - classification_loss: 0.0825 173/500 [=========>....................] - ETA: 1:20 - loss: 0.8651 - regression_loss: 0.7823 - classification_loss: 0.0828 174/500 [=========>....................] - ETA: 1:20 - loss: 0.8616 - regression_loss: 0.7792 - classification_loss: 0.0824 175/500 [=========>....................] - ETA: 1:20 - loss: 0.8608 - regression_loss: 0.7785 - classification_loss: 0.0822 176/500 [=========>....................] - ETA: 1:20 - loss: 0.8620 - regression_loss: 0.7797 - classification_loss: 0.0823 177/500 [=========>....................] - ETA: 1:19 - loss: 0.8614 - regression_loss: 0.7794 - classification_loss: 0.0820 178/500 [=========>....................] - ETA: 1:19 - loss: 0.8613 - regression_loss: 0.7794 - classification_loss: 0.0819 179/500 [=========>....................] - ETA: 1:19 - loss: 0.8585 - regression_loss: 0.7769 - classification_loss: 0.0816 180/500 [=========>....................] - ETA: 1:19 - loss: 0.8581 - regression_loss: 0.7768 - classification_loss: 0.0813 181/500 [=========>....................] - ETA: 1:18 - loss: 0.8573 - regression_loss: 0.7762 - classification_loss: 0.0810 182/500 [=========>....................] - ETA: 1:18 - loss: 0.8557 - regression_loss: 0.7749 - classification_loss: 0.0809 183/500 [=========>....................] - ETA: 1:18 - loss: 0.8574 - regression_loss: 0.7762 - classification_loss: 0.0812 184/500 [==========>...................] - ETA: 1:18 - loss: 0.8545 - regression_loss: 0.7737 - classification_loss: 0.0808 185/500 [==========>...................] - ETA: 1:18 - loss: 0.8552 - regression_loss: 0.7742 - classification_loss: 0.0811 186/500 [==========>...................] - ETA: 1:17 - loss: 0.8558 - regression_loss: 0.7748 - classification_loss: 0.0810 187/500 [==========>...................] - ETA: 1:17 - loss: 0.8537 - regression_loss: 0.7731 - classification_loss: 0.0806 188/500 [==========>...................] - ETA: 1:17 - loss: 0.8511 - regression_loss: 0.7708 - classification_loss: 0.0803 189/500 [==========>...................] - ETA: 1:17 - loss: 0.8523 - regression_loss: 0.7718 - classification_loss: 0.0805 190/500 [==========>...................] - ETA: 1:16 - loss: 0.8533 - regression_loss: 0.7724 - classification_loss: 0.0809 191/500 [==========>...................] - ETA: 1:16 - loss: 0.8526 - regression_loss: 0.7719 - classification_loss: 0.0808 192/500 [==========>...................] - ETA: 1:16 - loss: 0.8511 - regression_loss: 0.7705 - classification_loss: 0.0806 193/500 [==========>...................] - ETA: 1:16 - loss: 0.8478 - regression_loss: 0.7676 - classification_loss: 0.0802 194/500 [==========>...................] - ETA: 1:15 - loss: 0.8485 - regression_loss: 0.7683 - classification_loss: 0.0802 195/500 [==========>...................] - ETA: 1:15 - loss: 0.8466 - regression_loss: 0.7665 - classification_loss: 0.0801 196/500 [==========>...................] - ETA: 1:15 - loss: 0.8470 - regression_loss: 0.7668 - classification_loss: 0.0801 197/500 [==========>...................] - ETA: 1:15 - loss: 0.8446 - regression_loss: 0.7648 - classification_loss: 0.0798 198/500 [==========>...................] - ETA: 1:14 - loss: 0.8426 - regression_loss: 0.7632 - classification_loss: 0.0795 199/500 [==========>...................] - ETA: 1:14 - loss: 0.8427 - regression_loss: 0.7632 - classification_loss: 0.0796 200/500 [===========>..................] - ETA: 1:14 - loss: 0.8417 - regression_loss: 0.7623 - classification_loss: 0.0795 201/500 [===========>..................] - ETA: 1:14 - loss: 0.8424 - regression_loss: 0.7632 - classification_loss: 0.0793 202/500 [===========>..................] - ETA: 1:14 - loss: 0.8421 - regression_loss: 0.7629 - classification_loss: 0.0792 203/500 [===========>..................] - ETA: 1:13 - loss: 0.8431 - regression_loss: 0.7638 - classification_loss: 0.0793 204/500 [===========>..................] - ETA: 1:13 - loss: 0.8425 - regression_loss: 0.7633 - classification_loss: 0.0792 205/500 [===========>..................] - ETA: 1:13 - loss: 0.8431 - regression_loss: 0.7638 - classification_loss: 0.0793 206/500 [===========>..................] - ETA: 1:13 - loss: 0.8414 - regression_loss: 0.7625 - classification_loss: 0.0790 207/500 [===========>..................] - ETA: 1:12 - loss: 0.8411 - regression_loss: 0.7623 - classification_loss: 0.0789 208/500 [===========>..................] - ETA: 1:12 - loss: 0.8436 - regression_loss: 0.7643 - classification_loss: 0.0793 209/500 [===========>..................] - ETA: 1:12 - loss: 0.8434 - regression_loss: 0.7642 - classification_loss: 0.0792 210/500 [===========>..................] - ETA: 1:12 - loss: 0.8456 - regression_loss: 0.7662 - classification_loss: 0.0794 211/500 [===========>..................] - ETA: 1:11 - loss: 0.8463 - regression_loss: 0.7666 - classification_loss: 0.0797 212/500 [===========>..................] - ETA: 1:11 - loss: 0.8469 - regression_loss: 0.7673 - classification_loss: 0.0796 213/500 [===========>..................] - ETA: 1:11 - loss: 0.8473 - regression_loss: 0.7676 - classification_loss: 0.0796 214/500 [===========>..................] - ETA: 1:11 - loss: 0.8483 - regression_loss: 0.7685 - classification_loss: 0.0798 215/500 [===========>..................] - ETA: 1:10 - loss: 0.8464 - regression_loss: 0.7669 - classification_loss: 0.0796 216/500 [===========>..................] - ETA: 1:10 - loss: 0.8467 - regression_loss: 0.7671 - classification_loss: 0.0795 217/500 [============>.................] - ETA: 1:10 - loss: 0.8469 - regression_loss: 0.7671 - classification_loss: 0.0799 218/500 [============>.................] - ETA: 1:10 - loss: 0.8466 - regression_loss: 0.7666 - classification_loss: 0.0799 219/500 [============>.................] - ETA: 1:09 - loss: 0.8467 - regression_loss: 0.7670 - classification_loss: 0.0797 220/500 [============>.................] - ETA: 1:09 - loss: 0.8469 - regression_loss: 0.7671 - classification_loss: 0.0798 221/500 [============>.................] - ETA: 1:09 - loss: 0.8488 - regression_loss: 0.7688 - classification_loss: 0.0800 222/500 [============>.................] - ETA: 1:09 - loss: 0.8487 - regression_loss: 0.7686 - classification_loss: 0.0801 223/500 [============>.................] - ETA: 1:08 - loss: 0.8493 - regression_loss: 0.7691 - classification_loss: 0.0803 224/500 [============>.................] - ETA: 1:08 - loss: 0.8472 - regression_loss: 0.7671 - classification_loss: 0.0800 225/500 [============>.................] - ETA: 1:08 - loss: 0.8486 - regression_loss: 0.7684 - classification_loss: 0.0803 226/500 [============>.................] - ETA: 1:08 - loss: 0.8500 - regression_loss: 0.7694 - classification_loss: 0.0806 227/500 [============>.................] - ETA: 1:07 - loss: 0.8488 - regression_loss: 0.7684 - classification_loss: 0.0804 228/500 [============>.................] - ETA: 1:07 - loss: 0.8491 - regression_loss: 0.7676 - classification_loss: 0.0815 229/500 [============>.................] - ETA: 1:07 - loss: 0.8482 - regression_loss: 0.7668 - classification_loss: 0.0814 230/500 [============>.................] - ETA: 1:07 - loss: 0.8490 - regression_loss: 0.7675 - classification_loss: 0.0815 231/500 [============>.................] - ETA: 1:06 - loss: 0.8479 - regression_loss: 0.7666 - classification_loss: 0.0813 232/500 [============>.................] - ETA: 1:06 - loss: 0.8470 - regression_loss: 0.7659 - classification_loss: 0.0812 233/500 [============>.................] - ETA: 1:06 - loss: 0.8454 - regression_loss: 0.7645 - classification_loss: 0.0809 234/500 [=============>................] - ETA: 1:06 - loss: 0.8469 - regression_loss: 0.7657 - classification_loss: 0.0812 235/500 [=============>................] - ETA: 1:05 - loss: 0.8477 - regression_loss: 0.7665 - classification_loss: 0.0813 236/500 [=============>................] - ETA: 1:05 - loss: 0.8483 - regression_loss: 0.7670 - classification_loss: 0.0813 237/500 [=============>................] - ETA: 1:05 - loss: 0.8460 - regression_loss: 0.7650 - classification_loss: 0.0810 238/500 [=============>................] - ETA: 1:05 - loss: 0.8462 - regression_loss: 0.7651 - classification_loss: 0.0811 239/500 [=============>................] - ETA: 1:04 - loss: 0.8468 - regression_loss: 0.7656 - classification_loss: 0.0812 240/500 [=============>................] - ETA: 1:04 - loss: 0.8461 - regression_loss: 0.7651 - classification_loss: 0.0811 241/500 [=============>................] - ETA: 1:04 - loss: 0.8463 - regression_loss: 0.7652 - classification_loss: 0.0812 242/500 [=============>................] - ETA: 1:04 - loss: 0.8471 - regression_loss: 0.7659 - classification_loss: 0.0812 243/500 [=============>................] - ETA: 1:03 - loss: 0.8463 - regression_loss: 0.7652 - classification_loss: 0.0811 244/500 [=============>................] - ETA: 1:03 - loss: 0.8478 - regression_loss: 0.7667 - classification_loss: 0.0811 245/500 [=============>................] - ETA: 1:03 - loss: 0.8500 - regression_loss: 0.7687 - classification_loss: 0.0813 246/500 [=============>................] - ETA: 1:03 - loss: 0.8506 - regression_loss: 0.7692 - classification_loss: 0.0814 247/500 [=============>................] - ETA: 1:03 - loss: 0.8504 - regression_loss: 0.7690 - classification_loss: 0.0813 248/500 [=============>................] - ETA: 1:02 - loss: 0.8504 - regression_loss: 0.7691 - classification_loss: 0.0813 249/500 [=============>................] - ETA: 1:02 - loss: 0.8485 - regression_loss: 0.7673 - classification_loss: 0.0811 250/500 [==============>...............] - ETA: 1:02 - loss: 0.8507 - regression_loss: 0.7692 - classification_loss: 0.0815 251/500 [==============>...............] - ETA: 1:01 - loss: 0.8532 - regression_loss: 0.7716 - classification_loss: 0.0817 252/500 [==============>...............] - ETA: 1:01 - loss: 0.8534 - regression_loss: 0.7715 - classification_loss: 0.0818 253/500 [==============>...............] - ETA: 1:01 - loss: 0.8551 - regression_loss: 0.7730 - classification_loss: 0.0822 254/500 [==============>...............] - ETA: 1:01 - loss: 0.8551 - regression_loss: 0.7729 - classification_loss: 0.0821 255/500 [==============>...............] - ETA: 1:00 - loss: 0.8548 - regression_loss: 0.7726 - classification_loss: 0.0822 256/500 [==============>...............] - ETA: 1:00 - loss: 0.8565 - regression_loss: 0.7740 - classification_loss: 0.0825 257/500 [==============>...............] - ETA: 1:00 - loss: 0.8587 - regression_loss: 0.7760 - classification_loss: 0.0827 258/500 [==============>...............] - ETA: 1:00 - loss: 0.8566 - regression_loss: 0.7740 - classification_loss: 0.0826 259/500 [==============>...............] - ETA: 59s - loss: 0.8550 - regression_loss: 0.7725 - classification_loss: 0.0825  260/500 [==============>...............] - ETA: 59s - loss: 0.8564 - regression_loss: 0.7736 - classification_loss: 0.0828 261/500 [==============>...............] - ETA: 59s - loss: 0.8574 - regression_loss: 0.7745 - classification_loss: 0.0829 262/500 [==============>...............] - ETA: 59s - loss: 0.8565 - regression_loss: 0.7738 - classification_loss: 0.0827 263/500 [==============>...............] - ETA: 58s - loss: 0.8562 - regression_loss: 0.7737 - classification_loss: 0.0825 264/500 [==============>...............] - ETA: 58s - loss: 0.8574 - regression_loss: 0.7747 - classification_loss: 0.0827 265/500 [==============>...............] - ETA: 58s - loss: 0.8566 - regression_loss: 0.7741 - classification_loss: 0.0826 266/500 [==============>...............] - ETA: 58s - loss: 0.8553 - regression_loss: 0.7729 - classification_loss: 0.0824 267/500 [===============>..............] - ETA: 57s - loss: 0.8552 - regression_loss: 0.7728 - classification_loss: 0.0824 268/500 [===============>..............] - ETA: 57s - loss: 0.8528 - regression_loss: 0.7707 - classification_loss: 0.0821 269/500 [===============>..............] - ETA: 57s - loss: 0.8516 - regression_loss: 0.7697 - classification_loss: 0.0819 270/500 [===============>..............] - ETA: 57s - loss: 0.8521 - regression_loss: 0.7701 - classification_loss: 0.0820 271/500 [===============>..............] - ETA: 56s - loss: 0.8505 - regression_loss: 0.7687 - classification_loss: 0.0818 272/500 [===============>..............] - ETA: 56s - loss: 0.8507 - regression_loss: 0.7690 - classification_loss: 0.0817 273/500 [===============>..............] - ETA: 56s - loss: 0.8501 - regression_loss: 0.7682 - classification_loss: 0.0819 274/500 [===============>..............] - ETA: 56s - loss: 0.8510 - regression_loss: 0.7691 - classification_loss: 0.0819 275/500 [===============>..............] - ETA: 55s - loss: 0.8509 - regression_loss: 0.7690 - classification_loss: 0.0819 276/500 [===============>..............] - ETA: 55s - loss: 0.8507 - regression_loss: 0.7689 - classification_loss: 0.0817 277/500 [===============>..............] - ETA: 55s - loss: 0.8501 - regression_loss: 0.7685 - classification_loss: 0.0816 278/500 [===============>..............] - ETA: 55s - loss: 0.8519 - regression_loss: 0.7699 - classification_loss: 0.0820 279/500 [===============>..............] - ETA: 54s - loss: 0.8502 - regression_loss: 0.7684 - classification_loss: 0.0818 280/500 [===============>..............] - ETA: 54s - loss: 0.8501 - regression_loss: 0.7684 - classification_loss: 0.0818 281/500 [===============>..............] - ETA: 54s - loss: 0.8492 - regression_loss: 0.7676 - classification_loss: 0.0816 282/500 [===============>..............] - ETA: 54s - loss: 0.8469 - regression_loss: 0.7655 - classification_loss: 0.0813 283/500 [===============>..............] - ETA: 53s - loss: 0.8450 - regression_loss: 0.7639 - classification_loss: 0.0811 284/500 [================>.............] - ETA: 53s - loss: 0.8447 - regression_loss: 0.7637 - classification_loss: 0.0810 285/500 [================>.............] - ETA: 53s - loss: 0.8434 - regression_loss: 0.7625 - classification_loss: 0.0808 286/500 [================>.............] - ETA: 53s - loss: 0.8422 - regression_loss: 0.7616 - classification_loss: 0.0806 287/500 [================>.............] - ETA: 52s - loss: 0.8442 - regression_loss: 0.7632 - classification_loss: 0.0809 288/500 [================>.............] - ETA: 52s - loss: 0.8435 - regression_loss: 0.7627 - classification_loss: 0.0808 289/500 [================>.............] - ETA: 52s - loss: 0.8440 - regression_loss: 0.7632 - classification_loss: 0.0808 290/500 [================>.............] - ETA: 52s - loss: 0.8441 - regression_loss: 0.7633 - classification_loss: 0.0808 291/500 [================>.............] - ETA: 52s - loss: 0.8448 - regression_loss: 0.7638 - classification_loss: 0.0810 292/500 [================>.............] - ETA: 51s - loss: 0.8449 - regression_loss: 0.7639 - classification_loss: 0.0810 293/500 [================>.............] - ETA: 51s - loss: 0.8454 - regression_loss: 0.7643 - classification_loss: 0.0811 294/500 [================>.............] - ETA: 51s - loss: 0.8438 - regression_loss: 0.7629 - classification_loss: 0.0809 295/500 [================>.............] - ETA: 51s - loss: 0.8431 - regression_loss: 0.7624 - classification_loss: 0.0807 296/500 [================>.............] - ETA: 50s - loss: 0.8431 - regression_loss: 0.7622 - classification_loss: 0.0809 297/500 [================>.............] - ETA: 50s - loss: 0.8440 - regression_loss: 0.7630 - classification_loss: 0.0810 298/500 [================>.............] - ETA: 50s - loss: 0.8450 - regression_loss: 0.7639 - classification_loss: 0.0811 299/500 [================>.............] - ETA: 50s - loss: 0.8467 - regression_loss: 0.7654 - classification_loss: 0.0813 300/500 [=================>............] - ETA: 49s - loss: 0.8470 - regression_loss: 0.7657 - classification_loss: 0.0813 301/500 [=================>............] - ETA: 49s - loss: 0.8484 - regression_loss: 0.7668 - classification_loss: 0.0816 302/500 [=================>............] - ETA: 49s - loss: 0.8468 - regression_loss: 0.7654 - classification_loss: 0.0814 303/500 [=================>............] - ETA: 49s - loss: 0.8484 - regression_loss: 0.7667 - classification_loss: 0.0818 304/500 [=================>............] - ETA: 48s - loss: 0.8491 - regression_loss: 0.7672 - classification_loss: 0.0819 305/500 [=================>............] - ETA: 48s - loss: 0.8482 - regression_loss: 0.7664 - classification_loss: 0.0817 306/500 [=================>............] - ETA: 48s - loss: 0.8490 - regression_loss: 0.7667 - classification_loss: 0.0822 307/500 [=================>............] - ETA: 48s - loss: 0.8492 - regression_loss: 0.7669 - classification_loss: 0.0822 308/500 [=================>............] - ETA: 47s - loss: 0.8494 - regression_loss: 0.7671 - classification_loss: 0.0824 309/500 [=================>............] - ETA: 47s - loss: 0.8489 - regression_loss: 0.7665 - classification_loss: 0.0823 310/500 [=================>............] - ETA: 47s - loss: 0.8475 - regression_loss: 0.7653 - classification_loss: 0.0822 311/500 [=================>............] - ETA: 47s - loss: 0.8460 - regression_loss: 0.7641 - classification_loss: 0.0820 312/500 [=================>............] - ETA: 46s - loss: 0.8452 - regression_loss: 0.7634 - classification_loss: 0.0818 313/500 [=================>............] - ETA: 46s - loss: 0.8465 - regression_loss: 0.7645 - classification_loss: 0.0819 314/500 [=================>............] - ETA: 46s - loss: 0.8471 - regression_loss: 0.7649 - classification_loss: 0.0822 315/500 [=================>............] - ETA: 46s - loss: 0.8477 - regression_loss: 0.7654 - classification_loss: 0.0823 316/500 [=================>............] - ETA: 45s - loss: 0.8481 - regression_loss: 0.7657 - classification_loss: 0.0825 317/500 [==================>...........] - ETA: 45s - loss: 0.8482 - regression_loss: 0.7658 - classification_loss: 0.0824 318/500 [==================>...........] - ETA: 45s - loss: 0.8471 - regression_loss: 0.7649 - classification_loss: 0.0822 319/500 [==================>...........] - ETA: 45s - loss: 0.8479 - regression_loss: 0.7655 - classification_loss: 0.0824 320/500 [==================>...........] - ETA: 44s - loss: 0.8481 - regression_loss: 0.7657 - classification_loss: 0.0824 321/500 [==================>...........] - ETA: 44s - loss: 0.8488 - regression_loss: 0.7664 - classification_loss: 0.0824 322/500 [==================>...........] - ETA: 44s - loss: 0.8495 - regression_loss: 0.7670 - classification_loss: 0.0825 323/500 [==================>...........] - ETA: 44s - loss: 0.8484 - regression_loss: 0.7660 - classification_loss: 0.0824 324/500 [==================>...........] - ETA: 43s - loss: 0.8485 - regression_loss: 0.7662 - classification_loss: 0.0823 325/500 [==================>...........] - ETA: 43s - loss: 0.8478 - regression_loss: 0.7655 - classification_loss: 0.0823 326/500 [==================>...........] - ETA: 43s - loss: 0.8499 - regression_loss: 0.7672 - classification_loss: 0.0827 327/500 [==================>...........] - ETA: 43s - loss: 0.8486 - regression_loss: 0.7660 - classification_loss: 0.0826 328/500 [==================>...........] - ETA: 42s - loss: 0.8495 - regression_loss: 0.7667 - classification_loss: 0.0827 329/500 [==================>...........] - ETA: 42s - loss: 0.8502 - regression_loss: 0.7672 - classification_loss: 0.0829 330/500 [==================>...........] - ETA: 42s - loss: 0.8498 - regression_loss: 0.7669 - classification_loss: 0.0829 331/500 [==================>...........] - ETA: 42s - loss: 0.8501 - regression_loss: 0.7672 - classification_loss: 0.0830 332/500 [==================>...........] - ETA: 41s - loss: 0.8485 - regression_loss: 0.7657 - classification_loss: 0.0828 333/500 [==================>...........] - ETA: 41s - loss: 0.8483 - regression_loss: 0.7655 - classification_loss: 0.0828 334/500 [===================>..........] - ETA: 41s - loss: 0.8482 - regression_loss: 0.7654 - classification_loss: 0.0828 335/500 [===================>..........] - ETA: 41s - loss: 0.8490 - regression_loss: 0.7661 - classification_loss: 0.0829 336/500 [===================>..........] - ETA: 40s - loss: 0.8492 - regression_loss: 0.7662 - classification_loss: 0.0830 337/500 [===================>..........] - ETA: 40s - loss: 0.8496 - regression_loss: 0.7667 - classification_loss: 0.0829 338/500 [===================>..........] - ETA: 40s - loss: 0.8491 - regression_loss: 0.7662 - classification_loss: 0.0829 339/500 [===================>..........] - ETA: 40s - loss: 0.8504 - regression_loss: 0.7672 - classification_loss: 0.0831 340/500 [===================>..........] - ETA: 39s - loss: 0.8516 - regression_loss: 0.7684 - classification_loss: 0.0832 341/500 [===================>..........] - ETA: 39s - loss: 0.8514 - regression_loss: 0.7682 - classification_loss: 0.0832 342/500 [===================>..........] - ETA: 39s - loss: 0.8519 - regression_loss: 0.7686 - classification_loss: 0.0833 343/500 [===================>..........] - ETA: 39s - loss: 0.8502 - regression_loss: 0.7671 - classification_loss: 0.0831 344/500 [===================>..........] - ETA: 38s - loss: 0.8496 - regression_loss: 0.7666 - classification_loss: 0.0830 345/500 [===================>..........] - ETA: 38s - loss: 0.8495 - regression_loss: 0.7665 - classification_loss: 0.0830 346/500 [===================>..........] - ETA: 38s - loss: 0.8481 - regression_loss: 0.7654 - classification_loss: 0.0828 347/500 [===================>..........] - ETA: 38s - loss: 0.8482 - regression_loss: 0.7655 - classification_loss: 0.0827 348/500 [===================>..........] - ETA: 37s - loss: 0.8476 - regression_loss: 0.7650 - classification_loss: 0.0826 349/500 [===================>..........] - ETA: 37s - loss: 0.8461 - regression_loss: 0.7637 - classification_loss: 0.0824 350/500 [====================>.........] - ETA: 37s - loss: 0.8466 - regression_loss: 0.7640 - classification_loss: 0.0826 351/500 [====================>.........] - ETA: 37s - loss: 0.8454 - regression_loss: 0.7629 - classification_loss: 0.0824 352/500 [====================>.........] - ETA: 36s - loss: 0.8447 - regression_loss: 0.7624 - classification_loss: 0.0823 353/500 [====================>.........] - ETA: 36s - loss: 0.8440 - regression_loss: 0.7618 - classification_loss: 0.0822 354/500 [====================>.........] - ETA: 36s - loss: 0.8437 - regression_loss: 0.7615 - classification_loss: 0.0822 355/500 [====================>.........] - ETA: 36s - loss: 0.8433 - regression_loss: 0.7612 - classification_loss: 0.0821 356/500 [====================>.........] - ETA: 35s - loss: 0.8427 - regression_loss: 0.7607 - classification_loss: 0.0820 357/500 [====================>.........] - ETA: 35s - loss: 0.8432 - regression_loss: 0.7610 - classification_loss: 0.0821 358/500 [====================>.........] - ETA: 35s - loss: 0.8439 - regression_loss: 0.7617 - classification_loss: 0.0822 359/500 [====================>.........] - ETA: 35s - loss: 0.8439 - regression_loss: 0.7618 - classification_loss: 0.0821 360/500 [====================>.........] - ETA: 34s - loss: 0.8440 - regression_loss: 0.7619 - classification_loss: 0.0820 361/500 [====================>.........] - ETA: 34s - loss: 0.8444 - regression_loss: 0.7623 - classification_loss: 0.0821 362/500 [====================>.........] - ETA: 34s - loss: 0.8428 - regression_loss: 0.7609 - classification_loss: 0.0819 363/500 [====================>.........] - ETA: 34s - loss: 0.8421 - regression_loss: 0.7603 - classification_loss: 0.0818 364/500 [====================>.........] - ETA: 33s - loss: 0.8421 - regression_loss: 0.7603 - classification_loss: 0.0818 365/500 [====================>.........] - ETA: 33s - loss: 0.8429 - regression_loss: 0.7610 - classification_loss: 0.0819 366/500 [====================>.........] - ETA: 33s - loss: 0.8442 - regression_loss: 0.7623 - classification_loss: 0.0819 367/500 [=====================>........] - ETA: 33s - loss: 0.8441 - regression_loss: 0.7622 - classification_loss: 0.0819 368/500 [=====================>........] - ETA: 32s - loss: 0.8446 - regression_loss: 0.7626 - classification_loss: 0.0820 369/500 [=====================>........] - ETA: 32s - loss: 0.8458 - regression_loss: 0.7636 - classification_loss: 0.0822 370/500 [=====================>........] - ETA: 32s - loss: 0.8450 - regression_loss: 0.7629 - classification_loss: 0.0821 371/500 [=====================>........] - ETA: 32s - loss: 0.8453 - regression_loss: 0.7631 - classification_loss: 0.0822 372/500 [=====================>........] - ETA: 31s - loss: 0.8474 - regression_loss: 0.7649 - classification_loss: 0.0825 373/500 [=====================>........] - ETA: 31s - loss: 0.8480 - regression_loss: 0.7655 - classification_loss: 0.0825 374/500 [=====================>........] - ETA: 31s - loss: 0.8488 - regression_loss: 0.7661 - classification_loss: 0.0827 375/500 [=====================>........] - ETA: 31s - loss: 0.8501 - regression_loss: 0.7672 - classification_loss: 0.0828 376/500 [=====================>........] - ETA: 30s - loss: 0.8508 - regression_loss: 0.7678 - classification_loss: 0.0830 377/500 [=====================>........] - ETA: 30s - loss: 0.8498 - regression_loss: 0.7669 - classification_loss: 0.0829 378/500 [=====================>........] - ETA: 30s - loss: 0.8500 - regression_loss: 0.7671 - classification_loss: 0.0829 379/500 [=====================>........] - ETA: 30s - loss: 0.8491 - regression_loss: 0.7663 - classification_loss: 0.0828 380/500 [=====================>........] - ETA: 29s - loss: 0.8490 - regression_loss: 0.7662 - classification_loss: 0.0827 381/500 [=====================>........] - ETA: 29s - loss: 0.8489 - regression_loss: 0.7662 - classification_loss: 0.0827 382/500 [=====================>........] - ETA: 29s - loss: 0.8492 - regression_loss: 0.7664 - classification_loss: 0.0827 383/500 [=====================>........] - ETA: 29s - loss: 0.8485 - regression_loss: 0.7659 - classification_loss: 0.0826 384/500 [======================>.......] - ETA: 28s - loss: 0.8486 - regression_loss: 0.7661 - classification_loss: 0.0826 385/500 [======================>.......] - ETA: 28s - loss: 0.8475 - regression_loss: 0.7651 - classification_loss: 0.0824 386/500 [======================>.......] - ETA: 28s - loss: 0.8468 - regression_loss: 0.7646 - classification_loss: 0.0823 387/500 [======================>.......] - ETA: 28s - loss: 0.8470 - regression_loss: 0.7646 - classification_loss: 0.0823 388/500 [======================>.......] - ETA: 27s - loss: 0.8471 - regression_loss: 0.7648 - classification_loss: 0.0824 389/500 [======================>.......] - ETA: 27s - loss: 0.8480 - regression_loss: 0.7656 - classification_loss: 0.0823 390/500 [======================>.......] - ETA: 27s - loss: 0.8483 - regression_loss: 0.7659 - classification_loss: 0.0824 391/500 [======================>.......] - ETA: 27s - loss: 0.8475 - regression_loss: 0.7652 - classification_loss: 0.0823 392/500 [======================>.......] - ETA: 26s - loss: 0.8469 - regression_loss: 0.7647 - classification_loss: 0.0822 393/500 [======================>.......] - ETA: 26s - loss: 0.8478 - regression_loss: 0.7654 - classification_loss: 0.0823 394/500 [======================>.......] - ETA: 26s - loss: 0.8473 - regression_loss: 0.7650 - classification_loss: 0.0823 395/500 [======================>.......] - ETA: 26s - loss: 0.8470 - regression_loss: 0.7649 - classification_loss: 0.0822 396/500 [======================>.......] - ETA: 25s - loss: 0.8481 - regression_loss: 0.7659 - classification_loss: 0.0822 397/500 [======================>.......] - ETA: 25s - loss: 0.8471 - regression_loss: 0.7651 - classification_loss: 0.0821 398/500 [======================>.......] - ETA: 25s - loss: 0.8473 - regression_loss: 0.7651 - classification_loss: 0.0822 399/500 [======================>.......] - ETA: 25s - loss: 0.8460 - regression_loss: 0.7639 - classification_loss: 0.0820 400/500 [=======================>......] - ETA: 24s - loss: 0.8471 - regression_loss: 0.7649 - classification_loss: 0.0822 401/500 [=======================>......] - ETA: 24s - loss: 0.8485 - regression_loss: 0.7660 - classification_loss: 0.0825 402/500 [=======================>......] - ETA: 24s - loss: 0.8489 - regression_loss: 0.7664 - classification_loss: 0.0825 403/500 [=======================>......] - ETA: 24s - loss: 0.8502 - regression_loss: 0.7674 - classification_loss: 0.0827 404/500 [=======================>......] - ETA: 23s - loss: 0.8492 - regression_loss: 0.7666 - classification_loss: 0.0826 405/500 [=======================>......] - ETA: 23s - loss: 0.8497 - regression_loss: 0.7671 - classification_loss: 0.0826 406/500 [=======================>......] - ETA: 23s - loss: 0.8492 - regression_loss: 0.7668 - classification_loss: 0.0825 407/500 [=======================>......] - ETA: 23s - loss: 0.8486 - regression_loss: 0.7663 - classification_loss: 0.0823 408/500 [=======================>......] - ETA: 22s - loss: 0.8495 - regression_loss: 0.7671 - classification_loss: 0.0824 409/500 [=======================>......] - ETA: 22s - loss: 0.8491 - regression_loss: 0.7666 - classification_loss: 0.0825 410/500 [=======================>......] - ETA: 22s - loss: 0.8489 - regression_loss: 0.7662 - classification_loss: 0.0828 411/500 [=======================>......] - ETA: 22s - loss: 0.8490 - regression_loss: 0.7662 - classification_loss: 0.0827 412/500 [=======================>......] - ETA: 21s - loss: 0.8488 - regression_loss: 0.7661 - classification_loss: 0.0827 413/500 [=======================>......] - ETA: 21s - loss: 0.8481 - regression_loss: 0.7655 - classification_loss: 0.0826 414/500 [=======================>......] - ETA: 21s - loss: 0.8480 - regression_loss: 0.7655 - classification_loss: 0.0825 415/500 [=======================>......] - ETA: 21s - loss: 0.8479 - regression_loss: 0.7654 - classification_loss: 0.0825 416/500 [=======================>......] - ETA: 20s - loss: 0.8475 - regression_loss: 0.7651 - classification_loss: 0.0824 417/500 [========================>.....] - ETA: 20s - loss: 0.8475 - regression_loss: 0.7651 - classification_loss: 0.0823 418/500 [========================>.....] - ETA: 20s - loss: 0.8484 - regression_loss: 0.7659 - classification_loss: 0.0824 419/500 [========================>.....] - ETA: 20s - loss: 0.8477 - regression_loss: 0.7654 - classification_loss: 0.0823 420/500 [========================>.....] - ETA: 19s - loss: 0.8472 - regression_loss: 0.7649 - classification_loss: 0.0822 421/500 [========================>.....] - ETA: 19s - loss: 0.8476 - regression_loss: 0.7653 - classification_loss: 0.0823 422/500 [========================>.....] - ETA: 19s - loss: 0.8470 - regression_loss: 0.7647 - classification_loss: 0.0823 423/500 [========================>.....] - ETA: 19s - loss: 0.8461 - regression_loss: 0.7639 - classification_loss: 0.0821 424/500 [========================>.....] - ETA: 18s - loss: 0.8463 - regression_loss: 0.7642 - classification_loss: 0.0821 425/500 [========================>.....] - ETA: 18s - loss: 0.8468 - regression_loss: 0.7645 - classification_loss: 0.0823 426/500 [========================>.....] - ETA: 18s - loss: 0.8467 - regression_loss: 0.7645 - classification_loss: 0.0822 427/500 [========================>.....] - ETA: 18s - loss: 0.8484 - regression_loss: 0.7656 - classification_loss: 0.0829 428/500 [========================>.....] - ETA: 17s - loss: 0.8475 - regression_loss: 0.7647 - classification_loss: 0.0828 429/500 [========================>.....] - ETA: 17s - loss: 0.8486 - regression_loss: 0.7655 - classification_loss: 0.0831 430/500 [========================>.....] - ETA: 17s - loss: 0.8489 - regression_loss: 0.7658 - classification_loss: 0.0831 431/500 [========================>.....] - ETA: 17s - loss: 0.8489 - regression_loss: 0.7658 - classification_loss: 0.0831 432/500 [========================>.....] - ETA: 16s - loss: 0.8497 - regression_loss: 0.7665 - classification_loss: 0.0832 433/500 [========================>.....] - ETA: 16s - loss: 0.8499 - regression_loss: 0.7667 - classification_loss: 0.0832 434/500 [=========================>....] - ETA: 16s - loss: 0.8503 - regression_loss: 0.7672 - classification_loss: 0.0831 435/500 [=========================>....] - ETA: 16s - loss: 0.8492 - regression_loss: 0.7662 - classification_loss: 0.0829 436/500 [=========================>....] - ETA: 15s - loss: 0.8497 - regression_loss: 0.7667 - classification_loss: 0.0830 437/500 [=========================>....] - ETA: 15s - loss: 0.8485 - regression_loss: 0.7656 - classification_loss: 0.0829 438/500 [=========================>....] - ETA: 15s - loss: 0.8484 - regression_loss: 0.7654 - classification_loss: 0.0830 439/500 [=========================>....] - ETA: 15s - loss: 0.8490 - regression_loss: 0.7659 - classification_loss: 0.0831 440/500 [=========================>....] - ETA: 14s - loss: 0.8483 - regression_loss: 0.7652 - classification_loss: 0.0831 441/500 [=========================>....] - ETA: 14s - loss: 0.8485 - regression_loss: 0.7654 - classification_loss: 0.0831 442/500 [=========================>....] - ETA: 14s - loss: 0.8492 - regression_loss: 0.7661 - classification_loss: 0.0831 443/500 [=========================>....] - ETA: 14s - loss: 0.8491 - regression_loss: 0.7660 - classification_loss: 0.0831 444/500 [=========================>....] - ETA: 13s - loss: 0.8496 - regression_loss: 0.7664 - classification_loss: 0.0832 445/500 [=========================>....] - ETA: 13s - loss: 0.8506 - regression_loss: 0.7672 - classification_loss: 0.0834 446/500 [=========================>....] - ETA: 13s - loss: 0.8524 - regression_loss: 0.7689 - classification_loss: 0.0835 447/500 [=========================>....] - ETA: 13s - loss: 0.8522 - regression_loss: 0.7687 - classification_loss: 0.0834 448/500 [=========================>....] - ETA: 12s - loss: 0.8526 - regression_loss: 0.7691 - classification_loss: 0.0835 449/500 [=========================>....] - ETA: 12s - loss: 0.8522 - regression_loss: 0.7688 - classification_loss: 0.0834 450/500 [==========================>...] - ETA: 12s - loss: 0.8526 - regression_loss: 0.7691 - classification_loss: 0.0835 451/500 [==========================>...] - ETA: 12s - loss: 0.8526 - regression_loss: 0.7690 - classification_loss: 0.0836 452/500 [==========================>...] - ETA: 11s - loss: 0.8530 - regression_loss: 0.7693 - classification_loss: 0.0837 453/500 [==========================>...] - ETA: 11s - loss: 0.8543 - regression_loss: 0.7703 - classification_loss: 0.0840 454/500 [==========================>...] - ETA: 11s - loss: 0.8548 - regression_loss: 0.7707 - classification_loss: 0.0841 455/500 [==========================>...] - ETA: 11s - loss: 0.8553 - regression_loss: 0.7711 - classification_loss: 0.0842 456/500 [==========================>...] - ETA: 10s - loss: 0.8558 - regression_loss: 0.7714 - classification_loss: 0.0843 457/500 [==========================>...] - ETA: 10s - loss: 0.8560 - regression_loss: 0.7716 - classification_loss: 0.0844 458/500 [==========================>...] - ETA: 10s - loss: 0.8566 - regression_loss: 0.7721 - classification_loss: 0.0845 459/500 [==========================>...] - ETA: 10s - loss: 0.8575 - regression_loss: 0.7729 - classification_loss: 0.0846 460/500 [==========================>...] - ETA: 9s - loss: 0.8569 - regression_loss: 0.7723 - classification_loss: 0.0846  461/500 [==========================>...] - ETA: 9s - loss: 0.8557 - regression_loss: 0.7713 - classification_loss: 0.0844 462/500 [==========================>...] - ETA: 9s - loss: 0.8558 - regression_loss: 0.7714 - classification_loss: 0.0844 463/500 [==========================>...] - ETA: 9s - loss: 0.8564 - regression_loss: 0.7720 - classification_loss: 0.0843 464/500 [==========================>...] - ETA: 8s - loss: 0.8572 - regression_loss: 0.7727 - classification_loss: 0.0845 465/500 [==========================>...] - ETA: 8s - loss: 0.8572 - regression_loss: 0.7727 - classification_loss: 0.0845 466/500 [==========================>...] - ETA: 8s - loss: 0.8564 - regression_loss: 0.7720 - classification_loss: 0.0844 467/500 [===========================>..] - ETA: 8s - loss: 0.8570 - regression_loss: 0.7724 - classification_loss: 0.0845 468/500 [===========================>..] - ETA: 7s - loss: 0.8575 - regression_loss: 0.7729 - classification_loss: 0.0846 469/500 [===========================>..] - ETA: 7s - loss: 0.8564 - regression_loss: 0.7720 - classification_loss: 0.0845 470/500 [===========================>..] - ETA: 7s - loss: 0.8552 - regression_loss: 0.7709 - classification_loss: 0.0843 471/500 [===========================>..] - ETA: 7s - loss: 0.8547 - regression_loss: 0.7705 - classification_loss: 0.0843 472/500 [===========================>..] - ETA: 6s - loss: 0.8536 - regression_loss: 0.7695 - classification_loss: 0.0841 473/500 [===========================>..] - ETA: 6s - loss: 0.8539 - regression_loss: 0.7697 - classification_loss: 0.0841 474/500 [===========================>..] - ETA: 6s - loss: 0.8541 - regression_loss: 0.7700 - classification_loss: 0.0842 475/500 [===========================>..] - ETA: 6s - loss: 0.8535 - regression_loss: 0.7695 - classification_loss: 0.0841 476/500 [===========================>..] - ETA: 5s - loss: 0.8534 - regression_loss: 0.7694 - classification_loss: 0.0840 477/500 [===========================>..] - ETA: 5s - loss: 0.8535 - regression_loss: 0.7695 - classification_loss: 0.0840 478/500 [===========================>..] - ETA: 5s - loss: 0.8527 - regression_loss: 0.7688 - classification_loss: 0.0839 479/500 [===========================>..] - ETA: 5s - loss: 0.8537 - regression_loss: 0.7699 - classification_loss: 0.0839 480/500 [===========================>..] - ETA: 4s - loss: 0.8534 - regression_loss: 0.7696 - classification_loss: 0.0838 481/500 [===========================>..] - ETA: 4s - loss: 0.8531 - regression_loss: 0.7694 - classification_loss: 0.0837 482/500 [===========================>..] - ETA: 4s - loss: 0.8532 - regression_loss: 0.7695 - classification_loss: 0.0836 483/500 [===========================>..] - ETA: 4s - loss: 0.8541 - regression_loss: 0.7703 - classification_loss: 0.0838 484/500 [============================>.] - ETA: 3s - loss: 0.8541 - regression_loss: 0.7704 - classification_loss: 0.0838 485/500 [============================>.] - ETA: 3s - loss: 0.8534 - regression_loss: 0.7697 - classification_loss: 0.0837 486/500 [============================>.] - ETA: 3s - loss: 0.8537 - regression_loss: 0.7700 - classification_loss: 0.0837 487/500 [============================>.] - ETA: 3s - loss: 0.8538 - regression_loss: 0.7701 - classification_loss: 0.0837 488/500 [============================>.] - ETA: 2s - loss: 0.8533 - regression_loss: 0.7697 - classification_loss: 0.0836 489/500 [============================>.] - ETA: 2s - loss: 0.8534 - regression_loss: 0.7697 - classification_loss: 0.0837 490/500 [============================>.] - ETA: 2s - loss: 0.8538 - regression_loss: 0.7701 - classification_loss: 0.0838 491/500 [============================>.] - ETA: 2s - loss: 0.8540 - regression_loss: 0.7703 - classification_loss: 0.0838 492/500 [============================>.] - ETA: 1s - loss: 0.8540 - regression_loss: 0.7703 - classification_loss: 0.0838 493/500 [============================>.] - ETA: 1s - loss: 0.8553 - regression_loss: 0.7716 - classification_loss: 0.0838 494/500 [============================>.] - ETA: 1s - loss: 0.8548 - regression_loss: 0.7711 - classification_loss: 0.0837 495/500 [============================>.] - ETA: 1s - loss: 0.8550 - regression_loss: 0.7713 - classification_loss: 0.0837 496/500 [============================>.] - ETA: 0s - loss: 0.8568 - regression_loss: 0.7729 - classification_loss: 0.0839 497/500 [============================>.] - ETA: 0s - loss: 0.8565 - regression_loss: 0.7726 - classification_loss: 0.0839 498/500 [============================>.] - ETA: 0s - loss: 0.8577 - regression_loss: 0.7737 - classification_loss: 0.0840 499/500 [============================>.] - ETA: 0s - loss: 0.8572 - regression_loss: 0.7732 - classification_loss: 0.0839 500/500 [==============================] - 124s 248ms/step - loss: 0.8565 - regression_loss: 0.7727 - classification_loss: 0.0838 1172 instances of class plum with average precision: 0.7976 mAP: 0.7976 Epoch 00054: saving model to ./training/snapshots/resnet50_pascal_54.h5 Epoch 55/150 1/500 [..............................] - ETA: 1:56 - loss: 1.0581 - regression_loss: 0.9814 - classification_loss: 0.0767 2/500 [..............................] - ETA: 2:00 - loss: 1.0125 - regression_loss: 0.9410 - classification_loss: 0.0715 3/500 [..............................] - ETA: 2:00 - loss: 0.9067 - regression_loss: 0.8427 - classification_loss: 0.0641 4/500 [..............................] - ETA: 2:02 - loss: 0.8269 - regression_loss: 0.7488 - classification_loss: 0.0781 5/500 [..............................] - ETA: 2:01 - loss: 0.7255 - regression_loss: 0.6613 - classification_loss: 0.0642 6/500 [..............................] - ETA: 1:59 - loss: 0.6945 - regression_loss: 0.6236 - classification_loss: 0.0708 7/500 [..............................] - ETA: 1:59 - loss: 0.7789 - regression_loss: 0.6869 - classification_loss: 0.0921 8/500 [..............................] - ETA: 2:00 - loss: 0.7140 - regression_loss: 0.6316 - classification_loss: 0.0824 9/500 [..............................] - ETA: 2:01 - loss: 0.6905 - regression_loss: 0.6128 - classification_loss: 0.0777 10/500 [..............................] - ETA: 2:01 - loss: 0.7023 - regression_loss: 0.6223 - classification_loss: 0.0800 11/500 [..............................] - ETA: 2:01 - loss: 0.7390 - regression_loss: 0.6538 - classification_loss: 0.0853 12/500 [..............................] - ETA: 2:01 - loss: 0.7556 - regression_loss: 0.6685 - classification_loss: 0.0871 13/500 [..............................] - ETA: 2:01 - loss: 0.8193 - regression_loss: 0.7298 - classification_loss: 0.0895 14/500 [..............................] - ETA: 2:01 - loss: 0.8202 - regression_loss: 0.7302 - classification_loss: 0.0900 15/500 [..............................] - ETA: 2:00 - loss: 0.8260 - regression_loss: 0.7361 - classification_loss: 0.0900 16/500 [..............................] - ETA: 2:00 - loss: 0.8419 - regression_loss: 0.7517 - classification_loss: 0.0902 17/500 [>.............................] - ETA: 2:00 - loss: 0.8521 - regression_loss: 0.7607 - classification_loss: 0.0914 18/500 [>.............................] - ETA: 2:00 - loss: 0.8701 - regression_loss: 0.7743 - classification_loss: 0.0958 19/500 [>.............................] - ETA: 2:00 - loss: 0.8640 - regression_loss: 0.7705 - classification_loss: 0.0935 20/500 [>.............................] - ETA: 1:59 - loss: 0.8529 - regression_loss: 0.7624 - classification_loss: 0.0905 21/500 [>.............................] - ETA: 1:59 - loss: 0.8542 - regression_loss: 0.7656 - classification_loss: 0.0886 22/500 [>.............................] - ETA: 1:59 - loss: 0.8685 - regression_loss: 0.7780 - classification_loss: 0.0905 23/500 [>.............................] - ETA: 1:59 - loss: 0.8808 - regression_loss: 0.7898 - classification_loss: 0.0910 24/500 [>.............................] - ETA: 1:59 - loss: 0.8822 - regression_loss: 0.7900 - classification_loss: 0.0921 25/500 [>.............................] - ETA: 1:58 - loss: 0.8739 - regression_loss: 0.7830 - classification_loss: 0.0909 26/500 [>.............................] - ETA: 1:58 - loss: 0.8937 - regression_loss: 0.8005 - classification_loss: 0.0932 27/500 [>.............................] - ETA: 1:57 - loss: 0.9002 - regression_loss: 0.8062 - classification_loss: 0.0940 28/500 [>.............................] - ETA: 1:57 - loss: 0.8917 - regression_loss: 0.7993 - classification_loss: 0.0923 29/500 [>.............................] - ETA: 1:57 - loss: 0.9118 - regression_loss: 0.8152 - classification_loss: 0.0967 30/500 [>.............................] - ETA: 1:57 - loss: 0.8979 - regression_loss: 0.8029 - classification_loss: 0.0949 31/500 [>.............................] - ETA: 1:56 - loss: 0.9060 - regression_loss: 0.8101 - classification_loss: 0.0958 32/500 [>.............................] - ETA: 1:56 - loss: 0.9130 - regression_loss: 0.8169 - classification_loss: 0.0961 33/500 [>.............................] - ETA: 1:56 - loss: 0.8981 - regression_loss: 0.8041 - classification_loss: 0.0940 34/500 [=>............................] - ETA: 1:56 - loss: 0.8942 - regression_loss: 0.8006 - classification_loss: 0.0936 35/500 [=>............................] - ETA: 1:55 - loss: 0.9021 - regression_loss: 0.8067 - classification_loss: 0.0953 36/500 [=>............................] - ETA: 1:55 - loss: 0.8994 - regression_loss: 0.8036 - classification_loss: 0.0958 37/500 [=>............................] - ETA: 1:55 - loss: 0.9102 - regression_loss: 0.8123 - classification_loss: 0.0979 38/500 [=>............................] - ETA: 1:55 - loss: 0.8926 - regression_loss: 0.7971 - classification_loss: 0.0955 39/500 [=>............................] - ETA: 1:55 - loss: 0.8850 - regression_loss: 0.7908 - classification_loss: 0.0942 40/500 [=>............................] - ETA: 1:55 - loss: 0.8897 - regression_loss: 0.7927 - classification_loss: 0.0970 41/500 [=>............................] - ETA: 1:54 - loss: 0.8971 - regression_loss: 0.7994 - classification_loss: 0.0977 42/500 [=>............................] - ETA: 1:54 - loss: 0.8944 - regression_loss: 0.7958 - classification_loss: 0.0986 43/500 [=>............................] - ETA: 1:54 - loss: 0.8893 - regression_loss: 0.7920 - classification_loss: 0.0973 44/500 [=>............................] - ETA: 1:53 - loss: 0.8888 - regression_loss: 0.7927 - classification_loss: 0.0960 45/500 [=>............................] - ETA: 1:53 - loss: 0.8946 - regression_loss: 0.7982 - classification_loss: 0.0964 46/500 [=>............................] - ETA: 1:53 - loss: 0.8855 - regression_loss: 0.7908 - classification_loss: 0.0947 47/500 [=>............................] - ETA: 1:53 - loss: 0.8863 - regression_loss: 0.7915 - classification_loss: 0.0947 48/500 [=>............................] - ETA: 1:53 - loss: 0.8963 - regression_loss: 0.8013 - classification_loss: 0.0950 49/500 [=>............................] - ETA: 1:52 - loss: 0.9015 - regression_loss: 0.8058 - classification_loss: 0.0957 50/500 [==>...........................] - ETA: 1:52 - loss: 0.8901 - regression_loss: 0.7962 - classification_loss: 0.0939 51/500 [==>...........................] - ETA: 1:52 - loss: 0.8790 - regression_loss: 0.7866 - classification_loss: 0.0924 52/500 [==>...........................] - ETA: 1:52 - loss: 0.8867 - regression_loss: 0.7933 - classification_loss: 0.0934 53/500 [==>...........................] - ETA: 1:51 - loss: 0.8846 - regression_loss: 0.7915 - classification_loss: 0.0931 54/500 [==>...........................] - ETA: 1:51 - loss: 0.8881 - regression_loss: 0.7944 - classification_loss: 0.0937 55/500 [==>...........................] - ETA: 1:51 - loss: 0.8934 - regression_loss: 0.7988 - classification_loss: 0.0946 56/500 [==>...........................] - ETA: 1:51 - loss: 0.8926 - regression_loss: 0.7979 - classification_loss: 0.0946 57/500 [==>...........................] - ETA: 1:50 - loss: 0.8954 - regression_loss: 0.8007 - classification_loss: 0.0947 58/500 [==>...........................] - ETA: 1:50 - loss: 0.8867 - regression_loss: 0.7930 - classification_loss: 0.0936 59/500 [==>...........................] - ETA: 1:50 - loss: 0.8862 - regression_loss: 0.7925 - classification_loss: 0.0936 60/500 [==>...........................] - ETA: 1:49 - loss: 0.8901 - regression_loss: 0.7964 - classification_loss: 0.0937 61/500 [==>...........................] - ETA: 1:49 - loss: 0.8879 - regression_loss: 0.7952 - classification_loss: 0.0927 62/500 [==>...........................] - ETA: 1:49 - loss: 0.8909 - regression_loss: 0.7977 - classification_loss: 0.0931 63/500 [==>...........................] - ETA: 1:48 - loss: 0.8903 - regression_loss: 0.7967 - classification_loss: 0.0936 64/500 [==>...........................] - ETA: 1:48 - loss: 0.8843 - regression_loss: 0.7916 - classification_loss: 0.0927 65/500 [==>...........................] - ETA: 1:48 - loss: 0.8736 - regression_loss: 0.7822 - classification_loss: 0.0914 66/500 [==>...........................] - ETA: 1:48 - loss: 0.8721 - regression_loss: 0.7810 - classification_loss: 0.0912 67/500 [===>..........................] - ETA: 1:47 - loss: 0.8801 - regression_loss: 0.7883 - classification_loss: 0.0919 68/500 [===>..........................] - ETA: 1:47 - loss: 0.8704 - regression_loss: 0.7797 - classification_loss: 0.0906 69/500 [===>..........................] - ETA: 1:47 - loss: 0.8715 - regression_loss: 0.7805 - classification_loss: 0.0909 70/500 [===>..........................] - ETA: 1:47 - loss: 0.8754 - regression_loss: 0.7836 - classification_loss: 0.0918 71/500 [===>..........................] - ETA: 1:47 - loss: 0.8768 - regression_loss: 0.7848 - classification_loss: 0.0920 72/500 [===>..........................] - ETA: 1:46 - loss: 0.8670 - regression_loss: 0.7761 - classification_loss: 0.0910 73/500 [===>..........................] - ETA: 1:46 - loss: 0.8686 - regression_loss: 0.7774 - classification_loss: 0.0913 74/500 [===>..........................] - ETA: 1:46 - loss: 0.8747 - regression_loss: 0.7826 - classification_loss: 0.0921 75/500 [===>..........................] - ETA: 1:46 - loss: 0.8750 - regression_loss: 0.7834 - classification_loss: 0.0916 76/500 [===>..........................] - ETA: 1:45 - loss: 0.8737 - regression_loss: 0.7822 - classification_loss: 0.0915 77/500 [===>..........................] - ETA: 1:45 - loss: 0.8771 - regression_loss: 0.7849 - classification_loss: 0.0922 78/500 [===>..........................] - ETA: 1:45 - loss: 0.8706 - regression_loss: 0.7792 - classification_loss: 0.0914 79/500 [===>..........................] - ETA: 1:45 - loss: 0.8659 - regression_loss: 0.7753 - classification_loss: 0.0906 80/500 [===>..........................] - ETA: 1:44 - loss: 0.8709 - regression_loss: 0.7795 - classification_loss: 0.0914 81/500 [===>..........................] - ETA: 1:44 - loss: 0.8708 - regression_loss: 0.7798 - classification_loss: 0.0910 82/500 [===>..........................] - ETA: 1:44 - loss: 0.8689 - regression_loss: 0.7782 - classification_loss: 0.0906 83/500 [===>..........................] - ETA: 1:44 - loss: 0.8642 - regression_loss: 0.7745 - classification_loss: 0.0898 84/500 [====>.........................] - ETA: 1:44 - loss: 0.8621 - regression_loss: 0.7726 - classification_loss: 0.0896 85/500 [====>.........................] - ETA: 1:43 - loss: 0.8596 - regression_loss: 0.7704 - classification_loss: 0.0892 86/500 [====>.........................] - ETA: 1:43 - loss: 0.8545 - regression_loss: 0.7661 - classification_loss: 0.0884 87/500 [====>.........................] - ETA: 1:43 - loss: 0.8549 - regression_loss: 0.7668 - classification_loss: 0.0881 88/500 [====>.........................] - ETA: 1:42 - loss: 0.8603 - regression_loss: 0.7712 - classification_loss: 0.0892 89/500 [====>.........................] - ETA: 1:42 - loss: 0.8580 - regression_loss: 0.7694 - classification_loss: 0.0887 90/500 [====>.........................] - ETA: 1:42 - loss: 0.8663 - regression_loss: 0.7774 - classification_loss: 0.0889 91/500 [====>.........................] - ETA: 1:42 - loss: 0.8691 - regression_loss: 0.7796 - classification_loss: 0.0895 92/500 [====>.........................] - ETA: 1:41 - loss: 0.8703 - regression_loss: 0.7808 - classification_loss: 0.0895 93/500 [====>.........................] - ETA: 1:41 - loss: 0.8649 - regression_loss: 0.7762 - classification_loss: 0.0887 94/500 [====>.........................] - ETA: 1:41 - loss: 0.8664 - regression_loss: 0.7776 - classification_loss: 0.0888 95/500 [====>.........................] - ETA: 1:41 - loss: 0.8603 - regression_loss: 0.7723 - classification_loss: 0.0880 96/500 [====>.........................] - ETA: 1:41 - loss: 0.8563 - regression_loss: 0.7689 - classification_loss: 0.0874 97/500 [====>.........................] - ETA: 1:40 - loss: 0.8507 - regression_loss: 0.7640 - classification_loss: 0.0867 98/500 [====>.........................] - ETA: 1:40 - loss: 0.8487 - regression_loss: 0.7625 - classification_loss: 0.0862 99/500 [====>.........................] - ETA: 1:40 - loss: 0.8509 - regression_loss: 0.7643 - classification_loss: 0.0866 100/500 [=====>........................] - ETA: 1:40 - loss: 0.8484 - regression_loss: 0.7622 - classification_loss: 0.0862 101/500 [=====>........................] - ETA: 1:39 - loss: 0.8520 - regression_loss: 0.7649 - classification_loss: 0.0871 102/500 [=====>........................] - ETA: 1:39 - loss: 0.8516 - regression_loss: 0.7645 - classification_loss: 0.0871 103/500 [=====>........................] - ETA: 1:39 - loss: 0.8510 - regression_loss: 0.7645 - classification_loss: 0.0865 104/500 [=====>........................] - ETA: 1:39 - loss: 0.8489 - regression_loss: 0.7628 - classification_loss: 0.0861 105/500 [=====>........................] - ETA: 1:38 - loss: 0.8514 - regression_loss: 0.7650 - classification_loss: 0.0864 106/500 [=====>........................] - ETA: 1:38 - loss: 0.8522 - regression_loss: 0.7658 - classification_loss: 0.0863 107/500 [=====>........................] - ETA: 1:37 - loss: 0.8483 - regression_loss: 0.7625 - classification_loss: 0.0858 108/500 [=====>........................] - ETA: 1:37 - loss: 0.8469 - regression_loss: 0.7611 - classification_loss: 0.0857 109/500 [=====>........................] - ETA: 1:37 - loss: 0.8509 - regression_loss: 0.7649 - classification_loss: 0.0860 110/500 [=====>........................] - ETA: 1:37 - loss: 0.8502 - regression_loss: 0.7645 - classification_loss: 0.0857 111/500 [=====>........................] - ETA: 1:36 - loss: 0.8464 - regression_loss: 0.7613 - classification_loss: 0.0851 112/500 [=====>........................] - ETA: 1:36 - loss: 0.8449 - regression_loss: 0.7600 - classification_loss: 0.0849 113/500 [=====>........................] - ETA: 1:36 - loss: 0.8475 - regression_loss: 0.7623 - classification_loss: 0.0852 114/500 [=====>........................] - ETA: 1:36 - loss: 0.8475 - regression_loss: 0.7624 - classification_loss: 0.0852 115/500 [=====>........................] - ETA: 1:35 - loss: 0.8474 - regression_loss: 0.7624 - classification_loss: 0.0849 116/500 [=====>........................] - ETA: 1:35 - loss: 0.8431 - regression_loss: 0.7588 - classification_loss: 0.0843 117/500 [======>.......................] - ETA: 1:35 - loss: 0.8418 - regression_loss: 0.7577 - classification_loss: 0.0841 118/500 [======>.......................] - ETA: 1:35 - loss: 0.8434 - regression_loss: 0.7592 - classification_loss: 0.0841 119/500 [======>.......................] - ETA: 1:34 - loss: 0.8414 - regression_loss: 0.7577 - classification_loss: 0.0837 120/500 [======>.......................] - ETA: 1:34 - loss: 0.8421 - regression_loss: 0.7586 - classification_loss: 0.0835 121/500 [======>.......................] - ETA: 1:34 - loss: 0.8450 - regression_loss: 0.7611 - classification_loss: 0.0839 122/500 [======>.......................] - ETA: 1:34 - loss: 0.8468 - regression_loss: 0.7625 - classification_loss: 0.0842 123/500 [======>.......................] - ETA: 1:33 - loss: 0.8494 - regression_loss: 0.7648 - classification_loss: 0.0846 124/500 [======>.......................] - ETA: 1:33 - loss: 0.8475 - regression_loss: 0.7633 - classification_loss: 0.0841 125/500 [======>.......................] - ETA: 1:33 - loss: 0.8482 - regression_loss: 0.7641 - classification_loss: 0.0842 126/500 [======>.......................] - ETA: 1:33 - loss: 0.8451 - regression_loss: 0.7615 - classification_loss: 0.0836 127/500 [======>.......................] - ETA: 1:32 - loss: 0.8441 - regression_loss: 0.7603 - classification_loss: 0.0838 128/500 [======>.......................] - ETA: 1:32 - loss: 0.8423 - regression_loss: 0.7588 - classification_loss: 0.0836 129/500 [======>.......................] - ETA: 1:32 - loss: 0.8448 - regression_loss: 0.7607 - classification_loss: 0.0841 130/500 [======>.......................] - ETA: 1:32 - loss: 0.8457 - regression_loss: 0.7603 - classification_loss: 0.0854 131/500 [======>.......................] - ETA: 1:32 - loss: 0.8465 - regression_loss: 0.7610 - classification_loss: 0.0856 132/500 [======>.......................] - ETA: 1:31 - loss: 0.8467 - regression_loss: 0.7611 - classification_loss: 0.0856 133/500 [======>.......................] - ETA: 1:31 - loss: 0.8425 - regression_loss: 0.7575 - classification_loss: 0.0850 134/500 [=======>......................] - ETA: 1:31 - loss: 0.8421 - regression_loss: 0.7572 - classification_loss: 0.0850 135/500 [=======>......................] - ETA: 1:31 - loss: 0.8406 - regression_loss: 0.7559 - classification_loss: 0.0847 136/500 [=======>......................] - ETA: 1:30 - loss: 0.8390 - regression_loss: 0.7548 - classification_loss: 0.0842 137/500 [=======>......................] - ETA: 1:30 - loss: 0.8396 - regression_loss: 0.7555 - classification_loss: 0.0842 138/500 [=======>......................] - ETA: 1:30 - loss: 0.8369 - regression_loss: 0.7528 - classification_loss: 0.0840 139/500 [=======>......................] - ETA: 1:30 - loss: 0.8401 - regression_loss: 0.7559 - classification_loss: 0.0841 140/500 [=======>......................] - ETA: 1:29 - loss: 0.8446 - regression_loss: 0.7597 - classification_loss: 0.0848 141/500 [=======>......................] - ETA: 1:29 - loss: 0.8430 - regression_loss: 0.7584 - classification_loss: 0.0846 142/500 [=======>......................] - ETA: 1:29 - loss: 0.8449 - regression_loss: 0.7599 - classification_loss: 0.0850 143/500 [=======>......................] - ETA: 1:29 - loss: 0.8440 - regression_loss: 0.7591 - classification_loss: 0.0849 144/500 [=======>......................] - ETA: 1:28 - loss: 0.8427 - regression_loss: 0.7581 - classification_loss: 0.0846 145/500 [=======>......................] - ETA: 1:28 - loss: 0.8398 - regression_loss: 0.7556 - classification_loss: 0.0842 146/500 [=======>......................] - ETA: 1:28 - loss: 0.8419 - regression_loss: 0.7572 - classification_loss: 0.0847 147/500 [=======>......................] - ETA: 1:28 - loss: 0.8445 - regression_loss: 0.7596 - classification_loss: 0.0849 148/500 [=======>......................] - ETA: 1:27 - loss: 0.8453 - regression_loss: 0.7603 - classification_loss: 0.0849 149/500 [=======>......................] - ETA: 1:27 - loss: 0.8419 - regression_loss: 0.7573 - classification_loss: 0.0846 150/500 [========>.....................] - ETA: 1:27 - loss: 0.8440 - regression_loss: 0.7591 - classification_loss: 0.0849 151/500 [========>.....................] - ETA: 1:27 - loss: 0.8415 - regression_loss: 0.7570 - classification_loss: 0.0845 152/500 [========>.....................] - ETA: 1:26 - loss: 0.8417 - regression_loss: 0.7570 - classification_loss: 0.0846 153/500 [========>.....................] - ETA: 1:26 - loss: 0.8429 - regression_loss: 0.7580 - classification_loss: 0.0849 154/500 [========>.....................] - ETA: 1:26 - loss: 0.8414 - regression_loss: 0.7568 - classification_loss: 0.0846 155/500 [========>.....................] - ETA: 1:26 - loss: 0.8432 - regression_loss: 0.7584 - classification_loss: 0.0848 156/500 [========>.....................] - ETA: 1:25 - loss: 0.8449 - regression_loss: 0.7600 - classification_loss: 0.0849 157/500 [========>.....................] - ETA: 1:25 - loss: 0.8418 - regression_loss: 0.7573 - classification_loss: 0.0846 158/500 [========>.....................] - ETA: 1:25 - loss: 0.8415 - regression_loss: 0.7572 - classification_loss: 0.0843 159/500 [========>.....................] - ETA: 1:25 - loss: 0.8416 - regression_loss: 0.7575 - classification_loss: 0.0841 160/500 [========>.....................] - ETA: 1:24 - loss: 0.8423 - regression_loss: 0.7583 - classification_loss: 0.0840 161/500 [========>.....................] - ETA: 1:24 - loss: 0.8431 - regression_loss: 0.7593 - classification_loss: 0.0838 162/500 [========>.....................] - ETA: 1:24 - loss: 0.8460 - regression_loss: 0.7616 - classification_loss: 0.0844 163/500 [========>.....................] - ETA: 1:24 - loss: 0.8450 - regression_loss: 0.7610 - classification_loss: 0.0840 164/500 [========>.....................] - ETA: 1:23 - loss: 0.8436 - regression_loss: 0.7598 - classification_loss: 0.0838 165/500 [========>.....................] - ETA: 1:23 - loss: 0.8443 - regression_loss: 0.7604 - classification_loss: 0.0839 166/500 [========>.....................] - ETA: 1:23 - loss: 0.8428 - regression_loss: 0.7591 - classification_loss: 0.0836 167/500 [=========>....................] - ETA: 1:23 - loss: 0.8436 - regression_loss: 0.7599 - classification_loss: 0.0836 168/500 [=========>....................] - ETA: 1:22 - loss: 0.8452 - regression_loss: 0.7615 - classification_loss: 0.0837 169/500 [=========>....................] - ETA: 1:22 - loss: 0.8483 - regression_loss: 0.7641 - classification_loss: 0.0842 170/500 [=========>....................] - ETA: 1:22 - loss: 0.8457 - regression_loss: 0.7619 - classification_loss: 0.0838 171/500 [=========>....................] - ETA: 1:22 - loss: 0.8487 - regression_loss: 0.7645 - classification_loss: 0.0842 172/500 [=========>....................] - ETA: 1:21 - loss: 0.8483 - regression_loss: 0.7641 - classification_loss: 0.0842 173/500 [=========>....................] - ETA: 1:21 - loss: 0.8469 - regression_loss: 0.7628 - classification_loss: 0.0841 174/500 [=========>....................] - ETA: 1:21 - loss: 0.8438 - regression_loss: 0.7600 - classification_loss: 0.0838 175/500 [=========>....................] - ETA: 1:21 - loss: 0.8461 - regression_loss: 0.7621 - classification_loss: 0.0841 176/500 [=========>....................] - ETA: 1:20 - loss: 0.8485 - regression_loss: 0.7636 - classification_loss: 0.0849 177/500 [=========>....................] - ETA: 1:20 - loss: 0.8500 - regression_loss: 0.7650 - classification_loss: 0.0850 178/500 [=========>....................] - ETA: 1:20 - loss: 0.8494 - regression_loss: 0.7647 - classification_loss: 0.0847 179/500 [=========>....................] - ETA: 1:20 - loss: 0.8504 - regression_loss: 0.7656 - classification_loss: 0.0848 180/500 [=========>....................] - ETA: 1:19 - loss: 0.8495 - regression_loss: 0.7649 - classification_loss: 0.0846 181/500 [=========>....................] - ETA: 1:19 - loss: 0.8497 - regression_loss: 0.7653 - classification_loss: 0.0844 182/500 [=========>....................] - ETA: 1:19 - loss: 0.8515 - regression_loss: 0.7668 - classification_loss: 0.0847 183/500 [=========>....................] - ETA: 1:19 - loss: 0.8510 - regression_loss: 0.7666 - classification_loss: 0.0844 184/500 [==========>...................] - ETA: 1:18 - loss: 0.8510 - regression_loss: 0.7664 - classification_loss: 0.0846 185/500 [==========>...................] - ETA: 1:18 - loss: 0.8503 - regression_loss: 0.7657 - classification_loss: 0.0846 186/500 [==========>...................] - ETA: 1:18 - loss: 0.8476 - regression_loss: 0.7634 - classification_loss: 0.0842 187/500 [==========>...................] - ETA: 1:18 - loss: 0.8485 - regression_loss: 0.7643 - classification_loss: 0.0843 188/500 [==========>...................] - ETA: 1:17 - loss: 0.8483 - regression_loss: 0.7640 - classification_loss: 0.0843 189/500 [==========>...................] - ETA: 1:17 - loss: 0.8484 - regression_loss: 0.7642 - classification_loss: 0.0842 190/500 [==========>...................] - ETA: 1:17 - loss: 0.8475 - regression_loss: 0.7633 - classification_loss: 0.0842 191/500 [==========>...................] - ETA: 1:17 - loss: 0.8500 - regression_loss: 0.7653 - classification_loss: 0.0847 192/500 [==========>...................] - ETA: 1:16 - loss: 0.8503 - regression_loss: 0.7657 - classification_loss: 0.0846 193/500 [==========>...................] - ETA: 1:16 - loss: 0.8500 - regression_loss: 0.7655 - classification_loss: 0.0846 194/500 [==========>...................] - ETA: 1:16 - loss: 0.8572 - regression_loss: 0.7711 - classification_loss: 0.0861 195/500 [==========>...................] - ETA: 1:16 - loss: 0.8576 - regression_loss: 0.7715 - classification_loss: 0.0861 196/500 [==========>...................] - ETA: 1:15 - loss: 0.8574 - regression_loss: 0.7707 - classification_loss: 0.0867 197/500 [==========>...................] - ETA: 1:15 - loss: 0.8584 - regression_loss: 0.7713 - classification_loss: 0.0871 198/500 [==========>...................] - ETA: 1:15 - loss: 0.8597 - regression_loss: 0.7724 - classification_loss: 0.0873 199/500 [==========>...................] - ETA: 1:15 - loss: 0.8594 - regression_loss: 0.7722 - classification_loss: 0.0872 200/500 [===========>..................] - ETA: 1:15 - loss: 0.8616 - regression_loss: 0.7741 - classification_loss: 0.0875 201/500 [===========>..................] - ETA: 1:14 - loss: 0.8615 - regression_loss: 0.7740 - classification_loss: 0.0874 202/500 [===========>..................] - ETA: 1:14 - loss: 0.8621 - regression_loss: 0.7746 - classification_loss: 0.0875 203/500 [===========>..................] - ETA: 1:14 - loss: 0.8596 - regression_loss: 0.7725 - classification_loss: 0.0871 204/500 [===========>..................] - ETA: 1:14 - loss: 0.8571 - regression_loss: 0.7704 - classification_loss: 0.0868 205/500 [===========>..................] - ETA: 1:13 - loss: 0.8549 - regression_loss: 0.7685 - classification_loss: 0.0864 206/500 [===========>..................] - ETA: 1:13 - loss: 0.8549 - regression_loss: 0.7684 - classification_loss: 0.0865 207/500 [===========>..................] - ETA: 1:13 - loss: 0.8539 - regression_loss: 0.7675 - classification_loss: 0.0864 208/500 [===========>..................] - ETA: 1:12 - loss: 0.8540 - regression_loss: 0.7676 - classification_loss: 0.0864 209/500 [===========>..................] - ETA: 1:12 - loss: 0.8542 - regression_loss: 0.7678 - classification_loss: 0.0864 210/500 [===========>..................] - ETA: 1:12 - loss: 0.8519 - regression_loss: 0.7658 - classification_loss: 0.0860 211/500 [===========>..................] - ETA: 1:12 - loss: 0.8517 - regression_loss: 0.7657 - classification_loss: 0.0859 212/500 [===========>..................] - ETA: 1:12 - loss: 0.8509 - regression_loss: 0.7652 - classification_loss: 0.0858 213/500 [===========>..................] - ETA: 1:11 - loss: 0.8518 - regression_loss: 0.7661 - classification_loss: 0.0857 214/500 [===========>..................] - ETA: 1:11 - loss: 0.8502 - regression_loss: 0.7648 - classification_loss: 0.0854 215/500 [===========>..................] - ETA: 1:11 - loss: 0.8505 - regression_loss: 0.7651 - classification_loss: 0.0854 216/500 [===========>..................] - ETA: 1:11 - loss: 0.8521 - regression_loss: 0.7665 - classification_loss: 0.0856 217/500 [============>.................] - ETA: 1:10 - loss: 0.8511 - regression_loss: 0.7658 - classification_loss: 0.0853 218/500 [============>.................] - ETA: 1:10 - loss: 0.8482 - regression_loss: 0.7632 - classification_loss: 0.0850 219/500 [============>.................] - ETA: 1:10 - loss: 0.8486 - regression_loss: 0.7635 - classification_loss: 0.0851 220/500 [============>.................] - ETA: 1:10 - loss: 0.8497 - regression_loss: 0.7647 - classification_loss: 0.0851 221/500 [============>.................] - ETA: 1:09 - loss: 0.8497 - regression_loss: 0.7644 - classification_loss: 0.0853 222/500 [============>.................] - ETA: 1:09 - loss: 0.8482 - regression_loss: 0.7632 - classification_loss: 0.0850 223/500 [============>.................] - ETA: 1:09 - loss: 0.8482 - regression_loss: 0.7632 - classification_loss: 0.0851 224/500 [============>.................] - ETA: 1:09 - loss: 0.8472 - regression_loss: 0.7624 - classification_loss: 0.0848 225/500 [============>.................] - ETA: 1:08 - loss: 0.8450 - regression_loss: 0.7605 - classification_loss: 0.0845 226/500 [============>.................] - ETA: 1:08 - loss: 0.8457 - regression_loss: 0.7611 - classification_loss: 0.0846 227/500 [============>.................] - ETA: 1:08 - loss: 0.8470 - regression_loss: 0.7622 - classification_loss: 0.0848 228/500 [============>.................] - ETA: 1:08 - loss: 0.8479 - regression_loss: 0.7631 - classification_loss: 0.0848 229/500 [============>.................] - ETA: 1:07 - loss: 0.8480 - regression_loss: 0.7632 - classification_loss: 0.0849 230/500 [============>.................] - ETA: 1:07 - loss: 0.8493 - regression_loss: 0.7643 - classification_loss: 0.0850 231/500 [============>.................] - ETA: 1:07 - loss: 0.8518 - regression_loss: 0.7664 - classification_loss: 0.0854 232/500 [============>.................] - ETA: 1:07 - loss: 0.8527 - regression_loss: 0.7672 - classification_loss: 0.0856 233/500 [============>.................] - ETA: 1:06 - loss: 0.8528 - regression_loss: 0.7674 - classification_loss: 0.0853 234/500 [=============>................] - ETA: 1:06 - loss: 0.8496 - regression_loss: 0.7646 - classification_loss: 0.0850 235/500 [=============>................] - ETA: 1:06 - loss: 0.8483 - regression_loss: 0.7636 - classification_loss: 0.0848 236/500 [=============>................] - ETA: 1:06 - loss: 0.8485 - regression_loss: 0.7637 - classification_loss: 0.0848 237/500 [=============>................] - ETA: 1:05 - loss: 0.8509 - regression_loss: 0.7656 - classification_loss: 0.0853 238/500 [=============>................] - ETA: 1:05 - loss: 0.8492 - regression_loss: 0.7642 - classification_loss: 0.0850 239/500 [=============>................] - ETA: 1:05 - loss: 0.8494 - regression_loss: 0.7644 - classification_loss: 0.0850 240/500 [=============>................] - ETA: 1:05 - loss: 0.8510 - regression_loss: 0.7660 - classification_loss: 0.0851 241/500 [=============>................] - ETA: 1:04 - loss: 0.8517 - regression_loss: 0.7665 - classification_loss: 0.0852 242/500 [=============>................] - ETA: 1:04 - loss: 0.8508 - regression_loss: 0.7658 - classification_loss: 0.0850 243/500 [=============>................] - ETA: 1:04 - loss: 0.8531 - regression_loss: 0.7675 - classification_loss: 0.0856 244/500 [=============>................] - ETA: 1:04 - loss: 0.8545 - regression_loss: 0.7686 - classification_loss: 0.0859 245/500 [=============>................] - ETA: 1:03 - loss: 0.8549 - regression_loss: 0.7690 - classification_loss: 0.0859 246/500 [=============>................] - ETA: 1:03 - loss: 0.8540 - regression_loss: 0.7684 - classification_loss: 0.0856 247/500 [=============>................] - ETA: 1:03 - loss: 0.8522 - regression_loss: 0.7668 - classification_loss: 0.0854 248/500 [=============>................] - ETA: 1:03 - loss: 0.8524 - regression_loss: 0.7672 - classification_loss: 0.0852 249/500 [=============>................] - ETA: 1:02 - loss: 0.8540 - regression_loss: 0.7688 - classification_loss: 0.0853 250/500 [==============>...............] - ETA: 1:02 - loss: 0.8538 - regression_loss: 0.7687 - classification_loss: 0.0851 251/500 [==============>...............] - ETA: 1:02 - loss: 0.8556 - regression_loss: 0.7701 - classification_loss: 0.0855 252/500 [==============>...............] - ETA: 1:02 - loss: 0.8574 - regression_loss: 0.7717 - classification_loss: 0.0857 253/500 [==============>...............] - ETA: 1:01 - loss: 0.8600 - regression_loss: 0.7740 - classification_loss: 0.0860 254/500 [==============>...............] - ETA: 1:01 - loss: 0.8580 - regression_loss: 0.7722 - classification_loss: 0.0858 255/500 [==============>...............] - ETA: 1:01 - loss: 0.8594 - regression_loss: 0.7735 - classification_loss: 0.0858 256/500 [==============>...............] - ETA: 1:01 - loss: 0.8578 - regression_loss: 0.7721 - classification_loss: 0.0856 257/500 [==============>...............] - ETA: 1:00 - loss: 0.8587 - regression_loss: 0.7728 - classification_loss: 0.0858 258/500 [==============>...............] - ETA: 1:00 - loss: 0.8584 - regression_loss: 0.7725 - classification_loss: 0.0859 259/500 [==============>...............] - ETA: 1:00 - loss: 0.8578 - regression_loss: 0.7722 - classification_loss: 0.0857 260/500 [==============>...............] - ETA: 1:00 - loss: 0.8558 - regression_loss: 0.7703 - classification_loss: 0.0854 261/500 [==============>...............] - ETA: 59s - loss: 0.8553 - regression_loss: 0.7699 - classification_loss: 0.0854  262/500 [==============>...............] - ETA: 59s - loss: 0.8557 - regression_loss: 0.7700 - classification_loss: 0.0857 263/500 [==============>...............] - ETA: 59s - loss: 0.8572 - regression_loss: 0.7714 - classification_loss: 0.0858 264/500 [==============>...............] - ETA: 59s - loss: 0.8568 - regression_loss: 0.7708 - classification_loss: 0.0860 265/500 [==============>...............] - ETA: 58s - loss: 0.8569 - regression_loss: 0.7710 - classification_loss: 0.0859 266/500 [==============>...............] - ETA: 58s - loss: 0.8567 - regression_loss: 0.7706 - classification_loss: 0.0860 267/500 [===============>..............] - ETA: 58s - loss: 0.8572 - regression_loss: 0.7711 - classification_loss: 0.0861 268/500 [===============>..............] - ETA: 58s - loss: 0.8581 - regression_loss: 0.7719 - classification_loss: 0.0862 269/500 [===============>..............] - ETA: 57s - loss: 0.8583 - regression_loss: 0.7721 - classification_loss: 0.0862 270/500 [===============>..............] - ETA: 57s - loss: 0.8574 - regression_loss: 0.7714 - classification_loss: 0.0860 271/500 [===============>..............] - ETA: 57s - loss: 0.8577 - regression_loss: 0.7717 - classification_loss: 0.0860 272/500 [===============>..............] - ETA: 57s - loss: 0.8575 - regression_loss: 0.7717 - classification_loss: 0.0859 273/500 [===============>..............] - ETA: 56s - loss: 0.8571 - regression_loss: 0.7713 - classification_loss: 0.0858 274/500 [===============>..............] - ETA: 56s - loss: 0.8561 - regression_loss: 0.7705 - classification_loss: 0.0856 275/500 [===============>..............] - ETA: 56s - loss: 0.8580 - regression_loss: 0.7720 - classification_loss: 0.0861 276/500 [===============>..............] - ETA: 56s - loss: 0.8577 - regression_loss: 0.7717 - classification_loss: 0.0861 277/500 [===============>..............] - ETA: 55s - loss: 0.8573 - regression_loss: 0.7714 - classification_loss: 0.0859 278/500 [===============>..............] - ETA: 55s - loss: 0.8575 - regression_loss: 0.7716 - classification_loss: 0.0859 279/500 [===============>..............] - ETA: 55s - loss: 0.8584 - regression_loss: 0.7722 - classification_loss: 0.0861 280/500 [===============>..............] - ETA: 55s - loss: 0.8578 - regression_loss: 0.7719 - classification_loss: 0.0860 281/500 [===============>..............] - ETA: 54s - loss: 0.8570 - regression_loss: 0.7713 - classification_loss: 0.0858 282/500 [===============>..............] - ETA: 54s - loss: 0.8584 - regression_loss: 0.7724 - classification_loss: 0.0861 283/500 [===============>..............] - ETA: 54s - loss: 0.8591 - regression_loss: 0.7729 - classification_loss: 0.0862 284/500 [================>.............] - ETA: 53s - loss: 0.8607 - regression_loss: 0.7742 - classification_loss: 0.0865 285/500 [================>.............] - ETA: 53s - loss: 0.8596 - regression_loss: 0.7732 - classification_loss: 0.0864 286/500 [================>.............] - ETA: 53s - loss: 0.8603 - regression_loss: 0.7739 - classification_loss: 0.0864 287/500 [================>.............] - ETA: 53s - loss: 0.8602 - regression_loss: 0.7738 - classification_loss: 0.0864 288/500 [================>.............] - ETA: 52s - loss: 0.8609 - regression_loss: 0.7744 - classification_loss: 0.0865 289/500 [================>.............] - ETA: 52s - loss: 0.8615 - regression_loss: 0.7749 - classification_loss: 0.0866 290/500 [================>.............] - ETA: 52s - loss: 0.8627 - regression_loss: 0.7759 - classification_loss: 0.0868 291/500 [================>.............] - ETA: 52s - loss: 0.8629 - regression_loss: 0.7760 - classification_loss: 0.0869 292/500 [================>.............] - ETA: 51s - loss: 0.8621 - regression_loss: 0.7753 - classification_loss: 0.0868 293/500 [================>.............] - ETA: 51s - loss: 0.8612 - regression_loss: 0.7745 - classification_loss: 0.0866 294/500 [================>.............] - ETA: 51s - loss: 0.8614 - regression_loss: 0.7750 - classification_loss: 0.0865 295/500 [================>.............] - ETA: 51s - loss: 0.8615 - regression_loss: 0.7751 - classification_loss: 0.0864 296/500 [================>.............] - ETA: 50s - loss: 0.8622 - regression_loss: 0.7758 - classification_loss: 0.0864 297/500 [================>.............] - ETA: 50s - loss: 0.8628 - regression_loss: 0.7762 - classification_loss: 0.0865 298/500 [================>.............] - ETA: 50s - loss: 0.8621 - regression_loss: 0.7758 - classification_loss: 0.0863 299/500 [================>.............] - ETA: 50s - loss: 0.8636 - regression_loss: 0.7770 - classification_loss: 0.0866 300/500 [=================>............] - ETA: 49s - loss: 0.8636 - regression_loss: 0.7771 - classification_loss: 0.0866 301/500 [=================>............] - ETA: 49s - loss: 0.8619 - regression_loss: 0.7756 - classification_loss: 0.0863 302/500 [=================>............] - ETA: 49s - loss: 0.8626 - regression_loss: 0.7762 - classification_loss: 0.0865 303/500 [=================>............] - ETA: 49s - loss: 0.8628 - regression_loss: 0.7763 - classification_loss: 0.0865 304/500 [=================>............] - ETA: 48s - loss: 0.8622 - regression_loss: 0.7757 - classification_loss: 0.0864 305/500 [=================>............] - ETA: 48s - loss: 0.8602 - regression_loss: 0.7740 - classification_loss: 0.0862 306/500 [=================>............] - ETA: 48s - loss: 0.8603 - regression_loss: 0.7741 - classification_loss: 0.0862 307/500 [=================>............] - ETA: 48s - loss: 0.8607 - regression_loss: 0.7744 - classification_loss: 0.0863 308/500 [=================>............] - ETA: 47s - loss: 0.8612 - regression_loss: 0.7749 - classification_loss: 0.0863 309/500 [=================>............] - ETA: 47s - loss: 0.8607 - regression_loss: 0.7745 - classification_loss: 0.0862 310/500 [=================>............] - ETA: 47s - loss: 0.8610 - regression_loss: 0.7748 - classification_loss: 0.0862 311/500 [=================>............] - ETA: 47s - loss: 0.8603 - regression_loss: 0.7743 - classification_loss: 0.0860 312/500 [=================>............] - ETA: 46s - loss: 0.8604 - regression_loss: 0.7743 - classification_loss: 0.0860 313/500 [=================>............] - ETA: 46s - loss: 0.8596 - regression_loss: 0.7737 - classification_loss: 0.0859 314/500 [=================>............] - ETA: 46s - loss: 0.8594 - regression_loss: 0.7735 - classification_loss: 0.0859 315/500 [=================>............] - ETA: 46s - loss: 0.8573 - regression_loss: 0.7716 - classification_loss: 0.0857 316/500 [=================>............] - ETA: 45s - loss: 0.8564 - regression_loss: 0.7709 - classification_loss: 0.0855 317/500 [==================>...........] - ETA: 45s - loss: 0.8574 - regression_loss: 0.7717 - classification_loss: 0.0856 318/500 [==================>...........] - ETA: 45s - loss: 0.8589 - regression_loss: 0.7730 - classification_loss: 0.0859 319/500 [==================>...........] - ETA: 45s - loss: 0.8590 - regression_loss: 0.7727 - classification_loss: 0.0863 320/500 [==================>...........] - ETA: 44s - loss: 0.8583 - regression_loss: 0.7722 - classification_loss: 0.0861 321/500 [==================>...........] - ETA: 44s - loss: 0.8566 - regression_loss: 0.7707 - classification_loss: 0.0859 322/500 [==================>...........] - ETA: 44s - loss: 0.8556 - regression_loss: 0.7699 - classification_loss: 0.0858 323/500 [==================>...........] - ETA: 44s - loss: 0.8573 - regression_loss: 0.7713 - classification_loss: 0.0860 324/500 [==================>...........] - ETA: 43s - loss: 0.8567 - regression_loss: 0.7708 - classification_loss: 0.0859 325/500 [==================>...........] - ETA: 43s - loss: 0.8568 - regression_loss: 0.7709 - classification_loss: 0.0859 326/500 [==================>...........] - ETA: 43s - loss: 0.8586 - regression_loss: 0.7724 - classification_loss: 0.0862 327/500 [==================>...........] - ETA: 43s - loss: 0.8588 - regression_loss: 0.7726 - classification_loss: 0.0862 328/500 [==================>...........] - ETA: 42s - loss: 0.8593 - regression_loss: 0.7731 - classification_loss: 0.0861 329/500 [==================>...........] - ETA: 42s - loss: 0.8592 - regression_loss: 0.7732 - classification_loss: 0.0860 330/500 [==================>...........] - ETA: 42s - loss: 0.8594 - regression_loss: 0.7733 - classification_loss: 0.0861 331/500 [==================>...........] - ETA: 42s - loss: 0.8596 - regression_loss: 0.7735 - classification_loss: 0.0861 332/500 [==================>...........] - ETA: 41s - loss: 0.8603 - regression_loss: 0.7742 - classification_loss: 0.0861 333/500 [==================>...........] - ETA: 41s - loss: 0.8607 - regression_loss: 0.7745 - classification_loss: 0.0861 334/500 [===================>..........] - ETA: 41s - loss: 0.8606 - regression_loss: 0.7744 - classification_loss: 0.0862 335/500 [===================>..........] - ETA: 41s - loss: 0.8606 - regression_loss: 0.7745 - classification_loss: 0.0861 336/500 [===================>..........] - ETA: 40s - loss: 0.8609 - regression_loss: 0.7749 - classification_loss: 0.0860 337/500 [===================>..........] - ETA: 40s - loss: 0.8617 - regression_loss: 0.7756 - classification_loss: 0.0861 338/500 [===================>..........] - ETA: 40s - loss: 0.8610 - regression_loss: 0.7750 - classification_loss: 0.0860 339/500 [===================>..........] - ETA: 40s - loss: 0.8624 - regression_loss: 0.7761 - classification_loss: 0.0863 340/500 [===================>..........] - ETA: 40s - loss: 0.8608 - regression_loss: 0.7747 - classification_loss: 0.0861 341/500 [===================>..........] - ETA: 39s - loss: 0.8596 - regression_loss: 0.7736 - classification_loss: 0.0859 342/500 [===================>..........] - ETA: 39s - loss: 0.8599 - regression_loss: 0.7740 - classification_loss: 0.0859 343/500 [===================>..........] - ETA: 39s - loss: 0.8591 - regression_loss: 0.7733 - classification_loss: 0.0858 344/500 [===================>..........] - ETA: 39s - loss: 0.8596 - regression_loss: 0.7738 - classification_loss: 0.0858 345/500 [===================>..........] - ETA: 38s - loss: 0.8580 - regression_loss: 0.7724 - classification_loss: 0.0856 346/500 [===================>..........] - ETA: 38s - loss: 0.8581 - regression_loss: 0.7725 - classification_loss: 0.0856 347/500 [===================>..........] - ETA: 38s - loss: 0.8578 - regression_loss: 0.7723 - classification_loss: 0.0855 348/500 [===================>..........] - ETA: 38s - loss: 0.8585 - regression_loss: 0.7729 - classification_loss: 0.0856 349/500 [===================>..........] - ETA: 37s - loss: 0.8569 - regression_loss: 0.7715 - classification_loss: 0.0854 350/500 [====================>.........] - ETA: 37s - loss: 0.8549 - regression_loss: 0.7696 - classification_loss: 0.0852 351/500 [====================>.........] - ETA: 37s - loss: 0.8553 - regression_loss: 0.7700 - classification_loss: 0.0853 352/500 [====================>.........] - ETA: 37s - loss: 0.8542 - regression_loss: 0.7691 - classification_loss: 0.0852 353/500 [====================>.........] - ETA: 36s - loss: 0.8538 - regression_loss: 0.7686 - classification_loss: 0.0852 354/500 [====================>.........] - ETA: 36s - loss: 0.8532 - regression_loss: 0.7681 - classification_loss: 0.0851 355/500 [====================>.........] - ETA: 36s - loss: 0.8536 - regression_loss: 0.7684 - classification_loss: 0.0852 356/500 [====================>.........] - ETA: 36s - loss: 0.8532 - regression_loss: 0.7682 - classification_loss: 0.0850 357/500 [====================>.........] - ETA: 35s - loss: 0.8538 - regression_loss: 0.7686 - classification_loss: 0.0852 358/500 [====================>.........] - ETA: 35s - loss: 0.8545 - regression_loss: 0.7692 - classification_loss: 0.0854 359/500 [====================>.........] - ETA: 35s - loss: 0.8522 - regression_loss: 0.7670 - classification_loss: 0.0852 360/500 [====================>.........] - ETA: 35s - loss: 0.8531 - regression_loss: 0.7679 - classification_loss: 0.0852 361/500 [====================>.........] - ETA: 34s - loss: 0.8535 - regression_loss: 0.7682 - classification_loss: 0.0853 362/500 [====================>.........] - ETA: 34s - loss: 0.8527 - regression_loss: 0.7675 - classification_loss: 0.0852 363/500 [====================>.........] - ETA: 34s - loss: 0.8520 - regression_loss: 0.7669 - classification_loss: 0.0850 364/500 [====================>.........] - ETA: 34s - loss: 0.8528 - regression_loss: 0.7679 - classification_loss: 0.0849 365/500 [====================>.........] - ETA: 33s - loss: 0.8515 - regression_loss: 0.7667 - classification_loss: 0.0848 366/500 [====================>.........] - ETA: 33s - loss: 0.8532 - regression_loss: 0.7683 - classification_loss: 0.0849 367/500 [=====================>........] - ETA: 33s - loss: 0.8527 - regression_loss: 0.7678 - classification_loss: 0.0849 368/500 [=====================>........] - ETA: 33s - loss: 0.8518 - regression_loss: 0.7671 - classification_loss: 0.0847 369/500 [=====================>........] - ETA: 32s - loss: 0.8518 - regression_loss: 0.7671 - classification_loss: 0.0847 370/500 [=====================>........] - ETA: 32s - loss: 0.8506 - regression_loss: 0.7661 - classification_loss: 0.0846 371/500 [=====================>........] - ETA: 32s - loss: 0.8509 - regression_loss: 0.7662 - classification_loss: 0.0847 372/500 [=====================>........] - ETA: 32s - loss: 0.8529 - regression_loss: 0.7680 - classification_loss: 0.0850 373/500 [=====================>........] - ETA: 31s - loss: 0.8527 - regression_loss: 0.7678 - classification_loss: 0.0849 374/500 [=====================>........] - ETA: 31s - loss: 0.8532 - regression_loss: 0.7682 - classification_loss: 0.0850 375/500 [=====================>........] - ETA: 31s - loss: 0.8522 - regression_loss: 0.7673 - classification_loss: 0.0849 376/500 [=====================>........] - ETA: 31s - loss: 0.8509 - regression_loss: 0.7662 - classification_loss: 0.0847 377/500 [=====================>........] - ETA: 30s - loss: 0.8501 - regression_loss: 0.7656 - classification_loss: 0.0845 378/500 [=====================>........] - ETA: 30s - loss: 0.8489 - regression_loss: 0.7646 - classification_loss: 0.0843 379/500 [=====================>........] - ETA: 30s - loss: 0.8500 - regression_loss: 0.7654 - classification_loss: 0.0846 380/500 [=====================>........] - ETA: 30s - loss: 0.8505 - regression_loss: 0.7658 - classification_loss: 0.0847 381/500 [=====================>........] - ETA: 29s - loss: 0.8514 - regression_loss: 0.7665 - classification_loss: 0.0849 382/500 [=====================>........] - ETA: 29s - loss: 0.8509 - regression_loss: 0.7661 - classification_loss: 0.0848 383/500 [=====================>........] - ETA: 29s - loss: 0.8513 - regression_loss: 0.7665 - classification_loss: 0.0848 384/500 [======================>.......] - ETA: 29s - loss: 0.8520 - regression_loss: 0.7671 - classification_loss: 0.0849 385/500 [======================>.......] - ETA: 28s - loss: 0.8508 - regression_loss: 0.7660 - classification_loss: 0.0848 386/500 [======================>.......] - ETA: 28s - loss: 0.8501 - regression_loss: 0.7653 - classification_loss: 0.0847 387/500 [======================>.......] - ETA: 28s - loss: 0.8503 - regression_loss: 0.7656 - classification_loss: 0.0847 388/500 [======================>.......] - ETA: 27s - loss: 0.8527 - regression_loss: 0.7677 - classification_loss: 0.0850 389/500 [======================>.......] - ETA: 27s - loss: 0.8535 - regression_loss: 0.7683 - classification_loss: 0.0852 390/500 [======================>.......] - ETA: 27s - loss: 0.8539 - regression_loss: 0.7687 - classification_loss: 0.0852 391/500 [======================>.......] - ETA: 27s - loss: 0.8526 - regression_loss: 0.7676 - classification_loss: 0.0850 392/500 [======================>.......] - ETA: 26s - loss: 0.8532 - regression_loss: 0.7682 - classification_loss: 0.0850 393/500 [======================>.......] - ETA: 26s - loss: 0.8538 - regression_loss: 0.7686 - classification_loss: 0.0852 394/500 [======================>.......] - ETA: 26s - loss: 0.8548 - regression_loss: 0.7694 - classification_loss: 0.0853 395/500 [======================>.......] - ETA: 26s - loss: 0.8551 - regression_loss: 0.7697 - classification_loss: 0.0854 396/500 [======================>.......] - ETA: 25s - loss: 0.8551 - regression_loss: 0.7698 - classification_loss: 0.0853 397/500 [======================>.......] - ETA: 25s - loss: 0.8556 - regression_loss: 0.7703 - classification_loss: 0.0852 398/500 [======================>.......] - ETA: 25s - loss: 0.8552 - regression_loss: 0.7701 - classification_loss: 0.0851 399/500 [======================>.......] - ETA: 25s - loss: 0.8543 - regression_loss: 0.7693 - classification_loss: 0.0850 400/500 [=======================>......] - ETA: 24s - loss: 0.8528 - regression_loss: 0.7679 - classification_loss: 0.0848 401/500 [=======================>......] - ETA: 24s - loss: 0.8512 - regression_loss: 0.7665 - classification_loss: 0.0846 402/500 [=======================>......] - ETA: 24s - loss: 0.8516 - regression_loss: 0.7669 - classification_loss: 0.0847 403/500 [=======================>......] - ETA: 24s - loss: 0.8532 - regression_loss: 0.7684 - classification_loss: 0.0848 404/500 [=======================>......] - ETA: 23s - loss: 0.8531 - regression_loss: 0.7683 - classification_loss: 0.0847 405/500 [=======================>......] - ETA: 23s - loss: 0.8537 - regression_loss: 0.7689 - classification_loss: 0.0848 406/500 [=======================>......] - ETA: 23s - loss: 0.8530 - regression_loss: 0.7683 - classification_loss: 0.0847 407/500 [=======================>......] - ETA: 23s - loss: 0.8547 - regression_loss: 0.7698 - classification_loss: 0.0849 408/500 [=======================>......] - ETA: 22s - loss: 0.8536 - regression_loss: 0.7688 - classification_loss: 0.0848 409/500 [=======================>......] - ETA: 22s - loss: 0.8544 - regression_loss: 0.7694 - classification_loss: 0.0850 410/500 [=======================>......] - ETA: 22s - loss: 0.8535 - regression_loss: 0.7686 - classification_loss: 0.0849 411/500 [=======================>......] - ETA: 22s - loss: 0.8526 - regression_loss: 0.7679 - classification_loss: 0.0847 412/500 [=======================>......] - ETA: 21s - loss: 0.8522 - regression_loss: 0.7674 - classification_loss: 0.0847 413/500 [=======================>......] - ETA: 21s - loss: 0.8520 - regression_loss: 0.7672 - classification_loss: 0.0848 414/500 [=======================>......] - ETA: 21s - loss: 0.8525 - regression_loss: 0.7677 - classification_loss: 0.0848 415/500 [=======================>......] - ETA: 21s - loss: 0.8521 - regression_loss: 0.7674 - classification_loss: 0.0847 416/500 [=======================>......] - ETA: 20s - loss: 0.8518 - regression_loss: 0.7671 - classification_loss: 0.0847 417/500 [========================>.....] - ETA: 20s - loss: 0.8515 - regression_loss: 0.7669 - classification_loss: 0.0846 418/500 [========================>.....] - ETA: 20s - loss: 0.8504 - regression_loss: 0.7660 - classification_loss: 0.0845 419/500 [========================>.....] - ETA: 20s - loss: 0.8498 - regression_loss: 0.7654 - classification_loss: 0.0844 420/500 [========================>.....] - ETA: 19s - loss: 0.8490 - regression_loss: 0.7648 - classification_loss: 0.0842 421/500 [========================>.....] - ETA: 19s - loss: 0.8496 - regression_loss: 0.7653 - classification_loss: 0.0843 422/500 [========================>.....] - ETA: 19s - loss: 0.8487 - regression_loss: 0.7645 - classification_loss: 0.0842 423/500 [========================>.....] - ETA: 19s - loss: 0.8488 - regression_loss: 0.7646 - classification_loss: 0.0842 424/500 [========================>.....] - ETA: 18s - loss: 0.8488 - regression_loss: 0.7646 - classification_loss: 0.0843 425/500 [========================>.....] - ETA: 18s - loss: 0.8484 - regression_loss: 0.7643 - classification_loss: 0.0841 426/500 [========================>.....] - ETA: 18s - loss: 0.8487 - regression_loss: 0.7646 - classification_loss: 0.0842 427/500 [========================>.....] - ETA: 18s - loss: 0.8503 - regression_loss: 0.7659 - classification_loss: 0.0844 428/500 [========================>.....] - ETA: 17s - loss: 0.8508 - regression_loss: 0.7664 - classification_loss: 0.0844 429/500 [========================>.....] - ETA: 17s - loss: 0.8509 - regression_loss: 0.7665 - classification_loss: 0.0843 430/500 [========================>.....] - ETA: 17s - loss: 0.8508 - regression_loss: 0.7665 - classification_loss: 0.0843 431/500 [========================>.....] - ETA: 17s - loss: 0.8506 - regression_loss: 0.7663 - classification_loss: 0.0843 432/500 [========================>.....] - ETA: 16s - loss: 0.8495 - regression_loss: 0.7653 - classification_loss: 0.0842 433/500 [========================>.....] - ETA: 16s - loss: 0.8496 - regression_loss: 0.7654 - classification_loss: 0.0842 434/500 [=========================>....] - ETA: 16s - loss: 0.8492 - regression_loss: 0.7651 - classification_loss: 0.0841 435/500 [=========================>....] - ETA: 16s - loss: 0.8493 - regression_loss: 0.7651 - classification_loss: 0.0842 436/500 [=========================>....] - ETA: 15s - loss: 0.8479 - regression_loss: 0.7639 - classification_loss: 0.0840 437/500 [=========================>....] - ETA: 15s - loss: 0.8465 - regression_loss: 0.7626 - classification_loss: 0.0839 438/500 [=========================>....] - ETA: 15s - loss: 0.8455 - regression_loss: 0.7617 - classification_loss: 0.0838 439/500 [=========================>....] - ETA: 15s - loss: 0.8452 - regression_loss: 0.7615 - classification_loss: 0.0837 440/500 [=========================>....] - ETA: 14s - loss: 0.8460 - regression_loss: 0.7622 - classification_loss: 0.0838 441/500 [=========================>....] - ETA: 14s - loss: 0.8463 - regression_loss: 0.7624 - classification_loss: 0.0839 442/500 [=========================>....] - ETA: 14s - loss: 0.8469 - regression_loss: 0.7628 - classification_loss: 0.0840 443/500 [=========================>....] - ETA: 14s - loss: 0.8467 - regression_loss: 0.7628 - classification_loss: 0.0839 444/500 [=========================>....] - ETA: 13s - loss: 0.8468 - regression_loss: 0.7629 - classification_loss: 0.0839 445/500 [=========================>....] - ETA: 13s - loss: 0.8471 - regression_loss: 0.7632 - classification_loss: 0.0839 446/500 [=========================>....] - ETA: 13s - loss: 0.8475 - regression_loss: 0.7634 - classification_loss: 0.0840 447/500 [=========================>....] - ETA: 13s - loss: 0.8463 - regression_loss: 0.7624 - classification_loss: 0.0839 448/500 [=========================>....] - ETA: 12s - loss: 0.8461 - regression_loss: 0.7622 - classification_loss: 0.0839 449/500 [=========================>....] - ETA: 12s - loss: 0.8459 - regression_loss: 0.7620 - classification_loss: 0.0839 450/500 [==========================>...] - ETA: 12s - loss: 0.8459 - regression_loss: 0.7620 - classification_loss: 0.0839 451/500 [==========================>...] - ETA: 12s - loss: 0.8453 - regression_loss: 0.7615 - classification_loss: 0.0838 452/500 [==========================>...] - ETA: 11s - loss: 0.8457 - regression_loss: 0.7618 - classification_loss: 0.0839 453/500 [==========================>...] - ETA: 11s - loss: 0.8463 - regression_loss: 0.7623 - classification_loss: 0.0840 454/500 [==========================>...] - ETA: 11s - loss: 0.8456 - regression_loss: 0.7617 - classification_loss: 0.0839 455/500 [==========================>...] - ETA: 11s - loss: 0.8456 - regression_loss: 0.7617 - classification_loss: 0.0839 456/500 [==========================>...] - ETA: 10s - loss: 0.8459 - regression_loss: 0.7619 - classification_loss: 0.0839 457/500 [==========================>...] - ETA: 10s - loss: 0.8461 - regression_loss: 0.7621 - classification_loss: 0.0840 458/500 [==========================>...] - ETA: 10s - loss: 0.8462 - regression_loss: 0.7623 - classification_loss: 0.0840 459/500 [==========================>...] - ETA: 10s - loss: 0.8457 - regression_loss: 0.7618 - classification_loss: 0.0839 460/500 [==========================>...] - ETA: 9s - loss: 0.8460 - regression_loss: 0.7621 - classification_loss: 0.0839  461/500 [==========================>...] - ETA: 9s - loss: 0.8465 - regression_loss: 0.7625 - classification_loss: 0.0840 462/500 [==========================>...] - ETA: 9s - loss: 0.8474 - regression_loss: 0.7633 - classification_loss: 0.0841 463/500 [==========================>...] - ETA: 9s - loss: 0.8476 - regression_loss: 0.7634 - classification_loss: 0.0841 464/500 [==========================>...] - ETA: 8s - loss: 0.8468 - regression_loss: 0.7628 - classification_loss: 0.0840 465/500 [==========================>...] - ETA: 8s - loss: 0.8473 - regression_loss: 0.7631 - classification_loss: 0.0842 466/500 [==========================>...] - ETA: 8s - loss: 0.8468 - regression_loss: 0.7627 - classification_loss: 0.0840 467/500 [===========================>..] - ETA: 8s - loss: 0.8456 - regression_loss: 0.7617 - classification_loss: 0.0839 468/500 [===========================>..] - ETA: 7s - loss: 0.8451 - regression_loss: 0.7614 - classification_loss: 0.0838 469/500 [===========================>..] - ETA: 7s - loss: 0.8450 - regression_loss: 0.7613 - classification_loss: 0.0837 470/500 [===========================>..] - ETA: 7s - loss: 0.8439 - regression_loss: 0.7604 - classification_loss: 0.0836 471/500 [===========================>..] - ETA: 7s - loss: 0.8435 - regression_loss: 0.7600 - classification_loss: 0.0835 472/500 [===========================>..] - ETA: 6s - loss: 0.8441 - regression_loss: 0.7605 - classification_loss: 0.0836 473/500 [===========================>..] - ETA: 6s - loss: 0.8438 - regression_loss: 0.7601 - classification_loss: 0.0836 474/500 [===========================>..] - ETA: 6s - loss: 0.8443 - regression_loss: 0.7605 - classification_loss: 0.0838 475/500 [===========================>..] - ETA: 6s - loss: 0.8445 - regression_loss: 0.7608 - classification_loss: 0.0838 476/500 [===========================>..] - ETA: 5s - loss: 0.8438 - regression_loss: 0.7601 - classification_loss: 0.0837 477/500 [===========================>..] - ETA: 5s - loss: 0.8428 - regression_loss: 0.7593 - classification_loss: 0.0836 478/500 [===========================>..] - ETA: 5s - loss: 0.8426 - regression_loss: 0.7590 - classification_loss: 0.0835 479/500 [===========================>..] - ETA: 5s - loss: 0.8419 - regression_loss: 0.7585 - classification_loss: 0.0834 480/500 [===========================>..] - ETA: 4s - loss: 0.8420 - regression_loss: 0.7587 - classification_loss: 0.0833 481/500 [===========================>..] - ETA: 4s - loss: 0.8423 - regression_loss: 0.7590 - classification_loss: 0.0833 482/500 [===========================>..] - ETA: 4s - loss: 0.8421 - regression_loss: 0.7588 - classification_loss: 0.0832 483/500 [===========================>..] - ETA: 4s - loss: 0.8418 - regression_loss: 0.7586 - classification_loss: 0.0832 484/500 [============================>.] - ETA: 3s - loss: 0.8406 - regression_loss: 0.7576 - classification_loss: 0.0831 485/500 [============================>.] - ETA: 3s - loss: 0.8396 - regression_loss: 0.7567 - classification_loss: 0.0829 486/500 [============================>.] - ETA: 3s - loss: 0.8401 - regression_loss: 0.7570 - classification_loss: 0.0830 487/500 [============================>.] - ETA: 3s - loss: 0.8397 - regression_loss: 0.7568 - classification_loss: 0.0829 488/500 [============================>.] - ETA: 2s - loss: 0.8403 - regression_loss: 0.7573 - classification_loss: 0.0830 489/500 [============================>.] - ETA: 2s - loss: 0.8404 - regression_loss: 0.7574 - classification_loss: 0.0829 490/500 [============================>.] - ETA: 2s - loss: 0.8406 - regression_loss: 0.7576 - classification_loss: 0.0830 491/500 [============================>.] - ETA: 2s - loss: 0.8410 - regression_loss: 0.7580 - classification_loss: 0.0830 492/500 [============================>.] - ETA: 1s - loss: 0.8414 - regression_loss: 0.7582 - classification_loss: 0.0831 493/500 [============================>.] - ETA: 1s - loss: 0.8405 - regression_loss: 0.7575 - classification_loss: 0.0830 494/500 [============================>.] - ETA: 1s - loss: 0.8406 - regression_loss: 0.7575 - classification_loss: 0.0830 495/500 [============================>.] - ETA: 1s - loss: 0.8402 - regression_loss: 0.7572 - classification_loss: 0.0830 496/500 [============================>.] - ETA: 0s - loss: 0.8394 - regression_loss: 0.7565 - classification_loss: 0.0828 497/500 [============================>.] - ETA: 0s - loss: 0.8391 - regression_loss: 0.7563 - classification_loss: 0.0828 498/500 [============================>.] - ETA: 0s - loss: 0.8388 - regression_loss: 0.7561 - classification_loss: 0.0828 499/500 [============================>.] - ETA: 0s - loss: 0.8384 - regression_loss: 0.7557 - classification_loss: 0.0827 500/500 [==============================] - 124s 248ms/step - loss: 0.8380 - regression_loss: 0.7554 - classification_loss: 0.0826 1172 instances of class plum with average precision: 0.7811 mAP: 0.7811 Epoch 00055: saving model to ./training/snapshots/resnet50_pascal_55.h5